It’s hard to believe I’m looking forward to my retirement in just two weeks from today. Like most big changes in life, retirement is not so much about the end of something, but rather the start of something new. It also provides a wonderful opportunity for retrospection. So, if you’ll indulge me, I’d like to share with you some of the key strategic inflection points over the last 25 years that have brought the industry to where it is today.
Today, the relationship between data centres and connectivity is inextricably linked. In order for data centres to function effectively, they require high-speed, reliable, and secure connectivity to transmit data between servers, networks, and end-users. In the early days, internet exchanges emerged in locations with strong connectivity, but those were few and far between. Data centres had to follow to those same locations, making the industry incredibly network dependent. When we look at where we started it's almost inconceivable that we would have had the right connectivity in as many locations as we do today. The establishment of the network was key to setting the industry on its current trajectory.
One of the most significant things to occur in the last 25 years was the advent of the ASHRAE recommended envelope which specifies a long-term operating environment for greater IT equipment reliability. Before ASHRAE, data centre operators cooled to unspecified temperatures that varied across data centres facilities. This resulted in cross contamination of air as there was no hot aisle containment, making data centres incredibly inefficient. The ASHRAE window of 18 to 27 degrees Celsius provided a new way of thinking about data centre cooling.
This also enabled the focus to shift to energy efficiency. The first time I engineered a data centre to achieve a PUE was around 2008. At the time we were deploying free cooling chillers to save energy and create a more efficient data centre. We were still, however, using chilled water temperatures from 6-12°C which meant we had to run the chillers harder to achieve 16°C in the room. With the ASHRAE recommendations we were able to uplift those temperatures to 10-17°C and still achieve free cooling. This was the birth of PUE, so to speak. At a time when energy prices have been skyrocketing, PUE is still a very effective and simple metric to maximise efficiencies and keep energy costs down, and while I'm thinking about it, we have a very good whitepaper on that - well worth a read!
When I think about some of the variety of data centres I’ve built all around the world, there is one key lesson I’ve learned along the way – one size does not fit all. A solution that works at our Harlow campus won’t necessarily work in other locations. For example, in Harlow we use indirect free cooling to help achieve a PUE of 1.2. One of the ways this happens is by using water to help cool the room. In a location like Singapore that solution won’t work as the air is already saturated from the heat and humidity and can’t absorb the water that is released back into the atmosphere. California is another example, but for very different reasons. Regular drought conditions and water restrictions limit how much water a data centre operator can utilise. You've got to choose systems that work best for the climate that you are in.
A huge satisfaction across the last 25 years has been how sustainability has shifted from being a 'nice to have' to a fundamental element designed and built into the core of new data centre facilities. Every RFP we've seen over the last five years has included sustainability as one of the top 3 requirements, and that's exactly where it should sit. Along that vein, I was particularly pleased back in July 2021 to champion the use of HVO fuel at our Harlow campus. Kao Data was the first data centre developer/operator to use HVO fuel in Europe and we demonstrated how this simple switch can save 90% of carbon emissions from back-up generator usage. Since then we've seen the likes of Amazon, Microsoft, Google and other colo operators following suit.
One of the things I’ve appreciated the most in my career is the opportunity I’ve had to work on projects that have enabled a continuous evolution of project design. In my time at Digital Realty I would have seven different projects underway at any one time, all in different phases from brief to commissioning. No two projects were built to the exact same design because every time we saw an opportunity to tweak the design, we would then incorporate that into the next project and improve the efficiency next time around.
It’s a design principle we’ve incorporated into Kao Data from day one. The innovative design we used for KLON-01 to create a highly resilient, sustainable and energy efficient data centre facility has seamlessly transitioned into our newest building, KLON-02, opening later this year. In many ways this reflects the high-powered compute many of our customers are using. Machine learning and artificial intelligence are all about continuous learning and improvement. When we apply these same concepts to data centre design, we are able to build faster, more efficient, and hopefully, cheaper data centre facilities.
It’s amazing to look back and think about what I, with the brilliant support of so many others in the industry, have been able to accomplish. Working in some truly world class teams, I have been responsible for the design of $850 million worth of data centre design across Europe, Australia and South East Asia, and I’ve built over 150 megawatts worth of capacity across 24 data centres, including 15 new builds. I’m equally proud of that achievement, and also thankful for the memories, experiences, laughter and life-long friendships I’ve gathered along the way.