11 May 2020
The explosion in Artificial Intelligence (AI) applications has fast become a fundamental game changer for many industries across the globe. From weather prediction and climate change modelling, to bio-computational analysis and life science exploration, the requirement to process complex data sets faster and with higher fault-tolerance, is a key driver for equipment manufacturers, data centres and High Performance Computing (HPC) organisations.
A 2018 IDC Report predicts that by 2021, 75% of commercial enterprise apps will use AI, over 90% of consumers will interact with customer support bots, and over 50% of new industrial robots will leverage AI. I believe we can anticipate that AI will become an ever-more influential consideration in next-generation data centres and help to further demonstrate the benefit of digitisation to other connected industries.
Today it is probable that people will connect with global data centres in upwards of 200 times each day. Yet outside of our sector, many remain unaware of the crucial role that these facilities play in the process of hosting and transmitting critical data. Data centres are indeed the backbone of the world’s infrastructure systems; ultra-secure buildings that contain the inner workings of ‘the cloud’. But for AI and HPC businesses across London and the UK Innovation Corridor alike, they offer users a scalable platform to drive Exascale-level computing, perform quantitative data analysis and build computational innovations that can dramatically change the course, or the perspective, of an entire generation.
The data centre industry itself, is at the leading edge of digital infrastructure, responsible for powering global economies and supercomputing systems alike, but it is dependent on collaboration with one of today’s least digitised industries in order to succeed. According to the McKinsey Global Institute, the construction sector is also one of the least productive. Its productivity growth averaged 1% per year over the past two decades, compared with 2.8 per cent for the total world economy.
Regardless of the endemic industry skills shortage, which is slowly being addressed by graduate programmes and science, technology, engineering and mathematics (STEM) initiatives, it is staggering how many experts remain within the data centre ecosystem. From contractors and engineers, to manufacturers, and developers, we bring together leading experts to collaborate and deliver continuous innovation in the design of next-generation facilities, often built to new standards of technical excellence.
As with many industries where massive investment is made in the bricks and mortar of their trade, change and disruption is often resisted. I have been involved with the Kao Data brand since it’s inception and believe it is truly an example of where going against the has grain created positive outcomes, especially in terms of the customers we support, our environmental impact and total cost of ownership (TCO).
For example, when Kao Data raced to become the first OCP-Ready data centre in Europe, many thought that was a big step, others did not see the relevance. Yet that decision in its entirety opened the door to the HPC community and demonstrated the key differentiators that set Kao Data and legacy facilities apart. We found that when situations required it, it was an engineering-led and collaborative response that created the greatest outcomes. Those calculated enhancements within the development process helped to deliver a ground-breaking, highly flexible and scalable technology home for HPC across the UK Innovation Corridor.
It is my belief that the data centre industry is beginning to witness monumental change. One where AI will continue to gain significant influence and become a crucial part of customer decision-making. This is true not only for the HPC applications that leverage evolving GPU (Graphics Processing Unit) technologies within our own technology suites, but more fundamentally within the organisations that design, build and operate data centres.
Within the development of the Kao Data campus, we have built the very foundations to house and deliver greater levels of collaboration and productivity through AI. It is in the very fabric of our being to push the boundaries of design, adopting a continuous learning process, which enables us to innovate as we begin the next stage of campus developments.
The construction industry indeed remains one of the last sectors waiting to be digitised, but for many, the incentives are there to transform. Great change can be enabled through use of software and digital twins to design and model new smart buildings, with AI, predictive machine learning and complex data analysis giving users the ability to truly understand the impact of large-scale projects on the environment and anticipate their future needs.
For those unaware, digital twins are virtual replicas of physical infrastructure systems and devices, including data centres and smart buildings. They offer industry professionals the ability to run simulations that will test how an environment may perform over its lifecycle, before it is deployed and fully operational.
I believe that by using simulation and Virtual Reality (VR) the data centre industry can, for example, create a more accurate bills of materials (BoM), refine work schedules that not only consider the need for on-site equipment and the movements of in-house personnel alongside third-party contractors. We can anticipate seasonal climate conditions and long-term energy usage predict future supply-chain scenarios and help to more accurately design for a lower TCO.
The digital twin is not simply a better construction tool, to think that is short-sighted. It is a next-generation lifecycle platform that we can use in design and build throughout the operational life of our facilities and add value. It enables detailed site monitoring to identify unseen issues and alert owners so that an early remedial response can take place, thereby reducing delays and cost. It offers dynamic real time analytics and industry-wide context into both the prediction and outcomes, allowing better operational decisions to be made.
In future, it’s possible to predict a situation where AI-based technology will monitor every level of our operations, enabling us to reduce or eliminate even more faults and errors from our processes, whilst increasing the power availability and IT densities. Moreover, should an increasing number of data centre operators begin to utilise digital twin techniques in their designs, this could help the industry take a more positive step toward greater openness in reporting and environmental impact.
One might even consider that this could offer an anonymous, confidential and automatic reporting function into an organisation such as DCIRN, which would provide a basis for a comprehensive and well-rounded reporting on the data centre industry.
Today, the Kao Data Campus has capacity for another three new 8MW data centres and is engineered to meet the demands of AI and Machine Learning with customisable architectures that leverage high-capacity dark fibre wavelengths to offer diverse, ultra low-latency connectivity routes. They deliver unmatched renewable power provision, high-density infrastructure that future-proofs against evolving GPU’s and cools them with refrigerant-free air, or direct to chip liquid-cooling technologies.
The infrastructure housed within the walls of Kao Data London One is designed to deliver the highest levels of technical capabilities, providing customers across London and the Innovation Corridor with the most technologically advanced and scalable platforms for data analysis, Deep Learning and HPC processing in the UK.
This post was originally featured as a guest post by ComputerWeekly, click here to read the article.