29 Oct 2020

Power hungry: Why GPUs are taking over the data centre...

The rate of change within the technology space often brings with it feelings of excitement, anticipation and, in some cases, fear. Fresh into my new role at Kao Data, and with feelings predominantly of excitement, I thought it particularly appropriate to dive into the “core” of these technology advancements – please excuse the pun!

In 1999, when a niche Silicon Valley company called NVIDIA released what it claimed was the “world’s first GPU”, the announcement did not resonate much beyond the world of video games. Back then, GPUs (graphics processing units), were incredibly important for giving the latest gaming PCs the power to generate immersive 3D worlds but not hugely in demand otherwise.

Since then, however, GPUs have been adopted for enterprise computing, cryptocurrency mining and high performance computing (HPC) tasks such as running AI applications and quantitative research tasks. In August 2020, the revenue from NVIDIA’s data centre business surpassed its gaming revenue for the first time. GPUs are becoming a vital part of the modern data centre.

The racehorses of the processing world

The main reason for this change is that GPUs excel at parallel processing - the ability to carry out multiple tasks simultaneously. A traditional CPU (central processing unit) will have a handful of cores which are designed to process tasks in sequence. A GPU could have thousands of cores that are optimised for simultaneous processing, primarily of mathematical operations.

Imagine, for example, that your computing task involves checking the records in a vast spreadsheet. A CPU will run through the rows of data until it answers your query. A GPU could assign a core to each row of data and check them all at the same time - and in many HPC applications, speed = money saved.

It’s fair to wonder why, if GPUs are so fantastic, why we still bother with CPUs. The answer is that CPUs are better at a wider range of tasks, such as running a general-purpose computer. They’re like farm horses that might be pulling a plough in the morning and taking the farmer to market in the afternoon. GPUs on the other hand are the racehorses that would turn-up their noses at even the slightest mention of ploughing the turnip field...

An increasing range of uses

However, the range of tasks for which GPUs are ideal turns out to be broad. About 10 years ago they began to be used to ‘mine’ cryptocurrencies such as Bitcoin, because the process of creating the currency involves rapidly calculating simple mathematical problems. Crypto miners have now moved on to an even more specialised technology, known as application specific integrated circuits (ASICs).

As demand from crypto miners dried up, NVIDIA saw its share price slump. GPUs were already being used in data centres for high performance computing tasks such as visualisation, artificial intelligence computation and data-heavy applications. NVIDIA started to make this more of a focus, as did rivals AMD and UK-based start-up GraphCore. Meanwhile, CPU giant Intel has renewed its attempts at building a GPU business after some aborted efforts in the last 25 years.

For an increasingly broad range of uses, from the Internet of Things to autonomous vehicles, and from facial recognition to healthcare analytics, GPUs are the ideal tool, which is why more and more data centres are installing them.

Power-hungry processors

The problem with GPUs, with all those high performance cores packed into a small space is that they require a lot of power and get very hot. Many older data centres are simply not designed to deal with this amount of heat. As mentioned above, a GPU can have thousands of cores and a server might have as many as 16 GPUs.

In May, NVIDIA announced the latest of its cutting edge DGX range, the DGX A100. This performs 20-times better than its predecessor thanks to its Ampere architecture, which crams 54 billion transistors onto a chip. It is possible to combine eight of them into one giant processor that weighs 50 pounds. The company says the GPU’s focus on intensive AI tasks makes it ideal for research into the Covid-19 pandemic, to help uncover cures and vaccines.

The days of the GPU as solely a gaming device are long gone. HPC increasingly requires GPU-powered data centres and in the coming years more and more customers will see it as a requirement. Kao Data is one of the very few data centres in the UK engineered specifically to cater for these kind of deployments.

Exciting times lie ahead for everyone within the advanced computing space, especially in the Cambridge and UK Innovation Corridor - and I’m delighted to be leading the charge with Kao Data.

Tom Bethell

Tom Bethell is one of Kao Data's Business Development Directors. With a background in IT Infrastructure; Tom has been working specifically within the areas of data centre colocation and high performance computing for a number of years.



Share

Other articles

8 Mar 2024

How data centres in can boost career opportunities for women in tech in Manchester

Read more
1 Mar 2024

Kao Data Champions Mental Health First Aider Initiative: Elevating Employee Well-Being

Read more
20 Feb 2024

How data centres are steering Greater Manchester’s technological revolution

Read more
Get a quote

Privacy Consent

This site optionally uses cookies to monitor and improve your experience, and for anonymised tracking of visitor behaviour.

You can learn more about how we use cookies by viewing our Privacy Policy.