10 Feb 2021

Life after GPUs

GPUs have become the accepted norm for all kinds of fast processing, but could there be signs that there is something else beyond the GPU?

Back in October, Kao Data’s Tom Bethel told us how GPUs are taking the spotlight from from CPUs in data centres, because of their high performance, and ability to handle a lot of increasingly important tasks in fields like analytics and pattern recognition. Starting out as co-processors for gamers, then moving to Bitcoin mining, they are now mainstream in the world’s fastest computers. But he warned that their high performance generates a lot of heat - and that is a serious issue.

Now, let’s remember that GPUs don’t replace CPUs, they augment them. However, in many high performance computers the number of processors is skewed so heavily towards GPUs, they are referred to as 'GPU-based'.

Supercomputer makers place a lot of store by performance per Watt, because the energy used by a fast processor costs more than the processor itself. GPUs have a higher performance per Watt than CPUs, but because systems cram in GPUs, they use more power in total than the CPUs do.

GPU-based machines dominated the Top500 list of supercomputers in recent years, but in the last year or so there have been signs of change. The top spot is currently held by Japan’s Fugaku, a CPU-based system using Fujitsu’s implementation of an ARM processor, with vector extensions.

GPUs still figure in Frontier, from the Oak Ridge National Laboratory, which is the US’ hope to beat Fugaku in creating an exascale computer - one that can handle one exaflops, or a billion billion calculations per second.

China meanwhile, has boosted the AMD Radeon GPU, with local server maker Sugon adopting AMD’s architecture to make processors which get round export rules from the US.

So is it still all about GPUs and CPUs? Not necessarily. Step back from the world of obsessively faster supercomputers, and there’s a massive market emerging for AI work, and it turns out that a lot of this needs something slightly different.

GPUs were designed as specialised graphics processing units. They reached a bigger stage when their parallel operations turned out to have more general use. That gave a huge boost to GPU makers, led by NVIDIA. It also led to so-called GPGPUs (General Purpose GPUs) - a somewhat self-contradictory acronym, given that GPU refers to their original specialised task.

Now, new workloads are emerging that might need something different, which could be provided by FPGAs (field programmable gate arrays).

As the name implies, FPGAs can be re-programmed in the field, adding some flexibility to the benefits of having hardware tailored to a specific job. Like GPUs, they operate alongside a CPU, addressing specific tasks - though if those tasks become very significant, one could imagine systems where their role overshadows the CPU.

FPGAs have been engineered to be very power efficient, a key issue as the density of GPUs reaches levels where liquid cooling is standard for supercomputers once more.

The need for FPGAs is emerging with the rise of AI (artificial intelligence). Now, AI requires two kinds of system: Training, where a lot of fast work generates an AI model, and Inferencing, where that model is applied to real-world data. Training develops the “brain”, and inferencing deploys copies of that brain for actual use.

Research has found that FPGAs are very efficient at the inferencing part of the problem. And, as AI comes into widespread use, we are going to have a lot of inferencing systems, operating on local data, deployed in local sites.

This is a key point: inferencing will take place outside of specialist supercomputers, in regular data centers and new Edge facilities, where energy density is important, so efficiency is important.

The rise of GPUs brought NVIDIA to the fore, as its systems moved out of the niche of gaming machines. If AI does something similar for FPGAs, then it will benefit the market leader there, Xilinx.

FPGAs won’t push GPUs off their perch, but in this new sector, watch for their rise!

Peter Judge (Guest)

Peter Judge is Global Editor at Datacenter Dynamics and a freelance tech writer on data centres, the cloud and networking. You can follow him at: @judgecorp



Share

Other articles

8 Mar 2024

How data centres in can boost career opportunities for women in tech in Manchester

Read more
1 Mar 2024

Kao Data Champions Mental Health First Aider Initiative: Elevating Employee Well-Being

Read more
20 Feb 2024

How data centres are steering Greater Manchester’s technological revolution

Read more
Get a quote

Privacy Consent

This site optionally uses cookies to monitor and improve your experience, and for anonymised tracking of visitor behaviour.

You can learn more about how we use cookies by viewing our Privacy Policy.