How the Met Office's new supercomputer could save the UK £2 billion
A 140-tonne, £97 million Cray XC40
How did the Met choose the Cray XC40?
The XC40 is the very latest generation of Cray supercomputer, the present version (the XC30) of which can be found at the European Centre for Medium-Range Weather Forecasts (ECMRWF) in Reading.
"We wanted to act like an intelligent customer and to make sure the specifications in the tender were challenging, but realistic," says Underwood, "so we went to the various super-computing conferences to get a good understanding of the state of the industry, who the major players were, what the technologies are on offer and what the likely levels of performance might be." From seven manufacturers, it was whittled down to four, then three, finally arriving at Cray becoming the preferred bidder.
What is the lifecycle of the Cray XC40?
Supercomputers aren't an off-the-shelf product, with the Met Office's latest due to be implemented in three phases. "The first phase will be operational by September 2015 when in terms of performance it will be a like-for-like replacement of the current IBM POWER7, and based on the Intel Haswell chip," says Underwood. By February or March 2016 the Met Office will have upgraded to 1B machines using the Intel Broadwell chip, with the final stage in March 2017 hoped to feature the Intel Skylake processor.
"At each stage of the process we're going to use ever more capable processor technology," says Underwood. "We expect the XC40 to last four to five years. It's not the system itself that wears out, but the rate at which it performs becomes no longer state of the art, so it slows-up the rate at which we can bring new science into our services."
Why do we need supercomputers?
"Across a huge spectrum of human endeavour, High Performance Computing (HPC) is the engine and primary tool humanity is using to advance what we value in society," says Jason Cori, Director EMEA/APAC at SGI, which has just been chosen by NASA to upgrade the Discover supercomputer at the NASA Center for Climate Simulation (NCCS) at NASA's Goddard Space Flight Center in Greenbelt, Maryland.
"Supercomputers' incredible processing speeds – as much as quadrillions upon quadrillions of calculations per second at peak performance – can gain insights from data that it would take humans many years to do. They also assist in providing a key mechanism for collaborations, both within scientific communities and across scientific disciplines."
What's NASA doing?
The Discover supercomputer at the NCCS conducts weather and climate simulations that span time scales from seasons to centuries. It's being upgraded with 1.9 petaflops of SGI Rackable clusters featuring an Intel Xeon E5-2696 v3 processor so it can handle more data and develop higher-resolution weather and climate simulations.
Are you a pro? Subscribe to our newsletter
Sign up to the TechRadar Pro newsletter to get all the top news, opinion, features and guidance your business needs to succeed!
"In just a couple of years we have seen the science industry explode with data, enabling ground-breaking insights we couldn't imagine before," says Jorge Titinger, president and CEO of SGI. "High performance computing is a fundamental part of ensuring researchers have the tools to access all the data at their fingertips."
Only 1.9 petaflops? It may not compare to the Met Office's new supercomputer, but in the US weather forecasting is split between myriad agencies, so it's pointless to compare by the petaflop.
In what other industries are supercomputers used?
"There are a number of select industries that rely more on computing horsepower than others," says Cori, citing Earth Science as one of the. "In addition to weather and atmospheric research around the world, oil and gas is also a major consumer of HPC," he says, citing Total's supercomputer Pangea, which is helping engineers run simulations at 10 times the resolution of existing oil and gas reservoir models.
"Ultimately this new research should provide a clearer picture of what is happening beneath Earth's surface, and provide better direction to the decision makers at Total on where to place their valuable resources."
Supercomputers are also being used in computational biology and bioinformatics. "In my own work with a group at Argonne National Laboratory, science is accelerating advances well beyond its initial applications in the physical sciences," says Thiruvathukal.
"For example, computational biology and bioinformatics are emerging to understand the data coming from next-generation sequencing devices [and] it is likely that high performance computing will help us to understand virus mutation and evolution in real-time, and identify emerging threats."
Jamie is a freelance tech, travel and space journalist based in the UK. He’s been writing regularly for Techradar since it was launched in 2008 and also writes regularly for Forbes, The Telegraph, the South China Morning Post, Sky & Telescope and the Sky At Night magazine as well as other Future titles T3, Digital Camera World, All About Space and Space.com. He also edits two of his own websites, TravGear.com and WhenIsTheNextEclipse.com that reflect his obsession with travel gear and solar eclipse travel. He is the author of A Stargazing Program For Beginners (Springer, 2015),