Google Cloud developer uses AI to calculate 100 trillion digits of pi

IT
(Image credit: Shutterstock / carlos castilla)

The long and chequered lifespan of mathematical icon pi has been extended even further thanks to Google Cloud.

Google Developer Advocate, Emma Haruka Iwao, successfully calculated pi to 100 trillion digits using the company's cloud platform.

What's even more striking it that this is the second time in just three years that Iwao has broken the record.

Why does this matter?

Mathematicians have been cracking away at calculating pi to its limits since Ancient Egypt, Greece, and Babylon.

Google openly admitted that you might not need to “calculate trillions of decimals of pi” but said the “massive calculation demonstrates how Google Cloud’s flexible infrastructure lets teams around the world push the boundaries of scientific experimentation".

Though pi related calculations pop up in everything from the theory of relativity to engineering problems and GPS mapping, these types of extreme calculations are generally used as a benchmarking tool by computer scientists, to prove and assess the power of their hardware.

How did they do it?

Google Cloud says it used it’s generally available Compute Engine service to make the record calculation.

The tech giant attributed its improved result compared to last time it made the attempt in 2019 to improved networking and storage.

The project was able to achieve 100 Gbps egress bandwidth, a huge improvement on the 16 Gbps of egress available when they did the 31.4-trillion-digit calculation in 2019.

The project used new network driver Google Virtual NIC (gVNIC), which is integrated with Google’s Andromeda virtual network stack

Google also attributed the success of the project in large part to improved storage, saying that as the “dataset doesn't fit into main memory, the speed of the storage system was the bottleneck of the calculation”.

For this job they decided to use Balanced Persistent Disc, a new type of persistent disk which Google said offers up to 1,200 MB/s read and write throughout and 15-80k IOPS.

Those who are interested in checking out more about the nitty gritty of the project can head to  GitHub to find the code Google used.

Google will also be hosting a live webinar on June 15 to share more about the experimentation process and results, and you can head here to join.

TOPICS

Will McCurdy has been writing about technology for over five years. He has a wide range of specialities including cybersecurity, fintech, cryptocurrencies, blockchain, cloud computing, payments, artificial intelligence, retail technology, and venture capital investment. He has previously written for AltFi, FStech, Retail Systems, and National Technology News and is an experienced podcast and webinar host, as well as an avid long-form feature writer.

Read more
Google DeepMind panel discussion
“More sovereignty and protection” - Google goes all-in on UK AI with data residency, upskilling projects, and startup investments
Trillium TPU
You can now rent Google's most powerful AI chip: Trillium TPU underpins Gemini 2.0 and will put AMD and Nvidia on high alert
Gemini Code Assist
Google Gemini's new Code Assist tool might finally be the help I need to get coding
Google Gemini AI
Google Gemini is racing to win the AI crown in 2025
The logo of Google Cloud
Google Cloud introduces quantum-safe digital signatures
Google AI co-scientist overview
Scientists firmly in AI crosshairs as Google launches co-scientist scheme to accelerate scientific breakthroughs just days after another similar project
Latest in Pro
Nvidia GR00T N1 humanoid robot
Nvidia is dreaming of trillion-dollar datacentres with millions of GPUs and I can't wait to live in the Omniverse
Nvidia Isaac GROOT N1
“The age of generalist robotics is here" - Nvidia's latest GROOT AI model just took us another step closer to fully humanoid robots
A computer file surrounded by red laser beams
Free online file converters could infect your PC with malware, FBI warns
Nvidia Earth-2 weather models
Nvidia has updated its virtual recreation of the entire planet - and it could mean better weather forecasts for everyone
Nvidia DGX Station
Nvidia’s DGX Station brings 800Gbps LAN, the most powerful chip ever launched in a desktop workstation PC
Artificial intelligence India
Zoom launches AI Companion 2.0 with a major agent focus
Latest in News
Perplexity Squid Game Ad
New ad declares Squid Game's real winner is Perplexity AI
Pedro Pascal in Apple's Someday ad promoting the AirPods 4 with Active Noise Cancellation.
Pedro Pascal cures his heartbreak thanks to AirPods 4 (and the power of dance) in this new ad
Frank Grimes confronts Homer Simpson in The Simpsons' Homer's Enemy episode
Disney+ adds a new continuous Simpsons stream, so you no longer have to spend ages choosing an episode
Helly and Mark standing on an artificial hill surrounded by goats in Severance season 2 episode 3
New Apple teaser for Severance season 2 finale suggests we might finally find out what Lumon is doing with those goats, and I don't think it's anything good
Nvidia GR00T N1 humanoid robot
Nvidia is dreaming of trillion-dollar datacentres with millions of GPUs and I can't wait to live in the Omniverse
Foldable iPhone
Apple’s first foldable iPhone could beat the Samsung Galaxy Z Fold 7 in one key way