Two Quintillion

Doing research has been getting faster and faster and faster. Now, with the push of a button, answers are coming back in a single second, to the people researching daunting topics like the universe, neuroscience, aerospace, cancer, drugs, clean fusion and more. That’s faster than I can type the letter ”K”.   :))))))))

Depending how old you are, you likely remember when you used your first computer, first laptop or mobile device. Clunky in scale, it amazed each of us with the capacity to make “at the time” lightening fast computations.  Type in a search, bang, the info was there.  Crunch a spreadsheet, and the data appeared before your eyes. Call a friend and they answered within seconds.  I can remember here at KHT when we first added connected computers to our campus, the team was thrilled.  Each iteration and improvement made us even more efficient in solving your PIA (Pain in the @%$) Jobs! … something we enjoy doing every day.  I was reading the WSJ the other day and came across an article about a new “exascale” computer named Aurora.  Inside a vast data center on the outskirts of Chicago, the most powerful supercomputer in the world is coming to life. The supercomputer’s high-performance capabilities, matched with the latest advances in artificial intelligence, will help scientists research challenges like cancer, nuclear fusion, vaccines, climate change, encryption, cosmology and other complex sciences and technologies.  Special thanks to WSJ writer Scott Patterson for the article and Google for the history.

Aurora is housed at the Energy Department’s Argonne National Laboratory and is among a new breed of machines known as “exascale” supercomputers. In a single second, an exascale computer can perform one quintillion operations—a billion billion, or a one followed by 18 zeros… (it looks like this 1,000,000,000,000,000,000 – yeeeowsa!).

An exascale computer refers to a computing system capable of performing one exaflop, which is a quintillion (10^18) floating-point operations per second (FLOPS). In simpler terms, it signifies the ability to execute a billion billion calculations per second. This level of computational power is a significant milestone in high-performance computing (HPC) and represents a thousandfold increase in performance compared to the previous generation of supercomputers.

Aurora is the size of two tennis courts, weighs 600 tons. Behind Aurora’s computing muscle are more than 60,000 graphics processing units, or GPUs, technology developed for advanced videogaming systems that has become the powerhouse of supercomputers. (that compares with the nearly 40,000 GPUs in Frontier – Oak Ridge National Laboratory’s Frontier, which came online last year and was the first operational exascale computer, retained its title as the world’s No. 1 computer. Aurora will likely “exceed Frontier…when finished.”)

Aurora, built by Intel and Hewlett Packard Enterprise, is slowly being turned on, rack by rack. Unlike regular computers, these high-powered machines take months to bring online as technicians look for flaws like mechanics testing a Formula One car before a race. Aurora is expected to become fully operational in 2024.

Just some of what it can do …

  • increase the accuracy of climate forecasts (the more-accurate estimates will allow planners to better prepare for the potential impacts of floods, wildfires or storms on a facility or neighborhood).
  • screen 22 billion drug molecules to accelerate drug discovery.
  • mapping connections in the brain, a task so complicated it could take Aurora a full day to process a tiny sliver of the brain
  • handle the biggest large language model—a predictive AI system similar to ChatGPT—ever deployed.
  • deploy automated labs that will let the computer conduct real-world experiments using robots. It can analyze the results and modify the experiments, speeding up the research process.
  • help with the design and production of batteries that hold more power and charge faster. Batteries run on chemical reactions. Rather than test new battery chemistries in labs, scientists can use Aurora to model billions of combinations at high speed, providing scientists new insights into potentially breakthrough technologies.
  • leverage Argonne’s massive X-ray machine, the Advanced Photon Source, or APS, to examine materials at the atomic-scale level and feed the data into Aurora. The upgraded APS increases the brightness of the X-rays, that means experiments can be much faster … In fact, they can be so fast that humans can’t control them anymore.

There are more of these powerful machines coming soon. Lawrence Livermore National Laboratory in California is building a $600 million exascale supercomputer called El Capitan, after the famed rock formation in Yosemite National Park. It is expected to be deployed next year and could eventually exceed Aurora’s computational firepower.

According to Elon Musk, Tesla is spending more than $1 billion to build an exascale supercomputer called Dojo. China might have exascale machines, but it doesn’t provide them to outsiders for testing. Computer scientists in the U.K. and elsewhere are trying to produce their own exascale computers.

History of supercomputers, searched using Chat openai (how cool is that!)

1950s-1960s: The concept of supercomputing emerged in the 1950s with the development of the first electronic computers. However, the term “supercomputer” was not widely used until the 1960s. Early supercomputers like the Control Data Corporation (CDC) 6600, introduced in 1964, were characterized by their exceptional processing speed and capability for scientific and engineering applications.

1970s: Seymour Cray, known as the “father of supercomputing,” played a crucial role in the development of supercomputers during this period. Cray founded Cray Research and introduced a series of supercomputers, including the Cray-1 in 1976, which was the first supercomputer to use vector processing.

1980s: Vector processing became a dominant feature in supercomputers during the 1980s. Cray continued to produce successful models like the Cray-2. Other companies, such as IBM with its IBM 3090 and 390, also contributed to the supercomputing landscape.

1990s: Parallel processing, where multiple processors work together on a task, gained prominence. This era saw the development of massively parallel supercomputers. Thinking Machines Corporation introduced the Connection Machine, and Cray Research produced the Cray T3D and T3E.

2000s: The rise of clusters and distributed computing marked this period. Clustered systems, composed of multiple interconnected computers, became a cost-effective way to achieve supercomputer performance, such as IBM’s Blue Gene series and Cray’s XT5.

2010s: GPU (Graphics Processing Unit) acceleration gained popularity for certain types of computations, and heterogeneous computing became more common. China’s Tianhe-1A and later the Sunway TaihuLight were among the world’s fastest supercomputers during this period.

2020s: The race for exascale computing intensified, with countries and organizations aiming to build supercomputers capable of performing one exaflop or more. Frontier and Aurora supercomputers in the United States are part of this exascale push.

The supercomputer’s high-performance capabilities will be matched with the latest advances in artificial intelligence, with hard to imagine outcomes. Together they will be used by scientists researching cancer and other diseases, nuclear fusion, safer vaccines, climate patterns to avoid disasters, financial and data encryption, cosmology and other complex sciences and aid technologies to provide better products and services which in the past would take longer to develop.

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::

DO YOU LIKE CONTESTS?
Me, too.

As you may know the Kowalski Heat Treating logo finds its way
into the visuals of my Friday posts.
I.  Love.  My.  Logo.
One week there could be three logos.
The next week there could be 15 logos.
And sometimes the logo is very small or just a partial logo showing.
But there are always logos in some of the pictures.
So, I challenge you, my beloved readers, to count them and send me a
quick email with the total number of logos in the Friday post.
On the following Tuesday I’ll pick a winner from the correct answers
and send that lucky person some great KHT swag.
So, start counting and good luck!  
Oh, and the logos at the very top header don’t count.
Got it? Good.  :-))))
Have fun!!

::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::