- Jul 24, 2021
The U.S. Department of Energy (DOE) has announced that Lawrence Berkeley National Laboratory’s National Energy Research Scientific Computing (NERSC) Center has signed a contract with Cray for NERSC’s next-generation supercomputer, a pre-exascale machine slated to be delivered in 2020. Just a day after unveiling the world’s third fastest supercomputer at the Lawrence Livermore National Laboratory, the US Department of Energy (DOE) has announced yet another super-fast machine is on its way.
Named after Berkeley Lab’s Nobel Prize-winning astrophysicist Saul Perlmutter, the system has a total contract value of $146 million and is expected to more than triple the computational power currently available at NERSC, the high-performance computing facility for the DOE Office of Science.
This new supercomputer has a total contract value of $146 million, including multiple years of service and support, and will more than triple the computational power currently available at NERSC. In addition, the new system has a number of innovative capabilities that will facilitate analyzing massive data sets from scientific experimental facilities, a growing challenge for scientists across multiple disciplines.
Cray supercomputer systems consistently lead the industry in performance and efficient scaling. Shasta’s hardware and software innovations tackle the bottlenecks, manageability, and job completion issues that emerge or are magnified as core counts grow, compute node architectures proliferate, and workflows expand to incorporate AI at scale.
The new supercomputer will be a heterogeneous system comprising both CPU-only and GPU-accelerated cabinets. It will include a number of innovations designed to meet the diverse computational and data analysis needs of NERSC’s user base and speed their scientific productivity.
Cray believes the technology will open up opportunities for expansion to meet the AI needs of companies in energy, manufacturing, automotive, and aerospace sectors, as well as financial services and insurance.