Moore’s Law and nuclear computing

1 May 2024



Nuclear has benefited from the computational power of High-Performance Computing (HPC) with its ability to analyse complex data, simulate intricate processes, and optimise operations. Despite facing several challenges, Owen Thomas, founder of Red Oak Consulting, argues that the nuclear sector will continue to thrive as HPC invariably moves to the cloud.


Above: With increasing costs and shrinking space available for the growing number of semiconductor chips involved in high performance computers, the nuclear sector faces a new dilemma

Moore's Law, formulated by Gordon Moore in 1965, predicted that the number of transistors placed on a single square inch of an integrated circuit chip would double every two years, leading to an exponential increase in computing power. This Law has had profound implications for the development of High-Performance Computing (HPC), not least in the nuclear sector, and the evolution of cloud computing, shaping the landscape of modern technology.

It is now well recognised that Moore’s Law is nearing its end. Since its formulation there has been about a one trillion-fold increase in the amount of computing power being used in predictive models. To improve these high-performance models further, we need exponentially more computing power. Without it, the necessary gains in accuracy will diminish. But, with increasing costs and shrinking space available for the growing number of semiconductor chips involved in HPC computers, all sectors, including nuclear, face a new dilemma.

McKinsey estimates that global power consumption will triple by 2050. With the impact of climate change adding urgency to reducing energy use and energy waste, the nuclear industry is accelerating innovation to drive impact and outcomes at scale, and as the report states: ‘Technologies like CCUS (carbon capture, utilisation, and storage) and nuclear will likely see additional growth if renewables build-out remains constrained’.

In addition, artificial intelligence (AI), advanced analytics, 3-D imaging, and the internet of things (IoT), supported by HPC, are all contributing to nuclear production to ensure a smoother transition to a more sustainable pathway. Moreover, HPC contributes to the optimisation of power generation and distribution systems, including nuclear power plants and smart grids. Advanced simulation tools allow engineers to design more efficient turbines, boilers, and cooling systems, thereby reducing energy losses and environmental impacts. In addition, real-time monitoring and control systems empowered by HPC are enhancing grid resilience, enabling rapid responses to outages, fluctuations, and even cyber threats.

HPC in practice in the nuclear sector

The nuclear energy industry harnesses HPC across various sectors, spanning from research and education to enhancing nuclear power plant designs, predicting the behaviour of nuclear materials under extreme conditions with unprecedented accuracy, replacing real- world nuclear testing with virtual testing and, crucially, safety to reduce the probability of nuclear incidents. HPC assists ground-breaking research and development, accelerating the pace of discovery and enabling deeper insights into nuclear physics, materials science, and reactor engineering.

Above: HPC finds extensive application across various domains within nuclear, including materials science, structural integrity, neutronics, and thermal hydraulics, reactor design and could play a pivotal role in enabling commercial fusion

For plant life management and for delivering operation beyond design life, HPC affords a thorough assessment and management of structures and components for ageing degradation with ‘High fidelity’ modelling, while reducing uncertainties and optimising operations.

Los Alamos National Laboratory, in New Mexico, USA, is using HPC across many sectors, including to test materials inside a nuclear reactor and to use simulations to develop new materials by developing them from scratch on a computer, in a bid to make better material for clean, nuclear energy production.

HPC finds extensive application across various domains within nuclear, including materials science, structural integrity, neutronics, and thermal hydraulics. Likewise for reactor design with high-fidelity modelling.

Furthermore, it could play a pivotal role in enabling commercial fusion. At an ISC focus session in May 2023, AI and HPC were very much in debate. There Rob Akers, director of computing programs, U.K. Atomic Energy Authority (UKAEA), stated: “We simply don’t have time to deliver fusion against the timeline we’ve been given using test-based design”. He added that the answer lies in exploiting the “enormous power of data science and supercomputing at scale.”

Scalability and the future of Moore’s Law

The scalability and cost-effectiveness driven by Moore’s Law has significantly influenced the development of cloud computing. The ability to pack more transistors onto a chip has led to more powerful and affordable hardware, making it feasible for cloud service providers to offer robust computing resources at a lower cost whereby cloud computing leverages the principles of virtualisation and on-demand resource allocation. The technologies and innovation sitting behind Moore’s Law have empowered cloud providers to continually enhance their infrastructure, providing nuclear companies with the ability to scale up or down as needed.

Furthermore, the rapid evolution of semiconductor technology has spurred innovation in cloud services. Cloud providers can leverage the latest hardware advancements to offer new and improved services to their users. This continuous cycle of innovation enhances the agility of cloud platforms, allowing them to adapt to changing technological landscapes.

While growth of high-performance computing and the cloud aligns with Moore’s predictions, it faces challenges. These include factors such as physical limitations and the diminishing returns of miniaturisation. As transistors approach atomic scales, alternative technologies such as quantum computing may become necessary for sustaining the pace of progress.

It appears then that we could be forgiven for thinking that we are close to reaching the limits in available computational power. But that’s not necessarily the case, indeed the cloud will continue to be the principal catalyst for realising HPC’s impact across all sectors, so long as we all work better with the tools we have to improve efficiencies and outcomes.

Much of that will be down to training, and much also down to funding, but crucially, it’s about understanding where the true power lies, where petabytes of data are processed in milliseconds.

Over time needs will evolve, as indeed does the nature of support required. What is critical, however, is that as the nuclear sector evolves with HPC, it needs support to get optimal use, and power, to realise HPC’s benefits. And, despite everything, Moore’s Law is still guiding the nuclear sector to look at new ways of enhancing computational power to increase efficiencies for operators, and likewise to give greater power at the fingertips of consumers.


These themes are explored further in the Red Oak Consulting report Incorporating the cloud into the HPC mix.



Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.