Solving the fusion multi-challenge

31 August 2023



The digital world may be key to making fusion a reality in a realistic timeframe. It will rely on step changes in high performance computing and artificial intelligence, and an open-source approach to collaboration.


Above: ITER may be passed on the road to fusion by virtual developments

The world needs clean energy at scale and fast. For many, the hope is that the need will be met by fusion. But while fusion may have the necessary scale, so far, the decades required to deploy it, progressively through the Joint European Torus (JET) in the UK and the International Thermonuclear Experimental Reactor (ITER), still under construction in France, have ruled it out of power suppliers’ practical options in the short term.

New fusion initiatives want to move much faster, towards operation in good time to be the cause of significant carbon emissions by 2050. Now UKAEA, site of JET and a centre for fusion development in the UK, has joined with the University of Cambridge, Dell and Intel to achieve that goal, by moving from the real to the digital world to design and test a fusion reactor. The team will be able to take advantage of new supercomputers – and deploy a new open approach to development that should make it faster and more robust.

Launching the collaboration, Dr Rob Ackers, Director of Computing Programmes at UKAEA said that the concept at the heart of the so-called STEP (spherical tokamak for energy production) mission is to put fusion on the electricity grid in the 2040s. He described this as “a moonshot programme to prove fusion can be economically viable”. The collaboration is an important part of that mission and meeting the need to develop and nurture the supply chain that will design and construct the world’s first fusion power plants.

Ackers was blunt about the task. He said, “there is insufficient time to do engineering [for the fusion plant] if we do it the way we have been doing it for decades”. Historically, engineering has been carried out by an iterative test-based approach, in which “we design and build prototypes and then evaluate them and move forwards”.

But that is time consuming and expensive and now “we have 17 years to stand up STEP and plug it in. We need to think differently, change the engineering design process and take it into the virtual world.”

This is a path that is well-trodden in other industries, examples include the move from wind tunnels to computational fluid dynamics. But the fusion challenge is more difficult because “it is an incredibly complex strongly coupled system. The models underpinning it are limited in accuracy, there are coupling mechanisms to be taken into account, there is physics that spans the whole machine from structural forces, to heat loads, through the power plant, to electromagnetism and radiation.” A single change to a subsystem can have huge ramifications across the plant and the designers will have to look for emergent behaviour that will otherwise only become apparent when the plant is built.

The need to “simulate everything everywhere all at once” requires supercomputing and artificial intelligence. Akers says UKAEA and its partners need to use the world’s largest supercomputers and run simulations at 10^18 calculations per second (exaflops) to reduce the time to a solution, optimise the plant design and quantify risk in engineering.

The UK government recently announced additional funding of £250m (US$318m) to boost research into artificial intelligence, quantum technologies and engineering biology and it will also fund an exascale supercomputer. The team hopes to use that in the next 10 years to produce a digital version of STEP that can be used to dramatically reduce the need for real-world validation.

Dr Paul Calleja, Director, Research Computing Services, University of Cambridge, talked in more detail about the ‘multi challenge’ of a simulation that has to couple different types of physics – fluids, plasma and materials – and do it on various timescales, some very long.

Even with an ‘exa’ supercomputer to call on there is a long way to go in developing the tools that will use the full potential of a computer that will cost £600m (US$762m) and draws 20 MW of power in operation. Exaflop computing power is wasted if there is a ‘bottleneck’ in the computation process, such as slow performance of the code or access to data.

With Intel and Dell on board the partnership brings together hardware and application providers and scientists to look at this as a holistic problem. Moving to Intel’s new GPU systems provides an order of magnitude more performance per flop, while Nigel Green Director of emerging technologies and solutions, EMEA, at Dell said there would be a step change in how the companies work together. Adam Roe, HPC Technical Director, at Intel said the company was excited by the initiative, saying that, once informed by data, high performance computing can move from simulation to more complex science.

What is more, by taking an open-source approach, the partners are effectively calling on the global industry to help with the challenge. All work is on open standard hardware and open-source software. The partners are looking at the so-called ‘middleware’ and how to make the programme accessible to a broad range of scientists and engineers not used to supercomputing technologies.

Open source would apply to tool chains and engineering approaches will be open, and that gains the benefit of expert, concerted development, the partners explained. Open source means “the methodology gets critical analysis, they said, while tool chains were “more critically analysed, more sane and better reviewed”. What is more, such software remains up to date.

Open source applies to the design and build of the applications. Will the information be open? The project is funded by the UK government and that always involves a requirement to share learnings and sometimes data other fusion developers are taking the same approach. But data sets and intellectual property will remain be proprietary.


Addressing the open source security concerns

The ‘open source’ approach raises an issue in a world where there is a growing fear of ‘spyware’ or loss of control to unfriendly regimes.

The partners dismissed that bogeyman, saying “very strong cyber security” would protect data. Such fears may or may not be justified, but in technical sectors that require political support, if they are not addressed at the highest level the effect can be dramatic. In 2018 the UK government forced China’s out of its 5G networks, following on from US sanctions, fearing that the company concerned could spy on businesses and citizens. Huawei saw its UK turnover fall from £1.28bn in 2018 to £359.1m (US$1.63bn – US$456m) for the year ended December 2022. The day before the UKAEA’s launch, UK television showed an investigation into whether Chinese-supplied cameras used for surveillance in UK cities give China an opportunity to do its own surveillance. For the public, clearly the fear remains that ‘back doors’ allow software to be controlled by others: it is a fear that has to be addressed.

Exaflop computing power may be key to accelerated practical fusion
New fusion reactor designs may emerge from open-source collaboration


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.