Data is power

30 July 2020



The influx of new data technologies collectively known as Industry 4.0 promises to bring innovation to nuclear sites and projects. It is already finding roles that accelerate decommissioning. Jeremy Gordon talks to Robert Plana, chief technical officer of Assystem, about some of the company’s experience so far.


INDUSTRY 4.0 IS THE TERM for future work environments with total data mobility and transparency; where plant components collect data on their operation and share it with each other. Even today IBM estimates that a typical factory produces 1TB of data each day, but that only 1% is ever analysed. In future the volume of data is guaranteed to balloon, while more advanced analytics using artificial intelligence (AI) and its machine learning (ML) techniques will find more and more roles helping managers to make sense of it. Taking the concept to the extreme, some factories would even see distributed decision-making between autonomous plant components, and their integration with data coming from supply and distribution chains. On-site workers would be presented with exactly the information they need at the moment they need it and their work schedules could be optimised in real time.

The cumulative impact of these incoming technology integrations will be transformative for many industries, but for the nuclear industry probably less so. While nuclear plants have long been full of sensors constantly recording data, the regulatory environment is not conducive to rapid innovation in sharing, processing and usage of the data itself. It is at this point where Industry 4.0 becomes very challenging for the nuclear industry, especially in power plant operation. The need for security restricts the transmission of operational data and it can be very difficult to justify any changes to systems related to safety or security. According to the World Institute for Nuclear Security, “The convergence of IT and operational technology is inevitable, and with it come significant opportunities and risks,” but the issue remains unsolved and unscoped as we enter the 2020s.

Expert intelligence

While a sensor-led digital revolution is not forthcoming for the nuclear industry, it can instead take advantage of the wealth of structured data information it already has. Robert Plana is the chief technical officer of Assystem and he explained some of the approaches the company has taken.

Usually in AI and ML applications, the best performance comes after training new applications with huge datasets, sometimes taken from several sources across a range of fields. While nuclear industry security constraints make that approach impossible, an alternative for Plana and Assystem has been to create expert systems on a standalone basis in which all the data remains the property of the customer and under its control at all times. Without a host of training data, the company usually uses 80% of a customer’s real life data for training and 20% of it for testing. What might be lost from the potential performance of learning systems may be compensated by the rigour of the input data and knowledge of experts encoded on top, says Plana.

Supporting a forthcoming decommissioning project in France, Assystem created a specific search engine based on an archive of thousands of paper documents. This process is illustrated in Figure 1 on the next page.

The first stage was scanning them and using optical character recognition to convert to a machine-readable form. Then they were classified and made available in full for search.

Next the team built up an ontology – a map of the subject domain based on vocabulary and relationships apparent in the text – which further informed the search.

Naturally the highly structured information in plant and regulatory documents made a good basis for this, but for Plana the best starting points are sometimes the tables.

“Engineers are very structured and in the tables they summarise the important numbers,” he says, explaining that the documents are usually exploded, with plain text, glossary and tables handled separately. After analysis, the machine may propose to begin from the data tables, but the engineers will always guide and tailor the process.

To complement the decommissioning documents and ontology, a set of 20,000 questions and answers was added, encoding valuable institutional knowledge. “I call that the digital heritage of the company,” says Plana. “If you mix the knowledge of the expert, the most frequent questions, with the deep learning techniques you are able to build a very smart search engine.” In that way a comparatively small input of data – 48GB of documents – created a powerful subject specific knowledge tool including a site inventory database.

The final stage was the addition of a natural language interface. Engineers could investigate anomalies and their meaning with questions like, ‘Can you identify the radioactive materials in building A?’ or ‘Can you compare the radioactivity levels listed in documents with those measured when the dismantling team started to operate on the site?’

Plana states the system, “can retrieve information in seconds that would otherwise take a week.”

Digital twinning

Detailed knowledge of the precise state of a facility is essential to plan its decommissioning. Only from a complete picture is it possible to correctly define the size of the team, the tools and the processes needed, and therefore the schedule and cost. To achieve this, the digital twin must encompass the plant schematics and components down to the smallest possible level, including all the documentation to describe the parts and their maintenance.

For many of the facilities entering decommissioning today there are only 2D drawings and descriptions, usually of how the facility was designed and updates from subsequent modifications. However, it is important to remember also that no facility is truly built according to design. There are always small issues and optimisations when it actually comes to screwing things onto the wall. “A digital twin creates the environment to move between the two,” says Plana.

A first step in reconciling all this design information is 3D laser scanning of the facility to get its current topology. It is then possible to proceed to what is known as ‘Build4D’: the concept of checking a space through time as operations take place, for example installing large new components. As well as simulating the procedures for major tasks, Build4D also enables the management of coactivity at the facility – the avoidance of conflicts where two teams need to access the same space at the same time, or when simply the number of people on site would run into safety or security constraints. If a specialised team cannot complete its work on a given day, there is no guarantee that the next day the personnel or the tools will be available. These simple coactivity conflicts can quickly throw a project off schedule.

Scheduling can be informed by knowledge gained as tasks are completed. For example if OEM information suggests it will take three days to remove a certain part but in practice it usually takes four days the overall model can be updated with that result.

“The more data you have, the more your model is representing reality and more you will be able to be precise concerning project delivery,” says Plana.

Build4D has also proven useful in optimising outage management in the French nuclear fleet.


Author information: Jeremy Gordon, Director at Fluent in Energy, an independent communications consultancy

Robert Plana is the chief technical officer of Assystem
Figure 1: Process of building an advanced search engine (Source: Assystem)


Privacy Policy
We have updated our privacy policy. In the latest update it explains what cookies are and how we use them on our site. To learn more about cookies and their benefits, please view our privacy policy. Please be aware that parts of this site will not function correctly if you disable cookies. By continuing to use this site, you consent to our use of cookies in accordance with our privacy policy unless you have disabled them.