Although the nuclear energy industry has taken steps to improve cyber security, supported by the International Atomic Energy Agency (IAEA), a report by Chatham House suggested the sector still lacks the expertise to deal with such security issues compared to other industries. This is partly a result of regulatory requirements, which results in digital systems being adopted later than in other types of infrastructure, and partly due to the longstanding industry focus on physical protection and safety, so while these aspects of security are very robust, there has been less attention paid to developing cyber security.

Nuclear facilities are increasingly reliant on digital systems, and make increasing use of commercial “off-the-shelf” software, which offers considerable cost savings, but increases vulnerability to hacking. This, combined with a lack of executive level awareness of the risks, could mean that nuclear plant personnel may not be fully aware of their vulnerability to a cyber attack.

There is a belief that nuclear facilities are fully “air-gapped” – completely isolated from the public internet – and this protects them from cyber attacks. This is not entirely the case. Air gaps can be breached in several ways, which can be as simple as plugging in a flash drive. The commercial benefits of internet connectivity means that some nuclear facilities have virtual private networks and other connections that were installed by contractors and other legitimate third party operators, sometimes long ago and potentially undocumented or forgotten.

Meanwhile, hacking is easier to conduct and has become more widespread. Automatic packages targeted at known and discovered vulnerabilities are widely available; the advanced techniques used by malware such as the Stuxnet worm are now known and copied; and search engines can readily identify critical infrastructure components that are connected to the internet.

Challenges for the industry

Cyber security incidents at nuclear facilities are infrequent, which makes it difficult to assess the true extent of the risk. In addition there is, compared with other industries, limited collaboration and information sharing, which results in the nuclear industry being slow to learn from other industries that are more advanced in this field. It is also a concern that there is a shortage of regulatory standards, as well as limited communication between cyber security companies and vendors.

It has been reported that cyber security training at nuclear facilities is insufficient. In particular, there is a lack of integrated cyber security drills between nuclear plant personnel and cyber security personnel.

Many industrial control systems were designed and built before cyber security was an issue, and as a result, cyber security measures were not designed in from the beginning. Standard IT solutions, such as patching, are difficult to implement at nuclear facilities, mainly due to a concern that patches could break a system. In addition, supply chain vulnerabilities mean that equipment used at a nuclear facility risks compromise at any stage.

These factors suggest that the industry’s risk assessment on cyber security may underestimate the risk.

Known cyber security incidents at nuclear facilities

There have been several cyber security incidents reported at various nuclear facilities. There may be others, but some operators are reluctant to report incidents to avoid a perceived loss of reputation. This makes it difficult to assess the extent of the problem, and it could result in the belief that there are few incidents, reinforcing the view that cyber security is not a major issue. It also means the industry only learns slowly from incidents that have occurred and is slow to enhance
its defences. Since a cyber attack technique attempted against one facility may well be attempted on others, this lack of disclosure is a cultural issue that has to be overcome.

One expert said he believed that there may have been up to 50 actual control systems cyber incidents in the nuclear industry. It is not possible to verify this estimate, but it suggests there may be many other unreported incidents.

Some of the known incidents include:

Ignalina, Lithuania 1992

A technician at Ignalina nuclear power plant intentionally introduced a virus into the industrial control system. He claimed that this was to highlight the cyber security vulnerabilities of such plants.

This illustrates the danger of the insider threat. In this case little harm was caused but if there had been malicious intent a serious incident could have been initiated. Air-gapping does not protect against threats of this nature.

Davis-Besse, US, 2003

In January 2003, the Davis-Besse nuclear power plant was infected by the ‘Slammer’ worm. The worm first infected a consultant’s network. From there, it infected the corporate network of First Energy Nuclear, which operates the plant. This corporate network was connected directly to a SCADA system at Davis-Besse and the worm spread to this system where it generated a large amount of traffic that overwhelmed the system. The safety parameter display system was unavailable for five hours. Fortunately, the reactor was not operating at the time, but the same scenario could have occurred if it had been online. A patch for the vulnerability had been released six months earlier, which would have prevented the infection, but the patch had not been installed on any of the systems.

This problem arose because the vendor was permitted to access the network without protections or control. This provided a source of vulnerability, enabling malware to enter the network. The problem was exacerbated by not keeping up-to-date with protections against specific, known vulnerabilities.

Protecting against this threat requires attention being paid to all elements that connect to the network, and ensuring proper control of these systems.

Browns Ferry, US, 2006

In August 2006, Browns Ferry experienced a malfunction of both the reactor recirculation pumps and the condensate demineraliser.

Both of these contain microprocessors that send and receive data over an Ethernet network, but this makes them susceptible to failure if they receive too much traffic. This is what happened at Browns Ferry, and the plant’s Unit 3 had to be manually shutdown.

Although this was not a cyber attack, it shows the potential impact one might have. If a hacker were to cause a recirculation pump to fail, in combination with an infection by a worm like ‘Slammer’ (which could disable the sensors warning of a problem) then a serious problem would be initiated.

Hatch, US, 2008

In March 2008, Hatch experienced a shutdown as an unintended consequence of a contractor update. An engineer from Southern Company, the contractor that manages the plant’s technology operations, installed an update to a computer on the plant’s business network. The computer was connected to one of the plant’s industrial control system networks and the update was intended to synchronise the two. The synchronisation briefly reset the control system’s data to zero. However, the plant’s safety system interpreted this as indicating that there was insufficient water to cool the reactor core, and put the unit into automatic shutdown for 24 hours.

This demonstrates that nuclear owners and operators must be aware of the full ramifications of connecting their business networks to a plant’s industrial control systems. In this instance, the update’s unforeseen consequences did not put the plant in danger, although it did cause a costly shutdown. It does, however, demonstrate how a hacker might attack an industrial control system by making a change to a plant’s business network. The military historian Liddell-Hart characterised this type of attack as the ‘Strategy of the Indirect Approach’.

Natanz and Bushehr, Iran, 2010

The Stuxnet computer worm infected both the Natanz nuclear facility and the Bushehr nuclear power plant in Iran, partially destroying around 1000 centrifuges at Natanz. The worm is believed to have been designed by the US and Israeli governments, and specifically targeted to disrupt Iran’s uranium enrichment programme. Neither the US or Israel have openly acknowledged any involvement in the development of the virus or its intended use however.

It is considered probable that the worm spread initially when infected USB flash drives were introduced into these facilities, which became infected despite being air-gapped.

Stuxnet infects computers that run the Microsoft Windows operating system, taking advantage of vulnerabilities in the system that allow it to obtain system level access. The worm also makes use of falsified certificates so that the files it installs appear to come from a legitimate company, thus deceiving anti-virus software.

Stuxnet was aimed at inflicting damage on centrifuges at an enrichment plant, but its capabilities demonstrate the destructive potential of such technologies, and it is believed that other countries are developing similar offensive cyber capabilities.

Unnamed Russian nuclear power plant, 2010

Eugene Kaspersky, founder and CEO of Kaspersky Lab, said in 2013 that Stuxnet infected a Russian nuclear power plant in 2010, but the plant has not been identified. Kaspersky said the plant’s internal network, which was air-gapped, had been “badly infected”.

Korea Hydro and Nuclear Power Company, 2014

In December 2014, hackers infiltrated and stole data from the commercial network of Korea Hydro and Nuclear Power, which operates 23 of South Korea’s nuclear reactors.

The hackers gained access through phishing emails sent to employees, some of whom clicked on the links causing
the malware to download. The hackers obtained the blueprints and manuals of two reactors, as well as personal data on 10,000 employees, and radiation exposure estimates for local residents.

The hackers demanded money or they would release the data. South Korea blamed North Korea for the attack; North Korea denied any involvement and there the matter ended.

The incident does demonstrate the rise in extortion as a motivation for hackers.

Responses

It is evident from these examples that the potential threats come from a variety of sources: insider attack; infection from contractor software; microprocessor failure; government-sponsored cyber attack; and an unknown method of infection. From these, it is clear that the use of air-gapping as a protection is only successful if the isolation of the network from external influences is maintained. In each of these confirmed cases, infection took place when the air gap was breached, be it by flash-drive, contractor connections, or internal operator override.

The first and most robust protection against cyber attacks is to maintain an air gap protection at all times. Flash drives and unauthorised access can circumvent an air gap protection, so it is critically important to prevent such access points. Basic cyber security protocols, such as preventing the use of unauthorised flash drives, can improve protection, although not guarantee security.

The nature of threats can swiftly evolve, and there is a proliferation of modifications to the cyber attack vehicles. While the
first line of defence is ensuring that a potential infection does not have access in the first place, there have to be robust systems in place to deal with infections that have occurred. Cyber threats can be extremely sophisticated at propagating and concealment once they are in a system, and they will typically deploy techniques to evade anti-virus protections.

It is important for nuclear facilities to share information on threats. There can be a reluctance to disclose information of cyber attacks and potential indicators of compromise, partly due to a concern for reputational damage. However, it is as important for everyone to have a full knowledge of potential threats.

The changing nature of threats

It is commonplace to assume a cyber attack will necessarily be directed either against the control systems of a nuclear facility if the objective is to cause damage or disruption, or against the financial details of the network, if the threat is financial in nature.

But these are not necessarily the only potential routes. Ian Bonnett, former director of Ridgewood Europe, said that it was not just cyber security that was an issue: the organisation had to be ready to deal with hybrid threats as well. A hybrid threat is one where a low- level cyber attack is used to facilitate another form of attack.

An example of a hybrid threat might be one in which a cyber attack is used to access employee information or to clone an onsite pass for a contractor. This would make it possible for an unauthorised person to gain access to the site, giving them a greater range of options, one of which might be using a flash drive to install malware that would otherwise not be able to access the network.

Such hybrid threats, while difficult to organise, are also difficult to protect against. It is worthy of note that much of the literature on hybrid threats focus on a mixture of malware techniques, such a combining a Trojan with a worm that is used to drop a virus. The literature also looks at different effects malware can have, such as destroying data, providing access, or leaking information. In some cases ‘hybrid’ can be used to refer to a multiplicity of effects.

However, according to the European Parliamentary Research Service, a hybrid threat should be considered as one resulting from the convergence and interconnection of different elements, which together form a more complex and multidimensional threat.

Based on this, the combination of cyber and physical methods of attacking the security of nuclear facilities needs a coordinated response.

However, there is something of a cultural clash between nuclear plant personnel, who are primarily operations technology (OT) engineers, and cyber security personnel, who are IT engineers. They can often have conflicting priorities. One engineer who attended an IAEA meeting said OT and IT engineers had such different perspectives that communication was difficult. He said: “The OT engineers want security added to a system, without invalidating any of the previous tests. However, it is often not possible to introduce security without involving a change that would require the previous tests to be invalidated and need to be carried out again.”

He gave the example of adding security to a valve controller. This might introduce incompatibilities between the security and the safety system, especially if the plant wanted to connect the valve controller to the network, to gain easier access to plant data.

Consequently, one of the key elements in cyber security is improving communication between the people in the various elements of plant operation, who have different priorities and attitudes.

Because there have been relatively few cyber security incidents, and not all of these have been disclosed, it is difficult to assess the extent of the threat, and it may cause nuclear industry personnel to believe that the threat is not a high priority. In addition, there is limited collaboration with other industries or within the industry, so this is a field in which the nuclear industry tends to be slow to learn.