Skip to content

Reducing Cybersecurity Risks for the Electric Power Grid: Applications for Decision Science

May 8, 2013

This is derived from an assignment for my Analysis of Uncertain Social Systems course.

1. Introduction

In recent years, concerns have grown related to the cybersecurity risks of our electrical grid infrastructure. In a recent Executive Order (2013), President Obama states that “the cyber threat to critical infrastructure continues to grow and represents one of the most serious national security challenges we must confront.” An attack or incident impacting the electrical grid is of particular concern due to the interconnectedness of electricity and other basic services such as healthcare, transportation, and financial infrastructure. Although there is security technology in place, some elements of security procedures involve daily human decision-making to establish priorities and respond to incidents. Findings from the judgment and decision-making literature can be applied to improve these daily decisions to increase the effectiveness of cybersecurity efforts.

1.1. Cybersecurity Risks and Regulation

Cybersecurity includes all of the technologies, processes, and practices designed to protect networks, computers, and data from attack, damage or unauthorized access. In the context of the electrical grid, vulnerabilities extend beyond computers and servers to include all network-connected sensors and control devices. Common cyber exploits include denial-of-service (DoS) attacks, phishing, viruses, Trojan horses, worms, SQL injections and many more (GAO, 2012). These attacks can lead to physical impacts such as long-lasting black-outs and damaged infrastructure as well as psychological impacts such as loss of faith in the government (NRC, 2012).

The North American Electric Reliability Corporation (NERC) is a non-profit entity certified by the Federal Energy Regulatory Commission (FERC) to develop and enforce reliability standards (EIA, 2007). These standards detail rules for general operations as well as protection in the event of natural and man-made (ex. accidents, cyber-attacks, bombings etc.) disasters (NERC, 2013b). NERC has divided the country into eight regions, which ensure compliance at a regional level by sending auditors to utilities.

One problem with this compliance-based regulatory system is that many of the rules are ambiguous. For example, a key element of compliance is identifying “critical assets” and prioritizing securing them. However, even the authorities disagree about the exact definition. For example, in a recent FERC order (NERC, 2013a), FERC remanded NERC’s interpretation of “essential to the operation of the Critical Asset”. NERC determined that “a Cyber Asset that ‘may’ be used, but is not ‘required’ (i.e., without which a Critical Asset cannot function as intended), for the operation of a Critical Asset is not ‘essential to the operation of the Critical Asset’ for purposes of Requirement R3.” However, FERC identified that “the proposed interpretation of ‘essential’ may leave certain cyber assets lacking the required CIP Reliability Standards protection that could, if compromised, affect the operation of associated Critical Assets”. These types of fundamental ambiguities make enforcement of the reliability standards difficult and can divert resources from actual security efforts. Given this uncertainty, security officers and auditors must make decisions about what does or does not constitute a security risk on a daily basis.

1.2. Applications for Decision Science

Given the uncertain nature of cyber threats and the importance of decision-making in prioritizing resources and ensuring compliance with security procedures, research in judgment and decision-making can be used to improve assessments and personnel training. Security officers at utilities may have many misconceptions about cybersecurity risks (Pietre-Cambacedes et al., 2011). As a result, they may mis-allocate resources and underestimate threats. By identifying the specific biases and heuristics at play, we can begin to identify potential interventions and solutions to increase the effectiveness of cybersecurity. This paper outlines three specific concepts from decision science that could be used to improve cybersecurity efforts including the affect heuristic, optimism bias, and additivity of risk.

2. Using Decision Science to Improve Cybersecurity Risk Assessment

2.1 Affect Heuristic

People understand risk in two ways using the “analytic system” (risk as analysis – logical, rational, effortful) and “experiential system” (risk as feelings – intuitive, fast, automatic). These systems operate in parallel and risk assessment should include inputs from both systems. Affect is an important aspect of risk perception that affects probability judgment – for example, more is spent on deterring terrorism attacks than other more prevalent risks because terrorism is perceived as worse (Slovic et al., 2004). Consequently, for cybersecurity, compliance officers are likely to spend more time protecting their systems from the worst attacks, which would disrupt operations. While terrorists might be interested in causing a blackout, others such as environmental “hactivists” might simply be interested in discovering illegal dumping practices. Given high uncertainty regarding the nature of cybersecurity, compliance officers tend to make decisions based on a desire to feel secure rather than an objective measure of security (McDermott, 2012). As a result, they focus on reducing the risks of terrorist attacks, which may involve a clear indication of an event, rather than working to prevent espionage, which is also harder to detect. Ultimately, this means that money and effort are being diverted from the bigger risk.

Viruses are the most frequent cyberattack vector, and preventable via software patching (CSI, 2008). Although this type of attack is unlikely to disrupt operations, it could slow down systems, waste computing resources, and expose the company to espionage. Therefore, rationally, compliance officers should prioritize low-cost efforts to do regular preventative maintenance such as upgrading patches. However, in an industrial control system, it can be difficult to upgrade operating systems and software that control discontinued equipment. If the commercial software is no longer supported, it may not be able to operate on a newer operating system. In this case, the alternative is to employ strict procedures (ex. no USB sticks) to ensure that particular machine remains isolated. When it is difficult to perform preventative maintenance and the consequences seem minor, it may be easier to justify inaction. This leads to misconceptions that involve a denial of reality and rely on reductive views of security. However, this is an inefficient affective response to the problem and does not lead to an optimal allocation of resources.

2.2 Optimism Bias

It is likely that compliance officers are affected by the optimism bias when assessing cybersecurity risks. Since it is difficult to quantify risks due to the high uncertainty, qualitative judgment is used to allocate resources. Unfortunately, there is evidence that optimism bias increases for negative events that have high perceived controllability (Harris, 1996). In particular, people, especially heavy internet users, tend to be more optimistic about internet events in general. This finding was correlated with controllability, desirability, and personal experience (Campbell et al., 2007). As a result, optimism bias is likely to increase as more security measures are put into place because this will tend to increase the perceived controllability of the risk. This leads to many of the misconceptions.

The tendency to engage in practices that exhibit optimism bias can be attributed in part to the fact that people tend to perceive an inverse relationship between costs and benefits (Slovic, 1987). For example, in the isolated control system using an old operating system described above, an employee might perceive a benefit due to an improved workflow from using a USB stick to transfer files to that machine. Despite the known risks to using USB sticks, this context would make the risks seem lower as a consequence of the high perceived benefits. It is likely that the employee would be confident that this USB stick is clean, even if there is no evidence to indicate that. Given that these types of decisions are made at the employee level, it is difficult to ascertain if all security procedures are being followed. Compliance officers could use methods such as drills and regular training to evaluate compliance, but there would still be some uncertainty associated with this estimate.

2.2 Additivity of Risk

It is well known that unpacking a risk, or listing all of the possible scenarios within a risk category, will lead to a higher estimate of risk than simply listing the category. This has been observed among both experts and non-experts (Tversky and Koehler, 1994). In addition, risk assessments for terrorism are particularly sensitive to this phenomenon because terrorism incites fear and publicity, which can bias predictions. Mandel (2005) found that unpacking leads to subadditive results (probability of terrorist attack < probability of terrorist attack by al-Qaeda + non-al Qaeda) and refocusing leads to superadditive results (probability of attack  > 1-probability of no attack).

However, risk assessments can be designed to systematically solicit probabilities in a way that reduces bias. Evidence suggests that the refocusing effect is reduced when transparency is high and the complementary task is made salient (Mandel, 2005). Compliance officers should consider all of the possible scenarios when evaluating the risk a particular component is exposed to. This will reduce the likelihood of underestimating the risk. In addition, refocusing can ensure that the risk is not overestimated. These two effects can be used to bound the probability of attack. This exercise will help compliance officers more accurately estimate which aspects of the system are exposed to more risk and will need more security resources allocated.

3. Conclusion

There many biases and heuristics that affect the assessment of cybersecurity risk. This paper highlighted only three: affect heuristic, optimism bias, and additivity of risk. As a consequence of the affect heuristic, compliance officers tend to focus on preventing low probability cyberattacks that would interrupt operations rather than more common espionage attacks. Preventing observable attacks increases a sense of security without actually increasing, and sometimes decreasing by diverting resources, the overall security of the system. In addition, many cybersecurity behaviors are likely to exhibit the optimism bias because of perceived controllability and benefits in a specific context. However, it is possible to design risk assessment protocols to reduce the effects of bias by unpacking and refocusing the risks for each aspect of the system.

It is important for regulators to consider these types of biases when designing regulations and incentives for compliance. By focusing on processes, rather than outcomes, regulators can hold compliance officers accountable while increasing the accuracy of their judgments (Lerner and Tetlock, 2004). It may be useful to encourage a security culture, which establishes common beliefs and values in order to define behavioral norms. Sorenson (2002) identified that it is important to have clear communications and a commitment from senior management to foster a safety culture (Sorenson, 2002). The same is likely true for a security culture. A successful security culture would likely pay more attention to security issues and be less likely to underestimate risks. If there is a common belief that it is most important to not have a cybersecurity incident, then the utility of taking risks (such as using the USB stick on the isolated computer) will be lower and people will be less likely to engage in those behaviors.

In addition, it’s important to consider how biases and heuristics may impact perceptions of risks in order to design effective communications. If compliance officers are systematically overestimating the risks from terrorism as opposed to espionage, then it may be useful to design a communication campaign the increases the availability of the risks from espionage. Alternatively, if optimism bias is a bigger driver, then statistics about risks won’t encourage behavior change. It will be important for risk communication to highlight personal rather than overall risk. Ultimately, more research is needed to identify which heuristics and biases are most influential in determining how compliance officers perceive cybersecurity risks. Once identified, communications can be designed to mitigate the effects and improve security behaviors.

4. References

Campbell, J., Greenauer, N., Macaluso, K., & End, C. (2007). Unrealistic optimism in internet events. Computers in Human Behavior, 23(3), 1273–1284.

Richardson, Robert. (2008). CSI Computer Crime and Security Survey. Computer Security Institute. Retrieved on May 5, 2013 from https://www.hlncc.com/docs/CSIsurvey2008.pdf.

Exec. Order No. 13636, 78 FR 11737-11744 (2013). Improving Critical Infrastructure Cybersecurity. Retrieved on April 15, 2013, from https://federalregister.gov/a/2013-03915.

Harris, P. (1996). Sufficient Grounds For Optimism?: The Relationship Between Perceived Controllability and Optimistic Bias. Journal of Social and Clinical Psychology, 15(1), 9–52.

Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–292.

Lerner, J. S., & Tetlock, P. E. (2004). Accounting for the Effects of Accountability. Psychological Bulletin, 125(2), 255–275.

Mandel, D. R. (2005). Are Risk Assessments of a Terrorist Attack Coherent? Journal of Experimental Psychology: Applied, 11(4), 277–288.

McDermott, R. (2012). Emotion and security. Communications of the ACM, 55(2), 35.

National Research Council (NRC). (2012). Terrorism and the Electric Power Delivery System. Washington, DC: The National Academies Press.

North American Electric Reliability Corporation (NERC). (2013a). Order on Interpretation of Reliability Standard, 142 F.E.R.C. ¶ 61,204. Retrieved on April 15, 2013, from http://www.ferc.gov/whats-new/comm-meet/2013/032113/E-11.pdf.

North American Electric Reliability Corporation (NERC). (2013b). Reliability Standards for the Bulk Electric Systems of North America (pp. 113–551). Retrieved on April 15, 2013, from http://www.nerc.com/docs/standards/rs/Reliability_Standards_Complete_Set.pdf.

Pietre-Cambacedes, L., Tritschler, M., & Ericsson, G. N. (2011). Cybersecurity Myths on Power Control Systems: 21 Misconceptions and False Beliefs. IEEE Transactions on Power Delivery, 26(1), 161–172.

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2004). Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality. Risk Analysis, 24(2), 311–322.

Slovic, P. (1987). Perception of Risk. Science, 236(4799), 280–285.

Sorensen, J. N. (2002) Safety Culture: A Survey of the State-of-the-Art. Reliability Engineering & System Safety, 76(2), 189-204.

Tversky, A., & Koehler, D. J. (1994). Support Theory: A Nonestensional Representation of Subjective Probability. Psychological Review, 101(4), 547–567.

U.S. Energy Information Administration (EIA). (2007). Electric Power Industry Overview 2007. Retrieved April 15, 2013, from http://www.eia.gov/cneaf/electricity/page/prim2/figure7.html.

U.S. Government Accountability Office (GAO). (2012). Cybersecurity: Challenges in Securing the Modernized Electricity Grid. (Publication No. GAO-12-507). Retrieved April 15, 2013, from http://www.gao.gov/products/GAO-12-507T.

Advertisements
No comments yet

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: