Skip to content

NAE Engineering for You Video Entry: Skin Deep

April 4, 2014

I’m currently taking the Motion Picture Fundamentals course at Pittsburgh Filmmakers. For the digital project, I decided to make a movie for the NAE Engineering for You Video Contest. We brainstormed about a bunch of different ideas and, as you can see, settled on artificial skin.

Since I’m an engineer, I had to make a prototype version using iMovie before we learned how to use Final Cut Pro in the class. It was really helpful for me to think about how to put the movie together before I ran out of time for filming. You can see that prototype version (which I entered in the CMU version of the NAE contest) here.

If You Feel Secure, You’re Probably Not Secure

June 23, 2013

As I head home from the Trustworthy Cyber Infrastructure for the Power Grid (TCIPG) Summer School, I wanted to capture some of my immediate reactions. The photo is from the Ameren Smart Grid Training Platform.

  • It is difficult for operations and IT staff to communicate because they use the same words to mean very different things. For example, in operations ‘security’ focuses on reliability while in IT it is more about preventing attacks. IT people think about communications being routable on a network but electricity is not routable in the power grid to the same extent.
  • There is a need for more personnel trained in cybersecurity specifically. People are very worried about the workforce pipeline.
  • SCADA systems are particularly vulnerable because they don’t get patched frequently (because of reliability concerns, lack of supported software, maintenance contracts etc.), have no operating system and thus can’t support anti-virus protection, and lack basic security features such as encryption.
  • AMI systems have specific vulnerabilities because the hardware is outside of a secure perimeter (thus at risk for tampering) and there is a monoculture of devices so one vulnerability could affect a lot of people. However, (luckily) it’s also not a very good pathway to a SCADA system.
  • Security is defined by three aspects: confidentiality, integrity, and availability (CIA). Different systems put more emphasis on different aspects. For example, an enterprise system is most concerned about confidentiality while a control system is more concerned about availability. Security is addressed through people, process, and technology (and policy).
  • Main defense/hygiene capabilities include cryptography (encryption), authentication (certificates, key management), and redundancy.
  • There are engineering requirements that oppose security such as latency. Encryption increases the latency of communications, which can make the difference between a large or small blackout in an emergency situation.
  • There is a lack of documentation. Most power plants rely on CAD drawings. However, there isn’t a way to capture all of the software settings in this format.
  • I started thinking about the value of freedom vs. need for secrecy in the government. I should write another post on that.
  • Corporate culture might play a big part in how well companies address cybersecurity.
  • I really enjoyed the hands-on lab experience. It’s one thing to talk about sniffing a network in an abstract sense and quite another to actually see it. I was impressed by how easy it is to learn so much information about a network with a couple simple commands.
  • There is a need for simplicity in network architecture. Utilities that have huge systems cannot keep track of complexity.
  • There are many 3rd party connections that reduce security such as contractors and vendors.
  • Defense in depth is a good strategy. Defense by obscurity is not.
  • There is an intersection between physical and cyber security. Physical security tools such as cameras can be used both to commit as well as identify cybercrime.

I also made a lot of great contacts with people at utilities, NERC, FERC, DOE, PJM, and contractors who are involved in NERC CIP compliance. In terms of my research it seems like there is specific interest in work related to:

  • Spear phishing -> it is a problem and cannot be solved with training and technology. Companies do phishing campaigns to train employees to not click links but many people still fall for them. Ultimately, these attacks prey on people’s kindness, which may be hard to train out of them. A different strategy could be more effective.
  • Insider threat -> this is a very dangerous attack vector and difficult to protect against.
  • Regulatory uncertainty -> it might be interesting to have a time component to identify how beliefs and misconceptions change as regulations change (ex. CIP v3 to v5).
  • Information sharing -> this is not incentivized and utilities don’t trust the ISAC as much because it is part of NERC (the compliance authority).

Again, these are all immediate impressions and not necessarily backed up by data.

Thoughts on the NIST Cybersecurity Framework Workshop

May 30, 2013

Since it was on campus, I attended part of the NIST Cybersecurity Framework Workshop to get a sense of how to develop my mental models project. They reported on some initial findings from the Request for Information (RFI) and had breakout groups to discuss the gaps that needed more information. I attended two of the four breakout sessions – “The Business of Cyber Risk” and “Threat Management”.

A couple things popped out to me based on the discussions:

  1. There are fundamental internal and external communication problems - This was true for the RFI as well as our discussions. We spent time debating the definition of threat vs. vulnerability vs. risk because everyone was using those words differently. It’s hard to have a conversation across sectors without a common language for cybersecurity. In addition, individuals raised questions about how to communicate cybersecurity risks to others within the business, ex. executives.
  2. People desire simplicity – they want the framework to be simple (as well as risk-based, flexible etc.). In talking to some individuals from the Energy Sector, they also spoke of a desire for simpler regulations that were not so onerous for compliance. This reminded me of the notion that simple policy is better policy. Smaller companies just want cybersecurity to be easy – they want the framework to point them to best practices and tell them what to do. For most business, cybersecurity is a distraction that they want to go away. Particularly for the Energy Sector, there is also a desire for certainty so that they can make business decisions without worrying about what next year’s cybersecurity rules will be.
  3. The most useful information-sharing is happening in informal channels - This is because official sources of information, particularly the kind that is published publicly, are not timely and business entities do not trust that they will be protected if they share information. For official information, by the time you are notified of a threat, it is too late to do anything about it. More useful, timely information is coming from peers who have built a trust relationship and share real-time threat information. Some of this is happening within NDA’s in a quid pro quo environment where you can get kicked out for never sharing information (unlike an ISAC). There is a lot of concern about liability if, for example, shared information is wrong. There was a desire to completely separate compliance and information-sharing functions of regulatory agencies – for the Energy Sector, NERC is the regulatory agency and also runs the ES-ISAC. As a result, business entities are hesitant to share information that may implicate them in non-compliance.
  4. Threat information is repetitive - I’m not sure about this – but it seems like this could have implications on attention due to heuristics and biases. Threats that are “over-shared” may seem more or less important than others. If this information is perceived as noise, it might not be considered as carefully? It probably depends on the level of analytics in use (if any). This is also why the informally shared information is more useful – it is generally more relevant, specific, and actionable.
  5. There are perverse incentives – Particularly for the Energy Sector, there are numerous perverse incentives as a result of the compliance-based approach. Utilities delay upgrades to make auditing easier. For example, early cybersecurity regulations were concerned about IP protocols so some utilities avoided upgrading their serial control systems to avoid regulation – even if IP protocols offered needed improvements (ex. improved situational awareness). There is also little incentive to be certified as a black start asset since there are additional costs associated with securing those assets – as a result, the reliability of the system is lower because there are fewer black start designated resources.

I can’t verify the validity of all of these statements – but they were thoughts/stories shared during the workshop that I found particularly interesting.

What’s your Energy Pet Peeve?

May 13, 2013

This was a question posed during the NIEES workshop I attended in April. For some reason, that question really stuck with me. At the time, I said something about the split incentive problem because I am increasingly grossed out by the insulation problems in my house (if you stand near some of the windows while it’s raining really hard, you get wet – forget air leakage, there is water leakage).

However, as I thought more about it, I wish I had said something about anti-science environmentalism instead. I’m referring to the type of environmentalism that ignores the real complexity that exists around sustainability and energy issues. For example, carpooling does not reduce carbon emissions in every context. Sure, carpooling is better than everyone driving separate cars, but it’s not better than taking the bus (which would be operating anyway). Carpooling is also not the more environmentally friendly option if you convince a friend to go out of their way to pick you up and drive you to an event that they would have otherwise not attended – in this case, it really would have been better to just drive yourself. I feel like a lot of self-proclaimed environmentalists have a tendency to blanketly label a certain action as GREEN without layering that critical carbon footprint/life cycle analysis (LCA) lens on it.

As someone who has (briefly) studied LCA and tried to deal with these types of complexities, I find this incredibly irritating. It’s not ok to complain about our lack of investment in renewable energy when you are using air-conditioning in coal country and staunchly anti-nuclear. What if environmental regulations are bad for the economy? We have to deal with that. What if these regulations are just moving our pollution to other countries? That’s not ok either. The point is – it’s complicated – and you can’t avoid that by following 10 Tips for Living a Green Life. You can’t get mad that low-income women use disposable diapers because they don’t have the time or resources to do laundry that often. Maybe instead of using green cleaning products, you should just clean less (this is why I’m a bad roommate, but my point stands).

So rather than becoming bitter, I’ve been trying to think about how I can get involved and make a difference. I’ve become increasingly interested in regulation and the ethical issues related to it as I more seriously consider a career in government. Some ideas that have crossed my mind:

I recently went on a tour of downtown Pittsburgh lead by the Pittsburgh History and Landmarks Foundation. I was mostly shocked by how many of the buildings are empty. That’s one of the things I like about Pittsburgh, it has so much potential – it’s a hopeful place. There’s real desire to make changes here (although maybe I’m just optimistic).

I want to do some sort of public outreach through art. I’ll have to think about it.

Learning to Take Care of Plants

May 12, 2013

Fairy Garden

I went to the May Market and one booth was selling fairy gardens (miniature perennials)! Obviously, I was smitten.

I ended up purchasing three plants that do well in the shade to create a little fairy world on my office windowsill. They include:

I hope I can keep them alive!

Canning Season is Coming!

May 12, 2013

Image

In April, I got some grapefruits from my grandparents while visiting them in AZ for NIEES. So far, I have made grapefruit in ginger mint syrup (shown above) and grapefruit jam. Unfortunately, supremeing grapefruit is extremely difficult (so I got Caitlin to do it for me) and I am still not great at jam … so I’m not optimistic. We will be testing out the jam tomorrow. I have a feeling it will overset and become more candy than jam because I may have cooked it too long. Clearly, I need to invest in a candy thermometer. I’m still planning to make a citrus marmalade … so maybe that will go better?

In the meantime, I am going through my cookbooks and highlighting recipes to try this year. My goal for this summer is to figure out how to make jam.

To Make:

  • Raspberry Jam
  • Strawberry Jam
  • Honey Lemon Apple Jam
  • Pickled Brussels Sprouts
  • Marinara Sauce
  • Sweet Pumpkin Pickles
  • Pepper Jelly
  • Watermelon Jelly

In addition to old favorites like brandied peaches, maple bourbon pickles, nectarine-lime jam, dilly beans, salsa, and spicy pickled squash.

Reducing Cybersecurity Risks for the Electric Power Grid: Applications for Decision Science

May 8, 2013

This is derived from an assignment for my Analysis of Uncertain Social Systems course.

1. Introduction

In recent years, concerns have grown related to the cybersecurity risks of our electrical grid infrastructure. In a recent Executive Order (2013), President Obama states that “the cyber threat to critical infrastructure continues to grow and represents one of the most serious national security challenges we must confront.” An attack or incident impacting the electrical grid is of particular concern due to the interconnectedness of electricity and other basic services such as healthcare, transportation, and financial infrastructure. Although there is security technology in place, some elements of security procedures involve daily human decision-making to establish priorities and respond to incidents. Findings from the judgment and decision-making literature can be applied to improve these daily decisions to increase the effectiveness of cybersecurity efforts.

1.1. Cybersecurity Risks and Regulation

Cybersecurity includes all of the technologies, processes, and practices designed to protect networks, computers, and data from attack, damage or unauthorized access. In the context of the electrical grid, vulnerabilities extend beyond computers and servers to include all network-connected sensors and control devices. Common cyber exploits include denial-of-service (DoS) attacks, phishing, viruses, Trojan horses, worms, SQL injections and many more (GAO, 2012). These attacks can lead to physical impacts such as long-lasting black-outs and damaged infrastructure as well as psychological impacts such as loss of faith in the government (NRC, 2012).

The North American Electric Reliability Corporation (NERC) is a non-profit entity certified by the Federal Energy Regulatory Commission (FERC) to develop and enforce reliability standards (EIA, 2007). These standards detail rules for general operations as well as protection in the event of natural and man-made (ex. accidents, cyber-attacks, bombings etc.) disasters (NERC, 2013b). NERC has divided the country into eight regions, which ensure compliance at a regional level by sending auditors to utilities.

One problem with this compliance-based regulatory system is that many of the rules are ambiguous. For example, a key element of compliance is identifying “critical assets” and prioritizing securing them. However, even the authorities disagree about the exact definition. For example, in a recent FERC order (NERC, 2013a), FERC remanded NERC’s interpretation of “essential to the operation of the Critical Asset”. NERC determined that “a Cyber Asset that ‘may’ be used, but is not ‘required’ (i.e., without which a Critical Asset cannot function as intended), for the operation of a Critical Asset is not ‘essential to the operation of the Critical Asset’ for purposes of Requirement R3.” However, FERC identified that “the proposed interpretation of ‘essential’ may leave certain cyber assets lacking the required CIP Reliability Standards protection that could, if compromised, affect the operation of associated Critical Assets”. These types of fundamental ambiguities make enforcement of the reliability standards difficult and can divert resources from actual security efforts. Given this uncertainty, security officers and auditors must make decisions about what does or does not constitute a security risk on a daily basis.

1.2. Applications for Decision Science

Given the uncertain nature of cyber threats and the importance of decision-making in prioritizing resources and ensuring compliance with security procedures, research in judgment and decision-making can be used to improve assessments and personnel training. Security officers at utilities may have many misconceptions about cybersecurity risks (Pietre-Cambacedes et al., 2011). As a result, they may mis-allocate resources and underestimate threats. By identifying the specific biases and heuristics at play, we can begin to identify potential interventions and solutions to increase the effectiveness of cybersecurity. This paper outlines three specific concepts from decision science that could be used to improve cybersecurity efforts including the affect heuristic, optimism bias, and additivity of risk.

2. Using Decision Science to Improve Cybersecurity Risk Assessment

2.1 Affect Heuristic

People understand risk in two ways using the “analytic system” (risk as analysis – logical, rational, effortful) and “experiential system” (risk as feelings – intuitive, fast, automatic). These systems operate in parallel and risk assessment should include inputs from both systems. Affect is an important aspect of risk perception that affects probability judgment – for example, more is spent on deterring terrorism attacks than other more prevalent risks because terrorism is perceived as worse (Slovic et al., 2004). Consequently, for cybersecurity, compliance officers are likely to spend more time protecting their systems from the worst attacks, which would disrupt operations. While terrorists might be interested in causing a blackout, others such as environmental “hactivists” might simply be interested in discovering illegal dumping practices. Given high uncertainty regarding the nature of cybersecurity, compliance officers tend to make decisions based on a desire to feel secure rather than an objective measure of security (McDermott, 2012). As a result, they focus on reducing the risks of terrorist attacks, which may involve a clear indication of an event, rather than working to prevent espionage, which is also harder to detect. Ultimately, this means that money and effort are being diverted from the bigger risk.

Viruses are the most frequent cyberattack vector, and preventable via software patching (CSI, 2008). Although this type of attack is unlikely to disrupt operations, it could slow down systems, waste computing resources, and expose the company to espionage. Therefore, rationally, compliance officers should prioritize low-cost efforts to do regular preventative maintenance such as upgrading patches. However, in an industrial control system, it can be difficult to upgrade operating systems and software that control discontinued equipment. If the commercial software is no longer supported, it may not be able to operate on a newer operating system. In this case, the alternative is to employ strict procedures (ex. no USB sticks) to ensure that particular machine remains isolated. When it is difficult to perform preventative maintenance and the consequences seem minor, it may be easier to justify inaction. This leads to misconceptions that involve a denial of reality and rely on reductive views of security. However, this is an inefficient affective response to the problem and does not lead to an optimal allocation of resources.

2.2 Optimism Bias

It is likely that compliance officers are affected by the optimism bias when assessing cybersecurity risks. Since it is difficult to quantify risks due to the high uncertainty, qualitative judgment is used to allocate resources. Unfortunately, there is evidence that optimism bias increases for negative events that have high perceived controllability (Harris, 1996). In particular, people, especially heavy internet users, tend to be more optimistic about internet events in general. This finding was correlated with controllability, desirability, and personal experience (Campbell et al., 2007). As a result, optimism bias is likely to increase as more security measures are put into place because this will tend to increase the perceived controllability of the risk. This leads to many of the misconceptions.

The tendency to engage in practices that exhibit optimism bias can be attributed in part to the fact that people tend to perceive an inverse relationship between costs and benefits (Slovic, 1987). For example, in the isolated control system using an old operating system described above, an employee might perceive a benefit due to an improved workflow from using a USB stick to transfer files to that machine. Despite the known risks to using USB sticks, this context would make the risks seem lower as a consequence of the high perceived benefits. It is likely that the employee would be confident that this USB stick is clean, even if there is no evidence to indicate that. Given that these types of decisions are made at the employee level, it is difficult to ascertain if all security procedures are being followed. Compliance officers could use methods such as drills and regular training to evaluate compliance, but there would still be some uncertainty associated with this estimate.

2.2 Additivity of Risk

It is well known that unpacking a risk, or listing all of the possible scenarios within a risk category, will lead to a higher estimate of risk than simply listing the category. This has been observed among both experts and non-experts (Tversky and Koehler, 1994). In addition, risk assessments for terrorism are particularly sensitive to this phenomenon because terrorism incites fear and publicity, which can bias predictions. Mandel (2005) found that unpacking leads to subadditive results (probability of terrorist attack < probability of terrorist attack by al-Qaeda + non-al Qaeda) and refocusing leads to superadditive results (probability of attack  > 1-probability of no attack).

However, risk assessments can be designed to systematically solicit probabilities in a way that reduces bias. Evidence suggests that the refocusing effect is reduced when transparency is high and the complementary task is made salient (Mandel, 2005). Compliance officers should consider all of the possible scenarios when evaluating the risk a particular component is exposed to. This will reduce the likelihood of underestimating the risk. In addition, refocusing can ensure that the risk is not overestimated. These two effects can be used to bound the probability of attack. This exercise will help compliance officers more accurately estimate which aspects of the system are exposed to more risk and will need more security resources allocated.

3. Conclusion

There many biases and heuristics that affect the assessment of cybersecurity risk. This paper highlighted only three: affect heuristic, optimism bias, and additivity of risk. As a consequence of the affect heuristic, compliance officers tend to focus on preventing low probability cyberattacks that would interrupt operations rather than more common espionage attacks. Preventing observable attacks increases a sense of security without actually increasing, and sometimes decreasing by diverting resources, the overall security of the system. In addition, many cybersecurity behaviors are likely to exhibit the optimism bias because of perceived controllability and benefits in a specific context. However, it is possible to design risk assessment protocols to reduce the effects of bias by unpacking and refocusing the risks for each aspect of the system.

It is important for regulators to consider these types of biases when designing regulations and incentives for compliance. By focusing on processes, rather than outcomes, regulators can hold compliance officers accountable while increasing the accuracy of their judgments (Lerner and Tetlock, 2004). It may be useful to encourage a security culture, which establishes common beliefs and values in order to define behavioral norms. Sorenson (2002) identified that it is important to have clear communications and a commitment from senior management to foster a safety culture (Sorenson, 2002). The same is likely true for a security culture. A successful security culture would likely pay more attention to security issues and be less likely to underestimate risks. If there is a common belief that it is most important to not have a cybersecurity incident, then the utility of taking risks (such as using the USB stick on the isolated computer) will be lower and people will be less likely to engage in those behaviors.

In addition, it’s important to consider how biases and heuristics may impact perceptions of risks in order to design effective communications. If compliance officers are systematically overestimating the risks from terrorism as opposed to espionage, then it may be useful to design a communication campaign the increases the availability of the risks from espionage. Alternatively, if optimism bias is a bigger driver, then statistics about risks won’t encourage behavior change. It will be important for risk communication to highlight personal rather than overall risk. Ultimately, more research is needed to identify which heuristics and biases are most influential in determining how compliance officers perceive cybersecurity risks. Once identified, communications can be designed to mitigate the effects and improve security behaviors.

4. References

Campbell, J., Greenauer, N., Macaluso, K., & End, C. (2007). Unrealistic optimism in internet events. Computers in Human Behavior, 23(3), 1273–1284.

Richardson, Robert. (2008). CSI Computer Crime and Security Survey. Computer Security Institute. Retrieved on May 5, 2013 from https://www.hlncc.com/docs/CSIsurvey2008.pdf.

Exec. Order No. 13636, 78 FR 11737-11744 (2013). Improving Critical Infrastructure Cybersecurity. Retrieved on April 15, 2013, from https://federalregister.gov/a/2013-03915.

Harris, P. (1996). Sufficient Grounds For Optimism?: The Relationship Between Perceived Controllability and Optimistic Bias. Journal of Social and Clinical Psychology, 15(1), 9–52.

Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–292.

Lerner, J. S., & Tetlock, P. E. (2004). Accounting for the Effects of Accountability. Psychological Bulletin, 125(2), 255–275.

Mandel, D. R. (2005). Are Risk Assessments of a Terrorist Attack Coherent? Journal of Experimental Psychology: Applied, 11(4), 277–288.

McDermott, R. (2012). Emotion and security. Communications of the ACM, 55(2), 35.

National Research Council (NRC). (2012). Terrorism and the Electric Power Delivery System. Washington, DC: The National Academies Press.

North American Electric Reliability Corporation (NERC). (2013a). Order on Interpretation of Reliability Standard, 142 F.E.R.C. ¶ 61,204. Retrieved on April 15, 2013, from http://www.ferc.gov/whats-new/comm-meet/2013/032113/E-11.pdf.

North American Electric Reliability Corporation (NERC). (2013b). Reliability Standards for the Bulk Electric Systems of North America (pp. 113–551). Retrieved on April 15, 2013, from http://www.nerc.com/docs/standards/rs/Reliability_Standards_Complete_Set.pdf.

Pietre-Cambacedes, L., Tritschler, M., & Ericsson, G. N. (2011). Cybersecurity Myths on Power Control Systems: 21 Misconceptions and False Beliefs. IEEE Transactions on Power Delivery, 26(1), 161–172.

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2004). Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality. Risk Analysis, 24(2), 311–322.

Slovic, P. (1987). Perception of Risk. Science, 236(4799), 280–285.

Sorensen, J. N. (2002) Safety Culture: A Survey of the State-of-the-Art. Reliability Engineering & System Safety, 76(2), 189-204.

Tversky, A., & Koehler, D. J. (1994). Support Theory: A Nonestensional Representation of Subjective Probability. Psychological Review, 101(4), 547–567.

U.S. Energy Information Administration (EIA). (2007). Electric Power Industry Overview 2007. Retrieved April 15, 2013, from http://www.eia.gov/cneaf/electricity/page/prim2/figure7.html.

U.S. Government Accountability Office (GAO). (2012). Cybersecurity: Challenges in Securing the Modernized Electricity Grid. (Publication No. GAO-12-507). Retrieved April 15, 2013, from http://www.gao.gov/products/GAO-12-507T.

Follow

Get every new post delivered to your Inbox.