Skip to content

What’s your Energy Pet Peeve?

May 13, 2013

This was a question posed during the NIEES workshop I attended in April. For some reason, that question really stuck with me. At the time, I said something about the split incentive problem because I am increasingly grossed out by the insulation problems in my house (if you stand near some of the windows while it’s raining really hard, you get wet – forget air leakage, there is water leakage).

However, as I thought more about it, I wish I had said something about anti-science environmentalism instead. I’m referring to the type of environmentalism that ignores the real complexity that exists around sustainability and energy issues. For example, carpooling does not reduce carbon emissions in every context. Sure, carpooling is better than everyone driving separate cars, but it’s not better than taking the bus (which would be operating anyway). Carpooling is also not the more environmentally friendly option if you convince a friend to go out of their way to pick you up and drive you to an event that they would have otherwise not attended – in this case, it really would have been better to just drive yourself. I feel like a lot of self-proclaimed environmentalists have a tendency to blanketly label a certain action as GREEN without layering that critical carbon footprint/life cycle analysis (LCA) lens on it.

As someone who has (briefly) studied LCA and tried to deal with these types of complexities, I find this incredibly irritating. It’s not ok to complain about our lack of investment in renewable energy when you are using air-conditioning in coal country and staunchly anti-nuclear. What if environmental regulations are bad for the economy? We have to deal with that. What if these regulations are just moving our pollution to other countries? That’s not ok either. The point is – it’s complicated – and you can’t avoid that by following 10 Tips for Living a Green Life. You can’t get mad that low-income women use disposable diapers because they don’t have the time or resources to do laundry that often. Maybe instead of using green cleaning products, you should just clean less (this is why I’m a bad roommate, but my point stands).

So rather than becoming bitter, I’ve been trying to think about how I can get involved and make a difference. I’ve become increasingly interested in regulation and the ethical issues related to it as I more seriously consider a career in government. Some ideas that have crossed my mind:

I recently went on a tour of downtown Pittsburgh lead by the Pittsburgh History and Landmarks Foundation. I was mostly shocked by how many of the buildings are empty. That’s one of the things I like about Pittsburgh, it has so much potential – it’s a hopeful place. There’s real desire to make changes here (although maybe I’m just optimistic).

I want to do some sort of public outreach through art. I’ll have to think about it.

Advertisements

Learning to Take Care of Plants

May 12, 2013

Fairy Garden

I went to the May Market and one booth was selling fairy gardens (miniature perennials)! Obviously, I was smitten.

I ended up purchasing three plants that do well in the shade to create a little fairy world on my office windowsill. They include:

I hope I can keep them alive!

Canning Season is Coming!

May 12, 2013

Image

In April, I got some grapefruits from my grandparents while visiting them in AZ for NIEES. So far, I have made grapefruit in ginger mint syrup (shown above) and grapefruit jam. Unfortunately, supremeing grapefruit is extremely difficult (so I got Caitlin to do it for me) and I am still not great at jam … so I’m not optimistic. We will be testing out the jam tomorrow. I have a feeling it will overset and become more candy than jam because I may have cooked it too long. Clearly, I need to invest in a candy thermometer. I’m still planning to make a citrus marmalade … so maybe that will go better?

In the meantime, I am going through my cookbooks and highlighting recipes to try this year. My goal for this summer is to figure out how to make jam.

To Make:

  • Raspberry Jam
  • Strawberry Jam
  • Honey Lemon Apple Jam
  • Pickled Brussels Sprouts
  • Marinara Sauce
  • Sweet Pumpkin Pickles
  • Pepper Jelly
  • Watermelon Jelly

In addition to old favorites like brandied peaches, maple bourbon pickles, nectarine-lime jam, dilly beans, salsa, and spicy pickled squash.

Reducing Cybersecurity Risks for the Electric Power Grid: Applications for Decision Science

May 8, 2013

This is derived from an assignment for my Analysis of Uncertain Social Systems course.

1. Introduction

In recent years, concerns have grown related to the cybersecurity risks of our electrical grid infrastructure. In a recent Executive Order (2013), President Obama states that “the cyber threat to critical infrastructure continues to grow and represents one of the most serious national security challenges we must confront.” An attack or incident impacting the electrical grid is of particular concern due to the interconnectedness of electricity and other basic services such as healthcare, transportation, and financial infrastructure. Although there is security technology in place, some elements of security procedures involve daily human decision-making to establish priorities and respond to incidents. Findings from the judgment and decision-making literature can be applied to improve these daily decisions to increase the effectiveness of cybersecurity efforts.

1.1. Cybersecurity Risks and Regulation

Cybersecurity includes all of the technologies, processes, and practices designed to protect networks, computers, and data from attack, damage or unauthorized access. In the context of the electrical grid, vulnerabilities extend beyond computers and servers to include all network-connected sensors and control devices. Common cyber exploits include denial-of-service (DoS) attacks, phishing, viruses, Trojan horses, worms, SQL injections and many more (GAO, 2012). These attacks can lead to physical impacts such as long-lasting black-outs and damaged infrastructure as well as psychological impacts such as loss of faith in the government (NRC, 2012).

The North American Electric Reliability Corporation (NERC) is a non-profit entity certified by the Federal Energy Regulatory Commission (FERC) to develop and enforce reliability standards (EIA, 2007). These standards detail rules for general operations as well as protection in the event of natural and man-made (ex. accidents, cyber-attacks, bombings etc.) disasters (NERC, 2013b). NERC has divided the country into eight regions, which ensure compliance at a regional level by sending auditors to utilities.

One problem with this compliance-based regulatory system is that many of the rules are ambiguous. For example, a key element of compliance is identifying “critical assets” and prioritizing securing them. However, even the authorities disagree about the exact definition. For example, in a recent FERC order (NERC, 2013a), FERC remanded NERC’s interpretation of “essential to the operation of the Critical Asset”. NERC determined that “a Cyber Asset that ‘may’ be used, but is not ‘required’ (i.e., without which a Critical Asset cannot function as intended), for the operation of a Critical Asset is not ‘essential to the operation of the Critical Asset’ for purposes of Requirement R3.” However, FERC identified that “the proposed interpretation of ‘essential’ may leave certain cyber assets lacking the required CIP Reliability Standards protection that could, if compromised, affect the operation of associated Critical Assets”. These types of fundamental ambiguities make enforcement of the reliability standards difficult and can divert resources from actual security efforts. Given this uncertainty, security officers and auditors must make decisions about what does or does not constitute a security risk on a daily basis.

1.2. Applications for Decision Science

Given the uncertain nature of cyber threats and the importance of decision-making in prioritizing resources and ensuring compliance with security procedures, research in judgment and decision-making can be used to improve assessments and personnel training. Security officers at utilities may have many misconceptions about cybersecurity risks (Pietre-Cambacedes et al., 2011). As a result, they may mis-allocate resources and underestimate threats. By identifying the specific biases and heuristics at play, we can begin to identify potential interventions and solutions to increase the effectiveness of cybersecurity. This paper outlines three specific concepts from decision science that could be used to improve cybersecurity efforts including the affect heuristic, optimism bias, and additivity of risk.

2. Using Decision Science to Improve Cybersecurity Risk Assessment

2.1 Affect Heuristic

People understand risk in two ways using the “analytic system” (risk as analysis – logical, rational, effortful) and “experiential system” (risk as feelings – intuitive, fast, automatic). These systems operate in parallel and risk assessment should include inputs from both systems. Affect is an important aspect of risk perception that affects probability judgment – for example, more is spent on deterring terrorism attacks than other more prevalent risks because terrorism is perceived as worse (Slovic et al., 2004). Consequently, for cybersecurity, compliance officers are likely to spend more time protecting their systems from the worst attacks, which would disrupt operations. While terrorists might be interested in causing a blackout, others such as environmental “hactivists” might simply be interested in discovering illegal dumping practices. Given high uncertainty regarding the nature of cybersecurity, compliance officers tend to make decisions based on a desire to feel secure rather than an objective measure of security (McDermott, 2012). As a result, they focus on reducing the risks of terrorist attacks, which may involve a clear indication of an event, rather than working to prevent espionage, which is also harder to detect. Ultimately, this means that money and effort are being diverted from the bigger risk.

Viruses are the most frequent cyberattack vector, and preventable via software patching (CSI, 2008). Although this type of attack is unlikely to disrupt operations, it could slow down systems, waste computing resources, and expose the company to espionage. Therefore, rationally, compliance officers should prioritize low-cost efforts to do regular preventative maintenance such as upgrading patches. However, in an industrial control system, it can be difficult to upgrade operating systems and software that control discontinued equipment. If the commercial software is no longer supported, it may not be able to operate on a newer operating system. In this case, the alternative is to employ strict procedures (ex. no USB sticks) to ensure that particular machine remains isolated. When it is difficult to perform preventative maintenance and the consequences seem minor, it may be easier to justify inaction. This leads to misconceptions that involve a denial of reality and rely on reductive views of security. However, this is an inefficient affective response to the problem and does not lead to an optimal allocation of resources.

2.2 Optimism Bias

It is likely that compliance officers are affected by the optimism bias when assessing cybersecurity risks. Since it is difficult to quantify risks due to the high uncertainty, qualitative judgment is used to allocate resources. Unfortunately, there is evidence that optimism bias increases for negative events that have high perceived controllability (Harris, 1996). In particular, people, especially heavy internet users, tend to be more optimistic about internet events in general. This finding was correlated with controllability, desirability, and personal experience (Campbell et al., 2007). As a result, optimism bias is likely to increase as more security measures are put into place because this will tend to increase the perceived controllability of the risk. This leads to many of the misconceptions.

The tendency to engage in practices that exhibit optimism bias can be attributed in part to the fact that people tend to perceive an inverse relationship between costs and benefits (Slovic, 1987). For example, in the isolated control system using an old operating system described above, an employee might perceive a benefit due to an improved workflow from using a USB stick to transfer files to that machine. Despite the known risks to using USB sticks, this context would make the risks seem lower as a consequence of the high perceived benefits. It is likely that the employee would be confident that this USB stick is clean, even if there is no evidence to indicate that. Given that these types of decisions are made at the employee level, it is difficult to ascertain if all security procedures are being followed. Compliance officers could use methods such as drills and regular training to evaluate compliance, but there would still be some uncertainty associated with this estimate.

2.2 Additivity of Risk

It is well known that unpacking a risk, or listing all of the possible scenarios within a risk category, will lead to a higher estimate of risk than simply listing the category. This has been observed among both experts and non-experts (Tversky and Koehler, 1994). In addition, risk assessments for terrorism are particularly sensitive to this phenomenon because terrorism incites fear and publicity, which can bias predictions. Mandel (2005) found that unpacking leads to subadditive results (probability of terrorist attack < probability of terrorist attack by al-Qaeda + non-al Qaeda) and refocusing leads to superadditive results (probability of attack  > 1-probability of no attack).

However, risk assessments can be designed to systematically solicit probabilities in a way that reduces bias. Evidence suggests that the refocusing effect is reduced when transparency is high and the complementary task is made salient (Mandel, 2005). Compliance officers should consider all of the possible scenarios when evaluating the risk a particular component is exposed to. This will reduce the likelihood of underestimating the risk. In addition, refocusing can ensure that the risk is not overestimated. These two effects can be used to bound the probability of attack. This exercise will help compliance officers more accurately estimate which aspects of the system are exposed to more risk and will need more security resources allocated.

3. Conclusion

There many biases and heuristics that affect the assessment of cybersecurity risk. This paper highlighted only three: affect heuristic, optimism bias, and additivity of risk. As a consequence of the affect heuristic, compliance officers tend to focus on preventing low probability cyberattacks that would interrupt operations rather than more common espionage attacks. Preventing observable attacks increases a sense of security without actually increasing, and sometimes decreasing by diverting resources, the overall security of the system. In addition, many cybersecurity behaviors are likely to exhibit the optimism bias because of perceived controllability and benefits in a specific context. However, it is possible to design risk assessment protocols to reduce the effects of bias by unpacking and refocusing the risks for each aspect of the system.

It is important for regulators to consider these types of biases when designing regulations and incentives for compliance. By focusing on processes, rather than outcomes, regulators can hold compliance officers accountable while increasing the accuracy of their judgments (Lerner and Tetlock, 2004). It may be useful to encourage a security culture, which establishes common beliefs and values in order to define behavioral norms. Sorenson (2002) identified that it is important to have clear communications and a commitment from senior management to foster a safety culture (Sorenson, 2002). The same is likely true for a security culture. A successful security culture would likely pay more attention to security issues and be less likely to underestimate risks. If there is a common belief that it is most important to not have a cybersecurity incident, then the utility of taking risks (such as using the USB stick on the isolated computer) will be lower and people will be less likely to engage in those behaviors.

In addition, it’s important to consider how biases and heuristics may impact perceptions of risks in order to design effective communications. If compliance officers are systematically overestimating the risks from terrorism as opposed to espionage, then it may be useful to design a communication campaign the increases the availability of the risks from espionage. Alternatively, if optimism bias is a bigger driver, then statistics about risks won’t encourage behavior change. It will be important for risk communication to highlight personal rather than overall risk. Ultimately, more research is needed to identify which heuristics and biases are most influential in determining how compliance officers perceive cybersecurity risks. Once identified, communications can be designed to mitigate the effects and improve security behaviors.

4. References

Campbell, J., Greenauer, N., Macaluso, K., & End, C. (2007). Unrealistic optimism in internet events. Computers in Human Behavior, 23(3), 1273–1284.

Richardson, Robert. (2008). CSI Computer Crime and Security Survey. Computer Security Institute. Retrieved on May 5, 2013 from https://www.hlncc.com/docs/CSIsurvey2008.pdf.

Exec. Order No. 13636, 78 FR 11737-11744 (2013). Improving Critical Infrastructure Cybersecurity. Retrieved on April 15, 2013, from https://federalregister.gov/a/2013-03915.

Harris, P. (1996). Sufficient Grounds For Optimism?: The Relationship Between Perceived Controllability and Optimistic Bias. Journal of Social and Clinical Psychology, 15(1), 9–52.

Kahneman, D., & Tversky, A. (1979). Prospect Theory: An Analysis of Decision under Risk. Econometrica, 47(2), 263–292.

Lerner, J. S., & Tetlock, P. E. (2004). Accounting for the Effects of Accountability. Psychological Bulletin, 125(2), 255–275.

Mandel, D. R. (2005). Are Risk Assessments of a Terrorist Attack Coherent? Journal of Experimental Psychology: Applied, 11(4), 277–288.

McDermott, R. (2012). Emotion and security. Communications of the ACM, 55(2), 35.

National Research Council (NRC). (2012). Terrorism and the Electric Power Delivery System. Washington, DC: The National Academies Press.

North American Electric Reliability Corporation (NERC). (2013a). Order on Interpretation of Reliability Standard, 142 F.E.R.C. ¶ 61,204. Retrieved on April 15, 2013, from http://www.ferc.gov/whats-new/comm-meet/2013/032113/E-11.pdf.

North American Electric Reliability Corporation (NERC). (2013b). Reliability Standards for the Bulk Electric Systems of North America (pp. 113–551). Retrieved on April 15, 2013, from http://www.nerc.com/docs/standards/rs/Reliability_Standards_Complete_Set.pdf.

Pietre-Cambacedes, L., Tritschler, M., & Ericsson, G. N. (2011). Cybersecurity Myths on Power Control Systems: 21 Misconceptions and False Beliefs. IEEE Transactions on Power Delivery, 26(1), 161–172.

Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2004). Risk as Analysis and Risk as Feelings: Some Thoughts about Affect, Reason, Risk, and Rationality. Risk Analysis, 24(2), 311–322.

Slovic, P. (1987). Perception of Risk. Science, 236(4799), 280–285.

Sorensen, J. N. (2002) Safety Culture: A Survey of the State-of-the-Art. Reliability Engineering & System Safety, 76(2), 189-204.

Tversky, A., & Koehler, D. J. (1994). Support Theory: A Nonestensional Representation of Subjective Probability. Psychological Review, 101(4), 547–567.

U.S. Energy Information Administration (EIA). (2007). Electric Power Industry Overview 2007. Retrieved April 15, 2013, from http://www.eia.gov/cneaf/electricity/page/prim2/figure7.html.

U.S. Government Accountability Office (GAO). (2012). Cybersecurity: Challenges in Securing the Modernized Electricity Grid. (Publication No. GAO-12-507). Retrieved April 15, 2013, from http://www.gao.gov/products/GAO-12-507T.

Incentives Are Not Enough For Cybersecurity

May 8, 2013

This is derived from an assignment for my Cybersecurity in Critical Infrastructure Protection course.

Regulation is critical for ensuring that our critical infrastructure is protected from cyber threats. In a market-based system, security is an externality that private companies have no obligation to pay for unless forced to by regulation. Although some companies are compelled by patriotic duty to invest in cybersecurity, this is not a reliable mechanism for ensuring that we meet our goals for national security. Even with potential economic losses from cybersecurity breaches, there is no way to ensure compliance across the board with a voluntary framework. There will always be a company trying to undercut its competitor by avoiding costs for security. Given the interconnected nature of critical infrastructure (ex. financial, energy, and water sectors), a breach at one company could have broader physical and psychological impacts. It is in both private and national interest to institute regulations rather than a voluntary framework. This will ensure that national security goals are met and that no companies are put at risk by other companies. The loss of critical infrastructure services, even temporarily, could have ripple effects through the economy, which ultimately hurts all businesses, regardless of their attention to cybersecurity issues.

At the very least, regulators need to institute strong incentives to encourage adoption of a voluntary framework. Successful incentives should have financial and psychological elements. Financial elements could include tax breaks for companies that achieve specific levels of compliance or the ability to write off security expenditures. These types of incentives aim to reduce the costs associated with increasing security so that companies are not de-incentivized to invest. Fining companies who fail to invest in security can create a perverse incentive if companies are not always caught and the fines are less than the cost of investment (Farahmand et al., 2013). More psychological elements may include, for example, publishing a list of companies that achieve compliance with the voluntary framework to create a norm for security practices.  However, this may raise security concerns since companies that do not appear on this list would be known to be more vulnerable. In addition, companies that participate in the voluntary framework could be eligible for specific privileges – for example, a dinner with the President. This could serve to emphasize that cybersecurity is a national issue and reward companies that invest in a social, rather than financial way.

Beyond incentives, the accountability literature indicates that it is important to focus on processes, rather than outcomes, because this increases the accuracy of judgments and decreases commitment to potentially poor decisions (Lerner and Tetlock, 2004). As a result, companies should meet compliance by having good processes rather than being lucky enough to not be victim to an attack. However, methods like accountability only work when performance can be improved by increasing attention. If there is not enough information or more training is needed to improve decisions, accountability does not improve decision-making (Lerner and Tetlock, 2004). However, there are many aspects of cybersecurity that could benefit from just an increase in attention.

References

Farahmand, F., Atallah, M. M. J., & Spafford, E. H. (2013). Incentive Alignment and Risk Perception: An Information Security Application. IEEE Transactions on Engineering Management, 60(2), 238–246.

Lerner, J. S., & Tetlock, P. E. (2004). Accounting for the Effects of Accountability. Psychological Bulletin, 125(2), 255–275.

Graduate Coursework Review

May 8, 2013

Now that I have officially finished all of my required graduate coursework (celebrate!), I wanted to take a moment to reflect on what I did right and where I went wrong. Due to the nature of my interdisciplinary degree, it’s no surprise that I overloaded on social science courses:

  • Applied Microeconomics
  • Seminar in Electric Power (covering the history of regulation in the electricity industry)
  • Risk Perception and Communication
  • Analysis of Uncertain Social Systems
  • Communication Design and Analysis

I’m considering rounding out my focus on communication design with a graphic design course. I really enjoyed all of these courses (except economics … too many proofs). Each one ended up being a lot more interesting and useful than I had expected.

I also did my fair share of science and engineering coursework:

  • Experimental Design for Behavioral and Social Sciences
  • Engineering and Economics of Electric Energy Systems
  • Intermediate Statistics
  • Optimization
  • Introduction to Econometric Theory
  • Cybersecurity in Critical Infrastructure Protection

Here is where I benefit from hindsight. I ended up doing my statistics coursework in completely the wrong order. I learned very little in my Experimental Design course, which suffered from too-detailed directions in the labs. Ultimately, I should have taken statistics in the policy school rather than the statistics department – I barely passed Intermediate Statistics, although I learned a lot about the fundamental arguments in statistics (ex. Frequentist vs. Bayesian, Confidence Intervals vs. Hypothesis Testing). Intermediate Statistics was interesting, even though it was too difficult for me, which is really a testament to the professor. I tacked on the cybersecurity course at the end of this semester in preparation for a potential cybersecurity project that I will be involved with.

The sad part is that I went around to a bunch of older students during registration for my first semester to get advice on what courses to take. Everyone’s advice conflicted, of course, so I took some and not others. Now I look back and finally understand who’s advice I should have taken and didn’t. Alas, such is life.

Last Day of NIEES

April 12, 2013

I had a wonderful time at NIEES! I will write some more reflective posts during my flight home tomorrow.

In the meantime, a recent article about the ethics of geoengineering and a video about how to communicate with the public (from a workshop on Climate Change and Infrastructure).