It's Just Not Logical: Project Planning and Your Brain

It's Just Not Logical: Project Planning and Your Brain

Introduction to Those Pesky Moderators of Logical Thinking

By Dr. Josh Ramirez and Dr. Shari De Baets

You know all those projects that get planned optimistically? How about the schedule overruns, budget overruns, and underperforming projects that seem to make the news? Projects tend to have some pretty poor stats when it comes to finishing on time and on budget. 

Well, it's not just execution that's to blame. A big part of it is the plan. And it turns out it isn't just optimism bias, per se, that causes optimistic planning. There's a whole host of cognitive (thinking) features in the brain that cause us to plan less than adequately. 

This is an introduction to some of the moderators of rational cognition. You can think of cognitive moderators as those aspects of your brain that prevent you from making purely logical decisions. 

The brain is like a computer - sort of. It takes in inputs, stores the information in its memory, processes it, and sends it to outputs. Computer processing takes place in a linear, logical fashion, computing data without bias or any other impacts to accuracy. Each step is dry, void of emotion, and purely logical. The brain, on the other hand, does not operate quite as rationally.

The brain, on the other hand, does not operate quite as rationally

The processing of information through the brain is affected by a need to maintain its own survival and perceived survival. The brain is constantly trying to conserve energy, maintain a state of comfort, avoid situations that seem dangerous (real or perceived), and protect itself from challenges to what it thinks is real. This processing keeps it from being wholly rational or logical, as its survival is a higher priority than anything else.

Thus, the brain does not operate via pure rationality. A number of cognitive moderators are at work, distorting objective reality. While we would like to believe that we are rational creatures in an objective world, that is far from the truth. Our brain filters (moderates) all input. Cognitive moderators are ways of thinking that may cause, moderate, or contribute to cognitive biases. They distort our perception, processing of information, and decision-making. They are the lens through which we see and process the world around us. Here are some of the ways your brain moderates, filters, and modifies logic.

(The following is a condensed version of the cognitive moderators from NeuralPlan ).

Time Pressure

Time pressure is very familiar to the project manager, as projects are time-constrained. We all know the dangers of rushing things. Imagine that you are looking for a side street. If you are speeding through the main street at 100 miles an hour, chances are high that you will miss your turn (and get a ticket, but let's not focus on that). When rushed to make decisions, the brain does not have enough time to consider all alternatives, let alone analyze risks, resources, etc. Time pressure causes automatic thinking, otherwise known by cognitive scientists as System 1 thinking. Due to a project's time constraint, personnel on projects experience higher degrees of automatic thinking. The subsequent increase in System 1 thinking causes not only bypassed options like the side street in our example, but is also known to cause an increased reliance on heuristics (see description below), which in turn lead to thinking errors, cognitive biases, reduced creativity, and, last but not least, impact decision-making in risk and safety.

Heuristics

Heuristics are the brain's way of automatically referencing information in a split second without having to thoroughly think through a situation. Heuristics are a mental rule of thumb and are like a computer or Google Search engine that provides suggestions of phrases once you start typing in the search bar. The computer is constantly making comparisons as you put in more information. Heuristics are doing the same thing. Your brain is continually indexing what it sees and hears against what it thinks it knows, giving you split-second feedback for you to make a decision. The problem is that heuristics lead to more automatic decisions that can lead to errors. In planning, heuristics can cause us to default to inaccurate information when making predictions of time and resources. While heuristics or mental shortcuts are not bad in themselves and make our life significantly easier, they have the inherent danger of leading to cognitive biases.

Cognitive Load

Imagine running your computer all day long, and as you go about the day you open more and more programs. You now have MS Word open, Excel open, your email is open, a YouTube video is running, and you are editing photos. Meanwhile, your computer is running all the background programs to keep the computer functioning, such as updating programs, mouse controls, watching your battery power, etc. Your computer is bogged down and slow because the Random Access Memory (RAM) is almost entirely full. Now the computer cannot run at full capacity, and its performance is compromised.

No alt text provided for this image

Cognitive load in the brain is the same. The more information you put into it throughout the day, the lower and slower the performance. When making project predictions, such as planning, cognitive load on the brain slows down the performance of your decisions. And whether we want to believe it or not, no amount of coffee or motivation is going to change that (sorry, project estimators who pull planning sessions into the wee hours of the night. But it's ok, this is your excuse to go get some sleep...) 

Decision Fatigue

Like cognitive load, decision fatigue occurs when the computer between your ears (your brain) is making decisions and loses energy due to those decisions. While cognitive load represents the memory being used throughout the day, decision fatigue is like the computer using that memory to take actions. Each decision, large or small, builds up throughout the day. Every decision is burning calories and using oxygen (about 20% of your body's oxygen is used by your brain). As the day goes on, your energy for decisions decreases, similar to when you are using your muscles, and your arms or legs get tired. And just like a muscle, even small actions can have a cumulative effect at decreasing energy.

About 20% of your body's oxygen is used by your brain

Decision fatigue causes an increase in automatic (System 1) thinking. As you now know, this means an increased reliance on heuristics and consequently, a higher chance of cognitive biases. This will impact your decision-making quality. It is therefore not the best idea to start planning your project at the end of a meeting-filled day. If you want to make a reliable plan that doesn't end up in overruns, unnecessary costs, and missed opportunities, you should probably clear your schedule as much as possible on planning days.

Cognitive Dissonance

This term explains the mental discomfort experienced when someone holds two or more contradictory beliefs, ideas, or values in their mind simultaneously and experiences psychological stress because of it. When two actions or ideas are not mentally consistent with each other, people try to revise them until they are consistent with each other and with our pre-existing knowledge and opinions.

No alt text provided for this image

To explain dissonance, imagine this scenario: you made a plan for a project and handed it off to the project manager to deliver. After the project was delivered, you find that major errors in the plan caused it to finish behind schedule. Because you believed you were good at planning, you now have two conflicting pieces of information that are causing cognitive dissonance: 

  1. You are a good planner, and 
  2. You made a significant judgment error in your planning

When people experience dissonance, they make a decision of how to disposition it, which results in one of the following actions:

  1. Accept the new information (not a common disposition)
  2. Reject the new information
  3. Discredit the new information
  4. Minimize the new information

The mental discomfort causes people to make decisions that may not be purely logical or rational. This is because people often make choices that reduce mental discomfort over the decision that is correct. Cognitive dissonance is an underlying contributor to many cognitive biases, and causes people to avoid discussion of risk (among other things) in project planning sessions.

Optimism bias  is an example of avoiding mental discomfort, where a person holds an unrealistic positive view about the future. Optimism reduces negative views about the future, avoiding the associated mental discomfort. Deliberate ignorance has a similar theme: avoidance of information that causes cognitive dissonance.

Social Pressure

This phenomenon is the cause of many decision errors by humans in planning. Social pressure explains the pressure experienced from others to make decisions that correspond with their will or desires. This pressure can be real or perceived and based on social expectations of the culture, organization, or small group within the organization. It can also occur in temporary groups, such as in a business meeting. The pressure from other people causes humans to often make decisions that are not completely logical.

Social pressure is also associated with Strategic Misrepresentation, a common cause of optimistic project planning

For example, in a safety or planning meeting, a subject matter expert (SME) may introduce a risk to people in the meeting, but the risk is considered uncomfortable to discuss. Because of the common understanding of the discomfort with discussing the risk, the SME decides not to push the issue, and the risk is no longer discussed. However, not discussing the risk did not make the risk go away; it just kept it from being mitigated. In this case, the social pressure increased the risk to the project because the pressure decreased the logical decision. Social pressure is also associated with Strategic Misrepresentation, a common cause of optimistic project planning.

Inertia

The inertia cognitive phenomenon explains the tendency for the brain to maintain a stable state associated with inaction or persistence in a certain direction. Let's use the example of a car in motion. Once the car starts moving forward in a particular direction, the inertia keeps it going. Any steering, either left or right, introduces friction and causes the inertia to slow the car. The brain operates similarly. As people start to move in a certain direction in decisions, actions, or mental state, any direction changes introduce friction and discomfort.

No alt text provided for this image

The brain resists this change because it takes more mental energy to deal with the friction or change in inertia. Inertia is associated with status quo bias , and can be one of the causes of resistance to change. Inertia can also be used to improve decision-making by setting defaults (such as those found in choice architecture , or nudge theory ) so that the inertia causes people to make better decisions by putting the right decision in the path of movement, either physical or mental. Believe it or not, we can actually capitalize on inertia and use it to our advantage to redesign planning processes that result in better decision-making in predictions. 

Psychological Safety

One of the most basic moderators of cognition, and probably the most popular, is the brain's response to threat; most of us have heard of "fight, flight, or freeze." All humans and other mammals are constantly evaluating the environment for threats. Before we lived in civilized towns with a relative degree of safety, humans were more exposed to the elements, predators, and other dangerous situations. In a dangerous situation, the brain is on high alert. If there is an immediate threat, we respond by fighting the threat, fleeing the danger, or in some cases freezing and not responding (a natural reaction if one did not want to be seen by a predator). Above all, the brain is trying to survive in every situation. And just because we are now in more civilized environments with reduced threat levels does not mean the brain has shut off the function of threat detection. It is now just looking for other threats that may be more subtle; in the office, in a conversation with the boss, or in a project team meeting.

No alt text provided for this image

Psychological safety can generally be defined as the belief that one is safe to take interpersonal risks in the organization. Suppose a team or whole organization does not feel safe. In that case, that will inhibit performance, learning, innovation, and risk identification, among many other issues. A lack of psychological safety reduces confidence, and people feel afraid they will be rejected, punished, embarrassed, or socially ostracized. Psychological safety also significantly impacts the level of trust in the organization. In addition to social pressure, psychological safety can be a cause of strategic misrepresentation. As trust is decreased, individuals may not feel safe communicating the realities of duration or cost estimates. As most people know, trust plays a big part in a high-performing culture. The lack of psychological safety is a direct contributor to decreased trust in an organization.

Conclusion

Everything starts with the brain. Everything goes through the brain. Your brain is filtering all information through all of these moderators simultaneously, regardless of how logical you think you are being. And, of course, our ego will tell us that we are the exception to the rule and more logical and rational than others. The simple fact of the matter is that we are all human, and all of these cognitive moderators apply to us. The only way to improve decisions and reduce thinking errors is to be aware of how these impact our thinking and take steps to improve.

We can get more reliable results when we understand human cognition, how it impacts planning and forecasting accuracy, and design our processes around human cognition. And who doesn't want more projects finished on time, on budget, with happier customers?


About the Authors

Dr. Josh Ramirez and Dr. Shari De Baets are cognitive scientists who study prediction in planning and forecasting at the Institute for Neuro & Behavioral Project Management . Their recent work includes completion of the NeuralPlan credential course, which can be found at www.neural-plan.com.


References

Brosschot, J. F., Verkuil, B., & Thayer, J. F. (2017). Exposed to events that never happen: Generalized unsafety, the default stress response, and prolonged autonomic activity. Neuroscience & Biobehavioral Reviews, 74, 287-296.

Caputo, A. (2013). A literature review of cognitive biases in negotiation processes. International Journal of Conflict Management, 24 (4). pp. 374-398. ISSN 1044-4068.) Emerald.

Chen, W. (January 01, 2007). Analysis of Rail Transit Project Selection Bias With an Incentive Approach. Planning Theory, 6, 1, 69-94.

Costa-Font, J., Mossialos, E., & Rudisill, C. (January 01, 2009). Optimism and the perceptions of new risks. Journal of Risk Research, 12, 1, 27-41.

Croskerry, P., Singhal, G., & Mamede, S. (September 18, 2013). Cognitive debiasing 1: origins of bias and theory of debiasing. Bmj Quality & Safety, 22.

Dean, M., Kibris, O., & Masatlioglu, Y. (2017). Limited attention and status quo bias. Journal of Economic Theory, 169, 93-127.

Du, J., Zhao, D., & Zhang, O. (2019). Impacts of human communication network topology on group optimism bias in Capital Project Planning: a human-subject experiment. Construction Management and Economics, 37(1), 44-60.

Edmondson, A. (1999). Psychological safety and learning behavior in work teams. Administrative science quarterly, 44(2), 350-383.

Eizakshiri, F., Chan, P. W., & Emsley, M. W. (April 07, 2015). Where is intentionality in studying project delays?. International Journal of Managing Projects in Business, 8, 2, 349-367.

Flyvbjerg, B. (2006). From Nobel Prize to project management: getting risks right. Project Management Journal, 37(3), 5–15.

Flyvbjerg, B. (January 01, 2008). Curbing Optimism Bias and Strategic Misrepresentation in Planning: Reference Class Forecasting in Practice. European Planning Studies, 16, 1, 3-21.

Flyvbjerg, B. (2007). How Optimism Bias and Strategic Misrepresentation Undermine Implementation, Trondheim: Norwegian University of Science and Technology, 41-55.

Flyvbjerg, B. (2014). What you should know about megaprojects and why: An overview. Project management journal, 45(2), 6-19.

Forstmann, B. U., Dutilh, G., Brown, S., Neumann, J., von Cramon, D. Y., Ridderinkhof, K. R., Wagenmakers, E. (2008). Striatum and pre-SMA facilitate decision-making under time-pressure.Proceedings of the National Academy of Sciences of the United States of America, 105(45), 17538-17542.

Gertman, D. I., Halbert, B. P., Parrish, M. W., Sattison, M. B., Brownson, D., & Tortorelli, J. P. (2001). Review of Findings for Human Performance Contribution to Risk in Operating Events. Idaho National Engineering and Environmental Laboratory. Washington, DC: Office of Nuclear Regulatory Research.

Golman, R., Hagmann, D., & Loewenstein, G. (2017). Information avoidance. Journal of Economic Literature, 55(1), 96-135.

Hinojosa, A., Gardner, W., Cogliser, C., Gullifor, D., & Walker, H. (2016). A review of cognitive dissonance theory in management research: Opportunities for further development. Journal of Management, 43(1), 170-199. doi:10.1177/0149206316668236

Jones, L. R., & Euske, K. J. (January 01, 1991). Strategic misrepresentation in budgeting. Journal of Public Administration Research and Theory, 1, 4.

Jung. D. (2019, March 19). Nudge action: Overcoming decision inertia in financial planning tools. Behavioraleconomics.com. Retrieved from https://www.behavioraleconomics.com/nudge-action-overcoming-decision-inertia-in-financial-planning-tools/ .

Kahneman, D. (2011). Thinking, fast and slow. London: Allen Lane.

Kahneman, Daniel; Tversky, Amos (1979). "Intuitive prediction: biases and corrective procedures". TIMS Studies in Management Science. 12: 313–327

Karlsson, N., Loewenstein, G., & Seppi, D. (January 01, 2009). The ostrich effect: Selective attention to information. Journal of Risk and Uncertainty, 38, 2, 95-115.

Kunda, Z. (1990). The case for motivated reasoning. Psychological bulletin, 108(3), 480.

Kutsch, E., & Hall, M. (January 01, 2010). Deliberate ignorance in project risk management. International Journal of Project Management, 28, 3, 245-255.

Levin, I. P., Schneider, S. L., & Gaeth, G. J. (1998). All frames are not created equal: A typology and critical analysis of framing effects. Organizational Behavior and Human Decision Processes, 76, 149-188.

Lilienfeld, S. O., Ammirati, K. L., & Landfield, R. A. (July 01, 2009). Giving Debiasing Away: Can Psychological Research on Correcting Cognitive Errors Promote Human Welfare?. Perspectives on Psychological Science, 4, 4, 390-398.

Madrian, B., & Shea, D. (2001). The power of suggestion: Inertia in 401(k) participation and savings behavior. Quarterly Journal of Economics, 116, 1149-1187.

Min, K. S., & Arkes, H. R. (November 01, 2012). When Is Difficult Planning Good Planning? The Effects of Scenario-Based Planning on Optimistic Prediction Bias. Journal of Applied Social Psychology, 42, 11, 2701-2729.

Nebel, J. M. (2015). Status quo bias, rationality, and conservatism about value. Ethics, 125(2), 449-476.

Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon in many guises. Review of General Psychology, 2, 175-220.

Oliva, R., & Watson, N. (March 01, 2009). Managing Functional Biases in Organizational Forecasts: A Case Study of Consensus Forecasting in Supply Chain Planning. Production and Operations Management, 18, 2, 138-151.

Pachur, T., Hertwig, R., & Steinmann, F. (January 01, 2012). How do people judge risks: availability heuristic, affect heuristic, or both?. Journal of Experimental Psychology. Applied, 18, 3, 314-30.

Peetz, J., Buehler, R., & Wilson, A. (September 01, 2010). Planning for the near and distant future: How does temporal distance affect task completion predictions?. Journal of Experimental Social Psychology, 46, 5, 709-720.

Prater, J., Kirytopoulos, K., & Ma, T. (January 01, 2017). Optimism bias within the project management context: A systematic quantitative literature review. International Journal of Managing Projects in Business, 10, 2, 370-385.

Ramasesh, R. V., & Browning, T. R. (May 01, 2014). A conceptual framework for tackling knowable unknown unknowns in project management. Journal of Operations Management, 32, 4, 190-204.

Samuelson, W., & Zeckhauser, R. J. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1, 7-59.

Sharot, T., Riccardi, A. M., Raio, C. M., & Phelps, E. A. (2007). Neural mechanisms mediating optimism bias. Nature, 450(7166), 102-5. http://dx.doi.org.tcsedsystem.idm.oclc.org/10.1038/nature06280

Son, J., & Rojas, E. M. (2011). Impact of Optimism Bias Regarding Organizational Dynamics on Project Planning and Control. Journal Of Construction Engineering & Management, 137(2), 147-157. doi:10.1061/(ASCE)CO.1943-7862.0000260.

Sweller, J., & Chandler, P. (1991). Evidence for cognitive load theory. Cognition and instruction, 8(4), 351-362.

Tetlock, P. E., & Gardner, D. (2015). Superforecasting: The art and science of prediction. New York: Crown.

Tversky, A., & Kahneman, D. (January 01, 1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5, 2, 207-232.

Tversky, A., & Kahneman, D. (January 01, 1974). Judgment under Uncertainty: Heuristics and Biases. Science (New York, N.Y.), 185, 4157, 1124-31.

Tversky, A., & Kahneman, D. (1981). Judgments of and by representativeness (No. TR-3). Stanford Univ CA Dept. of Psychology.

Department of Energy. (2009). Human performance improvement handbook volume 1: concepts and principles. US Department of Energy AREA HFAC Washington, DC, 20585.

van der Kleij, R., Rasker, P. C., Lijkwan, J. T. E., & de Dreu, C. K. W. (2006). Effects of distributed teamwork and time pressure on collaborative planning quality. Proceedings of the Human Factors and Ergonomics Society Annual Meeting, 50(4), 555–559. https://doi.org/10.1177/154193120605000402

Van der Weele, Joel J., When Ignorance Is Innocence: On Information Avoidance in Moral Dilemmas (August 20, 2012). Available at SSRN: https://ssrn.com/abstract=1844702  or http://dx.doi.org/10.2139/ssrn.1844702

Vaughan, D. (1998). Rational choice, situated action, and the social control of organizations. Law and Society Review, 23-61.

Vohs, K. D., Baumeister, R. F., Schmeichel, B. J., Twenge, J. M., Nelson, N. M., & Tice, D. M. (2008). Making choices impairs subsequent self‐control: A limited‐resource account of decision making, self‐regulation, and active initiative. Journal of Personality and Social Psychology, 94, 883‐898.

Weinstein, N. D. (January 01, 1980). Unrealistic optimism about future life events. Journal of Personality and Social Psychology, 39, 5, 806-820.

Westen, D., Blagov, P. S., Harenski, K., Kilts, C., & Hamann, S. (2006). Neural bases of motivated reasoning: An fMRI study of emotional constraints on partisan political judgment in the 2004 US presidential election. Journal of cognitive neuroscience, 18(11), 1947-1958.

Young, C., Brown, O. W., & Blockwood, J. C. (2020). Cost Estimating and Assessment Guide: Best Practices for Developing and Managing Program Costs. US Government Accountability Office Washington United States.

 

Carole Osterweil MAPM

Bringing neuroscience to projects and business transformation ⭐️Project Troubleshooter⭐️ Coach⭐️ Author ⭐️Speaker⭐️

3y

A super summary Dr. Josh E.

Great article. Do you have any suggestions or tips to get all project stakeholders to understand how these cognitive moderators could adversely impact project management?

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics