This Guest paper, from which the following abstract has been drawn, was submitted for publication in November 2019, Part 2 published here March 2021.
It is copyright to Dr. Philip Crosby, CSIRO Astronomy and Space Science.

PART 1 | Risk and Contingency | Project Environment | Mission Assurance
Brief Qualitative Assessment | Conclusions and Discussion | Summary

Author's Comment

Last month in Part 1 of this paper I observed that:

"This study presumes that project success, as defined above, is not indeterminate by nature, and that undertaking certain activities, coupled with application of particular policies and launch conditions at the front end, positions a project for success. I followed that with Study approach, methodology and findings."

This month in Part 2, I shall complete the findings, together with Conclusions, Discussion and a brief Summary.

Risk and Contingency

This is Launch Conditioning #4

High-tech projects have inherent risk as a consequence of their raison d'être, and higher risks must be proportional with higher contingency.[75] A standard method to deal with project execution risk is a register type tool using a rating system to score anticipated risk based on the likelihood and consequences of the defined event occurring. Management decisions are then taken to accept, mitigate, or remove those risks. This approach is reasonably effective at project start-up for the known knowns, but takes little account of the unknown unknowns, i.e., events, circumstances or results that are invisible to the project.

The notion of our inability or reluctance to plan for unexpected massive negative (or positive) impacts is most highly developed by Taleb who classifies such events as "Black Swans".[76] Taleb forcefully argues that the tendency for humans to scale and smooth predictions on assumed or unfounded (i.e. guessed) data, is tainted by a process known as "anchoring" and guarantees erroneous forecasts.

My observations within the high-tech mega-project environment confirmed these tendencies, with highly rational scientist and engineers dealing expertly with the possible conditions of a project's physical systems, but notably less proficient when dealing with programmatic and social matters.

As Taleb puts it: "We simply assume that individuals will be rational in the future and thus act predictably".[77] Green adds the crucial point that project planners should not be fooled by the "statistically insignificant" frequency of Black Swans, but instead should pay close attention to the potential catastrophic consequences.[78]

The conventional approach is therefore problematic because of the perception that steps have has been instilled to deal with all unexpected events,[79] whereas it is actually most reliant on experiential hindsight as a risk predictor. Geraldi et al. contends that in projects "it is not a question of if but when unexpected events will emerge".[80] These may include transactional issues such as exchange rate fluctuations, market changes etc.[81]

On average, projects encounter five unexpected events in the formative stages and some confront as many as twelve.[82] The ALMA telescope gas energy supply interruption being one example where an early assumption appeared fully reliable, only to be unexpectedly revoked.[83]

If the substantive risk to the project is unknown, how might we deal with it at the project formative stage? This study suggests a dual response: applied contingency, and threat readiness.

"Project contingency" (rather than specific threat contingency) includes those external factors or events that cannot yet be pinpointed but will seriously jeopardize the project when they materialize. However quantifying contingency is non-trivial. The PMBOK mentions reserves and contingency but not how they are computed or applied within the project.[84]

Nicholas offers a calculator as well as suggesting an overrun allowance in some circumstances.[85] NASA has developed the "Joint Confidence Level — Probabilistic Calculator (JCL-PC) founded on the hypothesis that a project's early phases hold many unknown risks.[86] Many contemporary high-tech project reports recommend early budgeting for (cash) reserves around 20%-25%.[87], [88], ,

Since the unknown cannot be planned in detail, an alternative method is to plan for everything; (the Napoleon approach) expecting that something will go wrong and that a solution will be needed as the challenge emerges. When referring to the aggressive, revolutionary high-tech F117 Stealth Fighter program, Nicholas writes; "Expecting the unexpected is often better preparation for coping with risk than preparing extensive plans and believing that the unexpected has been eliminated."[89]

General managerial alertness is clearly required to scan broadly for potential threats. Smith (2007) describes "uncertainty spotting" skills; the early seeking out and challenging of threats and assumptions.[90] Being watchful and informed by timely and accurate trend-type data, is indicated as a key strategy for building response capability.

Coupled with this are task force response teams (aka "tiger" teams), which are shown to operate effectively to contain and direct events.[91] The strength of task forces lies in their combined expertise, detachment from the project, and freedom from project bureaucracy.[92] Power is concentrated through limiting numbers and very careful participant selection.[93]

However, task forces take time to establish and become effective, and this paper posits that one or more task force panels might be anticipated, assembled virtually during project start-up, and periodically offered a project "health" report so that a dormant state of readiness is maintained. In the event of an unforeseen disruption, a panel of previously enrolled experts is far better placed to begin problem solving than a mixture type response.

In addition to applying contingency and maintaining a quiescent threat readiness, wise project managers will practice skillful early stage planning to try and avoid unplanned events. Activities including response training, stakeholder negotiation skills, avoidance of panic and over-reaction, and speedy approval processes, all serve to strengthen resilience.[94]

PART 1  PART 1

75. Fisher, J, R. (2010). Large instrument development for radio astronomy: Astro2010 technology development white paper. Observatory.
76. Taleb, N. N., (2010). The black swan: The impact of the highly improbable, USA: Random House.
77. Ibid, p184.
78. Green, N., (2012) Keys to success in managing a black swan event, Aon Risk Solutions White Paper. Retrieved February 14, 2012, from aon.com/getmedia/bd217775-ef15-4d24-9126-50ed33d99b5c/Keys-To-Success-In-Managing-A-Black-Swan-Event.aspx.
79. Pender, S. (2001). Managing incomplete knowledge: Why risk management is not sufficient. International Journal of Project Management, 19, 79-87.
80. Geraldi, J, G., Lee-Kelley, L., & Kutsch, E. (2010). The Titanic sunk, so what? Project manager response to unexpected events. International Journal of Project Management, 28, 547-558.
81. Nicholas, J, M. (2004). Project management for business and engineering — principles and practice. Oxford: Elsevier Butterworth-Heinemann.
82. Miller, R., & Lessard, D. (2000). The strategic management of large engineering projects. Massachusetts Institute of Technology, USA.
83. ALMA, 2007.
84. Pender, S. (2001). Managing incomplete knowledge: Why risk management is not sufficient. International Journal of Project Management, 19, 79-87.
85. Nicholas, J, M. (2004). Project management for business and engineering, - principles and practice. Oxford: Elsevier Butterworth-Heinemann.
86. Butts, G., & Linton, K. (2009). Joint confidence level paradox — A history of denial. NASA 2009 Cost Estimating Symposium, Galveston, USA.
87. Fellows, J. D., & Alexander, J, K., (Eds). (2010). Decadal science strategy surveys: Report of a workshop, National Research Council, 2007. Retrieved October 1, 2010, from nap.edu/catalog/11894.html.
88. JPL, (2010). James Webb Space Telescope independent comprehensive review — final report. JPL D-67250, Jet Propulsion Laboratory, California Institute of Technology.
89. Nicholas, J, M. (2004). Project management for business and engineering, - principles and practice. Oxford: Elsevier Butterworth-Heinemann.
90. Smith, C. (2007). Making sense of project realities, UK: Gower Publishing Ltd, Hampshire.
91. Crosby, P. (2012a). Characteristics and techniques of successful high-technology project managers. International Journal of Project Organization and Management, 4(2).
92. CERN, 2009.
93. Pavlak, A. (2004b). Modern Tiger Teams: Team problem solving for the 21st century. Thales Research Inc, USA.
94. Geraldi, J, G., Lee-Kelley, L., & Kutsch, E. (2010). The Titanic sunk, so what? Project manager response to unexpected events. International Journal of Project Management, 28, 547-558.
 
Home | Issacons | PM Glossary | Papers & Books | Max's Musings
Guest Articles | Contact Info | Search My Site | Site Map | Top of Page