Search
  • OxfordBusinessConsulting

Unobvious pitfalls of big projects - from the construction industry to organisational change.

Updated: May 27

Numbers, facts, and more numbers


Every day managers make significant decisions. Some of them are critical from the strategic point of view; some of them directly affect the financial outcome and delivered benefits of a project. The issue gets even more serious when we consider the fact that each decision is not anonymous – behind each choice stands a decision-maker. It means a human being with all its flaws, delusion proneness, and inherent subjectivity.

Big programmes and projects - not only construction is uses programmes as a platform - organisational change.

Programmes around the world are getting bigger than they were in the past. Moreover, we can observe that programmes, in general, become a favourite delivery platform for public and business world (Flyvbjerg, 2017a). The problem emerges when we look closer at the historical data and compare them with more recent ones. Ansar et al. (2017) have examined data regarding 245 big dams, which were built within a time span of 73 years – from 1934 to 2007. The results are surprising because almost half the dams investigated were over budget. What is more striking, during all those years, the study shows no improvement in forecasts accuracy. A similar conclusion has been made by Flyvbjerg (2008; 2014) who investigated forecasts for infrastructure projects. He gathered data from ventures performed within 70 years period and found out that despite the technological advancement of forecasting methods, the accuracy of the forecast remains at a similar level.


Programmes around the world are getting bigger than they were in the past. Moreover, we can observe that programmes, in general, become a favourite delivery platform for public and business world (Flyvbjerg, 2017a). The problem emerges when we look closer at the historical data and compare them with more recent ones.

When we dive deeper into the numbers, the facts are getting even more disturbing. The cost overruns in case of each type of projects were significant. In case of big dams, an average cost gap between forecasts and actual costs of the project was at a level of 96% whereas median was 27% higher than expected (Ansar et al., 2017, p. 15). These findings correspond with details on rail projects which, on average, were inaccurate by 44.7%. The inaccuracy values for bridges/tunnels and roads were found to be respectively 33.8% and 20.4% (Flyvbjerg, 2008, p. 4). Moreover, Flyvbjerg examined also rail passengers forecasts and road traffic forecasts having available data for the 30-year period. In this case, the results confirmed the trend – the estimates have not improved (Flyvbjerg, 2008). See Fig. 1.

Flyvbjerg (2003, 1-10) noticed a “paradox” that can be nowadays observed. The programmes are getting bigger, more complex, and the demand for them shows an increasing trend. Therefore, the fact that managers, in most cases, stuck with a so-called break-fix model is at least difficult to comprehend (Flyvbjerg, 2017a, p. 14). The model presumes a standard approach to project management without enough knowledge and preparation. Managers are optimistic about forecasts, execution, costs, schedule, and benefits. Therefore, in most cases, projects “break”. In such a situation, the process of “fixing” follows. They tend to stop the project and reorganise it, which is usually associated with readjusting the costs. The model harms an entire project and leads to cost overruns and benefit shortfalls. A perfect example of the approach present Flyvbjerg et al. (2014) in their report regarding Hong-Kong Express Rail Link. The researchers discovered many flaws in the project and inaccurate forecasts as to budget and schedule. They found that there was still a 31% risk of further delay and a 67% risk that the forecasted cost would be exceeded. The budget was too tight to buffer unforeseen future events.


Sources of flaws, cases, solutions


What is the reason for the above situation? As the aim of this paper touches the issue of rationality in managers’ decisions, we first need to look much deeper to find out what stands behind the decisions in project management and what is the background for each of them. Ariely (2008) in his book “Predictably Irrational: The Hidden Forces That Shape Our Decisions” discusses the rationality of human beings in a wider sense. Through a series of examples, he proves that people want to be rational in their choices and perception, but they are susceptible to irrational shortcuts of seeing the world.


Kahneman and Tversky (1979) released a book which discussed the idea of planning fallacy as people proneness to focusing on certain details of a problem instead of paying more attention to the distribution of outcomes in comparable situations. Their early research confirmed that, in general, people rely mainly on singular information and are not sensitive enough to distributional data. Moreover, they display confidence in predictions without having clear reasons to do so. By paying attention only to benefits and ignoring the costs, we become more prone to taking the risk. Therefore, managers should not fall into optimism bias. Furthermore, Kahneman (2011) presented a concept that includes two systems of human thinking: System 1 – responsible for intuitive thinking, and System 2 – responsible for rational thinking. The former system is fast and based mainly on heuristics and cognitive biases that make it flawed when it comes to making optimal decisions. The latter embodies logical thinking and careful evaluation.


Flyvbjerg (2004; 2008; 2014) has described planning fallacy in a context of project management. There are two justified explanations for inaccurate forecasts – psychological and political-economic. The first explanation is linked with optimism bias, which is accounted for the mentioned earlier fact that most people are overly optimistic when asked about future events. The London Olympics are an excellent example of project affected by optimism bias. In 2002, the initial forecast delivered by Arup set the cost at £1.8B. One year later, a more detailed estimation was performed by PwC and presented a much higher sum - £3.1B. The corrected cost required a public subsidy of £1.3B, and the PwC’s risk assessment said that there was less than 2% probability that this amount would rise to £2B. In 2005, the International Olympic Committee received an offer with the sum of £4.2B and deemed it realistic without a need for further review. However, in 2007 the Department of Culture, Media and Sport reviewed the budget again, ending up with the total cost of £6.5B (Kay, 2003).

The political-economic explanation is linked with strategic misrepresentation, which does not include the cognitive aspect. It is associated with aware overestimation of benefits in the planning phase to gain the support of the decision-makers. For the same reason, the forecasted cost can be brought down, just to justify a favourable decision concerning project appraisal. Moreover, it is more probable that the strategic misrepresentation will take place in a highly competitive environment where political or organisational pressures are present. The idea was first pointed out in a study performed by Wachs (1986; 1989; 1990). He jettisoned the explanation that the high amount of wrong forecasts are caused by technical errors, pointing out that lying is a more probable reason. The outcome of the study is a result of interviewing people directly involved in underestimated forecasts in the area of public transport planning. The participants admitted that they fabricated data and adjusted them without having a scientific basis, just to fulfil the expectations of the decision-makers. As the biggest flaw of Wachs’es research was a too-small sample to assess how many projects are underestimated, Flyvbjerg et al. (2002) agreed with the study with a subtle correction regarding providing more extensive data about the results. Their research included much more statistically significant sample, which led to a conclusion that 86% of all examined projects had actual costs higher than forecasted. The study also showed that, in general, 9 in 10 public works projects had initial costs lower than the ultimate.


As a good example of strategic misrepresentation and optimism bias together may serve A2 motorway in Poland which in 2009 was announced to be delivered by Chinese consortium Covec. The company had won a tender regarding building two sections of the A2 motorway – 49,1km. The investment had the highest priority as the road should be built before football championship EURO 2012 in Poland as it was to connect two big cities and relief heavy traffic which could have been a huge problem during the time of the event. Covec had won the tender because of the critically low price of $450M. It was intentional, just to blaze the trail for entering other European countries, and at the same time hoping that it is possible to meet the tight budget – as up to that time the company was not successful in Europe. The submitted price was about half of the Polish government’s forecast. Therefore the European firms that lost started to criticise the winner publicly. Unfortunately, in June 2011, after failing to pay subcontractors, the company CEO announced that they could continue works for additional $320M, what would make the overall price 70% higher. After negotiations, the government decided to renew the bid and choose a new partner (Areddy, 2012).


In addition to the above concepts, Hubbard (2009) confirms that “catastrophic” overconfidence is often the most problematic factor when it comes to forecasting and future planning. People tend to be far too optimistic when they think about the future. Further, he introduced the concept of “calibration” (p. 103). A calibrated person can get rid of a subjective perspective and be able to take a different look at a particular case. There are specific tests which may be used in calibration training. If only the managers involved in projects were aware of their biases, it would be possible to reduce cost overruns and delays in delivery significantly.


To bypass people’s biases, Kahneman and Tversky (1979) proposed reference class forecasting. In practice, the concept was utilised by Flyvbjerg and Cowi (2004) in the study for The British Department for Transport which later established it as a standard procedure in case of large projects. It is possible to avoid optimism bias and strategic misrepresentation by proper usage of the reference class forecasting. The cure for the biases can be depicted by two supporting concepts – inside view and outside view. The former is biased by people closely involved in the project. Therefore there is a high risk of benefit shortfalls, cost increases or schedule delays. They do not see any of the possible risks because their position is within the project’s affairs. To get rid of the biases, a person has to take the latter position (Flyvbjerg, 2008; 2014).


The outside view means taking a step out and seeing the position of a project located among other, comparable groups of projects. The reference class forecasting requires managers to compare the project with a similar group of projects and thus, it enables them to find how it is placed in a statistical distribution of outcomes. It is recommended to take specific steps to unveil biases. However, the fundamental part is – having access to reliable data and finding the right reference class. It should be comparable to the particular project but also statistically meaningful to provide a manager with a credible outcome (Flyvbjerg 2004; 2008; 2014).

The outside view means taking a step out and seeing the position of a project located among other, comparable groups of projects. The reference class forecasting requires managers to compare the project with a similar group of projects and thus, it enables them to find how it is placed in a statistical distribution of outcomes.

Another aspect of the issue is that major programmes are usually big regarding spent, interconnections among projects within them, and the complexity of tasks (Ferns, 1991). Ansar et al. (2017) discusses general differences between big and scalable and then compares the theory to particular projects regarding dams. The size of the projects is increasing because they are expected to be efficient and scalable. However, that is very simplified thinking. The big size is believed to support these expectations according to heuristics: “bigger is better” and “building ahead of demand”. Unfortunately, large ventures are also more vulnerable because of their inherent complexity, long delivery time and much higher financial risk. The paper is concluded with a sentence that unambiguously condemns irrational decisions: “Ought decision-makers abandon all big ventures? Of course not. But decisionmakers must carefully assess when bigger is better, instead of unthinkingly assume this is the case” (p. 31).

Taleb (2007) came up with an idea of black swans which denote statistical outliers and because of the commonness of the Gaussian bell curve were usually omitted. Consequently, creating a paradigm of basing everything around us on average values. The black swan events are associated with fat tails in the probability distribution. Many experts are good at forecasting of ordinary events, but when it comes to unusual circumstances, they fail (p. 159). It has to be said that he attributes the flawed approach to the prediction to human weaknesses – “We attribute our successes to our skills, and our failures to external events outside our control, namely to randomness. We feel responsible for the good stuff, but not for the bad. This causes us to think that we are better than others at whatever we do for a living” (p. 152).

In his further work, Taleb (2012) argues that big projects cannot be fully controlled. The managers can control many aspects of a project, but there is always a certain level of uncertainty. Black swans mean events that are unpredictable but significant and may change the course of an entire project. There are scientifically proven methods of risk management and forecasting, but they are worthless in the face of black swans. What is more, this kind of approach may be deceptive as unpredictable events are not possible to foresee in their nature, therefore believing in new methods that capture black swans make us even more vulnerable to them. Ultimately, he concludes with a solution – “We should try to create institutions that won't fall apart when we encounter black swans—or that might even gain from these unexpected events” (p. 1). There is no learning without reflection When discussing the reasons why do not the managers learn from their failures, Budzier et al. (2013) came up with a conclusion that the reason for that is “framing of the failure” (p. 22). If a problematic project is perceived as a random failure, there is no space for learning. In such a situation, managers look like poor victims of fate. They could learn and understand the problem by looking not only at big failures, because also little overruns may be a good sign of significant danger. In their paper from 2013, the authors focused on IT projects and found out that cost and schedule risk within this kind of projects have fat tails. One of the most popular examples of black swan project in IT is the baggage handling system at Denver International Airport. Mainly because of the system, the airport was open 16 months later than scheduled and cost $4.8B instead of forecasted $2B – that is nearly 250% cost increase. The problem paralysed the project and maintenance cost for the city of Denver skyrocketed, costing $1.1M per day of delay (Montealegre and Keil, 1998). Black swans occur in every industry where the level of complexity and long time frames of projects is common. A classic example of such a venture in the oil and gas industry is a huge Sakhalin II project executed by Shell as the main shareholder. The project pertained production installation at Sakhalin Island. It was sanctioned in 2003 at $10B what meant for Shell a huge loss because the value exceeded its net income for 2002. In 2005, Shell announced in its 6K report that the cost had doubled to $20B (Dodson and Westney, 2014). The company also informed about the expected delay in first deliveries by 6 months. As Shell was forced to review and reduce estimates of its oil reserves several times in 2004, the investors were strongly concerned (Seager, 2005).

Looking deeper, we can also find a concept of benevolent hiding hand introduced by Hirschman (1967a; 1967b) which employ a belief that certain mistakes in a forecasting phase can propel a fixing mechanism which leads to the ultimate success of a venture. He presents a set of projects that, in his belief, prove the existence of the phenomenon. However, it has to be pointed out that Flyvbjerg (2016) undermined the theory by proving that Hirschman’s data collection was biased, the sample was too small, and misrepresentation of findings and false results were prominent (pp. 22-23). Moreover, some of the projects that Hirschman have indicated as examples of the beneficial role of the hiding hand in practice turned out to be failures (Flyvbjerg 2016; 2017b). Concluding the above, we can say that irrational decisions are the biggest problem when selecting and delivering big programmes. However, they are not the sole factor to be taken into consideration. Major programmes are complex and full of interdependencies. Therefore, irrational decisions are important root causes for failure, but there is much more to consider. Certainly, we can attribute a dose of irrationality to optimism bias. Managers are ignorant, or they lack the knowledge to bypass the bias. If they learn more about reference class forecasting or the idea of calibration, they could be much more successful in estimation and delivering their project. They should know more about the relation between big and fragile what would allow them to select more suitable and safer ventures. Despite the fact that black swans in project management are difficult to avoid, they could pay more attention to outliers and create additional scenarios. Moreover, they should bear in mind that the planning phase is crucial for avoiding black swans and delivering the expected benefits. On the other hand, we cannot say that irrational decisions are the only problem when selecting and delivering programmes. There is also a theory of strategic misrepresentation which means that in some cases, it is not about rationality but rather about conscious decisions or deceptions leading to the expected outcome. We can conclude that until now, most managers work by the break-fix method or they count on benevolent hiding hand to save them. However, it would be only justified if these concepts were proven helpful in real life. Do not let yourself to fail, learn from failures of others, reflect on the lessons from the industry which lays at the foundation of program management which, in turn, facilitates us to deliver all kinds of complex construction projects or change initiatives from the execution of a company's strategy to turnaround.

#organisationalchange #blackswan #programmemanagement #stategy #consulting #changemanagement #optimismbias #construction #referenceclassforecasting #programmanagement #turnaround #transformation


References:

[1] Ansar, A., Flyvbjerg, B, Budzier, A., Lunn, D. 2017. Big is fragile: An attempt at theorizing scale. In Bent Flyvbjerg, ed., The Oxford Handbook of Megaproject Management, Chapter 4, pp. 60-95, Oxford: Oxford University Press [2] Areddy, J., T. 2012. European project trips China builder. Wall Street Journal (Online), June 05, New York, N.Y [3] Ariely, D., 2008. Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York : Harper Collins Publishers [4] Budzier, A., Flyvbjerg, B. 2013. Making sense of the impact and importance of outliers in project management through the use of power laws. Proceedings of IRNOP (International Research Network on Organizing by Projects), at Oslo, Volume: 11, June 1. Retrieved from https://ssrn.com/abstract=2289549 [5] Dodson, K. Westney, R. 2014. Predictable projects in a world of black swans. Westney Consulting Group, Retrieved from: http://www.westney.com/wp-content/uploads/2014/05/Predictable-Projects-in-a-World-of-Black-Swans.pdf [6] Flyvbjerg, B. 2017a. Introduction: The iron law of megaproject management. In Bent Flyvbjerg, ed., The Oxford Handbook of Megaproject Management, Chapter 1, pp. 1-21, Oxford: Oxford University Press [7] Flyvbjerg, B. 2017b. Planning fallacy or hiding hand: Which is the better explanation? World Development, 103 to be 2018, 383-386. 10.1016/j.worlddev.2017.10.002 [8] Flyvbjerg, B., 2008. Curbing optimism bias and strategic misrepresentation in planning: reference class forecasting in practice. European Planning Studies, 16, pp. 3-21 [9] Flyvbjerg, B., 2014. From Nobel Prize to project management: Getting risks right. Project Management Journal, 37, pp. 457-467 [10] Flyvbjerg, B., 2016. The fallacy of beneficial ignorance: A test of Hirschman's hiding hand. World Development, vol. 84, May, pp. 176–189 [11] Flyvbjerg, B., Bruzelius, N. and Rothengatter, W. 2003. The megaprojects paradox. In: Megaprojects and Risk: An Anatomy of Ambition. Cambridge: Cambridge University Press, pp. 1–10 [12] Flyvbjerg, B., Cowi. 2004. Procedures for dealing with optimism bias. In: Transport Planning: Guidance Document. London: UK Department for Transport [13] Flyvbjerg, B., Kao, Tsung-C., Budzier, A. 2014. Report to the Independent Board Committee. October 28, Retrieved from https://ssrn.com/abstract=2516300 [14] Flyvbjerg, B., Skamris, M., Buhl, S. 2002. Underestimating costs in public works projects: error or lie?. Journal of the American Planning Association, vol. 68, no. 3, Summer 2002, pp. 279-295 [15] Hirschman, A. O. 1967a. Development Projects Observed. Washington, DC: Brookings Institution [16] Hirschman, A. O. 1967b. The principle of the hiding hand. The Public Interest, Winter, pp. 10-23 [17] Hubbard, D., W. 2009. The Failure of Risk Management: Why It's Broken and How to Fix It. John Wiley & Sons [18] Kahneman, D. 2011. Thinking, Fast and Slow. New York: Farrar, Straus and Giroux [19] Kahneman, D., Tversky, A. 1979. Intuitive prediction: Biases and corrective procedures. In: S. Makridakis & S. C. Wheelwright (Eds) Studies in the Management Sciences: Forecasting, 12, Amsterdam: North Holland, pp. 313-327 [20] Kay, J. 2003. The Olympic optimism bias has left the taxpayer out of pocket. 26 November, Financial Times, Retrieved from: https://www.ft.com/content/10979672-55e2-11e3-96f5-00144feabdc0 [21] Montelegre, R., Keil, M. 1998. Denver International Airport’s automated baggage handling system: A case study of de-escalation of commitment. Academy of Management Proceedings, 08, pp. 1-9 [22] Seager, A., 2005. Shell costs double at Sakhalin. The Guardian, July 15, Retrieved from https://www.theguardian.com/business/2005/jul/15/oilandpetrol.news [23] Taleb, N. N. (2012). Learning to love volatility. The Wall Street Journal, Saturday Essay, November 16 [24] Taleb, N. N. 2007. The Black Swan: The Impact of the Highly Improbable. New York: Random House [25] Wachs, M. 1986. Technique vs. advocacy in forecasting: A study of Rail Rapid Transit”. Urban Resources, 4(1), pp. 23–30. [26] Wachs, M. 1989. When planners lie with numbers”. Journal of the American Planning Association, 55(4), pp. 476–479 [27] Wachs, M. 1990. Ethics and advocacy in forecasting for public policy”. Business and Professional Ethics Journal, 9(1–2), pp. 141–157

Author: Radek Jaros, MBA, MSc (Oxon)

©️ 2019 Radek Jaros. All rights reserved.


First published in:

©️ 2019 Oxford Business Journal - Oxford Business Consulting. All rights reserved.


DOWNLOAD THE ARTICLE:


Pitfalls_of_big_projects_from_the_constr
Download • 483KB

12 views
Send Us a Message

We are mobile, and project-oriented

Let us know to schedule a call or meeting.

  • White LinkedIn Icon
  • White Facebook Icon
  • White Twitter Icon

© 2020 Oxford Business Consulting. All rights reserved.