Sunk cost

From Wikipedia, the free encyclopedia
Jump to navigation Jump to search
The sunk cost fallacy has also been called the "Concorde fallacy": the UK and French governments took their past expenses on the costly supersonic jet as a rationale for continuing the project, as opposed to "cutting their losses".

In economics and business decision-making, a sunk cost (also known as retrospective cost) is a cost that has already been incurred and cannot be recovered.[1][2][3] Sunk costs are contrasted with prospective costs, which are future costs that may be avoided if action is taken.[4] In other words, a sunk cost is a sum paid in the past that is no longer relevant to decisions about the future.[3] Even though economists argue that sunk costs are no longer relevant to future rational decision-making, in everyday life, people often take previous expenditures in situations such as repairing a car or house into their future decisions regarding those properties.

Bygones principle[edit]

According to classical economics and traditional microeconomic theory, only prospective (future) costs are relevant to a rational decision.[5] At any moment in time, the best thing to do depends only on current alternatives.[6] The only things that matter are the future consequences.[7] Past mistakes are irrelevant.[6] Any costs incurred prior to making the decision have already been incurred no matter what decision is made. They may be described as "water under the bridge,"[8] and making decisions on their basis may be described as "crying over spilt milk."[9] In other words, people should not let sunk costs influence their decisions; sunk costs are irrelevant to rational decisions. This is known as the bygones principle[7][10] or the marginal principle.[11]

The bygones principle is grounded in the branch of normative decision theory known as rational choice theory, particularly in expected utility hypothesis. Expected utility theory relies on a property known as cancellation, which says that it is rational in decision-making to disregard (cancel) any state of the world that yields the same outcome regardless of one's choice.[12] Past decisions—including sunk costs—meet that criterion.

Until a decision-maker irreversibly commits resources, the prospective cost is an avoidable future cost and is properly included in any decision-making processes.[10] For instance, if someone is considering pre-ordering movie tickets, but has not actually purchased them yet, the cost remains avoidable.

Both retrospective and prospective costs could be either fixed costs (continuous for as long as the business is operating and unaffected by output volume) or variable costs (dependent on volume).[13] However, many economists consider it a mistake to classify sunk costs as "fixed" or "variable." For example, if a firm sinks $400 million on an enterprise software installation, that cost is "sunk" because it was a one-time expense and cannot be recovered once spent. A "fixed" cost would be monthly payments made as part of a service contract or licensing deal with the company that set up the software. The upfront irretrievable payment for the installation should not be deemed a "fixed" cost, with its cost spread out over time. Sunk costs should be kept separate. The "variable costs" for this project might include data centre power usage, for example.

There are cases in which taking sunk costs into account in decision-making, violating the bygones principle, is rational.[14] For example, for a manager who wishes to be perceived as persevering in the face of adversity and privately holds information about the undesirability of abandoning a project, it may be rational to display the sunk cost effect and persist with the project.[15]

The fallacy effect[edit]

The bygones principle does not accord with real-world behavior. Sunk costs do, in fact, influence people's decisions,[8][14] with people believing that investments (i.e., sunk costs) justify further expenditures.[16] People demonstrate "a greater tendency to continue an endeavor once an investment in money, effort, or time has been made."[17][18] This is the sunk cost fallacy, and such behavior may be described as "throwing good money after bad,"[19][14] while refusing to succumb to what may be described as "cutting one's losses."[14]

The term "Concorde fallacy"[20] derives from the fact that the British and French governments continued to fund the joint development of the costly Concorde supersonic airplane even after it became apparent that there was no longer an economic case for the aircraft. The British government privately regarded the project as a commercial disaster that should never have been started. However, political and legal issues made it impossible for either government to pull out.[10]

In an everyday example, a person may purchase a ticket to a baseball game and find after several innings that they are not enjoying the game. Their options at this point are:-

  1. Accepting the waste of money on the ticket price and watching the remainder of the game without enjoyment; or
  2. Accepting the waste of money on the ticket price and leaving to do something else.

The economist will suggest that, since the second option involves suffering in only one way (wasted money), while the first involves suffering in two (wasted money plus wasted time), option two is preferable. In either case, the ticket-buyer has paid the price of the ticket so that part of the decision should no longer affect the future. If the ticket-buyer regrets buying the ticket, the current decision should be based on whether he wants to see the game at all, regardless of the price, just as if he were to go to a free baseball game.

Many people, however, would feel obliged to stay for the rest of the game despite not really wanting to, perhaps because they feel that doing otherwise would be wasting the money they spent on the ticket. They may feel they have passed the point of no return. Economists regard this behavior as irrational. It is inefficient because it misallocates resources by taking irrelevant information into account.

The idea of sunk costs is often employed when analyzing business decisions. A common example of a sunk cost for a business is the promotion of a brand name. This type of marketing incurs costs that cannot normally be recovered. It is not typically possible to later "demote" one's brand names in exchange for cash. A second example is research and development (R&D) costs. Once spent, such costs are sunk and should have no effect on future pricing decisions. So a pharmaceutical company's attempt to justify high prices because of the need to recoup R&D expenses is fallacious. The company will charge market prices whether R&D had cost one dollar or one million dollars. However, R&D costs, and the ability to recoup those costs, are a factor in deciding whether to spend the money on R&D or not.[21]

The sunk cost effect may cause cost overrun. In business, an example of sunk costs may be an investment into a factory or research that now has a lower value or no value whatsoever. For example, $20 million has been spent on building a power plant; the value now is zero because it is incomplete (and no sale or recovery is feasible). The plant can be completed for an additional $10 million or abandoned and a different but equally valuable facility built for $5 million. Abandonment and construction of the alternative facility is the more rational decision, even though it represents a total loss of the original expenditure—the original sum invested is a sunk cost. If decision-makers are irrational or have the wrong incentives, the completion of the project may be chosen. For example, politicians or managers may have more incentive to avoid the appearance of a total loss. In practice, there is considerable ambiguity and uncertainty in such cases, and decisions may in retrospect appear irrational that were, at the time, reasonable to the economic actors involved and in the context of their incentives. A decision-maker might make rational decisions according to their incentives, outside of efficiency or profitability. This is considered to be an incentive problem and is distinct from a sunk cost problem.

Sunk costs are distinct from economic losses. For example:

[W]hen a new car is purchased, it can subsequently be resold; however, it will probably not be resold for the original purchase price. The economic loss is the difference (including transaction costs). The sum originally paid should not affect any rational future decision-making about the car, regardless of the resale value—if the owner can derive more value from selling the car than not selling it, [then] it should be sold, regardless of the price paid.[3][10]

Some research has also noted circumstances where the sunk cost effect is reversed; that is, where individuals appear irrationally eager to write off earlier investments in order to take up a new endeavor.[22]

Plan continuation bias[edit]

A related phenomenon is plan continuation bias,[23][24][25][26][27] also called get-there-itis or press-on-itis, which is an unwise tendency to persist with a plan that is failing.

This is a dangerous hazard for ship's captains or aircraft pilots who may stick to a planned course even when it is leading to fatal disaster and they should abort instead. A famous example is the Torrey Canyon oil spill in which the tanker ran aground when its captain persisted with a risky course rather than accepting a delay.[28] It has been a factor in numerous air crashes and an analysis of 279 approach and landing accidents (ALAs) found that it was the fourth most common cause, occurring in 11% of cases.[29] Another analysis of 76 accidents found that it was a contributory factor in 42% of cases.[30]

Projects often suffer cost overruns and delays due to the planning fallacy and related factors including excessive optimism, an unwillingness to admit failure, groupthink and aversion to loss of sunk costs.[31]

Psychological factors[edit]

Evidence from behavioral economics suggests that there are at least five specific psychological factors underlying the sunk cost effect:

  • Loss aversion, whereby the price paid becomes a benchmark for the value, whereas the price paid should be irrelevant.
  • Framing effects, a cognitive bias where people decide on options based on whether the options are presented with positive or negative connotations; e.g. as a loss or as a gain.[32] People tend to avoid risk when a positive frame is presented but seek risks when a negative frame is presented.[33]
  • An overoptimistic probability bias, whereby after an investment the evaluation of one's investment-reaping dividends is increased.
  • The requisite of personal responsibility. Sunk cost appears to operate chiefly in those who feel a personal responsibility for the investments that are to be viewed as a sunk cost.
  • The desire not to appear wasteful—"One reason why people may wish to throw good money after bad is that to stop investing would constitute an admission that the prior money was wasted."[18]

Taken together, these results suggest that the sunk cost effect may reflect non-standard measures of utility, which is ultimately subjective and unique to the individual.

Experiments have shown that the sunk cost fallacy and loss aversion are common; hence economic rationality—as assumed by much of economics—is limited. This has enormous implications[example needed] for finance, economics, and securities markets in particular. Daniel Kahneman won the Nobel Prize in Economics in part for his extensive work in this area with his collaborator, Amos Tversky.

Loss aversion[edit]

Many people have strong misgivings about "wasting" resources (loss aversion) that may contribute to the sunk cost effect. However, David Gal and Derek Rucker argue that the sunk cost effect cannot be due to loss aversion because there is no comparison of a loss to a gain.[34]

Overoptimistic probability bias[edit]

In 1968, Knox and Inkster[35] approached 141 horse bettors: 72 of the people had just finished placing a $2.00 bet within the past 30 seconds, and 69 people were about to place a $2.00 bet in the next 30 seconds. Their hypothesis was that people who had just committed themselves to a course of action (betting $2.00) would reduce post-decision dissonance by believing more strongly than ever that they had picked a winner. Knox and Inkster asked the bettors to rate their horse's chances of winning on a 7-point scale. What they found was that people who were about to place a bet rated the chance that their horse would win at an average of 3.48 which corresponded to a "fair chance of winning" whereas people who had just finished betting gave an average rating of 4.81 which corresponded to a "good chance of winning". Their hypothesis was confirmed: after making a $2.00 commitment, people became more confident their bet would pay off. Knox and Inkster performed an ancillary test on the patrons of the horses themselves and managed (after normalization) to repeat their finding almost identically. Other researchers have also found evidence of inflated probability estimations.[36][37]

Sense of personal responsibility[edit]

In a study of 96 business students, Staw and Fox[38] gave the subjects a choice between making an R&D investment either in an underperforming company department, or in other sections of the hypothetical company. Staw and Fox divided the participants into two groups: a low responsibility condition and a high responsibility condition. In the high responsibility condition, the participants were told that they, as manager, had made an earlier, disappointing R&D investment. In the low responsibility condition, subjects were told that a former manager had made a previous R&D investment in the underperforming division and were given the same profit data as the other group. In both cases subjects were then asked to make a new $20 million investment. There was a significant interaction between assumed responsibility and average investment, with the high responsibility condition averaging $12.97 million and the low condition averaging $9.43 million. Similar results have been obtained in other studies.[39][36][40]

Desire not to appear wasteful[edit]

A ticket-buyer who purchases a ticket in advance to an event they eventually turn out not to enjoy makes a semi-public commitment to watching it. To leave early is to make this lapse of judgment manifest to strangers, an appearance they might otherwise choose to avoid. As well, the person may not want to leave the event, because they have already paid, so they may feel that leaving would waste their expenditure. Alternatively, they may take a sense of pride in having recognized the opportunity cost of the alternative use of time.

See also[edit]

References[edit]

  1. ^ Mankiw, N. Gregory (2009). Principles of Microeconomics (5th ed.). Mason, OH: Cengage Learning. pp. 296–297. ISBN 978-1-111-80697-2.
  2. ^ Mankiw, N. Gregory (2018). Principles of Economics (8th ed.). Boston, MA: Cengage Learning. pp. 274–276. ISBN 978-1-305-58512-6.
  3. ^ a b c Hussain, Tahir (2010). Engineering Economics. New Delhi: Laxmi Publications, Ltd. ISBN 978-93-80386-47-8.
  4. ^ Warnacut, Joyce I. (2017). The Monetary Value of Time: Why Traditional Accounting Systems Make Customers Wait. Taylor & Francis. ISBN 978-1-4987-4967-1.
  5. ^ Sharma, Sanjay; Sharma, Pramodita (2019). Patient Capital. Cambridge University Press. ISBN 978-1-107-12366-3.
  6. ^ a b Lipsey, Richard G.; Harbury, Colin (1992). First Principles of Economics. Oxford University Press. p. 143. ISBN 978-0-297-82120-5.
  7. ^ a b Ryan, Bob (2004). Finance and Accounting for Business. Cengage Learning EMEA. pp. 229–230. ISBN 978-1-86152-993-0.
  8. ^ a b Bernheim, B. Douglas; Whinston, Michael Dennis (2008). Microeconomics. McGraw-Hill Irwin. ISBN 978-0-07-721199-8.
  9. ^ Jain, P. K. (2000). Cost Accounting. Tata McGraw-Hill Education. ISBN 978-0-07-040224-9.
  10. ^ a b c d Gupta, K. P. (2009). Cost Management: Measuring, Monitoring & Motivating Performance. Global India Publications. ISBN 978-93-80228-02-0.
  11. ^ Samuelson, Paul A. (2010). Economics. Tata McGraw-Hill Education. ISBN 978-0-07-070071-0.
  12. ^ Tversky, Amos; Kahneman, Daniel (1986). "Rational choice and the framing of decisions". The Journal of Business. 59 (4): S251–S278. doi:10.1086/296365. ISSN 0021-9398. JSTOR 2352759.
  13. ^ Sherman, Roger (2008). Market Regulation. Pearson / Addison Wesley. ISBN 978-0-321-32232-6.
  14. ^ a b c d Parayre, Roch (1995). "The strategic implications of sunk costs: A behavioral perspective". Journal of Economic Behavior & Organization. 28 (3): 417–442. doi:10.1016/0167-2681(95)00045-3. ISSN 0167-2681.
  15. ^ Staw, Barry M.; Ross, Jerry (1987). "Knowing When to Pull the Plug". Harvard Business Review (March 1987). ISSN 0017-8012. Retrieved 2019-08-09.
  16. ^ Arkes, Hal (2000). "Think Like a Dog". Psychology Today. 33 (1): 10. ISSN 0033-3107. Retrieved 2019-08-05.
  17. ^ Arkes, Hal R.; Ayton, Peter (1999). "The sunk cost and Concorde effects: Are humans less rational than lower animals?". Psychological Bulletin. 125 (5): 591–600. doi:10.1037/0033-2909.125.5.591. ISSN 1939-1455.
  18. ^ a b Arkes, Hal R; Blumer, Catherine (1985). "The psychology of sunk cost". Organizational Behavior and Human Decision Processes. 35 (1): 124–140. doi:10.1016/0749-5978(85)90049-4. ISSN 0749-5978.
  19. ^ "sunk cost fallacy". Cambridge English Dictionary. Cambridge University Press. 2019. Retrieved 2019-08-07.
  20. ^ Weatherhead, P.J. (1979). "Do Savannah Sparrows Commit the Concorde Fallacy?". Behav. Ecol. Sociobiol. Springer Berlin. 5 (4): 373–381. doi:10.1007/BF00292525.
  21. ^ Yoram, Bauman; Klein, Grady (2010). The Cartoon Introduction to Economics. Volume One: Microeconomics (1st ed.). New York: Hill and Wang. pp. 24–25. ISBN 978-0-8090-9481-3.
  22. ^ Heath, Chip. "Escalation and de-escalation of commitment in response to sunk costs: The role of budgeting in mental accounting." Organizational Behavior and Human Decision Processes 62 (1995): 38-38.
  23. ^ "Flying in the rear view mirror". Critical Uncertainties. 2011-06-26. Retrieved 2019-12-28.
  24. ^ Admin (2015-06-20). "Safety and The Sunk Cost Fallacy". SafetyRisk.net. Retrieved 2019-12-28.
  25. ^ "17 Cognitive Biases which Contribute to Diving Accidents". www.thehumandiver.com. Retrieved 2019-12-28.
  26. ^ Winter, Scott R.; Rice, Stephen; Capps, John; Trombley, Justin; Milner, Mattie N.; Anania, Emily C.; Walters, Nathan W.; Baugh, Bradley S. (2020-03-01). "An analysis of a pilot's adherence to their personal weather minimums". Safety Science. 123: 104576. doi:10.1016/j.ssci.2019.104576. ISSN 0925-7535.
  27. ^ "FAA Safety Briefing - July August 2018" (PDF). FAA.
  28. ^ Harford, Tim (18 January 2019), "Brexit lessons from the wreck of the Torrey Canyon", Financial Times
  29. ^ Khatwa, Ratan; Helmreich, Robert (November 1998 – February 1999), "Analysis of Critical Factors During Approach and Landing in Accidents and Normal Flight" (PDF), Flight Safety Digest, pp. 1–77
  30. ^ Bermin, Benjamin A.; Dismukes, R. Key (December 2006), "Pressing the Approach" (PDF), Aviation Safety World, pp. 28–33
  31. ^ Behavioural Insights Team (July 2017). "A review of optimism bias, planning fallacy, sunk cost bias and groupthink in project delivery and organisational decision making" (PDF). An Exploration of Behavioural Biases in Project Delivery at the Department for Transport. GOV.UK.
  32. ^ Plous 1993
  33. ^ Tversky & Kahneman 1981
  34. ^ Gal, David; Rucker, Derek D. (2018). "The Loss of Loss Aversion: Will It Loom Larger Than Its Gain?". Journal of Consumer Psychology. 28 (3): 497–516. doi:10.1002/jcpy.1047. ISSN 1532-7663.
  35. ^ Knox, RE; Inkster, JA (1968). "Postdecision dissonance at post time". Journal of Personality and Social Psychology. 8 (4): 319–323. doi:10.1037/h0025528. PMID 5645589.
  36. ^ a b Arkes, Hal; Blumer, Catherine (1985). "The Psychology of Sunk Cost". Organizational Behavior and Human Decision Processes. 35: 124–140. doi:10.1016/0749-5978(85)90049-4.
  37. ^ Arkes, Hal; Hutzel, Laura (2000). "The Role of Probability of Success Estimates in the Sunk Cost Effect". Journal of Behavioral Decision Making. 13 (3): 295–306. doi:10.1002/1099-0771(200007/09)13:3<295::AID-BDM353>3.0.CO;2-6.
  38. ^ Staw, Barry M.; Fox, Frederick V. (1977). "Escalation: The Determinants of Commitment to a Chosen Course of Action". Human Relations. 30 (5): 431–450. doi:10.1177/001872677703000503. Retrieved 2019-08-06.
  39. ^ Staw, Barry M. (1976). "Knee-deep in the big muddy: A study of escalating commitment to a chosen course of action" (PDF). Organizational Behavior and Human Performance. 16 (1): 27–44. doi:10.1016/0030-5073(76)90005-2. ISSN 0030-5073. Retrieved 2019-08-05.
  40. ^ Whyte, Glen (1986). "Escalating Commitment to a Course of Action: A Reinterpretation". The Academy of Management Review. 11 (2): 311–321. doi:10.2307/258462. ISSN 0363-7425. JSTOR 258462.

Further reading[edit]