Hofstadter’s Law and the Planning Fallacy

by | May 22, 2019

estimating_time-3464158
Source: xkcd.com

In his book Godel, Escher, Bach: An Eternal Golden Braid, Douglas Hofstadter discusses the particularities of estimating how long it would take to complete a computer program. In his observations about the difficulties in estimating the completion time he created what is now referred to as Hofstadter’s Law. It states:

“It always takes longer than you expect, even when you take into account Hofstadter’s Law.”

The applicability of Hofstadter’s Law has been expanded to include the problems of estimating the difficulty of completing any complex task.

Hofstadter’s Law is a form of the Planning Fallacy and it occurs because the more complex a project is the more steps there are from start to finish and each step provides an opportunity to hit a snag and for a delay to occur.

Hofstadter’s Law commonly occurs when undertaking a construction project, implementing a new software system, writing a book, and launching a new website. High profile examples include:

  • The Tesla Model 3 was announced in March of 2016 and began taking deposits for orders for cars expected to ship in mid-2017. Delays pushed back delivery into early 2018, then late 2018.
  • The Boeing 787 Dreamliner suffered many setbacks and the rollout of the plane was delayed years. Here’s an article with a timeline of the 787’s many delays.

These examples are not unusual. Complex projects take longer than we estimate, even when we know they will take longer than expected and plan for delays.

hofstadters_law-4684542

Here’s a great discussion of this phenomenon in The Black Swan by Nassim Taleb, where he discusses the impact of scalable randomness on delays and takes Hofstadter’s Law a step further and states that delays get longer and pile up as you experience them:

Let’s say a project is expected to terminate in 79 days . . . . On the 79th day, if the project is not finished, it will be expected to take another 25 days to complete. But on the 90th day, if the project is still not completed, it should have about 58 days to go. On the 100th, it should have 89 days to go. On the 119th, it should have an extra 149 days. On day 600, if the project is not done, you will be expected to need an extra 1,590 days. As you see, the longer you wait, the longer you will be expected to wait.

How to combat Hofstadter’s Law and the Planning Fallacy? One potential solution is to seek guidance from someone with an outside view and with experience with such projects when planning the time needed to completion. There is a great discussion of this in Daniel Kahneman’s book Thinking Fast and Slow where he describes being on a committee that was a writing the curriculum and textbook for high school students on decision science. After a year of meeting and working on the project Kahneman asked the following:

I asked everyone to write down an estimate of how long it would take us to submit a finished draft of the textbook to the Ministry of Education. I was following a procedure that we already planned to incorporate into our curriculum: the proper way to elicit information from a group is not by starting with a public discussion but by confidentially collecting each person’s judgment. This procedure makes better use of the knowledge available to members of the group than the common practice of open discussion. I collected the estimates and jotted the results on the blackboard. They were narrowly centered around two years; the low end was one and a half, the high end two and a half years.

Then I had another idea. I turned to Seymour, our curriculum expert, and asked whether he could think of other teams similar to ours that had developed a curriculum from scratch. This was a time when several pedagogical innovations like “new math” had been introduced, and Seymour said he could think of quite a few. I then asked whether he knew the history of these teams in some detail, and it turned out that he was familiar with several. I asked him to think of these teams when they had made as much progress as we had. How long, from that point, did it take them to finish their textbook projects?

He fell silent. When he finally spoke, it seemed to me that he was blushing, embarrassed by his own answer: “You know, I never realized this before, but in fact not all the teams at a stage comparable to ours ever did complete their task. A substantial fraction of the teams ended up failing to finish the job.”

This was worrisome; we had never considered the possibility that we might fail. My anxiety rising, I asked how large he estimated that fraction was. “About 40%,” he answered. By now, a pall of gloom was falling over the room. The next question was obvious: “Those who finished,” I asked. “How long did it take them?” “I cannot think of any group that finished in less than seven years,” he replied, “nor any that took more than ten.”

I grasped at a straw: “When you compare our skills and resources to those of the other groups, how good are we? How would you rank us in comparison with these teams?” Seymour did not hesitate long this time. “We’re below average,” he said, “but not by much.” This came as a complete surprise to all of us—including Seymour, whose prior estimate had been well within the optimistic consensus of the group. Until I prompted him, there was no connection in his mind between his knowledge of the history of other teams and his forecast of our future.

Our state of mind when we heard Seymour is not well described by stating what we “knew.” Surely all of us “knew” that a minimum of seven years and a 40% chance of failure was a more plausible forecast of the fate of our project than the numbers we had written on our slips of paper a few minutes earlier. But we did not acknowledge what we knew. The new forecast still seemed unreal, because we could not imagine how it could take so long to finish a project that looked so manageable. No crystal ball was available to tell us the strange sequence of unlikely events that were in our future. All we could see was a reasonable plan that should produce a book in about two years, conflicting with statistics indicating that other teams had failed or had taken an absurdly long time to complete their mission. What we had heard was base-rate information, from which we should have inferred a causal story: if so many teams failed, and if those that succeeded took so long, writing a curriculum was surely much harder than we had thought. But such an inference would have conflicted with our direct experience of the good progress we had been making. The statistics that Seymour provided were treated as base rates normally are—noted and promptly set aside.

We should have quit that day. None of us was willing to invest six more years of work in a project with a 40% chance of failure. Although we must have sensed that persevering was not reasonable, the warning did not provide an immediately compelling reason to quit. After a few minutes of desultory debate, we gathered ourselves together and carried on as if nothing had happened. The book was eventually completed eight(!) years later. By that time I was no longer living in Israel and had long since ceased to be part of the team, which completed the task after many unpredictable vicissitudes. The initial enthusiasm for the idea in the Ministry of Education had waned by the time the text was delivered and it was never used.

This embarrassing episode remains one of the most instructive experiences of my professional life. I eventually learned three lessons from it. The first was immediately apparent: I had stumbled onto a distinction between two profoundly different approaches to forecasting, which Amos and I later labeled the inside view and the outside view. The second lesson was that our initial forecasts of about two years for the completion of the project exhibited a planning fallacy. Our estimates were closer to a best-case scenario than to a realistic assessment. I was slower to accept the third lesson, which I call irrational perseverance: the folly we displayed that day in failing to abandon the project. Facing a choice, we gave up rationality rather than give up the enterprise.

2 Comments

  1. Definitely applies to eliminating 5 warehouses of life’s collections-so valuable & no one wants them

    Reply
  2. Well this certainly applies to writing a novel!

    Reply

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Subscribe To The IFOD

Get the Interesting Fact of the Day delivered twice a week. Plus, sign up today and get Chapter 2 of John's book The Uncertainty Solution to not only Think Better, but Live Better. Don't miss a single post!

You have Successfully Subscribed!

Share This