Create and share a new lesson based on this one.

About TED-Ed Originals

TED-Ed Original lessons feature the words and ideas of educators brought to life by professional animators. Are you an educator or animator interested in creating a TED-Ed original? Nominate yourself here »

Meet The Creators

  • Educator Derek Abbott
  • Script Editor Alex Gendler
  • Director Brett Underhill
  • Animator Brett Underhill
  • Narrator Addison Anderson


Additional Resources for you to Explore
This exercise has got you to think about probability and likelihood. It has gotten you to think about the possibility of cases where unanimous agreement is so unexpected that a systematic failure or a systemic bias is more probable.

In the case of a police lineup, brainstorm as many crazy ideas as you can think of that might bias independent witnesses to chose the wrong guy. Search the internet and see if you can find real cases of this happening such as this one here.

Think back at the example of the nine apples and one rubber ball, where the witnesses were only given 10 seconds through a keyhole. How could many independent witnesses all agree to the incorrect number of apples? Perhaps some apples had accidentally rolled forward and looked larger than the rest in the line up and these were preferentially selected? Brainstorm as many other crazy possibilities that you can think up. The crazier the better!

See here and here to get the idea of how systematic failure and systemic bias can falsely create a unanimous outcome. Search the internet to find other real examples of this.

It’s said that when Walt Disney presented a new project to his executives, he would not go ahead if all the executives unanimously thought it was a great idea. There is some wisdom in this. By rejecting the unanimous vote Walt Disney was, in effect, guarding against the possibility of a systematic failure caused by his executives simply being “yes men” rather than really thinking through the proposal carefully. Can you search the internet and find other real examples of strategies that guard against systemic bias or systematic failure?

Is it possible for a large sequence of measurements or observations, which support a hypothesis, to counterintuitively decrease our confidence? Can unanimous support be too good to be true?