Ballparking with Enrico Fermi

fermi playgroundWe’re all familiar with ‘ballparking’ – making rough estimates or vague guesses. Today I want to talk about a particular kind of ballparking, namely how we solve a type of mental puzzle called “Fermi questions”. They were employed by physicist Enrico Fermi during the Manhattan project as fun puzzles to keep up morale (at least that’s what he thought he was doing). Nowadays, they’re used by employers like Google and Microsoft in interviews to test candidates’ problem-solving skills. They point to some interesting things about the mind, but I also think they’re a great exercise in testing the limits of thinking. Here are some typical examples of Fermi questions.

How many playgrounds are in New York?

How long would it take to bike from Seattle to Los Angeles?

How many mice could you fit in an average sized SUV?

Obviously, no-one knows the answer to these off the cuff (unless your job requires you specifically to cram mice in SUVs, in which case – stop). The goal isn’t to get the right answer exactly. Instead, the goal is come up with a method that gives you an answer in the right ballpark (say, one order of magnitude).

But you don’t have a clue about these questions, right? Well, that’s not true. You know it wouldn’t take a year to bike from Seattle to LA, and you know it wouldn’t take 2 hours. You have some kind of sense of the right answer, so it’s just a matter of refining that feeling. You can do this by making estimates about other variables.

Take the first question. You might think something like this to yourself. “Well, there are eight million people in NYC, and the city seems to have a reasonable number of playgrounds (after all, I don’t typically see the playgrounds overflowing, but I don’t see them consistently empty). Let’s say ten percent of them are young enough to use a playground. That’s 800,000 young kids. Some of them will almost never use a playground or use one only rarely. Other kids will use them almost every day, and some will be in between. So let’s there’s five visits per year on average per child, which means 4 million playground visits per year. How many visits does the average playground get every day? Well, again it depends. In summer the playgrounds probably have hundreds of visits, in winter or in rain, way less. So let’s say 10 visits per day, which gives us 3650 visits per playground per year. So now we divide the total 4 million visits that we expect to happen by the 3650 visits per playground, which gives 1095 playgrounds.”

How close were we? Surprisingly close! The actual answer is just under a thousand. I have no idea if my other figures were correct, but it doesn’t matter if one of them was too high or too low, because I’m just as likely to overestimate as underestimate, so the highs and lows will probably cancel out (this is related to the wisdom of crowds effect).

Other than being fun puzzles, and maybe helping you get a job, Fermi problems raise a couple of interesting philosophical issues.

One relates to how many things you know. It might seem like I don’t have a clue how many buses I could fit into Grand Central Station, or how much milk the average person drinks per year, but what this exercise shows is that you actually do have *some* idea about these questions. Some philosophers think that when you believe something, there’s a pattern of neuronal firing in your head that corresponds to that belief. So, if we knew how to interpret the neurons correctly, we could (in theory) crack open your head and take a look. Other philosophers are doubtful about this idea. For example, do you think President Obama knows that elephants don’t have wings? Of course he does. But is there a little bit of Obama’s brain dedicated to storing the idea “elephants don’t have wings”? Surely not – there are an infinite number of such ‘obvious’ beliefs we can ascribe to him, or to anyone. So philosophers nowadays tend to distinguish between ‘occurrent’ and ‘dispositional’ beliefs – occurrent ones are the ones that are actual neuronal events happening in your brain right now. Dispositional beliefs are the ones like the example above – things you know or believe but have never actually thought about. And Fermi problems illustrate just how many of those there are – enough to let us give reasonable answers to a never-ending array of questions

Fermi problems also show us something interesting about the role played by intuition in decision-making. There’s definitely a deliberate and linear process involved in properly answering Fermi problems – you have to do some careful math – answering them also relies on some kind intuitive sense of which values are too big and too small. This kind of cognitive intuition plays a big role in how we assess ideas. If I told you that there were more murders in Brooklyn last year than the whole of Africa, you’d probably tell me I was full of it. But how would you immediately know that what I said was false? It seems we get a subtle kind of feeling when encounter new information that helps us decide whether or not it’s likely to be true. Sometimes we can go on to find arguments or reasons to support or undermine those feelings, of course. Daniel Kahneman has suggested this kind of case is an example of ‘System 1’ reasoning – roughly, the set of unconscious processes that go on in the background of our minds, like the stagehands at a theatrical production.

Like many other System 1 processes, this kind of cognitive intuition is very useful in giving us swift answers to problems, but it’s not 100% reliable. Here’s a real life example: I once heard that the American prison population was by far the largest in the world. I got an instant cognitive intuition that it had to be false. It wasn’t hard to justify it: really, bigger than India and China? Bigger than countries with much worse poverty and social problems, like Brazil and Mexico? But this was a case where my cognitive intuition was drastically wrong. And those are precisely the cases we need to be on the look out for – it usually means we have something surprising and important to learn.


« »

Latest Comments:

  1. “…case where cognitive intuitions are drastically wrong…precisely the cases we need to be on the look out for – it usually means we have something surprising and important to learn” Brilliant: we should all be prepared to guess about things and our wrong guesses expose our ignorance and bias most wonderfully. e.g. the age of the earth, populations of Middle Eastern Countries, world deaths from various diseases.

Leave a Reply

Your email address will not be published. Required fields are marked *