### Newcomb’s Problem

There are two boxes, red and blue. You are given two options: (1) take both the red and blue boxes; or (2) take only the blue box. The red box always contains $1,000. The blue box contains either $0 or $1,000,000. The contents of the blue box will be decided by a being called the Swami. The Swami has accurately predicted the choices of everybody to make this choice in the past. You are aware of this, the Swami is aware you are aware of this, and so on. You therefore trust the Swami to accurately predict the choice you will make. Now, here’s how the Swami will decide whether or not to put $1,000,000 in the blue box: If the Swami predicts you will choose option 1 (take both boxes), it puts $0 in the blue box. If the Swami predicts you will choose Option 2 (take only the blue box), it puts $1,000,000 in the blue box. The order of operations is 1) the Swami makes a prediction, 2) The Swami either puts $1,000,000 in the blue box or does not, and 3) you make your choice. What do you do?

The boxes can be in one of two states, depending on what the Swami predicts you will do.

The Swami predicts you will take both boxes |

The Swami predicts you will take only the blue box |

Assuming you are interested in maximizing your payout, let’s examine both options. At first you may only see one clear option and don’t initially see why somebody would chose the alternative.

### Take the blue box only

If the Swami is such an accurate predictor then the smart decision is to take only the blue box. The Swami will have predicted that you do this, placed $1,000,000 in the blue box, and you will be $1,000,000 richer. Likewise, if you take both boxes, the Swami would have predicted this and not placed the $1,000,000 in the blue box. You will therefore only get $1,000. While it helps to ascribe supernatural talents to the Swami - say its a genie, or a god, or a super powerful artificial intelligence - we don’t need to. For instance, as long as we know many people who have had to make this choice before us and all of the ones who chose both boxes ended up with $1,000 and those who selected only the blue box ended up with $1,000,000, we should factor this into our decision. Why take the chance of bucking the trend?

### Take Both Boxes

The Swami decides to place the $1,000,000 or nothing in the blue box before you make your choice. Therefore, when it comes time to make a selection, the $1,000,000 is already in the blue box or it is not. No matter what I do the contents of the box do not change once the Swami makes its choice. Therefore, I maximize my payout by selecting both boxes. Why? If there is $1,000,000 in the blue box and I take both boxes I get $1,001,000 instead of only $1,000,000 by selecting just the blue box. If there is no money in the blue box I get $1,000 for selecting both boxes instead of zero for selecting just the blue box. Think of it this way: an external observer who can see what is in both boxes after the Swami has decided whether or not to place the $1,000,000 will always advise you to select both boxes because the total amount is always greater in both boxes.

### The Right Choice

It may comfort you - or not - to know that answers to Newcomb’s paradox are typically pretty split among the general public. In 2016 The Guardian presented a poll to readers and the results after 31,854 votes were 53.5% blue box and 46.5% both boxes. Robert Nozick himself claimed to have put the problem to many students and friends with decisions split almost evenly. It seems 50 years after Nozick popularized the problem we still lack a consensus right answer.

While Newcomb’s problem mobilizes philosophers to ponder free will and determinism and the decision theorists to squabble over the right choice, it has application for us in everyday life. Newcomb’s problem introduces a scenario where one choice seems obvious to us at first but upon closer inspection there are real merits to the other side. It teaches us that it is important to keep all perspectives in mind when trying to decipher the world and the way things work.