Decision making is integral to trading. Our philosophy emphasizes concepts that help us make more informed decisions. Here are some general examples of those concepts and how they relate to trading.

Click on each example to learn more about the concept and how it relates to trading.

Review the following table of data and answer this question: What is the relationship between pain and rainy days?

  Rain No Rain
Pain 14 6
No Pain 7 2

Many people will answer that a positive correlation exists between pain and rainy days. Why? In judging the relationship, many people mistakenly overweight the positive-positive cases (pain/rain) relative to all other cases. As a result, they mistakenly conclude that the two events are positively correlated. In fact, in this example, a slight negative correlation exists. (2/3 of the people feel pain on rainy days, whereas 3/4 of the people feel pain on days with no rain.)

This thinking is an example of illusory correlation, a type of confirmation bias in decision making. Searching for information or interpreting information in a way that supports our preconception can lead to a biased outcome. Experiments designed to prove (not disprove) a theory can result in erroneous conclusions about the accuracy of the theory. In the example above, considering the situations where the relationship between pain and rain does not exist is equally relevant to determining the correlation between the events.

In pricing securities, it is important to find information that supports our estimates of the fair price, but it is also important to search for information that disconfirms our estimates. When we cannot find information that disconfirms our prices, we can begin to believe those prices may be accurate.

For more information about the confirmation bias, please visit the links below:
Confirmation Bias Activity
Confirmation Bias Article

Let’s play a game. Write down the last two digits of your social security number. Do you think the average temperature in Chugwater, WY is higher or lower than this number, in degrees Fahrenheit? Now tell me what you believe the average annual temperature is in Chugwater, WY.

How does your answer to the second question relate to the first one? The two numbers are completely unrelated and the last two digits of your social security number should have no influence on your answer to the temperature in Chugwater.

In tests, however, people's answers to the temperature question are positively correlated. Individuals with higher social security numbers submit higher estimates of the Chugwater temperature. This is an example of anchoring, the error of assessing numbers (or probabilities) based upon a reference point, which overly influences (anchors) the estimate.

Anchoring and adjustment, the error of overweighting an initial estimate in a future estimate, can lead traders to make inaccurate prices. Because the starting value is overweighted, adjustments based on new information may be too small, influenced by the starting value. This shortcut heuristic is helpful in many places, but can lead us astray when someone wants to influence our perception of value.

For more information about anchoring, visit Psychology Today.

You flip a coin ten times and the outcome is ten heads in a row. If the coin is fair, is the outcome of next flip more likely to be heads or tails?

For a fair coin, the answer should be that both outcomes are equally likely. Believing that the next flip is more likely to be tails because “tails is due to come up” is known as gambler’s fallacy, an example of the availability bias.

In general, the availability bias occurs when our estimates of probabilities are influenced by what is most “available” in our memories. These memories are biased; the unusual result of 10 consecutive heads (which had an initial probability of 1 in 1,024 of occurring) could influence our judgment of the likelihood of outcomes on the next flip.

As traders assess new information, all observations must be appropriately weighted in prices or estimates of probabilities. If traders are unduly influenced by availability bias, the resulting estimates may not be accurate.

The following example is taken from a well-known study by Amos Tversky and Daniel Kahnemann, "The Framing of Decisions and the Psychology of Choice," published in Science, vol 211, 30 January 1981, pp. 455-458. Read the full article here.

Imagine the country is preparing for a disease outbreak which is expected to kill 600 people. Two alternative programs have been proposed and it is up to you choose which one:

Program A 200 people will be saved
Program B There is a 1/3 chance all 600 will be saved and a 2/3 chance no one will be saved

Which program did you choose?

Now consider two alternative programs and choose one:

Program C 400 people will die
Program D There is a 1/3 chance no one will die and a 2/3 chance 600 people will die

Which program did you choose?

Programs A & C are the same (if 200 people are saved then 400 people die) and Programs B & D are the same. In all four choices, 200 people are expected to live and 400 people are expected to die (B and D have greater variability than A and C).

However, in choosing between A & B, 72% of the physicians initially given this problem by Kahneman and Tversky chose A, and in choosing between C & D, 78% chose D. If A & C are the same, why did people "switch"?

In this famous experiment, Tversky and Kahneman shed light on the problem of framing; how we ask a question, or frame a problem, can influence our choice of outcomes. In this example, people choose to "save" 200 people and choose A, but avoid the alternative where 400 people "die" in C, instead choosing the riskier course of action, even though the expected outcome is the same.

Traders constantly assess alternatives in making decisions about prices. In evaluating alternatives, it is important to understand that a mistake framing the problem could lead to choosing an inferior alternative.