Skip to content
Go back

On Making Decisions

0. Key points

1. Why care about correct beliefs?

You probably have some things in live you want to achieve. Things you prefer to happen over other things. You probably want to stay healthy, have good relationships, have success in whatever is important to you, depending on your Values. To achieve those things you are making decisions all the time. Small decisions in your everyday life, up to bigger decisions that change your life trajectory all at once. Depending on the decision you take, you might end up in a preferable world state, or in one that is not as preferable. Obviously, choosing the right decision is better for you, if you care about reaching your goals and living out your values. Making good decisions means shifting the probability of ending up in a desired world state upwards.

For that, we need as correct beliefs about the world as possible. Our decisions are based on our beliefs about the world. If I believe that sugar is healthy and that there is nothing wrong with smoking cigarettes, I might choose to base my diet on eating sweets and end up smoking a pack a day, which would result in me being more likely to develop health issues. Fair, these two statements seem obviously false. But sometimes even subtle differences in our beliefs about the world can have quite noticeable impacts on where we end up. If I believe in the potential for development (growth mindset) or if I believe in fixed traits (fixed mindset) has effects on how I approach new challenges, how I learn and grow. In the long run, this difference in belief about oneself can have quite different impact on my decisions and subsequently on how my life play out.

Beliefs are hard

Sadly, having correct beliefs about the world is not as straightforward as bayes

We humans run on faulty and corrupted hardware and are far from an ideal bayesian agent that updates in the right amount, according to new evidence. No, we use some heuristics and cognitive moves that are far from ideal. We tend to believe things we hear without thinking about them first, especially when they are embedded in a narrative (Conspiracy Theories, health advice from instagram, prejudices, the great wall of china being visible from space,…). We use Motivated Reasoning, a style of thinking about something that starts with the conclusion and then looks for arguments that support this conclusion. The conclusion being chosen not by virtue of being true, but mostly for other reasons, e.g. a conclusion that signals our affiliation with a tribe (ahem ahem all of political discussion ahem).

The world is an uncertain place. Before actually happening, a thing has a probability to happen, alongside other things that could happen. Our beliefs about the world should reflect this by being probabilistic - we can not be sure about what is the case. But we tend to think in black and white - either a thing happens / is true, or it is not. This all or nothing thinking, this thinking in only 100% or 0% is making it hard for us to update on new evidence, because you can only go from completely right to completely wrong. And being wrong feels bad [CITATION NEEDED], so we tend to just not update at all and stick to our initial guess. Probabilistic thinking on the other would let us update a little bit on each new evidence, reflecting the world more truly.

Decisions and Updating are hard

But wait, there is more!

Evolution gave us a quite a few more heuristics that probably were useful to help us survive back then, but are not useful if we want to find truth today. We can identify many phenomena that lead to us having distorted views on our beliefs and decision making.

Shit we humans do:

All this is of course only a selection of everything that makes us less than optimal decision makers. But even knowing them all doesn’t seem to help that much, in itself. Being smarter could even be problematic if unchecked: as per the blind spot bias we default to recognizing the fault in others, but not in ourselves.

So, is there something we can do about that?

2. Are we screwed?

Short answer: Probably, but we can try.

This is not a good predicament to find oneself in - we want to make good decisions to achieve our goals, but our brains are working against us, thanks to evolutionary trained in behavioral patterns that stopped working when we changed our environment to one that does not resemble our ancestral one in the slightest. So we have to use some tools to help us with this and work against this default pattern.

The first tool is a perspective, a framework to view decisions with: viewing them as bets. It’s what you do all the time: basically, a decision is a bet on a future outcome. You are betting that the decision you take is the most likely to lead to an outcome you like. It helps to make this explicit and really ask yourself: “(what) would you bet on it?” - this prompts you into thinking about the beliefs that are foundation to your decision and helps you think in probabilites.

This works better and best with other people. If you do this by yourself, it is easy to weasel out and skip the “actually thinking about it” part, because there is no one to hold you accountable and you are pretty good at convincing yourself that you did your part, while not having done your part. With other people around, this might not be so easy. Your social brain is a really powerful motivator (this is why Focusmate works), so it would be wise to use this to your advantage. Having other people to reflect on your decisions and beliefs with is probably the best you can do, conditional on you all getting the social dynamics right.

Building a self correcting epistemic community

Ideally, we’d want to have a community, a group, a culture, that has norms that reward being open and intellectually honest. We humans crave approval. We usually end up in echo chambers, where your opinion and view is the same as the groups and everyone pats everyone else on their backs for having such good opinions and views. If we don’t want to end up there, we have to make our approval based on accuracy and intellectual honesty. Reinforcing each other for sharing their true viewpoint in all the detail possible leads to a diversity of viewpoints and better arguments that you can then run your model against to make it stronger (Steel manning).

We don’t have to invent this anonymous alcoholics for truthseeking from scratch, we can look to existing work. Robert Merton formulated some principles for his ideal-type model of a self correcting epistemic community. His norms of CUDOS are

Finding a group where you can establish these rules seems like one of the most useful things to do if you value having true beliefs and making good decisions. Some patterns and other behavioral tools I would like to see in such a group:

It seems that this process of building a group/community like this has to be deliberate. Especially in the beginning, you would have to make an conscious effort to stick to these rules, to act according to these principles instead of going with your autopilot. Building a collective epistemic immune system probably takes some time and effort.

More tools to use

Some other useful tools and frames you might want to use to work around your biases:

Possible Problems

We don’t want to end up in a community where it is only about criticizing each other, tearing each other down before we have built something. Only giving destructive criticism is not useful either. We might need to kill Socrates. Learn to recognize these patterns of unhelpful criticism and learn to differentiate between them and actually helpful criticism.

3. Resources


Share this post on: