A lot of the problems we cause ourselves – whether as individuals or as a community – arise from the way we've evolved to economise on thinking time by taking mental shortcuts.
We are a thinking animal, but there are two problems. First, we have to make so many thousands of decisions in the course of a day – most of them trivial, such as whether to take another sip of coffee – that there simply isn't enough time to think about more than a few of them.
Second, using our brains to think requires energy, in the form of glucose. But glucose is not in infinite supply. So we've evolved to save energy by minimising the thinking we do.
As Daniel Kahneman – an Israeli-American psychologist who won the Nobel prize in economics for his work with the late Amos Tversky on decision-making – explains in his bestselling Thinking, Fast and Slow, our brains solve these two problems by making all but the biggest, non-urgent decisions unconsciously.
This is Thinking Fast. We don't think about taking another sip of coffee, we just notice ourselves reaching for the cup.
But even when we are Thinking Slow, carefully considering a big decision – such as which house to buy, or whether to marry the person we've been seeing – we still have a tendency to save glucose by relying on what Kahneman and Tversky dubbed "heuristics" – mental shortcuts.
They stressed that our use of such shortcuts is, in general, a good thing. We fall into the habit of jumping to certain conclusions because, most of the time, they give us the right answer while saving brain fuel.
But they don't give us the right answer in every circumstance, and it's the classes of cases where they lead us astray that are most interesting and worth knowing about.
Kahneman and Tversky kicked off a small industry of psychologists thinking up different potentially misleading mental shortcuts and giving them fancy names.
I have a couple of my own I'd like to add to the list.
I call the first one "box labelling" – saving thinking time by consigning things or people to boxes with particular labels.
For example: "I regularly vote Labor/Liberal, therefore I don't have to think about the rights and wrongs of all the policy issues the pollies argue over, but can get my opinion just by checking which side my party's on."
You can see how common this is if you look those media opinion polls that show you how many people support or oppose a particular policy – say, curbing negative gearing – then show you who those people would vote for in an election.
Much more often than not, people take their lead on an issue from the position their favoured party takes.
You also see it by watching what happens to the index of consumer confidence when there's a change of government. Almost all those who voted for the losing party switch from optimism to pessimism, while those who voted for the winner switch from pessimist to optimist.
My second mental shortcut is "magic numbers". Experts develop and carefully calculate some economic or financial indicator, based on various assumptions.
The indicator measures changes in something we know is important, so we get used to watching it closely for an indication of how things are going.
Trouble is, we end up putting too much reliance on the indicator, using it as a mental shortcut – a substitute for thinking hard about what's going on.
We turn it into a magic number – a single figure that tells us all we need to know. We use it to inform us about things it wasn't designed to measure.
But, above all, we forget about all the assumptions on which it's built, assumptions that can become inappropriate or misleading without us noticing. That's when our magic numbers hit us on the head.
The American economic historian Barry Eichengreen attributes part of the blame for the global financial crisis to Wall Street's excessive reliance on a financial indicator called "value at risk" or VaR.
As Wikipedia tells us, VaR "estimates how much a set of investments might lose, given normal market conditions, in a set time period such as a day. VaR is typically used by firms and regulators in the financial industry to gauge the amount of assets needed to cover possible losses."
Eichengreen tells of the banking boss who, late each afternoon, would call for the figure giving the investment bank's VaR. If it fell within a certain range, the banker would go home content. If it was outside the range, he'd stay until he'd done whatever was needed to get it back into range.
The problem was his neglect of the assumptions on which the calculation was based, in particular, "given normal market conditions". Conditions stopped being normal without him realising and – like all its competitors – his bank got into deep trouble.
But the most notorious magic number is gross domestic product, GDP. It was developed by economists after World War II to help them manage the macro economy, but has since been widely adopted as the single indicator of economic progress.
Economists know that GDP is good at what it measures, but was never designed to be a broader measure of wellbeing. This, however, doesn't stop them treating the ups and downs of GDP as the be-all and end-all of economics, as a substitute for thought.
Another word for this is "bottomlinism" – don't bother me with the details, just give me the bottom line.
But never inquiring beyond the bottom line will often end up misleading yourself or getting you into trouble. That's particularly true of people who hear the words "deficit" and "debt" and immediately assume the worst.
In business, however, the most dangerous magic numbers – the most egregious substitute for the effort of thought – are known as KPIs – key performance indicators.