Another long weekend, another personal question: how honest are you? According to the people who study these things, not as much as you think you are.
In an experiment in which people were asked to solve puzzles and were paid a set amount for each puzzle they solved, some participants were told to check their answers against an answer sheet, count the number of questions answered correctly, put their answer form through a shredder, report the number of questions they got right to the experimenter and receive the money they had earned.
A second group wasn't allowed to shred their answers before reporting how many they got right. Those whose claims about how many they got right couldn't be checked claimed to have got significantly more correct than the second group.
Those who cheated probably counted a problem they would have answered correctly if only they hadn't made a careless mistake. Or they counted a problem they would have got right if only they'd had another 10 seconds.
In other words, they didn't tell blatant lies, they just gave themselves the benefit of any doubt, bent the rules a little bit in their own favour. And get this: they wouldn't have thought they were cheating.
When subjects are asked to rate how ethical they are compared with other people on a scale of 0-100, where 50 is average, the average rating is usually about 75. That is, almost all of us consider ourselves to be more ethical than other people.
Clearly, that's not possible. In their book, Blind Spots, Max Bazerman, a professor of business administration at Harvard Business School, and Ann Tenbrunsel, a professor of business ethics at the University of Notre Dame, say most of us behave ethically most of the time.
Even so, most of us overestimate our ethicality relative to others. We're unaware of the gap between how ethical we think we are and how ethical we actually are. We suffer from blind spots.
Bazerman and Tenbrunsel are exponents of the emerging field of ''behavioural ethics'' - the study of how people actually behave when confronted with ethical dilemmas. They say our ethical behaviour is often inconsistent and, at times, even hypocritical.
''People have the innate ability to maintain a belief while acting contrary to it,'' they say. ''Moral hypocrisy occurs when individuals' evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others.''
Hypocrisy is part of the human condition; we're all guilty of it. So you could say accusing someone else of being hypocritical is itself a hypocritical act.
Some people are consciously, deliberately unethical. But Bazerman and Tenbrunsel stress their interest is in unintentional ethical misbehaviour. How can we behave unethically and not realise it?
We suffer from ''bounded ethicality'' because we suffer from ''bounded awareness'' - the common tendency to exclude important and relevant information from our decisions by placing arbitrary and dysfunctional boundaries around our definition of a problem.
One way we limit our awareness is by making decisions on the basis of the information that's immediately available to us - maybe that someone has presented to us - rather than asking what information would be relevant to making the best decision, including other aspects of the situation and other people affected by it.
An organisation's ethical gap is more than just the sum of the ethical gaps of its individual employees, the authors say. Group work, the building block of organisations, creates additional ethical gaps.
Goupthink - the tendency for cohesive groups to avoid a realistic appraisal of alternative courses of action in favour of unanimity - can prevent groups from challenging questionable decisions.
And functional boundaries can prevent individuals from viewing a problem as an ethical one. Organisations often allocate different aspects of a decision to different parts of the organisation.
''As a result, the typical ethical dilemma tends to be viewed as an engineering, marketing or financial problem, even when the ethical relevance is obvious to other groups,'' the authors say. So everyone can avoid coming to grips with the ethical issue by assuming someone else is dealing with it.
Now consider this. You're a 55-year-old and have just been diagnosed with early-stage cancer. You consult a surgeon, who wants to operate to try to remove the cancer. You consult a radiologist who recommends blasting the cancer with radiation. You consult a homeopathic doctor who believes you should use less intrusive medicine and wait to see how the cancer develops.
Many of us would assume each specialist is lying so as to drum up business. But it's actually more complicated. Each person genuinely believes their treatment to be superior, but they fail to recognise their beliefs are biased in a self-serving manner.
They don't realise their training, incentives and preferences prevent them from offering objective advice. They just don't realise they're facing an ethical dilemma. They don't see they face a conflict of interest because they view conflicts of interest as problems of intentional corruption.
Bounded ethicality occurs because our cognitive limitations - the limitations of the way our brains work - leave us unaware of the moral implications of our decisions. Aspects of everyday work life - including goals, rewards, compliance systems and informal pressures - contribute to ''ethical fading,'' a process by which ethical dimensions are eliminated from a decision.
It's common for decisions at work to be classified as a ''business decision'' rather than an ''ethical decision,'' thus increasing the likelihood we will behave unethically.
Sometimes differences in language allow ethical fading. Albert Speer, one of Hitler's ministers and trusted advisers, admitted after the war that by labelling himself an ''administrator'' of Hitler's plan he convinced himself that issues relating to the treatment of people were not part of his job.
Why does the way we classify decisions matter? Because classification often affects the decisions that follow. When we fail to recognise a decision as an ethical one, whether due to our own cognitive limitations or because external forces cause ethical fading, this failure could well affect how we analyse the decision and steer us towards unintended, unethical behaviour.
Why do we predict we will behave one way and then behave another way, over and over throughout our lives? General principles and attitudes drive our predictions; we see the forest but not the trees. As the situation approaches, however, we begin to see the trees and the forest disappears.
Our behaviour is driven by details, not abstract principles.
Read more >>
In an experiment in which people were asked to solve puzzles and were paid a set amount for each puzzle they solved, some participants were told to check their answers against an answer sheet, count the number of questions answered correctly, put their answer form through a shredder, report the number of questions they got right to the experimenter and receive the money they had earned.
A second group wasn't allowed to shred their answers before reporting how many they got right. Those whose claims about how many they got right couldn't be checked claimed to have got significantly more correct than the second group.
Those who cheated probably counted a problem they would have answered correctly if only they hadn't made a careless mistake. Or they counted a problem they would have got right if only they'd had another 10 seconds.
In other words, they didn't tell blatant lies, they just gave themselves the benefit of any doubt, bent the rules a little bit in their own favour. And get this: they wouldn't have thought they were cheating.
When subjects are asked to rate how ethical they are compared with other people on a scale of 0-100, where 50 is average, the average rating is usually about 75. That is, almost all of us consider ourselves to be more ethical than other people.
Clearly, that's not possible. In their book, Blind Spots, Max Bazerman, a professor of business administration at Harvard Business School, and Ann Tenbrunsel, a professor of business ethics at the University of Notre Dame, say most of us behave ethically most of the time.
Even so, most of us overestimate our ethicality relative to others. We're unaware of the gap between how ethical we think we are and how ethical we actually are. We suffer from blind spots.
Bazerman and Tenbrunsel are exponents of the emerging field of ''behavioural ethics'' - the study of how people actually behave when confronted with ethical dilemmas. They say our ethical behaviour is often inconsistent and, at times, even hypocritical.
''People have the innate ability to maintain a belief while acting contrary to it,'' they say. ''Moral hypocrisy occurs when individuals' evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others.''
Hypocrisy is part of the human condition; we're all guilty of it. So you could say accusing someone else of being hypocritical is itself a hypocritical act.
Some people are consciously, deliberately unethical. But Bazerman and Tenbrunsel stress their interest is in unintentional ethical misbehaviour. How can we behave unethically and not realise it?
We suffer from ''bounded ethicality'' because we suffer from ''bounded awareness'' - the common tendency to exclude important and relevant information from our decisions by placing arbitrary and dysfunctional boundaries around our definition of a problem.
One way we limit our awareness is by making decisions on the basis of the information that's immediately available to us - maybe that someone has presented to us - rather than asking what information would be relevant to making the best decision, including other aspects of the situation and other people affected by it.
An organisation's ethical gap is more than just the sum of the ethical gaps of its individual employees, the authors say. Group work, the building block of organisations, creates additional ethical gaps.
Goupthink - the tendency for cohesive groups to avoid a realistic appraisal of alternative courses of action in favour of unanimity - can prevent groups from challenging questionable decisions.
And functional boundaries can prevent individuals from viewing a problem as an ethical one. Organisations often allocate different aspects of a decision to different parts of the organisation.
''As a result, the typical ethical dilemma tends to be viewed as an engineering, marketing or financial problem, even when the ethical relevance is obvious to other groups,'' the authors say. So everyone can avoid coming to grips with the ethical issue by assuming someone else is dealing with it.
Now consider this. You're a 55-year-old and have just been diagnosed with early-stage cancer. You consult a surgeon, who wants to operate to try to remove the cancer. You consult a radiologist who recommends blasting the cancer with radiation. You consult a homeopathic doctor who believes you should use less intrusive medicine and wait to see how the cancer develops.
Many of us would assume each specialist is lying so as to drum up business. But it's actually more complicated. Each person genuinely believes their treatment to be superior, but they fail to recognise their beliefs are biased in a self-serving manner.
They don't realise their training, incentives and preferences prevent them from offering objective advice. They just don't realise they're facing an ethical dilemma. They don't see they face a conflict of interest because they view conflicts of interest as problems of intentional corruption.
Bounded ethicality occurs because our cognitive limitations - the limitations of the way our brains work - leave us unaware of the moral implications of our decisions. Aspects of everyday work life - including goals, rewards, compliance systems and informal pressures - contribute to ''ethical fading,'' a process by which ethical dimensions are eliminated from a decision.
It's common for decisions at work to be classified as a ''business decision'' rather than an ''ethical decision,'' thus increasing the likelihood we will behave unethically.
Sometimes differences in language allow ethical fading. Albert Speer, one of Hitler's ministers and trusted advisers, admitted after the war that by labelling himself an ''administrator'' of Hitler's plan he convinced himself that issues relating to the treatment of people were not part of his job.
Why does the way we classify decisions matter? Because classification often affects the decisions that follow. When we fail to recognise a decision as an ethical one, whether due to our own cognitive limitations or because external forces cause ethical fading, this failure could well affect how we analyse the decision and steer us towards unintended, unethical behaviour.
Why do we predict we will behave one way and then behave another way, over and over throughout our lives? General principles and attitudes drive our predictions; we see the forest but not the trees. As the situation approaches, however, we begin to see the trees and the forest disappears.
Our behaviour is driven by details, not abstract principles.