COMMENTARY
Bending the rules, or just cheating?
By Shankar Vedantam
Washington Post
| |||
| |||
Both athletes were stars. Both faltered, then staged dramatic comebacks � displaying the tenacity that separates heroes from also-rans. Both now face drug charges that could end their careers.
After winning five medals at the 2000 Olympic Games but failing miserably at the 2004 Games, sprinter Marion Jones won the 100-meter race at the U.S. National Championships in June. But a urine sample taken there has come back positive for erythropoietin, an endurance booster, according to reports over the weekend.
After falling to 11th place in the 2006 Tour de France, Floyd Landis shot back into contention with a gritty ride in Stage 17 through the Alps. "I was very, very disappointed yesterday for a little while," Landis said at the time. "Today I thought I could show that at least I would keep fighting."
Landis went on to win, but two urine samples collected that day came back positive for artificial testosterone. His title may be stripped once his appeal is reviewed. Commentators are bemoaning what cheating is doing to sports.
Talk about cheating usually has a ring to it, and that ring comes from having a high moral tone. In this, it is fair to say, most people are hypocrites.
You and I may never get to ride in the Tour de France, but a great many studies show that most human beings are open to � and extraordinarily adept at � bending moral rules when it is convenient.
Most people report telling lies fairly regularly and being largely untroubled by them. When pressed, people say their lies are innocuous.
Nor can the world be divided cleanly into cheaters and honest people: A variety of ingenious experiments show that large majorities of people can be induced to do the wrong thing, depending on the circumstances.
Among the most potent motivators to cheat is the sense that one has lost the limelight, is falling behind and will be judged harshly. People are also more likely to cheat if they think other people are cheating.
One experiment asked volunteers to perform a simple mechanical task � track a rapidly moving light beam with a stylus. Volunteers were told (by someone they thought was another volunteer but was really part of the research team) that it was necessary to cheat to get a decent score.
After five practice trials, all the volunteers were told that they were not doing well and that they needed to make rapid improvement to catch up to the others. Then they were asked to keep track of their scores and were left alone. Volunteers did not know researchers were independently monitoring the scores.
More than three-quarters of the subjects lied about their performance. Volunteers who had done especially poorly on the practice runs seemed more likely to cheat, compared with those who did well. Carl I. Malinowski, an associate professor in marketing at Pace University in New York, said personality traits, anxiety levels, temptation and situational factors all played roles.
But people who do the wrong thing are fully aware of what they have done, right? Not always.
"We have a whole quiver full of rationalizations," said C. Daniel Batson, a psychologist at the University of Kansas who has closely studied cheating.
Batson does not know what happened in the Tour de France, but he does understand how athletes in general can rationalize a decision to cheat. All they need to do is think of a drug or a steroid as a relatively small offense that is evened out by other factors.
"We're very good at explaining to ourselves why we are doing something," he said. "Maybe I have a cold and I know I am going to underperform. Well, I have trained all this time, and in order to compensate for this misrepresentation in my performance ... "
When Batson asked volunteers to divide up an interesting task and a boring task with another person, most people chose the interesting task and assigned the boring task to the other person. The interesting task carried a bonus of $30.
Batson then handed volunteers a coin, to suggest a more equitable way to divide the tasks. Half the volunteers agreed to flip the coin. But something strange happened when the volunteers were left alone in a room. Whichever way the coin landed, people ended up choosing the better task. Batson tried to label the coin, so there would be no ambiguity about what heads and tails stood for. Mysteriously, when people were left alone, the coins still invariably pointed toward giving the boring task to the other person.
Batson wondered what people would do if the unpleasant task was not boring but something painful, like receiving an electric shock. It made absolutely no difference.
Volunteers later told Batson that they would normally have agreed to take on the painful task themselves and spare the other person, but electric shocks were the only kind of pain they just could not handle.
"When you are talking about a moral issue, it is something we feel we ought to do. But the fact we label it as 'moral' means it is probably not something we want to do," Batson said. "So we are in a bind of wanting to do what we don't want to do."
"Moral language is really the language of victims," he said. "We use it more to condemn other people's behavior than we do to motivate our own."