"Why Does It Seem Like Everything Is Against Me?"

Reality seems to have a negative bias, doesn't it? Could it be true? Is everything really against you?

Reality doesn't really have any bias at all, of course. But there are some ways the nature of reality works against an optimistic point of view. Not always, and not in all things, but it really sticks out when it happens. Why? Because it's usually easier to notice when something goes wrong than when things go right. That's why Murphy's Law has become so popular: It expresses what seems to be true.

Murphy's original law says, "If anything can go wrong, it will." That isn't true, of course, but when you have a project and you're trying to make it go right, and every aspect of it goes right — all the various parts are working well — except one thing, what do you notice? What is your attention on? What causes you to feel intense emotions? What will you remember most? The one thing that went wrong. Most of your attention is on the thing that made your project fail.

One of my favorite Murphy's Laws is, "The chance of the buttered side of the bread falling face down is directly proportional to the cost of the carpet." If it lands face up, you may feel a little relief but you'll probably forget about it soon after it happens. If the buttered bread lands face down, on the other hand, especially on expensive carpet, you may get upset, you'll have to clean it up, etc. It is more emotionally-laden, involves more work and inconvenience and frustration, and is therefore more memorable.

A good description of this phenomenon is in the book, How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. The author, Thomas Gilovich, says that the entrance to the Psychology Department at Cornell University where he works has six doors. Every once in awhile, the janitor neglects to unlock one of them. Gilovich enters the building from different directions on different days, and every once in awhile, he tries to go through a locked door. Even though he wrote a book on how to avoid being fooled by perceptions like these, it feels to him as if he always gets the locked door. Why? Because every time he goes through an unlocked door, it doesn't register in his mind as an event. Nothing really happened. He went to work. No emotion. Nothing memorable.

But when he expects the door to open and pushes on it and is stopped — that little moment of surprise and frustration is more memorable, and reminds him of every other similar experience when he tried to push open one of those doors and it stopped him. Lots of memories of a similar experience come to mind at that moment, and no memories at all of going through an unlocked door come to mind because there wasn't much to remember about those times (even though there were far more of them).

His natural conclusion, based on his personal experience, is that he always gets the locked door. Notice, also, that he never would find out if one of the doors was locked on the other days. He doesn't try all six doors each day. If the door opens, he goes through and goes on about his business. Reality, in this sense, is biased. Every time he goes through an unlocked door, reality does not alert him to the fact that one of the other doors was locked and he avoided it. But reality alerts him every time he tries to go through a locked door.

Making the conclusion, "I always get the locked door," is trivial — it won't make much of a difference. But what if you make the same kind of conclusion about your boss or one of your children? Your conclusion seems to be a fact you have lots of evidence for. Your boss is always angry. Your son never cleans his room. This phenomenon can lead to the formation of pessimistic, cynical, or defeatist beliefs and ways of looking at the world.

another way reality has a negative bias

Sometimes, no matter what decision is made, the chances are good it'll turn out badly. But since reality is not an experiment, you don't get to see what would have happened if another decision had been made. We don't get to look at a control group.

For example, the U.S. used a nuclear weapon on Hiroshima and Nagasaki. Many people believe that was a bad idea. So many lives were lost. So much destruction. It wasn't necessary to end the war.

Another alternative was to use conventional warfare until Japan surrendered, which, given the way they fought on the other islands, and given the fact that they did not believe in surrender, may have cost many more lives on both sides. But we don't get to see how it would have turned out had another decision been made.

As horrible as Hiroshima and Nagasaki were, that decision may have been the least horrible of the alternatives. And if a different decision had been made, no doubt that decision would have been criticized too. Reality was biased — the cards were stacked against the decision-makers — no matter what was decided, it would have terrible consequences.

Sometimes life demands you make a decision between a bad choice and an even worse choice. The result is sometimes a cynical attitude for all the terrible things people have decided to do without really considering that we don't know if the alternatives would have been worse.

Another way reality can appear to have a negative bias is in dealing with others. Psychologists use a game called The Prisoner's Dilemma to test various ideas. The game mimics real life in a way: We need to choose to either cooperate with someone for our mutual benefit or do something that benefits only ourselves, and maybe even takes away from someone else.

For example, Joe steals a stapler from the company he works for. It benefits him and takes something away from the company. But Joe is also taking the risk of getting caught and having a lot taken away from him. The Prisoner's Dilemma has that feature also.

In the Prisoner's Dilemma, a game is set up with two people who imagine they have committed a crime together. They are brought in for questioning in separate rooms. Each are offered the same deal. Let's say you are one of them. Here's the deal: If you confess and your partner does not confess, you go free and your partner gets ten years. If neither of you confess, you each get only one year. If you both confess, you each get five years. Imagine yourself with that dilemma. What would you choose?

This is often considered a way to test cooperative versus competitive attitudes. Usually the people in a given experiment play it again and again with the same partner, because this mimics real life even more closely. Most of the people you interact with are people you will interact with again and again. And how you interact in the future — how competitive or cooperative you treat any given person — usually does (and should) take into account how they have interacted with you in the past.

One of the findings of these experiments is pretty obvious: If you are a naturally cooperative person, which I'll bet you are, and you are pitted against a competitive person, you will lose. If I put a piece of candy on a table between you and a competitive person, who is likely to get the candy? The competitive person, not trying to be fair, will quickly grab the candy. You will get nothing.

Hopefully you would learn in multiple interactions to be more competitive with that person. But let's look at what has happened. A more beneficial strategy (cooperation) has been replaced with a less beneficial strategy (cut-throat competition). This phenomenon can lead to a cynical or pessimistic view of life.

When you have interactions with the same people over and over, you're likely to see more cooperative behavior. But when interactions are between people who will never see each other again, you're likely to see more cut-throat behavior. This is simple to explain. If you are a natural cooperator, and you let people in front of you on the freeway and yield and be polite and kind, you'll sometimes run into competitive behavior, so you'll get the short end of the stick again and again. Eventually, because you remember negative events better than positive events, your point of view will change and you may develop the belief that the "world is full of selfish people" and since you need to get to work on time too, you'll exhibit more competitive behavior, and as time goes on in an impersonal environment, cooperative people will turn into more competitive people out of what feels like self-preservation.

And so you're likely to find people in small towns being "nicer" (more cooperative) to strangers and people in big cities colder and meaner (more competitive) toward strangers. When people are packed together and forced into impersonal interactions, reality is biased toward the negative, toward selfishness, toward meanness, as cooperative people feel forced into a lower state by the circumstances.

Another aspect of this is that if a person has been convinced to deal with people competitively, their own experiences will tend to validate their belief. If a person is selfish and inconsiderate with others, she is much more likely to experience a world that seems to be full of selfish and inconsiderate people, making it seem obvious that with all these selfish and inconsiderate (competitive) people around, she'd better act selfish just to survive. Her original proposition that the world is a tough place that requires a cold-hearted attitude will make itself true. It functions as a self-fulfilling prophesy.

And one final way reality functions as if it had a negative bias is that the brain seeks evidence to confirm rather than to disconfirm. So as soon as one of these pessimistic, cynical, or defeatist beliefs start to form, your mind starts looking for evidence that you're right, and the belief starts to coalesce and harden into a firm belief — a firm, mistaken, unnecessarily negative view of the world — a view that makes you less effective at dealing with the world (especially other people), makes you feel bad more often, and a view that actually harms your health. Reality's quicksand has caught another victim.

perception of reality

Steven Jay Gould, a famous zoologist, says the general public tends to believe humans are a violent species. But we are remarkably friendly and kind to each other. He says that when an ethologist (a person who studies wild animals living in their natural environment) sees individual animals only have one or two aggressive encounters for tens of hours, they would rate it as a peaceful species. "But think," he says, "of how many millions of hours we can log for most people on most days without noting anything more threatening than a raised third finger once a week or so."

The problem is, of course, that an act of aggression or violence is supremely noticeable, and normal courteous interactions are not nearly as noticeable. When the lady at the checkout counter is polite, what is there to notice? Does it make your day? Do you remember it later? Do you tell anyone about it?

But what would happen if she insulted you or slapped you? Would you remember it later? You bet you would! Tell anyone about it? Are you kidding?!

There is a natural bias in our perception and memory of reality. It is heavily biased toward the negative. Not for all experiences — obviously, we do remember good events. But for a certain class of experiences, the bias is negative (experiences where the expected event isn't very noticeable and the negative event is very noticeable). This is one very important way pessimism worms its way into your mind.

For example, Gilovich says that at big schools, professors "learn early on that unless they are careful, it is easy to be exposed mainly to the alibis and complaints of the most difficult students and rarely see the more successful and more pleasant students who make teaching so gratifying."

And of course that would be the case. The good students listen in class so they have fewer dumb questions, and fewer problems with the work, and they do their homework so they don't show up in the professor's office asking for an extension on a due-date or whatever. They are not nearly as noticeable as the slacker students. Just by the nature of reality and perception, the professor's experience will be biased toward a negative opinion about students in general unless she compensates for it by deliberately trying to notice the good students.

This glitch in reality is a major source of the development of cynical beliefs. Think about how many things function well in government, for example. Thousands upon thousands of things go right every day. But when a senator does something wrong, we hear about it for days or weeks — in the news, in the late night comedian's jokes, in conversations with your co-workers. It is noticeable. It is easily remembered.

When senators do their normal work, what is there to notice? What is reported? Would you ever hear on the news, "A senator today did his job well?" No. It's not newsworthy. You're not going to go around telling all your friends about it. And why not? Because most senators on most days do what they are supposed to be doing and that just isn't news because it's so normal. And yet the end result of the media magnifying reality's negative bias is that many people have formed a cynical view of the world and of politics and big business and you name it — a view that isn't really justified by the facts, but a view that seems completely justified by the facts because the only facts about those things that makes it to the normal person are negative events, which are newsworthy because they are unusual.

promotional distortions

If you were trying to sell something or raise money for a cause, what would be an effective way to get people to hand over their money? One tried-and-true method is to scare people (about something like the safety of their tap water or pesticides in their fresh fruit).

Awhile back (and for about six months) our mailbox was flooded with requests for donations by lobbying groups, all of them saying the Roe v. Wade decision is threatened by the new judge (the decision legalized abortion in 1973 in the U.S.).

The lobbying groups were all saying the new judge was against abortion, so there was a “razor-thin” 5-4 majority, and if anything happened, abortion could be illegalized. If you don’t want this to happen, they all said breathlessly, then send us some money to help us fight it. And send it now!

But when I looked into it a little bit, I found that the U.S. Supreme Court had a 6-3 majority in favor of legalized abortion. Hardly "razor-thin." One of those judges in favor of abortion did, in fact, vote against abortion, but only once and it was a special case (a “partial-birth” procedure using “a particularly grotesque abortion method”). Justice Anthony Kennedy made it very clear at the time why he voted against this very specific case. He even made a pointed statement at the time that he was not in favor of legalizing abortion and that his case was a special situation.

In other words, there was no “crisis” in the Supreme Court. The lobbying groups, who rely on donations to stay in business, bent the truth to get donations. And bent the truth in a negative direction, making things seem worse than they actually are. The use of this kind of scare tactic is yet one more way pessimism worms its way into the minds of so many people.

The people responsible for these promotional campaigns are probably not evil. But in the competition for donation funds, who will get more money — those who make us believe a crisis needs our urgent help? Or those who don’t?

Since I was young I always assumed people working on the "noble" causes behaved honorably (and people who are merely trying to make a profit do not). But by a slow accumulation of contrary examples, I finally had to admit that no matter what their mission, everybody has to pay their bills to stay afloat, and if fudging the truth a little helps a good cause, some will do it.

For example, ecology is a noble cause. An ecologist at a university who has to obtain grants to do his research has an incentive to make alarming predictions of future doom. This might make his work seem more urgent than other people asking for grant money, so he might win the grant by using this tactic. He might have to bend the truth, but maybe the end justifies the means. After all, at the moment, he has nothing to lose. Nobody can prove he’s wrong until time has passed. By then, he will already have spent his grant money.

In 1970, Life magazine published an article saying, “Scientists have solid experimental and theoretical evidence to support...the following predictions: In a decade, urban dwellers will have to wear gas masks to survive air pollution...by 1985 air pollution will have reduced the amount of sunlight reaching earth by one half...” This was a mainstream magazine.

They had plenty of experts to back up those alarming predictions.

People who have an incentive to alter our points of view — even people working for noble causes — will sometimes use negative, distorted, and even fabricated information. The negative angle sells, it gets on the front page, and it helps recruit followers. That's all fine and dandy for the writers and publishers, but those of us who see those headlines are influenced to see the world more negatively than it really is.

It is not just lobbyists and professors and nonprofit organizations who find it necessary to use negative motivation. It is much bigger than that. Because of our own brain's negative bias, this approach will work better than any alternatives as long as people don't know about it.

But it ceases to work once you're aware of it. If you'd like to help make the world a more positive place, share this information with your friends and family. We don't need to outlaw promotional distortions. A widespread understanding of how it works will make it less profitable to promote pessimism.

Adam Khan is the author of Antivirus For Your Mind: How to Strengthen Your Persistence and Determination and Feel Good More Often and co-author with Klassy Evans of How to Change the Way You Look at Things (in Plain English). Follow his podcast, The Adam Bomb.

No comments:

Post a Comment