"Why Am I So Negative?"

The researcher John Cacioppo showed volunteers positive pictures and negative pictures while he recorded the electrical activity of their brains. The positive pictures were designed to give the volunteers pleasant feelings — pictures of a Ferrari or a pizza, for example. The negative pictures produced unpleasant feelings, like a mutilated face or a dead cat.

Cacioppo found that the volunteers' brains had more electrical activity when they looked at negative pictures. Their brains reacted more strongly to negative pictures than to positive or neutral ones.

Several different kinds of studies have shown the same thing. In other words, your brain reacts more intensely to negative than positive events. This probably doesn't surprise you, although you might never have considered the implications of it. Dealing with dangerous, scary, threatening information is fundamental. It's about survival. Life or death. It doesn't get much more fundamental than that.

Researchers at the University of Essex in England found that people who were even mildly anxious were more fixated by threatening images than they were by other images. The threatening images captured their attention more quickly and they had a harder time taking their attention away from it.

By fixating more strongly on threatening images, a person can conduct more "detailed cognitive processing of potential threats in their environment," as the lead researcher, Elaine Fox put it. That seems like a useful survival strategy.

Other more general studies on stress show something similar. Stress gives everyone a strong tendency to fixate on unpleasant thoughts and threatening information.

Our minds naturally and quite spontaneously tend to fixate on the negative and overlook the positive — under normal circumstance, and especially under stress.

CAUGHT BY THE NEGATIVE

Coming from a completely different angle, the researcher Mihaly Csikszentmihalyi discovered something along the same lines. He found that when your mind isn't engaged in anything in particular, it tends to drift randomly. Thoughts of all kinds stream through an idle mind.

But eventually, in its random meandering, the mind will think of something negative, and then what happens?

It sticks.

Your mind stops meandering and sticks on the negative thought because negative thoughts fixate attention with more stickiness than positive or neutral thoughts. We get caught in the worried or angry thought, and it doesn't pass by like a neutral thought might. We naturally, and even against our will, give threatening information extra attention.

There's more. Because of the way our brains are constructed, we make certain kinds of mistakes. For example, human brains like yours and mine tend to overgeneralize and see the world in black-or-white, all-or-nothing terms. And unless we have trained ourselves to avoid it, we also have a tendency to draw conclusions too quickly. These are naturally-occurring mistakes. They are the kind of errors every brain is prone to make.

In a way, these "mistakes" are simply the side-effects of a well-functioning, incredibly capable brain. Let me explain what I mean by that.

SEEING PATTERNS

Researchers at Duke University Medical Center hooked people up to a high-resolution functional MRI machine (to track the blood flow in the brain) and flashed pictures in front of them. The pictures were of either a square or a circle. They were asked to push a button in their right hand when they saw the square, and push the button in their left hand for the circle.

The squares and circles were presented in a random order, but of course short patterns would sometimes emerge — a string of all squares, for example, or alternation between a square and a circle for several cycles.

Their brains reacted when one of these short patterns ended. Their brains automatically detected and generalized patterns, and very quickly. They were given no reward for detecting patterns. They were not asked to detect patterns. In fact, they were told the pictures would be flashed randomly. Yet still, without any effort on their part, their brains automatically saw patterns in the random events and generalized — began to expect what the next picture would be. In previous similar studies testing their reaction time, the volunteers had a slower reaction time when an expected pattern was broken, revealing that they were automatically detecting patterns.

Your brain is predisposed to generalize. It automatically tries to see patterns. And for the most part, our ability to generalize is a good thing. Many moons ago, Ignaz Semmelweis noticed that when a doctor performed a dissection and then assisted in a birth, the women had a tendency to get childbed fever. He was able to detect a pattern and make a generalization. His ability to generalize led to the practice of using antiseptics and sterilization, saving millions of unnecessary deaths over time.

Charles Darwin saw a pattern that governs the evolution of all life on earth. Quite a generalization! From that single generalization, new understandings about diseases were discovered that greatly improved the effectiveness of doctors. In fact, whole new sciences have issued from that single generalization.

What I'm trying to say is: The mistakes our brains tend to make (like overgeneralizing) are the inevitable secondary results of our great intelligence.

Your ability to recognize a face comes from your brain's ability to complete a pattern with minimal clues. It has been exceedingly challenging to create computers that can do it, and they still aren't as good at it as you are on a bad day without even trying. Your brain recognizes faces without any effort on your part. Your brain is so good at completing a pattern that, even in dim light — even if you can only see half of the face — you recognize immediately who it is.

But this amazing ability also sometimes causes us to see patterns that don't really exist. We see a man in the moon. We see a horse in the clouds. We see the big dipper, the little dipper, Orion's belt. Our brains can take the most scant clues and see a pattern, without us making even the smallest effort to do so.

But especially given our brains' bias toward negativity, we also see patterns that create pessimism, cynicism, and defeatism — patterns our brains have created out of minimal clues — patterns that don't actually exist.

I used to work with a woman who had two failed marriages and concluded, "All men are pigs." From only two examples, she created a generalization that included three billion men! Her cynicism, her unwillingness to allow any men to get close to her, was the side-effect of two common mistakes our brains tend to make: 1) the brain's amazing ability to see a pattern with minimal clues, and 2) our brain's tendency to look for evidence that confirms an already-existing conclusion.

SEEKING EVIDENCE

Once you have concluded something, you have a strong tendency to notice evidence that supports your conclusion and to explain away or ignore information that invalidates your conclusion, not only in your immediate perception, which is bad enough, but also in your memory.

In an experiment, for example, volunteers were asked to read a story about a woman. Let's call her Clare. Two days later, half the volunteers were asked to recall the story and decide how suited Clare was for a career as a real-estate agent. The other half were asked to rate her suitability for a job as a librarian. They were all asked to remember some examples of Clare's introversion and extroversion.

The volunteers looking at her ability as a real estate agent remembered more examples of Clare's extroversion.

Those assessing her ability as a librarian recalled more instances of Clare's introversion.

The volunteers were not asked to bias their data. They had no stake in the matter. They weren't rewarded for answering one way or another. But that's what human brains do. Your brain naturally and automatically looks at the world and your own memory as if it is trying to confirm whatever conclusions you've already drawn.

This is not to say you are the helpless victim of your brain's natural functioning. You can do something about it. But here we're looking at how the virus of negativity can enter the system. We're asking the question: "At what points are we vulnerable to infection?" How do otherwise healthy, reasonable people become pessimistic, cynical, and defeatist? One way is through the natural mistakes human brains are prone to make, combined with the brain's negative bias.

Another mistake our brains make is jumping to conclusions too quickly. You can see how this makes the other mistakes all the worse. You might see a pattern that doesn't really exist, overgeneralize about it, and form a conclusion so fast you don't even know you're doing it. Then hold onto and even defend your conclusion when you get evidence against it.

I'm sure you don't do that often, or at least not as often as other people you know. Why? Because you have trained yourself (or have been trained) to deliberately prevent yourself from doing what your brain does naturally. But even so, it is very likely that you still make those mistakes, no matter how careful you are. Read the following examples of experts in their field making these mistakes.

Before trains could go very fast, experts in Germany predicted that if trains went faster than 24 miles per hour, people would get severe nosebleeds. Experts in the United States predicted people would go insane when they saw a train for the first time. The speed and the noise would be just too much for people to handle.

Many experts in the ship-building business were quite sure that a ship made out of iron couldn't possibly float.

In 1943, the chairman of IBM said "I think there is a world market for maybe five computers."

In 1929, Irving Fisher, an Economics professor at Yale, said, "Stocks have reached what looks like a permanently high plateau." Then the Great Depression hit.

In 1872, Pierre Pachet, professor of physiology at Toulouse, said, "Louis Pasteur's theory of germs is ridiculous fiction."

In 1981, Bill Gates said, "640k ought to be enough for anybody."

There is no end to examples like these. I've got a file full of them. These experts have perfectly normal human brains, maybe even better brains than average, and yet they still jump to conclusions in their own field of expertise.

Their brains did it, and so does yours and mine.

Let's recap. Human brains react more strongly to negative than positive information. They make certain kinds of mistakes in the way they process information — mistakes like overgeneralizing, seeing things too black-or-white, a tendency to confirm conclusions they have already formed, and they jump to conclusions quickly and easily, even if they don't yet know enough to decide.

And because the brain is already biased toward the negative, those cognitive mistakes tend to be made in the direction of pessimism, cynicism, and defeatism.

A form of therapy has sprung up to deal directly with this phenomenon, called cognitive therapy. A cognitive therapist tries to root out the mistakes clients make in their thinking. Those mistakes are causing or sustaining the clients' depression or anxiety. The therapy is simple, straightforward, and short term, and yet it has proven to be surprisingly effective. Cognitive therapy is the most thoroughly researched form of therapy and when compared with other forms of therapy, it is superior, both from independent evaluations and the clients' own assessments. As simple and straightforward as it is, it is the most effective of all therapies.

If you were a client, the most important thing a cognitive therapist would do for you is undermine your confidence in your mistaken conclusions. Overconfidence in our own conclusions is one of the worst mistakes we naturally make. We have a natural propensity — built into the brain — to draw conclusions with insufficient evidence and to hold those conclusions with excessive confidence. And to defend those conclusions with unjustified ardor.

A TASTE OF THE TILL

Here's a good example of holding conclusions with too much certainty. When the founder of the National Cash Register company, now known as NCR, John Patterson, first started his company, almost no stores used cash registers. But most store owners had a pilfering problem. In those days a "taste of the till" was an accepted part of the wages for a clerk or bartender, much like waiters' tips are today. With no way to keep tabs on what was actually being sold, it was easy for an employee to pocket some of the money without anyone knowing.

One of the biggest benefits Patterson pitched — the one that he thought would be the reason every store owner would jump at the opportunity — was that it could eliminate pilfering. The cash registers would not open until something was rung up, and everything rung up was printed on a little spool of paper inside the machine. The only one with a key to get into that spool was the owner. Viola! The owner could prevent employees from stealing the profits.

But Patterson ran into a deep-seated pessimism. Owners were quite sure a machine could never stop what they perceived to be human nature. Petty theft was accepted as inevitable.

It was a classic case of defeatism, and very hard for Patterson's salespeople to overcome. The owners had concluded "that's just the way people are," and they held onto their conclusion with far too much confidence.

The conclusion was wrong, as many (if not most) pessimistic conclusions are. When the machines were put into service, they did actually cut down on pilfering and more than paid for themselves in savings.

This kind of overconfidence is nothing new. It's a common feature of history. You could almost write history by telling the story of beliefs that people through the ages have held with excessive confidence only to have them proven wrong. The earth is flat. The sun revolves around the earth. A man will never walk on the moon.

People throughout history — experts, people who should know better — have made statements with certainty when they really weren't certain at all. They felt certain, but that feeling has nothing to do with correctness.

A common phrase in use in the 1930s and 40s was "When the kid next door walks on the moon." It used to be a phrase people used when they meant to say, "It'll never happen." This is an example of widespread defeatism — a certainty about a pessimistic conclusion that wasn't justified.

A story circulated around the internet a few years ago about Neil Armstrong. Maybe you've read it. The story goes that when Armstrong first stepped on the moon and said, "One small step for man, one giant leap for mankind," he had made some other remarks, including, "Good luck, Mr. Gorsky."

Nobody knew who Armstrong was referring to, and when anyone ever asked him, he just smiled. Twenty-six years later, after giving a speech, someone brought it up again and Armstrong said, "Well, Mr. Gorsky has passed away, so I guess it's okay to answer that question. When I was a kid, I was playing baseball in my backyard and when I chased a ball to a place under my neighbors' window, I overheard Mrs. Gorsky shouting at Mr. Gorsky, 'Oral sex? You want oral sex!? You'll get oral sex when the kid next door walks on the moon!'"

The story isn't true, but it is believable because anyone growing up in the forties knows that was a common expression. A man walk on the moon? Yeah, right. It'll never happen. And everyone knew it would never happen. They had concluded it with excessive confidence.

When people do that with something petty, it's not a big deal. But when you do it about your ambitions, it can cost you quite a bit.

SELF-FULFILLING PROPHESIES

Of course, I'm not going to leave you hanging. There is something you can do about your brain's natural negative bias. You probably do some of them already. We'll get to that a little later.

But a pessimist, even if he found out what he could do, might automatically think, "It'll take too much work," or "That's just the human condition," or "I'll never be able to change it. I'm not persistent enough. I have no will power. Etc." All those thoughts will stop a pessimist from trying to change, and of course they are all more of the same: Pessimistic conclusions exclaimed with far too much certainty.

The tendency to draw negative conclusions and then see the world through those conclusions can often create self-fulfilling prophesies. Pessimistic, cynical, and defeatist conclusions can make themselves come true.

For example, a waiter gets three lousy tips in a row and thinks, "All my customers tonight are bad tippers." Even three bad tippers in a row is statistically not unusual in a random sample, but the waiter's brain sees a pattern and overgeneralizes and then makes a conclusion and is completely convinced of it.

So what does he do? He gives up the fight. He becomes pessimistic, defeated, cynical, at least for the rest of the night. He doesn't try to give good service because it doesn't matter. He's going to get a lousy tip no matter what he does. Why try?

And sure enough, people are not at all impressed with his halfhearted service and tip him badly. His own negative conclusion has become a reality, brought into being by his own negative conclusion.

There is an old joke that reveals an understanding of this principle. One day a man gets a flat tire on a remote road and discovers he doesn't have a jack. He figures they might have one at a farmhouse he sees up the road, so he starts walking toward it.

As he walks, he starts thinking to himself, and his ruminations are biased to the negative. "They'll probably be suspicious of a stranger out here in the middle of nowhere, and won't answer the door. Then I'll have to walk another mile to get to the next place and they won't have a jack. When I eventually find someone who answers the door and has a jack, they'll make me leave my wallet or something so I don't run off with their jack. What's the matter with these people? Can't they help their fellow man without running him through hoops or thinking the worst of him?!"

His ruminations build up into an indignant anger by the time he reaches the first house. A woman answers the door and says, "Can I help you?"

He yells, "I wouldn't take your help if you begged me! And you can keep your stupid jack!"

The brain has a natural negative bias combined with a built-in tendency to jump to negative conclusions quickly and feel certain about them. This sometimes produces self-fulfilling prophesies. This is how the brain displays a negative bias. That is one way pessimism can worm its way into your mind.

Adam Khan is the author of Antivirus For Your Mind: How to Strengthen Your Persistence and Determination and Feel Good More Often and co-author with Klassy Evans of How to Change the Way You Look at Things (in Plain English). Follow his podcast, The Adam Bomb.

No comments:

Post a Comment