Читайте только на Литрес

Kitap dosya olarak indirilemez ancak uygulamamız üzerinden veya online olarak web sitemizden okunabilir.

Kitabı oku: «Eyes Wide Open: How to Make Smart Decisions in a Confusing World», sayfa 2

Yazı tipi:

QUICK TIPS FOR GETTING TO GRIPS WITH A WORLD IN HYPER-DRIVE

Commit to becoming an Empowered Decision-Maker.

See the Ten Steps as the tool kit to help you in this quest. The Tips that follow will get you thinking – they will get you started on your journey. We will develop them further as the book progresses.

Become aware that we have to make over 10,000 decisions a day. Begin to think about which of these you actually need to make, and whether you are really prioritising the important ones.

Start thinking about how it is you make decisions. Do you consult others and gather a range of opinions, or do you take everything upon yourself? Do you make decisions quickly? Or do you tend to want to mull options over first?

Start noting who or what typically influences the choices that you make.

Acknowledge that smart decision-making needs time and space. Begin to think about how you can reshape your environment to achieve this. How can you limit distractions and disruptions? Can you take technology-free Sabbaths? Can you initiate a new policy at work that limits who is cc’d on emails and under which circumstances?

Start considering how you have typically viewed experts and conventional big-hitters until now. Do you tend to accept their views and ideas without question?

Ask yourself more generally who it is you trust and why.

Contemplate your current strategies for dealing with the digital deluge. Think about the short cuts you take, and how you determine which information to base your decisions on. Begin thinking about how these may be impacting on the quality of your thinking.

KEEP YOUR EYES WIDE OPEN

STEP TWO
See the Tiger and the Snake
The Tiger and the Snake

In 2005, the prominent American cognitive psychologist Professor Richard Nisbett began an extraordinary experiment.

After some careful planning, he showed a group of American students and a group of Chinese students a set of images for just three seconds each. The images were pretty varied: a plane in the sky, a tiger in a forest, a car on the road – you get the idea.

How would the American and Chinese students view these images, the Professor wondered. Would they see them differently? Would they see the same things? If there was a snake on the ground, say, would the Americans or the Chinese notice it?1

The Professor’s methods were clinical. Upon entering the room, the students were placed in a chair, with their chins in a chin rest at a distance of precisely 52.8cm from the screen. They were then strapped in to 120Hz head-mounted eye-movement trackers. Nisbett could track every squint, glance or flicker of attention.

The differences between the two sets of students were immediately apparent.

The Americans focused on the focal object: the plane, the tiger, the car. They pretty much fixated on these, and barely looked at the background.

The Chinese, on the other hand, took longer to focus on the focal object – 118 milliseconds longer. And once they had done that, their eyes continued to dart around the image. They took in the sand, the sunlight, the mountains, the clouds, the leaves.

So if there was a snake on the ground behind the tiger, it would be the Chinese, not the Americans, who would see it.

All That Glitters …

In a complex world of hidden dangers and fleeting opportunities we all need to be able to see snakes as well as tigers.

We have to understand that the picture we see at first may not give us all the information we need to make the best possible decision. We need to learn to see beyond what is obvious, beyond what we are culturally or conventionally attuned to focus on.

Of course, this doesn’t mean that we shouldn’t ever act until we’ve gathered every single piece of information out there that may be relevant to our decision. That would be excessively time-consuming, and our brains wouldn’t be able to cope with all that data anyway.2

But what the tiger-and-the-snake experiment tells us is that the information we’re most prone to focus on may only give us a very partial story, a fragment of the truth, and therefore risks misleading us. Being aware of this, and adjusting accordingly, will make a profound difference to the decisions you make.

Take internet dating. Studies show that men are most attracted to photos of women with large eyes, a big smile and high cheekbones.3 Glossy hair, full lips and smooth skin are also a big draw.4 It’s also the case that women who describe themselves as ‘voluptuous’ or ‘portly’, or ‘large but shapely’, are contacted far less online than women who are slightly underweight.5

That’s how it goes with a lot of male browsers. What about women? Well, women seem to focus more on height. Men listed as between six foot three and six foot four receive about 60 per cent more first-contact emails than men in the five foot seven to five foot eight category.

But are superficial features such as the fullness of a woman’s lips or a man’s height really the best things to focus on if you’re looking to choose the right partner?

The answer, unsurprisingly, is a categorical ‘no’. Studies of successful long-term relationships point to less superficial qualities such as sense of humour, shared interests and common values as being much better indicators of whether a couple are well matched.6 Deep down, most of us know this, yet the majority of online daters only focus on one part of the picture.

Politicians, economists and investors can easily fall in to the same trap.

They tend to focus overwhelmingly on economic growth as the ultimate indicator of how well a country is doing. Yet growth indicators tell us nothing about how that growth was achieved.

If you cut all your trees down to sell the wood for kindling, that would be good for growth; but would it really be a measure of how successful you were as a nation? Clearly not. Think of all the problems that the destruction of the rainforests has caused to local habitats as well as its contribution to global warming. Then add to that the very significant associated future economic costs of this kind of short-term thinking.

‘Growth’ also does not tell us anything about how spoils are shared, about inequality levels, or gender differences, or the rumblings of discontent that can mutate into revolution. It yields, in reality, relatively little information on how a country is actually faring, or will fare in the future. In Russia, the economy has been growing at the same time as life expectancy has been falling.7 In the United States, during the noughties there was growth, but at the same time median family income declined, and there was no net job creation.8 So if you are a prime minister or an investor, or just a concerned citizen, the modern economic obsession with digits of growth can end up being the tiger that means you don’t see the snake creeping up on you on the forest floor.

Think also of the doctor who no longer carries out a physical examination or takes a detailed case history, but instead focuses overwhelmingly on blood tests or scan results – you’ve probably come across the type.

While blood tests and scans can of course yield important, often life-saving information, in many cases these static findings cannot on their own determine what is wrong with you. The best doctors are still those who listen to you, touch you and try to establish the individual characteristics of your situation. Such doctors use the static readings of your test results as potential clues to what’s wrong, but not as the sole way of defining and diagnosing you.9 As Dr Athina Tatisoni, of the prestigious Tufts University School of Medicine, remarks:

Usually what happens is that the doctor will ask for a suite of biochemical tests – liver fat, pancreas function, and so on … The tests could turn up something, but they’re probably irrelevant. Just having a good talk with the patient and getting a close history is much more likely to tell me what’s wrong.10

Doctors, like investors, like dating-site users, like all of us, need to get better at thinking about the whole picture. We need to look beyond what is immediately obvious or easy to see. For if we are to see a snake as well as a tiger, if we are to see what it is we need to see, we must remember that the information that glitters most brightly may not actually be what will serve us best.

The Unicycling Clown

Sometimes we are so focused on one thing in particular that we actually stop seeing the other things we need to take into account.

Just ask Dustin Randall.

One might think that, dressed in a vivid purple-and-yellow outfit, with large shoes and a bright-red nose, Dustin the clown would be pretty impossible to miss.

Especially when he was riding a unicycle around a small university campus square.

But researchers at the University of Washington discovered otherwise. When Dustin rode by people who were crossing the square while using their mobile phones, the vast majority completely failed to notice him.11 It was as if the unicycling clown simply wasn’t there.

This is an example of a phenomenon known as ‘inattentional blindness’,12 and it’s what happens when we’re very focused on one thing in particular – in this case a telephone conversation or an important text message. When we are focused like this, we typically don’t register new data points, new things that may come in to our sensory orbit. We can even miss highly visible objects we are looking directly at, because our attention is elsewhere.13 One professor rather brilliantly describes it as being ‘as if while the eyes “see” the object, the brain does not’.14

This is something we are all likely to have experienced. Have you ever bumped into a lamp-post while walking and texting? Or somehow missed seeing an important email in your inbox during a particularly busy week? If so, you can blame inattentional blindness.

There are times, of course, when tunnel vision clearly pays off – think of your hunter-gatherer ancestors, seeking out food and doing everything they could to avoid mortal danger. Once they heard the lion’s roar, there would only be two things to think about: working out where that roar was coming from, and running in the other direction. They would not have wanted to be distracted by anything else – the pain in their muscles as they ran, the sound of birds singing above, the sight of their favourite snack hanging from a tree – these would all have been irrelevant to their immediate survival.

But fast-forward a few millennia, and if you don’t want to get knocked over by a unicycling clown, miss a critical email or fail to spot a wrong charge on your credit-card statement, you’ll need to take your blinkers off and with eyes wide open improve your powers of attention and perception.

Think about what it is that is consuming your focus. At work, this may be your latest sales figures, or your company’s current stock price. At home, it may be the football scores. How many times a day do you check these? What might this mean you’re not paying attention to? Could you spend a day (or even a few hours) without looking at them? What would you notice that you hadn’t previously? And who around you could bring the unexpected to your attention? More on this point to come.

From PowerPoint to Hypnotised Chickens

Sometimes it is not our fault that we’ve only got partial vision.

Edward Tufte (pronounced tuff-TEE), an American statistician and Professor Emeritus of Political Science, Statistics and Computer Science at Yale University, knows this all too well.

A man of many talents, Tufte was hired by President Obama in 2010 to keep an eye on how his $787 billion stimulus package was being spent.15 He also has a sideline gig as an exhibiting sculptor. But in his academic career he has undertaken substantial research into how informational graphics impact on decision-making.16

One case he has looked at in real depth is the Columbia Space Shuttle disaster of February 2003. Seven astronauts lost their lives when their NASA spacecraft disintegrated shortly before the conclusion of a successful sixteen-day mission.

It is now well-known that a briefcase-sized piece of insulation foam from the Shuttle’s external fuel tank collided with its left wing during take-off, meaning that the Shuttle was unable to shield itself from the intense heat experienced during re-entry to the earth’s atmosphere.

But the official investigation also revealed a story that is both fascinating and curiously everyday in its nature.

A key underlying factor that led to the disaster was the way in which NASA’s engineers shared information.

In particular, the Columbia Accident Investigation Board singled out the ‘endemic’ use of a computer program. A program we more usually associate with corporate seminars or high-school classrooms: Microsoft PowerPoint.

The investigators believed that by using PowerPoint to present the risks associated with the suspected wing damage, the potential for disaster had been significantly understated.

As the Columbia circled the earth, Boeing Corporation engineers scrambled to work out the likely consequences of the foam striking the thermal tiles on the Shuttle’s left wing. Tragically, when they presented their findings, their methods of presentation proved to be deeply flawed. Information was ‘lost’, priorities ‘misrepresented’, key explanations and supporting information ‘filtered out’. The ‘choice of headings, arrangement of information and size of bullets … served to highlight what management already believed’, while ‘uncertainties and assumptions that signalled danger dropped out of the information chain’.17

In other words, the very design tools that underpin the clarity of PowerPoint had served to eclipse the real story. They had distracted its readers. They had served to tell a partial, and highly dangerous, story.

Tufte, applying his expertise in information design, investigated these claims further, and found even more to be concerned about.

He analysed all twenty-eight PowerPoint slides that had been used by the engineers to brief NASA officials on the wing damage and its implications during Columbia’s two-week orbit of the earth, and discovered that some were highly misleading.

The title of a slide supposedly assessing the destructive potential of loose debris, ‘Review of Test Data Indicates Conservatism for Tile Penetration’, was, in Tufte’s words, ‘an exercise in misdirection’. What the title did not make clear was that the pre-flight simulation tests had used a piece of foam 640 times smaller than that which slammed against the Shuttle. This crucial information was buried towards the bottom of the PowerPoint slide. Nobody seemed to take any notice of it – they were too focused on the headline at the top, and did not take in the full picture.18

Tufte found that the limited space for text on PowerPoint slides led to the use of compressed phrases, with crucial caveats squeezed into ever smaller font sizes. This created a reliance on ‘executive summaries’ or slide titles that lost the nuance of uncertainties and qualifications.

In cases such as this, in other words, oversimplification leads to the loss of vital detail.

Complexity, a reality of executive decision-making, is something the medium of PowerPoint dangerously disregards.

It’s not just NASA or professors who are concerned about the potential of PowerPoint to blinker our vision.

General James N. Mattis, who served as Commander of the United States Central Command after taking over from the subsequently disgraced General David Petraeus in 2010, has always had a way with words. He once advised his marines in Iraq to ‘Be polite. Be professional. But have a plan to kill everybody you meet.’19 Mattis’s assessment of PowerPoint was equally merciless: ‘PowerPoint makes us stupid,’ he said in 2010, at a military conference in North Carolina.

This is a feeling corroborated by his military colleagues. Brigadier General H.R. McMaster, who banned PowerPoint when leading the successful assault on the Iraqi city of Tal Afar in 2005, said, ‘It’s dangerous, because it can create the illusion of understanding and the illusion of control.’ Indeed, so aware is the US Army of the medium’s powers of evasion that it intentionally deploys PowerPoint when briefing the media, a tactic known as ‘hypnotising chickens’.20

Of course, this is not to say that we don’t ever need information to be summarised for us. There are times when we clearly do. But if we’re making decisions on the basis of summaries, we need to remind ourselves of the significant risk that in the process key details and subtleties may well be overlooked. Or be written in such small font size that you don’t pay attention to them.

We must also take a moment to remind ourselves that what someone else has deemed important may not be what matters most to us, or what will really make a difference to the decision we make.

So next time you’re looking at a PowerPoint presentation, or indeed any type of executive summary, look beyond the large font and the bold headline points. The information you actually need may be somewhat more buried. Or it may not even be on the slides at all. Do also put your presenter on the spot. If it matters, ask them to clarify and expand, and provide more information.

If you don’t, you risk being shown only tigers, and never snakes.

The Cult of the Measurable

It’s not only particular forms of presentation that can blinker our vision.

One type of information often dominates our attention – numbers. This in itself can be problematic.

Numbers can give us critical information – if we want to know whether to put a coat on, we’ll look at the temperature outside; if we want to know how well our business is doing, we’ll need to keep an eye on revenues and expenditures; and if we want to be able to compare the past with the future, we’ll need standard measures to do so.

But the problem is that the Cult of the Measurable means that things that can’t really be measured are sometimes given numerical values.21

Can a wine really be 86 per cent ‘good’? That doesn’t sound right to me, but one multi-billion-dollar industry doesn’t agree. Robert Parker, probably the most famous wine critic in the world, ranks wines on a scale between fifty and one hundred. And winemakers prostrate themselves before him, praying for a rating of ninety-two or above, because such a number practically guarantees commercial success, given how influential Parker’s ratings have become with wine drinkers.22

Yet what really is the difference between a rating of ninety-two and an eighty-six? How can we tell what that six-point difference means?

And are someone else’s tastes necessarily going to correlate with yours? I’ve already raised the question of whether Joe from Idaho on TripAdvisor is really the best person to steer your choice as to where to go on holiday. We also need to ask ourselves whether what I mean by four stars is what you mean by four stars. Whether my Zagat score of twenty means the same as yours. As the Financial Times’s wine critic Jancis Robinson points out on the subject of Parker’s influence, ‘Make no mistake about it, wine judging is every bit as subjective as the judging of any art form.’23

So, before you make a decision based on a number, think about what the number is capturing, and also what it is not telling you.

Risk is another area where our attempts to assign clear measures are often bound to fail. While it’s true that there are some areas where we can meaningfully quantify risk – such as engineering, where we can come up with a number for how likely a building is to withstand an earthquake, say; or medicine, where we can estimate a patient’s chance of responding to a particular drug – this approach doesn’t work effectively in all spheres. Indeed, in our turbocharged world, in which changes happen quicker and less predictably than ever before, many of our attempts to assign probabilities to future events are likely to be pretty meaningless. As President Obama said, reflecting on the huge range of probabilities that various senior intelligence folk had proffered back in March 2011 as to whether Osama bin Laden was the ‘high value target’ spotted within a high-walled compound in Abbottabad in Pakistan – with confidence levels that ranged between 30 and 95 per cent – ‘What you started getting was probabilities that disguised uncertainty as opposed to actually providing you with more useful information.’24

In our desire to reduce everything to some sort of standardised measure, to create universal meanings for things that will always be subjective, and to create the illusion of certainty when uncertainty is in fact what prevails, do we not risk making decisions on the basis of what may seem intelligence-rich information, but is in truth pretty meaningless?

Not everything can be measured, not everything can be compared, especially in a world as complex as ours. Indeed, Obama, realising this, responded to the various probabilities he’d been presented with, ‘Look guys, this is a flip of the coin. I can’t base this decision on the notion that we have any greater certainty than that.’25 A flip of the coin which we now know that President Obama won and Osama bin Laden lost.

Ücretsiz ön izlemeyi tamamladınız.

Türler ve etiketler

Yaş sınırı:
0+
Litres'teki yayın tarihi:
29 haziran 2019
Hacim:
385 s. 9 illüstrasyon
ISBN:
9780007467112
Telif hakkı:
HarperCollins
Metin
Средний рейтинг 0 на основе 0 оценок
Metin
Средний рейтинг 0 на основе 0 оценок