Skip to content

Looking for Support Rather than Illumination

2012 February 5

I recently attended a meeting where a marketing executive, in all seriousness, said, “we only present research results that we already know to be true.”  Really?  Then what’s the point of research in the first place?

Every year, millions of dollars are spent on research of varying quality.  If you look hard enough, you’ll always be able to find something, somewhere that can back up virtually any argument.   Yet by doing so, you are using data much like drunk uses a lamppost—for support rather than illumination.

That’s a big problem. Ours is a messy world, with lots of seemingly contradictory data and confusing logic.  To glean the truth, we need to be careful.  Make no mistake, choosing facts to fit your argument isn’t clever or professional, it’s sophistry and it’s fraudulent.  Here’s a quick guide to spotting—and avoiding—some of the most common pitfalls.

Cargo Cult Marketers Revisited

I described the problem before in a post about Cargo Cult Marketers which was based on a famous speech given by Richard Feynman about Cargo Cult Scientists.

He got the name from islanders in the South Pacific who saw military cargo planes landing at airfields during World War II.  After the war, they built their own mock airfields in the belief that the Gods would smile on them as well.  Alas, no cargo ever came, no matter how closely they followed the rituals of a military airfield.

His point was that insight takes discipline.  As he said, “The first principle is that you must not fool yourself—and you are the easiest person to fool. So you have to be very careful about that.”  

Michio Kaku makes a similar point in this video:

Ideas can be dangerous things.  They can take us places we really shouldn’t go.  Often, what we “know is true” can be patently false.

Where Ideas Come From

As I noted in an earlier post, we don’t have much control over where our ideas come from. Some are based on fact, others on intuition, still others based on upbringing and genetic makeup. Sometimes, we have good reason for thinking what we do, but often we don’t.

One example is priming.  In a landmark paper, Nobel laurettes Daniel Kahneman and Amos Tversky found that when subjects were shown a spinning wheel that randomly generated a number between 0 and 100, the outcome would affect their answers to factual questions such as how many countries there are in Africa.

Similar results have been found in other studies.  When subjects are asked to write down the last two digits of their social security number, those with higher numbers will bid more in an auction.  When people are told a bottle of wine costs more, they like it more. We’re very susceptible to subtle cues and our thoughts are not entirely our own.

The priming effect can have long term consequences because of confirmation bias.  We tend to favor information that confirms our ideas (however we arrive at them) and ignore data to the contrary.  Once an idea takes root, we seek to defend it.

Funny Numbers

Of course, just any idea won’t do in a professional setting.  You have to have numbers to back them up.  Unfortunately, numbers can lie.

Take the idea that smaller schools educate better than large schools.  It makes a lot of sense.  Students get more attention, there are not as many distractions and so on.  Do a little research and you’ll find that the highest achieving schools are, in fact, smaller than average.

But wait, small schools have disadvantages too.  They tend to have less resources and offer a smaller curriculum.  So maybe you think smaller schools are worse.  No problem, go to the same research and you will find that the worst schools are also small ones.

The problem, in actuality, isn’t small schools but small numbers.  Smaller samples are more likely to result in extreme values.  If you are merely seeking data to prove a preconceived belief, little statistical quirks will pass right by you.  As Feynman said, you have to be very careful.

Funny Logic

We not only have math to worry about, but logic as well.  For instance, when social media came on the scene, traditional media took a beating.  Many assumed that the the first caused the second, but that logic fell flat when traditional media made a comeback.

This is known as a post hoc fallacy.  Just because something preceded something else doesn’t mean it caused it.

Another common error is affirming the consequent, which confuses necessary and sufficient causes.  For instance, if it rains, the streets will be wet, but wet streets don’t mean that it rained.  There are lots of things that can make the street wet.  In much the same way, just because digital is growing doesn’t mean TV is dying (it’s not), although many people think it does.

These are pretty simple examples which seem easy enough to avoid, but when we’re hot on an idea, we tend to overlook pesky little details like logic and math. That’s why we need to discipline ourselves to look for uncomfortable facts that do not jibe with what we want to be true.

Sobering Up

As Shakespeare once wrote, “Life is a tale told by an idiot.”  It’s a messy world, with lots of stuff around to lead us astray.  Cognitive bias, funny math and logic, as well as our own emotional need for certainty, all conspire.  There are many more ways to get it wrong than to get it right.

So what to do?  Over the years I have come up with a few principles that help me eliminate the most egregious analytical errors:

Data First:  Do your best to approach research without preconceptions.  That’s easier said than done but, with some effort, you can make some real progress.  Also, look at raw data yourself, rather than simply reading an executive summary or a PowerPoint deck.  If you’re not using Excel, you’re not doing a thorough analysis.

Null Hypothesis:  Always assume that there is nothing there unless you have a very good reason to believe otherwise.  This is known as the null hypothesis.  The data never “says” anything, we interpret it one way or another.  As David Hume pointed out, even our belief that the sun will rise tomorrow is mostly a convenience.

Bayes Theorem:  Continually update your information and factor that back in to your analysis in the spirit of Bayes.  The earliest information is not necessarily the best.

Expand Your Network: Probably the best thing you can do is continually strive to connect with people of varying perspectives.  As Asch showed, local majorities can be incredibly persuasive.

We all have a tendency to marry our ideas.  Passion and integrity require a certain constancy.  However, being a professional entails testing our beliefs.  If we don’t, the market will.

– Greg

Comments are closed.