5 horror stories of bad data

From widely varying approval scores to inflated ‘impressions,’ we look at realm of lies, damn lies, and statistics. So, shed that prom dress and start mining relevant numbers.

Warning: I’m getting up on my Bad Data soap box here. Still, all the soap in the world is not going to clean up the gigantic pile of dirty data that has been landing on my desktop in the past few weeks. Each day has brought another round of, “You have got to be freakin’ kidding me!”

Yes, a certain amount of dealing with bad data is not that unusual around here. One of my jobs as “Measurement Sherpa” for my clients is to make sure that the data on which they are basing decisions is valid, accurate, and reliable. So we spend a lot of time digging in the dirt that is modern data.

Here are a few horror stories that you might recognize from your own quest for reliable, relevant data:

1. Four vendors, four very different sets of data

For one client we are monitoring four vendors to see which one has the most accurate sentiment analysis tool. My conclusion: Who the hell knows?

Each vendor provides results that are about 20 percentage points different from the others. One reports a “positive sentiment score” of 58 percent. Another says its 12 percent, and a third puts it right in between at 33 percent. The “neutral” scores are even farther apart. One scores 78 percent of all mentions neutral; another says it’s 30 percent. Negatives, in case you’re wondering, range from 9.5 percent to 26.5 percent.

The weirdest thing is that two of them use human coders and two are entirely machine coded, and when I use my own coders to do a validity check, they tend to agree more with the machine coding than with the other humans. Go figure.

2. The case of the positive prom dress

For another project we are testing which vendor’s “alerts” are actually alerting the company to the right stuff. One day my editorial assistant started laughing at an alert, and I knew we were in trouble.

Someone whose last name was the same as the monitored manufacturing company tweeted a picture of herself in her prom dress. The vendor’s data rated it as “positive” toward the manufacturer. Wouldn’t you love to see those coding instructions? (If you want well-tested coding instructions, look no further than here.)

3. Lost in bad travel data

Then there was the travel destination data that contained duplicate articles when it arrived. Easy enough to fix, I thought: Just run our de-duping routine and poof! Except that one copy had been coded positive, and the other had been coded neutral. Which do I keep? I then had to go in and reread and recode every duplicate. What was the vendor thinking?

No sooner had I gotten that bit of confusion cleaned up than I noticed that coverage for my client had dropped by 50 percent from the previous quarter. Turns out that the vendor was using such narrow search strings that they included only about half of the relevant items. So we asked them to reduce some of the search term restrictions. They did, but then, of course, they delivered dozens of calendar listings, traffic, and police reports. Sixty percent of the new content was irrelevant.

4. Who needs 6.8 trillion impressions?

About a year ago, I was called in to an agency that had been laughed out of a CMO’s office for reporting that they had generated 6.8 trillion impressions the previous year. The CMO had pointed out that the agency was implying that everyone on the planet saw news of the company 10 times.

As it turns out, the agency’s “impressions” counted every single Facebook posting as “2.5 million impressions,” as there were 850 million people signed up on Facebook at the time, and they were using a 3x multiplier. Bigger isn’t always better.

5. Tweet or be damned

Beware of a certain social monitoring vendor that prominently features “impressions” on its reporting dashboard. Unfortunately, once I asked the right questions, the people there admitted that they report them only for Twitter, despite their claim that they monitor dozens of sites.

Katie Delahaye Paine helps companies define success and design measurement programs for their PR, social media, and communications programs. A version of this article first appeared on PainePublishing.

COMMENT

Ragan.com Daily Headlines

Sign up to receive the latest articles from Ragan.com directly in your inbox.