5 metrics misconceptions—and how to avoid them

You’ve gotten the budget approval, selected a measurement tool and started gathering data, so you’re all set, right? Not so fast, Sherlock. It’s not all that elementary.


A major challenge in measuring communications is surviving the first few months of a new program.

Once you’ve weathered RFPs and vendor selection, you’ll probably have high expectations to see your data roll in and the insights roll out. It’s not that easy.

If you are going to survive and even prosper while you get your program running smoothly, you must look out for common pitfalls and unrealistic expectations.

Here are my top five unrealistic expectations for starting a measurement program, along with the cold realities:

Expectation No. 1: “I’ll get my magic number.”

You expect to find a “measurement number” that will justify your budget and answer all questions the moment you implement a new tool. It’s not going to happen.

Reality: Measurement isn’t a single number, so stop searching for unicorns.

Your CFO doesn’t go into the board of directors meeting armed only with your company’s stock price or your sales for the quarter. He or she probably presents metrics showing profit, customer satisfaction and reputation, as well as sales growth and possibly employee retention—all things that will affect the organization’s future success.

This free white paper explains how to measure your communication efforts, align PR objectives to business goals and prove your value within your organization. Download it here.

So, why would the team handling communications, comprising so many disparate components, think it can demonstrate results with one number? Agree upon three to five communications metrics that reflect your department’s efforts as they apply to the organization’s goals.

An optimal content score or social engagement index is a good place to start, because it forces you and your leaders to agree on the key factors of media relations and social media that affect the business.

Reality: Tools do not measure success.

Tools provide data, and if you buy the right one, you might actually get the right data to demonstrate your success. Do not expect a tool to provide you with instant measurement-no matter what the smooth salesperson said.

Reality: Measurement is a long-term process.

It will take time for you to identify changes and trends. Your manufacturing VP doesn’t report on the productivity for today; he or she reports on monthly, quarterly and annual changes in productivity. That’s why you must set expectations correctly. It will take up to nine months after your program is up and running before you can demonstrate trends.

Expectation No. 2: “My results are on the way.”

Reality: It will take time to get your ducks in a row.

Spend time with your measurement vendor’s staff to clearly define the parameters of your program. Identify subjects, messages, campaigns and any other details you wish to track.

If you give your vendor the wrong messages, incorrect competitors or out-of-date product names, it will be up to you to discover it and tell them. If it’s on your website, they will assume it’s current. If it’s not on your website and you haven’t told them about it, how will they know it exists?

Reality: A single month does not results make.

There’s a logical timeline, and it does not begin with yesterday and end with the day after tomorrow. Double-check the data (see how to do a validity test) before showing it to anyone. A month’s worth of data looks silly on a chart, because you have nothing to compare it against.

Even if you’re measuring just one event, it’s always more useful if you can compare results against those of other events or prior years. One month’s data does not a terrific report make. Good reporting takes time.

Expectation No. 3: “Now that I’ve sorted out my vendors, everything will be easy.”

Reality: Nope.

You’ve just gone through an incredibly complicated screening process to find the right vendor, and now you expect everything will be simple and easy? Don’t expect to sign the purchase order and have your metrics magically appear. It doesn’t work that way. There must be setup time, verification and testing for any tool.

No matter how simple or sophisticated a tool is, it will always take at least eight weeks to deliver consistent and useful data. Given the setup, training and validation, you’ll be lucky to get your results by the end of the quarter.

Expectation No. 4: “I’m spending a ton of money, so my data will be 100 percent perfect from day one.”

Reality: Chances are your data are dirty, dirty, dirty.

The more data there are, the greater the possibility that some of the information is wrong. Expect glitches in your data, especially to start. Computers, like children, take a while to learn the rules, and without rules they’ll spend a lot of time getting dirty.

For the first few months you’ll be sorting out search terms and media outlet specifics, and you and your computer(s) are going to make mistakes:

  • At first you will define search terms that are too broad, and you will be inundated with mentions that have nothing to do with your organization(s). We did a recent study of Google’s Calico project. Though I anticipated a large number of calico cat videos, I had no idea there were that many varieties of calico fabric for quilts. You might also ask the folks at SAS—the analytics software company, not the airline or the Semester at Sea folks—how many “exclude” terms they have. Last I checked, it was around 2,000.
  • Beware stock ticker symbols. Twitter turns up some bizarre abbreviations. Turns out that the ticker symbol for Astra Zeneca (AZN) is an abbreviation for Asian on Twitter and turns up some really nasty stuff. So unless you want to spend your days reading bizarre tweets, you need to be very careful about your search strings.
  • Assume it will take at least three months to get it right. You can shorten this cycle if you work with a precise list of media outlets, clearly defined by specific URLs. This means not, for instance, asking for “Forbes”: Which of the 100+ URLs did you have in mind?

Expectation No. 5: “My first report will explain everything and make me a hero.”

Reality: Your first report will be confusing.

No matter how clear the program seems, your first report will throw you for a loop. There will be charts that look weird and include language you might not understand. Until you see it up there on the screen, you probably won’t understand what you’ve ordered. That is the nature of data. Some early data will not be what you expected. Your job is to translate that confusion into something your senior leaders can easily digest.

Reality: You will be pecked to death by questions.

When seeing something new, people will pepper you with questions. They will question your data and the methodology, so be prepared. Anticipate every question. If you have solid answers, you’ll come out a hero and the doubters will go silent.

Katie Paine is the CEO, publisher, consultant and founder of Paine Publishing, where a version of this post first appeared. Find her on Twitter @queenofmetrics.

COMMENT

Ragan.com Daily Headlines

Sign up to receive the latest articles from Ragan.com directly in your inbox.