(Author’s note: In the interest of transparency, I run a consulting organization that designs measurement programs so I could conceivably profit from helping people fill out the framework. I’ve also developed a different framework that I use in my work called the 6-Step System for Perfect Measurement.)
The hottest thing in measurement this month is AMEC’s new integrated evaluation framework, unveiled with great fanfare at AMEC’s International Summit in London.
It’s an interactive version of the original AMEC framework, intended to make it easier for clients to implement Barcelona Principles/standards-compliant measurement in their organizations.
It features multi-colored squares. Each one requires you to provide information about your organization’s campaign or program. The seven squares are:
When you click on each square it asks you questions such as, “What are the broad objectives of your organization?” and then, “What are your communications objectives?”
Visually, it is certainly an enormous improvement over the old framework, and its interactive nature is a lot less daunting than the old PowerPoint version. In the end, it will be just as challenging to fill out as the earlier version because all the problems inherent in the old framework still exist, despite the sexy new front end.
Sure, there’s a taxonomy that offers examples of the types of answers they’re looking for, but the confusion will persist because the very smart people who created the framework live and breathe measurement every day of their working lives. Many people, if not most, work primarily with large, sophisticated organizations that have at least some background in measuring results.
Unfortunately, that is not your typical PR enterprise. The vast majority of PR is for small to mid-size businesses, restaurants, NGOs and government agencies. These are the people who show up at conferences, attend workshops, and participate in webinars-and they’re asking far more basic questions than the framework will answer.
I filled out the new framework (twice) using a recent PR campaign. In this first round, I used the responses that members of the PR team gave at our first meeting:
The initial conversation left large gaps between what they saw as organizational objectives of the particular event. There was considerable confusion between inputs, activities and outputs, and no clear connection between the organization’s objectives and what the actual impact would be.
In working with the client, we did eventually identify target audiences and objectives. We connected the dots between the communications activity and the ultimate impact. Furthermore, they now have a wonderful working dashboard offering information with which to update management on what was a waste of effort (or not a waste of effort). It just took a bit of time to sort out the definition of “working” and “not working.”
I filled out the framework again, this time with the information I used to make their dashboard. After a long day of checking the taxonomy and making sure I had put everything in the right boxes, I have these tips for anyone trying to do the same:
1. Before you Google “AMEC framework,” do your homework.
What all measurement requires-regardless of what framework, tool or anything else you employ-is a thorough understanding of the organization’s business goals, such as, “What is the mission?” and, “How does it make money?” and, “What is the perceived role of PR in that process?”
For agency folks, this is your biggest weakness and one reason why PR gets no respect when budgets get tight. If you don’t have an agreed-upon definition of how PR contributes to the success of the organization, then you’ll never get past square one (literally).
So meet with your boss, your boss’s boss or whoever is asking, “What have you done for me lately?” and make sure you agree on what the organizational goals are and how PR contributes to them. Click here for a recipe that will walk you through the process.
2. Bake cookies to find your “inputs.”
Some answers to the framework questions about target audiences and strategy may lie in other departments. Depending on the size of your organization, information on specific target audiences/personas or even overall strategy may well lie in sales, marketing or customer intelligence. If you’re at a nonprofit, answers may reside in membership or development. In government agencies, there may be a data center or a committee that has the answers.
Visit whatever departments hold the clues-and bring treats. I’ve always gotten much more information with chocolate chip cookies than an email. Depending on their stress level, a good scotch can also be an excellent persuader.
3. Look at the communications budget for “activities.”
What you’re really doing with this framework (and any measurement program for that matter) is determining what efforts are worthwhile and which are not. The fundamental concept is “worth,” which implies a financial or resource commitment. Rather than offering a list of activities, which could quickly become a nightmare of random metrics, you should list only those activities that require either a significant amount of money, time or other resources. (You can get to the others later.)
Because the Conclave on Social Media Measurement Standards has determined that you “earn” a share, I wouldn’t even bother with the “S” column; just include any shared data under “earned.” Also, note that “earned” doesn’t mean what you have already earned, but rather what you plan to do in terms of “earned” media-e.g., what you’re writing, the nature of the media outreach, speechwriting or anything that is going to require resources.
4. Outputs are what you’ve checked off your to-do list.
After you’ve listed all the activities, now you need to see what actually happened: Did any of that activity reach the agreed upon target audiences? This is where you can “count” the number of news items that ran or that you “earned.” Tally up the paid media placements and anything that was shared. Add the data on clicks, time on site or whatever metrics you’ve agreed are important from your web analytics platform.
If you’re measuring events, count the number of attendees, as well as anyone who used your hashtag. Whatever you do, try to avoid completely inaccurate definitions of “reach” and “impressions.”
5. If you don’t have good survey or engagement data, skip the “outtakes” section.
Essentially, outtakes are what your target audience actually takes away from all the stuff you’ve listed in step 4. In order to understand what a member of your audience actually “takes away,” you have to ask their opinion. In other words: Are they more aware or more likely to consider or prefer your brand? Although engagement is not the same as awareness, it may be an acceptable proxy for evidence of attention on the part of your target audiences.
6. “Outcomes” should be the same as your communications goals you listed in Step 1.
Go back to Step 1 and cut and paste your communications goals in the “Outcomes” section and change the tenses. For example, if the goal was “Increase preference in the new brand by 10 percent,” then the outcome should be “Increased preference in the new brand by 10 percent, as measured by pre/post testing.” If that didn’t happen, prepare a good explanation.
7. “Impact” should be the same as the organizational goals you listed in Step 1.
Go back to Step 1 to copy and paste the business goals in the “Impact” section and change the tenses. For instance, if your organizational goal is “Generate a 10 percent increase in qualified leads from communications activities” or, “Increase support for independence by 10 percent among Scottish women,” your impact should be “Increased support for independence among Scottish women by 10 percent as measured by an average of public polls.” If the impact is different, prepare a good explanation.
Ultimately, I love this framework, not because it is perfect, nor even particularly easy to use, but because it poses the kind of questions that I’ve been answering for 30 years.
What has been your experience using this framework? Please offer your insights in the comments section.
A version of this article first appeared on Paine Publishing.