Sunday, March 23, 2008

We Don't Need to Understand Engagement to Measure It

This blog is all about creating brand-building experiences for consumers. The key to successful Experiential Marketing (XM) is engagement, but what is engagement? And can we measure it without having a common understanding of what it is?

I read an interesting article on iMedia that asks if it is necessary for marketers to settle on a common definition of engagement. Tom Hespos notes that several presenters at the recent iMedia Breakthrough Summit suggested such a definition would be a good idea, since "advertising's history is characterized by standard definitions... (and) adoption is contingent on the industry deciding on a common definition that is accepted by all."

Hespos contends (and I agree) that there is no reason for the industry to settle on one definition, because what constitutes engagement can vary so greatly from one brand, strategy, or medium to another. Unlike traditional media where the "engagement" is relatively straightforward--did the consumer see the TV, print, or OOH ad?--XM engagement can be defined in an endless variety of ways.

For a simple example, let's consider an event marketing program involving a pop-up truck filled with interactive exhibits, a sweepstakes entry, and a post-event email invitation to visit the brand Web site. Which of the following defines the "engagement" for this marketing program:
  • People who see the event truck?
  • People who enter the event footprint?
  • People who interact with one exhibit? Two exhibits? More exhibits?
  • People who enter the sweepstakes?
  • People who speak to event reps?
  • People who sample the product while at the event?
  • People who request more info from the brand or take some other sort of conversion action (such as a purchase)?
  • People who open the post-event email invitation?
  • People who clickthrough to the Web site?
The answer to the question, "Which of these define engagement?" is "All of the above." This small example demonstrates the futility of trying to arrive at a standard definition of engagement. If we can't define engagement in a single campaign, how can we hope to devise a single definition across the breadth of strategies that encompass the field of XM?

Part of the problem is that engagement isn't a single thing that is or isn't. In the language of statistics, engagement is not a dichotomous variable, which means an either/or situation; instead, engagement is a continuous variable, because a program can have a little engagement or a lot.

Furthermore, until someone manages to define an objective standard for measuring human attention, it would seem we'd face the same challenge trying to define engagement that Justice Potter Stewart faced in 1964 in trying to define pornography: He famously wrote that he couldn't define pornography, "but I know it when I see it."

Rather than trying to define engagement, those in the XM business should instead be focused on how to measure the effectiveness of their programs. For the most part, our current measurement methodologies are failing us. Much like traditional advertising metrics, too many of the success metrics in XM rely on tallying people rather than gauging how their attitudes were altered.

In traditional advertising, we measure reach (how many people saw an ad) and frequency (how often an average consumer was exposed to the ad), and in interactive and XM, we count the number of people who enter a footprint or visit a site and how many completed a paper or online form. The problem with these metrics is that, although they are cheap and easy to collect, they are focused on executional effectiveness and not ROI. They ask how many consumers saw or did something and not whether the campaign achieved any of the intended goals.

The need to measure the effectiveness of engagement is especially important because XM tactics will always lose to traditional media in one of the simplest of comparative metrics in marketing, Cost Per Touch (CPT). The cost to produce a TV spot and buy ad time that reaches millions of people is considerable, but when divided by the large number of people who are theoretically exposed to that ad, the CPT is quite low. But CPT for XM tactics is high--comparatively fewer people walk through a physical event or visit a site, so the CPT is greater for XM than for mass media.

Of course, even though CPT is greater for XM, so is engagement--which brings us back to the need to measure the value of engagement. Brand marketers understand that engagement is higher for experiential programs, but when faced with a limited marketing budget and substantial demands to produce results, it is difficult for marketers to justify investing in higher-CPT tactics without metrics that prove those tactics are worthwhile. It isn't enough for those of us in XM to claim and believe that the deeper engagements are worthwhile to brands, we have to prove it.

In the iMedia article, Hespos notes how smart marketers conduct what he calls "3D Brand Studies" to measure campaign effectiveness. While getting the study right can be complex, the approach is quite simple--take one control group that hasn't seen your campaign or been part of your marketing experience and compare their brand perceptions or actions against those of an experimental group that has seen your campaign or been exposed to your marketing experience.

If the experimental group has measurably higher brand awareness, brand preference, awareness of key brand attributes, or purchase intent, then you've taken a big step forward in proving the value of the program. Of course, this still doesn't really prove ROI; for example, what is the dollar value of a five-point increase in brand awareness? But even if this approach doesn't result in a definitive measure of ROI, proving real impact on brand criteria is a vastly more powerful thing to deliver to clients than are simple tallies of visitors.

While understanding how to measure a program's impact on consumer perceptions is easy, getting this done is a challenge for XM. Gathering data from control and experimental groups requires budget and careful planning, but timelines are generally compressed and the focus of those involved in executing an XM program is dedicated to the exciting consumer experience and not to the relatively dry subject of measurement.

But if XM is to prove its worth and demonstrate that the experiences we craft for brands and their consumers create real value, we must begin to take measurement much more seriously. XM agencies that understand this will create better programs and stronger value-based relationships with clients, while those that simply count feet and eyeballs will find themselves fighting a losing battle to earn their clients' trust and business.

No comments: