Advertising Analytics 2.0


One of our clients, a consumer electronics giant, had long gauged its advertising impact one medium at a time. As most businesses still do, it measured how its TV, print, radio, and online ads each functioned independently to drive sales. The company hadn’t grasped the notion that ads increasingly interact. For instance, a TV spot can prompt a Google search that leads to a click-through on a display ad that, ultimately, ends in a sale. To tease apart how its ads work in concert across media and sales channels, our client recently adopted new, sophisticated data-analytics techniques. The analyses revealed, for example, that TV ate up 85% of the budget in one new-product campaign, whereas YouTube ads—a 6% slice of the budget—were nearly twice as effective at prompting online searches that led to purchases. And search ads, at 4% of the company’s total advertising budget, generated 25% of sales. Armed with those rich findings and the latest predictive analytics, the company reallocated its ad dollars, realizing a 9% lift in sales without spending a penny more on advertising.

That sort of insight represents the holy grail in marketing—knowing precisely how all the moving parts of a campaign collectively drive sales and what happens when you adjust them. Until recently, the picture was fuzzy at best. Media-mix modeling, introduced in the early 1980s, helped marketers link scanner data with advertising and decide how to allocate marketing resources. For about 20 years, everyone gorged on this low-hanging fruit, until the advent of digital marketing in the late 1990s. With the ability to monitor every mouse click, measuring the cause-and-effect relationship between advertising and purchasing became somewhat easier. Marketers started tracking a consumer’s most recent action online—say, a click on a banner ad—and attributing a purchase behavior to it.
Combined with a handful of time-honored measurement techniques—consumer surveys, focus groups, media-mix models, and last-click attribution—such outmoded methods have lulled many marketers into complacency. They mistakenly think they have a handle on how their advertising actually affects behavior and drives revenue. But that approach is backward-looking: It largely treats advertising touch points—in-store and online display ads, TV, radio, direct mail, and so on—as if each works in isolation. Making matters worse, different teams, agencies, and media buyers operate in silos and use different methods of measurement as they compete for the same resources. This still-common practice, what we call swim-lane measurement, explains why marketers often misattribute specific outcomes to their marketing activities and why finance tends to doubt the value of marketing. (See the exhibit “Get Out of Your Swim Lanes.”) As one CFO of a Fortune 200 company told me, “When I add up the ROIs from each of our silos, the company appears twice as big as it actually is.”
Marketers commonly measure the performance of each of their marketing activities as if they work independently of one another—so called swim-lane measurement. This may result in significant over- or underattribution of advertising revenues because ads in one medium can exert a powerful influence on, or assist, those in another. Swim-lane measurement ignores those assisted effects. Data analysis of one campaign revealed that swim-lane measurement grossly underestimated the revenues attributable to social-media marketing and display advertising while overestimating PR and paid-search revenue.

Today’s consumers are exposed to an expanding, fragmented array of marketing touch points across media and sales channels. Imagine that while viewing a TV spot for a Toyota Camry, a consumer uses her mobile device to Google “sedans.” Up pops a paid search link for Camry, as well as car reviews. She clicks through to Car and Driver’s website to read some reviews, and while perusing, she notices a display ad from a local dealership but doesn’t click on it. One review contains a link to YouTube videos people have made about their Camrys. On YouTube she also watches Toyota’s clever “Camry Reinvented” Super Bowl ad from eight months earlier. During her commute to work that week she sees a Toyota billboard she hadn’t noticed before and then receives a direct-mail piece from the company offering a time-limited deal. She visits local dealerships’ websites, including those promoted on Car and Driver and in the direct-mail piece, and at last heads to a dealer, where she test-drives the car and buys it.
Toyota’s chief marketing officer should ask two questions: How did this combination of ad exposures interact to influence this consumer? Is Toyota investing the right amounts at the right points in the customer-decision journey to spark her to action?
Data Deluge
Seismic shifts in both technology and consumer behavior during the past decade have produced a granular, virtually infinite record of every action consumers take online. Add to that the oceans of data from DVRs and digital set-top boxes, retail checkout, credit card transactions, call center logs, and myriad other sources, and you find that marketers now have access to a previously unimaginable trove of information about what consumers see and do.
The opportunity is clear, but so is the challenge. As the celebrated statistician and writer Nate Silver put it, “Every day, three times per second, we produce the equivalent of the amount of data that the Library of Congress has in its entire print collection. Most of it is...irrelevant noise. So unless you have good techniques for filtering and processing the information, you’re going to get into trouble.”
In this new world, marketers who stick with traditional analytics 1.0 measurement approaches do so at their peril. Those methods, which look backward a few times a year to correlate sales with a few dozen variables, are dangerously outdated. Many of the world’s biggest multinationals are now deploying analytics 2.0, a set of capabilities that can chew through terabytes of data and hundreds of variables in real time. It allows these companies to create an ultra-high-definition picture of their marketing performance, run scenarios, and change ad strategies on the fly. Enabled by recent exponential leaps in computing power, cloud-based analytics, and cheap data storage, these predictive tools measure the interaction of advertising across media and sales channels, and they identify precisely how exogenous variables (including the broader economy, competitive offerings, and even the weather) affect ad performance. The resulting analyses, put simply, reveal what really works. With these data-driven insights, companies can often maintain their existing budgets yet achieve improvements of 10% to 30% (sometimes more) in marketing performance.

Drawing on the pioneering mathematical models developed by UCLA marketing professor and MarketShare cofounder Dominique Hanssens, our firm provides analytics 2.0 solutions to many large global companies. The models quantify cross-media and cross-channel effects of marketing, as well as direct and indirect effects of all business drivers, and the software employs cloud-computing and big-data capabilities. The cases we present in this article are drawn from our client companies. Numerous other firms—such as VivaKi, Omniture, and DoubleClick—have emerged in recent years to meet the growing demand for advanced analytics.
The Move to 2.0
Powered by the integration of big data, cloud computing, and new analytical methods, analytics 2.0 provides fundamentally new insights into marketing’s effect on revenue. It involves three broad activities: attribution, the process of quantifying the contribution of each element of advertising;optimization, or “war gaming” by using predictive analytics tools to run scenarios for business planning; and allocation, the real-time redistribution of resources across marketing activities according to optimization scenarios. Although those activities are described in this article as sequential steps, they may occur simultaneously in practice; outputs from one activity feed into another iteratively so that the analytics capability continuously improves.
Electronic Arts (EA), one of the world’s largest software gaming companies, creates some of the best-known titles across all gaming platforms, including Madden NFL, Battlefield, and Sims. EA faced challenges that are common in creative industries: high volatility; high-risk, high-reward development cycles; short product life cycles; a premium on creative quality; and a reliance on hit products. Like other creative businesses, EA also relied heavily on intuition in its decision making.
Senior VP of marketing Laura Miele and head of decision sciences Zachery Anderson recognized several years ago that although relying on traditional analytics and instinct in its marketing had served the company adequately, its advertising performance had fallen off. One reason, they surmised, was that the company’s tech-savvy core audience was spending more time online, beyond the reach of EA’s traditional marketing efforts. In that new environment, they wanted to answer questions about a variety of strategic issues, including the company’s investment strategies, marketing activities, cross-media and cross-channel efforts, and the effect of online initiatives on in-store sales.
The company ultimately decided to retool its marketing analytics by applying the attribution, optimization, and allocation framework to its entire game portfolio. EA had been measuring advertising performance using traditional methods such as customer surveys and media-mix models, and it had been attributing year-to-year and title-to-title variations in sales to creativity in advertising and game quality.
In the attribution step, the analytics engine homed in on hundreds of EA’s business drivers, including advertising, reviews, sales data, pricing, game quality, distribution, and online chatter. The exercise uncovered several important facts. First, in-theater advertising, a tactic favored by the organization, was underperforming. Second, the effect on sales from search, digital, and online-video advertising (such as YouTube) was significantly greater than believed. Finally, EA discovered that the “flighting” of its advertising—that is, the timing of campaign tactics and the intervals between them—was suboptimal.
Then EA moved to the optimization phase, war-gaming hundreds of advertising scenarios for collaborative review by people in marketing, finance, operations, and other functions. This optimization process led to an allocation plan, to be executed by EA’s agencies and channel partners, that shifted ad investments from TV to search and online video, as well as a new flighting schedule for the holidays.
Before the analytics were deployed, the campaign for previous versions of Battlefield allocated about 80% to television and included very little paid search, social media, or online video. The increased budget for Battlefield 3 reflected big shifts in allocation: to only 50% television, with significant spending in both online video and paid search. These changes helped to makeBattlefield 3 the most successful launch in EA’s history. Shifts in the marketing budget alone accounted for an estimated 23% increase in EA’s sales of Battlefield 3, compared with previous versions of the game.

Attribution. To determine how your advertising activities interact to drive purchases, start by gathering data. Many companies we’ve worked with claim at first that they lack the required data in-house. That is almost always not the case. Companies are awash in data, albeit dispersed and, often, unintentionally hidden. Relevant data typically exist within sales, finance, customer service, distribution, and other functions outside marketing.
Knowing what to focus on—the signal rather than the noise—is a critical part of the process. To accurately model their businesses, companies must collect data across five broad categories: market conditions, competitive activities, marketing actions, consumer response, and business outcomes. (See the exhibit “Optimizing Advertising.”)
Statistical models that reveal the effect of advertising on consumer behavior and business results must account for hundreds of variables related to market conditions, marketing actions, and competitive activities. A software analytics engine uses those models to attribute each variable’s effect accurately, to optimize the marketing mix, and to guide spending allocation. Data on consumer response and business outcomes feed back into the engine, allowing marketers to fine-tune their cross-media spending in real time.
With detailed data that parse product sales and advertising metrics by medium and location, sophisticated analytics can reveal the impact of marketing activities across swim lanes—for example, between one medium, say television, and another, social media. We call these indirect effects “assist rates.” Recognizing an assist depends on the ability to track how consumer behavior changes in response to advertising investments and sales activities. To oversimplify a bit: An analysis could pick up a spike in consumers’ click-throughs on an online banner ad after a new TV spot goes live—and link that effect to changes in purchase patterns. This would capture the spot’s “assist” to the banner ad and provide a truer picture of the TV ad’s ROI. More subtly, analytics can reveal the assist effects of ads that consumers don’t actively engage with—showing, for example, a 12% jump in search activity for a product after deployment of a banner ad that only 0.1% of consumers click on.
This insight translates directly to any advertising that consumers encounter but may not specifically act on, including TV ads, social-media placements, PR, online or outdoor displays, mobile ads, and in-store promotions. Think of the billboard ad on our Toyota buyer’s commute. The ad itself probably didn’t cause her to drive to the dealership and purchase a car. But it may have nudged her to look at the direct-mail piece when it arrived, which ultimately inspired the visit to the dealership—a complete customer journey we can now measure. It’s difficult or impossible to quantify such assist effects at an individual level, particularly when they involve off-line ads, so analytics 2.0 works by exposing those effects. It uses a sophisticated series of simultaneous-equation statistical models that reassemble various interrelated effects into a view that accurately explains the market behavior.
The hazards of simplistic swim-lane measurement were personal for one of our client’s marketing executives. Early in his career, at a high-profile e-commerce company, the marketing team presented to finance some campaign results that had been generated using traditional analytics methods:


Things quickly became awkward when finance pointed out that the business unit had generated only $110 million in revenue, $50 million short of the reported total. The discrepancy arose because, lacking good data, leaders in each swim lane claimed the same bucket of revenue.

That lesson stuck with this executive as he set out to help solve the industry problem of incorrect attribution. He eventually joined a consumer technology company that has enthusiastically embraced analytics 2.0. There he created an analytics platform to reveal how the company’s advertising and sales force activities interacted.
Examples like these necessarily distill the complexity of analytics 2.0. In actual analyses run by a large company, statistical models may account for hundreds or thousands of permutations of advertising and sales tactics, as well as exogenous variables such as geography, employment rates, pricing, season of the year, competitive offerings, and so on. When you analyze every permutation of an ad campaign according to those variables, the complexity of the task and the necessity for cloud computing and storage become clear. You also realize that such analyses allow you, for example, to instantly see how a new TV ad affects consumers’ online search patterns—and then to change your keyword-search bidding strategy to buy up relevant words as the ad is running. They might also help you identify Facebook’s actual effect on both short-term revenue and long-term brand equity.
Optimization. Once a marketer has quantified the relative contribution of each component of its marketing activities and the influence of important exogenous factors, war gaming is the next step. It involves using predictive-analytics tools to run scenarios for business planning. Maybe you want to know what will happen to your revenue if you cut outdoor display advertising for a certain product line by 10% in San Diego—or if you shift 15% of your product-related TV ad spending to online search and display. Perhaps you need to identify the implications for your advertising if a competitor reduces prices in Tokyo or if fuel prices go up in Sydney.
Working with the vast quantities of data collected and analyzed through the attribution process, you can assign an “elasticity” to every business driver you’ve measured, from TV advertising to search ads to fuel prices and local temperatures. (Elasticity is the ratio of the percentage change in one variable to the percentage change in another.) Knowing the elasticities of your business drivers helps you predict how specific changes you make will influence particular outcomes. If your TV ads’ elasticity in relation to sales is .03, for example, doubling your TV ad budget will yield a 3% lift in sales, when all other variables remain constant. In short, analytics 2.0 modeling reveals how all driver elasticities interact to affect sales. (See the exhibit “How Ads Interact to Boost Sales.”)
In this holiday campaign for a consumer electronics product, online searches on the manufacturer’s name spiked in direct response to TV advertising.


Analytics revealed that the company could have made better use of cross-media effects on retail traffic. Although just 15% of its campaign budget went to digital marketing, digital accounted for 38% of the product’s retail sales.


Measuring cross-media, cross-channel effects drove significant reallocation recommendations that ultimately generated 9% more revenue with the same budget.



War gaming uses the actual elasticities of your business drivers to run hundreds or thousands of scenarios within minutes. In a typical war-gaming process, team members define marketing goals (such as a certain revenue target, share goal, or margin goal), often across multiple products and markets. Crunching the vast database of driver elasticities, optimization software generates a set of most-likely scenarios along with marketing recommendations to achieve them. The software also can test specific what-if scenarios: For instance, how will sales of our midsize pickup truck in Denver be affected if gas prices climb 5% and we launch a combined TV and online campaign promoting a $300 rebate?
At Ford, marketing communications director Matthew VanDyke leads a cross-functional team involving IT, finance, marketing, and other functions. The group is tasked with optimizing Ford’s $1 billion in advertising spending. Using advanced analytics, the team routinely runs thousands of scenarios involving hundreds of variables to gauge the probable effects of different ad strategies under a range of complex circumstances. The analyses incorporate insights from the attribution step, allowing Ford to predict from one scenario to the next how changes in advertising investment in one medium are likely to affect ad performance in others, and how exogenous factors might influence outcomes.
For example, as consumers’ interest in fuel-efficient vehicles has grown, Ford’s marketing science manager Mike Macri and his team have used war gaming to quickly assess which markets will be receptive to creative messages about fuel efficiency and have redirected advertising resources accordingly via their agency partners. Indeed, these war games are driving several current cross-media campaigns for Ford.
Predictive analytics also allow Ford to war-game changes in media planning and purchasing, both nationally and locally. For instance, it discovered that the company’s overall digital spending, though appropriate, was overemphasizing digital display and underinvesting in search. In addition, before the firm used war-game scenario planning, national and local marketing budgets were treated separately and rarely coordinated. It had been difficult for Ford to determine, for example, how much it should provide in matching funds to dealer groups, whether consumer incentive levels differ among the various cars and regions in its portfolio, and how boosting social-media spending and reducing traditional media buys would affect sales to young drivers. War gaming allowed Ford to predict how those scenarios would play out before actually making changes. The result: Shifts from the national budget to local budgets have produced tens of millions of dollars in new revenues, with no net change in the total ad budget.

Marketers are also using analytics 2.0 to run what-if scenarios for advertising new-product launches, ad buys in markets where data are limited, and the potential effects of surprise moves by competitors. For instance, as a global consumer electronics company client of ours was preparing to launch a game-changing product in an emerging market where historical sales-marketing data were scarce, it used advanced analytics to review advertising behavior by competitors and accurately predict their spending for upcoming releases. Using those predictions and optimization scenarios, the company successfully entered the market with a much clearer understanding of the strategic landscape and adjusted its plans quickly to address new competitive dynamics.
Allocation. Gone are the days of setting a marketing plan and letting it run its course—the so-called run-and-done approach. As technology, media companies, and media buyers continue to remove friction from the process, advertising has become easier to transact, place, measure, and expand or kill. Marketers can now readily adjust or allocate advertising in different markets on a monthly, weekly, or daily basis—and, online, even from one fraction of a second to the next. Allocation involves putting the results of your attribution and war-gaming efforts into the market, measuring outcomes, validating models (that is, running in-market experiments to confirm the findings of an analysis), and making course corrections.
At one of the world’s largest software companies, senior management realized that it needed more accountability and precision in its marketing, as allocation decisions had historically not been based on scientific analysis. To understand which marketing activities were driving leads to its website, resellers, and retail partners—and thereby generating sales—the marketing leadership team used analytics 2.0 to reveal how all its marketing components interacted.
By using models that ultimately accounted for hundreds of variables, the company quantified the precise combination of ads that most effectively stimulated software trials, which activities by resellers generated the most profits, and how advertising in one product category influenced purchasing in other categories. With those insights, the firm reallocated marketing dollars for its various B2B and B2C products. Shifts between off-line and online spending, as well as investments in brand building, have boosted revenues by millions of dollars incrementally.
This company’s analytics 2.0 system has gained credibility with executive management, is now driving minute-to-minute allocation decisions, and is being rolled out globally. As a result, the firm’s advertising ROI has nearly doubled over the past three years.
Five Steps to Implementation
Analytics, once a back-of-the-house research function, is becoming entwined in daily strategy development and operations. Executives who were pioneering early digital marketing teams 10 years ago are advancing to the CMO office. Already wired for measurement, they are often amazed at the analytics immaturity of the broader advertising industry. These new CMOs are taking more responsibility for technology budgets and are creating a culture of fact-based decision making within advertising. Technology consultancy Gartner estimates that within five years, most CMOs will have a bigger technology budget than chief technology officers do.
Technology is necessary but not sufficient to move an organization to analytics 2.0. In our experience, these initiatives require five steps, which can be implemented even by small companies:
First, embrace analytics 2.0 as an organization-wide effort that must be championed by a C-level executive sponsor. Often, pockets of resistance to new analytics approaches crop up, as they challenge closely held beliefs about what works and what doesn’t. Senior-level buy-in is essential to help promote clarity of vision and alignment in the early stages.


Second, assign an analytics-minded director or manager to become the point person for the effort. It should be someone with strong analytical skills and a reputation for objectivity. This person can report to the CMO or sit on a cross-functional team between marketing and finance. As the project expands, he or she can help guide business planning and resource allocation across units.
Third, armed with a prioritized list of questions you seek to answer, conduct an inventory of data throughout the organization. Intelligence that is essential to successful analytics 2.0 efforts is often buried in many functions beyond marketing, from finance to customer service. Identify and consolidate those disparate data sets and create systems for ongoing collection. Treat the data as you would intellectual property, given its asset value.
Fourth, start small with proofs of concept involving a particular line of business, geography, or product group. Build limited-scope models that aim to achieve early wins.
Fifth, test aggressively and feed the results back into the model. For instance, if your optimization analysis suggests that shifting some ad spending from TV to online display will boost sales, try a small, local experiment and use the results to refine your calculations. In-market testing is old hat—what’s new is getting the cross-media attribution right so that your testing is more effective.
When businesses have multiple sales channels such as retail, online, value-added resellers, or multiple products and geographies, analytics 2.0 may become more complex than internal teams can handle. That’s when vendors with specific analytics and computing capabilities are needed. But any company can begin the journey and build much of the required infrastructure for analytics—and the culture of adaptive marketing—in-house. The challenge is as much organizational as computational. Either way, the writing is on the wall: Marketing is rapidly becoming a war of knowledge, insight, and asymmetric advantage gained through analytics 2.0. Companies that don’t adopt next-generation analytics will be overtaken by those that do.
Harvard Business Review
Advertising Analytics 2.0 Advertising Analytics 2.0 Reviewed by Unknown on Thursday, February 28, 2013 Rating: 5

2 comments:

  1. It may shortly become a catastrophic advertisements enterprise if they're not with a professional and seasoned direct mailing business if they start a direct mail letter printing effort with pressure seal mailer and other new technology blessed equipments.

    ReplyDelete

Theme images by RBFried. Powered by Blogger.