Archive for July, 2008

I promised more regarding the future standard of response campaign tracking via multiple attribution so here is the next, albeit short, installment. This post focuses a bit on the devil’s advocate position on the arbitrary nature of current multiple attribution protocol reporting.

I had a chance to speak with Doubleclick last week, who reminded me that their “Exposure to Conversion” custom reports are in fact multiple attribution reports, which actually have been in-market longer than the Atlas Engagement Mapping reporting. The format is very different from that of Atlas, who developed a slick interface to go along with what is essentially a very complex process. Doubleclick provides the data in a more raw format to allow the agencies to manage the data sets manually. (using excel, access or a data mining or dashboard tool for example).

Ultimately Atlas and Doubleclick offer different means to reach similar multiple attribution reporting ends.

Of course, there is no “magic button” to make the multivariate process of attribution simpler. Whether  a slick interface or raw data is provided, there are too many variables for any current solution to be  deemed accurate.  However, I still believe that any multiple attribution reporting has to be better than the last ad standard. The reality is that accuracy lies somewhere in between the two.

Weighting of criteria such as format, ad size and even recency, for example, are very subjective and lack the support of empirical data to direct the weighting. Over time the ad servers will be crunching aggregate level data to establish more standards in this regard. However, none of the current models take environment into account. In theory, relevancy of placement (another arbitrary and subjective variable) would have to be included in the equation somehow. For example – if a consumer is exposed to a rich media unit in a non-contextually relevant environment yesterday and a standard ad in a contextually relevant environment today, how would the inter-relationship between relevancy, format and recency effect influence? In other words, how would these variables be weighted in the multiple attribution protocol equation?

As you can see – there are no simple answers. One of the issues this poses is the level of adoption among an industry struggling to keep up with the data intensity needs of today, let alone those of tomorrow.  To a degree, agencies and marketers are indeed seeking the “magic button” that doesn’t exist.

I still proclaim this to be part of the future standard, even if it takes a couple of more years to get there. I look forward to a bright future of multiple attribution reporting, not only for response campaigns, but also for brand based campaigns where multiple touch points create aggregate influence. We currently lack the necessary insight to prove where that influence is coming from. Roll over John Wanamaker!

Atlas Advertiser Suite

Atlas Advertiser Suite

I am incredibly excited to discuss the first in a series of posts about the topic of the next generation of online media analytics.

Why is there a resounding silence and echo when I say that? Why is the industry not jumping at the opportunity to apply a more sophisticated model to our efforts?

Folks – this is the future standard!

I’m excited! This is one of the biggest improvements in online advertising since the release of centralized post click/post impression tracking 10 years ago!

Bringing More Precision to DR Accountability
I have recently had the pleasure of receiving a demo on Atlas  Engagement Mapping, which is essentially their application of the “multiple attribution protocol” – essentially providing partial credit for each ad exposed or engaged with prior to a conversion event. For direct response marketers this is the panacea that further fulfills the the promise of accountability that digital media has promised for such a long time.

I have been yearning to apply this type of methodology to the tracking of client campaigns for many years. I sat on a Doubleclick advisory board for about 5 years or so and have always wished to have the ability to easily mine Doubleclick log files to apply multiple attribution to each conversion. It can be done – by using a third party like Theorem or Blackfoot, but it’s not an easy, turnkey or affordable process.

Particularly as search became the lion’s share of online ad spending over the last several years, marketers have been seeking better analytics on the inter-relationship between search and display ads., while agencies have been in desperate need of a deeper ability to generate insight that led to more intelligent optimization. The industry data just scratches the surface on that one…until now!

Atlas beat Doubleclick to the punch and has set the ball in motion, and all I can say is – “thank you for forcing the industry to progress”!

According to Atlas research, due to the “last-click standard” we have been using to-date, we have been ignoring 94% of the “engagement touch points” (influence) before each online conversion, and that 66% of converters are exposed to ads on multiple sites before the conversion occurs. Up until now we have only been giving credit to the proverbial straw that broke the camel’s back, if you will.

Engagement Mapping Visualizer

Engagement Mapping Visualizer

Atlas Engagement Mapping allows the agency to apply weighting to criteria such as recency, ad format, and ad size, and you can change weighting for active (engagement) versus passive (exposure) variables. All of this adds up to a conversion credit %.

So why has Doubleclick not formally released a similar product? I know this has been on their radar for a number of years now. I wonder what the potential hit to search marketing credit and potential revenue for [parent] Google would be.  In fact, in the case study released by Atlas, search took a 60% hit in conversion credit compared to  the current last-click standard.  However, Doubleclick will not sit idly by as this evolution creates a big value difference between the analytics provided by DART versus Atlas. At any point in time either of these two leading ad serving and analytics systems offer one or two features that the other doesn’t, and it’s normally not a “make or break” feature. However, this new development is a game changer, and I look forward to seeing the far reaching ramifications.

Note: I did reach out to Doubleclick but have yet to be provided with the official position on where they stand with their application of the Multiple Application Protocol, but I will update the blog with that info as soon as I have a chance to chat with my friends at Doubleclick.

Not A Perfect World, But More Perfect Than Before
Of course, there is no such thing as a perfect closed loop tracking system.  Will MAP be the solution to all of our problems? No, of course not. The attribution of a subjective percentage of post-impression conversions  as it relates to the isolation of online media from other media has been questioned, for years and this argument will re-emerge and take new shape. But multiple attribution tracking brings us a lot closer to the complete picture than we have been up until now, at least within the confines of the digital ecosystem.

A big question I have been asking myself relates to frequency. We have learned as an industry up until now that high frequencies are not necessary to drive response, conversion nor branding effectiveness. This fact has been an indication of digital media’s role in an integrated media mix, as well as the need for relevancy within the context of digital media being intent-driven. But it also begs the following thought…

Based on the last ad standard, we have been able to report on aggregate frequency’s impact on DR and branding metrics for years, and lower frquencies have been more efficient and effective (prior to hitting a reach saturation point), and regardless of attribution, the whole is still the sum of its parts. Even at an optimal frequency of 2 or 3, multiple attribution testing will have a huge impact on conversion credit reporting. So therefore, although powerful for display ads alone, as I mentioned earlier – the mac daddy application here, if you will, is providing insight into the manner in which search and display work cooperatively and the resulting attribution shifts.

Next Stop, Brandville
When this type of systematic approach to multiple attribution credit is overlaid onto branding effectiveness data such as Dynamic Logic, we will see another seismic shift in our ability to hone in on what works best for any particular client. Imagine partial attribution of increased awareness, brand favorability and purchase intent, a dynamic that nobody denies.

A Widening Digital Divide Among Marketers?
If you’re working in media planning, buying & management, I would definitely advise you to check out the demo of the new Atlas Engagement Mapping visualizer. If you’re an Atlas client already, give your rep a call and try it out. If you’re a Doubleclick client we’ll have more news for you soon, but you can always inquire yourself. If you are not using either system you may be out of luck for now, but eventually this will have to become the standard for everyone or the industry will continue to polarize the “data haves” and the “data have nots”, which is not not a pretty picture. The press has used the term “digital divide” to describe consumers with or without access to the internet. There is a greater digital divide between the small to mid sized bsuiness and the larger businesses who can afford the tools and data to glean insights that make running a business that much more efficient and profitable.  The reality is that data storage and bandwidth costs have  dropped significantly, and there is no reason why these tools cannot one day become affordable for all marketers. Sure certain features will be set aside for the “data-elite” who can afford them, but ultimately it is important that most marketers have access to these sophisticated tools.

Stay tuned for more on this topic!

As the shift of strategy development and stewardship continues to move into the realm of the media agencies, IPG has launched a group to manage and oversee all of the media services groups within the holding company. This move gives IPG the equivalent of WPP’s GroupM, a hub that creates efficiency through resource and information sharing. This is an essential move in this era of increasingly labor intensive, complex and collaborative client service requirements. The next step will be data systems that finally bind these units together from an analytic and insight perspective, across media. The eventual dashboards from each media services conglomerate will be the value center of the organization.

“The moves we are making today are part of an ongoing evolution in our approach to media as an increasingly strategic and high-value marketing service,” said Michael I. Roth, IPG’s Chairman and CEO. “The creation of ‘Mediabrands’ will allow our media companies to share and leverage resources, as required to meet the needs of our clients in a highly complex and rapidly-changing media landscape that’s being transformed by digital and the proliferation of content and media platforms.”…

The acquisitions and vital components of the “new agency” structure are being aggregated by each of the holding companies. So far it seems that WPP has the most momentum in the process, but ultimately this will be a focal area of competition among the agency holding company heavyweights as they aim to continue to serve the largest clients in the ecosystem.

I decided to add a new posting format to TheDigitalBlur.  The “Digital Marketing Round-Up” will be posted around the end of each month and will be a  combination of short thoughts on issues that I feel will have a big impact on us marketers in the not so distant future. This ranges from acquisitions to  companies restructuring, new applications of technology, and new ad programs. I hope you enjoy it!

So without further delay, The inaugural Digital Round-up for June 2008…

Google Applying Cookie Data: Despite the cries of privacy advocates, this can be a major breakthrough in online advertising. A few years ago Google changed its privacy policy to state that they might eventually use cookie data to “display customized content and advertising.” Apparently a securities analyst has discovered that they are indeed doing so, and this was confirmed by Google. Well, I certainly hope so!

I am waiting for the true integration of Google and Doubleclick units, and although this will present a fine privacy line as it relates to the personally identifiable data that Google does indeed have via Gmail etc, there should be an easy way of firewalling that data if need be. We live in a data driven world folks. This is the future of content and marketing distribution. Creating increased relevancy for the consumer is a good thing. I have posted many thoughts on this matter, and I expect that we will get past the perceived privacy issues as we have with every other aspect of digital marketing to date. Doubleclick has been the martyr of at least one round of this issue in the past. Relevancy is a benefit, I wish we could all just get over it and move on.

Microsoft Acquires Semantic Search Technology: After the failed attempt at acquiring Yahoo, Microsoft last week announced the acquisition of semantic search company Powerset. Of course this was in the works for a long time , but the timing of the announcement was classic. Does Microsoft + Powerset = a threat to Google? Not in a million years. The momentum of Google’s stronghold on search is going to be tough to beat, or even compete with, as Yahoo and Microsoft have both learned the hard way to date. But the advances in semantic technology will in theory make for better search experiences over time, and this is Microsoft’s first step in the direction of developing a new search mouse mouse trap, or least improving the existing one. I’ve reported previously about Yahoo adopting semantic web standards, and have predicted that the application of semantic technology will fuel the next evolution of the web itself. In the increasingly data driven world we live in, I fiercely stand by that prediction.

Nokia Acquires Remaining Part of Symbian: It’s no secret that consumers’ and marketers’ dependence on the carriers for on-deck mobile opportunities will change over the next few years. Nokia has been making headway in the mobile advertising space, and the acquisition of Symbian should prove to be part of paving the road to the golden goose. Symbian currently runs on over half of the smart phones in the global market. However, with Apple’s iPhone and the soon-to-be-rolled-out open platform “Android” from Google, Symbian’s market share can be eroded quite easily. By standardizing an open platform, Nokia should be able to entice additional development and remain a major player in the mobile OS world.

More Print Shift To The Web: The LA Times slashed 250 jobs last week, the findings – consumers don’t have the time to read the paper anymore. Editor Russ Stanton stated that “The Web and print departments will be merged into one operation with a single budget, and the company will also refocus on being more versatile. We’ve heard these sentiments before, and we’ll here them again from others.

Average TV Network Viewer Age = 50 Years Old: Of course this varies from network to network (CW median age is only 34), but the trend shows that TV viewing audiences are getting older as media continues to fragment. It’s a brave new world out there, and as digital media consumption increases, we need to solve some of the basic issues that have plagued our industry since the dawn of online marketing history, including establishing more industry level research and data on the correlation of various aspects of advertising as it relates to effectiveness, as well as educating marketers about digital measurement in general. It still boggles my mind how many marketers (and agencies for that matter) mis-align their KPI’s (key performance indicators) with their objectives, or chose to use irrelevant metrics like CTR. There’s a lot of experimentation happening with emerging media, and most have not mastered the basics yet. A year has past since I published an article in MediaPost on this very subject, and on an industry level I haven;’tseen  or heard of much change.

MySpace & Facebook – Battle of The Redesigns: Facebook is quickly catching up to MySpace’s market dominance, in part due to the open platform for developers and the streamlined nature of the profile design and application of the social graph. With Facebook’s upcoming redesign,  applications will be moving to a separate tab, and the news feed will become even more prominent than it is currently. This is a big change amid marketers’  experimentation revolving primarily around launching applications and subsequntly trying to foster participation.  Meanwhile MySpace rolled out a redesign a few weeks ago, which was primarily focused on streamlining the chaotic mess of  a structure that was once consumer profiles. Cleaner navigation and increased applications of the social graph has been Facebook’s strong point. and MySpace’s achilles heel. MySpace had no choice but to update., and ‘they done good’. Even though they are a leader today, there always exists the chance of  MySpace getting displaced as we have seen with other social networks like Friendster.

Publicis Consolidates and Creates Vivaki: Next in the big agencies to announce the consolidation of digital assets is Publicis. WPP and Carat have already sone so in varying capacities, and inevitably all the others will follow suit soon enough.  Note to David Kenny & Jack Klues: the first step to proving that Vivaki is the right digital solution is following best practices. That 10 second flash intro on the new Vivaki website needs to go! Rishad, same to you buddy on the Denuo site.

This is a topic near and dear to my heart, and I often write about the morphing agency structure. The fragmentation of media and the shift to a data driven marketplace has created a shift of general marketing strategy from the creative agencies to that of the media agencies. Many of the holding companies have even developed units that specifically specialize in the development and stewardship of strategy. We will continue to see re-bundling of agency services, although to a degree the specialist is needed more than ever . Agencies must attract and recruit specialized individuals to ensure the proficient execution across an ever growing palette of channels. We have seen many senior digital agency execs moving to the client and publisher side as an additional trend lately. Integration of services to offer a big picture approach while maintaining proficiency in the specialties will be the new agency positioning.

Social Media As A Formal Discipline?: As the opportunity cost of not monitoring the conversations and interactions surrounding your brands and products increases, the role of full time Social Media Strategists and Community Managers  have crept into recent rounds of recruitment for marketers and agencies alike. The required commitment to the social media ecosystem has made it apparent that the attention of at least one full time staffer on the agency or client side is going to be a requirement at some point for all brands.  Although brands can have their agencies assign a full time person assigned to their brand (today there are many specialized and integrated agencies who offer social marketing services), there is an economic reality that brands may be best served in this manner internally, with support from agencies for specific tasks and projects. It’s far too early to tell, but if I were a major brand I’d be looking for  an internal manager at this point. The costs of the monitoring tools are coming down and the players are becoming more diverse. The social media ecosystem is evolving before our eyes, it’s a lot to keep up with. Brands must commit to be committed – hire a social media manager or at least an agency that can help you wrap your arms around what’s happening in social media and what it means to your brand.