Archive for the ‘Data Driven’ Category

Hey direct marketers – guess what – you have an attribution problem. You may not realize it yet, but if you are running campaigns in multiple digital marketing channels like search, display, affiliate marketing, and comparison shopping engines, you are double  counting some conversions (sometimes triple counting), over crediting certain channels and under crediting others. Your model needs a fixin.

Attribution reporting tools have been on the market for several years now, but are potentially the most underutilized tools by agencies and advertisers today. In part this is due to the complex nature of the planning involved in making the data actionable – in other words planning to go from attribution reporting to media mix modeling based on the analysis. On the other hand, digital media planners and agencies have been working to the bone and have very little time to innovate and discover new and better ways to work and service their clients.

So when Hollis Thomases from ClickZ reached out to me for input on this important topic, I was more than happy to help. It’s a topic I am quite passionate about. I think she put together a great article that highlights the issue.

Under-utilizing great marketing technology is nothing new. As an industry we have a history of under-utilizing the capabilities of the marketing technologies that are at our fingertips. I’ll never forget when Doubleclick wanted to shut down their Boomerang product, the very first retargeting technology on the market. Our agency was one of a small handful using it, and although circa 2000 it was tough to scale retargeting programs, we managed to run some of the first successful programs for large portfolio companies like Cendant. Fast forward to today. Retargeting has not only become one of the most popular forms of behavioral / data-based targeting, but an entire retargeting industry has sprouted up in the last couple of years.

Granted, there are hundreds of marketing technologies and capabilities pitched to agencies on a regular basis. It’s hard to find the time to vet them all. But we do work in DIGITAL media and and it is important to  be on the lookout for tools and systems that can help provide bottom line impact for clients and agencies. Embrace marketing technologies in particular that help create efficiency in workflow,  accuracy in analytics, and provide better experiences to consumers.

I’d love to hear about any new marketing technology tools that you find useful.

 

If you enjoyed this post, please consider adding a comment, subscribing to post updates via email or subscribing to the RSS feed. Thanks.

Advertisements

I posted a short piece on the latest Nielsen product in the Laredo Group newsletter in response to many agencies wondering what Nielsen’s new Campaign Ratings system means for them. here’s the answer…

Mention online GRP’s to a group of marketers or agencies and you’ll get reaction ranging from relief to rage. The notion of using the traditional media metric of GRP (technically TRP) has been the source of much debate, and in the crosshairs of the industry’s leading media measurement companies, Nielsen and Comscore, for years.

Let’s remember that the GRP is used in two facets of advertising:

A)  predicting the outcome of specific levels of media weight during planning; and

B)  confirming ratings after campaigns have run

The launch of Nielsen Campaign Ratings is big news, and the tool focuses on the latter measurement.

Nielsen will be working with large web publishers, including Facebook, who will provide anonymous aggregate reach and frequency data in age and gender buckets, which will be combined with Nielsen data from its TV and online panels, resulting in a single report showing R/F and GRPs for specific campaigns. Quantcast had tried to plant the same stake in the ground, from a solely digital perspective, and with a vastly different methodology. Nielsen takes a giant leap forward by partnering with large publishers, and combining the reports with the industry standard Nielsen TV panel.

What Nielsen created here is most valuable to brand advertisers who are trying to maximize R/F against specific audiences across a media mix. The big caveat is that this only uses broad age and gender buckets (and falls short of all the wonderful psychographic profile data used in other tools like @plan) – but that normalizes against the TV targets of most brand advertisers.

With the difference in media currency between digital and traditional media, and previous attempts at comparing cost per point, we can expect to see online buys showing less GRP coverage than planned. In some instances this will result in an increase in digital budget allocation to close the gap, and in other cases, a decrease in budgets due to inefficiency in reaching certain targets as compared to other media. It is generally accepted that a diverse media mix will maximize the realization of R/F goals, but the ideal mix model is not an easy nut to crack.

Nielsen’s new Campaign Ratings system definitely represents the progression of cross media R/F and GRP measurement, however, in the grand scheme of the GRP conversation, I’d argue that the predictive nature of planning with GRP’s is more important.

Official announcement.

The Privacy Man cometh. Now it’s time to figure out who is going to payeth!

The industry has formally taken a stance to thwart the strong arm of the FTC by enforcing compliance of self regulatory guidelines on data collection and usage.  The Digital Advertising Alliance, which is comprised of the leading advertising trade organizations AAAA, AAF, ANA, DMA, IAB, NAI & the BBB, has selected Better Advertising’s monitoring technology to help enforce compliance. Enforcement is said to begin Jan 1, 2011., and the enforcer will actually be the BBB. I’ll also add that this initiative is in its very nascent stages and surely will continue to be shaped by adoption and the economics of the process. AdSafe Media and TRUSTe have partnered  to become the second provider of compliance enforcement, and will be applying for the same accreditation that Better Advertising received from DAA. You can bet that a handful of others will enter the space as it grows. This is a good thing. We need multiple options for healthy competition, as well as many minds working to keep the FTC from passing ‘baby & the bathwater’ type of regulations for online advertising targeting. However, it is notable that Better Advertising is the only company solely focusing on this.

Compliance is Everyone’s Responsibility, Sort Of

The onus of compliance is going to be more on the publisher/network/DSP side than the advertisers.  When placing buys on networks, exchanges, DSP’s, and even sites directly, when using behavioral targeting, an advertiser can license the icon via the publisher, or choose to use their own, so to speak. The icon overlay can be served via the publisher’s account and over the advertiser’s third party served ad without any technical implementation by the advertiser, or the advertiser can work directly with Better Advertising (and  at some point in the future, a provider of choice). Of course the cost always comes back to the advertiser somehow. But adding more line items of marketing technology fees is not something that advertisers embrace quickly.  So ultimately it is too early to tell whether the standard practice will be the advertiser or publisher being responsible for compliance. However, the big agency holding companies, in addition to the ad networks and publishers, have all bought in and are slowly ramping up delivery of BT ads with the “Advertising Options Icon”, which provides consumers with disclosure regarding data collection and usage, and the ability to opt-out of  specific targeting. Currently when consumers opt-out of targeting they are opting out at a data / targeting provider level, not at an advertiser level.

The Cost of Compliance Enforcement

Publishers & advertisers who are compliant will be able to license the use of the icon for $5,000 per year (if annual BT revenue is less than $2MM this fee is waived). This fee helps fund the DAA and enforcement of compliance. Better Advertising is paid a nominal CPM for the service, which consists of the delivery of a java script overlay of the icon and the functionality of disclosure,  compliance monitoring, and opt-out facilitation. They also offer additional reporting services for additional fees. Essentially the bigger media companies, networks and agencies will be subsidizing the early stages of these initiatives by adopting and paying for the technology so that eventually the costs for everyone can come down with volume.

Challenge: The industry will have to fork out millions of dollars for this.

A new role of planning will include mapping out where compliance is necessary, in which case the icon needs to be visible, and where it is not. Note to agencies – there may be an audit trail requirement here to ensure that you are not paying for enforcement of non BT targeted ads. Nominal CPM or not, it adds up just like ad serving, or ad verification. Ideally it would be nice to have this built into the ad server – but I can say that about so many things! Assume that compliance and the use of the icon will be a part of media terms & conditions in the not too distant future.

One of the elephants in the room is the slightly ambiguous definition of compliance – or more accurately what the compliance is ensuring. The following are the DAA’s Self Regulatory Principles:

The Education Principle calls for organizations to participate in efforts to educate individuals and businesses about online behavioral advertising and the Principles.

The Transparency Principle calls for clearer and easily accessible disclosures to consumers about data collection and use practices associated with online behavioral advertising. It will result in new, enhanced notice on the page where data is collected through links embedded in or around advertisements, or on the Web page itself.

The Consumer Control Principle provides consumers with an expanded ability to choose whether data is collected and used for online behavioral advertising purposes. This choice will be available through a link from the notice provided on the Web page where data is collected.

The Consumer Control Principle requires “service providers”, a term that includes Internet access service providers and providers of desktop applications software such as Web browser “tool bars” to obtain the consent of users before engaging in online behavioral advertising, and take steps to de-identify the data used for such purposes.

The Data Security Principle calls for organizations to provide appropriate security for, and limited retention of data, collected and used for online behavioral advertising purposes.

The Material Changes Principle calls for obtaining consumer consent before a Material Change is made to an entity’s Online Behavioral Advertising data collection and use policies unless that change will result in less collection or use of data.

The Sensitive Data Principle recognizes that data collected from children and used for online behavioral advertising merits heightened protection, and requires parental consent for behavioral advertising to consumers known to be under 13 on child-directed Web sites. This Principle also provides heightened protections to certain health and financial data when attributable to a specific individual.

The Accountability Principle calls for development of programs to further advance these Principles, including programs to monitor and report instances of uncorrected non-compliance with these Principles to appropriate government agencies. The CBBB and DMA have been asked and agreed to work cooperatively to establish accountability mechanisms under the Principles.

It’s a Big Job But Somebody’s Got to Do It

Can an amalgamation of  of a number of industry trade groups that historically have not been involved in technology nor enforcement keep the FTC satisfied? We better hope so.

Personally I feel that a lot of it has to do with the economics behind the process. Can the DAA generate sufficient revenue to properly resource enforcement? Will the industry accept these costs in stride? Do we all understand the alternative?

If you enjoyed this post, please consider adding a comment or subscribing to the RSS feed. Thanks.

 

Best Buy TV Ad SpendingToday Ad Age reported that Best Buy is shifting more advertising dollars to TV this holiday season.

The consumer electronics giant wouldn’t give exact figures, but it is increasing its spending by a low double- digit percentage. In 2009, it spent $150 million on TV advertising, according to Kantar; network TV ads accounted for $65 million of that figure. To free up funds for TV, the retailer is pulling money away from inserts and trimming distribution in parts of the country where newspaper readership has suffered.

Quoting a Best Buy exec: “When you have big budgets like we do, a 5% to 10% improvement is a big deal.”

What’s Good For The Goose…

Believe it or not, as a digital strategist, I am actually thrilled to see this move and think it is wholeheartedly the right one – because it is supported by sophisticated media mix models that predict the impact and outcome of the media investments. Make no mistake about it – media mix econometric modeling is neither simple nor absolute, but more companies need to attempt to crack this nut. Unfortunately, today a lot of media investment is based on intuition and debate under the guise of collaborative channel planning, rather than a systematic approach to modeling a mix. And BTW – you don’t need a nine figure budget to take a deeper look at the way your media works together. You may not be able to develop the sophistication level of a comprehensive econometric model, but there are so many different ways to analyze your data. It starts with the desire to do so and a lack of aversion to walk outside of your comfort zone – because trust me, that’s where you’ll be very quickly.

Based on the model’s recommendations, Best Buy has also tweaked its digital spending, putting more money into display advertising. And it’s also considering putting more money into events. Mr. Panayiotou said that the model made other suggestions, which the retailer is still evaluating. His team is looking at everything from events to the loyalty program, digital to online search.

The fact of the matter is that all media spend, whether direct response or branding focused, has the same objective – to influence and sell product to consumers. The primary difference is where in the sales cycle you reach a consumer and how long it takes to influence the sale. This is extremely over simplified, but a fact nonetheless.

Media Investment Predictability

The beauty of the concept of GRP’s is that a historical level of media weight could somewhat reliably predict the business outcome in the market. One of the challenges of digital media for large brand advertisers is that, unlike traditional media, it’s hard to predict the outcome in the market as a result of ad spending. To a degree this is because of the small budget allocations to digital, but is also due to the differences in media currency and the lack of significant corollary research on investment impact. Many brands believe in the power of digital media, but most have yet to quantify the marginal increase to their businesses as media dollars get shifted between traditional and digital media. We can talk ad nauseum about how digital is an essential part of the mix (and it is!) – but as an industry we must do a better job at proving it.

Digital Media is Growing Up

Of course, within the digital ecosystem there is significant evolution all around us. The market is still dominated by direct response marketers – and supports these efforts at scale. Most DR marketers have yet to hit a point of diminishing returns and the market is evolving to push that point even further. Even within this space most agencies and marketers fail to use available tools like attribution reporting to properly model a digital mix and prevent duplicate tracking and over-crediting of activation channels like search and retargeting – a huge issue that plagues every multi-channel digital marketer, particularly retailers, whether they take the time to realize it or not.

As marketers get savvier about the word “accountability” not equating to “direct response”, we will see  more branding dollars shifting to the web. But this won’t happen until every company has a champion to drive modeling that incorporates  and measures digital in a more intelligent fashion than is happening  today, where we use disconnected proxy metrics and great salesmanship to feed into brands’ (and often our own) desire to want to believe in digital. DO believe – digital media IS integral to your or your clients’ businesses. But take a systematic approach to how everything works together, because it’s only becoming more  fragmented and complicated. CMO’s today have a tough job, and they are dropping like flies, with an average tenure being less than 2 years. Maybe  the role of the CMO needs a fundamental shift. Maybe the transformation is underway already. Enter the era of the “Chief Modeling Officer”.

If you enjoyed this post, please consider adding a comment or subscribing to the RSS feed. Thanks.

The increase in buzz and actual growth of the Demand Side Platform (DSP) / Real Time Bidding (RTB) market is not news anymore. The trend of separating audience profiles from media and empowering media buyers  to bid on specific audience profiles across large exchanges of media inventory is a hot topic of conversation, and rightfully so. But of course with any growing trend, it is essential to take the time to identify which players provide true and unique value propositions to the marketplace. Beware of impostors trying to capitalize on the hype rather than helping to perfect the concept of what a DSP facilitates.

In Theory, Practice & Theory are The Same – In Practice They Are Not

In theory, each agency or media buyer needs only one DSP to bid into the entire exchange and second channel media ecosystem, with all the data plugins available at their disposal. The market would have a high degree of bid density (a lot of actual demand side activity) and liquidity (stable supply of replicable “inventory” that establishes and holds its value – which of course is an entire issue in and of itself). Of course, we don’t live in a perfect world, yet. Neither bid density nor an ability to value inventory properly exists in the RTB marketplace.

Give Me a D…

Adding the acronym DSP to your product offering gets more feet in more doors today, and therefore we will see many large networks and new players adding “DSP” to their offerings. However, in concept, many of these new platforms are limited to specific network inventory (albeit large amounts of it), static data profiles or targeting options (albeit fairly sophisticated options), and sometimes lack the total transparency that the more savvy buyers have come to expect from a true DSP (albeit some are willing to work on a CPA basis, so sometimes the buyer doesn’t care). A “true DSP” is one that can bid in real-time into the entire exchange and second channel ecosystem, works with all or at least most of the data providers and maintains total transparency on media and pricing. The holding company-level media agencies have all either developed their own or white labeled AppNexus or other third part technologies. Much like ad servers, as the market evolves, your agency or in-house buyers will only be working with one DSP (or maybe we will start calling them real time bidding engines at some point?) – or at least a primary DSP. Speaking of ad servers – I predict that ultimately Google (DART) and Microsoft (Atlas) will be the two leading DSP’s on the market (although MediaMind will be a third major player, particularly with the impending IPO). This will happen through acquisition, and the first in the category was Invite Media – check one off for Google. Some of the other current acquisition contenders include DataXu, X+1, Media Math, and AppNexus, with new players claiming market entry seemingly monthly. Degree of sophistication of advanced optimization engines should soon become a unique point of differentiation between companies.

Wanted: A Stable Market

Imagine a series of interconnected Venn diagrams, where the overlapping areas represent consumers that satisfy multiple advertisers’ criteria. These criteria are compiled using a combination of data points from several data providers, all integrated into your DSP and available to select from an intuitive  interface. Every single available impression in the exchange is assessed in real-time by every DSP on the market, and multiple bids from the appropriate advertisers within each DSP are all processed in real-time. Those buyers with the highest bids will get the inventory. All of this bidding happens in real-time – billions of times per day. Sound familiar? The bidding part at least? Google built the biggest cash cow in our industry on a similar model – using far less data and sans cross category competition for the same consumer.

Until there is a higher level of bid density and inventory availability, the marketplace will not be ripe for all advertisers and will favor select categories, and not all publishers will provide inventory into the RTB marketplace. It’s the classic chicken and egg problem. Hence some of the non-DSP DSP’s.

The Opposite of DSP

Publishers on the other-hand utilize yield optimizers to interface with the DSP’s to manage inventory, data relationships,  real time bidding and maximize revenue generated. Companies such as Rubicon Project, Ad Meld and PubMatic will soon become necessities for any publisher who wishes to participate in the second channel (inventory not sold at a premium directly by its sales force – which BTW is a lot of inventory!) Even some of these companies are releasing so called Demand Side products. Can they sit in the middle ground of both the supply and demand side worlds? Only time will tell, but my gut says no way. While technology might be able to play both side sof the fence more objectively than people, buyers still want separation from sellers. Of course Doubleclick did it – there’s the whole DFP / DFA thing, but the instances of one company becoming a leader on the buy and sell side of the same technology coin are few and far between.

Anyway, as you can see it’s all really simple …

But in all seriousness, it is as exciting as it is complex. We are participating in the evolution of the digital media world as we know it.

If you enjoyed this post, please consider adding a comment or subscribing to the RSS feed. Thanks.

With the announcement of the Open Graph, Facebook has once again provided an evolutionary leap for the entire industry. Publishers, brands and consumers alike will benefit from “a smarter, personalized web that gets better with every action taken”, as concisely described by Mark Zuckerberg at yesterday’s F8 conference.  With Facebook’s critical mass (nearly 500 million members as of today), the Open Graph is poised to become the most powerful move the company has made so far – if successful it will revolutionize the web as we know it and propel Facebook into a position to compete with Google for the throne of dominance.

The Open Graph – We Like

Facebook is already fairly ubiquitous among consumers. Facebook Connect has extended that ubiquity to sites outside of Facebook, but the process for consumers, publishers and marketers was not seamless. While successful, Connect was not the technology that extended the social experience of Facebook to the entire web. But that is exactly what the Open Graph will do. Facebook has simplified the process of implementing the code for developers and for sharing and connecting with content and brands for consumers. It is truly a win-win-win. One line of code (an iFrame for those who care), will enable publishers to include a “Like” button, which will facilitate social actions anywhere on the web. As long as you are logged into Facebook, your cookie will allow your social graph to augment the experience on any site with the code. Bret Taylor said it best during F8  yesterday that “Lowering the friction of sharing will increase the volume of sharing”.

Vaults of Data

Facebook already sits on a data goldmine, but these vaults will become far deeper with wider ranging application as the Open Graph further connects social graphs of individuals, brands and publishers around the web. For now the targeting opportunities resulting from the additional data will be limited, most likely providing marketers the opportunity to target interests “liked” for the time being. But the potential of the data applications are profound – think Minority Report-like, as mobile and geo-location converge on the Open Graph.

Privacy

Inevitably there will be some privacy backlash, as all forms of behavioral data applications are under severe scrutiny by the FTC and advocacy groups. Of course Facebook thought about this too – and they will be rolling out a new simplistic privacy panel where you can opt out of the Open Graph. Ultimately there will be collection of an amount of non-personally identifiable data at a scale that we have never witnessed before, and the proximity and ability to connect it to personally identifiable information will most likely become the issue at hand. But the benefit of the Open Graph adding significant value to the overall consumer experience, and the affinity with Facebook as a trusted brand powering the collection, storage and usage of the data will trump any privacy backlash. Make no doubt about it – there will be some backlash – there always is – but we will get past it rather quickly. The social web is here in a big way, and our lives have been changed forever – and soon everyone will realize it.

A Monumental Day

For marketers, in the short term this turning point will make it exponentially easier to turn fans into advocates, identify new prospective customers, and drive peer influence through the coveted Facebook newsfeed. In the long term, the potential is far wider reaching and as mobile and geo-location (Facebook is launching their own geo-location service as well) converge with the Open Graph, this may be the catalyst that soon connects the online and offline world. It is truly a monumental day.

You can see the F8 conference videos here.

If you enjoyed this post, please consider adding a comment or subscribing to the RSS feed. Thanks.

At the risk of sounding cliche, welcome to a new decade of marketing. Indeed it is an exciting time to be a marketer. The past decade may prove to be the most pivotal ever in terms of the changes in how we communicate with consumers. It was also the decade of aggregation – or better put, the decade that killed the “big idea”. The era of the big idea is over (in the context of marketing communication). Since the explosion of digital marketing during the last decade, the new big idea morphed into an aggregation of many smaller “ideas”. This aggregation has a bigger impact than any one “big idea” ever could, by distributing risk and providing more chances to develop successful approaches.

Marketing evolution continues in 2010, and here are some of the areas to keep your eyes on.

Mobile Forges Forward

We keep joking about how “this year has been the year of mobile for the last few years”. Well, we’re waist deep the age of mobile and moving further along every day. We’ve crossed the proverbial tipping point. The handsets and data speeds provide better experiences, and the data plans are affordable. Over 60 million US consumers access the web via mobile device.  Globally we’re on track for more people to access the mobile web than the PC web (of course I’ll need to save that post for Jan 2020). The thing is, the distribution of this access is skewed, significantly towards the iPhone. While the iPhone catalyzed mobile web usage, competition is not far behind with the Droid, Pre, new Blackberries and other devices to come in 2010. Consumers are using, and even paying for mobile applications and mobile websites that provide value. Of course, as with any marketing channels, there are plenty of misguided executions that do not focus on the consumer, provide little value and flop. Unfortunately oftentimes the medium gets blamed for poor strategy on the part of the marketer and/or agency. Provide experiential or utility-based value to consumers and you’ll reap the rewards of consumer engagement. Additionally, keep your sights set on the convergence of mobile and social experiences as well. This will prove big in 2010.

Location Based Applications

As consumers become more comfortable with GPS enabled smart phones and the first generation of applications that incorporate GPS into the experience, the marketing opportunities that utilize geo-location data will come to fruition. However, it will be 100% predicated on permission, transparency and trust. Of course the recurring theme of providing actual value to the consumer experience is key as well. An early success story is FourSquare, which combines social actions and geo-tagging. But FourSquare is definitely not for everyone. Marketers will have to provide utility in order to gain access to consumers’ private lives and  geo-location data. A few bad apples can spoil the bunch very easily here. Where your brand attributes meet consumers’ needs is a good crossroads to  aim for. Note: If would be interesting to see Facebook acquire and incorporate FourSquare into their current platform.

Real Time Search & Social Search

As social media has become ingrained in the digital media experience for consumers and marketers alike, real time search was inevitable. Information is distributed via so many channels including consumers’ social media feeds, that not including real time data in search results created a void in the relevancy of search results at the major engines. Google’s roll out of social search results from “people in your social circle” also fills the void that was otherwise filled directly from the social media sites like Twitter and Facebook. Real time search will indeed make search results more relevant, but the algorithms for filtering signal to noise will be an interesting evolution to watch and participate in.  The implications for marketers is a new era of SEO that ties even more tightly into social media.

Social Media Expands Its Journey

There are two major areas to keep an eye on here. First is the portable social graph. Facebook Connect really took off in 2009, and 2010 marks the tipping point for social graph / data portability. The social graph is just beginning to become part of the overarching digital platform. Through this ubiquity consumers are empowered, taking the influence and social activities of their social connections with them everywhere they go (well not everywhere, but soon enough). Check out one of my favorite implementations of Facebook Connect so far in the Prototype trailer. Try it out. It takes a minute to load, but it’s worth the wait.

The social graph has become portable on the PC-based and mobile web, and the second area to keep an eye on is the expansion of the social graph to your television. Samsung was the first to release high end flat screen TV’s with internet based widgets that allow you to access Twitter on your TV (currently via Yahoo, but inevitably this will become more open very soon). Expect the social graph to become a standard part of our TV viewing experience in the future (note: not in 2010).

Multiple Attribution

While all marketers would agree that reaching consumers at multiple marketing touch points is essential, most marketers still maintain disparate data systems and utilize the last ad standard protocol when it comes to attribution of influence or conversion.  Multiple attribution tracking capabilities have existed at the major ad servers for a couple of years now, provide a solution to attribution modeling, yet are underutilized by the industry. Third parties, such as ClearSaleing, also offer dashboard, reporting and analytics platforms to provide multiple attribution reporting for marketers. Let’s face it, we are constantly increasing the number of digital marketing channels we are working in and as an industry our analytical capability, or more accurately – marketers’ and agencies’ willingness to utilize the tools available,  has been lagging behind . Some of the dashboard tools can also incorporate a limited set of non digital channels as well. If you are not using a multiple attribution system currently, make 2010 the year to do so. There is simply no excuse not to.

The Privacy Issue Marches On

Privacy is always a heated topic. This has been true since the dawn of digital data collection. The issue is over-hyped by the media and advocacy groups, however there are some underlying truths to the hype. As we have seen with the privacy policy changes on Facebook over the last year, if nothing else, consumers do pay attention and now have the means to spread the word quickly.  As digital marketing technologies evolved, more parties gained access to more data. Although most of this data does not actually contain personally identifiable (PII)  data, in some instances it can be associated with other data assets that do. Tying the  vast amounts of anonymous and PII data together will become a bigger focus of the FTC and advocacy groups as the portable social graph  continues to morph the internet as we knew it into one big social web. The FTC warned the industry in 2009 that a day of reckoning was near, and that the self governance was not working due to a lack of enforcement. The industry will have to take the issue more seriously in 2010 or the government will do so for us.

Augmented Reality

For the uninitiated, augmented reality (AR) is conceptually any technology that ties the real and virtual world’s together. For the mobile device, AR will utilize the built in GPS, compass and video camera, creating an unlimited potential to layer content onto any physical location in the real world. On the PC, AR utilizes the webcam to overlay data, usually in the form of a virtual hologram. Most of the augmented reality executions to-date have focused on the novelty factor and have not provided consumers with much actual value. The few mobile AR applications available, including Yelp, are beginning to provide actual utility. I see a bright future there. On the PC side of things, the USPS Priority Mail box simulator is by far the most useful application of PC/webcam based AR implementation to date.

The key to AR is to hone in on the utility aspect, and provide real value to the consumer. (That concept is starting to sound awfully familiar, huh)

Long Live Display

Display ads get a bad rap. The reality is that online advertising works, and not only for direct response. While search see’s the lion’s share of industry ad spending, display is a standard part of the mix and will continue to be for the long term. That is not to say that display doesn’t have its issues. Lack of creative prowess, challenges with media currency and an inefficient process still plague the industry, but all are common topics of conversation and ad hoc work-arounds are being implemented every day. Most agencies and media buyers have had to develop large infrastructures to support the inefficiency of digital media. Clients constantly challenge the process and costs. Yet very little industry-level research is being conducted to better the situation. The IAB, nor any other industry body has set forth to develop the correlational research required to make advertisers feel more comfortable about the market-level impact of online advertising. The last industry-level research was released almost 10 years ago. Some individual agencies embark on this type of research on a client by client basis, but there is little public domain research readily available for most marketers, who for the most part, park the vast majority of their brand budgets elsewhere. Hopefully in 2010 we will see more industry collaboration to develop research and studies and the tools and systems to create more efficiency in the media buying and management process, without commoditizing it.

Even with all that said, display ads do work at creating influence, this can be and is measured by many marketers and agencies, and display is a standard part of the media mix just like any other medium. The degree of inclusion is what is in question, and hopefully we will at least see more discussion and proposed improvements that make advertisers confident to allocate more brand dollars online.

The Elephant in The Room … The Economy

All indicators lead to a slow and steady economic recovery ahead of us.  But this will happen at a different pace for each category and client. The reality for digital marketing is that most marketers have not been and won’t be experimenting much, and focusing on the more accountable (read – DR) focused channels and tactics. I do expect budgets to open up for social media and mobile. Amid the greatest recession of our lives we witnessed the explosive growth of social media. Some marketers had the budgets to allocate proper resources to understanding, monitoring and integrating social media into their corporate culture, while others put forth a minimal effort and yielded an equal impact. The brands that embraced social media have developed social voices separate from their brand voices and are on their way to becoming accepted social brands. Most are still playing catch-up, and we’ll see a lot of that in 2010.

So there you have it, some areas to keep your sights set on for 2010 and beyond. Have any additional thoughts about what else will be big in 2010? Post your ideas in the comments.