March 10, 2012 1 Comment
Just ahead of our SXSW about “How to personalize without being creepy” Pew Internet published the findings of a study on Search Engine Use in 2012 yesterday (here) with some interesting data on users’ preferences for personalized search results and targeted advertising.
Unsurprisingly, the key takeaway of the report is that: “Most search users disapprove of personal information being collected for search results or for targeted advertising“
More specifically, 73% of the polled say that they do NOT want a “search engine keeping track of your searches and using that information to personalize your future search results because you feel it is an invasion of privacy” and 68% stated that “I’m NOT OKAY with targeted advertising because I don’t like having my online behavior tracked and analyzed”.
It gets really interesting when putting these numbers into context with some of the other findings in the report – and I’d love to see the above variables cross-tabled with these other statements:
- Most internet users don’t know how to control or limit the information that is being collected about them
- “55% of search engine users say that, in their experience, the quality of search results is getting better over time, while just 4% say it has gotten worse”
- 86% of search users “learned something new or important that really helped them or increased their knowledge”
It seems there is a theme here – people don’t like the “black box” of personalization and behavioral targeting and have often little idea of what is collected, how it is being used, and most importantly how to control what a website can or cannot do.
Looking at examples of personalization that theme seems to hold up: Personalization is welcomed and appreciated if it’s derived in obvious manners and can be easily controlled – but if it’s “magic” pieced together from unknown sources we get scared. A few examples:
- News aggregators or personalized newspapers (e.g. Zite, Washington Post’s Personal Post, etc.) – The user declares the sections he or she is interested in, and the consumption and ongoing rating of content continues to shape the news stream for the user. Hugely transparent, full control.
- Dating websites occupy both ends of the spectrum: While Match.com is rather straightforward with allowing users to set their “preferences” explicitly, other companies like eHarmony pride themselves for their proprietary algorithm to match people with results that are largely opaque to the users.
- Amazon’s recommendation lists and emails are an odd one – while they are very clearly based on stuff you’ve looked at or bought in the past (augmented with a bit of clearly flagged collaborative filtering) it seems to be impossible to effectively control by the user (even long after the original “intent” for the item has gone)!
- Music streaming services and video rentals – often flagged “because you viewed/listened to this, you might enjoy that). While the “magic” behind it (mostly collaborative filtering) is opaque to the user, the connection is easily made for the user. Not creepy at all.
- The infamous “following shoes” – re-targeting (while effective) is personalization at it’s worst. Looking at an item on a website without completing the purchase, and then finding that this very item shoes up featured in banner ads on completely different websites later. Black-box magic with no apparent way to control it.
- Facebook – for sponsored stories and social ads: Seeing your friend’s face used in an ad for a company he “liked” on Facebook was just the first step, together with sponsored stories this form of advertising is bound to step up.
- Targeted TV advertising through DVRs, based on behavior. Based on your TV viewing behavior the ads you see during the superbowl might not be the same ads your friends see – with little or no way to control it.
- Lastly Google+ is now a social layer across all of Google’s products with knowledge of a user’s real identity, demographics, likes, and behaviors – and judging by the recent Safari break-in with little regard for a user’s choice in opting out. This will be a huge playground for targeting and personalizing at scale (Susan Wojcicki seems to confirm this)
Now I am not at all opposed to personalization or targeting at all – I believe it is a useful application of technology to eliminate waste advertising (I really don’t need to watch TV ads for female products, I don’t want to buy golf clubs, and there is no point pitching a vacation in Mallorca to me) and provide me with relevant offerings (Do tell me about the outdoor gear sale, do pitch that sailing trip to me, and yes it’s noon and I’m downtown – so it’s ok to talk to me about restaurant lunch offers).
What is paramount, however, is that (a) you have my consent – so you better be a company I trust and that I’m ok with holding some of my personal information, (b) you’re transparent about the data use – so don’t go off selling it to third parties or do stuff behind my back that I didn’t consent to, and (c) you let me control this relationship – I’m happy to share a lot if I think my data is safe and I trust the company, I might share more if there is something in it for me in return, and I want the ability to turn it off completely at any time.
No need to be creepy!