- The industry should move toward anonymizing users.
- The focus should shift to analyzing trends around the context in order to do right by consumers and their privacy advocates.
With the exception of a few bad actors — here’ looking at you, Cambridge Analytica — the digital ad tech industry plays by the rules that govern the protection of PII data. For instance, enterprises will anonymize CRM data they export to their DMPs for activation. Partners in a second-party data exchange will send their customer data to a third-party like LiveRamp to strip away any PII data and provide both parties a completely anonymized list for targeting.
There’s no denying that our industry goes to great technological and financial lengths to protect consumer privacy in all advertising and marketing initiatives.
And yet, we are absolutely failing to protect the privacy of consumers, our kids, even the President of the United States. In their article, How to Track the President, New York Times reporters Stewart Thompson and Charlie Warzel describe how they were able to obtain, “a dataset with more than 50 billion location pings from the phones of more than 12 million people in this country. It was a random sample from 2016 and 2017, but it took only minutes — with assistance from publicly available information — for us to deanonymize location data and track the whereabouts of President Trump.” The reporters also used that anonymized data to identify where secret service agents live, who they live with, who their friends are, and so on.
That article is just one installment of the New York Times Privacy Project, an in-depth look at the many ways people of all ages are losing their right to anonymity. As part of this initiative, reporters are methodically pulling back the curtains of our industry, explaining how we collect and analyze people’s data and use it for targeting.
The Times isn’t alone in taking on this topic. Last year, Harvard Business School professor, Soshana Zuboff, published a sweeping book, The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. On just about every page Zuboff warns readers that companies in our ecosystem are collecting data on our individual actions and selling it to others (read: advertisers) so they can control our behavior at scale. She calls these strategies a threat to civil liberties and democracy itself.
Whether or not you agree with Zuboff’s characterization of what we do is somewhat beside the point. The truth is, we’ve been hiding behind the mantra of “no PII” for a long time, and it’s not working. The Privacy Project, The Age of Surveillance Capitalism, even GDPR and the CCPA are warning shots we need to heed. We’ve built an industry around barely concealed people data and the consumer is not pleased. They now have allies who are putting in the work of figuring out and explaining how it all works. Once those allies articulate how our industry goes about our business, consumer privacy advocates will have the vocabulary they need to demand regulation.
ID Targeting Vs. Aggregate Targeting
As I thought about this issue, it has become clear to me that there are two distinct spaces (for lack of a better word) for targeting: the ID space, and the behavioral space. The former is the world we currently occupy: tracking consumers based on their cookie IDs, IP address, device IDs, you name it. At its core, this space is focused on the individual. It tracks what this individual has purchased, the stores this individual visits, the content this individual consumes. The space associates the individual to his or her work computer, mobile phone, and other devices that share his or her home and campus WiFi networks. Given this level of tracking, does it even matter that this individual’s PII data is masked? People want to go about their lives assuming that no one is watching them, but the reliance on the ID space makes that wish impossible.
The behavior space offers them more protection, though I freely admit that there are a lot of obstacles to overcome before it can become a reality. In the behavioral space, user behavior is not just anonymized but aggregated up, so that it becomes absolutely impossible to know that a specific mobile ID belongs to, say, a specific secret service agent or high school student.
In the behavioral space, it’s not a matter of knowing what I, anonymized Tim Burke, am doing, but figuring out the patterns and behaviors that are common among people who are like me. It’s a matter of analyzing trends around context and deploying a new form of contextual targeting. Let’s say I wake up on Saturday morning and go to the running shop to buy a new pair of sneakers. It’s not important that I, Tim, purchased those sneakers. What’s important is that millions of runners everywhere consume a specific kind of content, favor specific foods, share specific values and so on. We all live contextually inside of behavioral spaces, and identifying those aggregated affinities across large numbers of consumers will allow marketers to target effectively without unnerving the consumer and their privacy advocates.
I recognize this isn’t a trivial thing to accomplish. I get that mapping the commonalities of behavioral data is a massive undertaking, but it needs to be done. Consumers look to publishers for access to free or low-cost content and apps; publishers look to advertisers to pay for the content that consumers want. Advertisers are game to pay, but they want to know that their ads will deliver meaningful business outcomes, and look to the digital ad tech ecosystem to help them do that. We need to step up. Until we can tell, say, a marketer of vitamins and supplements or wireless services that people who are most inclined to buy such products are likely to congregate these types of sites and share these types of values and affinities, we will continue to operate in the ID space, and incur the consumer’s wrath. We need to reinvent the way the digital world is monetized.