Change is coming for digital advertising, challenging our long-term experiment in ubiquitous personalization. This could mean the end of millions of dollars of marketplace value. Or it could mean innovative privacy-centric adaptions for advertising technology.
For more than 20 years, the ad industry has flourished alongside the explosive growth of the internet. Content providers and marketers have together built an elaborate latticework of data connections to fuel the ad marketplace.
Consumers’ digital footprints have fueled this system. Gaining consumer attention has been its goal. Each time we click or tap, scroll or swipe, ripples of our behavior spread across the web, triggering cascades of near instantaneous signals and transactions.
Targeted advertising is so powerful that it has led to the widespread misconception that our various devices listen to our conversations.
But this intricate commercial hub — adtech’s glittering Emerald City — was built over a weakening fault line. Its infrastructure was predicated on the laissez faire flow of consumer’s personal data. Fueling the adtech metropolis with personal data was a choice, not an inevitability.
Once the choice was made, however, reliance on this source of value became a foundational part of the way ad markets grew. Alternatives, like contextual ads based on what the consumer is doing in the moment, have seen far less adoption, and far less investment.
Pressure from regulators and platforms, fueled by consumer privacy concerns, is now building to a seismic shift in the function of online advertising.
U.S. state privacy laws are now in force that, at a minimum, require robust opt out capabilities for third-party data sharing, including support for nascent universal opt-out mechanisms. U.S. President Joe Biden has endorsed a ban of all targeted advertising for minors, a common trend in youth privacy bills and a move that major platforms like Meta and Google have already voluntarily implemented.
Apple’s decision to impose an opt-in consent mechanism for mobile apps to access ad tracking capabilities caused an overnight shift in the landscape — and wiped $10 billion from Meta’s ad business. By 2024, perhaps, Google’s twice delayed deprecation of third-party cookies in its Chrome browser will be made real. And EU regulators, meanwhile, threaten to reject every legal theory propping up the current system.
Without major changes to its operating model, each day the targeted advertising infrastructure is inching closer to collapse. Those who build and maintain the adtech metropolis now face a decision: shore up defenses around their strategic interests with slight tweaks and creative legal theories, buying themselves as much time as possible, or rebuild before the crash.
We still see marketplace participants taking both paths. Uncertainty around the precise meaning of new regulatory requirements has led to widespread bickering and diverging plans among major players and coalitions.
Some claim that the death of targeted advertising is equivalent to the death of the ad-supported internet. This is a fallacy. Ad value is not intrinsically tied to the extraction of personal data.
But, moving away from today’s data-driven adtech architecture will require coordinated effort on multiple fronts: redesigning architectures of consumer choice, bringing to market new privacy-enhancing technologies, and standardizing contractual arrangements between market participants. At every step, privacy experts will need to be in the room to help design a new system that can flourish in our new reality.
Fortunately, collaborative efforts are already underway. On the technological front, the Global Privacy Control represents the latest effort to create a universal opt-out mechanism for web browsers, a frictionless way to respect consumer choice. As regulators increasingly demand such solutions, similar efforts will be needed for mobile devices, smart TVs, extended reality technologies, and whatever comes next. Controls for personal data are required across modalities, whether the data is collected through a screen or a gyroscope or a hologram.
New developments in privacy-enhancing technologies also promise the ability to deliver some degree of ad personalization in a manner that does not rely on the widespread sharing of personal data. Google’s Topics proposal is one such initiative. A W3C working group, the Private Advertising Technology Community Group, has led to other proposals like Interoperable Private Attribution, a method of ads measurement co-authored by Meta and Mozilla. Other new market entrants are leaning into the promise of high-value contextual ads or embracing new technologies like multiparty computation to preserve privacy.
Meanwhile, the Interactive Advertising Bureau released its Multi-State Privacy Agreement in an attempt to help standardize the legal conversation. The flexibility of the template reflects the legal uncertainty and divergence among market participants. This uncertainty will resolve in one of two ways: through protracted court fights that challenge the intent of consumer privacy laws, or through mutual agreement among the advertising community.
It may seem impossible to reach consensus on something as difficult as privacy controls, but we have been here before.
More than a decade ago, facing an earlier threat of regulatory scrutiny, market players came together and adopted self-regulatory principles for targeted advertising. Although these have mostly been outpaced by new rules, their existence reminds us that industry-wide coordination is feasible.
Together, the industry built from scratch a protocol to display a special opt-out icon on top of every targeted ad on the internet. This was a technical feat. With privacy professionals working across disciplines, such a feat is possible again.
Innovation will solve this crisis, just as it always does. Those who innovate in ways that recognize the staying power of consumer privacy will be poised to reap the rewards. Those who don’t may not survive the quake.
Here’s what else I’m thinking about:
- The Senate Commerce Committee approved two youth privacy bills — again. This time, the bills are moving along with the explicit support of President Biden. The updated draft Kids Online Safety Act, S.1409, has been voted out of committee with a substantial tack-on amendment offered by Sen. John Thune, R-S.D., who inserted most of his Filter Bubble Transparency Act into the bill. Seven other amendments were passed, including a manager’s amendment with a variety of changes throughout the bill offered by sponsoring Sen. Marsha Blackburn, R.-Tenn. The Children and Teens’ Online Privacy Protection Act, S.1418 or “COPPA 2.0” as its affectionately called, was also voted out of committee with amendments. Neither bill has been introduced in the House. But Sen. Maria Cantwell, D-Wash., said we should expect even more privacy action from the Senate Commerce Committee this fall.
- What about all the other U.S. federal privacy proposals? My colleague, Müge Fazlioglu, CIPP/E, CIPP/US, collated and analyzed the wide variety of introduced legislation in the 118th U.S. Congress. Artificial intelligence governance proposals are also proliferating, along with crowded hearings to explore “principles for regulating AI.”
- Health data tips from the U.S. Federal Trade Commission. On its business blog, the FTC posted a “baker’s dozen” takeaways from its recent slate of health enforcement actions. The summary highlights the expanded breadth of scope of sensitive data related to health, which may include “anything that conveys information — or enables an inference — about a consumer’s health.” This is consistent with other policy developments for health-related data. Companies should double check whether they really are exempt from heightened requirements for sensitive data types, whether they collect location, shopping, movement, or any other potentially health-adjacent data type.
Thought-provoking summer reads:
- Could the expanding scope of biometric laws smother the immersive tech industry in its cradle? A new detailed analysis from the Future of Privacy Forum explores the interaction between biometric privacy rules and the use of covered data in extended reality devices and services. From face detection to tracking bodily movements, the use of biometric-related data is essential for the functioning of immersive technologies, but not for purposes of identifying or individuals.
- How enforceable are companies’ voluntary commitments to follow responsible AI principles? In a blog post, Kelley, Drye & Warren Of Counsel Jessica Rich reminds us that, at least when it comes to the specific and actionable commitments that large AI companies made to the Biden administration, they are likely enforceable under consumer protection law. As explored in the IAPP AI Governance Dashboard, a few companies are also taking the further step of establishing an industry forum to build best practices for the development of foundation models.
Privacy people on the move:
- Krysten Jenci, long-serving director of the Office of Digital Services Industries in the Department of Commerce’s International Trade Administration, has joined Cisco’s government affairs team.
Please send feedback, updates and postcards from Oz to firstname.lastname@example.org.