University of Illinois at Chicago

File(s) under embargo







until file(s) become available

Empirical Analysis of Google’s Privacy-Preserving Targeted Advertising

posted on 2024-05-01, 00:00 authored by Giuseppe Calderonio
Google recently announced plans to phase out third-party cookies and is currently in the process of rolling out the Chrome Privacy Sandbox, a collection of APIs and web standards that offer privacy-preserving alternatives to existing technologies, particularly for the digital advertising ecosystem. This includes FLEDGE, also referred to as the Protected Audience, which provides the necessary mechanisms for effectively conducting real-time bidding and ad auctions directly within users' browsers. FLEDGE is designed to eliminate the invasive data collection and pervasive tracking practices used for remarketing and targeted advertising. In this thesis, we provide a study of the FLEDGE ecosystem both before and after its official deployment in Chrome. We find that even though multiple prominent ad platforms have entered the space, Google ran 99.8% of the auctions we observed, highlighting its dominant role. To conduct our research, we have gathered data over a period of four Months with an automatic crawler script and no humans were involved in the process. In particular, we have first collected a list of the 70k most popular websites from the Tranco public dataset ( and we leveraged the Chrome DevTool Protocol integrated with the Puppeteer nodeJS library to create wrapper code of the javascript functions "navigator.joinAdInterestGroup()", "navigator.leaveAdInterestGroup()" and "navigator.runAdAuction()", storing their argument for each website scraped. Subsequently, we provide the first in-depth empirical analysis of FLEDGE, and uncover a series of severe design and implementation flaws. We leverage those for conducting 12 novel attacks, including tracking, cross-site leakage, service disruption, and pollution attacks. While FLEDGE aims to enhance user privacy, our research demonstrates that it is currently exposing users to significant risks, and we outline mitigations for addressing the issues that we have uncovered. We have also responsibly disclosed our findings to Google so as to kickstart remediation efforts. We believe that our research highlights the dire need for more in-depth investigations of the entire Privacy Sandbox, due to the massive effect it will have on user privacy.



Jason Polakis


Computer Science

Degree Grantor

University of Illinois Chicago

Degree Level

  • Masters

Degree name

MS, Master of Science

Committee Member

Chris Kanich Stefano Zanero Marco Santambrogio

Thesis type


Usage metrics


    No categories selected


    Ref. manager