A shocking new tracking admission from Google, one that hasn’t yet made headlines, should be a serious warning to Chrome’s 2.6 billion users. If you’re one of them, this nasty new surprise should be a genuine reason to quit. Behind the slick marketing and feature updates, the reality is that Chrome is in a mess when it comes to privacy and security. It has fallen behind rivals in protecting users from tracking and data harvesting, its plan to ditch nasty third-party cookies has been awkwardly postponed, and the replacement technology it said would prevent users from being profiled and tracked turns out to have just made everything worse. “Ubiquitous surveillance... harms individuals and society,” Firefox developer Mozilla warns, and “Chrome is the only major browser that does not offer meaningful protection against cross-site tracking... and will continue to leave users unprotected.” Google readily (and ironically) admits that such ubiquitous web tracking is out of hand and has resulted in “an erosion of trust... [where] 72% of people feel that almost all of what they do online is being tracked by advertisers, technology firms or others, and 81% say the potential risks from data collection outweigh the benefits.” So, how can Google continue to openly admit that this tracking undermines user privacy, and yet enable such tracking by default on its flagship browser? The answer is simple—follow the money. Restricting tracking will materially reduce ad revenue from targeting users with sales pitches, political messages, and opinions. And right now, Google doesn’t have a Plan B—its grand idea for anonymized tracking is in disarray. “Research has shown that up to 52 companies can theoretically observe up to 91% of the average user’s web browsing history,” a senior Chrome engineer told a recent Internet Engineering Task Force call, “and 600 companies can observe at least 50%.” Google’s Privacy Sandbox is supposed to fix this, to serve the needs of advertisers seeking to target users in a more “privacy-preserving” way. But the issue is that even Google’s staggering level of control over the internet advertising ecosystem is not absolute. There is already a complex spider’s web of trackers and data brokers in place. And any new technology simply adds to that complexity and cannot exist in isolation. It’s this unhappy situation that’s behind the failure of FLoC, Google’s self-heralded attempt to deploy anonymized tracking across the web. It turns out that building a wall around only half a chicken coop is not especially effective—especially when some of the foxes are already hanging around inside. Rather than target you as an individual, FLoC assigns you to a cohort of people with similar interests and behaviors, defined by the websites you all visit. So, you’re not 55-year-old Jane Doe, sales assistant, residing at 101 Acacia Avenue. Instead, you’re presented as a member of Cohort X, from which advertisers can infer what you’ll likely do and buy from common websites the group members visit. Google would inevitably control the entire process, and advertisers would inevitably pay to play. FLoC came under immediate fire. The privacy lobby called out the risks that data brokers would simply add cohort IDs to other data collected on users—IP addresses or browser identities or any first-party web identifiers, giving them even more knowledge on individuals. There was also the risk that cohort IDs might betray sensitive information—politics, sexuality, health, finances, ... No, Google assured as it launched its controversial FLoC trial, telling me in April that “we strongly believe that FLoC is better for user privacy compared to the individual cross-site tracking that is prevalent today.” Not so, Google has suddenly now admitted, telling IETF that “today’s fingerprinting surface, even without FLoC, is easily enough to uniquely identify users,” but that “FLoC adds new fingerprinting surfaces.” Let me translate that—just as the privacy lobby had warned, FLoC makes things worse, not better. Follow this thread on OUR FORUM.