37 Million Users, Zero Awareness

A security researcher who goes by Q Continuum just published a list of 287 Chrome extensions that exfiltrate browsing history to data brokers. The extensions have a combined 37.4 million installations. Many of them are productivity tools, ad blockers, and VPN utilities that users installed thinking they were enhancing their privacy.

The data flows to companies like Similarweb, Semrush, ByteDance, and Alibaba. Some of it is disclosed in privacy policies that nobody reads. Some of it violates Chrome Web Store policies but persists anyway. All of it represents a systematic extraction of user behavior data through tools that users trusted.

For founders, this is both a cautionary tale about the extension ecosystem and a lesson about how data economies actually work.

The Business Model in Plain Sight

Similarweb is a publicly traded company. Their business is providing web analytics and competitive intelligence. Their SEC filings explicitly state that their "platform and solutions depend in part on the ability to obtain data from our contributory network through browser extensions."

This isn't hidden. It's just not prominently advertised. Users install a free PDF tool or tab manager. Buried in the privacy policy is language about "contributing to analytics." The extension watches every URL the user visits and sends that data upstream.

The data gets aggregated and sold to enterprises who want to know traffic patterns for competitor websites, engagement metrics for online services, user behavior across the web. It's valuable data, which is why companies are willing to fund free extensions to collect it.

Why This Persists

Chrome's Web Store has a Limited Use policy that's supposed to prevent exactly this kind of data harvesting. Extensions aren't supposed to share browsing data with data brokers. But the policy has an exception that can be exploited: if users consent, different rules apply.

The consent comes from privacy policies. Install the extension, agree to the terms, and you've technically consented to data collection. Most users don't read the policies. Many wouldn't understand them if they did. The consent is legally valid and practically meaningless.

Google's incentive to enforce strictly is limited. Every extension they remove is a worse user experience for whoever installed it. The data collection doesn't directly harm Google. Aggressive enforcement creates PR problems and developer relations issues. The result is a policy that exists but isn't rigorously enforced.

What This Means for Enterprise

If you're running a company, your employees' browsing history is being harvested by their browser extensions. Some of that browsing includes your internal tools, your dashboards, your customer URLs. The extensions don't know the difference between reddit.com and your-internal-app.company.com.

This is a data leakage vector that most security teams haven't fully addressed. Browser extensions run with high privileges. They can see everything the browser sees. And the typical employee has installed several, often without IT awareness.

The remediation is either extension whitelisting (painful to implement, creates friction) or browser isolation (expensive, also creates friction). Neither is free. But neither is having your competitive intelligence leaked through a tab manager your marketing team installed.

The Founder Angle

If you're building products that handle sensitive user data, the extension ecosystem is a threat model you need to consider. Your beautifully secured application can have its URL patterns leaked by a third-party extension your users installed independently.

There's no perfect defense here. You can't control what extensions your users run. But you can design systems that assume hostile browser environments. That means avoiding sensitive data in URLs, minimizing what's visible in the browser context, and building as if everything the browser sees will be logged somewhere.

The more sophisticated approach is to build browser extensions yourself, ones that are trustworthy, to capture the extension slot before a malicious one does. If your users need PDF tools or productivity features related to your product, building or partnering on clean extensions reduces the chance they'll install compromised alternatives.

The Bigger Market

Data brokers exist because data has value. The extensions are just collection infrastructure. The actual market is in the aggregated intelligence that browsing history enables.

Knowing which websites a company's employees visit tells you their tech stack. Knowing traffic patterns tells you market share. Knowing which pages users engage with tells you product strategy. This is valuable information, which is why the collection infrastructure is so well-funded.

For founders building in competitive intelligence, this is the landscape you're operating in. Some of your competitors' data comes from sources with questionable consent. That's a moat if you can build equivalent insights from cleaner sources. It's a threat if regulators eventually crack down and you haven't diversified.

The Regulatory Trajectory

GDPR theoretically covers browsing history as personal data. CCPA gives California users deletion rights. But enforcement against extension-based data collection has been minimal. The Q Continuum research might change that, at least temporarily.

The pattern with privacy enforcement is that public research creates pressure, pressure creates enforcement actions, enforcement actions create compliance investments, and then the cycle repeats. The 287 extensions identified will probably see some removals. New ones will appear. The economics haven't changed.

For founders, the takeaway is that privacy regulation is an ongoing cost, not a one-time compliance exercise. Whatever you build today needs to be flexible enough to adapt as enforcement patterns shift.