The Location Data Industry That Knows Everything
Your phone company knows where you sleep. They know where you work. They know when you visit a hospital, a gun store, a protest, a mistress. They know which Airbnb you checked into and how long you stayed. And they're selling that information to anyone willing to pay.
This isn't speculation or privacy paranoia. It's the documented, legal, extraordinarily profitable business model of the location data industry. And if you're building any kind of product that touches location, you're either participating in this ecosystem, competing with it, or getting steamrolled by it.
How the Machine Works
The data pipeline starts with your phone. Every app that requests location permissions—weather apps, games, maps, shopping apps—can harvest your coordinates and sell them downstream. The carriers themselves collect location data as part of normal network operation. Cell tower triangulation gives approximate position; GPS gives precise position; WiFi fingerprinting fills in the gaps.
This raw location data flows to aggregators and brokers. Companies like Gravy Analytics, Venntel, and dozens of others buy, clean, and package location data from thousands of sources. They attach advertising IDs (Apple's IDFA, Google's AAID) that allow data from multiple apps to be linked to a single device. Over time, they build comprehensive movement profiles: where you go, how often, for how long.
The buyers are everyone. Law enforcement agencies—including ICE, which is actively seeking bids for "ad tech and big data" tools for investigations. Insurance companies that want to know if you actually go to the gym. Hedge funds tracking foot traffic to retail stores. Private investigators. Stalkers. Anyone with a budget.
The Consent Theater
The legal fig leaf is "consent." Somewhere in the 47,000 words of privacy policies that no one reads, you technically agreed to let these apps share your location with third parties. The consent is broad, the language is vague, and the implications are invisible. You clicked "Allow" because you wanted to check the weather. You didn't realize you were opting into a surveillance marketplace.
Regulators are starting to push back. The FTC has brought enforcement actions against data brokers selling sensitive location information without adequate safeguards. California just launched a tool that lets residents request deletion of their data from brokers—150,000 people signed up in the first month. But enforcement is slow, fragmented, and perpetually behind the industry's innovation curve.
The Founder's Dilemma
If you're building a consumer app, you face a genuine strategic choice about location data. The options aren't pretty.
Option 1: Participate in the data marketplace. You collect location data, you sell it to aggregators, you make money. This is the default business model for apps that can't monetize directly. Weather apps, flashlight apps, casual games—they're all doing this. The revenue is real, the ethics are questionable, and the regulatory risk is increasing.
Option 2: Collect but don't sell. You keep location data for your own product purposes—personalization, analytics, feature improvement—but you don't share it downstream. This is more defensible but limits your monetization options. It also means you're still a custodian of sensitive data, with all the security and compliance obligations that entails.
Option 3: Minimize or avoid collection entirely. You request location permissions only when essential, process location on-device where possible, and avoid storing precise coordinates. This is the most privacy-respecting approach but may limit your feature set. It's also increasingly a selling point—"we don't track you" becomes a marketing message in a world where everyone else does.
What the Data Actually Reveals
The depth of location data inference is staggering. From raw coordinates alone, analysts can determine:
Home and work locations. Where does your phone spend nights? Where does it spend weekdays 9-5? These become your home and work addresses, with high confidence.
Relationship patterns. Whose phones are frequently co-located with yours? These become your family members, romantic partners, close friends. The pattern of "two phones that used to be together every night, now never together" is a breakup or separation.
Sensitive activities. Visits to abortion clinics, addiction treatment centers, religious institutions, political events, gun stores, strip clubs—all visible in the data. The location of your phone is a proxy for the location of your body, and the location of your body reveals your private life.
Travel patterns. When you travel for work. When you take vacation. Which hotel you stayed at. Whether you visited the conference venue or spent the whole trip at the hotel bar.
The Airbnb angle from the headline isn't hypothetical. Location data can trivially identify which short-term rental you're staying at, how long you stayed, and who else's phone was there. For a private investigator looking for evidence of infidelity, this data is gold.
The Regulatory Landscape
The US approach to location privacy is a patchwork. No comprehensive federal law exists. The FTC uses its authority over unfair and deceptive practices to go after the most egregious cases, but most location data trading is technically legal. State laws vary wildly—California is aggressive, most states are permissive.
Europe under GDPR is stricter in theory, but enforcement has been inconsistent. The recent trend toward emergency enforcement actions (like Finland's shutdown of Meta's teen data processing) suggests regulators are getting more aggressive, but the location data industry hasn't yet faced the kind of existential regulatory threat that would force business model changes.
The practical implication for founders: you can probably get away with aggressive data practices today. You might not be able to get away with them in three years. Building privacy-respecting systems now is insurance against future regulatory environments.
The Technical Reality
For founders thinking about location features, some technical considerations:
Precision trade-offs. Do you actually need GPS-level coordinates? City-level granularity might serve your product needs while dramatically reducing privacy risk. The difference between "user is in San Francisco" and "user is at 123 Main Street" is enormous from a privacy perspective.
On-device processing. Modern phones are powerful enough to do sophisticated location processing locally. You can compute distances, match geofences, and personalize experiences without sending raw coordinates to your servers. Apple's approach with on-device ML is a template.
Temporal limitations. Do you need historical location data, or just current location? Storing location history creates a much richer (and more sensitive) dataset than ephemeral current-location checks.
Purpose limitation. If you collect location for one purpose (showing nearby restaurants), don't repurpose it for another (selling to advertisers). This sounds obvious but is the core of most location data abuse.
What the Scandals Show
The AT&T settlement—$177 million for a data breach affecting millions—shows what happens when location data gets loose. Users can claim up to $7,500 for proven financial losses. The company's reputation takes a hit that's hard to quantify.
But most location data privacy violations don't result in lawsuits, because users never know they happened. Your data is sold, used, and discarded without any notification. The harm is invisible until it isn't—until someone shows up at your door because they know where you live, or until your insurance premium spikes because your phone visits too many fast food restaurants.
The Founder's Position
If you're building anything with location features, you have to choose what kind of company you want to be. The exploitative path is paved and profitable. The privacy-respecting path is harder but increasingly differentiated.
The market is shifting. Users are more aware. Regulators are more active. The companies that build trust now will have an advantage when the ones that didn't face their reckoning.
Your phone company knows exactly which Airbnb you're cheating at. The question is whether your product is part of that surveillance ecosystem or an alternative to it. There's money on both sides. Choose accordingly.