Post

Your Data Is Not Your Own (And That Should Make You Furious)

March 4, 2026 · 12 minute read

Take control

Part 2 of my series on dark patterns and digital exploitation. If you haven't read The Slow Decay: How Dark Patterns Are Ruining Everything, start there. This one goes deeper.


The Context

I saw an ad for a data removal service the other day. You know the type — "we'll scrub your personal information from the internet for $8 a month."

My first reaction was skepticism.

My second was curiosity.

My third was a four-hour rabbit hole that left me genuinely angry.

Not at the ad itself, rather, at the fact that this service needs to exist at all.

The Rabbit Hole

Here's what I learned: there are roughly 4,000 data brokers operating globally. These are companies whose entire business model is collecting your personal information — your name, home address, phone number, email, employment history, family members, court records, political affiliation, health conditions, shopping habits, location history — and selling it for as little as fifty cents per record.

Let that number sink in a bit. Your identity, packaged up and sold, for less than a pack of gum (is gum still that cheap?).

They don't ask your permission. They scrape it from public records, purchase it from credit card companies and retailers, pull it from social media profiles, and aggregate it from apps that track your location. A 2023 study found that a person can be re-identified with 99.98% certainty using just 15 demographic data points. Even when brokers claim the data is "anonymized," it's not, not really.

I work in UX and product design and spend my days thinking about how users interact with systems, how to make experiences intuitive and respectful, how to build trust. Dark patterns — the manipulative design choices I wrote about in Part 1 — are the antithesis of everything I believe in professionally. But data brokers? They're not even bothering with the dark pattern. They skip the manipulation and go straight to exploitation. There's no deceptive UI. There's no misleading opt-in. They just take your data and sell it, and the system is built to let them.

Why Is This Legal?

This is the part that turned my curiosity into anger.

In Europe, GDPR operates on a simple principle: you can't collect my data unless I say yes. Opt-in. Consent first. The United States takes the opposite approach: we're taking your data unless you figure out how to say no. Opt-out. Good luck finding the form, sucker.

That's a dark pattern baked into the legal framework itself.

And there is no federal data privacy law. None. Congress has tried twice — the American Data Privacy and Protection Act in 2022 and the American Privacy Rights Act in 2024 — and both died. The reasons are complicated but mostly come down to money, lobbying, and a disagreement about whether a federal law should override stronger state protections. The data broker industry generates billions of dollars. They spend accordingly to keep regulations weak.

So it's been left to the states. And the results are... uneven.

California is leading the charge. Their DELETE Act created a platform called DROP that launched in January 2026 — a single free portal where residents can submit one deletion request that reaches every registered data broker in the state. Brokers who don't comply face fines of $200 per consumer per day. California's privacy agency has already been fining and shutting down non-compliant brokers. That's real teeth.

In New Jersey, they got the NJDPA (New Jersey Data Privacy Act) in January 2025. This gives the right to access, correct, delete, and transfer our personal data, plus the ability to opt out of data sales and targeted advertising. The NJDPA explicitly prohibits companies from using dark patterns to obtain consent. That's a meaningful step. But enforcement has only just begun, and the cure period for violations doesn't expire until mid-2026.

Meanwhile, twenty other states now have comprehensive privacy laws on the books. The other thirty? Varying degrees of "good luck." Half of the union's representation are just complacent in letting their constituents, and their own, personal information get sold and traded like Pokémon cards.

Who's Profiting, and Who's Getting Hurt?

Data brokers aren't just feeding annoying targeted ads. That would be irritating but manageable, but the harm they're doing goes much deeper.

Duke University researchers found in 2023 that data brokers were willing to sell mental health data — including diagnoses of depression, anxiety, and bipolar disorder — organized by zip code, age, and marital status. Think about that. Someone's mental health diagnosis, packaged and sold to the highest bidder.

The FTC had to step in when General Motors and OnStar were caught collecting and selling drivers' precise geolocation data and driving behavior to insurance companies without adequate consent. Your car was snitching on you so your insurance premiums could go up and yet car manufacturers are still integrating these systems into their vehicles and even if you don't use the service, they sneakily enrolled users during onboarding so that the software can still be utilized and leveraged against you. Read about it, it's deplorable: FTC Takes Action Against General Motors for Sharing Drivers’ Precise Location and Driving Behavior Data Without Consent

Government agencies like ICE have been purchasing data broker information instead of obtaining warrants — using the commercial market to bypass constitutional protections against unreasonable search. The Fourth Amendment doesn't apply when someone else already sold the data.

People-search sites like Whitepages, Spokeo, and BeenVerified publish your home address, phone number, and family members' names for anyone to find. Domestic violence survivors who've relocated? Stalkers can find their new address for a few dollars. And here's the dark pattern kicker: to opt out of many of these sites, you have to provide more personal information to verify your identity. The system designed to remove your data requires you to hand over more data. That's not an accident. That's by design, darker than midnight design.

Who's Fighting the Good Fight?

It's not all bleak. There are organizations doing real work on this front, and they deserve attention and support.

The Electronic Frontier Foundation (EFF) has been defending digital privacy for 35 years. They fight in courts and Congress, develop privacy-protecting tools, and actively recruit volunteer technologists to contribute to open-source projects. If you're technical and want to help, their "Coding with EFF" program is a real way in.

Privacy Rights Clearinghouse has been focused exclusively on data privacy since 1992. They were instrumental in getting California's Delete Act passed. In 2025, they built a unified database of 750+ data brokers by cross-referencing state registries and reported hundreds of non-compliant companies to enforcement agencies. They also have a complaint form where individuals can submit concerns that get shared with the FTC and state attorneys general. Your complaint can actually reach people with enforcement power.

Fight for the Future, EPIC (Electronic Privacy Information Center), Access Now, and noyb (None of Your Business, a European digital rights center) are all actively litigating, lobbying, and advocating for stronger protections.

These are the good guys. They need funding, volunteers, and public support. Research them to learn about their work, make sure it aligns with your principles, and consider donating, spreading their work, or using the tools they build.

What You Can Do Right Now

As I said in part 1, I'm not going to pretend there's one simple solution. This problem is systemic, and fixing it requires pressure on multiple fronts. But there are real, practical steps you can take today to reduce your exposure and make it harder for brokers to profit off your data.

Start here — these take minutes:

Google yourself. Seriously. Search your name, your name plus your city, your phone number, your email address. Expand outward from one identifier to discover related accounts, documents, and data sources. Start with exact match searches like:

  • "First Last"
  • "First Middle Last"
  • "First M Last"
  • "Last, First"

See what comes up and then wait for your eyebrows to return from the back of your head before going on.

Audit your own online footprint using OSINT (Open-Source Intelligence) techniques. OSINT refers to the process of collecting and analyzing publicly available information from sources such as search engines, social media, public records, breach datasets, and documents to build a picture of a person or organization.

Sites like Whitepages, Spokeo, FastPeopleSearch, and BeenVerified will likely have a profile on you. Most of these sites have opt-out forms — they're deliberately hard to find and tedious to complete, but they work, some do have that "Cache" 22 caveat of requiring more information to prove you are who you say you are, so exercise caution.

Set up Google's Results About You tool. It lets you request that Google deindex search results containing your personal information. Free, and surprisingly effective.

Opt out of pre-approved credit offers at OptOutPrescreen.com. Freeze your credit with Experian, Equifax, and TransUnion. These two steps cut off major data pipelines.

Check if your state has a comprehensive privacy law. If you're in one of the handful of states with protections, exercise your rights. File deletion requests. Opt out of data sales. Use the tools your state has given you.

As of this writing, the most legitimate of these states are:

  • California
  • Colorado
  • Connecticut
  • Oregon
  • New Jersey
  • Delaware
  • Maryland
  • Minnesota
  • Virginia
  • Texas
  • Florida
  • Montana
  • Tennessee
  • Indiana
  • Kentucky
  • New Hampshire
  • Nebraska
  • Iowa
  • Utah
  • Rhode Island

Quick context: Most of these laws follow either the California model (stronger) or the Virginia model (business-friendlier). California remains the most aggressive privacy regulator in the U.S., with a dedicated enforcement agency and new tools like the data-broker deletion portal.

Then build better habits:

Stop giving real information to services that don't need it. Use a dedicated email alias for signups that aren't verified essential (banks, utilities, medical). Proton Mail's SimpleLogin integration lets you generate unique addresses per service, so you can trace exactly who sells your data when spam starts arriving. Use a Google Voice number instead of your real phone number for any form that asks for one.

Turn on Global Privacy Control (GPC) in your browser. Under the NJDPA and similar state laws, companies are now required to honor this signal. It's a single setting that communicates your opt-out preference to every site you visit.

Be intentional about what you share and where. Every form you fill out, every app you grant location access to, every "sign in with Google" you click — that's data entering the pipeline. Not all of it is avoidable, but a lot of it is.

For the nerds — go further:

There are open-source tools that automate the data removal process. Eraser is a free, open-source tool that sends GDPR/CCPA removal request emails to 750+ data brokers automatically. It tracks responses and flags which brokers need manual follow-up. It's a Go-based CLI tool, and if you have a home server or NAS, you can schedule it to run monthly.

Privotron uses Playwright to automate browser-based opt-out submissions. It's newer and only covers a handful of brokers so far, but the YAML-based playbook system makes it easy to contribute new broker configurations.

The Big Ass Data Broker Opt-Out List on GitHub is the most comprehensive, regularly updated resource for manual opt-outs. Bookmark it.

If you're willing to spend a little money, services like Optery offer a free tier that scans 150+ broker sites and provides DIY opt-out guides with quarterly monitoring. Running a paid removal service for the first year to clear accumulated data, then transitioning to open-source tools and better personal habits for ongoing maintenance, is a strong strategy.

The Bigger Question

Here's what I keep coming back to: we don't own our data.

Not in any meaningful sense. We generate it. It describes us. It can be used to manipulate, discriminate against, and endanger us. But the legal and commercial frameworks treat it as a commodity that belongs to whoever grabs it first.

That's fundamentally wrong.

I called it "borderline domestic terrorism" in a conversation recently, and while that's obviously hyperbolic, the sentiment isn't entirely off. When your personal information can be bought by anyone for fifty cents, when government agencies use commercial data purchases to sidestep warrant requirements, when stalkers can find a domestic violence survivor's new address through a people-search site — the harm is real, even if the legal system hasn't caught up to calling it what it is.

The data broker industry exists in a regulatory gray area that our laws haven't been designed to address. And until we demand better — from our legislators, from the companies we give our business to, and from ourselves — it will keep operating exactly as it does.

Your Choices Are Your Greatest Weapon

If you read Part 1, you know where I stand: I believe our choices are the most powerful tool we have. Every time you opt out of a data broker, switch to a privacy-respecting service, file a complaint with your state's attorney general, or simply refuse to hand over information that isn't necessary — you're making the data broker business model slightly less profitable.

That matters. Not because one person can dismantle a multi-billion dollar industry. But because critical mass is built one intentional choice at a time.

Support the organizations fighting this fight. Use the tools available to you. Exercise the rights your state has given you. And talk about it — with your family, your friends, your coworkers. Most people don't know this industry exists, let alone how deeply it affects them.

They should.


This is Part 2 of an ongoing series about digital exploitation, dark patterns, and what we can do about it. If you want to keep the conversation going, reach out. I'm also exploring how these issues connect to ethical AI development and community frameworks for responsible technology — more on that soon.

If you want to know when I post something new, drop your email below. No spam — just a heads up when there's a new post.