Surveillance pricing is having a moment in the spotlight, thanks in part to a recent push by Canada’s New Democratic Party to curb the practice. At its core, the concept is simple but unsettling: companies use personal data to tailor prices as closely as possible to what an individual customer is willing to pay. It’s not about offering a universally fair price – it’s about extracting maximum value from each person, one by one.

None of this exists in a vacuum. The collection of personal data has become so routine that it barely registers anymore. Membership cards, online tracking, purchase histories – we hand over this information almost automatically. And often, it feels harmless, even beneficial. Getting coupons tailored to your habits or a surprise discount can feel like a small win.

I remember using a membership card while buying dog food for my mother years ago. A few weeks later, I received a cheerful package in the mail congratulating me on my “new puppy,” complete with coupons. It was inaccurate, but it felt thoughtful in a strange, algorithmic way. That’s the hook: when personalization feels like a perk, we don’t question the machinery behind it.

But the line between personalization and manipulation is thinner than it appears.

More recently, I noticed a discount in my “my offers” section for a large package of toilet paper – a product I buy infrequently and only on sale. For my small household, one purchase lasts a long time. When I mentioned the deal to a friend with a much larger household, she checked her account and found no such offer. The implication was clear: because she buys more often – and presumably needs it more – she wasn’t given the same break.

That’s where surveillance pricing stops feeling like a perk and starts looking like a penalty. Two customers, two different prices, based not on cost or fairness, but on predicted behavior. One is rewarded for being flexible; the other is quietly charged more for being consistent.

To be clear, pricing has always been a nuanced exercise. Businesses must balance costs, margins, and what the market will bear. But there has traditionally been a shared baseline – a price that, while sometimes adjusted through sales or promotions, remained broadly visible and consistent. Surveillance pricing erodes that baseline. It replaces it with a fluid, opaque system where the price you see is not necessarily the price I see, and neither of us knows why.

Years ago, I attended a sales workshop where this philosophy was laid out in blunt terms: discard standard pricing guides and instead evaluate each customer individually. Charge what you think you can get. I remember pushing back. The idea of charging someone more simply because they seemed able to pay – or because they were in a bind – felt ethically dubious at best. “Smarmy” was the word that came to mind then, and it still fits.

At the time, my resistance was framed as being out of step with modern sales thinking. Ironically, that mindset has now been supercharged by technology. What was once a subjective judgment call is now driven by data – vast, detailed, and constantly updated. With advances in Artificial Intelligence, companies can analyze purchasing habits, browsing behavior, and even timing patterns with a level of precision that far exceeds human intuition.

The result is a marketplace where consumers are no longer just buyers – they are data profiles to be optimized against. Every click, every purchase, every hesitation feeds into a system designed to calculate how much you can be nudged to spend.

This raises a fundamental question: at what point does smart pricing become unfair pricing? When essential goods are involved, the stakes are even higher. If those who need a product most are systematically denied discounts, the system begins to resemble a quiet form of discrimination – one that operates without visibility or accountability.

Surveillance pricing isn’t new in principle, but its scale and sophistication are unprecedented. The tools now exist to individualize nearly every price we encounter. That may be efficient, but efficiency is not the same as fairness.

As this debate gains traction, it’s worth asking not just what companies can do with our data, but what they should do. Because once pricing becomes a personalized negotiation conducted in the dark, trust – arguably the most important currency in any marketplace – starts to erode.

Nicole Fawcett