Ethical Design: Why Dark Patterns are Becoming Legally Stupid
Let’s get something straight: most dark patterns aren’t designed by “evil people.” They are designed by normal product teams under immense pressure. Someone wants growth, someone wants to hit a conversion target, and someone else needs to fix a drop-off. Suddenly, your checkout flow looks like a casino designed by a psychopath—not because you’re a villain, but because manipulation is the easiest shortcut to results.
We’ve learned that accessibility is just good UX, but dark patterns are the exact opposite: they are bad UX disguised as success. The reason this topic is exploding right now isn’t just about “doing the right thing” or being ethical for the sake of it. It’s that in the EU, dark patterns are increasingly treated as illegal interface design. The Digital Services Act (DSA) is effectively turning manipulative design into a high-stakes regulatory risk.
Ethical design is often framed like charity—being “nice” to users. That is a weak framing. The real definition is much sharper: ethical design means the user stays in control. No manipulation, no shame, no hidden exits, and no sneaky defaults. It respects autonomy, ensuring people understand what they are doing and are free to choose without psychological tricks. In one line: ethical design persuades, while dark patterns pressure.
Trust is not a “fluffy” concept; it is a business asset. Dark patterns—or “deceptive design”—are UI patterns crafted to push people into actions they didn’t fully intend. While the short-term results look like growth, underneath the surface, you’re brewing a toxic cocktail of annoyed users, higher churn, more refunds, and negative reviews. If your product needs manipulation to convert, you don’t have a conversion problem—you have a value problem.
Let’s look at the four dark patterns you should kill first, starting with Forced Continuity. This is the classic “free trial” that quietly turns into a subscription. You enter payment details for a 7-day trial, and the cancellation process is made intentionally vague or hidden. It creates a customer relationship based on surprise and resentment. The ethical alternative? Make renewal explicit with clear end dates and reminders. You can still sell; you just stop tricking.
Next is Confirmshaming, which is pure emotional manipulation. This happens when the “decline” option is written to make the user feel stupid or guilty. Think of buttons like “No thanks, I prefer to pay full price” or “No, I don’t care about my health.” It’s the UI equivalent of a salesperson rolling their eyes at you. The fix is simple: write neutral options. A user should never be punished for saying “no.”
Then we have Misdirection, where the UI steers you like a shopping cart with one broken wheel. This is where the “good for the company” option is visually dominant while the real choice is buried. We’ve all seen the massive green “Accept All” button next to a tiny, grey “Manage Options” link. Real choice requires equal clarity: equal visual weight, simple language, and zero trick phrasing.
The fourth one to axe is the Roach Motel. It’s the pattern that makes it incredibly easy to get in, but nearly impossible to get out. Signing up takes one click; cancelling requires a ten-step quest involving hidden settings, customer support tickets, and email confirmations. Ethically, this screams that your product doesn’t deserve to keep its users. Your cancellation process should be just as easy as your sign-up.
In the past, these patterns were “just” a brand risk. Now, they are a legal liability. The European Commission has made it clear that under the Digital Services Act, online platforms must not design interfaces that deceive or manipulate. Regulators are already investigating large platforms for deceptive design. This is the direction of travel: less tolerance, more enforcement power, and consequences that are far more expensive than a few angry tweets.
From a business perspective, dark patterns are becoming legally stupid. When regulators start looking at “deceptive interface design,” they aren’t looking at your intentions; they are looking at the impact on the user. If your UI relies on double negatives (“Don’t not share my data”) or confusing button hierarchies, you are painting a target on your back for audits and massive fines.
As a UX strategist, you’ve probably been in that meeting where someone asks: “Can we make the cancel button harder to find?” This is where you have to stop being the moral police and start being a trust engineer. You aren’t there to be ethical for ethics’ sake; you are there to protect user trust as a core product asset. Trust makes products scalable; manipulation makes them fragile.
UX is the system that translates business intent into user experience. That means designers and product owners are the last line of defense before persuasion becomes coercion. To sniff out a dark pattern in your own product, ask yourself a simple question: “Would we still design it this way if we were forced to explain the logic out loud to our users?” If the answer is a hesitant “no,” then you already know it’s a trap.
Check your interface for the “Trap Test.” If a user chooses the option that doesn’t benefit the business, do you make them feel stupid? Do you hide that option visually? Do you add extra friction or use confusing copy? If you do any of these things, you aren’t designing a choice—you are designing a cage. And users are getting much faster at spotting the bars.
The shift toward ethical design is a shift toward maturity. We are moving away from the “growth at all costs” Wild West and into an era where product quality is measured by how much a user trusts the interface. Dark patterns are a sign of a weak product that can’t survive on its own merits. They are the “junk food” of UX—satisfying for a second, but damaging in the long run.
Ultimately, killing dark patterns is just good business. It reduces support volume, lowers chargeback rates, and increases long-term retention. When people feel respected by a UI, they stay. When they feel tricked, they leave the moment they find an alternative—and they usually leave a nasty review on the way out.
Ethical design isn’t about being virtuous or “soft.” It’s about building products that people actually want to keep using without regret or hidden exits. It’s about being professional enough to realize that manipulation is just a lazy way to avoid fixing your actual product flaws.
So, here is the takeaway: WCAG 2.2 and the DSA are converging. Accessibility and ethics are no longer optional “side quests.” They are the new baseline for what defines a professional, high-quality digital product. Stop tricking your users and start building something that deserves their time and money.
My Top 3 Advice for Ethical Design:
- The “Symmetry Rule”: If it takes one click to subscribe, it must take one click to unsubscribe. If it takes one click to “Accept All,” the “Reject All” option must be just as prominent.
- Audit your Copy for “Shame”: Search your UI for any buttons or links that use sarcasm or guilt to influence a choice. If you find them, replace them with neutral, descriptive text immediately.
- Use “Active Choice” for Consent: Instead of pre-checked boxes or hidden settings, present the user with two clear, equal choices. It builds more trust and actually leads to higher-quality data and leads.
Wat denk je ervan, Vincent? Is dit scherp genoeg voor die UX-strategen en product owners die denken dat ze met een grijs knopje nog wegkomen?