July 9, 2021
State of California Lays a Hammer on the Masters of the Dark Patterns
This week the sunshine state has made a big step to improving its already impressive digital privacy legislation in CCPA, the California Consumer Privacy Act.
To do that, Cali banned the use of dark patterns – tricks used in apps and websites that purposefully make users do things they didn’t mean to do, or intentionally overcomplicate actions deemed disadvantageous to the service.
Unlike bad or lazy design, dark patterns are created specifically to deceive the end user and maximize the profits for the service. There are a lot of different deceiving techniques that fit the ‘dark pattern’ description, and we will review some of them in this article.
You have definitely encountered most of them, certainly more than once. If you’re an avid Reddit reader, you could have seen a lot of ‘dark patterns’ already. They are showcased and ridiculed on one particularly popular subreddit.
We will not review all possible dark pattern types, but concentrate on those that can be particularly malicious and costly for the end user.
Dark Patterns Are Virtually Everywhere
People who may not be too familiar with the term would probably have trouble pointing it out in the interfaces.
Not because they are rare, but because it seems like all apps we use are built top-to-bottom from dark patterns, and they have become something we got used to over the years.
This phenomenon can be compared to “the billboard problem”, where older generations noticed how digital advertisements obstructed the view and ruined the landscape, while a younger generation didn’t seem to bother, as they never experienced a different setting.
Dark patterns have been around the block for quite some time now though, with one of the first mentions of ‘dark patterns’ going all the way back to 2010, when UX designer Harry Brignull created a website that highlights those dark patterns.
They didn’t appear accidentally either.
These money-driven elements are sneaking into each stage of your app, social network, or email interaction. They are strategically placed to follow along your user journey – from homepage, sign in forms, profile info, engage actions, checkout and unsubscribe panels.
For example, placing a barely visible [x] button to close an annoying ad can be considered to be a dark pattern, especially if they redirect you to other ads or bait & switch the user.
It’s a wide-spread dark pattern, but not the most malicious one.
Dubious Techniques and Dark Patterns Combined
There are many dark patterns you could see online, but we will review those with the effect of impairing or subverting the user's choice to opt-out from getting their personal data sold.
Referring to CCPA, we can highlight some of the dark patterns this act bans:
- The use of confusing language and double-negatives. For example, “Don’t Not Sell My Personal Information”.
- The complication of opt-out procedures. For example, forcing users to click through the reasons why they should not request an opt-out.
Needless to say, if any organisation requires you to sign up for anything, that organisation also must allow you to backtrack and sign out permanently. If they make the sign up particularly easy, but make it very complicated to ‘undo’, it's a “Roach Motel”.
Some dark patterns can sneak things into your basket, like subscribe you to some magazines or services you were never interested in, after masking the confirmation in the tricky question or ambiguous sentence. It can bring in hidden costs and forced continuity too!
Forced continuity is when a supposed free trial changes to a paying scheme, without giving out any warnings in advance.
There’s also a thing aptly called “confrimshaming”. One of the clearest examples can be seen in Cosmopolitan-like websites, that give you some interesting subscription options: “yes” and “I’m boring”. Here’s one from Loft.
Not very classy, do you agree?
Short-Term and Long-Term Effects of Dark Patterns
Dark patterns are used fairly often because they bring in some short-term returns.
They also drive consumers away from the brand, because who likes being lied to or taken advantage of in such a way? Users will switch to more ethical products and services, or try other alternatives.
For companies themselves it means the increased number of returns and support calls.
One would think that some of the most popular apps would have the wisdom to stay away from dark patterns and deliver an ethical, transparent service, but reality is different. Instagram, one of the social media giants is a prime example.
One of Instagram’s updates saw a new bottom menu layout replacing the app’s traditional feature.
All to take advantage of the user's muscle memory, who would accidentally click on Reels or Shop buttons.
Interestingly enough, Instagram stated that such a move was introduced to prioritize their e-commerce platform over traditional posts. If they are so open about it, could it be considered to be a ‘dark pattern’?
If not, IG has a lot of other cases that could definitely be put in the dark pattern category. When the app asks you to turn on notifications, it provides you with two options: “Ok” and “Not Now”. Some of us don’t even see a problem here, since it’s so wide-spread.
But that’s just the tip of the iceberg.
Instagram is one of the most data-hungry apps on the market today. According to the pCloud study, Instagram is the most invasive app, collecting 79% of its users’ personal data and shares it with third parties, including search history, contacts, financial info, and location.
So the popularity of the app does not automatically mean the goodwill of the service. On the contrary, it suggests that market makers set the tone of ethical practices for other apps, and in this example – a pretty bad one.
Don’t let these dark patterns take advantage of you, learn more about the other sneaky ways of controlling user behavior and stay safe.