Insights

Dark UX Patterns.

What makes a company or brand great? Is it their success? Their size? Their brand awareness? Their morality?

Rather than focusing on the usual—impressive profit, massive global expansion, or aesthetically pleasing branded billboards scanning cityscapes—today we'll look at the idea of morality. We'll be discussing 'Dark' UX, and advocating for why it should always be avoided.  

Some of the companies that you perceive as 'great' will undoubtedly be guilty of following some shady UX decisions to manipulate users... which may make you rethink your perception of said brands.

What is Dark UX?

Simply, Dark UX or Dark UX Patterns refer to user experience decisions that are not designed with the users' best interest in mind. Instead, they tend to be self-serving decisions that aim to influence or manipulate a user into a particular behaviour that is favourable to the decision.

Arguably many (or most!) businesses exist to sell their product or service, so it's understandable why a brand might do all they can to increase sales. Some may argue that good advertising campaigns attempt to do the same, influencing potential customers. However, the reason we will be arguing against Dark UX is from a question of honesty, transparency, and morality.

Dark UX patterns tend to be conscious decisions to exploit cognitive bias, or play on common psychological thinking to force choices that would otherwise not be made. When the goal is no longer to empower the user, the brand enters risky territory of potentially alienating themselves from the market by being exposed, shamed or ignored. To expand on this, we can look at the manipulation matrix:

If the brand is providing something beneficial, it may seem less 'evil' to trick users into forced interaction. However, if a brand cannot succeed honestly, then perhaps the strategic plans have gone extremely wrong.

To understand why brands choose to 'trick' their own customers, it's helpful to look at common examples of Dark UX.

Dark UX Examples.

Imagine you're booking a flight.

You've consciously made the decision to fly with a particular airline, on a particular date, and have decided the advertised cost is acceptable. So you proceed with the airline's website to make your purchase.

After selecting your flight you're asked for any meal preferences, such as vegan or kosher. This comes at no extra cost, so you pick your option and click next.

You then shown a map of the cabin with a seat highlighted for you. You're told you can continue with seat E11 by clicking next, or change by clicking any seat. Certain seats are marked in yellow with +$5 indicated below them. You decide you don't want to pay extra, so simply click next.

Then you're shown a page with additional extras such as ski equipment, musical instruments, and even pets. You're asked to click the tick-box for any extra services you require.

Finally, you are shown a page offering additional insurance, again with a tick-box. You decide not to click this, and continue to the check out.

To many, the surprise of the final check-out suddenly having additional costs is understandable. It's common to discover here that you have a seat cost, and an insurance cost. So how did this happen? Dark UX is at play.

The two prevelant issues here can be named 'Sneak into Basket', and 'Bait and Switch'. We'll outline exactly what they refer to, along with another host of

Types of Dark UX.

Sneak into Basket

In the case of the airline example, an upsell took place without the user knowing. By preselecting a seat on the plane, and having the UI designed to make the 'next' button prevalent—yet the 'randomly assign seat' option fairly obscure—the user unwittingly adds a paid upgrade to their purchase.

The most common example of this is having additional products pre-selected, forcing the user to 'opt-out' of items they never expressed interest in.

Bait and Switch

Again with the airline example, a classic Dark UX trick here would be the 'Bait and Switch' technique.  The user is given a false sense of security by seeing pages whereby clicking continue doesn't add the upsell items, but then the pattern changes unexpectedly, and suddenly the continue button comes with an attached item automatically being added to their order.

Very commonly, this can be achieved by using favourable (or less favourable) shades and UI button styles. For example, a greyed out button suggests the button doesn't yet work—discouraging users from clicking it.

Even if the user aims to one particular task, the bait and switch can cause a completely different and undesirable action to occur.

Disguised Ads.

Almost unavoidable, internet adverts are everywhere. As attention spans dwindle and users flood their time by swimming through social media pages and websites alike, advertisers spot potential to grab a huge virtual audience. While some adverts earnestly aim to showcase their product or service, there are many adverts designed to exploit user UI expectations.

Many websites offer some sort of downloadable such as pdfs, mp3s, programs, or documents. An advertising space on a download page may seek to exploit user expectations by tricking them with a large 'DOWNLOAD' graphic design.

Essentially, disguised adverts try to blend in with the user interface to dupe users into clicking on their content.

Forced Continuity.

Sometimes a free or heavily discounted offer can seem too good to be true. A business might offer a one-month free trial of their software, or perhaps a weekly product delivery service might offer a 90% off value to try your first delivery from them.  Forced continuity refers to such 'freebies' or 'purchases' that are actually tied to a subscription a.k.a. a continuity of service that was forced upon you.

A business may immorally make information about their offer purposely vague, focusing purely on the up front value savings or freebies they're trying to lure people in with. The hidden subscription aims to fly under the radar so that users not only accept the terms, but indeed make the follow up payment unwittingly. By the time the user realises and unsubscribes, they could have paid large recurrent values that they can't legally claim back.

A major warning flag here, would be for any free service or trial that asks for your credit card details. If the card is required, then you probably need to double check the fine print.

This tactic is sadly often paired effectively with our next Dark UX discussion point, roach motels.

Roach Motel.

The idea of a 'roach motel', is the idea that it's very easy to get in to a certain situation, but extremely difficult to get back out. This often is demonstrated by subscriptions that have an intuitive and simple sign up process, filled with support options—yet the unsubscribe option and contact details are hidden away and purposely made tediously tricky to action.

When combined with forced continuity tactics, some users may be perplexed enough just to live with the subscription costs. A massive (negative) win for the Dark UX teams.

Friend Spam.

Professional networking social media platform 'LinkedIn' was guilty of this deceptive tactic. When signing up from an account, the platform will as for other permissions from you (such as email contacts, or other social media accounts access) under the guise of helping you grow your network or improving your account experience. However, instead the brand spams all of your contacts with messages claiming to be from you.

Hidden Costs.

Most countries list pricing simply. The price shown is what you pay. Some countries, like the USA, like to take a more awkward approach of separating the tax from the pricing... meaning if a shirt is $99, you'll not actually be able to buy it with a $100 note.

Having billable items hidden—such as tax and delivery—gives the impression a purchase will be more affordable than it actually is. Sometimes the user is guided through a lengthy checkout process, perhaps making an account and entering card details before being shown the final card confirmation.

Another example to consider could be unadvertised minimum spend costs. Perhaps a restaurant offers a main dish at $9, but the minimum order for delivery is $12—forcing the user to purchase more items to complete their order.

Misdirection.

Beautiful interfaces are designed to guide the user through beneficial actions, with clarity. However, preying on how accustomed we have all become to various common features, it has become increasingly easy to predict expectations and exploit them with misdirection tactics.

This could be demonstrated with icons that don't match the common associated factor—such as backwards arrows taking users to the previous page—or by colouring and sizing of buttons and links being set up to try to prompt a particular action from a user.

A large button with 'Recommended' under it has more appeal than a simple text link in a small font, purposely positioned off center.

Price Comparison Prevention.

Perhaps one of the rarer Dark UX patterns, price comparison prevention aims to make it difficult for users to accurately or easily compare information given with that of competitors or other sources.

An interesting example of this was found via job listing website 'theladders'.

"The user must apply for a free account to browse jobs, but still must buy a premium account to apply for the job. Text highlighting is disabled through javascript, so the job cannot be searched for easily elsewhere, where it is free to apply for. In addition, the one-month membership option continues payment when it expires, which is stated in small fine print."

Whilst many Dark UX patterns are included in that awkward service, disabling the text highlighting effectively dissuades users from defecting to other platforms. This could be replicated across many product and service pages that may seem to be unique, but actually have many much better priced competitors offering a near identical product; If a user is easily compare information then they simply may not bother.

Privacy Zuckering.

Named directed after the Facebook CEO Mark Zuckerberg, this tactic aims to trick users into publicly sharing more information than intended.

Originally Facebook privacy settings were fairly basic, which led to a lot of 'oversharing' online by mistake—with many new users not fully aware how easily discoverable and public there content would be.

Despite privacy centres on Facebook now being much more powerful, there is now a much bigger worry about behind-the-scenes zuckering. As the data brokerage industry booms, so does the accidental widespread sharing of information by users. This usually follows this flow:

  1. When you use a service, you are prompted to accept the terms and conditions to proceed.
  2. As Terms and Conditions tend to be incredibly length and dense, most people blindly accept these. Within the terms, the servicer provider has expressed that they can and will share any data collected with third party providers if deemed desirable (to them).
  3. Data brokers then buy your data and combine it with larger compiled lists, and public data, to then resell your data further.

Currently the data mining and brokering industry is not well regulated. See stopdatamining.me for more information.

Trick Questions.

Simple to understand, sometimes simple to spot. Trick questions try to catch people out with clever wording to disguise or confuse their true meaning—yet giving them a safety net of consent should they get any complaints.

While filling in a form you respond to a question that tricks you into giving an answer you didn't intend. When glanced upon quickly the question appears to ask one thing, but when read carefully it asks another thing entirely.

Imagine a button that read:

"I do not wish to miss out on none of these offers".

Another typical example is to have two similar questions side by side with an opposite approach. This is often asking the user to click a box if they don't want mail from the brand, and to click the other box if they do want mail from 3rd parties. Our eyes would likely scan the question, and presume the same format for both—meaning we end up with an undesired newsletter either way.

Confirm-shaming.

Brand personalities shine through many marketing campaigns for big industry names, and for the newcomers alike...

And there's nothing wrong with taking a strong stance and keeping your tone, voice, and approach consistent. However, there are times whereby professional tones, clear language, general practises, and simplicity should be adopted. Here, we're talking about confirmation alerts that aren't simply asking a user to confirm an opinion, but also to agree to 'shame' added by the UI choice.

As an example, a streetwear company may have the following buttons appear after a user clicks 'unsubscribe' from their newsletter:

'Keep me subscribed for awesome deals and news'
'No thanks, I don't want to be stylish'

The unsubscribe option here makes the user feel guilty, or shamed, for not selecting the desired option.

Fear of Missing Out.

False time constraints can be a powerful motivator. Websites and applications have introduced lots of useful data analytics to relay key information back to users to help their experience such as:

  • How many of a particular item is still available.
  • Available sizes/colours/shapes/versions.
  • Available delivery slots.
  • Discount percentage and comparison to original price.
  • Amount of items sold.
  • Last purchase time.
  • Overall review score.

All genuine analytics can be of interest when making a decision, but these tools can also be used—and manipulated—to influence our behaviour.

You may rush a purchase after seeing 'only 2 left in stock' next a particular item you want. Unfortunately many websites and apps now use fake stock counters and false time constraints to tempt sales. Websites offering 3 hour, or 5 hour flash sales tend to almost always be disingenuous; Unless a flash sale has proper surrounding marketing and reason, it's usually just Dark UX patterns hoping to stop you thinking too much about the purchase they want you to make.

Who Uses Dark UX?

Our stance is that Dark UX patterns are undesirable, we advise against them as the manipulative goals are against our principles for good business. At first, it might seem like only a few hustling bloggers or 'contreprenuers' are using them to push their zero-to-millions software and solutions, but some of the biggest brands in the market unfortunately capitalise on this dastardly route.

With the digital world still lacking a lot of regulation, many independant watch dog websites have been set up to study, monitor, and call out Dark UX Patterns. In a report from forbrukerradet.no, it was found that Facebook and Google were guilty of purposely making their privacy flows easier for the options they wish users to take:

For instance, in Facebook and Google’s privacy settings process, the more private options are simply disabled by default, and users not paying close attention will not know that there was a choice to begin with. You’re always opting out of things, not in. To enable these options is also a considerably longer process: 13 clicks or taps versus 4 in Facebook’s case.
Source: https://techcrunch.com/ and forbrukerradet.no

Based on the extremely high usage of both Facebook and Google, it can be argued that these companies set the standards for much procedure. It's understandable that startups will look at the most successful brands for inspiration and the 'gold standard' for flows that serve business interests.  Therefore, when Dark Patterns exist at the top, we can expect to see them replicated and trickling down throughout more and more user experience designs.

It has to be asked how harmful it is for the likes of Samsung to subtly add sale banners with 'limited time only' ominously unattached to any sense of what that time may be. Or, in the case of eCommerce websites like 'TheRealReal', they have set their process up so that customers can only view their catalogue by subscribing to the website firstly...which might be marketed as 'exclusivity' but is simply a forced membership model (instantly harvesting customer details for remarketing purposes).

What Matters Most?

When creating or repositioning a brand, teams are always filled with grand ideas, hopes, and visions.

Brands might be created with noble and excited values...and built with purpose to fill a particular need, and add value. We refer to this purpose as the brand's 'why'—the reason why they exist. When a business has 'found their why', and pursues being genuinely beneficial to their target market, success becomes easier and more likely.

With all this in mind, we can see that good UX design works in tandem with good purpose. And conversely, using Dark UX Patterns can contradict how the brand wishes to be perceived. To remain transparent, honest, earnest, and true to your principles, you should identify and avoid Dark UX Patterns wherever possible—your audience deserves that. As users and governments globally start holding a magnifying glass up to unethical web practices, businesses have a duty to self-audit their design and ensure they're on the right side of UX.

The 'Front Page' Test.

To conclude, we can call upon the ethical 'Front Page Test' to aid our UX audits.

That test requires asking yourself: How would I feel if the course of action I am considering were reported on the front page of the local newspaper or blog? If you would be at all uncomfortable, the best course of action is not to do it — end of analysis.
Source: Western City.com

Word of mouth is powerful, and your brand reputation will be under scrutiny depending on the various hoops you make (or not) customers jump through. A short-term spike in sales is unlikely to outweigh long term positive brand positioning and customer trust.

If your UX was examined, would you be proud of it or apologising?

When selecting a UX agency, make sure your values align with theirs—and don't compromise on the wellbeing of your customers.

#workwithmad

Join the team.
Yes

Send an RFP.
hi@mad.co

How we think.
Insights