Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Dark Patterns are designed to confuse and enroll

Accepted submission by exec at 2016-07-29 14:45:53
News

Story automatically generated by StoryBot Version 0.0.1f (Development).

Note: This is the complete story and will need further editing. It may also be covered
by Copyright and thus should be acknowledged and quoted rather than printed in its entirety.

FeedSource: [HackerNews] collected from rss-bot logs

Time: 2016-07-28 16:40:20 UTC

Original URL: http://arstechnica.com/security/2016/07/dark-patterns-are-designed-to-trick-you-and-theyre-all-over-the-web/ [arstechnica.com]

Title: Dark Patterns are designed to confuse and enroll

Suggested Topics by Probability (Experimental) : 23.5 science 17.6 digiliberty 17.6 OS 11.8 business 5.9 mobile 5.9 hardware 5.9 careersedu 5.9 careers 5.9 breaking

--- --- --- --- --- --- --- Entire Story Below --- --- --- --- --- --- ---
 
 

Dark Patterns are designed to confuse and enroll

Everyone has been there. So in 2010, London-based UX designer Harry Brignull decided he’d document it. Brignull’s website, darkpatterns.org [darkpatterns.org], offers plenty of examples of deliberately confusing [wikipedia.org] or deceptive user interfaces. These dark patterns trick unsuspecting users into a gamut of actions: setting up recurring payments, purchasing items surreptitiously added to a shopping cart, or spamming all contacts through prechecked forms on Facebook games.

Dark patterns aren’t limited to the Web, either. The Columbia House mail-order music club of the '80s and '90s famously charged users exorbitant rates for music they didn’t choose if they forgot to specify what they wanted. In fact, negative-option billing began as early as 1927, when a book club decided to bill members in advance and ship a book to anyone who didn’t specifically decline. Another common offline example? Some credit card statements boast a 0 percent balance transfer but don’t make it clear that the percentage will shoot up to a ridiculously high number unless a reader navigates a long agreement in tiny print.

“The way that companies implement the deceptive practices has gotten more sophisticated over time,” said UX designer Jeremy Rosenberg [jeremyrosenberg.co.uk], a contributor to the Dark Patterns site. “Today, things are more likely to be presented as a benefit or obscured as a benefit even if they’re not.”

When you combine the interactive nature of the Web, increasingly savvy businesses, and the sheer amount of time users spend online, it’s a recipe for dark pattern disaster. And after gaining an awareness for this kind of deception, you’ll recognize it’s nearly ubiquitous.

With six years of data, Brignull has broken dark patterns down into 14 categories. There are hidden costs users don’t see until the end. There’s misdirection, where sites attract user attention to a specific section to distract them from another. Other categories include sites that prevent price comparison or have tricky or misleading opt-in questions. One type, Privacy Zuckering [darkpatterns.org], refers to confusing interfaces tricking users into sharing more information than they want to. (It’s named after Facebook CEO Mark Zuckerberg, of course.) Though perhaps the worst class of dark pattern is forced continuity, the common practice of collecting credit card details for a free trial and then automatically billing users for a paid service without an adequate reminder.

But while hackers and even SEO firms are often distinguished as “white hat” or “black hat,” intent isn’t always as clear when it comes to dark patterns. Laura Klein, Principal at Users Know [usersknow.com] and author of UX for Lean Startups, is quick to point out that sometimes it’s just a really, really poor design choice. “To me, dark patterns are very effective in their goal, which is to trick the user into doing something that they would not otherwise do,” she said. Shady patterns [medium.com], on the other hand, simply push the company’s agenda over the user’s desires without being explicitly deceptive.

Examples of bad design choices that may be accidental aren’t hard to find. British Airways lists flights [darkpatterns.org] that are the second-lowest price as the lowest, and it’s hard to tell whether this misdirection is intentional. And examples of deceptive patterns that are, strictly speaking, completely legal are a dime a dozen. There’s the unclear language hidden in 30-page Terms of Service agreements, which lull users into a sense of complacency as they hit “agree” on every page. Sometimes users agree to allow apps to post on their Twitter feed or Facebook walls but later forget that this feature is enabled. The app doesn’t let them know at the moment it’s going to post, of course.

“The companies that know what they’re doing operate in sort of a safe zone where they’re not likely to be prosecuted or get into trouble legally,” Brignull explained.

Over time, users have been desensitized to these permissions. There are subscription sites that renew without a reminder a few days in advance or ones that are very easy to sign up for online but then force users to cancel by phone during business hours. And the vicious cycle of online advertising is even more difficult to pierce. There are those ads that follow you around the Web, known as behavioral targeting, or those ads based directly on things like your Web history or search terms. Opting out of this is so difficult that UX designer and Dark Patterns contributor James Offer [codehesive.com] considers that a dark pattern in its own right.

Even though the line between outright deception and poor user design is often hard to distinguish, Brignull said “there are some sites where it’s clearly intentional—they’re doing too many things for it to be by accident.” As an example, he points to The Boston Globe, which was recently called out [rationalconspiracy.com] for multiple dark patterns. Among the offenses, the site didn’t inform subscribers of price increases and buried rates in the site’s FAQ.

               

-- submitted from IRC


Original Submission