Watch the workshop from the Federal Trade Commission.
Some things are difficult by design.
Consider Amazon. The company perfected the one-click checkout. But canceling a $119 Prime subscription is a labyrinthine process that requires multiple screens and clicks.
Or Ticketmaster. Online customers are bombarded with options for ticket insurance, subscription services for razors and other items and, when users navigate through those, they can expect to receive a battery of text messages from the company with no clear option to stop them.
These are examples of “dark patterns,” the techniques that companies use online to get consumers to sign up for things, keep subscriptions they might otherwise cancel or turn over more personal data. They come in countless variations: giant blinking sign-up buttons, hidden unsubscribe links, red X’s that actually open new pages, countdown timers and pre-checked options for marketing spam. Think of them as the digital equivalent of trying to cancel a gym membership.
In a recent experiment testing some of the most commonly used tactics, like multiple opt-out screens and double-negatives, a University of Chicago Law School professor, Lior Strahilevitz, and a law clerk, Jamie Luguri, found that using dark patterns is extremely effective at compelling consumers to pay for services they didn’t necessarily want. Participants in the study subjected to digital cajoling were nearly four times more likely than a control group to keep a paid data protection service they had been automatically signed up for.
More than one in 10 e-commerce sites rely on dark patterns, according to another study, which also found that many online customer testimonials (“I wouldn’t buy any other brand!”) and tickers counting recent purchases (“7,235 customers bought this service in the past week”) were phony, randomly generated by software programs.
“Everyone is frustrated with dark patterns,” said Mr. Strahilevitz. “Companies are taking a calculated risk that they won’t get caught doing deceptive things because there is no consistent enforcement mechanism for this.”
Read more at The New York Times