Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

You Can’t Handle The Truth About Facebook Ads, New Harvard Study Shows

Accepted submission by Arthur T Knackerbracket at 2018-05-11 07:43:19
Digital Liberty

COLLECTED BY Arthur Bot - NEEDS EDITING

The study [hbs.edu], based on research conducted at Harvard Business School and published in the Journal of Consumer Research, is an inquiry into the tradeoffs between transparency and persuasion in the age of the algorithm. Specifically, it examines what happens if a company reveals to people how and why they’ve been targeted for a given ad, exposing the algorithmic trail that, say, inferred that you’re interested in discounted socks based on a constellation of behavioral signals gleaned from across the web. Such targeting happens to virtually everyone who uses the internet, almost always without context or explanation.

In the Harvard study, research subjects were asked to browse a website where they were presented with various versions of an advertisement — identical except for accompanying text about why they were being shown the ad. Time and time again, people who were told that they were targeted based on activity elsewhere on the internet were turned off and became less interested in what the ad was touting than people who saw no disclosure or were told that they were targeted based on how they were browsing the original site. In other words, if you track people across the internet, as [facebook.com] Facebook routinely [facebook.com] does [zdnet.com], and admit that fact to them, the transparency will poison the resulting ads. The 449 paid subjects in the targeting research, who were recruited online [mturk.com], were about 24 percent less likely to be interested in making a purchase or visiting the advertiser if they were in the group that was told they were tracked across websites, researchers said [hbr.org].

In a related research effort described in the same study, a similar group of subjects was 17 percent less interested in purchasing if they had been told they’d been targeted for an advertisement based on “information that we inferred about you,” as compared to people who were told they were targeted based on information they themselves provided or who were told nothing at all. Facebook makes inferences about its users not only by leveraging third-party data, but also through the use of artificial intelligence [theintercept.com].

It’s easy to see the conflict this represents for a company recently re-dedicated to transparency and honesty that derives much of its stock market value from opacity.

The paper inadvertently offers an answer to a crucial question of our time: Why won’t Facebook just level with us? Why all the long, vague transparency pledges and congressional evasion [theintercept.com]? The study concludes that when the data mining curtain is pulled back, we really don’t like what we see. There’s something unnatural about the kind of targeting that’s become routine in the ad world, this paper suggests, something taboo, a violation of norms we consider inviolable — it’s just harder to tell they’re being violated online than off. But the revulsion we feel when we learn how we’ve been algorithmically targeted, the research suggests, is much the same as what we feel when our trust is betrayed in the analog world.

The research was, as the study puts it, “premised on the notion that ad transparency undermines ad effectiveness when it exposes marketing practices that violate consumers’ beliefs about ‘information flows’ — how their information ought to move between parties.” So if a clothing store asks you for your email address so that it can send you promotional spam, you may not enjoy it, but you probably won’t consider it a breach of trust. But if that same store were, say, covertly following your movements between the aisles by tracking your cellphone, that would be unnerving, to say the least. Given that Facebook operates its advertising operation largely on the basis of data harvesting that’s conducted invisibly or behind the veil of trade secrecy, it has more in common with our creepy hypothetical retailer.

Facebook claims that it does offer advertising transparency in the form of a tiny, hard-to-locate button that will disclose an extremely vague summary of why you were targeted for a given ad:

“Conspicuous disclosure is uncommon in today’s marketplace,” the study notes. “Digital advertisements are not usually accompanied by information on how they were generated, and when they are, this information is typically inconspicuous, merely made available for the motivated consumer to find.” See above.

The research team tested what would happen if targeted ads were automatically accompanied with explanations of the targeting process, rather than requiring curious users to find the right button. The results are stark and telling:

Ad transparency reduced ad effectiveness when it revealed cross-website tracking — an information flow that consumers deem unacceptable, as identified by our inductive study. … Ad transparency that revealed unacceptable information flows heightened concern for privacy over interest in personalization, reducing ad effectiveness.

In other words, for the same reasons you might not actually want to look at the dingy kitchen that just cooked your greasy burger, ad transparency can be deeply alarming.

For those following Mark Zuckerberg’s various apologias this year, this sounds at odds with one of the Facebook CEO’s favorite lines: People actually want targeted ads. This rationale made a notable appearance during Zuckerberg’s first day of congressional testimony (emphasis added):

Senator, people have a control over how their information is used in ads in the product today. So if you want to have an experience where your ads aren’t — aren’t targeted using all the information that we have available, you can turn off third-party information.

According to Leslie John, an associate professor at Harvard Business School and one of the paper’s authors, this defense by Zuckerberg “oversimplifies things.” If internet users have no choice about whether they’ll have to see ads or not, they may prefer to see so-called relevant ads. But, as John wrote in a Harvard Business Review article [hbr.org] accompanying her paper, “the research supporting ad personalization has tended to study consumers who were largely unaware that their data dictated which ads they saw.”

Or as John explained via email, “If I have to see ads, then yeah, I’d generally prefer ones that are relevant than not relevant but I’d add the qualifier: as long as I get the sense that you are treating my personal information properly. As soon as people feel that you are violating their privacy, they can become uneasy and understandably, distrustful of you.” Zuckerberg’s claim that you prefer to have your most personal information and online behavior tracked and analyzed on an industrial scale probably only checks out if you’re unaware it’s happening.

Assuming the validity of the research here, it’s no wonder Facebook doesn’t want to show its math: The ads that are its lifeblood will stop working as well. John agreed that “there’s a disincentive for firms to reveal unsavory information flows, so that could plausibly explain trying to hide it.” Facebook is, after all, one big, world-spanning, unsavory information flow.


Original Submission