Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday December 17 2019, @12:34PM   Printer-friendly
from the browse-with-circularly-polarized-glasses dept.

How Facebook's Political Ad System Is Designed to Polarize

Amid the tense debate over online political advertising, it may seem strange to worry that Facebook gives campaigns too little control over whom their ads target. Yet that's the implication of a study released this week by a team of researchers at Northeastern University, the University of Southern California, and the progressive nonprofit Upturn. By moonlighting as political advertisers, they found that Facebook's algorithms make it harder and more expensive for a campaign to get its message in front of users who don't already agree with them—even if they're trying to.

[...] The paper, still in draft form, is a follow-up to research the group did earlier this year, which found that Facebook's algorithms can dramatically skew the delivery of ads along racial and gender lines even when the advertiser doesn't intend it. That's because while Facebook allows advertisers to design their audience—that's ad targeting—the platform's algorithms then influence who within the audience actually sees the ad, and at what price. That's ad delivery. Because Facebook wants users to see ads that are "relevant" to them, the algorithm essentially pushes a given ad toward users it thinks are most likely already interested in its message. This, the researchers found, can reinforce stereotypes. For example, of the users who saw ads for jobs in the lumber business, 90 percent were male, even though the intended audience was evenly split between men and women. (Facebook is also facing litigation for allegedly allowing advertisers to intentionally discriminate.)

For the new study, the team decided to explore whether the algorithm also skews political ad delivery along partisan lines. Because the company doesn't share that information, they had to run a number of experiments, essentially going undercover to figure out where targeting ends and Facebook's algorithms begin.

[...] What seemed to most bother the political strategists I spoke with was not so much the existence of that machinery as its invisibility. In one of the cleverest twists of the experiment, the researchers created a neutral voter registration ad that secretly served code to make Facebook think it directed to one of the campaign's sites. In other words, to users, the ad was completely neutral, but Facebook had been tricked into thinking it was partisan. Lo and behold, the skew was still there—and it could only have come from Facebook's end. And, significantly, it would indicate that the algorithm was determining the ad's relevance not by the content, but purely by who it thought was behind it.

"This ultimately comes down to a lack of honesty and transparency on the part of Facebook—and that is toxic for our democracy," said Betsy Hoover, a former campaign strategist and the cofounder of the progressive tech incubator Higher Ground Labs, in an email. If the platform is pre-judging which voters should hear from which candidates, regardless of the message, it could be locking campaigns into filter bubbles they aren't even aware of.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Tuesday December 17 2019, @11:57PM

    by Anonymous Coward on Tuesday December 17 2019, @11:57PM (#933489)

    Using psychological techniques to impose anxiety with malicious intent, is a form of battery. It is lawfully actionable and frequently a subject of debate in divorce trials. Being perpetrated by people with doctorates in psychology (typical of advertising executives) makes the effect and the premeditated nature of it even more egregious.

    Psychology is pretty universally maligned by the TV industry. Typically healthcare professionals are portrayed as manipulators, or the techniques the use portrayed as malicious.

    Advertisers perpetrate battery to create mass psychosis. TV generally abets this by convincing the victims that the abuse they've experienced hasn't happened, or to fear those mechanisms that can be used to relieve their anxiety.

    There is a public harm to ad-tracking based advertising. It can be quantified. The means of restitution are not unknown to our system of law. But there is a chicken and egg problem. It is essentially the same problem that addiction counselors experience all the time. How do you convince a disordered mind, to abandon the harmful precepts that it believes to be the savior of itself? How do you convince people not to suffer, when they've been told since childhood by a ever present digital nanny, that suffering is the apex of human behavior?

    How do you unbrainwash 12 jurors who would feel alone, confounded, and addled if separated from a TV for more than a day?