Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday January 31 2017, @03:46AM   Printer-friendly
from the max-headroom-lives dept.

This thought provoking and somewhat frightening article in Vanity Fair describes the state of play of technology which has the potential to warp our world-view far beyond anything before.

At corporations and universities across the country, incipient technologies appear likely to soon obliterate the line between real and fake. Or, in the simplest of terms, advancements in audio and video technology are becoming so sophisticated that they will be able to replicate real news—real TV broadcasts, for instance, or radio interviews—in unprecedented, and truly indecipherable, ways. One research paper published last year by professors at Stanford University and the University of Erlangen-Nuremberg demonstrated how technologists can record video of someone talking and then change their facial expressions in real time. The professors' technology could take a news clip of, say, Vladimir Putin, and alter his facial expressions in real time in hard-to-detect ways. In fact, in this video demonstrating the technology, the researchers show how they did manipulate Putin's facial expressions and responses, among those of other people, too.

This is eerie, to say the least. But it's only one part of the future fake-news menace. Other similar technologies have been in the works in universities and research labs for years, but they have never really pulled off what computers can do today. Take for example "The Digital Emily Project," a study in which researchers created digital actors that could be used in lieu of real people. For the past several years, the results have been crude and easily detectable as digital re-creations. But technologies that are now used by Hollywood and the video-game industry have largely rendered digital avatars almost indecipherable from real people. (Go and watch the latest Star Wars to see if you can tell which actors are real and which are computer-generated. I bet you can't tell the difference.) You could imagine some political group utilizing that technology to create a fake hidden video clip of President Trump telling Rex Tillerson that he plans to drop a nuclear bomb on China. The velocity with which news clips spread across social media would also mean that the administration would have frightfully little time to respond before a fake-news story turned into an international crisis.

Audio advancements may be just as harrowing. At its annual developer's conference, in November, Adobe showed off a new product that has been nicknamed "Photoshop for audio." The product allows users to feed about ten to 20 minutes of someone's voice into the application and then allows them to type words that are expressed in that exact voice. The resultant voice, which is comprised of the person's phonemes, or the distinct units of sound that distinguish one word from another in each language, doesn't sound even remotely computer-generated or made up. It sounds real. This sort of technology could facilitate the ability to feed one of Trump's interviews or stump speeches into the application, and then type sentences or paragraphs in his spoken voice. You could very easily imagine someone creating fake audio of Trump explaining how he dislikes Mike Pence, or how he lied about his taxes, or that he did indeed enjoy that alleged "golden shower" in the Russian hotel suite. Then you could circulate that audio around the Internet as a comment that was overheard on a hot microphone. Worse, you could imagine a scenario in which someone uses Trump's voice to call another world leader and threaten some sort of violent action. And perhaps worst of all, as the quality of imitation gets better and better, it will become increasingly difficult to discern between what is real behavior and what isn't.

Regardless of what ideology you subscribe to, what politician you support, what media source you visit; you fundamentally must be able to trust the information you see. If there is no way, barring forensic analysis, to tell truth or falsehood, how can we know anything? Plausible lies could literally be the end of the world.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Funny) by butthurt on Tuesday January 31 2017, @05:46AM

    by butthurt (6141) on Tuesday January 31 2017, @05:46AM (#461074) Journal

    We already had a story about the Adobe software, in November.

    "Adobe Voco 'Photoshop-for-Voice' Causes Concern [soylentnews.org]"

    Starting Score:    1  point
    Moderation   +3  
       Interesting=1, Funny=2, Total=3
    Extra 'Funny' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by Some call me Tim on Tuesday January 31 2017, @05:57AM

    by Some call me Tim (5819) on Tuesday January 31 2017, @05:57AM (#461078)

    OK, I gave you a funny because F$ck Adobe. ;-)

    --
    Questioning science is how you do science!
  • (Score: 1) by charon on Tuesday January 31 2017, @06:21AM

    by charon (5660) on Tuesday January 31 2017, @06:21AM (#461087) Journal
    I looked for a dupe before I submitted, but I missed that. Thank you.