from the geeks-discover-pron dept.
Submitted via IRC for TheMightyBuzzard
Pornhub is one of the pre-eminent porn sites on the web. Each year Pornhub releases a year in review post with anonymous details about the site's users. More and more Linux users are visiting Pornhub, Linux saw an impressive 14% increase in traffic share in 2016.
[...] While Windows continues to dominate when it comes to which operating system users count on to watch Pornhub (about 80% of desktop users), Mac OS and Linux are on the rise, with Mac OS up 8% in traffic share and Linux up an impressive 14%.
Moving onto mobile. The playing field is pretty even here, with Android and Apple iOS almost at par with one another. Android leading with 3% more users on Pornhub than Apple iOS (47% of Pornhub's mobile users). Android's mobile market share has increase by 5% over the last year.
Look, it wasn't all me. I swear.
Pornhub has begun to use machine learning to automatically tag videos:
Artificial intelligence has proven to be a dab hand at recognizing what's going on in photos and videos, but the datasets it's usually trained on are pretty genteel. Not so for Pornhub, which announced today that it's using machine learning to automatically catalog its videos.
The site is starting small, deploying facial recognition software that will detect 10,000 individual porn stars and tag them in footage. (Usually this information is provided by uploaders and viewers, who will still play a part by verifying the software's choices.) It plans to scan all 5 million of its videos "within the next year," and then move onto more complicated territory: using the software to identify the specific categories videos belong to, like "public" and "blonde."
In a press statement, Pornhub VP Corey Price said the company was joining the trend of firms using AI to "expedite antiquated processes." However, the speed at which PornHub's AI processes the data doesn't seem like it would be an improvement on its current crowdsourced system. While in beta the machine learning software apparently scanned some 50,000 videos in a month. At this rate it would take nearly a decade to scan the entire site, but presumably improvements are being made.
Meanwhile, a security firm has warned that millions of Pornhub users were targeted by "malvertising" for more than a year:
Millions of Pornhub users were targeted with a malvertising attack that sought to trick them into installing malware on their PCs, according to infosec firm Proofpoint.
By the time the attack was uncovered, it had been active "for more than a year", Proofpoint said, having already "exposed millions of potential victims in the US, Canada, the UK, and Australia" to malware by pretending to be software updates to popular browsers.
Although Pornhub, the world's largest pornography site with 26bn yearly visits according to data from ranking firm Alexa, and its advertising network have shut down the infection pathway, the attack is still ongoing on other sites.
Related: BugReplay - Finding How Ads Get Past the Blockers
Linux Use on Pornhub Surged 14% in 2016
Malvertising Campaign Finds a Way Around Ad Blockers
Pornhub's Newest Videos Can Reach Out and Touch You
Pornhub will be deleting "deepfakes" — AI-generated videos that realistically edit new faces onto pornographic actors — under its rules against nonconsensual porn, following in the footsteps of platforms like Discord and Gfycat. "We do not tolerate any nonconsensual content on the site and we remove all said content as soon as we are made aware of it," the company told Motherboard, which first reported on the deepfakes porn phenomenon last year. Pornhub says that nonconsensual content includes "revenge porn, deepfakes, or anything published without a person's consent or permission."
Update: The infamous subreddit itself, /r/deepfakes, has been banned by Reddit. /r/CelebFakes and /r/CelebrityFakes have also been banned for their non-AI porn fakery (they had existed for over 7 years). Other subreddits like /r/fakeapp (technical support for the software) and /r/SFWdeepfakes remain intact. Reported at Motherboard, The Verge, and TechCrunch.
Motherboard also reported on some users (primarily on a new subreddit, /r/deepfakeservice) offering to accept commissions to create deepfakes porn. This is seen as more likely to result in a lawsuit: