Security lapse exposed Clearview AI source code – TechCrunch:
Since it exploded onto the scene in January after a newspaper exposé, Clearview AI quickly became one of the most elusive, secretive and reviled companies in the tech startup scene.
The controversial facial recognition startup allows its law enforcement users to take a picture of a person, upload it and match it against its alleged database of 3 billion images, which the company scraped from public social media profiles.
But for a time, a misconfigured server exposed the company's internal files, apps and source code for anyone on the internet to find.
Mossab Hussein, chief security officer at Dubai-based cybersecurity firm SpiderSilk, found the repository storing Clearview's source code. Although the repository was protected with a password, a misconfigured setting allowed anyone to register as a new user to log in to the system storing the code.
The repository contained Clearview's source code, which could be used to compile and run the apps from scratch. The repository also stored some of the company's secret keys and credentials, which granted access to Clearview's cloud storage buckets. Inside those buckets, Clearview stored copies of its finished Windows, Mac and Android apps, as well as its iOS app, which Apple recently blocked for violating its rules. The storage buckets also contained early, pre-release developer app versions that are typically only for testing, Hussein said.
The repository also exposed Clearview's Slack tokens, according to Hussein, which, if used, could have allowed password-less access to the company's private messages and communications.
(Score: 2) by krishnoid on Monday April 20 2020, @09:15PM (2 children)
Seems like anyone can look at the source now (and maybe examine it for bugs), but they don't have the data against which to run it. Considering their explicitly stated mission already makes people hate them anyway, how bad is this, really, for Clearview? Are people gonna complain about inconsistent indentation or undesirable algorithm complexity?
(Score: 1) by DECbot on Monday April 20 2020, @09:45PM
Obviously they train the AI compiled from the source code with a dataset that will guarantee false positives and show the method is unreliable. Then it will be on Clearview to demonstrate the quality of their training dataset and how their running AI is superior than the AI trained by the inferior dataset. In a dick measuring contest, 300 millimeters beats 12 inches.
cats~$ sudo chown -R us /home/base
(Score: 2) by DannyB on Tuesday April 21 2020, @02:29PM
Data wants to be free!
(no, it's his brother that wants to be free)
What if the source could be written by an AI having the properties:
The lower I set my standards the more accomplishments I have.