Facial-recognition data is typically used to prompt more vending machine sales:
Canada-based University of Waterloo is racing to remove M&M-branded smart vending machines from campus after outraged students discovered the machines were covertly collecting facial-recognition data without their consent.
The scandal started when a student using the alias SquidKid47 posted an image on Reddit showing a campus vending machine error message, "Invenda.Vending.FacialRecognitionApp.exe," displayed after the machine failed to launch a facial recognition application that nobody expected to be part of the process of using a vending machine.
"Hey, so why do the stupid M&M machines have facial recognition?" SquidKid47 pondered.
The Reddit post sparked an investigation from a fourth-year student named River Stanley, who was writing for a university publication called MathNEWS.
Stanley sounded alarm after consulting Invenda sales brochures that promised "the machines are capable of sending estimated ages and genders" of every person who used the machines without ever requesting consent.
[...] A University of Waterloo spokesperson, Rebecca Elming, eventually responded, confirming to CTV News that the school had asked to disable the vending machine software until the machines could be removed.
[...] Adaria Vending Services told MathNEWS that "what's most important to understand is that the machines do not take or store any photos or images, and an individual person cannot be identified using the technology in the machines. The technology acts as a motion sensor that detects faces, so the machine knows when to activate the purchasing interface—never taking or storing images of customers."
According to Adaria and Invenda, students shouldn't worry about data privacy because the vending machines are "fully compliant" with the world's toughest data privacy law, the European Union's General Data Protection Regulation (GDPR).
"These machines are fully GDPR compliant and are in use in many facilities across North America," Adaria's statement said. "At the University of Waterloo, Adaria manages last mile fulfillment services—we handle restocking and logistics for the snack vending machines. Adaria does not collect any data about its users and does not have any access to identify users of these M&M vending machines."
Under the GDPR, face image data is considered among the most sensitive data that can be collected, typically requiring explicit consent to collect, so it's unclear how the machines may meet that high bar based on the Canadian students' experiences.
According to a press release from Invenda, the maker of M&M candies, Mars, was a key part of Invenda's expansion into North America. It was only after closing a $7 million funding round, including deals with Mars and other major clients like Coca-Cola, that Invenda could push for expansive global growth that seemingly vastly expands its smart vending machines' data collection and surveillance opportunities.
"The funding round indicates confidence among Invenda's core investors in both Invenda's corporate culture, with its commitment to transparency, and the drive to expand global growth," Invenda's press release said.
But University of Waterloo students like Stanley now question Invenda's "commitment to transparency" in North American markets, especially since the company is seemingly openly violating Canadian privacy law, Stanley told CTV News.
(Score: 5, Insightful) by mcgrew on Wednesday February 28 2024, @01:38AM (2 children)
The damned corporations just keep getting more evil. Why is it legal to require that I use your app to charge my car at your station? It's like needing a different app for every brand of gas station!
We need a privacy law like Europe has. I thought Canada had one, I guess not.
It is a disgrace that the richest nation in the world has hunger and homelessness.
(Score: -1, Troll) by Anonymous Coward on Wednesday February 28 2024, @02:26AM (1 child)
> The damned corporations just keep getting more evil.
Why stop at corps? Isn't it likely that, "in Soviet Russia, vending machine watches you!"
(Score: 2) by turgid on Wednesday February 28 2024, @07:41AM
In Soviet Russia the lethargic economic system would not have allowed that sort of technology to develop to the point it was possible to put it in vending machines.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 2) by ChrisMaple on Wednesday February 28 2024, @02:18AM (3 children)
When I was in college, about 1970, one of the bigger fools in my dorm kicked in the front of a vending machine. No particular reason, just the random malice and anti-capitalism so popular among college students in those days. I would have loved to see him identified and jailed.
(Score: 2, Funny) by Anonymous Coward on Wednesday February 28 2024, @02:31AM (2 children)
My college friends were more resourceful. They worked out how to take things out of one of the "sandwich" type of vending machines--it had multiple doors and in normal operation you rotated an internal mechanism until what you wanted was behind one of the doors. Then selected and paid, and that door was unlatched.
The next step after working out how to open the doors took a little leap--as well as taking things out, they put interesting things back in! Ones I remember were a joint (not very good pot), sandwiches with various electronic components (resistor sandwich anyone?) and a variety of other clever things.
(Score: 2) by RS3 on Wednesday February 28 2024, @06:10PM (1 child)
That is a riot. I wonder if people then bought those swapped-in things?
(Score: 0) by Anonymous Coward on Thursday February 29 2024, @08:26PM
Either bought, or other thieves liberated them--a suitably bent piece of wire was all it took to trip the release on the individual doors. I'm pretty sure they were gone before the machine was officially restocked.
(Score: 2) by bzipitidoo on Wednesday February 28 2024, @02:20AM (4 children)
What does it mean that the equipment and software for facial recognition is so cheap it can be included in a vending machine of all things? Facial recognition is or will be everywhere.
(Score: 4, Touché) by Tork on Wednesday February 28 2024, @02:26AM
🏳️🌈 Proud Ally 🏳️🌈
(Score: 2) by RS3 on Wednesday February 28 2024, @06:14PM (2 children)
Just speculating: they may be sending simple image files to a powerful processor / server somewhere.
(Score: 3, Informative) by aafcac on Wednesday February 28 2024, @09:40PM (1 child)
Probably, although a raspberry pi can do image recognition now, so it could be either, I think there would need to be data that's going back to the server for practical reasons though. Otherwise it would need to relearn the faces at every possible machine.
https://www.raspberrypi.com/news/add-face-recognition-with-raspberry-pi-hackspace-38/ [raspberrypi.com]
(Score: 2) by RS3 on Friday March 01 2024, @07:04PM
For sure, some kind of front-end preprocessing would make much sense.
Why does your username remind me of an insurance company... something about a quacking duck... :)
(Score: 2) by Mojibake Tengu on Wednesday February 28 2024, @02:26AM (3 children)
Candies like M&M are quite simple enough to be made at home kitchen, at about 1/100 of the total price by equivalent quantity of ingredients.
Sometimes, even hacking funny machines is overrated.
Rust programming language offends both my Intelligence and my Spirit.
(Score: 3, Touché) by epitaxial on Wednesday February 28 2024, @06:42AM (2 children)
Let's be generous and round up with a small bag of them costing $2 at the store. You're telling me I can make them in my kitchen for $0.02?
(Score: 3, Interesting) by mhajicek on Wednesday February 28 2024, @08:42AM
Perhaps if you make them by the 55 gallon drum, buy ingredients wholesale, and don't count your labor or shipping.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 2) by cereal_burpist on Thursday February 29 2024, @05:20AM
It's the final step of stamping an "m" on each one that jacks up the price.
(Score: 3, Interesting) by darkfeline on Wednesday February 28 2024, @06:21AM (3 children)
Their statement is certainly plausible. Face detection is very simple and definitely could be done entirely on-device without storing any data.
The usefulness of such a feature is dubious, but not entirely zero. If the feature works correctly (that is, it is not crashing), then it is slightly more convenient for a customer to be able to order immediately without first touching the screen to disable the attract mode.
If true, it is also unfortunate that people jump at internal names like "Invenda.Vending.FacialRecognitionApp.exe", because that just incentives developers to use shitty inaccurate names, which helps no one.
I will give them some credit though, their publication looks pretty good for college students.
Join the SDF Public Access UNIX System today!
(Score: 5, Insightful) by mhajicek on Wednesday February 28 2024, @08:45AM (2 children)
If it's capable of doing what they say it does, it's also capable of saving and transmitting faces, or compressed facial recognition data. It's also very likely vulnerable to hacking, and could be repurposed without the company's knowledge.
The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
(Score: 2) by SomeRandomGeek on Wednesday February 28 2024, @05:01PM (1 child)
You are capable of going on a shooting spree, but that does not make you a mass murderer. By your logic, any computer with an attached camera should be outlawed, since a hacker is capable of hacking in, installing malicious software, and grabbing pictures of things the hacker shouldn't have pictures of.
The "could be used for if a hacker hacked in and altered it" standard is too high. Choose a lower standard, like "was actually used for" or "was intended to be used for"
(Score: 3, Interesting) by aafcac on Wednesday February 28 2024, @10:06PM
Corporations spying on people is far more common than mass-shootings. And that's even in America where we're largely numb to it due to how frequently it happens.
(Score: 3, Insightful) by corey on Wednesday February 28 2024, @11:26PM
This is what they’ll do.
ren FacialRecognitionApp.exe ProductDeploymentProcess.exe
Solved.