Submitted via IRC for SoyCow1984
Audio device maker Sennheiser has issued a fix for a monumental software blunder that makes it easy for hackers to carry out man-in-the-middle attacks that cryptographically impersonate any big-name website on the Internet. Anyone who has ever used the company’s HeadSetup for Windows or macOS should take action immediately, even if users later uninstalled the app.
To allow Sennheiser headphones and speaker phones to work seamlessly with computers, HeadSetup establishes an encrypted Websocket with a browser. It does this by installing a self-signed TLS certificate in the central place an operating system reserves for storing browser-trusted certificate authority roots. In Windows, this location is called the Trusted Root CA certificate store. On Macs, it’s known as the macOS Trust Store.
The critical HeadSetup vulnerability stems from a self-signed root certificate installed by version 7.3 of the app that kept the private cryptographic key in a format that could be easily extracted. [...] the sensitive key was encrypted with the passphrase “SennheiserCC” (minus the quotation marks). That passphrase-protected key was then encrypted by a separate AES key and then base64 encoded. The passphrase was stored in plaintext in a configuration file. The encryption key was found by reverse-engineering the software binary.
[...] A later version of the Sennheiser app made a botched attempt to fix the snafu. It too installed a root certificate, but it didn’t include the private key. But in a major omission, the update failed to remove the older root certificate, a failure that caused anyone who had installed the older version to remain susceptible to the trivial TLS forgeries. Also significant, uninstalling the app didn’t remove the root certificates that made users vulnerable.
Source: Original source
(Score: 3, Interesting) by Anonymous Coward on Thursday November 29 2018, @07:48PM (2 children)
Why is there anything to install?
(Score: 1, Informative) by Anonymous Coward on Thursday November 29 2018, @07:58PM
DUUUUH!!!!
But for a serious answer:
still pretty dumb
(Score: 0) by Anonymous Coward on Thursday November 29 2018, @08:11PM
Apping the apps that app the app.
(Score: 1, Funny) by Anonymous Coward on Thursday November 29 2018, @07:57PM (1 child)
I bet Schneier ones have much better crypto.
(Score: 2, Funny) by Anonymous Coward on Thursday November 29 2018, @10:29PM
Sennheiser means "without schneier" in german. That's the problem right there.
(Score: 4, Insightful) by ikanreed on Thursday November 29 2018, @08:00PM
This way we made our innocuous physical object to spy on you for our profit also allows nefarious individuals to spy on you for their profit. Our bad.
(Score: -1, Flamebait) by Anonymous Coward on Thursday November 29 2018, @08:16PM
Fuck em, they deserve it.
(Score: 2) by Runaway1956 on Thursday November 29 2018, @09:21PM
All of my headphones have seams. I'm the only kid on the block with seamed headphones. The shame! Not even my pumped up kicks makes up for the seams.
-signed Foster
Abortion is the number one killed of children in the United States.
(Score: 4, Insightful) by TheFool on Thursday November 29 2018, @09:26PM
You really, really don't want to lock an admin user out of modifying the root certificate store. The day you do that is the day we truly lose control of our computers. And if users can do it, code admin users run (like this exceptionally silly installer) can do it.
But, why the websocket? Loading a driver, OK, but... a websocket? What could they even possibly be using it for? "Telemetry" is the reactionary answer, but I wonder if it's something even more ridiculous.
(Score: 4, Insightful) by requerdanos on Thursday November 29 2018, @09:33PM (5 children)
The way to make a set of speakers or headphones "work seamlessly with computers" is not, repeat not, "to establish an encrypted websocket" by breaching the security of the store of certificates.
If you have an employee who tells you this, you should fire that employee--escort the employee out--or at the least send him to security training and don't allow him to touch product design again until he has learned to the satisfaction of actual computer security professionals.
If you yourself feel as though you should rise to defend the statement, then you may be the employee that needs to be fired or extensively retrained.
See "Sony Rootkit" for further examples of why this is thinly veiled computer crime, not slick product design.
(Score: 5, Informative) by TheFool on Thursday November 29 2018, @09:58PM (4 children)
These "value-add software" guys in hardware companies get paid extremely little, because... well, the software itself isn't what they are selling, so corporate treats those teams like garbage. There is no training budget for people on teams like this and it's often unpaid/low-paid interns doing the work to keep the costs even lower. I don't know if it's industry-wide, but I've run across this with pretty much any major hardware vendor I've worked with.
I won't defend it, but yeah, none of what you suggest will happen. I imagine that as far as they are concerned this is a PR problem now, not a personnel or training problem. And that PR work is probably cheaper than Doing the Right Thing in 2018.
(Score: 4, Insightful) by edIII on Thursday November 29 2018, @11:21PM (3 children)
Guess what is even cheaper? Not buying all that crap in the first place.
I stopped buying anything that had bluetooth in it, or for some inane reason, demanded being configured by a smart phone app just because. Especially, when they're fucking headphones. Anything that demands a persistent connection to the Internet, to give a company telemetry, is also banned from my life. I don't allow any 3rd party company to collect telemetry on me, and heck, I even have a deal with my doctor (who is older) that he doesn't allow data entry of medical records. Keeps it on paper, and deliberately keeps as little info as possible in the system.
All of my audio in the last few years was switched entirely over to analog, and wired. That's because Bluetooth already went through their security Armageddon with billions of devices, that can't be easily upgraded, now possessing a critical security flaw. So, can't use Bluetooth ever again. Won't trust them, especially without the ability to upgrade firmware via a USB cable.
Software and security is such utter fucking shit these days.
Technically, lunchtime is at any moment. It's just a wave function.
(Score: 3, Insightful) by legont on Friday November 30 2018, @03:52AM (2 children)
Unfortunately it is everywhere and one can't avoid it. For example half - yes, half - of a modern car costs is this software junk.
"Wealth is the relentless enemy of understanding" - John Kenneth Galbraith.
(Score: 1, Insightful) by Anonymous Coward on Friday November 30 2018, @05:49AM
The problem isn't 'software junk'. It is junk software.
(Score: 2) by Bot on Sunday December 02 2018, @12:43AM
in other words, electronics provide the scapegoat for cars to cost more.
Account abandoned.
(Score: 4, Insightful) by ledow on Friday November 30 2018, @09:18AM (2 children)
Complete lack of permissioning.
Where the hell was the dialog saying "This app is trying to install a certificate that will allow it to intercept all your secure web sessions, including banking and financial transactions, carried out by any installed system application. Do you want to allow this?"
Nowhere. Because Microsoft don't want to burden you with decisions and kowtowed to apps "just doing anything they like" on your machine without question... even if you're an expert user. Meanwhile Android gives you a bunch of individual permissions and, while it only "allows" or "disallows" installation in its entirety, at least you can join the dots if you have half a brain.
Honestly, we are 20-30 years overdue for just runnings apps in entirely contained bottles that can't affect anything, even user's data, until you set an individual permission for them to do so that no automated program can set for you (i.e. it takes human interaction to set it whether that human's a network admin, or a home user). Literally pass applications only a copy-on-write copy of the user's documents when the user says to open a file in a certain program, and keep a revision history of everything they touch, and then completely isolate them from everything else in the system and things like this go away.
And then when you uninstall them, anything you HAVE agreed to... it just disappears with the bottle. Because at no point was any program ever allowed to add anything anywhere other than in its bottle, which extends to a bunch of executables, and a /proc like filesystem to request permissions / interact with the wider system / install certificates that overlay into the system if the user so wishes. Remove the bottle, that overlaid file disappears, and it's gone from everywhere.
(Score: 1, Interesting) by Anonymous Coward on Friday November 30 2018, @12:13PM
Containers and sandboxing do exist, they just aren't all that popular outside industry. Considering that Jess Frazelle now works for Microsoft though, they may get a "containerized-everything" userland before any major linux distro.
(Score: 2) by acid andy on Friday November 30 2018, @02:01PM
That's a great idea, but I imagine in cases like TFA, the programmers might see the need to release their software installer as an operating system upgrade, in the same vein as the rootkits we have now, that would punch a hole in those sandbox protocols by "upgrading" that part of the operating system. I can't see how that can be stopped without also locking out the user from upgrading their own operating system. I suppose your first point about the increased permission granularity would at least mean they'd be told what OS component was about to be replaced, rather than the generic "this application needs root / admin access. OK?".
Master of the science of the art of the science of art.
(Score: 0) by Anonymous Coward on Friday November 30 2018, @09:15PM
Sounds like a series of clusterfucks to me.