Arthur T Knackerbracket has processed the following story:
BrakTooth is a collection of flaws affecting commercial Bluetooth stacks on more than 1,400 chipsets used in billions of devices – including smartphones, PCs, toys, internet-of-things (IoT) devices and industrial equipment – that rely on Bluetooth Classic (BT) for communication.On Thursday, CISA urged manufacturers, vendors and developers to patch or employ workarounds.The PoC has been made available on the BrakTooth website on GitHub.As the paper pointed out, all that attackers need to do to pick apart the BrakTooth bugs is an off-the-shelf ESP32 board that can be had for $14.80, (or as low as $4 for an alternative board on AliExpress), custom Link Manager Protocol (LMP) firmware, and a computer to run the PoC tool.Researchers from the University of Singapore disclosed the initial group of 16 vulnerabilities (now up to 22), collectively dubbed BrakTooth, in a paper published in September. They found the bugs in the closed commercial BT stack used by 1,400+ embedded chip components and detailed a host of attack types they can cause: Mainly denial of service (DoS) via firmware crashes (the term “brak” is actually Norwegian for “crash”). One of the bugs can also lead to arbitrary code execution (ACE).Since the paper was published, there have been a number of updates, as vendors have scrambled to patch or to figure out whether or not they will in fact patch, and as researchers have uncovered additional vulnerable devices.
BrakTooth is a collection of flaws affecting commercial Bluetooth stacks on more than 1,400 chipsets used in billions of devices – including smartphones, PCs, toys, internet-of-things (IoT) devices and industrial equipment – that rely on Bluetooth Classic (BT) for communication.
On Thursday, CISA urged manufacturers, vendors and developers to patch or employ workarounds.
The PoC has been made available on the BrakTooth website on GitHub.
As the paper pointed out, all that attackers need to do to pick apart the BrakTooth bugs is an off-the-shelf ESP32 board that can be had for $14.80, (or as low as $4 for an alternative board on AliExpress), custom Link Manager Protocol (LMP) firmware, and a computer to run the PoC tool.
Researchers from the University of Singapore disclosed the initial group of 16 vulnerabilities (now up to 22), collectively dubbed BrakTooth, in a paper published in September. They found the bugs in the closed commercial BT stack used by 1,400+ embedded chip components and detailed a host of attack types they can cause: Mainly denial of service (DoS) via firmware crashes (the term “brak” is actually Norwegian for “crash”). One of the bugs can also lead to arbitrary code execution (ACE).
Since the paper was published, there have been a number of updates, as vendors have scrambled to patch or to figure out whether or not they will in fact patch, and as researchers have uncovered additional vulnerable devices.
I have long wondered why software is so often broken and buggy. Searching about this one weekend, I stumbled upon Formal Methods [wikipedia.org], which seem to elevate programming from something of an art to a proper engineering discipline, where one can have good confidence that the software will function as intended.
Anyone here care to speculate whether formal methods will catch on or when that might happen? Alternatively, how much pain might we have to go through before such methods are utilized more broadly?
Let's see... how much more does it cost to use those formal methods instead of our trusty "it compiles ship it"?
And what do we have to pay if (ok, when) security holes in our hard- and software surface?
Thank you for your suggestion, we will formally discuss it in the next board meeting, I appreciate and value your input and you can rest assured that your concern will be taken serious. If you excuse me now...
For some reason I have memory of Microsoft and Mozilla collaborating on a TLS implementation that was created using a new language by Microsoft Research that allowed inline formal proofs of correctness alongside the code. I remember thinking, this is huge, and will lead to much more robust software.
But it must be a false memory, because now I can't find anything of the sort actually existing. Oh well.
You are thinking of https://github.com/project-everest/hacl-star [github.com] It isn't a full formal method implementation of the cryptographic primitives, but it is formally verified to have certain properties.
I discussed Formal Methods in my CompSci University interview circa 1990. Never used or heard much about them since then. Probably mostly useful in cryptography and safety-critical systems
Aviation uses them happily. Well, in Europe at least.
Who wants to engineer software with formal methods when you can fake it in less than half the time for less than 1/4th the cost? Show me a company that would rather go to market with something 99.9% correct in three years when they could have something 99% correct in 18 months using a team half the size and cost?
This is why I stopped using Bluetooth years ago. Don't want it on keyboards, mice, speakers, .etc. I have it turned off on my phone at practically all times. I relent *only* when in the car moving because the speakerphone is just too damned useful, but turn it off when I get somewhere. The exploits are too many, the devices with outdated software and few updates, and therefore the level of confidence is about zero. If there was a really high quality speakerphone available for Android phones based on something else, I'm listening. Problem is the built-in one for cars that integrates with the music, car speakers, and steering wheel is also the best audio quality.
Manufacturers have no impetus, via financial negatives or positives, to focus on security first. We see the results.
Other wireless technologies are not immune, but the commercial equipment isn't that bad when combined 802.1x/EAP and RADIUS authentication methods. AFAIK, there is nothing comparable in Bluetooth. Worse is that Bluetooth is also kind of a shit connection. I've lost and had to rebuild far more Bluetooth connections than 2.4ghz wireless.
Bluetooth is just a pile of worthless shit, but it certainly is convenient to use and everywhere isn't it?
Our embedded wireless guy is pretty fresh out of the Navy. I had a debate with him about "roll our own" vs using the Bluetooth stack. While I'm all about the benefits of using established libraries with more functionality built in, more thoroughly tested security, etc. it was obvious 3 years ago when we were making the decision that Bluetooth wasn't / isn't mature enough to avoid issues like in OP. Still, idealism won out (as well as a lust for some particular feature in Bluetooth that was easier than hand coding it to our chipset), and so our product rests on a commercial Bluetooth stack.
Now, every time one of these things gets published, instead of saying "we use a proprietary protocol which was verified and validated internally to fulfill all of our requirements for functionality and security," and fixing whatever bugs we may find on our own schedule and terms, we get to respond to upper management with reports about how our commercial Bluetooth stack is or isn't impacted by the latest bug reports.
Just as a consumer, I find Bluetooth performance to be too spotty for my taste - random dropouts, variable connection times - sometimes quite long, weird glitches... I listen to Bluetooth speakers at my desk maybe 20 hours a week. At that level of testing, I'm noticing glitches that a proprietary protocol might have been able to mask two or three times a week. Our users are only "listening" to the wireless link maybe 3-4 hours a week, but glitches in their audio can potentially have much higher consequences than a little annoyance in a music stream.
You should be aware that formal methods do not add anything to the development cycle. The fundamental question is whether a program is "correct" (which is hard to define in itself). You can start writing pages of proofs but this will not improve the code nor will it ever proof that the code is correct as the correctness of the proof has to be proved (i.e. an infinite regression). Just consider the function that adds one to an integer. If you start to proof that your implementation is correct you can only obscure your solution. I believe that this is the reason why formal methods have stayed irrelevant ever since their invention (well, I heard that NASA formally developed the code for the Apollo missions).
Code should be as clearly written as to serve as a proof by and in itself. Together with clear APIs, programs can then be understood and modified. This is complemented with unit testing. As a matter of fact, in mathematically logic the truth of a statement can be shown by checking all values of the free variables. Unlike formal proofs, this step can be completely formalized in software which is why it was more successful IMO.
Without getting into a long digression on formal methods, I don't think they'll make much difference in this case is because the problem with Bluetooth is that the spec runs to thousands of pages, containing every feature every vendor who contributed to them could think up, and then some more stuff thrown in just for fun. Who would have thought that would lead to lots of bugs?
Thesis: "To maintain the rule of law in modern society, it is necessary that all commercial communications software from major technology companies shall at all times in every released version have at least one security bug of each type PE, RCE, DOS"
your thesis has a flaw. it communicates "one flaw" but it should be "two". the rest is garbeled, can't read past the first axiom.
There are numerous languages that make it harder or impossible to create the most common C bug vulnerabilities in code.
The C programmers will universally complain about any past, present or future language that takes away the ability to hang yourself.
I could pick many examples, but I'll pick two.
1. Java doesn't let you mess with pointers. It has precise GC to eliminate three entire classes of memory management bugs. (i) not freeing something, (ii) freeing it twice, (iii) using a pointer to an object after it has been freed. Furthermore, I would point out that GC reduces latency on the money earning threads because they never see a single cpu instruction related to memory management (eg reference counting, or other strategy). The cleanup of GC happens later on other cpu cores not running the money earning threads. The money making thread has to pay the freight to operate those GC threads, and it does, and in return gets lower latency. You can never buy back latency. Crank up the clock speed all you want.
2. Languages that have a String length indicator. (most modern languages) What I'm saying is that a string object contains a length indicator. Not a zero terminated string.
3. (I said two, but this is a bonus item) Non overflowable buffers. Ultimately this comes down to array bounds checking. Modern languages make it impossible to overflow an array. Oh, but the cost of that checking! In many cases it can be done once at the start of a loop by the compiler to guarantee no overflow. The cost seems trivial if it prevents creating a vulnerability such as a stack overwrite of the return address.
It's not that we CAN'T fix these problems. It's that people simply WON'T fix it in a systematic way. The solution is to endeavor to make these problems impossible to happen. Array overflows (eg buffer overflows and other overflows). Runaway strcopy operations. The three memory management bugs (above) which are by themselves the most common source of bugs of several decades of microcomputer evolution.
I won't waste my breath with strong typing to prevent assigning a string to a floating point value.
1) Formal methods doesn't mean your code is correct. Formal methods only prove your code matches the method. It says nothing about if the requirements are correct or if the method itself is bugged or has gaps. If you want a subtraction machine and I go build an adding machine and prove it works using formal methods, so what? Perhaps you try to add a negative number and then we realize my proof doesn't take into account numbers below zero.2) It increases the cost of software development by a factor of around 1000. If you want all software development to use formal methods, expect a standard PC to cost more than a house.3) Random developers aren't smart enough to use formal methods. If it were adopted, there would be a MASSIVE shortage of developers.
Software is so buggy because it's complexity increases like crazy. To write high quality software, over 80% of your code will be error handling and writing in that style is truly annoying and isn't needed most of the time. Plus what should you do if you have an error when saving a file while the program is exiting. Should you somehow halt exiting and thus perhaps prevent the machine for shutting down? If the drive is full and the user can't save and can't log out and can't open another program, WTF is your calculator app supposed to do when it's trying to store your last calculation so if you reopen it you'll be where you left off? If you want truly robust software the developers have to deal with nearly non-existent edge cases like that.