Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by chromas on Monday July 23 2018, @10:22PM   Printer-friendly
from the drm dept.

Hugo Landau has written a blog post about why Intel will never let hardware owners control the Management Engine. The Intel Managment Engine (ME) is a secondary microprocessor ensconced in recent Intel x86 chips, running an Intel-signed, proprietary, binary blob which provides remote access over the network as well as direct access to memory and peripherals. Because of the code signing restrictions enforced by the hardware, it cannot be modified or replaced by the user.

Intel/AMD will never allow machine owners to control the code executing on the ME/PSP because they have decided to build a business on preventing you from doing so. In particular, it's likely that they're actually contractually obligated not to let you control these processors.

The reason is that Intel literally decided to collude with Hollywood to integrate DRM into their CPUs; they conspired with media companies to lock you out of certain parts of your machine. After all, this is the company that created HDCP.

This DRM functionality is implemented on the ME/PSP. Its ability to implement DRM depends on you not having control over it, and not having control over the code that runs on it. Allowing you to control the code running on the ME would directly compromise an initiative which Intel has been advancing for over a decade.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by c0lo on Tuesday July 24 2018, @03:35AM (4 children)

    by c0lo (156) Subscriber Badge on Tuesday July 24 2018, @03:35AM (#711556) Journal

    but an assurance that the CPU will do only what you, the supposed owner, tell it to do, no more and no less

    You realize that this doesn't hold to much of a value, the question of your trust in the ... ummm .. assuring party still remains.
    I mean, how could one verify the "no more no less" assurance if one doesn't have access to the design of the CPU?

    It doesn't even need to be a false assurance given in ill-faith, to err is human, unintentional bugs in hardware aren't new.
    Even if you know implementation details doesn't keep you from harm's way: the predictive branching, out-of-order execution and all the other stuff used by Spectre were known as design principles for quite a long way back

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 3, Insightful) by stormwyrm on Tuesday July 24 2018, @06:20AM (3 children)

    by stormwyrm (717) on Tuesday July 24 2018, @06:20AM (#711613) Journal

    Gaining access to the design of the CPU to be able to look at it does not automatically give you the rights to the design of the CPU, any more than having a peek at the source code for Windows gives you any right to use it the way you'd be able to use code that was released under a Free license. Your previous post talked about "rights on the design of the CPU". An independent third-party auditor might be able to gain enough access to the designs to audit them under a non-disclosure agreement, and yes, it's naturally going to be a matter of trust in the auditors as well. But that's also true for a lot of things out there. We have it just about entirely on trust that every bit of Free Software out there doesn't have malicious misfeatures. The only difference is that that trust is highly distributed, making it a lot stronger than trust in say, Microsoft alone. Plenty of other people can and have had a good look at the code. If you had an open hardware design for a CPU, I doubt that you, personally, would have the time, knowledge, and inclination to be able to do a thorough job of assuring its trustworthiness all by yourself, any more than you would have for something as complex as the Linux kernel. The only advantage it would have is that the ability to audit it is a lot easier: you don't have to sign NDA's or pay Danegeld to Intel or AMD to be able to audit the design of RISC V or some other Free CPU, which increases the chances that someone trustworthy has done the auditing already. But again, that's also something you still probably need to take entirely on trust.

    Unintentional bugs are out of scope here. We're talking only about outright malicious features designed specifically to subvert the user's intentions put there deliberately by the designer, like Intel's ME or AMD's PSP. You can get unintentional bugs in any design, Free or proprietary or anything in between. Even so, I'd rather have a system that at minimum strives to be loyal [gnu.org] to me, instead of having deliberate back doors baked into it to allow itself to betray me to whoever is supposed to be its true master. Unintentional mistakes are a fact of life. Deliberate disloyalty and betrayal shouldn't be.

    --
    Numquam ponenda est pluralitas sine necessitate.
    • (Score: 3, Insightful) by anubi on Tuesday July 24 2018, @07:01AM (2 children)

      by anubi (2828) on Tuesday July 24 2018, @07:01AM (#711617) Journal

      Agreed, I have no rights to the CPU design. But I find myself in the same position as if I were buying locks.

      Say, for instance, I love Schlage locks. But some suit-man over there accepts the idea that all his locks should open with a master key. Now, the playing field is wide open to whoever gets a copy of the master key. Now, anybody who has an interest in violating my lock is free to do so.

      ( Just for example... mechanical locks like this are not secure at all. "bump keys". And any locksmith can pick one. A lot of kids can, too. )

      I know I am going to be moderated "redundant" for this post. Everyone on this forum is saying the same thing! Dammit! Just how can one explain in words that even a Hand Shaking SuitMan can comprehend that having any hardware with a cooked-in-silicon master key backdoor is a really, really, really bad idea?

      This is one issue I have been screaming about ever since "scripts" which mix code and data. Never execute code you can't verify or hold someone accountable - and for crying out loud, don't willy nilly leave your business and trust some lock, especially if you know even a script kiddie will have free run of your place once he jimmies your lock. The technology exists for you to have the only key in existence, yet you choose to use a technology where others you have no idea who they are also have a key?

      Once the trick gets out how to get in your machine through that backdoor, there will be no patch. And all your stuff is right out there for anyone who knows how the "open sesame" works.

      And to think that suit-men have spent so much effort on copyright. They are watching their candy jar while someone else is making off with their bank account!

      My grandpa, and old dirt-farmer, was smarter than that... when the local kids took to setting outhouses on fire, or moving them back four feet, he built his out of cinderblock.

      Yet, how many of those executives are going to authorize purchase and implementation of this backdoored technology in *their own* corporation?

      --
      "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
      • (Score: 2) by stormwyrm on Wednesday July 25 2018, @05:30AM (1 child)

        by stormwyrm (717) on Wednesday July 25 2018, @05:30AM (#712189) Journal
        Agree with the main points, but will quibble on how you say mechanical locks are "not secure at all". Yes, any such mechanical lock can be picked with greater or lesser degrees of ease, and if I really wanted in I could use a blowtorch and melt the lock or use a hacksaw to cut it apart. Does that make such locks not secure? Of course not. Locks and safes buy you time depending on the skills and equipment of the potential adversaries. The cheap lock I use on my locker at the gym can probably be picked by a thief in under ten minutes, but since I'd probably be able to walk by and brain the thief with a barbell before he got it open, it's secure enough for my purposes. A heavy bank vault might be opened within an hour given dynamite, but it's still very much secure if the police can be at the vault to apprehend the thieves in less time than that once they hear the first explosions. Real-world security is never about absolutes.
        --
        Numquam ponenda est pluralitas sine necessitate.
        • (Score: 1) by anubi on Wednesday July 25 2018, @08:06AM

          by anubi (2828) on Wednesday July 25 2018, @08:06AM (#712222) Journal

          I was mostly referring to the lock I compared to... standard common house front door lock.

          Like you say, they come in varieties from that super cheap lock I use on a gate, just to let people know that I don't welcome uninvited visitors, but should they insist and force it open anyway ( can be done with paper clip ), another circuit will sense the open gate and make a fuss.

          I have a G&S dial lock on an outside door.... just in case I lock myself out of my own house. It'll be easier to bash the door down than to open that one without its combination.

          Generally, its hard to compare mechanical locks to electronic locks.. as its usually hard to violate a mechanical lock in private. Whereas an electronic lock can be hammered at from the other side of the planet for years if it comes to that.

          --
          "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]