Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Tuesday April 06 2021, @08:03AM   Printer-friendly

IBM Bets Homomorphic Encryption Is Ready To Deliver Stronger Data Security For Early Adopters | Venturebeat:

The topics of security and data have become almost inseparable as enterprises move more workloads to the cloud. But unlocking new uses for that data, particularly driving richer AI and machine learning, will require next-generation security.

To that end, companies have been developing confidential computing to allow data to remain encrypted while it is being processed. But as a complement to that, a security process known as fully homomorphic encryption is now on the verge of making its way out of the labs and into the hands of early adopters after a long gestation period.

Researchers like homomorphic encryption because it provides a certain type of security that can follow the data throughout its journey across systems. In contrast, confidential computing tends to be more reliant upon special hardware that can be powerful but is also limiting in some respects.

Companies such as Microsoft and Intel have been big proponents of homomorphic encryption. Last December, IBM made a splash when it released its first homomorphic encryption services. That package included educational material, support, and prototyping environments for companies that want to experiment.

[...] With FHE, the data can remain encrypted while being used by an application. Imagine, for instance, a navigation app on a phone that can give directions without actually being able to see any personal information or location.

Companies are potentially interested in FHE because it would allow them to apply AI to data, such as from finance and health, while being able to promise users that the company has no way to actually view or access the underlying data.

While the concept of homomorphic encryption has been of interest for decades, the problem is that FHE has taken a huge amount of compute power, so much so that it has been too expensive to be practicable.

But researchers have made big advances in recent years.

[...] Maass said in the near term, IBM envisions FHE being attractive to highly regulated industries, such as financial services and health care.

"They have both the need to unlock the value of that data, but also face extreme pressures to secure and preserve the privacy of the data that they're computing upon," he said.

But he expects that over time a wider range of businesses will benefit from FHE. Many sectors want to improve their use of data, which is becoming a competitive differentiator. That includes using FHE to help drive new forms of collaboration and monetization. As this happens, IBM hopes these new security models will drive wider enterprise adoption of hybrid cloud platforms.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Interesting) by VLM on Tuesday April 06 2021, @01:36PM (6 children)

    by VLM (445) on Tuesday April 06 2021, @01:36PM (#1133856)

    Companies are potentially interested in FHE because it would allow them to apply AI to data, such as from finance and health, while being able to promise users that the company has no way to actually view or access the underlying data.

    "With my fingers crossed behind my back I promise to never ask for the result of a single record times one or a single record plus zero"

    People would have to be pretty dumb to trust those companies.

    My guess when you cut thru the feel good crap is this will only be useful for sensitive work (aka DoD military) performed on powned (aka, ALL) cloud services. So the air force could do AI image analysis of bomb damage craters on AWS despite it being foreign controlled.

    The problem with homomorphic encryption is its always been so incredibly computationally expensive that its cheaper and faster to rely on Moore's Law and all that and do stuff locally. Even if AWS was free it would still be cheaper and faster to use an old Z-80 from 1985 than going homomorphic. But someday it might actually work... someday.

    I last researched homomorphic encryption a couple years ago and I don't recall it being terribly robust toward either innocent or non-innocent errors. Note that if you make 1+1=? a trillion times more computationally intensive than just doing it, you make it a trillion times more vulnerable to cosmic rays and innocent electromagnetic interference type stuff. As for non-innocent it would seem a low error rate would completely jam some historical systems that were never optimized for real world error rates. What I'm getting at is not just cosmic rays hitting CPU chips and ram chips, but using a "five nines" communication system I can easily discuss 1+1=2 with you all, but if I have to transmit exabytes of data across networks to discuss 1+1 then we have communications problems.

    To some extent the field a couple years ago was like digital cash before the first bitcoin implementation. Oh that would be interesting but academic approaches have never been workable..... not yet.....

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   3  
  • (Score: 2) by bradley13 on Tuesday April 06 2021, @05:35PM (5 children)

    by bradley13 (3053) on Tuesday April 06 2021, @05:35PM (#1133936) Homepage Journal

    "With my fingers crossed behind my back I promise to never ask for the result of a single record times one or a single record plus zero"

    I was tangentally involved in a project, a couple of years ago. Nothing to do with encryption, but the goals were similar: Competitors in a particular industry voluntarily combined their data into a central database. The job of the project was to make sure that no queries were possible, that would reveal data about a single competitor. What you don't know: there is a whole niche industry specializing in this kind of stuff.

    Sorry, but it cannot work. To oversimplify: If you prohibit adding zero, or multiplying by one, then someone will come along and take the module of a zillion. If you can operate on the data, there *will* be a way to exfiltrate it.

    --
    Everyone is somebody else's weirdo.
    • (Score: 2) by dwilson on Wednesday April 07 2021, @12:10AM (4 children)

      by dwilson (2599) on Wednesday April 07 2021, @12:10AM (#1134086)

      I'm a bit confused, here. As I understood it, homomorphic encryption allowed you to take some encrypted data you don't have the key to, perform operations on it, and as a result get another bit of encrypted data. ..that you still don't have the key to.

      Or, to put it another way, I receive from company X some encrypted data (a series of random-looking bytes), and I add zero to it, or multiply by one (to use your examples), and the result of those operations is a different series of random-looking bytes, which then get returned to company X. At the end of the day, I still don't have the key.

      Have I been understanding it wrong?

      --
      - D
      • (Score: 0) by Anonymous Coward on Wednesday April 07 2021, @12:45AM

        by Anonymous Coward on Wednesday April 07 2021, @12:45AM (#1134103)

        You are right, that is roughly how it works. I'm not exactly sure of what threat model they are imagining where they don't already have the key or just get encrypted results they cannot use. I think they are trying to come up with models where they aim at individual users, which is easy enough to prevent, or against the aggregate. As an example of the latter, if I wanted to know the average income of my competitors I could load my table with all encrypted zeros, add everyone from all tables together, divide it out, decrypt the results, and then solve for the average without decrypted zeros. Sure, that would give me some information as to what the aggregate average is, but that doesn't allow for individualized results in the way they seem to imagine without further details of what exactly they are picturing.

      • (Score: 2) by VLM on Wednesday April 07 2021, @01:53PM

        by VLM (445) on Wednesday April 07 2021, @01:53PM (#1134266)

        The way the Federal Census handles privacy is simply locking up the data for 70 years then release everything in plaintext.
        Meanwhile, you have to trust them that they won't go all 1940s Germany on releasing your private data during those 70 years, HOWEVER they do extensive aggregate data analysis, most of which might be correct? and they publish that aggregate data.

        So an analysis of reported race in some city 80 years ago can be done from primary sources, literally look at the returned forms and see who's white vs jewish vs black vs asian vs whatever.

        Now if want an analysis you have to talk the census dept into doing it and then trust their results and you wait for awhile for them to get around to it and they publish it and everyone has a level playing field.

        With homomorphic encryption, VERY theoretically, your process would look like "obtain encrypted individual census results" "spend terawatt-hours doing homomorphic operations to generate a percentage breakdown of a city by race" "ask the census oracle server to decrypt one, and precisely one, message today which theoretically is this supposedly legit statistical analysis and not something illegal (cross my fingers behind my back)"

        Of course everyone who's ever done a "workplace feelings" survey at work knows you can abuse aggregated data to de-anonymize it. If your company has 1050 employees worldwide in the category of 29 years of age with 7 years experience and 2 years at the company, your workplace feels survey is anonymous. If your boss has 3 employees and you're the only guy in that "aggregated category" then your survey is completely de-anonymized. You see this with medical records, break down the aggregate data to individual blocks in a subdivision with enough finely aggregated demographic criteria and you can completely deanonymize the data, so there's lots of HIPPA whining about it, especially if you rub one report up against another report.

        The problem is homomorphic encryption is, or used to be, so incredibly inefficient, that it would be faster and cheaper and more environmentally responsible to buy the Census dept a supercomputer and then politely ask them for legal results rather than trying to calculate it yourself. Its really really really slow. Or used to be.

        There was some online game theoretical proposal where you could have open source untrusted clients do something between homomorphic and digital cash transactions with each other such that if some individual client was "cheating" for at least some values of cheating, then at least half the other clients could unblind the anonymity of the cheating player.

      • (Score: 2) by DeVilla on Sunday April 11 2021, @03:52PM (1 child)

        by DeVilla (5354) on Sunday April 11 2021, @03:52PM (#1136036)

        Performing an operation on 2 encrypted values should get you a third encrypted value. I would assume you can also perform operations on an encrypted value and a non-encrypted value to produce an encrypted value. So I don't know how much money is in the account, but I know I need to decrement it by 1.

        I would also assume I can compare 2 encrypted values. So I may not know the account ID or the ID of the account the transaction applies to, but I can compare the encrypted values and know the records need to be processed together (to subtract the amount from the transaction (which I don't know) from the account balance (which I also don't know)).

        Now assume I have 2 encrypted values. cval1 & cval2. And I have a loop like so

        int i = 0;
        while ( cval2 - cval1 != cval2 - i ) {
                i++;
        }

        When the loop exits i must be equal to cval1. Swap cval2 & cval1 to find cval2.

        Even if all values must be encrypted, you can dig through the encrypted data looking for a one (eval1 times any eval2 equals eval2, then eval1 is one) and a zero (eval1 plus any eval2 equals eval2, then eval1 is zero). Once you've found those you use a modified version of the loop above.

        If there is no way to compare to values for equality, then you are seriously limited on the useful computation you can do.

        Actually, once I know the encrypted form of zero & one, I can calculate any other value I like.

        int encrypt(int val){
                int cval = cZero;
                for(int i = 0 ; i cval ; ++i){
                        cval = cval + cOne;
                }
                return cval;
        }

        • (Score: 0) by Anonymous Coward on Tuesday April 13 2021, @11:18PM

          by Anonymous Coward on Tuesday April 13 2021, @11:18PM (#1137156)

          You need to read up on homomorphic encryption. It doesn't work like that. For one thing you are seemingly assuming a simple substitution cipher with 1:1 correspondence between ciphertext and plaintext. You also don't seem to understand that the operation for subtracting two encrypted values is different from the operation for subtracting an encrypted value and a non-encrypted value. Doing a comparison like that doesn't work in the way you think. In fact, there are an unknown number values of your ciphertext that will cause that loop to exit and, if they do exist at all, they aren't all equal to i when the loop exits. I know it sounds weird, but it doesn't take a mathematician to understand the ramifications of E(x) + E(y) == E(x) + y but it does take quite a few to design something like this.