Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Tuesday May 08 2018, @06:13PM   Printer-friendly
from the Oh,-that's-what-it-means! dept.

What is edge computing?

The word edge in this context means literal geographic distribution. Edge computing is computing that's done at or near the source of the data, instead of relying on the cloud at one of a dozen data centers to do all the work. It doesn't mean the cloud will disappear. It means the cloud is coming to you. [...] One great driver for edge computing is the speed of light. If a Computer A needs to ask Computer B, half a globe away, before it can do anything, the user of Computer A perceives this delay as latency. The brief moments after you click a link before your web browser starts to actually show anything is in large part due to the speed of light. Multiplayer video games implement numerous elaborate techniques to mitigate true and perceived delay between you shooting at someone and you knowing, for certain, that you missed.

Voice assistants typically need to resolve your requests in the cloud, and the roundtrip time can be very noticeable. Your Echo has to process your speech, send a compressed representation of it to the cloud, the cloud has to uncompress that representation and process it — which might involve pinging another API somewhere, maybe to figure out the weather, and adding more speed of light-bound delay — and then the cloud sends your Echo the answer, and finally you can learn that today you should expect a high of 85 and a low of 42, so definitely give up on dressing appropriately for the weather.

So, a recent rumor that Amazon is working on its own AI chips for Alexa should come as no surprise. The more processing Amazon can do on your local Echo device, the less your Echo has to rely on the cloud. It means you get quicker replies, Amazon's server costs are less expensive, and conceivably, if enough of the work is done locally you could end up with more privacy — if Amazon is feeling magnanimous.

The phrase seems to be popping up more this week due to developments at Microsoft's Build 2018 conference:

Microsoft delivers new edge-computing tools that use its speech, camera, AI technologies

Wikipedia's article, complete with multiple issues.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by DannyB on Tuesday May 08 2018, @06:18PM (3 children)

    by DannyB (5839) Subscriber Badge on Tuesday May 08 2018, @06:18PM (#677117) Journal

    Your Echo has to process your speech, send a compressed representation of it to the cloud, the cloud has to uncompress that representation and process it — which might involve pinging another API somewhere, maybe to figure out the weather, and adding more speed of light-bound delay — and then the cloud sends your Echo the answer,

    My favorite question used to be to ask Alexa what movies came out in 1995. There would be a very, very noticeable pause. Then she would start reciting the list. You could just let it go on, and on, and on, and on until you were sick of it.

    Alexa! Stop!

    Now, she tells the first few movies and then tells you where you can go.

    --
    Would a Dyson sphere [soylentnews.org] actually work?
    • (Score: 4, Interesting) by turgid on Tuesday May 08 2018, @06:35PM (1 child)

      by turgid (4318) Subscriber Badge on Tuesday May 08 2018, @06:35PM (#677123) Journal

      Alexa, what are the digits of pi?

      • (Score: 2) by DannyB on Tuesday May 08 2018, @06:53PM

        by DannyB (5839) Subscriber Badge on Tuesday May 08 2018, @06:53PM (#677133) Journal

        I've never tried that one. What happens? Have they "fixed" it? ("fixed" in the sense that you take a pet to the vet to be "fixed" to prevent offspring.)

        I will have to think about how one would calculate PI in a way that you can continue producing a sequence of digits forever.

        The way I do it requires knowing up front how many digits you want.

        public static void main(String[] args) {
            System.out.println( "Calculating pi..." );
            System.out.flush();
            System.out.println( calcPiBigDecimal( 1000 ) );
        }

        public static BigDecimal calcPiBigDecimal( int digits ) {
            final MathContext prc = new MathContext( digits+1 );

            final BigDecimal one = BigDecimal.ONE;
            final BigDecimal two = one.add( one );
            final BigDecimal four = two.add( two );
            final BigDecimal five = four.add( one );
            final BigDecimal six = four.add( two );
            final BigDecimal eight = four.add( four );
            final BigDecimal sixteen = eight.add( eight );

            BigDecimal eightN = BigDecimal.ZERO;
            final BigDecimal oneSixteenth = BigDecimal.ONE.divide( sixteen, prc );
            BigDecimal oneSixteenthToNPower = BigDecimal.ONE;

            BigDecimal pi = BigDecimal.ZERO, oldPi = BigDecimal.ZERO;

            //int n = 0;
            while( true ) {
                pi = pi.add(
                                   four.divide( eightN.add( one ), prc )
                        .subtract( two.divide( eightN.add( four ), prc ) )
                        .subtract( one.divide( eightN.add( five ), prc ) )
                        .subtract( one.divide( eightN.add( six ), prc ) )

                        .multiply( oneSixteenthToNPower )
                    , prc );

                if( pi.equals( oldPi ) ) {
                    // Trim off one inaccurate digit.
                    pi = pi.round( new MathContext( digits ) );
                    break;
                }
                oldPi = pi;

                //n += 1;
                eightN = eightN.add( eight );
                oneSixteenthToNPower = oneSixteenthToNPower.multiply( oneSixteenth );
            }

            return pi;
        }

        I suppose the algorithm could use unlimited precision, rather than limiting it. Then each iteration, compare to the previous. Once, say 2, or more digits have "stabilized", then produce them as new digits. Yet the iterations keep going forever. The internal variable values keep getting longer and longer with more decimal digits.

        I will have to work on expressing that algorithm as both:
        * A Java Iterator (and Stream)
        * A Clojure lazy list

        Then I will have to generalize the approach so that different "iteration algorithms" can be plugged in to produce an infinite sequence of digits in decimal (or your preferred radix?).

        --
        Would a Dyson sphere [soylentnews.org] actually work?
    • (Score: 5, Informative) by Snow on Tuesday May 08 2018, @06:39PM

      by Snow (1601) on Tuesday May 08 2018, @06:39PM (#677128) Journal

      99% of my use of Google home goes like this:

      OK Google, make a (Doggie|Pig|Horsie|Monkey|Elephant) sound. My daughter loves it.

      Rarely will I get it to play music. It's actually pretty useless at everything else.

  • (Score: 4, Informative) by bob_super on Tuesday May 08 2018, @06:35PM (1 child)

    by bob_super (1357) on Tuesday May 08 2018, @06:35PM (#677122)

    So, now you're gonna push the cloud all the way into the customers' building, for The Ultimate Low Latency !
    The workers will love the low latency, the beancounters will love not owning the hardware, and the provider will love not paying for the building, bandwidth and power!
    One big pool of proprietary VMs accessed through thin clients ... it's like the main processing, but all in one frame ... now how could we name that ?

    • (Score: 3, Touché) by captain normal on Tuesday May 08 2018, @07:58PM

      by captain normal (2205) on Tuesday May 08 2018, @07:58PM (#677162)

      How about "Big Brother"?

      --
      "It is easier to fool someone than it is to convince them that they have been fooled" Mark Twain
  • (Score: 2) by turgid on Tuesday May 08 2018, @06:36PM

    by turgid (4318) Subscriber Badge on Tuesday May 08 2018, @06:36PM (#677124) Journal

    ...Digitalization, Big Data, Edge.

  • (Score: 5, Informative) by Thexalon on Tuesday May 08 2018, @06:53PM (4 children)

    by Thexalon (636) Subscriber Badge on Tuesday May 08 2018, @06:53PM (#677132)

    1. $NEW_BUZZWORD means that we should do all the hard computing work on high-powered servers managed by professionals, and have relatively "thin" clients for the users that basically just display the results and handle user input.
    2. $NEW_BUZZWORD means that we should do more of the hard computing work on those clients that are otherwise sitting there relatively idle, thus easing the load on the high-powered and expensive servers.
    3. But wouldn't it be great if the clients were less busy? Let's shift more of the work back to the servers. Goto step 1.

    Of course, what none of that changes is that the computation has to be done somewhere by something.

    --
    The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 3, Insightful) by FatPhil on Tuesday May 08 2018, @09:55PM (3 children)

      Whilst I agree with your main thrust, I disagree with your conclusion. MS Word 2 on a 90MHz Pentium was blisteringly quick, in particular if you has one of those graphics cards with accellerated 2D graphics. Way quicker than whatever openoffice or google docs is currently available on multicore multi-GHz 64-bit processors. And yet they're not actually doing any more now, in google docs' case, they're doing less. Despite using 100 times the computational power. Which means that 99% of the computations don't actually need to be done at all.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @10:40PM (1 child)

        by Anonymous Coward on Tuesday May 08 2018, @10:40PM (#677211)

        Functionally I believe your correct as far as the user is concerned, but there are so many "smart" things programs do these days.

        • (Score: 3, Insightful) by mhajicek on Wednesday May 09 2018, @04:33AM

          by mhajicek (51) Subscriber Badge on Wednesday May 09 2018, @04:33AM (#677339)

          And most of them are things we'd rather they not do.

          --
          The spacelike surfaces of time foliations can have a cusp at the surface of discontinuity. - P. Hajicek
      • (Score: 2) by Thexalon on Wednesday May 09 2018, @01:28AM

        by Thexalon (636) Subscriber Badge on Wednesday May 09 2018, @01:28AM (#677282)

        Of course you should be trying to make your software efficient rather than a bloated horrible mess.

        That's a separate issue from trying to pretend that the inherent complexity in the problem you're trying to solve doesn't exist because a different processor is responsible for carrying it out.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
  • (Score: 3, Insightful) by Gaaark on Tuesday May 08 2018, @07:08PM (2 children)

    by Gaaark (41) Subscriber Badge on Tuesday May 08 2018, @07:08PM (#677137) Journal

    In regards to anything to do with MS, it stands for living on the edge.

    Leave more than gaming in the hands of MS? Yeah, nah.

    --
    --- Please remind me if I haven't been civil to you: I'm channeling MDC. ---Gaaark 2.0 ---
    • (Score: 4, Funny) by DannyB on Tuesday May 08 2018, @07:19PM (1 child)

      by DannyB (5839) Subscriber Badge on Tuesday May 08 2018, @07:19PM (#677144) Journal

      Using Edge to browse or Internet Exploiter. That puts one on Edge.

      Any diode can be light emitting -- at least once.

      --
      Would a Dyson sphere [soylentnews.org] actually work?
      • (Score: 3, Funny) by Phoenix666 on Wednesday May 09 2018, @01:55PM

        by Phoenix666 (552) on Wednesday May 09 2018, @01:55PM (#677425) Journal

        I thought "edging" meant something entirely different and inappropriate.

        On second thought, that too describes Microsoft well.

        --
        Washington DC delenda est.
  • (Score: 5, Funny) by BsAtHome on Tuesday May 08 2018, @07:19PM (2 children)

    by BsAtHome (889) on Tuesday May 08 2018, @07:19PM (#677145)

    My Bulshit-Bingo-card just got filled all the way in each direction. Can I have another, please?

    And, I'd like to collect my BB prize?

  • (Score: 3, Insightful) by PiMuNu on Tuesday May 08 2018, @07:43PM (1 child)

    by PiMuNu (3823) on Tuesday May 08 2018, @07:43PM (#677154)

    Particle physicists have been doing edge computing for years, for the same reason - it is too difficult to transport data to a remote batch farm and then back to an experiment, where the results are needed in real time. So particle physicists invented complex logic for "triggering" the detector system, based on sometimes quite high-level stuff like total energy of the triggering event.

    • (Score: 2, Interesting) by suburbanitemediocrity on Wednesday May 09 2018, @12:20AM

      by suburbanitemediocrity (6844) on Wednesday May 09 2018, @12:20AM (#677263)

      I used to do that. Logic gates are very fast and rack mounted. Timing between gates is realized using different length wire, this is important because an event that you are looking for is detected by the detection of decay particles that have to be correlated temporally and spatially. If detected events can't be correlated, the it is some other junk coming off the beam line.

  • (Score: 2, Insightful) by Anonymous Coward on Tuesday May 08 2018, @08:40PM (6 children)

    by Anonymous Coward on Tuesday May 08 2018, @08:40PM (#677177)

    ...How is this different from processing locally on the user's machine? We been doing that since, well, since computers were invented. Please excuse my confusion and lack of excitement for new seemingly useless terminology.

    • (Score: 4, Funny) by sonamchauhan on Tuesday May 08 2018, @09:04PM (4 children)

      by sonamchauhan (6546) on Tuesday May 08 2018, @09:04PM (#677185)

      Ah, the difference :? ... you no longer own (or access) the software you run.. In fact, the edge compute provider can even be running a coin miner on your node, and you won't know.

        In short, the cloud is you.

      • (Score: 2) by stormwyrm on Wednesday May 09 2018, @04:02AM (3 children)

        by stormwyrm (717) on Wednesday May 09 2018, @04:02AM (#677331) Journal
        That's basically always been true for all proprietary software. You can only take the author's word for it that the opaque chunk of machine code you got from them for actually does what they say it does, no more and no less. Who knows what else that proprietary blob of code that you've got there is doing behind your back, like mining cryptocurrency for its author, packaging up your secrets and sending them via a covert channel back to the mothership for later sale to the highest bidder, etc. Without source code, it is very difficult to tell for certain.
        --
        Numquam ponenda est pluralitas sine necessitate.
        • (Score: 2) by All Your Lawn Are Belong To Us on Wednesday May 09 2018, @12:00PM (2 children)

          by All Your Lawn Are Belong To Us (6553) on Wednesday May 09 2018, @12:00PM (#677416) Journal

          Very difficult, but not impossible. On your local machine you can run a process monitor. On your local machine you can get build a low-level monitor to observe hardware level interactions. On your local machine you can build a dedicated and separate hardware level interface monitor for any component of your system, including running a second PC as a sniffer off your ethernet (or other) connection to search out unauthorized activity. On your local machine you can get open source software and either compile yourself and/or compare compile hashes. (Which I realize you were talking about proprietary blobs, so I mostly focused on how you can empirically monitor your system.)
          You know, the sorts of things that otaku types do because we can.

          Tell me how you do those things with most cloud solutions.

          --
          This sig for rent.
          • (Score: 2) by Runaway1956 on Wednesday May 09 2018, @02:28PM

            by Runaway1956 (2926) Subscriber Badge on Wednesday May 09 2018, @02:28PM (#677440) Homepage Journal

            Well - thanks for a new word, I guess. I'll probably forget it real soon, because it ain't English. Otaku. I'll keep OCD to describe to describe that kind of people. Of course, if I do remember it, I can use it at work to describe some of my coworkers . . .

            --
            Abortion is the number one killed of children in the United States.
          • (Score: 1) by sonamchauhan on Wednesday May 09 2018, @10:17PM

            by sonamchauhan (6546) on Wednesday May 09 2018, @10:17PM (#677636)

            Completely agree. Plus you could do a memory dump and decompile, if you wanted to (some apps more easily than others). That may or may not have been illegal but it kept vendors honest.

            Now this 'trust but verify' ability is gone. The IoT era is turning into the IoLDD era: locked down devices.

            The Xbox, azure Kinect, kindle, echo ... are all IoLDD. With some kindles you can't even snoop network interactions without illegal kit

    • (Score: 2) by crafoo on Tuesday May 08 2018, @10:52PM

      by crafoo (6639) on Tuesday May 08 2018, @10:52PM (#677217)

      It's no different, it's just a new name for old idea. You have to keep selling the ideas to the new batch of managers and marketing spergs though. It's just the way it is.

      I've heard Edge Computing used to describe the processing and accumulation that microcontrollers do with sensor input and other logged data. It gets processed down into more useful and comprehensible information and then transmitted to "the cloud" when it's possible to do so or when convenient. Not all sensor packages and other equipment operate in an environment where they can be in continuous & reliable contact to a network.

      There is a gold rush on for data right now. More and better sensors in everything to feed the training sets of machine learning algorithms.

  • (Score: 5, Insightful) by jmorris on Tuesday May 08 2018, @09:37PM (1 child)

    by jmorris (4844) on Tuesday May 08 2018, @09:37PM (#677197)

    We do this thing where we flip between extremes. Centralized mainframes gave way to the decentralized liberation of the PC, we realized that Windows was inescapable and an unmanagable nightmare so pushed it all into the Cloud, that was a different security and privacy nightmare and as noted in TFA had serious latency problems so.....

    But notice the difference this time, the Cloud wants to push the computing back to you but retain full ownership.

    • (Score: 0) by Anonymous Coward on Tuesday May 08 2018, @10:53PM

      by Anonymous Coward on Tuesday May 08 2018, @10:53PM (#677219)

      I have never worked anywhere ever that thought that Windows was inescapable and an unmanagable nightmare--leading to the conclusion to push it all into the Cloud.

      No, the people in charge wanted to save money and get rid of all that equipment and people supporting it.

      you speak like you were one of the people responsible for the layoffs and outsourcing, and have to come up with some reason to rationalize it.

  • (Score: 0) by Anonymous Coward on Wednesday May 09 2018, @02:39PM

    by Anonymous Coward on Wednesday May 09 2018, @02:39PM (#677443)

    We didn't even get rid of the thick clients before they came back in vogue.

(1)