Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday August 21 2019, @09:47AM   Printer-friendly
from the met-his-endian dept.

Submitted via IRC for Bytram

RIP Danny Cohen: The computer scientist who gave world endianness meets his end aged 81

The computer scientist who created the first visual flight simulator, gave us the compsci concept of endianness and whose pioneering work blazed a trail for modern VOIP services has died at the age of 81.

Dr Danny Cohen worked on one of the first ever computer-based flight simulations in the early 1970s, an era where most think of computing as something that was still reliant on punch cards instead of advanced graphics processing and display technologies.

In addition, Cohen gave us the compsci notion of endianness and developed some of the first clustered computing deployments – paving the way for modern cloud technology.

The flight simulator created by the Israeli-born mathematician is very basic by modern standards but wouldn't be bested by generally available software until the advent of home gaming consoles more than a decade later.

What made his flight sim achievements even more remarkable was that it wasn't until after he developed the simulator that Cohen learned to fly in real life, as he told Wired in a 2012 interview.

Cohen also carried out some early work on what he described as "digital voice teleconferencing" in 1978, as this Youtube video published from an account seemingly in Cohen's name sets out.

[...] The Internet Hall of Fame inducted him into their ranks in 2012, recognising him as a pioneer.

[...] Danny Cohen, computer scientist. 9 December 1937 – 12 August 2019.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by rob_on_earth on Wednesday August 21 2019, @11:08AM (13 children)

    by rob_on_earth (5485) on Wednesday August 21 2019, @11:08AM (#883052) Homepage

    In computing, endianness refers to the order of bytes (or sometimes bits) within a binary representation of a number. It can also be used more generally to refer to the internal ordering of any representation, such as the digits in a numeral system or the sections of a date.
    https://en.wikipedia.org/wiki/Endianness [wikipedia.org]

    Had to deal with this in the dial up days, everyone and every system seemed want something different

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 4, Informative) by FatPhil on Wednesday August 21 2019, @12:32PM

    by FatPhil (863) <reversethis-{if.fdsa} {ta} {tnelyos-cp}> on Wednesday August 21 2019, @12:32PM (#883078) Homepage
    And this was a concept that was absolutely at the core of computer i/o right since the early days (e.g. https://en.wikipedia.org/wiki/UNIVAC_1100/2200_series#Data_formats ), way before Cohen came up with a catchy name that only describes, and sometimes ambiguously at that, two of the most common cases for sub-field ordering, where others were known to exist. Not just that, but his catchy name wasn't even original, being a literary reference. He invented a meme associated with the concept, not the concept itself.
    --
    Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
  • (Score: 4, Insightful) by Coward, Anonymous on Wednesday August 21 2019, @12:45PM (7 children)

    by Coward, Anonymous (7017) on Wednesday August 21 2019, @12:45PM (#883082) Journal

    The names big and little endian have always confused me. Which end is big, which end is little?

    • (Score: 2) by theluggage on Wednesday August 21 2019, @12:54PM (3 children)

      by theluggage (1797) on Wednesday August 21 2019, @12:54PM (#883084)

      The names big and little endian have always confused me. Which end is big, which end is little?

      Neither. One is small, the other is far away... :-)

      Seriously: big-endian is the most significant byte first. If it bothers you, just draw your computer's memory map upside down...

      • (Score: 1) by noelhenson on Wednesday August 21 2019, @01:11PM (2 children)

        by noelhenson (6184) on Wednesday August 21 2019, @01:11PM (#883093)

        The endianedness of byte ordering is important the underlying architecture of the CPU core. Big-endian is more human readable as one reads bytes in memory Eg. hex dumps; which I don't thing programmers do much today. I recently had to give a long-time programmer a quick lesson on what hexadecimal was and what it meant. Little-endian makes for a simpler ALU in the CPU core; especially in the old days of 8- and 16-bit CPUs. This is because of add/subtract carry/borrow bits - it would allow for the next byte to be fetched while the add/subtract was being performed while the program counter was simply incremented to index the next byte in memory.

        • (Score: 2) by DannyB on Wednesday August 21 2019, @04:01PM

          by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @04:01PM (#883196) Journal

          Oh how I remember using TMON on Classic Mac (68000 processors). (a low level debugger with amazingly high level features)

          How nice it was that HEX was easily read because it was, unbeknownst to me, Big-endian.

          A few years later . . . Wow! Intel is DOING IT WRONG! Or so I thought. Later, I came to realize the advantages of Little-endian.

          I recently had to give a long-time programmer a quick lesson on what hexadecimal was and what it meant.

          My how far things have come!

          --
          The lower I set my standards the more accomplishments I have.
        • (Score: 2) by kazzie on Thursday August 22 2019, @08:24AM

          by kazzie (5309) Subscriber Badge on Thursday August 22 2019, @08:24AM (#883514)

          Little-endian makes for a simpler ALU in the CPU core; especially in the old days of 8- and 16-bit CPUs.

          How is it less of an issue with 32-bit (+) CPUs?

    • (Score: 2) by DannyB on Wednesday August 21 2019, @03:53PM

      by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @03:53PM (#883191) Journal

      Me too. When I first heard the terms big-endian and little-endian, I found it confusing. Which is which? For years, I had to look it up when I would run across it. I rarely needed to know, but when I did, it was infrequent enough that I had to look it up. I didn't consider myself stupid or dim witted. I just found it non descriptive and confusing. Eventually got the hang of it enough that I now just keep a browser bookmark handy.

      --
      The lower I set my standards the more accomplishments I have.
    • (Score: 2) by DeathMonkey on Wednesday August 21 2019, @06:06PM

      by DeathMonkey (1380) on Wednesday August 21 2019, @06:06PM (#883255) Journal

      The names big and little endian have always confused me. Which end is big, which end is little?

      It's a reference to Gulliver's Travels and I think the joke is that it doesn't matter. [upenn.edu]

    • (Score: 3, Informative) by stormreaver on Wednesday August 21 2019, @07:26PM

      by stormreaver (5101) on Wednesday August 21 2019, @07:26PM (#883282)

      In Big Endian, the big end (Most Significant Byte) comes first in memory. In Little Endian, the little end (Least Significant Byte) comes first in memory.

  • (Score: 4, Interesting) by theluggage on Wednesday August 21 2019, @12:46PM

    by theluggage (1797) on Wednesday August 21 2019, @12:46PM (#883083)

    In computing, endianness refers to the order of bytes (or sometimes bits) within a binary representation of a number.

    The name, of course (like Yahoo!, but probably not Swift) comes from Gulliver's Travels and the holy war over which way up you should eat your egg, which raises a chicken/egg question: In the article cited in TFA, Cohen refers specifically to the "holy wars" about byte order - so it was clearly not a new concept that he introduced, although his paper may have been a seminal description of the debate. So was it simply the case that he coined the name "endian"?

    NB: Danny Cohen obviously contributed more than enough to the subject - and the wider the world of computing - to command respect, so it is only the reporting I'm nit-picking here. Otherwise, glass duly raised.

  • (Score: 2) by inertnet on Wednesday August 21 2019, @01:11PM (2 children)

    by inertnet (4071) on Wednesday August 21 2019, @01:11PM (#883091) Journal

    On that page I didn't find the optimization reason for little endianness as I understood it way back when this was significant, but this quote comes close:

    As carry propagation must start at the least significant bit (and thus byte), multi-byte addition can then be carried out with a monotonically-incrementing address sequence, a simple operation already present in hardware

    In order to add 16 bit numbers, the processor first fetched the least significant byte. The addition could then already start while reading the most significant byte, so the carry bit would be known early. Reading the bytes in big endian order (on a byte by byte bus without cache) would require more processing time in a number of calculations. Not only additions but also for calculating relative jump addresses.

    • (Score: 2) by jmorris on Thursday August 22 2019, @12:31AM (1 child)

      by jmorris (4844) on Thursday August 22 2019, @12:31AM (#883375)

      That only mattered with the earliest and most primitive processors, i.e. the Intel crap and the 6502. Pretty much anything else was complex enough to support both incrementing and decrementing pointers.

      So a 16bit add of a little endian number would look something like:

      ;Add secnum to firstnum, storing result in firstnum, little endian
      ; firstnum will end up with 26,4 or 1050
      ldx #frstnum ; immediate load
      ldy #secnum
      lda x
      adda y+
      sta x+
      lda x
      adca y
      sta x
      ret
      firstnum: .byte 232,3 ; 1000
      secnum: .byte 50,0 ; 50

      ;Same thing in Big Endian
      ; firstnum ends up with 4,26 or 1050
      ldx #firstnum+1 ;point at LSB
      ldy #secnum+1
      lda x
      adda y-
      sta x-
      lda x
      adca y
      sta y
      ret
      firstnum: .byte 3,232 ; 1000
      secnum: .byte 0,50 ; 50

      There were some oddball processors that only gave post increment and pre decrement but the code doesn't change much to accommodate that. So no, little endian is just stupid, a legacy of the earliest Intel processors we will suffer from for the next thousand years. Forever converting from sane network byte ordering to the stupid intel byte ordering since ARM gave up and pretty much adopted it as well.

      • (Score: 2) by inertnet on Thursday August 22 2019, @09:37AM

        by inertnet (4071) on Thursday August 22 2019, @09:37AM (#883527) Journal

        I remember back then that expensive manager types simply advised companies to buy IBM computers with backward processors, instead of Macintoshes with much more advanced 68000's in them. Other managers then decided to buy IBM, they never listened to tech types.