Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Wednesday August 21 2019, @09:47AM   Printer-friendly
from the met-his-endian dept.

Submitted via IRC for Bytram

RIP Danny Cohen: The computer scientist who gave world endianness meets his end aged 81

The computer scientist who created the first visual flight simulator, gave us the compsci concept of endianness and whose pioneering work blazed a trail for modern VOIP services has died at the age of 81.

Dr Danny Cohen worked on one of the first ever computer-based flight simulations in the early 1970s, an era where most think of computing as something that was still reliant on punch cards instead of advanced graphics processing and display technologies.

In addition, Cohen gave us the compsci notion of endianness and developed some of the first clustered computing deployments – paving the way for modern cloud technology.

The flight simulator created by the Israeli-born mathematician is very basic by modern standards but wouldn't be bested by generally available software until the advent of home gaming consoles more than a decade later.

What made his flight sim achievements even more remarkable was that it wasn't until after he developed the simulator that Cohen learned to fly in real life, as he told Wired in a 2012 interview.

Cohen also carried out some early work on what he described as "digital voice teleconferencing" in 1978, as this Youtube video published from an account seemingly in Cohen's name sets out.

[...] The Internet Hall of Fame inducted him into their ranks in 2012, recognising him as a pioneer.

[...] Danny Cohen, computer scientist. 9 December 1937 – 12 August 2019.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by rob_on_earth on Wednesday August 21 2019, @11:08AM (13 children)

    by rob_on_earth (5485) on Wednesday August 21 2019, @11:08AM (#883052) Homepage

    In computing, endianness refers to the order of bytes (or sometimes bits) within a binary representation of a number. It can also be used more generally to refer to the internal ordering of any representation, such as the digits in a numeral system or the sections of a date.
    https://en.wikipedia.org/wiki/Endianness [wikipedia.org]

    Had to deal with this in the dial up days, everyone and every system seemed want something different

    • (Score: 4, Informative) by FatPhil on Wednesday August 21 2019, @12:32PM

      by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Wednesday August 21 2019, @12:32PM (#883078) Homepage
      And this was a concept that was absolutely at the core of computer i/o right since the early days (e.g. https://en.wikipedia.org/wiki/UNIVAC_1100/2200_series#Data_formats ), way before Cohen came up with a catchy name that only describes, and sometimes ambiguously at that, two of the most common cases for sub-field ordering, where others were known to exist. Not just that, but his catchy name wasn't even original, being a literary reference. He invented a meme associated with the concept, not the concept itself.
      --
      Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
    • (Score: 4, Insightful) by Coward, Anonymous on Wednesday August 21 2019, @12:45PM (7 children)

      by Coward, Anonymous (7017) on Wednesday August 21 2019, @12:45PM (#883082) Journal

      The names big and little endian have always confused me. Which end is big, which end is little?

      • (Score: 2) by theluggage on Wednesday August 21 2019, @12:54PM (3 children)

        by theluggage (1797) on Wednesday August 21 2019, @12:54PM (#883084)

        The names big and little endian have always confused me. Which end is big, which end is little?

        Neither. One is small, the other is far away... :-)

        Seriously: big-endian is the most significant byte first. If it bothers you, just draw your computer's memory map upside down...

        • (Score: 1) by noelhenson on Wednesday August 21 2019, @01:11PM (2 children)

          by noelhenson (6184) on Wednesday August 21 2019, @01:11PM (#883093)

          The endianedness of byte ordering is important the underlying architecture of the CPU core. Big-endian is more human readable as one reads bytes in memory Eg. hex dumps; which I don't thing programmers do much today. I recently had to give a long-time programmer a quick lesson on what hexadecimal was and what it meant. Little-endian makes for a simpler ALU in the CPU core; especially in the old days of 8- and 16-bit CPUs. This is because of add/subtract carry/borrow bits - it would allow for the next byte to be fetched while the add/subtract was being performed while the program counter was simply incremented to index the next byte in memory.

          • (Score: 2) by DannyB on Wednesday August 21 2019, @04:01PM

            by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @04:01PM (#883196) Journal

            Oh how I remember using TMON on Classic Mac (68000 processors). (a low level debugger with amazingly high level features)

            How nice it was that HEX was easily read because it was, unbeknownst to me, Big-endian.

            A few years later . . . Wow! Intel is DOING IT WRONG! Or so I thought. Later, I came to realize the advantages of Little-endian.

            I recently had to give a long-time programmer a quick lesson on what hexadecimal was and what it meant.

            My how far things have come!

            --
            To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
          • (Score: 2) by kazzie on Thursday August 22 2019, @08:24AM

            by kazzie (5309) Subscriber Badge on Thursday August 22 2019, @08:24AM (#883514)

            Little-endian makes for a simpler ALU in the CPU core; especially in the old days of 8- and 16-bit CPUs.

            How is it less of an issue with 32-bit (+) CPUs?

      • (Score: 2) by DannyB on Wednesday August 21 2019, @03:53PM

        by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @03:53PM (#883191) Journal

        Me too. When I first heard the terms big-endian and little-endian, I found it confusing. Which is which? For years, I had to look it up when I would run across it. I rarely needed to know, but when I did, it was infrequent enough that I had to look it up. I didn't consider myself stupid or dim witted. I just found it non descriptive and confusing. Eventually got the hang of it enough that I now just keep a browser bookmark handy.

        --
        To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 2) by DeathMonkey on Wednesday August 21 2019, @06:06PM

        by DeathMonkey (1380) on Wednesday August 21 2019, @06:06PM (#883255) Journal

        The names big and little endian have always confused me. Which end is big, which end is little?

        It's a reference to Gulliver's Travels and I think the joke is that it doesn't matter. [upenn.edu]

      • (Score: 3, Informative) by stormreaver on Wednesday August 21 2019, @07:26PM

        by stormreaver (5101) on Wednesday August 21 2019, @07:26PM (#883282)

        In Big Endian, the big end (Most Significant Byte) comes first in memory. In Little Endian, the little end (Least Significant Byte) comes first in memory.

    • (Score: 4, Interesting) by theluggage on Wednesday August 21 2019, @12:46PM

      by theluggage (1797) on Wednesday August 21 2019, @12:46PM (#883083)

      In computing, endianness refers to the order of bytes (or sometimes bits) within a binary representation of a number.

      The name, of course (like Yahoo!, but probably not Swift) comes from Gulliver's Travels and the holy war over which way up you should eat your egg, which raises a chicken/egg question: In the article cited in TFA, Cohen refers specifically to the "holy wars" about byte order - so it was clearly not a new concept that he introduced, although his paper may have been a seminal description of the debate. So was it simply the case that he coined the name "endian"?

      NB: Danny Cohen obviously contributed more than enough to the subject - and the wider the world of computing - to command respect, so it is only the reporting I'm nit-picking here. Otherwise, glass duly raised.

    • (Score: 2) by inertnet on Wednesday August 21 2019, @01:11PM (2 children)

      by inertnet (4071) on Wednesday August 21 2019, @01:11PM (#883091) Journal

      On that page I didn't find the optimization reason for little endianness as I understood it way back when this was significant, but this quote comes close:

      As carry propagation must start at the least significant bit (and thus byte), multi-byte addition can then be carried out with a monotonically-incrementing address sequence, a simple operation already present in hardware

      In order to add 16 bit numbers, the processor first fetched the least significant byte. The addition could then already start while reading the most significant byte, so the carry bit would be known early. Reading the bytes in big endian order (on a byte by byte bus without cache) would require more processing time in a number of calculations. Not only additions but also for calculating relative jump addresses.

      • (Score: 2) by jmorris on Thursday August 22 2019, @12:31AM (1 child)

        by jmorris (4844) on Thursday August 22 2019, @12:31AM (#883375)

        That only mattered with the earliest and most primitive processors, i.e. the Intel crap and the 6502. Pretty much anything else was complex enough to support both incrementing and decrementing pointers.

        So a 16bit add of a little endian number would look something like:

        ;Add secnum to firstnum, storing result in firstnum, little endian
        ; firstnum will end up with 26,4 or 1050
        ldx #frstnum ; immediate load
        ldy #secnum
        lda x
        adda y+
        sta x+
        lda x
        adca y
        sta x
        ret
        firstnum: .byte 232,3 ; 1000
        secnum: .byte 50,0 ; 50

        ;Same thing in Big Endian
        ; firstnum ends up with 4,26 or 1050
        ldx #firstnum+1 ;point at LSB
        ldy #secnum+1
        lda x
        adda y-
        sta x-
        lda x
        adca y
        sta y
        ret
        firstnum: .byte 3,232 ; 1000
        secnum: .byte 0,50 ; 50

        There were some oddball processors that only gave post increment and pre decrement but the code doesn't change much to accommodate that. So no, little endian is just stupid, a legacy of the earliest Intel processors we will suffer from for the next thousand years. Forever converting from sane network byte ordering to the stupid intel byte ordering since ARM gave up and pretty much adopted it as well.

        • (Score: 2) by inertnet on Thursday August 22 2019, @09:37AM

          by inertnet (4071) on Thursday August 22 2019, @09:37AM (#883527) Journal

          I remember back then that expensive manager types simply advised companies to buy IBM computers with backward processors, instead of Macintoshes with much more advanced 68000's in them. Other managers then decided to buy IBM, they never listened to tech types.

  • (Score: 3, Informative) by DannyB on Wednesday August 21 2019, @03:49PM (2 children)

    by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @03:49PM (#883188) Journal

    <no-sarcasm>
    Back in the day I was using the UCSD p-System to write Pascal. It was portable across surprisingly different systems. It was amazing considering how primitive the tech was in those daze. 64 K was still a LOT of memory. And it's all a 16-bit world.

    Because of the portability, binary files had something called a 'byte sex' indicator. (yes, I'm serious. See the <no-sarcasm> tags.) This was a word (two bytes). The integer value of 1 was stored in that pair of bytes. If you looked at the integer and its value was 256, then you knew that binary had the opposite byte-sex than the implementation this present system is running on. So you byte-flipped all words. This worked very neatly without even knowing what Endianness your system had. The p-Code binary just worked. Other binary files just worked.

    I heard that term for several years within the p-System documentation and associated materials. I had never heard of Endianness until much, much later.
    </no-sarcasm>

    --
    To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
    • (Score: 3, Informative) by DannyB on Wednesday August 21 2019, @04:05PM (1 child)

      by DannyB (5839) Subscriber Badge on Wednesday August 21 2019, @04:05PM (#883198) Journal

      Oh, I should mention that byte-sex marks were used for things sent over the network. Yes, there were some LANs that the p-System worked on. The best of that era was Corvus Omninet.

      Also there was a utility that could flip the byte sex of certain recognized binary file types, like executables, system libraries, etc.

      When building any file format or network format, a developer learned to put in byte-sex marks, and recognize them. Without needing to know the endianness of the system they are running on. In Pascal, you look at that marker (integer), if it is 1, no byte flipping is needed. If it is 256, bytes within words must be swapped. You could then be sure your compiled code (not just source) would run on any p-System. This was long before Java.

      --
      To transfer files: right-click on file, pick Copy. Unplug mouse, plug mouse into other computer. Right-click, paste.
      • (Score: 0) by Anonymous Coward on Wednesday August 21 2019, @05:55PM

        by Anonymous Coward on Wednesday August 21 2019, @05:55PM (#883246)

        I'm confused, you're no-sarcasm tag ended, so is this all sarcasm?

  • (Score: 2) by istartedi on Wednesday August 21 2019, @04:12PM

    by istartedi (123) on Wednesday August 21 2019, @04:12PM (#883201) Journal

    Mine says he died at 20736. Wait... something must be off.

    --
    Appended to the end of comments you post. Max: 120 chars.
  • (Score: 2, Interesting) by Anonymous Coward on Wednesday August 21 2019, @05:13PM (1 child)

    by Anonymous Coward on Wednesday August 21 2019, @05:13PM (#883230)

    has died at the age of 81

    2019 - 81 = 1938

    Israeli-born mathematician

    However, Israel was declared state in 1948.

    Conclusion: Said scientist aged thusly could be born in Palestine, but surely not in Israel.

    • (Score: 2) by Coward, Anonymous on Thursday August 22 2019, @04:42AM

      by Coward, Anonymous (7017) on Thursday August 22 2019, @04:42AM (#883470) Journal

      In Hebrew, the Palestine of 1938 is called Palestine (Land of Israel). Would you say that a Palestinian who was born in the pre-1988 West Bank is not Palestinian-born? The State of Palestine was declared in that year.

  • (Score: 2) by Bot on Friday August 23 2019, @08:38AM

    by Bot (3902) on Friday August 23 2019, @08:38AM (#883990) Journal

    yMc noodelcnse .

    --
    Account abandoned.
(1)