Submitted via IRC for Bytram
RIP Danny Cohen: The computer scientist who gave world endianness meets his end aged 81
The computer scientist who created the first visual flight simulator, gave us the compsci concept of endianness and whose pioneering work blazed a trail for modern VOIP services has died at the age of 81.
Dr Danny Cohen worked on one of the first ever computer-based flight simulations in the early 1970s, an era where most think of computing as something that was still reliant on punch cards instead of advanced graphics processing and display technologies.
In addition, Cohen gave us the compsci notion of endianness and developed some of the first clustered computing deployments – paving the way for modern cloud technology.
The flight simulator created by the Israeli-born mathematician is very basic by modern standards but wouldn't be bested by generally available software until the advent of home gaming consoles more than a decade later.
What made his flight sim achievements even more remarkable was that it wasn't until after he developed the simulator that Cohen learned to fly in real life, as he told Wired in a 2012 interview.
Cohen also carried out some early work on what he described as "digital voice teleconferencing" in 1978, as this Youtube video published from an account seemingly in Cohen's name sets out.
[...] The Internet Hall of Fame inducted him into their ranks in 2012, recognising him as a pioneer.
[...] Danny Cohen, computer scientist. 9 December 1937 – 12 August 2019.
(Score: 2) by inertnet on Wednesday August 21 2019, @01:11PM (2 children)
On that page I didn't find the optimization reason for little endianness as I understood it way back when this was significant, but this quote comes close:
In order to add 16 bit numbers, the processor first fetched the least significant byte. The addition could then already start while reading the most significant byte, so the carry bit would be known early. Reading the bytes in big endian order (on a byte by byte bus without cache) would require more processing time in a number of calculations. Not only additions but also for calculating relative jump addresses.
(Score: 2) by jmorris on Thursday August 22 2019, @12:31AM (1 child)
That only mattered with the earliest and most primitive processors, i.e. the Intel crap and the 6502. Pretty much anything else was complex enough to support both incrementing and decrementing pointers.
So a 16bit add of a little endian number would look something like:
;Add secnum to firstnum, storing result in firstnum, little endian
; firstnum will end up with 26,4 or 1050
ldx #frstnum ; immediate load
ldy #secnum
lda x
adda y+
sta x+
lda x
adca y
sta x
ret
firstnum: .byte 232,3 ; 1000
secnum: .byte 50,0 ; 50
;Same thing in Big Endian
; firstnum ends up with 4,26 or 1050
ldx #firstnum+1 ;point at LSB
ldy #secnum+1
lda x
adda y-
sta x-
lda x
adca y
sta y
ret
firstnum: .byte 3,232 ; 1000
secnum: .byte 0,50 ; 50
There were some oddball processors that only gave post increment and pre decrement but the code doesn't change much to accommodate that. So no, little endian is just stupid, a legacy of the earliest Intel processors we will suffer from for the next thousand years. Forever converting from sane network byte ordering to the stupid intel byte ordering since ARM gave up and pretty much adopted it as well.
(Score: 2) by inertnet on Thursday August 22 2019, @09:37AM
I remember back then that expensive manager types simply advised companies to buy IBM computers with backward processors, instead of Macintoshes with much more advanced 68000's in them. Other managers then decided to buy IBM, they never listened to tech types.