Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by janrinok on Saturday April 28 2018, @12:54PM   Printer-friendly
from the there-are-still-some-of-us-left dept.

Over at ACM Yegor Bugayenko writes:

In the 1970s, when Microsoft and Apple were founded, programming was an art only a limited group of dedicated enthusiasts actually knew how to perform properly. CPUs were rather slow, personal computers had a very limited amount of memory, and monitors were lo-res. To create something decent, a programmer had to fight against actual hardware limitations.

In order to win in this war, programmers had to be both trained and talented in computer science, a science that was at that time mostly about algorithms and data structures.

[...] Most programmers were calling themselves "hackers," even though in the early 1980s this word, according to Steven Levy's book Hackers: Heroes of the Computer Revolution, "had acquired a specific and negative connotation." Since the 1990s, this label has become "a shibboleth that identifies one as a member of the tribe," as linguist Geoff Nunberg pointed out.

[...] it would appear that the skills required of professional and successful programmers are drastically different from the ones needed back in the 1990s. The profession now requires less mathematics and algorithms and instead emphasizes more skills under the umbrella term "sociotech." Susan Long illustrates in her book Socioanalytic Methods: Discovering the Hidden in Organizations and Social Systems that the term "sociotechnical systems" was coined by Eric Trist et al. in the World War II era based on their work with English coal miners at the Tavistock Institute in London. The term now seems more suitable to the new skills and techniques modern programmers need.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by MichaelDavidCrawford on Saturday April 28 2018, @08:42PM (4 children)

    by MichaelDavidCrawford (2339) Subscriber Badge <mdcrawford@gmail.com> on Saturday April 28 2018, @08:42PM (#673119) Homepage Journal

    It had to be updated in sync with the electron beam scan. Real work was done during the vertical blanking interval

    Bob Polaroid is a friend of mine who was one of Atari's first coders

    http://polaro.com [polaro.com]

    Look under Articles & Interviews on the left

    --
    Yes I Have No Bananas. [gofundme.com]
    Starting Score:    1  point
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 4, Interesting) by VanessaE on Sunday April 29 2018, @02:16AM (2 children)

    by VanessaE (3396) <vanessa.e.dannenberg@gmail.com> on Sunday April 29 2018, @02:16AM (#673211) Journal

    The Atari did not have a one-pixel buffer. That would be impossible, as each pixel is only a bit over 100 nanoseconds long, and the 6502 averages about 35 times that, per instruction.

    No, Atari used a display list prepared by a user program. That list is itself a program which is executed entirely by the ANTIC display chip. ANTIC uses DMA to read the list in realtime, and to fetch the display data as dictated by it, and it keeps reusing the list, so the CPU need only prepare it once to create a static image. Of course, you do have to stay ahead of the raster or stay in the vertical blank interval when updating the list or changing the data it's about to fetch, if you want to avoid tearing.

    • (Score: 0) by Anonymous Coward on Sunday April 29 2018, @11:33PM

      by Anonymous Coward on Sunday April 29 2018, @11:33PM (#673522)

      Thanks Vanessa, super interesting.

    • (Score: 1, Interesting) by Anonymous Coward on Monday April 30 2018, @01:24AM

      by Anonymous Coward on Monday April 30 2018, @01:24AM (#673557)

      The 2600 drew anything more complex than original Pong by software altering the TIA registers during hblank for every line in software. The tricky bit is that there's no hblank interrupt. Beyond the small handful of registers, there's no framebuffer or tilemap or anything. It's not even like a one pixel buffer, you basically just move the sprites/change the background data/swap colors each line to get the image you want.
      The 5200/Atari 8-bit computer series expands on this a bunch by using a display list to do that kind of tricky, timing-sensitive work automatically, like you described.

  • (Score: 1) by suburbanitemediocrity on Monday April 30 2018, @10:33PM

    by suburbanitemediocrity (6844) on Monday April 30 2018, @10:33PM (#673952)

    Wow. He was one of my heroes when I was a kid.