Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Sunday March 13 2022, @02:14PM   Printer-friendly

10 years of Raspberry Pi: The $25 computer has come a long way:

This little device has revolutionized computing since it came on the scene. We take a look back at its journey.

The UK in the 1980s was ground zero for the microcomputer revolution. Cheap computers based on 8-bit processors flooded the market, teaching a generation to program using built-in BASIC interpreters. Homes had devices like Sinclair's ZX81 and Spectrum, while schools used Acorn's BBC Micro.

These weren't like today's PCs. They were designed and built to be accessible, with IO ports that could be accessed directly from the built-in programming environments. Turn one on, and you were ready to start programming.

But then things changed: 16-bit machines were more expensive, and technical and marketing failures started to remove pioneers from the market. The final nail in the coffin was the IBM PC and its myriad clones, focused on the business market and designed to run, not build, applications.

It became harder to learn computing skills, with home computers slowly replaced by gaming consoles, smartphones and tablets. How could an inquisitive child learn to code or build their own hardware?

The answer first came from the Arduino, a small ARM-based developer board that served as a target for easy-to-learn programming languages. But it wasn't a computer; you couldn't hook it up to a keyboard and screen and use it.

Eben Upton, an engineer at microcontroller chip manufacturer Broadcom, was frustrated with the status quo. Looking at the current generation of ARM-based microcontrollers he realized it was possible to use a low-cost (and relatively low power) chip to build a single-board computer. Using a system-on-a-chip architecture, you could bundle CPU and GPU and memory on a single chip. Using the SOC's general purpose IO ports, you could build it into a device that was easily expandable, booting from a simple SD storage card.

Work on what was to become the Raspberry Pi began in 2006, with a team of volunteers working with simple ARM SOC.

Can anyone remember the first program that they actually wrote (rather than copied from a magazine or downloaded from a friend's cassette tape)? Mine simply moved an asterisk around the screen 'bouncing' off the edges, and was written in Z80 assembly language. That is all I had on my Nascom 1.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @08:46PM (1 child)

    by Anonymous Coward on Sunday March 13 2022, @08:46PM (#1228960)

    > What should "kids today" start programming in? I tried to encourage mine to try Scratch when they were about 7-10... no real interest shown. One latched on to Tux Paint, and the other You Tube. I might have done something similar, if I hadn't been stuck with BASIC.

    Assembly. Computers are too abstract and hard to understand without taking 10,000 things as a given and feeling like an idiot if you don't learn the low level first. Unfortunately, pedagogy takes the opposite approach and starts with the super-complex high-level stuff that drives away non-rote learners.

  • (Score: 2) by JoeMerchant on Monday March 14 2022, @01:53AM

    by JoeMerchant (3937) on Monday March 14 2022, @01:53AM (#1228996)

    >Assembly.

    I sort of agree. I studied electrical engineering in University, and it took them over 3 years to get around to unraveling the one "magic" bit of computers that I never understood before school: multiplexers. I got how transistors get built up into logic gates, and how gates are made into things like CPU registers, adders, multipliers, etc. But the mystery for me was: how does the CPU address memory? In the end, it's basically a really big multiplexer. Given that, I felt like I actually understood how computers were built, how everything worked. I bet today it would have taken me less than 3 months to learn what it took University 3 years to teach me about that thing I was curious about. Not that the rest of University wasn't fun and informative along the way, just that they dribbled out some of the information painfully slowly.

    So, given working hardware, you program it in assembly, assembly can make compilers and/or interpreters, which can build higher level compilers and interpreters, and eventually you end up with HTML, CSS, Javascript and what-all interpreted in your browser.

    And, how interesting is assembly, really? Hopefully very interesting for a few thousand new students a year, otherwise we're all going to be screwed when the low level magic is lost and we're all stuck doing cargo-cult construction with ultra-complex building blocks that nobody understands. If the rumors are true, that has happened in the hard drive world: middle layers of the drive controllers have become a copy-pasta fest where nobody really knows how they work anymore, they just use the old code that works.

    --
    🌻🌻 [google.com]