10 years of Raspberry Pi: The $25 computer has come a long way:
This little device has revolutionized computing since it came on the scene. We take a look back at its journey.
The UK in the 1980s was ground zero for the microcomputer revolution. Cheap computers based on 8-bit processors flooded the market, teaching a generation to program using built-in BASIC interpreters. Homes had devices like Sinclair's ZX81 and Spectrum, while schools used Acorn's BBC Micro.
These weren't like today's PCs. They were designed and built to be accessible, with IO ports that could be accessed directly from the built-in programming environments. Turn one on, and you were ready to start programming.
But then things changed: 16-bit machines were more expensive, and technical and marketing failures started to remove pioneers from the market. The final nail in the coffin was the IBM PC and its myriad clones, focused on the business market and designed to run, not build, applications.
It became harder to learn computing skills, with home computers slowly replaced by gaming consoles, smartphones and tablets. How could an inquisitive child learn to code or build their own hardware?
The answer first came from the Arduino, a small ARM-based developer board that served as a target for easy-to-learn programming languages. But it wasn't a computer; you couldn't hook it up to a keyboard and screen and use it.
Eben Upton, an engineer at microcontroller chip manufacturer Broadcom, was frustrated with the status quo. Looking at the current generation of ARM-based microcontrollers he realized it was possible to use a low-cost (and relatively low power) chip to build a single-board computer. Using a system-on-a-chip architecture, you could bundle CPU and GPU and memory on a single chip. Using the SOC's general purpose IO ports, you could build it into a device that was easily expandable, booting from a simple SD storage card.
Work on what was to become the Raspberry Pi began in 2006, with a team of volunteers working with simple ARM SOC.
Can anyone remember the first program that they actually wrote (rather than copied from a magazine or downloaded from a friend's cassette tape)? Mine simply moved an asterisk around the screen 'bouncing' off the edges, and was written in Z80 assembly language. That is all I had on my Nascom 1.
(Score: 5, Interesting) by JoeMerchant on Sunday March 13 2022, @04:44PM (3 children)
BASIC worked for me. I spent my life savings, $750 in 1982 / 10th grade, on an Atari 800 - and the only thing available on the machine was BASIC, or 6502 assembly shoe-horned into BASIC. I did a fair amount of both until about 1987 / Junior year in college when I started using other machines more.
I dabbled a bit with Fortran and Pascal because I had classes in them, poked at Logo and other things because they looked interesting, but they weren't enough to get me away from BASIC. C moved me away from BASIC. When I started work in 1991, I tried, briefly, to use C++ for work but it just wasn't ready for prime time yet - so we stuck in C/DOS until about 1997 when we made the jump to C++/Windows 95 - skipping Win 3.1 which wasn't much more compelling than DOS for our use cases.
40 years later, we just gave away a Raspberry Pi 4 setup my son had been using, finally replaced it with a "real" art tablet computer similar in cost to my Atari 800, but of course 20,000x more powerful. I set him up with Krita and walked him through a couple of drawings, tutorials, and he did them easily, but he still uses Tux Paint on the new machine (installed it himself using Google), because it's what he knows.
BASIC worked for me mostly because it was the only thing available at the time and I semi-imprinted on it. C made a much stronger impression, and I still mostly use C++ today. Of course, you could do C++ on a Raspberry Pi, and the cost of entry is now down to the wages from a couple of 4 hour shifts at McDonalds (for computer, case, ps, monitor, keyboard and mouse)... in 1984 I would have had to work 60+ shifts to earn the Atari 800. In other words, most parents can afford to gift the Pi to their kids without a second thought. The Atari 800, kitted out with a floppy drive and various accessories I bought for it cost more than my first used car which I didn't get until 1985 - but apparently kids today aren't as desperate to have a car as we were, probably because they can "reach" their friends through smartphones any place any time, whereas we barely had access to landline telephone communication.
What should "kids today" start programming in? I tried to encourage mine to try Scratch when they were about 7-10... no real interest shown. One latched on to Tux Paint, and the other You Tube. I might have done something similar, if I hadn't been stuck with BASIC.
🌻🌻 [google.com]
(Score: 0) by Anonymous Coward on Sunday March 13 2022, @08:46PM (1 child)
> What should "kids today" start programming in? I tried to encourage mine to try Scratch when they were about 7-10... no real interest shown. One latched on to Tux Paint, and the other You Tube. I might have done something similar, if I hadn't been stuck with BASIC.
Assembly. Computers are too abstract and hard to understand without taking 10,000 things as a given and feeling like an idiot if you don't learn the low level first. Unfortunately, pedagogy takes the opposite approach and starts with the super-complex high-level stuff that drives away non-rote learners.
(Score: 2) by JoeMerchant on Monday March 14 2022, @01:53AM
>Assembly.
I sort of agree. I studied electrical engineering in University, and it took them over 3 years to get around to unraveling the one "magic" bit of computers that I never understood before school: multiplexers. I got how transistors get built up into logic gates, and how gates are made into things like CPU registers, adders, multipliers, etc. But the mystery for me was: how does the CPU address memory? In the end, it's basically a really big multiplexer. Given that, I felt like I actually understood how computers were built, how everything worked. I bet today it would have taken me less than 3 months to learn what it took University 3 years to teach me about that thing I was curious about. Not that the rest of University wasn't fun and informative along the way, just that they dribbled out some of the information painfully slowly.
So, given working hardware, you program it in assembly, assembly can make compilers and/or interpreters, which can build higher level compilers and interpreters, and eventually you end up with HTML, CSS, Javascript and what-all interpreted in your browser.
And, how interesting is assembly, really? Hopefully very interesting for a few thousand new students a year, otherwise we're all going to be screwed when the low level magic is lost and we're all stuck doing cargo-cult construction with ultra-complex building blocks that nobody understands. If the rumors are true, that has happened in the hard drive world: middle layers of the drive controllers have become a copy-pasta fest where nobody really knows how they work anymore, they just use the old code that works.
🌻🌻 [google.com]
(Score: 1, Insightful) by Anonymous Coward on Monday March 14 2022, @01:58AM
Javascript.
It has all the same advantages that BASIC once did: every computer comes with an interpreter and the necessary tools to write the code, you can see your results immediately, and it has exactly the right amount of flaws. Not so many that you can't use it without pain, but just enough to show you why you would want to use other languages.
And as a bonus, unlike BASIC, Javascript is actually used in the real world, so you're learning an actual useful skill, not just a toy language.
Stuff like Scratch will trigger the pandering reflex and cause a loss of interest. In 80s terms, every school taught Logo, and while it was actually a reasonable language, you don't hear a lot of programmers saying they developed an interest in computers because of Logo. It's always from playing games or experimenting with BASIC and assembly at home.
It's kind of a shame that there's no more bare metal programming on computers, where you could see the whole path from the physical hardware to the code, but you can do that with Arduino (those have about the same capability as a 1981-ish micro). .