Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 13 submissions in the queue.
posted by janrinok on Sunday March 13 2022, @02:14PM   Printer-friendly

10 years of Raspberry Pi: The $25 computer has come a long way:

This little device has revolutionized computing since it came on the scene. We take a look back at its journey.

The UK in the 1980s was ground zero for the microcomputer revolution. Cheap computers based on 8-bit processors flooded the market, teaching a generation to program using built-in BASIC interpreters. Homes had devices like Sinclair's ZX81 and Spectrum, while schools used Acorn's BBC Micro.

These weren't like today's PCs. They were designed and built to be accessible, with IO ports that could be accessed directly from the built-in programming environments. Turn one on, and you were ready to start programming.

But then things changed: 16-bit machines were more expensive, and technical and marketing failures started to remove pioneers from the market. The final nail in the coffin was the IBM PC and its myriad clones, focused on the business market and designed to run, not build, applications.

It became harder to learn computing skills, with home computers slowly replaced by gaming consoles, smartphones and tablets. How could an inquisitive child learn to code or build their own hardware?

The answer first came from the Arduino, a small ARM-based developer board that served as a target for easy-to-learn programming languages. But it wasn't a computer; you couldn't hook it up to a keyboard and screen and use it.

Eben Upton, an engineer at microcontroller chip manufacturer Broadcom, was frustrated with the status quo. Looking at the current generation of ARM-based microcontrollers he realized it was possible to use a low-cost (and relatively low power) chip to build a single-board computer. Using a system-on-a-chip architecture, you could bundle CPU and GPU and memory on a single chip. Using the SOC's general purpose IO ports, you could build it into a device that was easily expandable, booting from a simple SD storage card.

Work on what was to become the Raspberry Pi began in 2006, with a team of volunteers working with simple ARM SOC.

Can anyone remember the first program that they actually wrote (rather than copied from a magazine or downloaded from a friend's cassette tape)? Mine simply moved an asterisk around the screen 'bouncing' off the edges, and was written in Z80 assembly language. That is all I had on my Nascom 1.


Original Submission

Related Stories

Jeff Geerling on Rumors of a Raspberry Pi IPO 9 comments

Vlogger Jeff Geerling has an analysis of rumors of a future IPO for Raspberry Pi Trading Ltd.

But long-term, will Eben's vision for what makes Raspberry Pi change? Will there be turnover and some of the people who make the Pi a joy to use be gone?

Will the software side start leaning on subscriptions to increase revenues to make shareholders happy?

And ultimately, could Eben be replaced, and would that change things? Yes, probably, but I won't speculating about any that here. See my blog post about enshittification from last month if you wanna read more about that topic.

What I will do is answer some misconceptions I've seen about Raspberry Pi and the IPO.

The Register covered the IPO discussion the other day and while bankers have been appointed to the task, the CEO asserts that nothing will change.

"The business is in a much better place than it was last time we looked at it. We partly stopped because the markets got bad. And we partly stopped because our business became unpredictable."

"Unpredictable" is an understatement for many who attempted to acquire certain models of the computer during the supply chain crunches of recent years. "The public markets value predictability as much as they value performance," said Upton.

Previously:
(2023) Arm Acquires Minority Stake in Raspberry Pi
(2023) Eben Upton Interview on Raspberry Pi Availability Update and Painful Decisions
(2023) Raspberry Pi Produced 10 Million RP2040s in 2021, More Pi Stores Likely
(2022) 10 Years of Raspberry Pi: the $25 Computer Has Come a Long Way
(2021) Raspberry Pi Raises Price for First Time, Reintroduces 1 GB Model for $35
... and many more.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @02:36PM (5 children)

    by Anonymous Coward on Sunday March 13 2022, @02:36PM (#1228894)

    Some reports on the state of CS education in the UK would be nice, showing the impact of a quadrupled-in-price product which 95% of owners are already computer literate over-25s.

    My first self-invented code was on a BASIC Apple][+ which showed a rocket ship taking off. (I was 12..)

    • (Score: 4, Touché) by RedGreen on Sunday March 13 2022, @02:58PM (1 child)

      by RedGreen (888) on Sunday March 13 2022, @02:58PM (#1228899)

      You must have been living in a cave or some far off island. The world today does not work on facts but by hype where you tout the greatness of your project and the leader of it. Facts are an old fashioned notion it is the vision you have that is important to communicate to the masses and how your product is the best thing since sliced bread...

      --
      "I modded down, down, down, and the flames went higher." -- Sven Olsen
      • (Score: 0) by Anonymous Coward on Tuesday March 15 2022, @06:51PM

        by Anonymous Coward on Tuesday March 15 2022, @06:51PM (#1229408)

        The world has always worked like this.

    • (Score: 5, Informative) by kazzie on Sunday March 13 2022, @05:28PM (1 child)

      by kazzie (5309) Subscriber Badge on Sunday March 13 2022, @05:28PM (#1228935)

      Quick summary of CS education in the UK over the decades:

      In the 80s, the focus had been on introducing the population as a whole to the idea of computers. The BBC Micro was originally commissioned for a TV series introducing (mainly) adults to computing, and then got adopted into the government-funded Computers for Schools scheme, where loads of kids got to grips with them. As with many early microcomputers, there was initially little existing software, and because it booted straight into a BASIC prompt (and it came with a decent manual), loads of enterprising teachers, and older pupils, got a hands-on introduction to computer programming.

      (I've got a 1960s A-Level maths textbook with sections on computer programming in FORTRAN and COBOL, despite the fact that hardly any school would have access to a computer back then!)

      Come the 90s, machines were booting into GUIs and well-established firms were making software. While computers weren't as scary, neither did they offer obvious encouragement to compose your own code, other than in a specific environment (such as LOGO). There was good backward compatibility to BBC software with the widespread use of Acorn Archimedes machines in schools, but the growing dominance of the IBM-compatible resulted to a shift away come the mid 90s. In the meantime, the educational focus had shifted firmly toward educating kids how to use computer software, rather than about computing. The World-Wide-Web was the new thing, and there was some teaching focus on HTML and creating web-pages, but that's as close to a programming language as most kids got. (I went through six years of high school IT/ICT lessons, including some electives, waiting for it to turn into a proper computers course, but it never did.)

      The Raspberry PI, as TFA states, first landed during a wave of reverting to computing as a curriculum. (All these things come in waves in education, the old way is usurped by a new way of doing things, then the new way gives way to the new-new (old) way again.) In my time in high-school education, circa 2012-2017, some schools had a few Raspberry Pis, but there was limited engagement from the kids. The computer club I helped run had far more kids interested in putting programs together in Scratch, or building and programming Lego EV3 robots for an inter-school competition. The Pi's GPIO possibilities weren't something we really explored in-depth, partially due to a lack of time on my part.

      In higher education, I know of one university that's used large numbers of Raspberry Pis in order to teach hands-on courses on computer networking, security, and penetration testing, etc. The small size and affordability of these machines is what makes it feasible for groups of students to set up, configure, and attack their own networks without needing acres of desk space.

      To me, that's the key feature of the Raspberry Pi from an individual-educational point of view. Because they're so small and cheap (yet fully functional), they can be given to a kid to play around with without much risk. The kid/teenager can have a go at doing whatever with it, in their own time. In my time, I was gifted an old 486 for a birthday, and the simple fact that it wasn't the family computer, and nobody else would be waiting to use it, meant that I could experiment with the hardware and software much more extensively. It didn't matter if I hosed the hard drive, or if it took me a week to learn how to fix it, because my parents wouldn't be breathing down my neck waiting to type a letter in the meantime. These days there's no need for a big beige box plus CRT to do this, thanks to the likes of the Pi.

      • (Score: 1) by anubi on Monday March 14 2022, @07:56AM

        by anubi (2828) on Monday March 14 2022, @07:56AM (#1229030) Journal

        Is there any interest in Arduinos in the USA?

        Seems I am in love with the things, but I have yet to even meet anyone who even has heard of them.

        I love their simplicity, adaptability, reliability, and easy to make and program. Cheap to make too.

        Southern California too, no less.

        --
        "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @06:04PM

      by Anonymous Coward on Sunday March 13 2022, @06:04PM (#1228940)

      https://www.raspberrypi.org/blog/ [raspberrypi.org]

      They did that already and are on to doing other shit now, like moving those UK jobs to India and Africa.

      It's not a quadrupled in price product either. It's just over doubled only if you want more RAM. Industry wide chip shortage that affects everyone does not count.

  • (Score: 4, Informative) by KritonK on Sunday March 13 2022, @03:32PM

    by KritonK (465) on Sunday March 13 2022, @03:32PM (#1228902)

    My first program was written in FORTRAN (no, not Fortran!) on a mainframe, as an assignment for a computer programming course. It was to compute the day of the week corresponding to a given date, using Zeller's congruence [wikipedia.org].

    At that time, we had not yet been taught about IF statements, but the way the assignment had been phrased, it would have been much easier to implement it using such statements. I had to figure out how to calculate the formula using only integer arithmetic, just as it is given in the Wikipedia article.

    This was the first time I ever used a computer. Needless to say, I got hooked, and have been working with computers ever since.

  • (Score: 2) by looorg on Sunday March 13 2022, @04:09PM (15 children)

    by looorg (578) on Sunday March 13 2022, @04:09PM (#1228911)

    I like how these stories, or journalists, always believe that it was BASIC that somehow did it. While you might start with your classic "hello world" in BASIC and some other stuff you fairly quickly run into the limitations and realize how bad this is or how limiting it is. Not to mention how annoying it is with having to use actual line numbers and what stupid things you resort to with it. All BASIC did was really to infuse you with the knowledge that you needed to learn assembly. That is the lesson of BASIC. To learn something else and better. Perhaps it's not actually better but lets say faster then. Cause back in the day speed mattered. There was no extra system resources just milling about that could be assigned easily. I'm not sure what the modern equivalent would be. It probably isn't C anymore but like C# or JAVA or something like that.

    I really like when you find some thousands of line of BASIC program and think how somehow actually wrote this and that it was a good idea somehow. Hopefully at least they have written some machine code routines etc that they call cause otherwise that is going to be horrific.

    First program I recall writing in BASIC that wasn't from a book, manual or whatnot. Having read several and looked at little programs I tried to write a little game that moved a bit of petscii characters around to fight. Clearly heavily influenced by things I had seen. I seem to recall it was something with Ninjas and Samurai or something 80's cool like that. But it eventually got very bogged down with the horrors of BASIC. It was very slow. I think the main lesson learned was better pre-planning and increment the line number a lot more (not the best solution!). There is always more you have to get in there and using GOTO and a line number is not good programming.

    • (Score: 5, Interesting) by JoeMerchant on Sunday March 13 2022, @04:44PM (3 children)

      by JoeMerchant (3937) on Sunday March 13 2022, @04:44PM (#1228917)

      BASIC worked for me. I spent my life savings, $750 in 1982 / 10th grade, on an Atari 800 - and the only thing available on the machine was BASIC, or 6502 assembly shoe-horned into BASIC. I did a fair amount of both until about 1987 / Junior year in college when I started using other machines more.

      I dabbled a bit with Fortran and Pascal because I had classes in them, poked at Logo and other things because they looked interesting, but they weren't enough to get me away from BASIC. C moved me away from BASIC. When I started work in 1991, I tried, briefly, to use C++ for work but it just wasn't ready for prime time yet - so we stuck in C/DOS until about 1997 when we made the jump to C++/Windows 95 - skipping Win 3.1 which wasn't much more compelling than DOS for our use cases.

      40 years later, we just gave away a Raspberry Pi 4 setup my son had been using, finally replaced it with a "real" art tablet computer similar in cost to my Atari 800, but of course 20,000x more powerful. I set him up with Krita and walked him through a couple of drawings, tutorials, and he did them easily, but he still uses Tux Paint on the new machine (installed it himself using Google), because it's what he knows.

      BASIC worked for me mostly because it was the only thing available at the time and I semi-imprinted on it. C made a much stronger impression, and I still mostly use C++ today. Of course, you could do C++ on a Raspberry Pi, and the cost of entry is now down to the wages from a couple of 4 hour shifts at McDonalds (for computer, case, ps, monitor, keyboard and mouse)... in 1984 I would have had to work 60+ shifts to earn the Atari 800. In other words, most parents can afford to gift the Pi to their kids without a second thought. The Atari 800, kitted out with a floppy drive and various accessories I bought for it cost more than my first used car which I didn't get until 1985 - but apparently kids today aren't as desperate to have a car as we were, probably because they can "reach" their friends through smartphones any place any time, whereas we barely had access to landline telephone communication.

      What should "kids today" start programming in? I tried to encourage mine to try Scratch when they were about 7-10... no real interest shown. One latched on to Tux Paint, and the other You Tube. I might have done something similar, if I hadn't been stuck with BASIC.

      --
      🌻🌻 [google.com]
      • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @08:46PM (1 child)

        by Anonymous Coward on Sunday March 13 2022, @08:46PM (#1228960)

        > What should "kids today" start programming in? I tried to encourage mine to try Scratch when they were about 7-10... no real interest shown. One latched on to Tux Paint, and the other You Tube. I might have done something similar, if I hadn't been stuck with BASIC.

        Assembly. Computers are too abstract and hard to understand without taking 10,000 things as a given and feeling like an idiot if you don't learn the low level first. Unfortunately, pedagogy takes the opposite approach and starts with the super-complex high-level stuff that drives away non-rote learners.

        • (Score: 2) by JoeMerchant on Monday March 14 2022, @01:53AM

          by JoeMerchant (3937) on Monday March 14 2022, @01:53AM (#1228996)

          >Assembly.

          I sort of agree. I studied electrical engineering in University, and it took them over 3 years to get around to unraveling the one "magic" bit of computers that I never understood before school: multiplexers. I got how transistors get built up into logic gates, and how gates are made into things like CPU registers, adders, multipliers, etc. But the mystery for me was: how does the CPU address memory? In the end, it's basically a really big multiplexer. Given that, I felt like I actually understood how computers were built, how everything worked. I bet today it would have taken me less than 3 months to learn what it took University 3 years to teach me about that thing I was curious about. Not that the rest of University wasn't fun and informative along the way, just that they dribbled out some of the information painfully slowly.

          So, given working hardware, you program it in assembly, assembly can make compilers and/or interpreters, which can build higher level compilers and interpreters, and eventually you end up with HTML, CSS, Javascript and what-all interpreted in your browser.

          And, how interesting is assembly, really? Hopefully very interesting for a few thousand new students a year, otherwise we're all going to be screwed when the low level magic is lost and we're all stuck doing cargo-cult construction with ultra-complex building blocks that nobody understands. If the rumors are true, that has happened in the hard drive world: middle layers of the drive controllers have become a copy-pasta fest where nobody really knows how they work anymore, they just use the old code that works.

          --
          🌻🌻 [google.com]
      • (Score: 1, Insightful) by Anonymous Coward on Monday March 14 2022, @01:58AM

        by Anonymous Coward on Monday March 14 2022, @01:58AM (#1228997)

        What should "kids today" start programming in?

        Javascript.

        It has all the same advantages that BASIC once did: every computer comes with an interpreter and the necessary tools to write the code, you can see your results immediately, and it has exactly the right amount of flaws. Not so many that you can't use it without pain, but just enough to show you why you would want to use other languages.

        And as a bonus, unlike BASIC, Javascript is actually used in the real world, so you're learning an actual useful skill, not just a toy language.

        Stuff like Scratch will trigger the pandering reflex and cause a loss of interest. In 80s terms, every school taught Logo, and while it was actually a reasonable language, you don't hear a lot of programmers saying they developed an interest in computers because of Logo. It's always from playing games or experimenting with BASIC and assembly at home.

        It's kind of a shame that there's no more bare metal programming on computers, where you could see the whole path from the physical hardware to the code, but you can do that with Arduino (those have about the same capability as a 1981-ish micro). .

    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @05:04PM

      by Anonymous Coward on Sunday March 13 2022, @05:04PM (#1228926)

      Here you go: https://taipangame.com/BASIC.txt [taipangame.com]

    • (Score: 3, Interesting) by Snotnose on Sunday March 13 2022, @05:18PM (1 child)

      by Snotnose (1623) on Sunday March 13 2022, @05:18PM (#1228934)

      I didn't have a problem with BASIC, probably because I didn't know any better, until I wrote Conway's Game of Life. I couldn't figure out why it wasn't working, everything looked fine but the damned screen never updated.

      One day I made lunch while it was running, when I came back the screen had changed. Turned out each generation was taking several minutes.

      That's when I learned Z-80 assembly and never looked back.

      --
      Bad decisions, great stories
      • (Score: 3, Interesting) by JoeMerchant on Sunday March 13 2022, @06:07PM

        by JoeMerchant (3937) on Sunday March 13 2022, @06:07PM (#1228941)

        On the 1980s Atari (6502) platform, BASIC was... BASIC, but you had a somewhat easy entry to doing short bursts of 6502 assembly, and reasonable documentation of the hardware like the graphics and sound chips, so it was very possible to do clever things with a handful of bytes of 6502 assembly coded into a BASIC program for stuff BASIC never had a hope of touching.

        Some of the more common tricks were vertical blank interrupt routines to play music (primitive MIDI-like music) in the background, horizontal interrupt handlers to change graphics registers as the screen drew so you could show one set of 16 colors above the interrupt line and a different set of 16 colors below the interrupt line - you could do this on dozens to hundreds of the horizontal interrupts so rainbow demonstrations were quite common.

        I took a widely available (and horribly slow) BBS software written in BASIC and identified a couple of major bottlenecks, addressed them with assembly routines, and the whole package ran hundreds of times faster. It wasn't a situation of low hanging fruit, it was more a jungle begging for somebody with a machete to clear a path.

        --
        🌻🌻 [google.com]
    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @05:58PM

      by Anonymous Coward on Sunday March 13 2022, @05:58PM (#1228939)

      > "hello world" in BASIC

      "Hello World" was a C thing, and latter-1980s experienced programmers were ones using it. Into the 1990s it started being used in tutorials and retroactively claimed to have been The Universal First Program.

      Early 1980s was more like:
      10 INPUT "PLEASE TYPE IN YOUR NAME"; A$
      20 PRINT "HELLO, "; A$

      Me in Year 7/first year high-school once typed that in..

      Scott, in Year 9, sitting at the Apple][ next to mine 'pirated' it :), doing the following modification:
      20 PRINT "HELLO, SIR "; A$

      Brian, in Year 11, came up to Scott's computer and typed in "CUMSIZE"...

      Steve Levy's "Hackers" account of what MIT undergrads were doing reminded me so much of the community we had at school.

    • (Score: 1, Funny) by Anonymous Coward on Sunday March 13 2022, @08:48PM (1 child)

      by Anonymous Coward on Sunday March 13 2022, @08:48PM (#1228961)

      >> While you might start with your classic "hello world" in BASIC and some other stuff you fairly quickly run into the limitations and realize how bad this is or how limiting it is.

      Not true... Bill Gates wrote a whole operating system in BASIC. The only reason they stopped using it sometime around 1995 is that they wanted to add some telemetry libraries that were only available in C++

    • (Score: 3, Interesting) by theluggage on Monday March 14 2022, @01:59PM

      by theluggage (1797) on Monday March 14 2022, @01:59PM (#1229056)

      I like how these stories, or journalists, always believe that it was BASIC that somehow did it.

      Try using a compiled language like C, Pascal or Fortran on a £70 ZX-81 with 1K of RAM, an 8K EPROM and a slightly wonky domestic cassette recorder... Even using an assembler was a bit of a pain.

      The big revolution was the availability of really cheap computers that most individuals could actually afford (or that schools could buy in quantity).
      However, BASIC was an important part of that, since it provided an easy route into programming without having to jump straight in the deep end of assembly language.

      Also, in the case of UK education (...and the slightly better heeled consumer), the BASIC in question was often BBC BASIC [wikipedia.org] which was considerably more sophisticated than the Microsoft/Apple BASIC of the day, with long variable names, named procedures and functions (no need for GOSUB line_number), repeat/until, powerful byte, word and string indirection operators (instead of the usual PEEK/POKE) and a built in assembler. It was also considerably faster than contemporary BASIC (...it was written by one of the designers of the ARM processor, who really knew how to make a 6502 sit up and beg...) If you needed GOTO in BBC BASIC 1 it was because you needed over-long chunks of conditional code that you couldn't be arsed to break down into functions/procedures, and by BBC BASIC 5 on the Archimedes you had multi-line IF/THEN/ELSE/ENDIF and CASE statements for that (and a full-screen editor that hid line numbers).

      Also, in the UK, a BBC Micro was about half the price of an Apple II (...and in many respects rather more powerful).

      I think the main lesson learned was better pre-planning and increment the line number a lot more

      RENUMBER 1000,10 (BBC BASIC ~ 1981 and pretty much every 'toolkit' utility or extended BASIC for anything else...)

      and using GOTO and a line number is not good programming.

      Compared to what? Lovingly hand-crafted machine code... or the Pascal compiler that you can run after buying a floppy drive that cost 5x as much as your Sinclair?

      The other language that ran well on 8-bit micros of that era was FORTH which ran at near assembly speeds and technically had high-level/structured code - but was also the source of the term "write-only language" (reverse-Polish yay!).

    • (Score: 3, Interesting) by Anonymous Coward on Monday March 14 2022, @06:26PM (3 children)

      by Anonymous Coward on Monday March 14 2022, @06:26PM (#1229115)

      There is always more you have to get in there and using GOTO and a line number is not good programming.

      There has never been a compelling case for this made. Very quickly after that Djikstra ACM letter [arizona.edu], this idea was quickly picked up by people because it allowed one to separate the masses into the "good" programmers and the "bad" programmers based upon whether they used it or not. This quickly became unquestionable dogma in the academic circles, largely because Djikstra had said it so! GOTO is a programming tool, and like most parts of a language, there are better ways to use it than others. Frank Rubin had written a letter about this [archive.org], which led Djikstra to finally respond in a very condescending and pithy way [utexas.edu], not in defense of his arguments (he explicitly expressed frustration that those in his camp have failed to defend his position adequately), but in personal attacks on Rubin: "By my standards. a competent professional programmer in 1987 should know . . ." He criticizes Rubin's counter argument by saying that if he were a competent professional programmer, he should not only know the theorem of the bounded linear search, but should be able to derive its proof. In other words, "I am a competent professional programmer, and if you don't do it my way, you are NOT a competent professional programmer." Nevermind that the "next" in a for-next loop is a GOTO statement, the "break" statement is a GOTO statement, etc. We need this kind of redirect statement, but as long as we name it something else, we can worship at the altar of the competent professional programmer (at least BASIC loops were honest about it). If you have a situation where an efficacious GOTO would suffice, one is compelled to completely change the logic (while deriving a mathematical theorem here and there) to maintain an air of academic purity.

      Those views quickly caught on in the academic world in the 70s and 80s, I think, because computer programming had become very accessible, and BASIC very easy to program. It became a very easy to apply metric to separate people and languages into castes. "Not only do BASIC programmers use GOTOs, I also heard that they eat their toast with the butter side down!" I remember well being shown examples of horrendous spaghetti code and been told the cautionary tales that this is the path one's code will result in if one heads down that path. "This is your brain on GOTO." As a scientist in training, I was never too captivated by those sorts of computer science academic arguments; I needed code that worked and gave me the correct answer. I didn't give a rat's ass whether using recursive functions made my code more elegant unless I needed to speed things up. We were ten years past the punch card for me to care about issues like that. To me is was no different than being considered uncouth because I didn't know which fork I was supposed to pick up first at a fancy restaurant.

      I was once a FORTRAN wizard and have written many thousands of lines in that language, and though FORTRAN provided a GOTO, I had never had a need to use one. To give an example the depths these ideas took, just within the last month I have a young colleague who maintains a some scientific FORTRAN code. I overhead another young acquaintance talking to him about that, expressing pity that he had to work with that old FORTRAN code because it must be hard since "FORTRAN requires you to use GOTOs."

      • (Score: 2) by looorg on Tuesday March 15 2022, @01:22AM

        by looorg (578) on Tuesday March 15 2022, @01:22AM (#1229204)

        That was not the issue really. The issue is/was that if you fill a program with GOTO functions, because you run out of line numbers or something similar, it doesn't really become very readable or good code as you keep jumping back and forth. It has nothing to do with Djikstra or academic coding practices. I wasn't old enough for that or to even know who Djikstra was at the time. My conclusion was solely based on that all these GOTO lines and jumping back and forth wasn't really doing wonders for the code and readability. It didn't exactly do wonders for the performance either.

      • (Score: 2) by hendrikboom on Tuesday March 15 2022, @01:38AM

        by hendrikboom (1125) on Tuesday March 15 2022, @01:38AM (#1229207) Homepage Journal

        That third letter is marvelous. Dijkstra criticizes several of the programs as being wrong. The only one in which he finds no error is the first one, which was written explicitly with go-to's, and made no attempts to avoid them.

      • (Score: 2) by hendrikboom on Tuesday March 15 2022, @02:09AM

        by hendrikboom (1125) on Tuesday March 15 2022, @02:09AM (#1229215) Homepage Journal

        Years after this controversy, I had written a program (in turbo Pascal, by the way) that had a search process that was composed of two recursive searches through a complicated data structure. The inner recursion was used every time the outer recursion had a candidate to tell if it had found what it was looking for.
        Each recursion nest would signal it had found what it was looking for by a go to out of the entire mess. The inner recursion, on success, would thus jump into a success: label within the outer recursion, and the outer recursion would jump out of the entire search altogether.
        At one point I converted the entire thing to Modula 3, which did not have go to statements.
        Lots of Boolean flags passed around.
        It became a model of unclarity.

  • (Score: 2) by kazzie on Sunday March 13 2022, @04:48PM (1 child)

    by kazzie (5309) Subscriber Badge on Sunday March 13 2022, @04:48PM (#1228919)

    I saw the tail-end of the BBC Micros as I started school, and used the Acorn Archimedes machines until they were usurped by Windows 95 PCs. No computer at home until 1997.

    While we didn't do any programming in BASIC or similar, we did a lot in procedural thinking etc.in plotting vectors with LOGO. The first program I was dead-proud of was when (following writing routines for drawing squares, hexagons, etc) I worked out how to draw a circle, as REPEAT 360 [FD 1, RT 1]. Unfortunately there was a schedule for when everyone got time to try their programs on the class computer, and I'd had my stroke of genius a few days after my slot. I handed my program to a friend to try. It worked, and I got a 'well done' from the teacher, but also a telling off for jumping the queue,

    (I was then set the task of optimising the program, because it ran so slowly. The solution: polygonise it to something like REPEAT 36 [RD 10, RT 10]. It still looked round on a 90's CRT, but it executed much quicker.

    • (Score: 5, Interesting) by kazzie on Sunday March 13 2022, @04:59PM

      by kazzie (5309) Subscriber Badge on Sunday March 13 2022, @04:59PM (#1228924)

      I'll follow that up with another anecdote, of when I programmed an 80's micro in secondary school c. 2001. My school was celebrating it's 50th anniversary, and holding a special open-evening-style event to celebrate. A teacher from the science department approached me (as the resident computer geek) with the school's very first computer, a Sinclair ZX80. (The only reason it was still there was because it had been owned by the science department - there was not computing department yet - and hidden at the back of a storeroom.)

      We still had some CRT TVs around to link it up to, that wasn't a problem. The 1K of built-in memory was a bit more of a challenge, though. I ended up writing a ballistics program where you input a force and angle for a projectile. The program plotted a blocky track of the projectile's path, then reported the distance covered and maximum altitude. It was written in BASIC, because even though I had a manual, I wasn't in a position to learn Z80 assembler in the week or two I had before the event.

      My abiding memory of the whole experience was how horrid the Sinclair membrane keyboards were. Fine if you'd never used anything better, I suppose, but by then I definitely had.

  • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @06:45PM (4 children)

    by Anonymous Coward on Sunday March 13 2022, @06:45PM (#1228945)

    1980s BASIC in similar is some ways to programming in present day BASH: Very slow execution speed, hard to write clean, modular code, uses strings and ints for just about all variables, and usually has to integrate with software written in a more powerful language to do the heavy lifting. Still, BASIC back then could do graphics and sound, something BASH would struggle with.

    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @06:57PM

      by Anonymous Coward on Sunday March 13 2022, @06:57PM (#1228947)

      Commodore VIC20 and C64 owners would disagree about the sound and graphics.. (everything was done with PEEKS and POKES on the stock systems)

    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @09:44PM (1 child)

      by Anonymous Coward on Sunday March 13 2022, @09:44PM (#1228969)

      And that's why people with any brains used Borland Turbo Basic (already had Turbo C so switching environments was dirt simple). For a $100 basic compiler, I sold my first basic program for $1,500, distributed as an .exe so good luck modifying the source code to remove the customizations for the individual customer.

      With an ordinary DOS/BASIC program, you're giving away the source, the gift that could potentially keep on giving you $$$. And people who pirated the program but needed to change the customizations to remove the original customer's name (stpred encrypted) are great advertising.

      Stallman was full of shit when he said closed source was immoral. An exchange that both sides benefit that is freely agreed upon is no more immoral than paying money for a dozen eggs. Want your eggs for free - get your own damn chicken.

      Paying for the tools gives you confidence in being in the right to charge for your work product. Same as a carpenter or mechanic who invests in their tools, or a chef in their kitchen.

      If our food supply was based on the open source model (a few people produce the food for free, and everyone freeloads), we'd all be starving.

      • (Score: 2) by looorg on Monday March 14 2022, @02:02PM

        by looorg (578) on Monday March 14 2022, @02:02PM (#1229058)

        It took a bit for compilers for BASIC to come about. But yes they had the added benefit of obfuscating and turning it into an executable in that regard. By doing so it also tended to speed it up somewhat since you didn't have to interpret every line in realtime. Still not as fast as hand coded machine code or assembly tho. But a lot faster then just normal interpreted BASIC. Borland Turbo Basic is very late 80's tho and I think most of us here talked about early 80's or even earlier then that so it doesn't really compare. All the Borland products looked the same as I recall it if it was C or PASCAL or whatnot, I don't think I ever saw Borland Turbo Basic.

        Along with compilers tho you tend to also have decompilers. So it's not like you are safe in that regard or did you incorporate some devilish DRM within your EXE? If not that thing was quite open for edit with a simple hex editor or machine code monitor if you wanted to.

    • (Score: 0) by Anonymous Coward on Monday March 14 2022, @06:44AM

      by Anonymous Coward on Monday March 14 2022, @06:44AM (#1229029)

      80s basic didn’t need to fork off a process per command like shell.
      Sometimes I start an emulator to hack up a simple program to calculate something.

  • (Score: -1, Offtopic) by Anonymous Coward on Sunday March 13 2022, @07:07PM

    by Anonymous Coward on Sunday March 13 2022, @07:07PM (#1228950)

    A smartphone SOC is anything but simple.
    The pi is a big black box that few people understand completely.
    But that's the world we live in. Don't actually TRY to understand
    the hardware, or the layers of software on it, just use it.

  • (Score: -1, Flamebait) by Anonymous Coward on Sunday March 13 2022, @07:28PM (2 children)

    by Anonymous Coward on Sunday March 13 2022, @07:28PM (#1228953)

    Is the GPU still closed? Is documentation freely available?

    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @09:40PM

      by Anonymous Coward on Sunday March 13 2022, @09:40PM (#1228967)

      Silly consumer. You don't need to see documentation. Just consume.

    • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @11:01PM

      by Anonymous Coward on Sunday March 13 2022, @11:01PM (#1228976)

      Sort of. While not a totally open platform, some fully open source firmware does exist. Unfortunately the project for building the open firmware was dead for three years so it's a little behind.

      https://github.com/librerpi/rpi-open-firmware [github.com]
      https://github.com/Yours3lf/rpi-vk-driver [github.com]

      Note that the PC also requires at least three mystery blobs and typically about five, some of which you don't even get to choose from, much less see the source of.

  • (Score: 2, Interesting) by pTamok on Sunday March 13 2022, @09:54PM

    by pTamok (3042) on Sunday March 13 2022, @09:54PM (#1228973)

    ...was written on a paper coding form and mailed off to the county computing facility where it was retyped into the mainframe for execution. You got the results a week later, if the poor secretary doing the retyping hadn't made any mistakes. I guess the first one was something like putting two numbers into variables and printing their sum.

    The following year, the school got a teletype where you could prepare your programs offline by using the teletype to punch them to paper tape. We would dial up with a 300 baud audio-coupler and upload programs *in real time*. As the phone calls were not free, time was limited.

    A year later, we could actually use interactive sessions. The school computer club did things like playing Star Trek [wikipedia.org] on the teletype. Used a lot of fanfold paper.

  • (Score: 0) by Anonymous Coward on Sunday March 13 2022, @10:57PM (2 children)

    by Anonymous Coward on Sunday March 13 2022, @10:57PM (#1228975)

    But: Has anyone found Raspis available for purchase in recent months, not at a multiple of the $25 or $30 they ought to cost?

    • (Score: 0) by Anonymous Coward on Monday March 14 2022, @12:49AM

      by Anonymous Coward on Monday March 14 2022, @12:49AM (#1228985)

      Someone's got to pay for Eben's hookers and blow.

    • (Score: 2) by hendrikboom on Tuesday March 15 2022, @01:45AM

      by hendrikboom (1125) on Tuesday March 15 2022, @01:45AM (#1229210) Homepage Journal

      Sounds like the usual pandemic supply-chain breakage.

  • (Score: 3, Interesting) by Rich on Monday March 14 2022, @01:06AM

    by Rich (945) on Monday March 14 2022, @01:06AM (#1228987) Journal

    Before I had an Apple II, there was a Sinclair ZX80, and before that, a Casio FX-502P calculator.

    The first program on the Apple I remember was a primitve one-ship, one-alien variant of Space Invaders. I don't remember if I got anything meaningful out of the ZX80, but the Casio taught me to write compact code. I eventually wrote a one-armed-bandit program for it, using the degree°minute°second display for the wheels. While that might have been well later than when I got the Apple, I wonder today if I still could pull that off in the 256 program steps it had.

    I remember the last Basic program I wrote on an Apple //c, ca. around 1990, which was called "Tape Mix Arranger". I had a Mac Plus by then, and a IIgs, and high-level languages, but I wanted quick results (much as you'd use Python today) and just hacked it down in Applesoft. It would take a list of songs with their runtime, show the total (so a tape side was perfectly filled), then you could tap the BPM for the songs, and it would sort the songs by BPM. The timing for the later part might have used assembly. I'm pretty certain that the sort was either Bubblesort or Shellsort.

    My Raspis (3 and 4) probably are probably used like most of them: One sits behind the DSL router to serve a tiny bit of web (having replaced a VIA C3 after 18 years to reduce the electricity bill), and the other waits for something useful to do. But there's a Casio FX-502P on my desk right now (although I'm mostly an RPN convert by now).

  • (Score: 2) by hendrikboom on Tuesday March 15 2022, @02:02AM

    by hendrikboom (1125) on Tuesday March 15 2022, @02:02AM (#1229213) Homepage Journal

    The original BASIC was designed for interactive educational use on a time-shared computer with many terminals.
    So editing, compiling, and running had to be fast.
    They didn't want the slowdowns that resulted from interpretation.
    So they designed a system where every line could be compiled by itself and they would all work together harmoniously.
    Each line was compiled when it was entered into the program.
    Many of the constraints in the original language were there to permit this line-by-line independence -- the limited number of variables (with one-letter names, if I recall correctly) were stored in statically allocated space.
    They were doing this style of compilation as a research project. To keep things simple enough to accomplish it in reasonable time their main design criterion was that the language should be well behind the current state of the art in language design.
    It compiled.
    It ran fast.
    Success!

    And you see why the language got used for serious work on personal computers only after being reimplemented interpretively with a lot of language improvements.

(1)