Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Tuesday November 12 2019, @12:07PM   Printer-friendly
from the insights-into-education dept.

[UPDATE 20191112_223013 UTC: Per original author's request, I hereby note this is an edited excerpt and not an exact quote from the blog post linked below. --martyb]

Submitted via IRC for Bytram

Three of the Hundred Falsehoods CS Students Believe

Jan Schauma recently posted a list of one hundred Falsehoods CS Students (Still) Believe Upon Graduating. There is much good fun here, especially for a prof who tries to help CS students get ready for the world, and a fair amount of truth, too. I will limit my brief comments to three items that have been on my mind recently even before reading this list.

18. 'Email' and 'Gmail' are synonymous.

CS grads are users, too, and their use of Gmail, and systems modeled after it, contributes to the truths of modern email: top posting all the time, with never a thought of trimming anything. Two-line messages sitting atop icebergs of text which will never be read again, only stored in the seemingly infinite space given us for free.

38. Employers care about which courses they took.

It's the time of year when students register for spring semester courses, so I've been meeting with a lot of students. (Twice as many as usual, covering for a colleague on sabbatical.) It's interesting to encounter students on both ends of the continuum between not caring at all what courses they take and caring a bit too much. The former are so incurious I wonder how they fell into the major at all. The latter are often more curious but sometimes are captive to the idea that they must, must, must take a specific course, even if it meets at a time they can't attend or is full by the time they register.

90. Two people with a CS degree will have a very similar background and shared experience/knowledge.

This falsehood operates in a similar space to #38, but at the global level I reached at the end of my previous paragraph. Even students who take most of the same courses together will usually end their four years in the program with very different knowledge and experiences.

The complete list is available at www.netmeister.org.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 2) by YeaWhatevs on Tuesday November 12 2019, @01:23PM (9 children)

    by YeaWhatevs (5623) on Tuesday November 12 2019, @01:23PM (#919363)

    Sure, it's just someone's personal gripe list, dressed up as humor, but it's hard to read without feeling the list if off.

    Take these two:
        "Linux and Unix are synonymous"
        "Bash and sh are synonymous"

    They're not the same and the author cares greatly about the difference. Thing is, I don't really care. They are equivalent in the ways that matter to me. Also, I have much more pressing things to worry about. Unless I have a problem specifically tied to the differences, then I care for about 5 minutes before I go back to not caring anymore.

    • (Score: 3, Insightful) by bart on Tuesday November 12 2019, @01:55PM (4 children)

      by bart (2844) on Tuesday November 12 2019, @01:55PM (#919371)

      If you don't really care, you probably haven't run into issues caused by the differences (bash vs sh is pretty significant!)

      • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @02:21PM

        by Anonymous Coward on Tuesday November 12 2019, @02:21PM (#919384)

        Plus the security fixes for that exploit a number of years back broke a TON of old scripts that made use of the behavior, intentionallly or unintentionally, as did later coreutils versions which require a baroque environmental variable to run in legacy compatibility mode, a mode wich all the loki games based installers required, one of the reasons the installer outlasted the company by 5-10 years then suddenly died overnight.

      • (Score: 2) by HiThere on Tuesday November 12 2019, @05:11PM (1 child)

        by HiThere (866) Subscriber Badge on Tuesday November 12 2019, @05:11PM (#919465) Journal

        Well, if you've got bash installed, sh is probably a link to it. But there are several shells that are "pretty much the same" but which have differences in detail, and they all tend to get linked to sh if they're installed as the system shell. I think the original UNIX shell was named sh, but I haven't run into it in a decade. If I did it would need to be renamed to work on my system.

        FWIW, I did at one time work on an AT&T UNIX system, but I never learned the details, even though I was the "system administrator". There was no one around to teach me, and my real job was to design and implement a multi-user database accessed by modem. So while I used the command "sh", I don't know whether it was the native command, or a link to something else.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 0) by Anonymous Coward on Wednesday November 13 2019, @05:37AM

          by Anonymous Coward on Wednesday November 13 2019, @05:37AM (#919727)

          The first UNIX shell was, I believe, the Thompson shell. This, however, was too limited so they replaced it with PWB in version 5, then the Bourne shell (the infamous /bin/sh) was system 7 or 8, IIRC. From there, the Korn shell, C shell, Almquist shell and a few others splintered out for different reasons. It wasn't until '89 when the first POSIX came out that /bin/sh was formally declared to somehow be an executable to a POSIX-compatible shell. Due to this, a number of shells made or maintained after this will check their argv[0] to see whether they should activate their "POSIX-conformant" mode by default. This is why your cross-platfrom scripts shebang should almost always be /bin/sh but your user shell should be /bin/bash or /bin/tcsh/ or /bin/fish or whatever.

      • (Score: 1) by bmimatt on Wednesday November 13 2019, @08:25AM

        by bmimatt (5050) on Wednesday November 13 2019, @08:25AM (#919759)

        You've jogged my aged Solaris memories where sh, not bash, was used for default root shell. One of the reasons - bash was not a 'built-in' shell, so you could lock yourself up if you changed root's default shell to bash, whereas sh was part of the shipped system.

    • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @01:57PM (2 children)

      by Anonymous Coward on Tuesday November 12 2019, @01:57PM (#919373)

      But those 2 quotes from the article are factually true.
      The fact that "you don't care" is irrelevant.

      You only care about Linux. That's fine, but it doesn't invalidate that students getting a CS educate should learn that OSes are not limited to Linux, macOS, and Windows. Linux in many areas is not the pinnacle of decades of OS design and refinement. Much of what less-educated people think is "Unix" is purely Linux (or GNU bash) "innovations."

      • (Score: 2) by YeaWhatevs on Tuesday November 12 2019, @02:52PM (1 child)

        by YeaWhatevs (5623) on Tuesday November 12 2019, @02:52PM (#919396)

        I know the difference, and I have had to program around them, just like the differences between them and Windows and MacOS systems. Still don't care. I'm fine with you caring, just hope you see it has nothing to do with having a freshly minted CS degree.

        • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @04:33PM

          by Anonymous Coward on Tuesday November 12 2019, @04:33PM (#919443)

          Fair enough.

    • (Score: 3, Informative) by maxwell demon on Tuesday November 12 2019, @02:36PM

      by maxwell demon (1608) on Tuesday November 12 2019, @02:36PM (#919390) Journal

      On some systems, bash and sh are really the same shell under another name.

      When invoked as /bin/sh, bash more or less behaves like a standard sh.

      Of course, some distros use another shell for /bin/sh.

      --
      The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 3, Interesting) by mth on Tuesday November 12 2019, @01:30PM (9 children)

    by mth (2848) on Tuesday November 12 2019, @01:30PM (#919364) Homepage

    The following are listed as falsehoods, but in my opinion they're not false in general:

    9. Sprinkling printf statements is an efficient debugging technique.

    All debugging is inefficient, but in my experience logging (which printf statements are a primitive form of) is often a more efficient way of debugging than single-stepping.

    28. Command-line tools should print colorized output.

    Of course adding color doesn't automatically improve readability, but if used well, I do think colorized output is easier to understand than monochrome text.

    78. The humanities requirements were a waste of time.

    While there is merit to having non-technical courses in the curriculum, some of the ones I took about 20 years ago turned out to indeed be a waste of time. So maybe this sentiment shouldn't be dismissed as the students being short-sighted but instead they should check whether those courses are indeed teaching useful skills and/or insights.

    • (Score: 5, Insightful) by nobu_the_bard on Tuesday November 12 2019, @02:10PM (3 children)

      by nobu_the_bard (6373) on Tuesday November 12 2019, @02:10PM (#919375)

      I agree, #28 shouldn't be considered a "must" but selective use of color can improve readability. Unless it is a dark blue on a black background, that is the opposite of helpful.

      #78, it's too blanket a statement to be useful. Humanities have their purposes, but it's not universal. I studied Logic in the Philosophy department in college and it was very good at showing how to make arguments; the teacher even observed he had a lot of comp-sci students and occasionally went on tangents of things we might find useful. At the same time, I had an English class that was basically "popular film criticism" which was sort of useless as we just discussed subjective things pointlessly (it was more "this movie's plot was bad because I don't like plots centered on saving the world" vs. the potentially more thoughtful "let us discuss how this movie attempts to make a specific point but inadvertently appears to provide proof of the opposite").

      • (Score: 0) by Anonymous Coward on Wednesday November 13 2019, @07:38PM (2 children)

        by Anonymous Coward on Wednesday November 13 2019, @07:38PM (#919974)

        I teach Logic at a local university. My current intro class is about 1/3 engineers, 1/3 CS, and 1/3 everyone else (mostly Philosophy or legal studies). I make sure I cover things like set theory and Venn Diagrams, fuzzy logic, N-order logic, many-sorted logic, modal logic, and many-valued logic (after experimenting with different systems, I'm thinking of sticking with the functionally-complete version of ternary SQL logic as the most applicable and easily understandable). Each of those has direct implications to the CS and engineering fields, even if the students don't see that from the ground.

        If any of you have any suggestions as to what to add to my curriculum or tweaks to make, I'd appreciate them. Specifically, what did you find helpful when you learned them, or what do you wish you'd learned earlier? Any insight would be appreciated, as most who take the class are not Philosophy majors but I, obviously, have a Philosophy background, so do not share the same background or training.

        And no, Steve, if you see this, I'm not teaching them that crazy IEEE logic you want in an intro course!

        • (Score: 2) by nobu_the_bard on Thursday November 14 2019, @07:22PM (1 child)

          by nobu_the_bard (6373) on Thursday November 14 2019, @07:22PM (#920471)

          I am not an expert in computer science, i went to college for it but what I learned was I am not an expert to the level I should be teaching it, hahaha.

          I'd advise having a chat with the computer science professors sometime. They'd know better than a chump like me. Find one that recognizes how important philosophy is - it's hard to tell with a casual glance in my experience, have to talk to them.

          At my college the administration liked to have a "Science vs Humanities" power struggle and force them to compete for resources, so they didn't get along well mostly, but that's stupid. I lucked out and got a professor that recognized it was stupid as an advisor. They don't exist without each other. They should get along.

          • (Score: 0) by Anonymous Coward on Thursday November 14 2019, @07:29PM

            by Anonymous Coward on Thursday November 14 2019, @07:29PM (#920472)

            That was a general invitation for everyone for general ideas but was spurred because you sounded like you had a few. I do talk to the different departments (mathematics, EE, and CS, to name the big ones) and that is why I include some of the things I do. Those hacking their way through the trees with a machete want to know what food is poisonous and animals are venomous. Meanwhile, when you look down from the ivory tower, you can only see the forest and sometimes forget it is made out more than trees.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday November 12 2019, @02:10PM (2 children)

      by Anonymous Coward on Tuesday November 12 2019, @02:10PM (#919376)

      I think the last item should be amended to be correct: MANY university requirements, technical and non-technical, are a waste of time.
      One class that is NOT a waste of time is TECHNICAL WRITING.
      It's completely different from the usual writing students are taught their entire academic careers: essays, literary analysis, long papers.
      I work with a sea of millennials and it's obvious that they desperately need this skill.

      • (Score: 2) by YeaWhatevs on Tuesday November 12 2019, @03:02PM

        by YeaWhatevs (5623) on Tuesday November 12 2019, @03:02PM (#919399)

        I agree. Though, I see bad writing from all ages. Also, I see dumb people. Finally, Technical Writing IS a CS degree requirement in most places these days. It wasn't back in my day.

      • (Score: 2) by Alfred on Tuesday November 12 2019, @03:17PM

        by Alfred (4006) on Tuesday November 12 2019, @03:17PM (#919403) Journal
        My tech writing class was BS. Yet I recognize the importance of Tech writing though it wasn't taught to me. Even more sadly I am better at it then those I have to interact with.
    • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @04:45PM

      by Anonymous Coward on Tuesday November 12 2019, @04:45PM (#919451)

      Ah ha...found the M$ programmer!!

    • (Score: 2) by acid andy on Wednesday November 13 2019, @12:59AM

      by acid andy (1683) on Wednesday November 13 2019, @12:59AM (#919633) Homepage Journal

      I disagreed with these:

      27. Real Programmers(TM) use neon-green on black terminals.

      I don't always use exactly that but a dark or warm background makes for a good night mode so it's easier to get to sleep. And anyway, everyone knows it's amber on black. ;)

      30. Software with version numbers ending in '.0' are buggy and you should wait until the next release.

      Well these days the '.1' will probably be buggy too.

      This was a good one though:

      60. Object-oriented programming is the best and most common programming paradigm.

      --
      If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
  • (Score: -1, Offtopic) by Anonymous Coward on Tuesday November 12 2019, @01:57PM (20 children)

    by Anonymous Coward on Tuesday November 12 2019, @01:57PM (#919372)

    IT grads in general are useless nowadays. The last year before my retirement was a neverending facepalm at the level of incompetence displayed by the new recruits fresh out of school.

    The crushing, inescapable reality is that if you've never had to program in assembly, if you don't know the difference between expanded and extended memory, if you've never had to manually set non-conflicting memory, io and interrupts using DIP-switches and jumpers on expansion cards, then as far as CS is concerned, you are useless.

    • (Score: 1, Insightful) by Anonymous Coward on Tuesday November 12 2019, @02:03PM (1 child)

      by Anonymous Coward on Tuesday November 12 2019, @02:03PM (#919374)

      You are really showing your early Windows PC background.
      Assembly aside, those other things are not critical to knowing how a computer works. If you started your career on Sun or SGI workstations, that other crap never existed. You'd learn all about SCSI issues and the Forth based boot interpreter instead. ;-) Still a lot less hassle than a PC. If you started wth Macs, SCSI was also a thing plus who knows what other issues peculiar to Macs.

      • (Score: 2) by Alfred on Tuesday November 12 2019, @03:23PM

        by Alfred (4006) on Tuesday November 12 2019, @03:23PM (#919408) Journal
        Macs are perfect there is no troubleshooting to learn /s I remember getting the wrong SCSI IDs set. Or not having the terminator. But that doesn't matter because "Dark Castle" booted from floppy anyway.
    • (Score: 4, Insightful) by Immerman on Tuesday November 12 2019, @02:20PM (13 children)

      by Immerman (3985) on Tuesday November 12 2019, @02:20PM (#919383)

      You're talking about technical details - there's absolutely no reason those should be taught in Computer Science programs. There's far too many, and most of them are irrelevant to any particular career paths.

      As the old saying goes: Computer Scientists don't work with hardware. Similarly, Computer Scientists don't write code.

      Of course, Computer Scientists also mostly don't have jobs. There just isn't that much work for Computer Scientists. Mostly you want Software Engineers, Network Engineers, IT technicians, etc.etc.etc. Computer Science provides a decent foundation for becoming those, especially if a student decides to specialize in one of those directions, but it does NOT encompass them. If you want to be able to hire cheap employees right out of school and have them be useful, then you need to provide training for the specific skills you need. That's the trade off - either you pay market price for an experienced professional - or you provide the necessary training yourself. Or just bumble along in the face of a steady stream of incompetent fuck-ups, that's an option too I guess.

      TL,DR: Undergraduate degrees are NOT supposed to make you a competent employee - that's what trade schools and on-the-job training are for. They're designed to lay the foundation for you to eventually *become* much more competent than you would be able to without that foundation, or to go on to a graduate school where you will acquire much more specialized skills that will generally be applicable to a career in a much narrower field.

      • (Score: 3, Interesting) by ikanreed on Tuesday November 12 2019, @02:32PM (5 children)

        by ikanreed (3164) Subscriber Badge on Tuesday November 12 2019, @02:32PM (#919388) Journal

        I agree with your final sentiment. We've broken the concept of a 4 year degree over the knee of the service economy.

        All of the following are completely fucked up

        • Computer science as software engineering for business
        • Communications as a prep degree for PR/HR
        • Business degrees in general, there's nothing a middle manager needs to do that requires it, and it doesn't attach to some deeper theory
        • The widespread conclusion that degrees in philosophy, history, and the like are "useless"

        The original understanding of a bachelorreate was that you were a person who genuinely understood something and had developed a well-rounded general intellectual skillset, not highschool++. The way we expect every programmer to have a 4 year degree in computer science is entirely expecting every musician to have a 4 year degree in music.

        • (Score: 4, Touché) by Anonymous Coward on Tuesday November 12 2019, @04:08PM (4 children)

          by Anonymous Coward on Tuesday November 12 2019, @04:08PM (#919432)

          I miss the days when baccalaureates were conferred only on those who could spell “baccalaureate.”

          • (Score: 3, Insightful) by captain normal on Tuesday November 12 2019, @05:10PM (3 children)

            by captain normal (2205) on Tuesday November 12 2019, @05:10PM (#919464)

            I miss the days when undergrads were required to have some knowledge of Latin.
            https://www.theclassroom.com/baccalaureate-degree-4603623.html [theclassroom.com]

            --
            When life isn't going right, go left.
            • (Score: 2) by ikanreed on Tuesday November 12 2019, @06:47PM (2 children)

              by ikanreed (3164) Subscriber Badge on Tuesday November 12 2019, @06:47PM (#919499) Journal

              As someone with entirely too much latin and not enough real language study in his background: why?

              • (Score: 3, Funny) by barbara hudson on Tuesday November 12 2019, @08:24PM (1 child)

                by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Tuesday November 12 2019, @08:24PM (#919531) Journal
                Maybe if they knew what caveat emptor meant they wouldn't spend so much money on useless degrees? Nah, who am I kidding?
                --
                SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
                • (Score: -1, Troll) by Anonymous Coward on Thursday November 14 2019, @06:05PM

                  by Anonymous Coward on Thursday November 14 2019, @06:05PM (#920449)

                  Barbara (tom) Hudson CHEMICALLY CASTRATED itself with estrogen since you failed as a man lol! You also FAIL as a "woman" you NEUTERED delusional freakazoid! What is is like knowing you are a living mockery? A parody of both a 'woman' or a man! You know that. Everyone knows it about you "TraNsTeSticLe" hohohohoho. Barbara Hudson is a twistoid mental case deluding itself it is a REAL woman. Clue: You will never EVER be able to pass a DNA test due to the fact you do not, nor did you ever, possess female mitochondrial material you crackpot weirdo. It isn't logical to attempt to "fix" bodyparts that work with no issues. You had a working (extremely small) penis and balls you sawed off with estrogen hahahaha! Barbara Hudson breaks laws by possessing a SAWED OFF SHOTGUN, rotflmao!!!!!!!!!!!!!!!!!!!!!!!!!!!!!!

      • (Score: 2) by tangomargarine on Tuesday November 12 2019, @04:42PM (4 children)

        by tangomargarine (667) on Tuesday November 12 2019, @04:42PM (#919449)

        You're talking about technical details - there's absolutely no reason those should be taught in Computer Science programs. There's far too many, and most of them are irrelevant to any particular career paths.

        I'd agree that every CS student should be forced to take a little assembly; it's useful for understanding the underlying reasons behind a lot of logic that nobody really explains elsewhere.* Apparently my university agreed, because it was a required course.

        *registers vs RAM (why certain operations are faster, register width), endianness, computational costs of branching...writing a program that you can't debug was also interesting :)

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 3, Informative) by Immerman on Tuesday November 12 2019, @04:57PM (3 children)

          by Immerman (3985) on Tuesday November 12 2019, @04:57PM (#919458)

          That much I'll tentatively agree on - assembly is (really close to) where the hardware meets the software, and the insights are likely useful to any application of CS knowledge.

          I'd add in cache performance as another thing that should be spotlighted in such a "bare metal" assembly segment (I'm not sure assembly is worth a dedicated course) - I've encountered many people that don't understand that traversing large data sets sequentially is almost guaranteed to be several times faster that any other pattern, simply because it's pretty much the only pattern that cache prefetching can effectively anticipate - and using cache effectively gives you the capacity of RAM with most of the speed of registers. (The really scary thing? I've encountered such ignorance at professional supercomputing conferences...)

          • (Score: 2) by tangomargarine on Tuesday November 12 2019, @11:00PM (2 children)

            by tangomargarine (667) on Tuesday November 12 2019, @11:00PM (#919591)

            The one I took was a split course between x86 assembly and Java threading, yeah.

            --
            "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
            • (Score: 2) by Immerman on Wednesday November 13 2019, @02:07AM (1 child)

              by Immerman (3985) on Wednesday November 13 2019, @02:07AM (#919666)

              Ever tried your hand at 68000 assembler? Not much call for it anymore, but *so* much nicer to program in. On of the difference that springs to mind was a "function call" instruction that did all the stack manipulation, etc. needed for a normal clean function call with a single instruction, while you needed a half-dozen instructions in x86.

              • (Score: 2) by tangomargarine on Wednesday November 13 2019, @05:38PM

                by tangomargarine (667) on Wednesday November 13 2019, @05:38PM (#919930)

                Oh, I totally believe that other assembly languages are nicer than x86. The platform didn't win out for technical reasons, but because it was cheaper and pushed by a bigger company (Intel).

                --
                "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
      • (Score: 2) by acid andy on Wednesday November 13 2019, @01:09AM (1 child)

        by acid andy (1683) on Wednesday November 13 2019, @01:09AM (#919636) Homepage Journal

        Many of the best professionals learned the trade for fun in their spare time. The degree formalizes that knowledge and puts it into a scientific context. I think people who try to study Computer Science purely to get at the revenue stream and don't have a passion for it will often have a bad time in the workplace if they even make it through the degree. I'm sure there are exceptions, though.

        --
        If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
        • (Score: 3, Insightful) by Immerman on Wednesday November 13 2019, @01:58AM

          by Immerman (3985) on Wednesday November 13 2019, @01:58AM (#919662)

          >The degree formalizes that knowledge and puts it into a scientific context.

          I have to disagree. Yeah, there's some formalization, but where a degree really shines is granting breadth of knowledge. There's plenty of good self-taught programmers out there, but I doubt all that many of them are familiar with big-O notation, graph theory, boolean algebra, etc, etc, etc. (to mention some examples from the first class that springs to mind) All things that are very useful to have in your mental toolbox, but that you probably aren't going to learn organically under your own initiative - they're just not obviously relevant to much until you already know what they are.

          Degrees excel at exposing you to lots of relevant background knowledge, while self-guided learning excels at developing practical knowledge. I'd venture a guess that most really excellent professionals have both.

    • (Score: 2) by HiThere on Tuesday November 12 2019, @05:28PM (3 children)

      by HiThere (866) Subscriber Badge on Tuesday November 12 2019, @05:28PM (#919470) Journal

      Why would one need to know the difference between "expanded" and "extended" memory? I seem to remember caring at one time, but these days I don't know the difference anymore. What's significant is which memory will remain resident for fast access, and which will get rolled out. And that varies from system to system, and by amount of load.

      I got no permanent advantage out of learning to set DIP switches, soldering my own connections, etc. It was useful at the time on specific hardware, but that went out of use within 5 years.

      Assembler, though, is really valuable. OTOH, assembler for a virtual machine is just as valuable as assembler for an actual machine as far as learning how things work. I sympathize with Knuth wanting to use MIXX for his Art of Computer Programming, even though it made the things a pain to use as a reference. At one time I programmed a modem controller for a bunch of i8088 systems in assembler. And hand made the cables needed to interface the terminal to the printer port. But most of that was of no permanent value.

      Most of what you learn by programming in assembler you could learn from K&R C with a really simple (non-optimizing) compiler, and a few routines that you wrote in assembler to interface. I specify K&R C because the current version incorporates too much magic for learning. Unfortunately, to really understand you need to run it on a simple computer where there's only one program running at a time and no operating system. So a virtual computer is the best choice. Possibly a virtual i8008 or i6502. Not a z80, as that adds a bit too much complication. But MIXX might be even better, it's just that I don't know that a C compiler was ever written for MIXX.

      --
      Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 3, Informative) by barbara hudson on Tuesday November 12 2019, @08:34PM (1 child)

        by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Tuesday November 12 2019, @08:34PM (#919537) Journal
        There's no longer expanded or extended memory once the boot environment is set up. 64-bit cpus no longer need expanded or extended ram since they operate on a flat memory address space: the hardware allocates chunks of ram for each program and maps them into a virtual memory space with the same virtual start address, so programs always start at the same entry point.

        This was because of bad design both in hardware (segmented memory) and software (non-relocatable code). But it made for fun times when first encountering it after leaving an OS that supported position-independent code.

        --
        SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
        • (Score: -1, Troll) by Anonymous Coward on Thursday November 14 2019, @06:03PM

          by Anonymous Coward on Thursday November 14 2019, @06:03PM (#920446)

          Barbara (tom) Hudson CHEMICALLY CASTRATED itself with estrogen since you failed as a man lol! You also FAIL as a "woman" you NEUTERED delusional freakazoid! What is is like knowing you are a living mockery? A parody of both a 'woman' or a man! You know that. Everyone knows it about you "TraNsTeSticLe" hohohohoho. Barbara Hudson is a twistoid mental case deluding itself it is a REAL woman. Clue: You will never EVER be able to pass a DNA test due to the fact you do not, nor did you ever, possess female mitochondrial material you crackpot weirdo. It isn't logical to attempt to "fix" bodyparts that work with no issues. You had a working (extremely small) penis and balls you sawed off with estrogen hahahaha! Barbara Hudson breaks laws by possessing a SAWED OFF SHOTGUN, rotflmao

      • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @09:54PM

        by Anonymous Coward on Tuesday November 12 2019, @09:54PM (#919564)

        i6502?

  • (Score: -1, Troll) by Anonymous Coward on Tuesday November 12 2019, @02:13PM (10 children)

    by Anonymous Coward on Tuesday November 12 2019, @02:13PM (#919378)

    CS grads are guaranteed a six figure income: $000,000. All zeroes.

    In the real world, corporations take open source code straight from CS grads’ “GitHub resumes” and incorporate it into proprietary products and services that generate billions of dollars in revenue. Corporations don’t need to pay any coders anything because CS grads are naive enough to believe that showing off their talent by working for free will land them huge salaries at top tech companies. The reality is top tech became top tech by paying exactly nothing for top talent. Tech companies invented the myth of the lucrative job market for CS grads to guarantee an unending supply of young naive idiots for big tech to leech from. Anyone foolish enough to get a CS degree is unemployable for life.

    Learn to Code, Earn Zero, Die Poor.

    • (Score: 3, Informative) by ikanreed on Tuesday November 12 2019, @02:24PM (6 children)

      by ikanreed (3164) Subscriber Badge on Tuesday November 12 2019, @02:24PM (#919385) Journal

      Ah yes, the "steal wholesale and don't bother to integrate at all" strategy of development. Have you ever actually worked in a corporate shop? The percentage of open source code that's "enterprise ready" to link into existing APIs is close to zero.

      • (Score: 2) by DutchUncle on Tuesday November 12 2019, @04:08PM (2 children)

        by DutchUncle (5370) on Tuesday November 12 2019, @04:08PM (#919434)

        But try to convince an EE hardware manager that it's worth buying a tested certified stack instead of just copying one off github for free.

        • (Score: 2) by ikanreed on Tuesday November 12 2019, @04:12PM

          by ikanreed (3164) Subscriber Badge on Tuesday November 12 2019, @04:12PM (#919435) Journal

          I have done that before. Hell, I've pitched going from systems I know how to use and are currently working to ones that are more stable but less familiar to me and I struggle with quite recently(ask me what I'm putting off today). Maybe the reality you've experienced is so radically different from mine that we just see completely different kinds of behaviors.

        • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @06:56PM

          by Anonymous Coward on Tuesday November 12 2019, @06:56PM (#919503)

          I think an EE manager would recognize the value of the certified stack, and deprecate the github one. Many EE's still have an engineering mentality, while CS people don't mind winging it and dealing with the aftermath when it breaks.

      • (Score: 3, Funny) by tangomargarine on Tuesday November 12 2019, @04:35PM (2 children)

        by tangomargarine (667) on Tuesday November 12 2019, @04:35PM (#919444)

        Have you ever actually worked in a corporate shop?

        You're asking the "no jobs exist for CS grads" guy whether he's worked a CS job before? Really??

        --
        "Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
        • (Score: 4, Funny) by aristarchus on Tuesday November 12 2019, @07:08PM (1 child)

          by aristarchus (2645) on Tuesday November 12 2019, @07:08PM (#919510) Journal

          I was wondering what happened to the "no jobs exist for CS grads" guy! Hasn't been around for a while, and I was worried he had the misfortune of finding a job in CS. Thank goodness that appears not to be the case.

          • (Score: 1) by khallow on Wednesday November 13 2019, @03:34AM

            by khallow (3766) Subscriber Badge on Wednesday November 13 2019, @03:34AM (#919697) Journal
            Indeed, we are safe for another posting cycle.
    • (Score: 1, Interesting) by Anonymous Coward on Tuesday November 12 2019, @04:17PM (1 child)

      by Anonymous Coward on Tuesday November 12 2019, @04:17PM (#919437)

      As someone who makes a six figure income coding for multinational corporations...

      The problem I find is CS grads who _think_ they can just copy shit off github and integrate it into my enterprise codebase. Their resumes are easy to spot because they tend to copy their acronym soup definitions straight from wikipedia. I wouldn’t consider them for a $000,000 position.

      I would sooner hire someone at my salary with a BA in Ballet and a MFA in Portuguese poetry if they could successfully bang out “Hello World” in three or more languages than most of the CS resumes I see.

      • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @06:30PM

        by Anonymous Coward on Tuesday November 12 2019, @06:30PM (#919492)

        Their resumes are easy to spot because they tend to copy their acronym soup definitions straight from wikipedia.

        They do that because there are all too many companies that won't even consider you unless your resume includes a ridiculous number of buzzwords, particularly if the resumes have to go through some poorly-designed automated process first.

    • (Score: 3, Informative) by turgid on Friday November 15 2019, @09:04PM

      by turgid (4318) Subscriber Badge on Friday November 15 2019, @09:04PM (#920800) Journal

      I have done plenty of "jobs for CS grads" and I don't even have a CS degree. I do have a degree. And no, they don't steal open source code. They're usually pretty tight on license compliance. But then I don't generally work in the Wild West for cowboys and gangsters.

  • (Score: 5, Insightful) by fadrian on Tuesday November 12 2019, @03:50PM (11 children)

    by fadrian (3194) on Tuesday November 12 2019, @03:50PM (#919421) Homepage

    Yawn. Another get-off-my-lawn article.

    Get with it, man. Nobody cares about alternatives anymore. The world's too busy to deal with plumbing and just wants shit to work. Plumbers don't bore you with the fact that there's more than one type of wrench before they get down to work, do they? I know, in the end, you're training computer plumbers, so maybe they need to know, but they (and the author) shouldn't obsess about it - train them to use any fucking wrench and get the job done.

    This touches on a topic that troubles me lately, namely treating computers with more importance than they deserve, because in the end they're nothing but tools. That's really all they are. There's been so little change in the computer world for the last forty years, it's no longer fresh. So unless you have a new, interesting tool, don't bug me (and if you have a new, interesting tool, don't bug me too much). In any case, computers' world-changing aspects are over and by now they're about as interesting as any wrench. This could change with any actual new innovation in computer systems, but I'm not holding my breath, as the computer industry has changed from one where changing the world was a co-priority (along with with getting rich) to one where milking the cash cow is most important.

    In the end, lamenting this shit makes me seem older than the author of the article. Get off my fucking lawn. Damn.

    --
    That is all.
    • (Score: 3, Insightful) by Anonymous Coward on Tuesday November 12 2019, @05:05PM (7 children)

      by Anonymous Coward on Tuesday November 12 2019, @05:05PM (#919463)

      This touches on a topic that troubles me lately, namely treating computers with more importance than they deserve, because in the end they're nothing but tools. That's really all they are. There's been so little change in the computer world for the last forty years, it's no longer fresh. So unless you have a new, interesting tool, don't bug me (and if you have a new, interesting tool, don't bug me too much).

      Really? Were you actually around for the past 40 years? I really don't want to spend the 5 minutes searching online, but off the top of my head, it sounds like you are suggesting the following were considered non-big changes:
      1) Shrinking computers from room-sized to microwave-oven-sized.
      2) Advent of electronic spreadsheets, including nearly-failproof calculations (especially compound interest)
      3) Color monitors
      4) Network connectivity, especially the Internet
      5) Making the internet be international
      6) Online commerce (remember, "the internet is for porn" trope... and nobody would use it to buy shoes or check when a nearby movie is showing)
      7) Streaming
      8) Social networking
      9) Mobile network connectivity
      10) Electronic maps and GPS (when's the last time you searched an index for a street name, to find the square your destination was)
      11) Electronic directories (when's the last time you memorized a phone number)
      13) Expert Systems/Artificial Intelligence
      14) RSA/PGP/etc

      I could literally go on, but I have more important things to do now.

      In any case, computers' world-changing aspects are over and by now they're about as interesting as any wrench. This could change with any actual new innovation in computer systems, but I'm not holding my breath, as the computer industry has changed from one where changing the world was a co-priority (along with with getting rich) to one where milking the cash cow is most important.

      Reminds me of this. [amasci.com]

      I'm *SURE* that things like quantum computing, image recognition and driverless cars, deep fakes, machine-derived pharmaceuticals, "artificial intelligence" (whatever that is), and everything will turn out to be dead ends.

      I'd get off your lawn, but I'm not convinced you actually live in this neighborhood.

      • (Score: 2) by HiThere on Tuesday November 12 2019, @05:40PM (3 children)

        by HiThere (866) Subscriber Badge on Tuesday November 12 2019, @05:40PM (#919475) Journal

        Even though I've upmodded you, I disagree. Computers *ARE* overrated. People pay more attention to devices than to what they do. Not that they don't *use* what they do, but they don't think about it.

        I *think* I'm different about this, but who knows. Consider "facial recognition". What's important is not what the technology is, but how it's used and who controls it. Or computerized selection of job applicants.

        It's not the computer that's important, it's how it's getting used. And who's controlling the results, and what selection criteria they use.

        Computers are an enabling tool. As long as the "only follow orders" that's all they are, even if the orders are several layers removed from observation. And what's important is what those orders are, and what coercive force is behind them. Until the computers are self directed, it's not the computers that are important.

        That said, I'm not a political animal, I'm a technologist. I can control my computer (to an extent). And I can choose to buy a newer, fancier, one. And that's where my attention naturally lives. But I *know* that that's not what's socially important, and pretending that it is, is a mistake.

        --
        Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
        • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @06:28PM

          by Anonymous Coward on Tuesday November 12 2019, @06:28PM (#919491)

          Thanks for the upmod, and fair point. I think there were several points in that post I reacted to.

          There's been so little change in the computer world for the last forty years, it's no longer fresh... This could change with any actual new innovation in computer systems, but I'm not holding my breath...

          This is the part which got a proverbial bee in my bonnet, and it was the primary point of my response. I strongly disagree with it. If the poster had said "no big innovations in the past 2 years," and had said it in 2015 I may have agreed... but things have changed a TON in recent years, let alone the past 40 years.

          This touches on a topic that troubles me lately, namely treating computers with more importance than they deserve, because in the end they're nothing but tools.

          I think this is what you were reacting to, and to a large degree I agree. However, saying something is "just a tool" is both literally true, and I think misrepresentative of the reality of the situation. I could likewise say fire, electricity, steam engines, currency, elections, pacemakers, and gunpowder are all "just tools." You are right that on their own they do nothing (case in point: Chinese people invented gunpowder... and only used it for ceremonial purposes until a westerner had the clever idea to use it to project objects into enemies in combat). Steel beams are "just a tool," but imagine trying to build a skyscraper without them. Computers are "just a tool" much like humans are "just some hydrogen, oxygen, carbon, nitrogen, and a few other trace elements."

          I will agree with your general point, though, that it's not the tool or its strength that matters, it's how it is used. (e.g. the dotcom bust of the early 2000s, where things like pets.com flopped... just being "on the internet" isn't good enough.)

        • (Score: 1) by khallow on Wednesday November 13 2019, @03:44AM (1 child)

          by khallow (3766) Subscriber Badge on Wednesday November 13 2019, @03:44AM (#919700) Journal

          Consider "facial recognition". What's important is not what the technology is, but how it's used and who controls it.

          Don't forget the infrastructure for databases matching human faces to names, collecting the data which is scanned for faces, and of course, being able to act on it.

          • (Score: 2) by HiThere on Wednesday November 13 2019, @05:31PM

            by HiThere (866) Subscriber Badge on Wednesday November 13 2019, @05:31PM (#919927) Journal

            I included that in "how it's used an who controls it".

            --
            Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
      • (Score: 2) by barbara hudson on Tuesday November 12 2019, @08:41PM (1 child)

        by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Tuesday November 12 2019, @08:41PM (#919542) Journal
        Shrinking computers from room-sized to microwave-sized? Only if the microwave is designed only to nuke chocolate bars. My phone is more powerful than those old room sized computers, and you can probably get 100 in a decent microwave.
        --
        SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
        • (Score: -1, Troll) by Anonymous Coward on Thursday November 14 2019, @06:01PM

          by Anonymous Coward on Thursday November 14 2019, @06:01PM (#920443)

          Barbara (tom) Hudson CHEMICALLY CASTRATED itself with estrogen since you failed as a man lol! You also FAIL as a "woman" you NEUTERED delusional freakazoid! What is is like knowing you are a living mockery? A parody of both a 'woman' or a man! You know that. Everyone knows it about you "TraNsTeSticLe" hohohohoho. Barbara Hudson is a twistoid mental case deluding itself it is a REAL woman. Clue: You will never EVER be able to pass a DNA test due to the fact you do not, nor did you ever, possess female mitochondrial material you crackpot weirdo. It isn't logical to attempt to "fix" bodyparts that work with no issues. You had a working (extremely small) penis and balls you sawed off with estrogen hahahaha! Barbara Hudson breaks laws by possessing a SAWED OFF SHOTGUN!

      • (Score: 2) by Rupert Pupnick on Tuesday November 12 2019, @11:26PM

        by Rupert Pupnick (7277) on Tuesday November 12 2019, @11:26PM (#919603) Journal

        But these really aren’t fundamental changes to CS, and in fact the von Neumann architecture on which all classical computers are based is part of almost every CS curriculum being taught today. What you are talking about are the immense changes in scale, and the applications that these changes have made possible.

        True AI as I understand it is really NN based, and probably doesn’t fall under CS, but having no exposure to present day CS curricula I could be totally wrong on this.

    • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @07:11PM (2 children)

      by Anonymous Coward on Tuesday November 12 2019, @07:11PM (#919513)

      Yawn. Another get-off-my-lawn article.

      "Tortured Old Man" submission? Runaway is older than dirt, and just as intelligent.

      • (Score: 3, Interesting) by barbara hudson on Tuesday November 12 2019, @08:49PM (1 child)

        by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Tuesday November 12 2019, @08:49PM (#919545) Journal
        Never read much sci-do, I guess. Computronium (aka smart dust) is the next step after nanites, and eventually the whole solar system is converted to it. Or maybe it already happened and we're living in a simulation. Prove otherwise.

        https://en.wikipedia.org/wiki/Computronium?wprov=sfti1 [wikipedia.org]

        computronium is a material hypothesized by Norman Margolus and Tommaso Toffoli of the Massachusetts Institute of Technology in 1991 to be used as "programmable matter", a substrate for computer modeling of virtually any real object.[1]

        It also refers to a theoretical arrangement of matter that is the best possible form of computing device for that amount of matter

        --
        SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
        • (Score: -1, Spam) by Anonymous Coward on Thursday November 14 2019, @05:59PM

          by Anonymous Coward on Thursday November 14 2019, @05:59PM (#920442)

          Barbara (tom) Hudson CHEMICALLY CASTRATED itself with estrogen since you failed as a man lol! You also FAIL as a "woman" you NEUTERED delusional freakazoid! What is is like knowing you are a living mockery? A parody of both a 'woman' or a man! You know that. Everyone knows it about you "TraNsTeSticLe" hohohohoho. Barbara Hudson is a twistoid mental case deluding itself it is a REAL woman. Clue: You will never EVER be able to pass a DNA test due to the fact you do not, nor did you ever, possess female mitochondrial material you crackpot weirdo. It isn't logical to attempt to "fix" bodyparts that work with no issues. You had a working (extremely small) penis and balls you sawed off with estrogen hahahaha! Barbara Hudson breaks laws by possessing a SAWED OFF SHOTGUN, rotflmao!!!!!!!!!!

  • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @04:06PM

    by Anonymous Coward on Tuesday November 12 2019, @04:06PM (#919427)

    Anybody actually cares what Jan Schauma has to say about anything.

  • (Score: 0) by Anonymous Coward on Tuesday November 12 2019, @06:37PM (3 children)

    by Anonymous Coward on Tuesday November 12 2019, @06:37PM (#919495)

    Is that they often have very many regulations and legal requirements to get into. Anyone can get into programming without a degree and because of that those that spent time getting real world experience may outperform those that spent that same time getting a degree by the time the person with a degree gets it.

    One can argue that this often restricts access to other fields though while others can argue that those restrictions keep people out of fields that may cause harm if the people within those fields don't know what they are doing. I think the truth is somewhere in between, I do think our educational system and the requirement to get a license/education to do anything can be a racket intended to enrich licensing bureaus and educational institutions while denying people real world experience necessary to improve their skills during the time that they are pursing their education but I also think that education can be beneficial as well and give people a broader knowledge outside of their particular niche.

    • (Score: 2) by barbara hudson on Tuesday November 12 2019, @08:52PM (2 children)

      by barbara hudson (6443) <barbara.Jane.hudson@icloud.com> on Tuesday November 12 2019, @08:52PM (#919546) Journal
      Taxi driver comes to mind for some reason. There was never a reason for a medallion to cost 6-7 figures.
      --
      SoylentNews is social media. Says so right in the slogan. Soylentnews is people, not tech.
      • (Score: 0) by Anonymous Coward on Wednesday November 13 2019, @12:12AM

        by Anonymous Coward on Wednesday November 13 2019, @12:12AM (#919612)

        >Taxi driver comes to mind for some reason. There was never a reason for a medallion to cost 6-7 figures.

        Cattle want to work so much, they;ll pay for the privledge.

      • (Score: -1, Spam) by Anonymous Coward on Thursday November 14 2019, @05:57PM

        by Anonymous Coward on Thursday November 14 2019, @05:57PM (#920440)

        Barbara (tom) Hudson CHEMICALLY CASTRATED itself with estrogen since you failed as a man lol! You also FAIL as a "woman" you NEUTERED delusional freakazoid! What is is like knowing you are a living mockery? A parody of both a 'woman' or a man! You know that. Everyone knows it about you "TraNsTeSticLe" hohohohoho. Barbara Hudson is a twistoid mental case deluding itself it is a REAL woman. Clue: You will never EVER be able to pass a DNA test due to the fact you do not, nor did you ever, possess female mitochondrial material you crackpot weirdo. It isn't logical to attempt to "fix" bodyparts that work with no issues. You had a working (extremely small) penis and balls you sawed off with estrogen hahahaha! Barbara Hudson breaks laws by possessing a SAWED OFF SHOTGUN, rotflmao!

  • (Score: 2) by progo on Wednesday November 13 2019, @02:16AM

    by progo (6356) on Wednesday November 13 2019, @02:16AM (#919670) Homepage

    Huh? Is that some kind of point about someone spent time and money so it could be delivered to you? Same with the free bathroom water in your hotel that the hotel kindly didn't charge you for.

    Free software IS free.

(1)