Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Friday May 20 2022, @03:37PM   Printer-friendly
from the be-afraid,-be-very-afraid dept.

Nonprogrammers are building more of the world's software: A computer scientist explains 'no-code':

Traditional computer programming has a steep learning curve that requires learning a programming language, for example C/C++, Java or Python, just to build a simple application such as a calculator or Tic-tac-toe game. Programming also requires substantial debugging skills, which easily frustrates new learners. The study time, effort and experience needed often stop nonprogrammers from making software from scratch.

No-code is a way to program websites, mobile apps and games without using codes or scripts, or sets of commands. People readily learn from visual cues, which led to the development of "what you see is what you get" (WYSIWYG) document and multimedia editors as early as the 1970s. WYSIWYG editors allow you to work in a document as it appears in finished form. The concept was extended to software development in the 1990s.

There are many no-code development platforms that allow both programmers and nonprogrammers to create software through drag-and-drop graphical user interfaces instead of traditional line-by-line coding. For example, a user can drag a label and drop it to a website. The no-code platform will show how the label looks and create the corresponding HTML code. No-code development platforms generally offer templates or modules that allow anyone to build apps.

[...] There are many current no-code website-building platforms such as Bubble, Wix, WordPress and GoogleSites that overcome the shortcomings of the early no-code website builders. Bubble allows users to design the interface by defining a workflow. A workflow is a series of actions triggered by an event. For instance, when a user clicks on the save button (the event), the current game status is saved to a file (the series of actions).

Meanwhile, Wix launched an HTML5 site builder that includes a library of website templates. In addition, Wix supports modules—for example, data analysis of visitor data such as contact information, messages, purchases and bookings; booking support for hotels and vacation rentals; and a platform for independent musicians to market and sell their music.

WordPress was originally developed for personal blogs. It has since been extended to support forums, membership sites, learning management systems and online stores. Like WordPress, GoogleSites lets users create websites with various embedded functions from Google, such as YouTube, Google Maps, Google Drive, calendar and online office applications.

[...] No-code platforms help increase the number of developers, in a time of increasing demand for software development. No-code is showing up in fields such as e-commerce, education and health care.

I expect that no-code will play a more prominent role in artificial intelligence, as well. Training machine-learning models, the heart of AI, requires time, effort and experience. No-code programming can help reduce the time to train these models, which makes it easier to use AI for many purposes. For example, one no-code AI tool allows nonprogrammers to create chatbots, something that would have been unimaginable even a few years ago.

I suppose that I expect the comments to be divided into 2 groups - those that are written by programmers and those that are not. But what do you think of the idea? How would you go about testing such software then, and who is responsible for how the final code behaves?


Original Submission

Related Stories

Why Are There So Many Programming Languages? 68 comments

Over at ACM.org, Doug Meil posits that programming languages are often designed for certain tasks or workloads in mind, and in that sense most languages differ less in what they make possible, and more in terms of what they make easy:

I had the opportunity to visit the Computer History Museum in Mountain View, CA, a few years ago. It's a terrific museum, and among the many exhibits is a wall-size graph of the evolution of programming languages. This graph is so big that anyone who has ever written "Hello World" in anything has the urge to stick their nose against the wall and search section by section to try find their favorite languages. I certainly did. The next instinct is to trace the "influenced" edges of the graph with their index finger backwards in time. Or forwards, depending on how old the languages happen to be.

[...] There is so much that can be taken for granted in computing today. Back in the early days everything was expensive and limited: storage, memory, and processing power. People had to walk uphill and against the wind, both ways, just to get to the computer lab, and then stay up all night to get computer time. One thing that was easier during that time was that the programming language namespace was greenfield, and initial ones from the 1950's and 1960's had the luxury of being named precisely for the thing they did: FORTRAN (Formula Translator), COBOL (Common Business Oriented Language), BASIC (Beginner's All-purpose Symbolic Instruction Code), ALGOL (Algorithmic Language), LISP (List Processor). Most people probably haven't heard of SNOBOL (String Oriented and Symbolic Language, 1962), but one doesn't need many guesses to determine what it was trying to do. Had object-oriented programming concepts been more fully understood during that time, it's possible we would be coding in something like "OBJOL" —an unambiguously named object-oriented language, at least by naming patterns of the era.

It's worth noting and admiring the audacity of PL/I (1964), which was aiming to be that "one good programming language." The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's.

The author goes on to reason that new languages are mostly created for control and fortune, citing Microsoft's C# as an example of their answer to Java for a middleware language they could control.

Related:
Non-Programmers are Building More of the World's Software
Twist: MIT's New Programming Language for Quantum Computing
10 Most(ly dead) Influential Programming Languages


Original Submission

It's the End of Programming as We Know it -- Again 37 comments

As AI assumes more software development work, developers may eventually be working with training models more than they do with coding tools:

Over the past few decades, various movements, paradigms, or technology surges -- whatever you want to call them -- have roiled the software world, promising either to hand a lot of programming grunt work to end users, or automate more of the process. CASE tools, 4GL, object-oriented programming, service oriented architecture, microservices, cloud services, Platform as a Service, serverless computing, low-code, and no-code all have theoretically taken the onerous burdens out of software development. And, potentially, threaten the job security of developers.

Yet, here we are. Software developers are busier than ever, with demand for skills only increasing.

[...] Matt Welsh, CEO and co-founder of Fixie.ai, for one, predicts that "programming will be obsolete" within the next decade or so. "I believe the conventional idea of 'writing a program' is headed for extinction," he predicts in a recent article published by the Association for Computing Machinery. "Indeed, for all but very specialized applications, most software, as we know it, will be replaced by AI systems that are trained rather than programmed."

In situations where one needs a "simple program -- after all, not everything should require a model of hundreds of billions of parameters running on a cluster of GPUs -- those programs will, themselves, be generated by an AI rather than coded by hand," Welsh adds.

Although some of the article delves into businesspeak, it does speculate on what the roles of IT professionals and developers may be in a future where most of the code writing grunt work is done by AI.

Previously:


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 5, Insightful) by Anonymous Coward on Friday May 20 2022, @03:42PM (1 child)

    by Anonymous Coward on Friday May 20 2022, @03:42PM (#1246601)

    It's obviously created by non-programmers... severely retarded non-programmers.

    • (Score: 0) by Anonymous Coward on Friday May 20 2022, @06:29PM

      by Anonymous Coward on Friday May 20 2022, @06:29PM (#1246649)

      Fully-retarded non-programmers (FRNPs) is a more appropriate term.

  • (Score: 5, Insightful) by ilsa on Friday May 20 2022, @04:15PM (6 children)

    by ilsa (6082) Subscriber Badge on Friday May 20 2022, @04:15PM (#1246606)

    So we can look forward to an even greater acceleration of the number of hacked and broken systems than we are already seeing.

    Because apparently nobody learned anything from the Visual Basic years. At least I know I'll be employed for as long as I want to be.

    • (Score: 5, Insightful) by JoeMerchant on Saturday May 21 2022, @01:24AM (5 children)

      by JoeMerchant (3937) on Saturday May 21 2022, @01:24AM (#1246746)

      It isn't simplicity that makes code insecure, it's power.

      Most "no code" systems are actually not very powerful at all. A WYSIWYG editor (without macros, which are far from WYSIWYG) can only create static documents, if that's all you let it do, and if they're stored in a "non powerful" format, then they can't be trojaned. 100% secure.

      Power, like web pages that can launch executables on your PC (brought to you by Internet Explorer in the early days) - that's dangerous, especially when the executables essentially have root privileges.

      --
      🌻🌻 [google.com]
      • (Score: 3, Interesting) by Booga1 on Sunday May 22 2022, @06:53AM

        by Booga1 (6333) on Sunday May 22 2022, @06:53AM (#1246990)

        Even with Wordpress, crappy as it is, most of the vulnerabilities now come from plugins. For a basic blog it's actually fairly secure if you just use Wordpress by itself. It's when people add a plugin that "requires" admin access that things get nasty and suddenly the site is full of spam, serving malware.

        There's only so much Wordpress itself can do when the half-baked plugins have administrative rights to the entire site.

      • (Score: 2) by ilsa on Tuesday May 24 2022, @05:31PM (3 children)

        by ilsa (6082) Subscriber Badge on Tuesday May 24 2022, @05:31PM (#1247508)

        That is true. It is also true that these low/no-code tools may start off simple, they _never_ remain that way.

        It's not as if these tools are some unique new paradigm. It's yet another marketing bullshit-spin on technology that we've had for over 30 years in various forms. And those tools have been universally garbage. The products those tools produced have universally been garbage. Horribly inefficient. Security swiss cheese.

        And worst of all, you end up with a segment of workforce that are complete morons that think they're the best programmers in the world cause the ramp up was "so easy". I'm all for empowering people, but there's a difference between empowerment and giving people rose coloured classes and and endless supply of rope to hand not only themselves but everyone around them.

        • (Score: 2) by JoeMerchant on Tuesday May 24 2022, @05:44PM (2 children)

          by JoeMerchant (3937) on Tuesday May 24 2022, @05:44PM (#1247513)

          In 1989 I "invented" a no-code signal processing software assembly system layered on a CAD tool we had for circuit design. Lay in your software modules like chips and connect them with "wires" to implement the signal flow. Yes, LabView was actually already around in 1989 as well, but being 1989 it took me 8 months of research and development on this, my Master's Thesis, before I stumbled upon LabView and also some Apple thing that was similar. Purdue published some papers on "systolic processing" which was a pretty good match for how my compiler implemented the signal flow behind the scenes, and of course, being 1989, it was "implemented" on a parallel machine which was scary-close in architecture to the processor used in the PS3 many years later: a Motorola 16 bit processor to administrate the system and an array of 8 DSPs to handle the signal flow work.

          So, yeah, 8 billion people on this planet, tens of millions of computer engineers, hundreds of thousands of "exceptionally good ones" spread across a few dozens of specialties. Not quite an infinite number of monkeys, but still thousands of scary sharp people in any given area of specialty all grinding on the same problems... we're gonna be repeating each other quite a bit.

          --
          🌻🌻 [google.com]
          • (Score: 2) by ilsa on Tuesday May 24 2022, @05:49PM (1 child)

            by ilsa (6082) Subscriber Badge on Tuesday May 24 2022, @05:49PM (#1247518)

            Very cool! But I'm not seeing the point you're trying to make.

            • (Score: 2) by JoeMerchant on Tuesday May 24 2022, @06:59PM

              by JoeMerchant (3937) on Tuesday May 24 2022, @06:59PM (#1247534)

              All Of This Has Happened Before And Will Happen Again

              https://cityoftongues.com/non-fiction/all-of-this-has-happened-before/ [cityoftongues.com]

              O.K. not _exactly_ that, but "exciting breakthrough tech" like code-free programming has been around for decades, even though it is introduced as new and innovative every few years. It's an inevitable feature of a large society where people "invent" things in ignorance of virtually identical things done long before they had their ideas.

              --
              🌻🌻 [google.com]
  • (Score: 5, Insightful) by Anonymous Coward on Friday May 20 2022, @04:34PM (11 children)

    by Anonymous Coward on Friday May 20 2022, @04:34PM (#1246611)

    This is no different from Amazon or eBay making it possible to set up a storefront without knowing how to code, or Excel making it possible to make spreadsheets and charts, or Discord/Slack or Twitch making it possible to set up bots that handle routine moderation tasks. Once upon a time that would have required programming, and then it didn't.

    This is just the normal process of coders building software that allows users to do things with their computers.

    Writing an AI that can write code is a harder problem than even artificial general intelligence, because the AI must not only be able to write code, but understand what the non-programmer human wants. The hard part is not ever learning how to program the system. People keep making this same mistake and have been ever since the 1960s, when people thought the hard part of programming was learning the language and invented SQL and COBOL to make the syntax more "English like." All this accomplished was making the language worse, because the hard part is untangling the muddled, incoherent thoughts floating around the mind of whoever wants the software built. Most humans are not smart enough to do this.

    These "no code" things are the computing equivalent of Lego bricks. But just because you can put Legos together, that doesn't make you an engineer.

    • (Score: 0) by Anonymous Coward on Friday May 20 2022, @05:54PM

      by Anonymous Coward on Friday May 20 2022, @05:54PM (#1246641)

      There is nothing wrong with SQL. (Perfectionists excepted.)
      It's a very limited language that allows you to work with data adhering to a rigid model: the relational data model.
      What is true is that you still have to have the skills of a scripter in addition to relational data skills to use it.

    • (Score: 2) by mcgrew on Friday May 20 2022, @08:37PM (1 child)

      by mcgrew (701) <publish@mcgrewbooks.com> on Friday May 20 2022, @08:37PM (#1246685) Homepage Journal

      You needed at least a Master's in mathematics to write "hello world" before this lady [wikipedia.org] came along. Came up with assembly, which was the first programming language, and later FORTRAN and COBOL. Her mission was to make programming simple enough that anyone could do it. 1906-1992.

      --
      mcgrewbooks.com mcgrew.info nooze.org
      • (Score: 5, Insightful) by Thexalon on Friday May 20 2022, @10:33PM

        by Thexalon (636) on Friday May 20 2022, @10:33PM (#1246722)

        And they ran into the same problem that exists today: The hard part of software development usually is figuring out exactly what a system should do, not the part of describing that process in code. We've tried visual coding tools, lots of different kinds of language models, and there's always a bunch of complexity that cannot be simplified because human processes that are being replaced by software processes are complicated, and humans can adapt and learn in a way software can't.

        --
        The only thing that stops a bad guy with a compiler is a good guy with a compiler.
    • (Score: 2) by JoeMerchant on Saturday May 21 2022, @01:27AM (7 children)

      by JoeMerchant (3937) on Saturday May 21 2022, @01:27AM (#1246750)

      >Writing an AI that can write code is a harder problem than even artificial general intelligence

      It all depends on the application, and the semantics. Some "AI" is already automatically programming "things" - it just needs to know what its optimization weights are, what sequences are permitted, and the dataset to optimize for. That's not "AI" enough for you? This is why "AI" is always 5 years away, because every time it conquers a domain that used to be considered "proof of AI" the goalposts are moved.

      --
      🌻🌻 [google.com]
      • (Score: 4, Insightful) by PiMuNu on Sunday May 22 2022, @08:33AM (6 children)

        by PiMuNu (3823) on Sunday May 22 2022, @08:33AM (#1246999)

        You should include the full quote

        > Writing an AI that can write code is a harder problem than even artificial general intelligence, because the AI must not only be able to write code, but understand what the non-programmer human wants

        This has never been achieved by AI. It is barely achievable by non-A I

        • (Score: 2) by JoeMerchant on Sunday May 22 2022, @10:56PM (5 children)

          by JoeMerchant (3937) on Sunday May 22 2022, @10:56PM (#1247117)

          >It is barely achievable by non-A I

          Agreed. And my Google Home frequently mis-understands the simplest of commands. However, the Google story matching algorithm (which can't really be called anything but AI) seems to know what I'm thinking before I do - far better than most people I ask to look up things for me.

          --
          🌻🌻 [google.com]
          • (Score: 2) by ilsa on Tuesday May 24 2022, @05:46PM (4 children)

            by ilsa (6082) Subscriber Badge on Tuesday May 24 2022, @05:46PM (#1247515)

            Yes, it *can* be called anything but AI. AI is artificial intelligence. No system on earth has achieved that yet. Everything we have right now falls under the category of either "Expert system" or "machine learning", which have specific definitions. WRT ML, all it is doing is correlating very large quantities of data and coming up with a prediction on what the next piece of data will be. That's it. No unique insights. Just plain old regurgitation of facts based on required match criteria.

            • (Score: 2) by JoeMerchant on Tuesday May 24 2022, @06:28PM (3 children)

              by JoeMerchant (3937) on Tuesday May 24 2022, @06:28PM (#1247529)

              Self-definition of terms like "intelligence" is very slippery... intelligent as an average beagle? komodo dragon? are we even qualified to judge the intelligence of a cold blooded animal? So "human level intelligence" at what task? Playing a game like Go? Retrieving relevant medical research? Predicting what people are likely to buy on their next trip to the store? These are all things that humans do with a certain level of proficiency, and the algorithms are getting to be better and better when compared to even the best humans in the field. The Turing test? At this point success on the Turing test has more to do with the interface than how the machine responds to inputs.

              > No system on earth has achieved that yet.

              In all fields at once? No. In specific fields, fields thought "uncrackable by computers" just a decade or two ago, they are now exceeding performance of the best humans on the planet.

              --
              🌻🌻 [google.com]
              • (Score: 2) by ilsa on Thursday May 26 2022, @01:12PM (2 children)

                by ilsa (6082) Subscriber Badge on Thursday May 26 2022, @01:12PM (#1247984)

                As long as you let marketing control the definitions of words, then yes, all bets are off. Anything can mean anything.

                But in real comp sci, AI, ML, etc, all have formal definitions, and we definitely don't have AI.

                • (Score: 2) by JoeMerchant on Thursday May 26 2022, @02:12PM (1 child)

                  by JoeMerchant (3937) on Thursday May 26 2022, @02:12PM (#1247997)

                  So, "real comp sci" disagrees with the Oxford English Dictionary?

                  noun

                  the theory and development of computer systems able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, decision-making, and translation between languages.

                  --
                  🌻🌻 [google.com]
                  • (Score: 1, Informative) by Anonymous Coward on Friday May 27 2022, @08:15AM

                    by Anonymous Coward on Friday May 27 2022, @08:15AM (#1248238)

                    Yes. Obviously. It is not uncommon for jargon used in a particular field to disagree with the general vernacular of society.

  • (Score: 5, Insightful) by bloodnok on Friday May 20 2022, @04:36PM (5 children)

    by bloodnok (2578) on Friday May 20 2022, @04:36PM (#1246612)

    I imagine many are concerned that this will lower code quality. But before anyone condemns this, ask yourself where the quality bar is set currently. It is hard to believe that quality can fall much further.

    There are few examples of good code, good systems and well-managed projects out there, and those few that exist (I'm thinking of postgres and struggling to find others) are unlikely to adopt a no-code approach.

    There is so much wrong with current software development practices that I can't get excited about the idea of non-programmers making things any worse.

    __
    The major

    • (Score: 4, Informative) by stormreaver on Friday May 20 2022, @05:02PM (4 children)

      by stormreaver (5101) on Friday May 20 2022, @05:02PM (#1246621)

      I'm not stressing over "no code". Having been writing software since 1985, it is painfully obvious that logic and subsystem communications are by far the hardest part of programming. The user interface, which is 90% of what "no code" platforms provide is, far and away, the least significant part of software development.

      Anyone who can't write code-based programs are stumped by program logic (sequencing, iteration, decisions; the entire concept of programming), and subsystem communication (the whole general category of subsystem I/O), which are not at all addressable by "no code" systems.

      "No code" can get you just far enough to make you hate yourself, but no further. Barring an A.I. that can understand human wants and desires based on very vague criteria, such as, "I want a fun game the involves shooting stuff," the "no code" systems are just setting people up for failure. Take Wix, for example (since it was mentioned in the article). It is point and click for the user interface and whatever predefined (by programmers -- hint, hint) functions are available. But if I want my Wix-based website to use my external PostgreSQL server for my data storage and stored procedures, I'm out of luck. This is something that a semi-competent programmer can create rather readily (not counting the schema design and store procedure programming), but that "no code" languages cannot do unless they are programmed (by a programmer; hint, hint) to do so. And even then, they can not create customized interactions. They will always be limited to whatever features a programmer (hint, hint) includes in them.

      In summary, "no code" systems are lowest-common-denominator systems. They always have been (I first saw them in the early 1990's) and always will be.

      • (Score: 4, Informative) by bloodnok on Friday May 20 2022, @07:44PM

        by bloodnok (2578) on Friday May 20 2022, @07:44PM (#1246665)

        The user interface, which is 90% of what "no code" platforms provide is, far and away, the least significant part of software development.

        Although I'm inclined to agree with you, being more of a server-side guy myself, I find that UIs are often mysterious and appalling. Implementing the mechanics of a UI are simple enough, but making sane choices of labels, field positionings and so on seem to be beyond most development groups.

        I have lost track of the number of web pages and apps where the use of even a simple thing like a submit button is inconsistent from one page to the next. Or apps with important buttons (like submit) not visible until you scroll down and with no indication that anything is beyond the visible part of the page. Or radio buttons that indicate, not the current state of things, but what the developer thinks you might want to change it to.

        And of course the current fashion of changing parts of the UI every other iteration does nothing to help.

        __
        The Major

      • (Score: 3, Interesting) by Snotnose on Friday May 20 2022, @10:34PM (2 children)

        by Snotnose (1623) on Friday May 20 2022, @10:34PM (#1246723)

        The user interface, which is 90% of what "no code" platforms provide is, far and away, the least significant part of software development.

        I'm gonna have to call horse hockey on this. I've been at this for 40 years. Device drivers, Real Time Systems, and I helped port Linux to the SH-4 20 years ago.

        The hardest thing I've ever coded is the UI. Not that I can't make one, that's fairly straighforward. More like I'm really good at saying "this UI sucks ass", but I really suck at saying "here's how to make it better".

        Then again, I've never used a "no code" platform. It's always been 8086 assembly, C, or Java.

        --
        When the dust settled America realized it was saved by a porn star.
        • (Score: 1, Interesting) by Anonymous Coward on Saturday May 21 2022, @01:23AM

          by Anonymous Coward on Saturday May 21 2022, @01:23AM (#1246745)

          A sloppy device driver doesn't work. A sloppy UI works, sort of. So in that sense, device drivers are harder to write than a UI.

          But look at UI for complex applications. They're typically difficult to learn, difficult to use, prone to hangs and crashes, give useless errors if some dependency is corrupt, and eat memory like mad. I use IntelliJ IDEA for work and it's a fine program, but when an obscure error starts to appear if I can't figure it out in two hours I uninstall the whole program, delete leftover files, restart my computer, and reinstall. And arguably IntelliJ is a high quality UI product.

        • (Score: 2) by stormreaver on Sunday May 22 2022, @01:10AM

          by stormreaver (5101) on Sunday May 22 2022, @01:10AM (#1246955)

          The hardest thing I've ever coded is the UI.

          Everyone has different strengths and weaknesses, so I can understand if UI's are harder for some than for others. However, "no code" systems aren't going to help anyone with the qualitative parts of user interfaces. They will just help with quantitative parts (the nuts and bolts).

          For any given program, the user interface will be the smallest part (assuming a proper separation between UI and business logic, a proper OOP GUI toolkit, and IDE support for GUI building). This has proven itself to be true of most types of software I've written since 1985 (with games being a notable exception, where the interface and logic are roughly 50/50). The user interface tends to be about five to ten percent of the overall program, whereas the business logic (the largest component) and data I/O (usually databases) are the rest.

          Your mention of device drivers reminded me that I used to make simple device drivers (mouse, keyboard, serial) in assembly and C back in the MS-DOS years, but I find modern device drivers to be impenetrably impossible to comprehend.

  • (Score: 4, Touché) by Ingar on Friday May 20 2022, @04:44PM (2 children)

    by Ingar (801) on Friday May 20 2022, @04:44PM (#1246614) Homepage

    And that's why so much software these days is bloated trash.

    Yeah, I'm a programmer.

    • (Score: 3, Informative) by acid andy on Friday May 20 2022, @08:08PM (1 child)

      by acid andy (1683) on Friday May 20 2022, @08:08PM (#1246673) Homepage Journal

      Yep. Regarding web development, if you ever view the HTML, Javascript and related files produced by some of these tools it's enough to induce nightmares, especially if you have to port it to a different development environment. It's no wonder websites these days are slow, glitchy pieces of crap that constantly stop working unless you're running the very latest G**gle Chrome with all the scripts, ads, and cross-domain abominations enabled.

      --
      If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
      • (Score: 4, Interesting) by JoeMerchant on Saturday May 21 2022, @01:30AM

        by JoeMerchant (3937) on Saturday May 21 2022, @01:30AM (#1246752)

        Sounds like .NET code to me. When I make a feature addition in Qt there's a handful of files that get touched, maybe 4-6, and they have clear additions, deletions that you can read in a git diff easily. When my .NET colleagues make a minor change, there are 30+ files that get changes scattered throughout and understanding what was changed from the "source code" diff would amount to a major reverse engineering effort.

        Same for those "HTML generation tools" put out by Redmond over the years. Why do something straightforward when you can obfuscate the back end code so hopelessly that you are forced to use their tools to make changes?

        --
        🌻🌻 [google.com]
  • (Score: 5, Insightful) by Anonymous Coward on Friday May 20 2022, @05:04PM (5 children)

    by Anonymous Coward on Friday May 20 2022, @05:04PM (#1246623)

    As a programmer myself, I have been watching tools that could enable non-programmers to program come and go for about 30 years now. The value proposition is obvious, and it is pretty easy to sell these to non-programmers. The problem with these tools is that they are inflexible. They are designed to do a limited set of things, with a small set of customizations supported. When you try to use it in the real world, someone inevitably insists on customizing the thing in a way it wasn't designed for. Next thing you know, you go from "So easy I don't need programmers" to "I need programmers for the tricky bit" to either "I need an expert for the parts that the programmers can't figure out" or "My programmers have abandoned the tool and started working in a general purpose language." By making the things within the design scope easier, you have made everything else much much harder.

    • (Score: 1, Insightful) by Anonymous Coward on Friday May 20 2022, @05:59PM (3 children)

      by Anonymous Coward on Friday May 20 2022, @05:59PM (#1246643)

      As long as you have realistic expectations of what you can produce with "no code" products, you are fine.
      "No code" allows you to make some simple customizations to a "canned" product. That's all it is. If you like the "canned" product, then fine.

      • (Score: 0) by Anonymous Coward on Friday May 20 2022, @06:01PM

        by Anonymous Coward on Friday May 20 2022, @06:01PM (#1246645)

        I will add, you ought to expect to "throw away" your "no code" solution if your business needs grow. Maybe it is best viewed as a "cheap, light duty" product.

      • (Score: 3, Touché) by sjames on Saturday May 21 2022, @03:01PM

        by sjames (2882) on Saturday May 21 2022, @03:01PM (#1246843) Journal

        The problem is, the people who are sufficiently knowledgeable to form realistic expectations are programmers, exactly the people these products are NOT marketed to.

      • (Score: 0) by Anonymous Coward on Saturday May 21 2022, @03:24PM

        by Anonymous Coward on Saturday May 21 2022, @03:24PM (#1246849)

        The only time I've had a manager with realistic expectations was when I was at a startup that was still in the all-engineers phase.

    • (Score: 2) by SomeGuy on Saturday May 21 2022, @02:01AM

      by SomeGuy (5632) on Saturday May 21 2022, @02:01AM (#1246761)

      The thing is, everything above the level of a transistor is "limited" in some way.

      Every language, every environment, every set of libraries, is designed with a specific problem domain in mind. Stray too far from that problem domain, and it probably won't do what you want it to do. It becomes a matter of finding the right tool for the job.

      One wouldn't screw in a screw with a hammer. Good luck getting that nail in with a pair of pliers.

      No one going to waste their time writing the equivalent of a shell script in assembler, or lay out their own transistors on a chip, even if it is possible. Most would rather drag and drop a few fields on to a GUI form rather than write an entire new database management system in C.

      One always has to look at the requirements and see if a specific language or environment will meet their needs well enough, and draw a line at what what they think the limits should be. The best that can be done is to make sure it can interface with other languages or environments, of course that may not help of missing functionality is something core to the language itself.

      Yes, unfortunately some "needs" have changed faster than these tools can be adapted. When the web became popular many scrambled to embed their existing Win32 applications in to web browsers via ActiveX. Similar with "cloud" stuff, all that custom web scripting needs to be modified to support clustering or whatever.

      And unfortunately, many choose the wrong tool for the job simply because it was popular or it was the only way to do one part. Then it winds up a clunky mess. For example there are now many web-based spreadsheets, some of them are even starting to look slick, but there are many things that they will never be able to do or do well compared to an OS native binary spreadsheet. A spreadsheet is outside of the problem domain of a web browser, who's problem domain is to retrieve and render documents.

      The point is, there is no single tool that can do everything, and those that try to do more than one wind up doing it poorly.

  • (Score: 3, Insightful) by looorg on Friday May 20 2022, @05:09PM (5 children)

    by looorg (578) on Friday May 20 2022, @05:09PM (#1246625)

    I doubt I'm alone in this here but how do they have a steep learning curve? I learned to program BASIC from a few books written in a language (English) that I barely spoke at the time. On my own. No internet. Nobody to ask. I'm not saying I was great or anything but still. It was not a steep learning curve. Then I somehow managed to learn assembler, c and pascal without the internet or anyone to really ask either. Just by reading books and doing it. Then when I already knew it you eventually found other people that also knew it and you could exchange some ideas and tips and tricks or what you will from that.

    I'm not saying things have not become more complex over the ages but this steep learning curve is a myth. But then I guess it goes hand-in-hand with this whole idea that everyone can learn or should be a coder.

    Also this steep learning curve bullshit just reinforces the image of that computers are the big black magic box and getting it to do anything is borderline wizardry.

    So actual programmers now code nocode tools for the nonprogrammers to do nonprograms on? I doubt this will solve the "shortage" of coders. Oh right they are all called "developers" now instead. Right. My bad!

    • (Score: 3, Insightful) by fliptop on Friday May 20 2022, @06:21PM

      by fliptop (1666) on Friday May 20 2022, @06:21PM (#1246648) Journal

      I doubt I'm alone in this here but how do they have a steep learning curve?

      Personally, I believe the only thing that's steep when it comes to programming is understanding logic. It's a wall you either climb or walk around. Nonprogrammers seem content w/ doing the latter.

      --
      Our Constitution was made only for a moral and religious people. It is wholly inadequate to the government of any other.
    • (Score: 0) by Anonymous Coward on Friday May 20 2022, @06:54PM

      by Anonymous Coward on Friday May 20 2022, @06:54PM (#1246653)

      There actually is a steep learning curve, which presents itself in several dimensions:
      1. Learn troubleshooting skills
      2. Learn code organization skills
      3. Learn the idiosyncrasies of a particular language
      4. Learn the idiosyncrasies of a particular environment
      5. Learn the requirements of the program you are building
      6. Learn how to use each of the tools you are going to use
         

    • (Score: 2) by acid andy on Friday May 20 2022, @08:14PM (1 child)

      by acid andy (1683) on Friday May 20 2022, @08:14PM (#1246676) Homepage Journal

      I guess part of the problem is some people expect to be able to only ever do things related to coding and learning to code from 9-5. They see it is work to do and a way they think they'll make big money, rather than an interesting passtime. You need to have a real passion for it and be learning about it for fun in spare time, to really succeed. So I can see why someone might call that a steep learning curve. I guess it's part of devaluing or misunderstanding the value of coding skills.

      --
      If a cat has kittens, does a rat have rittens, a bat bittens and a mat mittens?
      • (Score: 1, Interesting) by Anonymous Coward on Saturday May 21 2022, @01:26AM

        by Anonymous Coward on Saturday May 21 2022, @01:26AM (#1246748)

        You are onto something.

        The technologies are not managed or developed in a professional way. We use open source basically because it is free, but then we put up with no documentation and amateurish, crap APIs that make it very hard to just pick up the technology and start using it. And since it's FREE, why would anyone bother producing those things for the programmers? And many places don't limit the tech stack enough. If you use several complicated, totally different technologies to get the job done, are you better off than if you used only a few that everybody could truly master?

    • (Score: 1, Interesting) by Anonymous Coward on Saturday May 21 2022, @01:36AM

      by Anonymous Coward on Saturday May 21 2022, @01:36AM (#1246754)

      Either you have an unusual level of creativity, or the books you were working with were of higher than average quality. I cut my teeth on Basic and then Pascal back before I had internet access, and I was motivated but it was a nightmare. Book: "This is a variable. That is an if-statement." Me: "Okay, I understand them. Now what can I do with it?" Book: "Good luck with that." Taking the leap from what I read to making a choose-your-own-adventure game or a dice roller or a todo list, which are things I could do in a few hours now, seemed impossible then. To put it another way, I had an easy time learning how to fit any two Lego pieces together but no idea how to make a pirate ship.

      You might be flat out smarter than me. I'm open to that possibility. Today, more than twenty years later, I've debugged printer driver errors and written web services and Android applications. But the learning curve was horrible for me at first. I suspect a lot of people could be proper software engineers if they got a better education, and only washed out because of a bad environment.

  • (Score: 0) by Anonymous Coward on Friday May 20 2022, @05:29PM (1 child)

    by Anonymous Coward on Friday May 20 2022, @05:29PM (#1246634)

    Solving problems by throwing overpowered HW at them to cut corners on programmers salaries, will now be costing more and more.

    • (Score: 2) by Dr Spin on Saturday May 21 2022, @11:00PM

      by Dr Spin (5239) on Saturday May 21 2022, @11:00PM (#1246938)

      All real coders write PHP by throwing cow-pats at the screen with a Wiimote.
      Its been like that for ever (for social media values of forever).

      --
      Warning: Opening your mouth may invalidate your brain!
  • (Score: 0) by Anonymous Coward on Friday May 20 2022, @05:36PM (2 children)

    by Anonymous Coward on Friday May 20 2022, @05:36PM (#1246635)

    Anyone remember Garry Kitchen's GameMaker [wikipedia.org] for the C64? It wasn't no-code, and I think it would be extremely difficult to get to a point of no-code logic, but it was definitely simplified coding. Many tools were available for editing things like sprites, backgrounds, game music, and sounds. There was still some coding involved, but the language was simplified, and the menu-driven interface for writing the code with some features that might not be all that different from the code editors in modern IDEs. I suppose this could be done visually using flowcharts like the Model Builder in ArcGIS or equivalent functionality in QGIS, which might hide the coding from the user. However, you do have to still design the game logic one way or another, whether it's done visually or through simplified code. Using a tool like GameMaker would have some performance hit, but it was vastly simpler than writing games in assembly language for the 6510. No, you wouldn't use a tool like that to write the KERNAL or the BASIC interpreter, but it made creating games a lot simpler.

    In many ways, this was one of the earliest attempts at a game engine, and many of these concepts exist in modern game engines. Unreal, Unity, and Godot all simplify the process of creating games, and significantly reduce the amount of coding that needs to be done. Godot uses GDScript, which is fairly similar to Python, though more limited. A lot of editing is done visually. Sure, simplifying the process of creating games means that there are more low-quality games out there. But even people with limited programming experience can make quality games if they put in the effort. Again, it's not no-code, but it's dramatically simpler, and made game programming feasible for many people for whom game programming wouldn't otherwise be accessible.

    • (Score: 2) by looorg on Friday May 20 2022, @05:56PM

      by looorg (578) on Friday May 20 2022, @05:56PM (#1246642)

      I don't recall that one in particular, but I recall a lot of similar projects such as Shoot-em-up-construction-kit (SEUCK) and then you had various "demo-makers" where you could select things (select scroller, writer scrolltext, pick a tune, pick a few images, some effects, etc) and then it would make them by just putting the various parts together. Sure no code, or little code. But as noted it was somewhat limited in the scope and span. Yes there was sprite editors, some code screens or various boxes to put input into etc. But quite limiting in what you could do. They where tools for you to more or less make one thing and then endless derivatives of that.

      I think, or seem to recall that, some of the early Boulder Dash games game with their own construction kit (or level editor/maker) to. You could make your own levels in that regard but still then limited to doing Boulder Dash maps and levels essentially.

    • (Score: 0) by Anonymous Coward on Saturday May 21 2022, @09:36AM

      by Anonymous Coward on Saturday May 21 2022, @09:36AM (#1246807)

      I remember it, but time has faded those memories. Two things stand out:

      1. Nobody called it “GameMaker”, it was always spoken of as “Gary Kitchen’s GameMaker”. Now that’s marketing done right.

      2. I may be wrong but you wrote your program by picking out commands and parameters from contextualized lists, kinda like Maniac Mansion. Programming by picking out words with the joystick was less than fun for me.

  • (Score: 0) by Anonymous Coward on Friday May 20 2022, @05:51PM (7 children)

    by Anonymous Coward on Friday May 20 2022, @05:51PM (#1246640)

    More as an absolute quantity? - i.e. last year there we X no-code software artifacts, no there are Y...
    More as a percentage? - it has grown to Z% of all software artifacts...

    Oh what, you have no actual numbers? No definition of software? nothing measurable? ...

    Is it software if no code went into it? (web pages used to be documents... not software.)

    • (Score: 0) by Anonymous Coward on Friday May 20 2022, @07:48PM (6 children)

      by Anonymous Coward on Friday May 20 2022, @07:48PM (#1246666)

      web pages have always been code. markup language interpreted at run-time by a compiler (browser) ipso facto noto nocodeo.

      • (Score: 2, Insightful) by shrewdsheep on Friday May 20 2022, @08:36PM (2 children)

        by shrewdsheep (5215) on Friday May 20 2022, @08:36PM (#1246684)

        Markup language means that peaces of text are annotated with meta-information. This does not qualify as any programming let alone Turing-completeness Actually, the only thing you can do is to parse it, the rest is phantasy.

        • (Score: 2) by mcgrew on Friday May 20 2022, @08:47PM

          by mcgrew (701) <publish@mcgrewbooks.com> on Friday May 20 2022, @08:47PM (#1246691) Homepage Journal

          I agree, but I have written a useful computer program using nothing but HTML. [mcgrew.info] The article starts "My intention wasn’t to use HTML as a programming language—it’s a markup language. I didn’t realize what I had done until I had done it."

          The one command in the hypertext markup language is "GOTO", or "JMP". In html it's "<a href".

          The code can be found at mcgrewbooks.com/Scoreboard. It's a scoreboard for shuffleboard.

          --
          mcgrewbooks.com mcgrew.info nooze.org
        • (Score: 1, Informative) by Anonymous Coward on Sunday May 22 2022, @09:43PM

          by Anonymous Coward on Sunday May 22 2022, @09:43PM (#1247104)

          Markup languages can be turing-complete. Just look at XSLT or TeX or the Game of Life. And programming languages can be non-turing-complete. See basic SQL or Datalog or Charity and any other total functional language.

      • (Score: 2) by sjames on Saturday May 21 2022, @04:19PM (1 child)

        by sjames (2882) on Saturday May 21 2022, @04:19PM (#1246855) Journal

        Only if you consider a Word document to be code. It is a bit obfuscated because the tags aren't in the form of human readable text, but it can be translated. If you go back a way, there was the old Wordstar where the 'codes' (tags) were explicit and you could see them with a 'show code' mode.

        Static web pages are DOCUMENTS. They are data that tells code (the browser engine itself) what to do.

        Javascript does complicate things. That is code that interacts with the document (through the DOM: Document Object Model). When non-programmers produce web pages, their tool may (or may not) offer a menu of canned bits of code that can be attached to bits of the DOM for a desired effect. This is not coding, this is using code that someone else wrote. It is similar to a programmer calling library functions except that in the case of web pages, the author is often un-qualified to vet the quality of the canned code and he/she often has no idea how it is actually attached.

        The result tends to be a bulky thing packed with a few kitchen sinks worth of unused and un-needed code just waiting to be exploited. Lat an actual programmer do the scripting and you get a LOT less script that does exactly what is wanted without needing the browser to run on a multi-core CPU to have a chance of rendering the page any time today. If that programmer is skilled, it's also less exploitable. It might even be portable.

        If writing web pages is computer programming, then so is setting a DVR to record something.

        • (Score: 1, Interesting) by Anonymous Coward on Sunday May 22 2022, @01:10AM

          by Anonymous Coward on Sunday May 22 2022, @01:10AM (#1246956)

          many people cannot use a DVR. instructing a machine to do an operation at some temporal shift from now is coding.. get over it.

      • (Score: 2) by PiMuNu on Sunday May 22 2022, @08:41AM

        by PiMuNu (3823) on Sunday May 22 2022, @08:41AM (#1247000)

        HTML is not Turing complete.

        https://stackoverflow.com/questions/30719221/is-html-turing-complete [stackoverflow.com]

        I assert that in the context of this discussion, that means that writing HTML is not programming/coding in any reasonable sense.

  • (Score: 2) by istartedi on Friday May 20 2022, @09:00PM

    by istartedi (123) on Friday May 20 2022, @09:00PM (#1246698) Journal

    Lim "no code" (specification accuracy) --> 100% = code.

    --
    Appended to the end of comments you post. Max: 120 chars.
  • (Score: 0) by Anonymous Coward on Saturday May 21 2022, @01:09AM (1 child)

    by Anonymous Coward on Saturday May 21 2022, @01:09AM (#1246742)

    Steep learning curve? Are you serious? Any moron can program in Python or Java.

    • (Score: 1, Insightful) by Anonymous Coward on Saturday May 21 2022, @01:30AM

      by Anonymous Coward on Saturday May 21 2022, @01:30AM (#1246751)

      Really? Because learning to program in Java or Python is very little about learning the language itself and almost completely about learning the frameworks and libraries associated with them. That part is not easy.

  • (Score: 2) by Mojibake Tengu on Saturday May 21 2022, @01:25AM

    by Mojibake Tengu (8598) on Saturday May 21 2022, @01:25AM (#1246747) Journal

    What I am scared most of is the non-programmers actually building compilers.

    --
    Respect Authorities. Know your social status. Woke responsibly.
  • (Score: 0) by Anonymous Coward on Saturday May 21 2022, @04:17AM

    by Anonymous Coward on Saturday May 21 2022, @04:17AM (#1246778)

    I think that makes them programmers.

  • (Score: 4, Interesting) by jb on Saturday May 21 2022, @07:30AM (1 child)

    by jb (338) on Saturday May 21 2022, @07:30AM (#1246799)

    Is it just me, or does this sound like exactly the same marketing hype that surrounded so-called 4GLs, way back in the 1980s?

    The problem back then was *not* that it was difficult to learn to write programs (it wasn't then and it still isn't now -- unless you're unfortunate enough to have a poor teacher or to pick a badly written book, but that's true of *any* discipline).

    The real problem then (and I suspect still now) was that *software engineering* (which is much broader than just programming) is a non-trivial discipline ... and those who opted for the "simplicity" of the 4GL "draw your program" approach tended to be even less likely to study *the rest* of software engineering than those who took the time to learn to program were. The results were, for the most part, predictable.

    Just as understanding how to use a torque wrench does not make me an automotive engineer, understanding how to write a program [or having some fancy tool to do it for them] does not make someone a software engineer.

    • (Score: 0) by Anonymous Coward on Saturday May 21 2022, @06:40PM

      by Anonymous Coward on Saturday May 21 2022, @06:40PM (#1246893)

      It's just the propaganda machine. There will be more of this going into fall. You can't take it seriously.

      - People who build software systems are programmers
      - More people are building software systems
      - Somehow these people aren't programmers

      Eww nerds. Ok. Can't hear it over the pile of cash I'm accumulating.

(1)