Stories
Slash Boxes
Comments

SoylentNews is people

posted by hubie on Wednesday March 15 2023, @06:37AM   Printer-friendly
from the things-expand-to-exceed-the-space-provided dept.

Hackaday has a story about a simple non-scientific calculator that packs an Alwinner A50 tablet SoC and the Android operating system:

As shipped they lack the Android launcher, so they aren't designed to run much more than the calculator app. Of course that won't stop somebody who knows their way around Google's mobile operating system for very long - at the end of the review, there's some shots of the gadget running Minecraft and playing streaming video.

But it does beg the question as to why such a product was put into production when the same task could have been performed using very cheap microcontroller. Further, having done so they make it a non-scientific machine, not even bestowing it with anything that could possibly justify the hardware.

Embedded has more generic related post about overengineering in embedded systems:

Embedded systems have traditionally been resource-constrained devices that have a specific purpose. They are not general computing devices but often some type of controller, sensor node, etc. As a result, embedded systems developers often are forced to balance bill-of-material (BOM) costs with software features and needs, resulting in a system that does a specific purpose efficiently and economically.

Over the last few years, I've noticed many systems being built that seem to ignore this balance. For example, I've seen intelligent thermostats that could be built using an Arm Cortex-M4 with a clock speed of fewer than 100 MHz and several hundred kilobytes of memory. Instead, these systems are designed using multicore Arm Cortex-M7 (or even Cortex-A!) parts running at 600 MHz+ with several megabytes of memory! This leads me to ask, are embedded systems developers today overengineering their systems?

I think there are more systems today that are designed with far more memory and processing power than is necessary to get the job done. To some degree, the push for IoT and edge devices has driven a new level of complexity into embedded systems that were once optimized for cost and performance. In addition, connectivity and the need to potentially add new features to a product for a decade or more into the future are leading developers to overestimate their needs and overengineer their systems.

While leaving extra headroom in a system for future expansion is always a great idea, I've seen the extras recently move into the excess. It's not uncommon for me to encounter a team without understanding their system's performance or software requirements. Yet, they've already selected the most cutting-edge microcontroller they can find. When asked about their part selection based on requirements, I've heard multiple times, "We don't know, so we picked the biggest part we could find just in case". Folks, that's not engineering; that's design by fear!


Original Submission

 
This discussion was created by hubie (1068) for logged-in users only, but now has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Insightful) by sjames on Wednesday March 15 2023, @05:02PM (1 child)

    by sjames (2882) on Wednesday March 15 2023, @05:02PM (#1296272) Journal

    It's exactly the opposite problem, they're UNDERengineering.

    A skilled engineer could carefully work out exactly what is needed and do a custom design that exactly meets the specs. Then a skilled firmware developer could do their part and you have a traditional embedded device.

    This is what happens when a less skilled engineer picks a generic part that exceeds requirements because he isn't sure what the requirements are and doesn't want to undershoot. Then they want it to be capable of running android or something like it so they can use a less skilled software developer who doesn't know enough to code to bare metal or something like FreeRTOS.

    Some of this can be justified by component prices coming way down, but there are other reasons to (for example) select a cortex-m4 rather than cortex-A including power consumption. There are even good cases for going with an 8 bit AVR over ARM. For example, an underclocked AVR will tolerate a serious undervoltage when batteries get low. But doing that requires a better grade of software developer.

    On the other hand, from the hardware hacker perspective, this is a great way to get really inexpensive hardware to have fun with. Often things like the calculator in TFA are a LOT cheaper than buying the parts as an individual in single quantity.

    Starting Score:    1  point
    Moderation   +3  
       Insightful=2, Interesting=1, Total=3
    Extra 'Insightful' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Interesting) by DannyB on Wednesday March 15 2023, @05:38PM

    by DannyB (5839) Subscriber Badge on Wednesday March 15 2023, @05:38PM (#1296283) Journal

    You're right! It is UNDER engineering in a very real sense, because they aren't having to do a lot of the engineering work required for a less powerful processor.

    If a $10 calculator has this much compute power with a way over capable display for a calculator, this would seem to make it very attractive to people who like to hack hardware to do unintended things.

    It seems like it is engineered to make it easier or possible to cheat on tests using a cheap simple calculator which might be permitted on the test.

    --
    People today are educated enough to repeat what they are taught but not to question what they are taught.