from the things-expand-to-exceed-the-space-provided dept.
Hackaday has a story about a simple non-scientific calculator that packs an Alwinner A50 tablet SoC and the Android operating system:
As shipped they lack the Android launcher, so they aren't designed to run much more than the calculator app. Of course that won't stop somebody who knows their way around Google's mobile operating system for very long - at the end of the review, there's some shots of the gadget running Minecraft and playing streaming video.
But it does beg the question as to why such a product was put into production when the same task could have been performed using very cheap microcontroller. Further, having done so they make it a non-scientific machine, not even bestowing it with anything that could possibly justify the hardware.
Embedded has more generic related post about overengineering in embedded systems:
Embedded systems have traditionally been resource-constrained devices that have a specific purpose. They are not general computing devices but often some type of controller, sensor node, etc. As a result, embedded systems developers often are forced to balance bill-of-material (BOM) costs with software features and needs, resulting in a system that does a specific purpose efficiently and economically.
Over the last few years, I've noticed many systems being built that seem to ignore this balance. For example, I've seen intelligent thermostats that could be built using an Arm Cortex-M4 with a clock speed of fewer than 100 MHz and several hundred kilobytes of memory. Instead, these systems are designed using multicore Arm Cortex-M7 (or even Cortex-A!) parts running at 600 MHz+ with several megabytes of memory! This leads me to ask, are embedded systems developers today overengineering their systems?
I think there are more systems today that are designed with far more memory and processing power than is necessary to get the job done. To some degree, the push for IoT and edge devices has driven a new level of complexity into embedded systems that were once optimized for cost and performance. In addition, connectivity and the need to potentially add new features to a product for a decade or more into the future are leading developers to overestimate their needs and overengineer their systems.
While leaving extra headroom in a system for future expansion is always a great idea, I've seen the extras recently move into the excess. It's not uncommon for me to encounter a team without understanding their system's performance or software requirements. Yet, they've already selected the most cutting-edge microcontroller they can find. When asked about their part selection based on requirements, I've heard multiple times, "We don't know, so we picked the biggest part we could find just in case". Folks, that's not engineering; that's design by fear!
(Score: 4, Insightful) by TheReaperD on Wednesday March 15, @07:29AM
Designed by someone who padded their resume.
Ad eundum quo nemo ante iit
(Score: 4, Interesting) by Anonymous Coward on Wednesday March 15, @08:17AM (3 children)
Working in the space industry, where in many cases we don't have the luxury of putting any cheap and super-high performance SoC, we end up having to explain to "embedded software" engineers complaining about the resource constraints that it's their fault if they cannot implement basic spacecraft control algorithms in the same low resources that others could 5 years ago.
(Score: 5, Insightful) by TheReaperD on Wednesday March 15, @09:18AM (1 child)
Intel and Microsoft has so long pushed the narrative of throwing more CPU and RAM at a problem is more economical than come up with quality software and the fact that components have become dirt cheap (comparatively) has just made the problem even worse. For the most part, it's not an issue, except it leaves even more security holes for attackers to exploit.
The space program is where this comes to a screeching halt. Satellites, or worse, manned space vehicles, can't have components easily repaired or replaced, if at all. This means that chips and components have to be guaranteed to perform in the environment for the specified mission duration, which may span decades without maintenance. This means you can't use the newest an highest performance components hot off the silicon presses. At best, you can use five year old components that have just cleared stress testing. In reality, that's a highly optimistic scenario. Many units are a decade or more old because the design has to have been proven to work.
The problem, as you stated, is you get these new software engineers that have been trained to slop together a commercial application with high-level languages and framework upon framework and now they're having to work with components that are a decade or more out of date and having to run high end calculations on them, which requires low-level languages and working directly with the hardware with little to no abstraction layers. Modern programmers simply aren't trained to do this anymore. It might actually be more efficient for NASA and other space organizations to hire kids right out of high-school and train them how to program from scratch. It'd save having to try and get them to unlearn all the bad habits programmers that are taught in college these days. It'd also have the advantage of repopulating the embedded and low-level driver programmers again, which is quickly turning into a lost art. (Looking at driver install packages in the hundreds of MB!)
We can't keep counting on boomers to keep putting off retirement. We need to start training the next generations before all the boomers die off, which is already starting to happen. We wait another 5-10 years and it'll be too late. It takes years to train skills like this to any level of proficiency and years more for mastery!
Ad eundum quo nemo ante iit
(Score: 3, Interesting) by DannyB on Wednesday March 15, @05:27PM
I am reminded of the pilot episode of ST:TOS, or the two part episode The Menagerie, where the inhabitants living underground had forgotten how to repair the machines built by their ancestors.
The driver part is a tiny fraction of that. The rest is all advertising, spyware, bloatware, things you don't need and didn't ask for. Shovelware. Craptacular special offers if you sign up online. Etc.
Can't large language models be put in charge of resolving ethical issues related to the use of AI?
(Score: 2, Touché) by Anonymous Coward on Wednesday March 15, @10:55AM
You mean you can't just `pip install pyControl' ?? That's what they say to do on StackOverflow!
(Score: 4, Insightful) by Anonymous Coward on Wednesday March 15, @08:18AM (3 children)
The extra CPU and RAM is for additional spying, monitoring and whatever other stuff they want to do.
(Score: 5, Insightful) by SomeGuy on Wednesday March 15, @11:58AM
Don't forget advertising. I'd be surprised if there isn't advertising sitting in there ready to go after you have used it for a while. That is one of the big reasons why big corps want to embed this kind of crap in to everything.
Oh, and built in obsolescence - after using it for more than 5 seconds "to keep your device secure" you must throw it away and buy a new one!
Then turn everything in to a subscription. You want your square root button to work? That will be $50 a month, forever! Muhahahha.
I hate this planet.
(Score: 2, Interesting) by DadaDoofy on Wednesday March 15, @04:10PM
It's the same reason every single product Apple makes, with the possible exception of AirTags, has a microphone in it.
(Score: 2) by DannyB on Wednesday March 15, @05:31PM
I thought the Management Engine is for the "whatever other stuff they want to do". The extra CPU and RAM is to keep you satiated while the Management Engine does its job.
How long before microcontrollers have management engines?
Then RAD hardened management engines that can withstand the radiation environment of the Jovian moons.
Can't large language models be put in charge of resolving ethical issues related to the use of AI?
(Score: 4, Insightful) by Mojibake Tengu on Wednesday March 15, @08:59AM (1 child)
It's been a long trip from simple mechanical bi-metal switch with adequately precise temperature adjust.
Then, instead of panel, there comes a full webserver and planner into your thermostat for your convenience tied to your fancy smart phone application.
What could possibly go wrong with that?
The edge of 太玄 cannot be defined, for it is beyond every aspect of design
(Score: 2) by krishnoid on Wednesday March 15, @05:51PM
Maybe it's a manufacturing capacity issue in the wafer fabrication plants, where standard-issue general-purpose ARM SOCs [arstechnica.com] available in every Android smartphone are readily available for scooping out of the chocolate or vanilla ice cream buckets over the more boutiquey flavors, especially for high-volume consumer products.
When you can run something via microcode in an emulator [google.com] on any Android device, and can get 2-year old leftover generic SOCs in bulk for cheap ... why not?
(Score: 5, Insightful) by shrewdsheep on Wednesday March 15, @09:18AM (6 children)
"Overengineerd": In my book this seems to mean something different than described. I would call the system not designed at all. Indeed all the choices were made to avoid having to design something. I would call it system-assembly.
(Score: 5, Touché) by choose another one on Wednesday March 15, @11:14AM
The cost differentials are probably simply traded off against "not having to design anything" (not just cost of design but also time-to-market).
End of the day, if what you have, and know how to use, is infinite orbital-nukes, it's so much easier to just nuke everything from orbit.
"Eeek, a mouse"
"no problem, we'll just nuke it from orbit".
(Score: 5, Interesting) by garfiejas on Wednesday March 15, @12:49PM (2 children)
Agreed; but I would also suggest that its likely to do with bootstrapping/coding; pretty trivial to design a calculator using Google tools, very cost effective to use an Android SOC and whatever touch panels you have. An actual ARM Cortex M7 with ARM CMSIS drivers on the other hand... even something as dev friendly like https://www.pjrc.com/store/teensy41.html [pjrc.com] - in C using a commercial or GCC toolchain ... requires a detailed knowledge of how the tin actually works and getting a real working GUI plugged into it is definitely non trivial...
(Score: 2) by jb on Thursday March 16, @02:57AM (1 child)
Why on earth would you need even C to build something as simple as a ("non-scientific") calculator?
All you need is a couple of registers, a serial adder and a few strategically placed inverters and you have addition, subtraction, memory & clear taken care of. A small rom with a lookup table in it is probably still the most efficient way to add support for multiplication, division & square roots. Pretty sure a "non-scientific" calculator does not need more than that.
If design costs are your worry (small production runs), note that the average 2nd year EE student should be capable of designing something like that in the course of a 3 hour exam (and a top student will also manage to prove his design correct within that time limit). If your engineers can't, just fire them and hire a (much cheaper) graduate who can.
If unit costs are your worry (large production runs), note that tools exist to transform such a design into an ASIC layout that can be fabricated cheaply at scale.
As to "a real working GUI", there's nothing more "real" (nor, in most cases, more "working") than mechanical buttons (plus a series of 7seg displays for output of course). For a simple calculator-only device, adding a touch-screen will serve only to *diminish* usability and *diminish* service life ... but then again, I guess shipping barely usable devices that are built to break down seems to the *goal* of most electronics manufacturers today...
(Score: 3, Insightful) by Immerman on Thursday March 16, @01:37PM
Commercial design ultimately has one goal: maximizing the profit-versus-cost ratio.
These days you can get a mature and massively-too-powerful generic SoC off the shelf for well under a dollar. Good luck producing a custom calculator circuit with for anywhere close to that - the necessary ROM and RAM alone will likely be almost as expensive as they are increasingly specialty hardware (because why would you buy them separately when you can get lots more of both, already integrated with a powerful CPU, for a similar price?). Add in the adder and support circuitry and you're almost certainly more expensive than the SoC.
And that's before you consider the assembly costs, which increase linearly with the number of components - it costs just as much for a pick-and-place machine to place a single resistor as it does to place an entire SoC.
As for design...
Writing software is almost always vastly faster and easier than designing an equivalent circuit, is MUCH easier and faster to debug, and adds zero additional per-unit costs. And if you find a serious bug in the software late in production you can quickly re-flash all the existing inventory, while a bug in hardware probably means throwing it all away.
And on the hardware side, with a SoC the wiring is basically just connect power and I/O devices - very few components, very few connections connections, which means very little that can possibly be screwed up.
(Score: 5, Interesting) by SomeRandomGeek on Wednesday March 15, @03:44PM
I strongly suspect that we simply do not understand the design trade offs that the engineers were making. Is the simpler part really cheaper? How much custom supporting hardware does each part need? How do the software development costs for the parts compare? Are both parts even available in today's market? Perhaps the calculator was overengineered or underengineered, but we are in no position to judge. For all we know, using an overpowered but mass produced android SoC is the absolute cheapest way to make a calculator in this economic environment. The calculator retailed for $10, and I doubt that the chip accounts for even a significant fraction of that cost.
(Score: 3, Interesting) by corey on Wednesday March 15, @09:33PM
Yep. The younger guys at work live by the motto “just chuck a micro in it”, then they write a quick bit of software.
I’m old school so I try to go simple as possible but I ack that sometimes that increases the NRE.
(Score: 4, Interesting) by RamiK on Wednesday March 15, @12:04PM (2 children)
Btw, there been scientific calculators with 4g meant for cheating ( https://www.aliexpress.com/item/33002124253.html [aliexpress.com] ) so maybe it was made to look like some specific legit calculator or something? Regardless, it's a pretty good size and having a touch scroll to go through the history isn't a bad thing to have for some folks. Of course, battery life is probably only in the hours so...
Anyhow, there's a lot of weird and legitimate devices out there in this space. Like, I mostly use a tenkeyless so sometimes I need a numpad and when I was looking for dedicated numpad and such I ended up finding all sorts of combo devices that have dedicated calculators, mechanical keyboard, usb hubs, touchpad... Like, this oddity just came up looking for "numpad wireless": https://www.aliexpress.com/item/1005005292306610.html [aliexpress.com]
Point is, there might be a market demand behind these.
(Score: 2) by VLM on Thursday March 16, @01:08PM (1 child)
Got it... the linked article is funny the point isn't to sideload minecraft but to ask the prof or TA "Can I use my dumb 4-function-only calculator on the midterm exam?" then log into google drive to access the cheatsheet full of notes for the test and the pdf of the entire textbook.
Or get permission to use the 4-function built in calculator then run an HP48 emulator or Wolfram Alpha online or similar.
(Score: 2) by RamiK on Thursday March 16, @02:23PM
Yup. Another speculated use case that was mentioned in the cnx soft comments was elderly users that want to have a scroll history for the computation but find scientific calculators too burdensome due to the multitude of small keys. I would also add some sort of checkout cashier use-case or running tallies and/or going through inventory...
(Score: 5, Informative) by ElizabethGreene on Wednesday March 15, @02:02PM
The calculator on my desk, a Ti-84plus, has a venerable ZiLOT z80 chip in it. It's $2 for the chip. That part's obsolete, but it's the right neighborhood. An ARM Cortex EFM32PG23B210F256IM48-C [mouser.com] SOC is $2.85 in quantity. There isn't a huge difference.
Where there is a huge difference is in the displays. They easily cost 10x what the SOC does. If I were a gambler, I'd say they got a great deal on the display and used the MFR's reference design for an android tablet folded into a calculator form factor.
(Score: 0) by Anonymous Coward on Wednesday March 15, @04:36PM
...like asking if water is wet.
The newer a thing, the higher the probability of it being overengineered. Simple, stupid things aren't profitable and don't result in job security.
(Score: 5, Insightful) by sjames on Wednesday March 15, @05:02PM (1 child)
It's exactly the opposite problem, they're UNDERengineering.
A skilled engineer could carefully work out exactly what is needed and do a custom design that exactly meets the specs. Then a skilled firmware developer could do their part and you have a traditional embedded device.
This is what happens when a less skilled engineer picks a generic part that exceeds requirements because he isn't sure what the requirements are and doesn't want to undershoot. Then they want it to be capable of running android or something like it so they can use a less skilled software developer who doesn't know enough to code to bare metal or something like FreeRTOS.
Some of this can be justified by component prices coming way down, but there are other reasons to (for example) select a cortex-m4 rather than cortex-A including power consumption. There are even good cases for going with an 8 bit AVR over ARM. For example, an underclocked AVR will tolerate a serious undervoltage when batteries get low. But doing that requires a better grade of software developer.
On the other hand, from the hardware hacker perspective, this is a great way to get really inexpensive hardware to have fun with. Often things like the calculator in TFA are a LOT cheaper than buying the parts as an individual in single quantity.
(Score: 3, Interesting) by DannyB on Wednesday March 15, @05:38PM
You're right! It is UNDER engineering in a very real sense, because they aren't having to do a lot of the engineering work required for a less powerful processor.
If a $10 calculator has this much compute power with a way over capable display for a calculator, this would seem to make it very attractive to people who like to hack hardware to do unintended things.
It seems like it is engineered to make it easier or possible to cheat on tests using a cheap simple calculator which might be permitted on the test.
Can't large language models be put in charge of resolving ethical issues related to the use of AI?
(Score: 2) by istartedi on Wednesday March 15, @06:55PM (8 children)
I think the most likely reason for that device being the way it is, as somebody commenting on TFS said, that the company didn't want to try selling the hardware in to other markets. That might cannibalize sales of higher end general purpose devices. They were better off building this, perhaps even taking a loss on it to avoid taking a bigger loss someplace else.
Also, laziness. Finding engineers that can make this run *tight* on hardware no better than it needs to be is difficult. Finding engineers to build a lame 4-banger on Android is easy. This is probably not even the most extreme example of the "developer cycles are more expensive than CPU cycles" mindset.
A lot of us have used, say, an old TV as a door stop or something. Now imagine you're a business and you have a whole *warehouse* of old TVs but you really want to sell brand-new TVs. Suddenly you're in the decorator doorstop business.
(Score: 0) by Anonymous Coward on Wednesday March 15, @07:50PM (7 children)
The old TV thing is a bad analogy because retro gamers and collectors are known to pay out the ass for old CRTs. That company could make a little fortune selling those at 4-5K a piece.
(Score: 2) by istartedi on Wednesday March 15, @09:39PM (6 children)
Seriously? Dang. I'm not really that sorry I donated my old CRTs though. They would have been a PiTA to move across country.
(Score: 0) by Anonymous Coward on Wednesday March 15, @11:25PM (1 child)
Depending on the CRT and its condition, yes. Plenty of old games only look correct on CRT monitors because the developers took into account stuff like glare (etc) when they designed the graphics. And then there's stuff like lightgun games that flat out don't work with non-CRT displays.
(Score: 0) by Anonymous Coward on Thursday March 16, @04:39AM
The thing I heard, in addition to your points, is the analog input- no digital delays that drive gamers to great unhappiness.
(Score: 2) by VLM on Thursday March 16, @01:12PM (3 children)
The market is very thin.
An example of a thin market is the highest paid NFL quarterback made just under $50M but a couple guys down the list its already down to $30M and couple thousand places down the list you got used car salesmen, so going into football is not viable.
If you have a top quality undamaged sony trinitron CRT that fits arcade cabinets you can get a small pile of money. On the other hand, 99.99% of old CRTs... not so much.
(Score: 2) by istartedi on Thursday March 16, @04:52PM (2 children)
That makes sense. Even though I saw whole *pallets* of CRTs destined for the Chinese recycling programs, you figure there are still vast numbers of them. I still see them getting illegally dumped sometimes. It might be worthwhile to research and take notes on what's hot. I still see them at thrift stores once in a while.
(Score: 2) by toddestan on Friday March 17, @03:47AM (1 child)
Pretty much anything Sony. JVC D-Series. Some Toshiba and Mitsubishi sets. Generally, the smaller CRTs seem to be more desirable than the massive sets that take up half your living room. Smaller sets with higher quality inputs like S-video and component are especially desirable. Also, any professional video monitors, many of which aren't technically televisions (no built-in tuner).
What's less hot are the late HD CRT televisions, which suffer from input lag and poor support for SD content, and stuff like projection TVs which were junk even when they were new.
It's kind of interesting to see what was considered junk just a few years that you had to pay to get rid of are now considered collector items. It could be that there's just not that many left after years of them getting unceremoniously trashed. On the other hand there could still be a large number of them lurking in people's closets and attics that'll start making an appearance once word gets around that they are worth money, flooding a market that I suspect isn't actually all that big.
CRT computer monitors are also an interesting case. They don't see to be as in demand as the CRT televisions, but on the other hand they seem to have survived in much smaller numbers, making them harder to find for the people who are looking for one.
(Score: 2) by istartedi on Saturday March 18, @03:44AM
Just for grins and giggles I checked up on the ol' 1702 Commodore monitor.
I actually had one. Prices were low $200s, but then there was also a complete C-64
setup w/1541, the CPU, *and* the monitor and a printer for about $450 out there,
so I still don't feel bad about getting rid of all that. It might have all gotten jarred
in to oblivion with all the moves I did from the 90s to now.
We had an old black-and-white Philco when I was a real little kid, a set with not
just a CRT but vacuum tubes also. That sucker still worked in the 80s and I hooked
the C-64 up to it one time for the same kind of reason--seeing a computer hooked
up to vacuum tube tech, there was just something so wild about that, spanning
history. It's one of those fond little geek moments I look back on. We yard saled
the Philco, probably $10, let's check that... about $400 maybe. I don't see one that
looks exactly like ours. I know we didn't have a Predicta. That's one weird looking TV,
which is probably why they're asking $2700 for it.
(Score: 2) by ilsa on Wednesday March 15, @09:41PM (1 child)
Not knowing anything about the current state of the chip industry, the first question that pops into my mind is the cost and availability of the hardware.
Do they even still make the ICs we used to have with older calculators?
For example, Intel produced the 8088 as late as 1998 according to wikipedia, and they were used as controllers for things like washing machines. There is still a need for low performance microcontrollers but my understanding is that due to myriad factors, it's actually cheaper and/or more power efficient to to produce and use higher capability chips and clock them down.
(Score: 0) by Anonymous Coward on Thursday March 16, @04:47AM
Not sure if this was your point, but designers want their (our) designs to last some time. Usually you can know if and when a part will cease being produced, so you might jump to the newer part in your design. Many IC spec sheets may say "not recommended for new designs".
Cynically, as someone seems to have alluded to above, I think many hw and sw designers want to keep their resume as updated as possible, so they use the greatest technology they can, no matter it's way overkill. Web page designers are very guilty of this.
(Score: 2) by VLM on Thursday March 16, @01:14PM (1 child)
An area often missed WRT security is you can't hack a TI-81 via a json deserializer buffer overflow.
HUGE appliances that never get any security patches have a huge security risk surface.
(Score: 2) by ElizabethGreene on Thursday March 16, @02:52PM
No, to hack the TI-81 you have to buffer overflow a specific rom routine. The procedure is described here [ticalc.org].
Other models, like my 84+, are much less asinine about execution of arbitrary code. With it I can compile on my pc with the Zilog compiler, drop it over on the calculator with TI Connect, and run it from the program menu.