Tunguska the ternary computer emulator:
"I wonder how it would be if my computer was ternary?", I asked myself a while back. I googled around, and found a few stub-ish wikipedia articles, a few pages explaining why ternary computing is vastly superior to everything, and some documents discussing russian experiments in the 1950's.
Obviously, this wouldn't do. I wanted a hands-on computer I could play with. So I got to work, and a few months later, this is the result.
So, the purpose is to provide a simple and accessible, yet powerful playground for ternary computing for the man in the street (with a decent understanding of assembly programming and general computer infrastructure).
[N.B. This is an old story but I found it to be interesting so I thought I would share it. Have any of you ever wondered what it would be like if our computers were ternary? Would it have been possible to build such a thing in silicon back in the early days of computing? - Fnord]
(Score: 2) by Frosty Piss on Monday January 04 2021, @04:33AM (1 child)
I’m afraid to read the article because it’s hosted on Sourceforge, without a valid cert... Given Sourceforge’s history, I don’t want to download something nasty.
(Score: 3, Funny) by RS3 on Monday January 04 2021, @04:37AM
Are you saying it could be a forged document?
(Score: 4, Informative) by c0lo on Monday January 04 2021, @04:44AM (26 children)
3 is the integer base with the lowest average radix economy [wikipedia.org] - the reason why ternary would be a better choice than binary when it comes to power consumption.
flip flap flop [wikipedia.org] for a possible approaches in hardware.
Design of a Ternary Edge-Triggered D Flip-Flap-Flop for Multiple-Valued Sequential Logic [arxiv.org] (which shows some Iranians know a lot more about ternary than fnord666 - grin)
https://www.youtube.com/@ProfSteveKeen https://soylentnews.org/~MichaelDavidCrawford
(Score: 0, Funny) by Anonymous Coward on Monday January 04 2021, @04:56AM (6 children)
We have five vowels, let's use them all. Now we have Flap-Flep-Flip-Flop-Flup, 5 states. We can call the computer "La Quinta Madre"
(Score: 0) by Anonymous Coward on Monday January 04 2021, @05:01AM
flerp
(Score: 2) by Runaway1956 on Monday January 04 2021, @05:54AM (4 children)
No flyps?
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 0) by Anonymous Coward on Monday January 04 2021, @06:32AM (1 child)
Flyps are a quantum superposition between flap and flip, some other kettle of fish
(Score: 2) by Runaway1956 on Monday January 04 2021, @06:51AM
42 then?
“I have become friends with many school shooters” - Tampon Tim Walz
(Score: 0) by Anonymous Coward on Monday January 04 2021, @09:22AM (1 child)
Don't worry. Everybody always forgets about the flwps too.
(Score: 1) by pTamok on Monday January 04 2021, @04:08PM
And the fløps, flåps, flæps, flœps..and so on. An almost complete set would be documented in the IPA vowel sound chart [wikipedia.org].
flips, flyps, flɨps, flʉps, flɯps, flups
flɪps, flʏps, flʊps
fleps, fløps, flɘps, flɵps, flɤps, flops
fle̞ps, flø̞ps, fləps, flɤ̞ps, flo̞ps
flɛps, flœps, flɜps, flɞps, flʌps, flɔps
flæps, flɐps
flaps, flɶps, fläps, flɑps, flɒps
(Score: 2) by RS3 on Monday January 04 2021, @05:16AM (17 children)
Any thoughts / insight into why ternary hasn't gone mainstream?
(Score: 1, Insightful) by Anonymous Coward on Monday January 04 2021, @05:28AM
Throwing out decades of optimization for binary systems isn't worth a tiny efficiency gain?
(Score: 2) by krishnoid on Monday January 04 2021, @05:46AM (10 children)
The nomenclature [schlockmercenary.com] might be a minor issue (bottom panels and explanation at the bottom).
Maybe voltage thresholding is simpler -- if you can encode two values with one voltage threshold, maybe encoding three values with two thresholds isn't as helpful as being able to encode four values with the same two thresholds and two clock cycles?
(Score: 3, Informative) by RS3 on Monday January 04 2021, @06:09AM (6 children)
That's pretty funny. I read the explanation first, then saw the cartoon.
Yes, your point about voltage thresholds is a big one. Even with one threshold we have issues with electrical noise, race conditions, inductive ringing causing extra false state triggering. You'd need tighter threshold tolerances and hysteresis (deadband).
Another problem would be that a transition from 0 to 2 would take longer than 0 to 1, or 1 to 2, so timings would be much more difficult to synchronize. Or looked at another way, the system speed would be limited by the 0 to 2 transition time, but that might still have more total throughput.
Good old cost/benefit analysis.
(Score: 2, Interesting) by Anonymous Coward on Monday January 04 2021, @10:52AM (5 children)
That 0-2 transition time is fatal all by itself. Suppose that 0-1 and 1-2 take the same time, and 0-2 is the combined time (approximately the best case for any given technology). Using the 0-1 interval to implement binary logic using otherwise identical trinary gates give a 2x latency improvement right off the bat, which directly corresponds to a 2x clock speed increase. Now, binary requires more gates for a given computation, so assuming a 50% deeper mesh (which is probably high) we get 1.5*0.5=0.75, so binary still gets 3/4 the latency of trinary using otherwise identical hardware. That is a 1/3 speed improvement which is enough to choose binary over trinary logic even using trinary capable hardware.
(Score: 4, Interesting) by RS3 on Monday January 04 2021, @03:09PM (4 children)
Good analysis. The numbers might be difficult to calculate, but your logic is sound. It seems like you get more done per clock cycle with trinary (or more), but why not just add more bits to the bus.
All that said, you could store more data in a ternary RAM if you could reliably differentiate stored voltage levels, but I think that would be difficult at: 1) current clock speeds, 2) at scale, and 3) at production volumes. Oh and temperature variation would kill the whole thing.
So ya, like others have said, interesting concept, but it's a non-starter.
(Score: 1, Interesting) by Anonymous Coward on Monday January 04 2021, @10:01PM (1 child)
There are many ternary designs that don't have that drawback. For example, there are designs with [-1, 0, +1] where the time to switch between all states is equal. Optical systems are the most obvious, but electrical systems have this property as well. There are [0, 1, Z] and [F, U, T] and [2F, 1T1F, 2T] systems that accomplish much the same thing. Some of those designs are actually quite competitive, but so far the added complexity isn't worth it outside of relatively slow storage where density is king. There is some prediction with computers being "fast enough" and starting to hit various limits, more research will be done in ternary systems to overcome their current shortcomings.
(Score: 0) by Anonymous Coward on Tuesday January 05 2021, @06:11AM
0, 1, Z is a thing, but going from -1 to 1 is not as fast as going from -1 to 0, or 0 to 1, because these are voltage levels, and -1 to 1 has to go through 0.
Or rather, it *can* be as fast - but has to be overengineered for the shorter leaps, to allow it.
Since clock freqs are uniform (ish) and need voltage change time and settle time, the worst case limits would move those out.
(Score: 3, Funny) by jb on Tuesday January 05 2021, @03:20AM (1 child)
"Bit" is short for "Binary digIT". If we're working in ternary (rather than binary), then surely we're working with ternary digits, or "tits" for short.
Your suggestion then becomes "why not just add more tits to the bus" ... which sounds like something we can all get on board with ;)
(Score: 2) by RS3 on Tuesday January 05 2021, @05:11AM
Unless we're using the -1 state; I'm not on board with that.
(Score: 3, Interesting) by HiThere on Monday January 04 2021, @03:20PM (2 children)
IIRC, voltage thresholds is why binary won over decimal back in the late 1940's or early 1950's. Ternary would seem to be a lot easier, but the number of three-way logic choices is a lot fewer than the number of binary choices. FORTRAN used to have a three way if choice, but the number of times two of the choices was the same was huge. (Of course the choice was 0, but that's three choices.) IIRC the logical if was a later addition.
So I don't think ternary would actually be an optimization, because most choices are binary. Three handed logic is good to think about, but relatively infrequent in application. (And occasionally you get more than 3 choices...thus the switch statement.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by HiThere on Tuesday January 05 2021, @01:32AM
whoops: "Of course the choice was less that zero, zero, or greater than zero". ASCII equivalents got auto-deleted. (Plain Old Text isn't *that* plain.)
Javascript is what you use to allow unknown third parties to run software you have no idea about on your computer.
(Score: 2) by RS3 on Tuesday January 05 2021, @05:20AM
Excellent points. Having roots in hw, I was thinking, as were others here, that the point was electrical / signal states. But your point about programmatic branches brings up an interesting thought about "if" statements with multiple possibilities, like a "case" statement, and how it might be implemented in hardware for some amazing optimization. But I'm tired and thinking about it is making me sleepy...
(Score: 2) by fakefuck39 on Monday January 04 2021, @09:06AM (4 children)
It's been a while since my times with Altera, but I'd say it hasn't gone mainstream because per mm of silicon, you get less compute density. It's real simple to make something that either lets current through or not, or flips between one voltage and another. That doesn't require any logic. If you're talking about multiple values, that's a bigger transistor. So for the same die space, you could fit a bunch of on/off switches, or you could fit one that handles multiple levels. the compute capacity you get from the on/off switches for the same area is more.
It's one of those things that sounds good, but when you get under the higher level concept of registers and such, having a very simple gate as your base unit of operation gets you more bang for your buck. Look at electrons leaking out of transistors for example. With binary - who cares - if you got volts, even if some amps leak, you got a 1. No current, it's a zero. You have a good noise floor with that. If however there's always current and you need to behave differently based on the amount of it, leakage is important, so at nanoscale fabs, it's not an optimal way to go.
(Score: 1, Insightful) by Anonymous Coward on Monday January 04 2021, @06:46PM (3 children)
To summarize: digital computing is implemented with digital switches. Since the 1960s, the switches are transistors. What can you build that is simpler than an ON/OFF switch? Just about nothing. This is why we use base-2 logic. Simpler means smaller, faster, cheaper. There is NOT ONE SINGLE advantage to a non-binary computer.
(Score: 1, Troll) by fakefuck39 on Monday January 04 2021, @07:20PM
i hope for a living you're an account executive and your target is pompous c-levels overcompensating for their physical shortcomings with the company checkbook. if not, you're wasting talent. dell is gonna start hiring again early q2.
(Score: 2) by Rupert Pupnick on Tuesday January 05 2021, @12:27AM (1 child)
Yup, ternary state machines imply either operating transistors linearly to get more than two states (big power penalty along with lower noise immunity) or a higher transistor count.
(Score: 2) by RS3 on Tuesday January 05 2021, @05:28AM
Hmmm, wanna collaborate on a multi-state transistor? :)
Transistor saturation is the enemy of switching speed, hence the Schottky diode clamped "S" TTL, and ECL (someone here used to tout that stuff - maybe you?) IIRC ECL isn't saturating. Of course MOS stuff pretty much saturates, but IIRC they don't store charge like a BJT.
Bring back analog computers! :)
(Score: 0) by Anonymous Coward on Monday January 04 2021, @05:52AM
Which in real reality amounts to exactly nothing.
Decimal 100 in base 3 is 4 digits: "radix economy" 12; in base 2 it is 6 digits: "radix economy" is the same 12 again. Decimal 1,000,000,000 is 54 vs 58: whole 7% "better".
3 is not that much nearer than 2 to the optimum e=2.71828... to be worth ANY extra complexity.
The old and tired urban legend of "ternary computing" is long deserving eternal rest.
(Score: -1, Flamebait) by Anonymous Coward on Monday January 04 2021, @05:38AM
No. Runaway is really old. Boomer old. And just about to be fired for being too fucking old. So what if computers were, um non-digital? Unmathematical? What if computers keep old farts like Runaway off of them, like before AOL? Ever think of that? This whole White Supremashist thing would not even exist.
(Score: 5, Funny) by Anonymous Coward on Monday January 04 2021, @09:27AM
There are 10 kinds of people in the world:
(Score: 3, Interesting) by kazzie on Monday January 04 2021, @12:21PM (6 children)
One scenario I've thought about is what if we had eight digits instead of ten.
Assuming we settled on the decimal system because we have ten fingers and thumbs, a race with eight would likely settle on octal. While conversion between binary and decimal is a non-trivial task for most people, there's a direct mapping between binary and octal digits. If our children learned numbers in octal instead, binary would be much more accessible.
A lot of our early computing machines were decimal in nature (from Babbage onwards), and many processors had specific modes for doing decimal calculations. If we hadn't had to deal with all that nonsense, just how much further long would we be in the race to the stars?
(Score: 0) by Anonymous Coward on Monday January 04 2021, @01:23PM (2 children)
i did always wonder why it was not 8 fingers and two check digits (thumbs) ?
(Score: 3, Funny) by RS3 on Monday January 04 2021, @03:12PM (1 child)
Thumb = parity bit?
(Score: 0) by Anonymous Coward on Tuesday January 05 2021, @06:18AM
FTFY
(Score: 2) by sjames on Monday January 04 2021, @06:41PM
The decimal options on CPUs was mostly done in binary coded decimal. Even the Z-80 had some hardware support in the form of a half carry flag and the DAA (Decimal Adjust Accumulator) instruction. Hardware BCD didn't hold us back much if at all. We can still see the distant echos of that in COBOL today.
(Score: 0) by Anonymous Coward on Monday January 04 2021, @07:02PM (1 child)
Quoth the parent:
"If we hadn't had to deal with all that nonsense, just how much further long would we be in the race to the stars?"
Absolutely no further along than we are now.
Base 10 arithmetic logic served a purpose in the very early days of computing. We don't need it now because if we need base 10 arithmetic (financial calculations), we just implement the base 10 arithmetic in software instead of hardware because computers are fast enough now that that is a better trade-off.
Also, it is not a waste to try different approaches to a problem along the way. This is how we learn. It does not slow things down at all.
(Score: 2) by kazzie on Tuesday January 05 2021, @07:38AM
But we're still implementing it in software now. And it's a conceptual hurdle for some programmers/users.
(In the meantime, CPU designers spent silicon on implementing it, and programmers of that era had to deal with it too.
I was thinking more in terms of how accessible computing/programming is to our youngsters. If every schoolkid was effectively fluent in binary anyway, might that lower the bar for entry into the field?