From The Guardian...
"Very few Yugoslavians had access to computers in the early 1980s: they were mostly the preserve of large institutions or companies. Importing home computers like the Commodore 64 was not only expensive, but also legally impossible, thanks to a law that restricted regular citizens from importing individual goods that were worth more than 50 Deutsche Marks (the Commodore 64 cost over 1,000 Deutsche Marks at launch). Even if someone in Yugoslavia could afford the latest home computers, they would have to resort to smuggling.
In 1983, engineer Vojislav "Voja" Antonić was becoming more and more frustrated with the senseless Yugoslavian import laws.
Antonić was pondering this while on holiday with his wife in Risan in Montenegro in 1983. "I was thinking how would it be possible to make the simplest and cheapest possible computer," says Antonić. "As a way to amuse myself in my free time. That's it. Everyone thinks it is an interesting story, but really I was just bored!" He wondered whether it would be possible to make a computer without a graphics chip – or a "video controller" as they were commonly known at the time.
Instead of having a separate graphics chip, Antonić thought he could use part of the CPU to generate a video signal, and then replicate some of the other video functions using software. It would mean sacrificing processing power, but in principle it was possible, and it would make the computer much cheaper."
And the Galaksija (Galaxy) was born.
(Score: 4, Informative) by Rich on Sunday October 27, @01:31PM (4 children)
Any idea on how the video generation works? There's a 2716 on the address bus, a '166 shifter behind it, and that's it. IIRC the ZX 8x have the CPU slide across synthetic zeroes (which is a NOP) for address generation, but here?! None of that, it seems to be a feature-by-omission.
Here are two pages with more details, but they don't explain the video part:
https://blog.vladovince.com/galaksija/ [vladovince.com]
https://revspace.nl/Galaksija [revspace.nl]
(Score: 2) by JoeMerchant on Sunday October 27, @02:40PM
Atari had fairly good documentation of their GTIA chip. Essentially, you task the CPU to produce the same signals. If you only want monochrome, I believe you can get all the levels you need from two output pins. If you want some grayscale capability you can make a makeshift DAC with resistors on the digital outputs.
🌻🌻 [google.com]
(Score: 3, Interesting) by JoeMerchant on Sunday October 27, @02:46PM (1 child)
I suppose I should add: composite video is a single analog time series signal. The 2d picture is encoded in high amplitude sync pulses and lower amplitude pictures intensity levels between the pulses.
It seems to me that an ordinary CPU from the 80s wouldn't have much capability left over when producing video, but I suppose you could at least get some general purpose cycles accomplished during the blank portions of the frame scan.
🌻🌻 [google.com]
(Score: 1, Informative) by Anonymous Coward on Sunday October 27, @03:05PM
Lancaster's "Cheap Video Cookbook" explains how to do it with a 6502.
As you notice, it takes almost all of the CPU time to do that.
(Score: 3, Informative) by Rich on Monday October 28, @01:10AM
I watched the 2012 29C3 presentation on the machine by Tomaz Solc . They use the refresh counter, interleaved with "useless" single-byte instructions. The missing 8th bit of the Z80 refresh address bit is reconstructed with a bit from the '174 latch. (I think everyone - especially 4164 users - would have been happy if Zilog had eventually added 8 bit refresh to a later stepping.)
And they were SO strapped for ROM space that they dual-used ASCII text as "useless" instructions in the video driver (and complemented those in a way it wouldn't mess up state). They also used a hard-to-layout keyboard matrix that would save a few bytes for decoding.
Impressive minimalism!