One of my favorite computer books of all time is Undocumented DOS [archive.org]. Because it turned out that if you knew what you were doing, you could read and even alter DOS internals to learn all sorts of interesting things about your computer. This was especially true if you were doing this on an IBM PC without any of the privilege protections in the CPU and memory that are common now, so you could if you wanted to do things like write a new interrupt handler and tell the CPU to use that instead of the OS's version.
Of course, viruses and other assorted bad guys also knew how to take advantage of this stuff.
-- The only thing that stops a bad guy with a compiler is a good guy with a compiler.
Starting Score:
1
point
Karma-Bonus Modifier
+1
Total Score:
2
(Score: 2) by Freeman on Tuesday October 03 2023, @08:30PM
'eh, viruses weren't a real threat for the vast majority of users until the age of Windows. Though, DOS was around for a good while and wasn't really replaced / supplanted until Windows 2000/NT/XP era.
-- Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
>you could if you wanted to do things like write a new interrupt handler and tell the CPU to use that instead of the OS's version.
In 1991 this was standard practice within MS-DOS, if you wanted your RS-232 data to travel in and out of the PC intact and usefully connected to a processing program.
Fun tidbit: RS-232 handler tutorials of the day (mostly found in bookstores) would publish examples of the interrupt based receiver / buffer, and they mostly worked well, but EVERY SINGLE ONE would leave the interrupt driven transmit and buffering code "as an exercise for the reader" because... the standard 8259 or whatever those one byte buffer 232 chips were included in the PCs, including the system-on-chip silicon that implemented the 232 interface on-chip, had a bug in the transmit handler hardware - it just didn't work as intended or documented, you basically had to hand feed data to the transmitter without relying on an interrupt signal from it to tell you when it was ready for the next byte.
In the later 1990s when modems were starting to exceed 1200 baud (because people were starting to connect graphic browsers to the internet), the standard PC 232 interface chips moved up to 16550 and similar 16 byte buffer models, and then everything worked, but by then we had settled on using a commercial 232 interface library and letting them re-code for every new release of DOS/Windows while we interfaced with their more stable API. See, those "undocumented DOS" techniques, while they were the only thing that was really practical to use at the time, also were virtually guaranteed to break every time a new "99% compatible with legacy code" version of DOS was released, which was several times a year back in those days.
I really, really hate those "exercises for the reader" where they don't tell you the whole story i.e. that there is something broken that you'll have to work around yourself. When I was younger and tended to believe my elders and betters, I would have gone mad wondering what was wrong with my code when the actual hardware was at fault.
I fought the problem from 1990 until 1995, didn't learn that the hardware was at fault until 2003. I knew the system was borked, just not at what level for sure. Until 2003 I assumed Microsoft had screwed up somehow. Given their position in the field I still feel comfortable laying the blame at their feet for ignoring the problem for over a decade.
(Score: 2) by Thexalon on Tuesday October 03 2023, @12:27PM (4 children)
One of my favorite computer books of all time is Undocumented DOS [archive.org]. Because it turned out that if you knew what you were doing, you could read and even alter DOS internals to learn all sorts of interesting things about your computer. This was especially true if you were doing this on an IBM PC without any of the privilege protections in the CPU and memory that are common now, so you could if you wanted to do things like write a new interrupt handler and tell the CPU to use that instead of the OS's version.
Of course, viruses and other assorted bad guys also knew how to take advantage of this stuff.
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 2) by Freeman on Tuesday October 03 2023, @08:30PM
'eh, viruses weren't a real threat for the vast majority of users until the age of Windows. Though, DOS was around for a good while and wasn't really replaced / supplanted until Windows 2000/NT/XP era.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 3, Informative) by JoeMerchant on Saturday October 07 2023, @03:49PM (2 children)
>you could if you wanted to do things like write a new interrupt handler and tell the CPU to use that instead of the OS's version.
In 1991 this was standard practice within MS-DOS, if you wanted your RS-232 data to travel in and out of the PC intact and usefully connected to a processing program.
Fun tidbit: RS-232 handler tutorials of the day (mostly found in bookstores) would publish examples of the interrupt based receiver / buffer, and they mostly worked well, but EVERY SINGLE ONE would leave the interrupt driven transmit and buffering code "as an exercise for the reader" because... the standard 8259 or whatever those one byte buffer 232 chips were included in the PCs, including the system-on-chip silicon that implemented the 232 interface on-chip, had a bug in the transmit handler hardware - it just didn't work as intended or documented, you basically had to hand feed data to the transmitter without relying on an interrupt signal from it to tell you when it was ready for the next byte.
In the later 1990s when modems were starting to exceed 1200 baud (because people were starting to connect graphic browsers to the internet), the standard PC 232 interface chips moved up to 16550 and similar 16 byte buffer models, and then everything worked, but by then we had settled on using a commercial 232 interface library and letting them re-code for every new release of DOS/Windows while we interfaced with their more stable API. See, those "undocumented DOS" techniques, while they were the only thing that was really practical to use at the time, also were virtually guaranteed to break every time a new "99% compatible with legacy code" version of DOS was released, which was several times a year back in those days.
🌻🌻 [google.com]
(Score: 2) by turgid on Sunday October 08 2023, @11:51AM (1 child)
I really, really hate those "exercises for the reader" where they don't tell you the whole story i.e. that there is something broken that you'll have to work around yourself. When I was younger and tended to believe my elders and betters, I would have gone mad wondering what was wrong with my code when the actual hardware was at fault.
I refuse to engage in a battle of wits with an unarmed opponent [wikipedia.org].
(Score: 2) by JoeMerchant on Sunday October 08 2023, @12:41PM
I fought the problem from 1990 until 1995, didn't learn that the hardware was at fault until 2003. I knew the system was borked, just not at what level for sure. Until 2003 I assumed Microsoft had screwed up somehow. Given their position in the field I still feel comfortable laying the blame at their feet for ignoring the problem for over a decade.
🌻🌻 [google.com]