Stories
Slash Boxes
Comments

SoylentNews is people

Journal of mcgrew (701)

The Fine Print: The following are owned by whoever posted them. We are not responsible for them in any way.
Wednesday May 10, 23
01:28 AM
Digital Liberty

This was originally a comment in an article here at S/N that was posted a couple of days ago. Since I'm a little late posting it, I'll repeat it here.

The article was about a pop songwriter being sued by the late Marvin Gaye's estate for copyright infringement. The greedsters lost the case, thankfully. BUT,

Those Marvin Gaye songs wouldn't be under copyright any more and this case could never have come to trial had it not been for Sony Bono and a corrupt judicial system.

The rich pop singer Sony Bono got himself elected to the US Senate and made a lot of friends there before suffering a rich man's death at the hands of a skiing accident most could never afford. So for the poor dead talentless pop singer, they raised the copyright length from 20 years, extendable another twenty with proper paperwork and a twenty dollar fee as previously, to the author's life plus ninety five years, ninety five if done for a corporation, without any copyright paperwork at all unless it gets to court.

Since the constitution allows copyrights and patents for limited times anybody with two functioning brain calls can't possibly believe that a century past the author's lifetime ("The Congress shall have Power To...promote the Progress of Science and useful Arts, by securing for limited Times to Authors and Inventors the exclusive Right to their respective Writings and Discoveries;") is in any way constitutional. A lifetime plus a century is logically and reasonably unlimited.

I thought the law was passed as a result of bribery, but when folks tell me all congressman are crooked, I say it's almost statistically impossible that 535 people would ALL be corrupt. Then I realized, they don't have to be corrupt. 535 cowards fearful of losing an election are easy prey for the likes of the Music And Film Association of America (MAFIAA). "Nice campaign, be a shame if anything happened to it. You know I'm a good citizen who gives you and your opponent both fifty million buck campaign contribution. Be a shame if he got a hundred million and you got nothing." Too bad none of those 535 have the courage to outlaw "contributing" to more than one candidate in any given race. It should be a felony with mandatory prison time.

But they can't extort the Supreme Court like that. They face no election and hold office until they retire or die. It's been shown publicly that Clarence Thomas broke every ethical rule the judicial system has, Except that the rules don't apply to the Supreme Court! Since the other eight refuse to enact a code of ethics, I must assume all nine are as God damned dirty as Thomas.

That not only explains "Limited means whatever congress says it means" but why Citizens United ruled that a corporation is a person (who can't go to prison or be executed for any crime, only fined).

America's slide into fascism is well underway. As you're aware, under fascism, business runs government. Under communism, government runs business. I don't see a lot of difference, both require dictatorships. I'm just glad I'm old enough to miss the end of this shit show, I won't exist thirty years from now.

Wednesday April 26, 23
01:28 PM
Science

TV Meteorologists all, every single one, are victims of our abysmal educational system. It shuts off children’s thinking and demands they not learn, but memorize. For example, history class. I always hated history until I reached college. In public school, they want you to memorize names and dates without ever mentioning why those names and dates are important, or how what happened in the past affects you and can happen again.
        So like almost everyone else in our once great nation that has fallen greatly at the hands of the rich and the politicians they have purchased, meteorologists don’t think. It’s a wonder they could graduate college after the damage done in public school. Let’s outlaw private school! If the rich were forced to attend public school, things would vastly change for the better, because they would be well funded.
        So it’s no surprise that their “feels like” temperature calculations are missing variables, the first thing wrong with “feels like”. In the summer, the formula takes into account temperature and humidity, since hot wet air feels hotter than hot dry air. But two eighty degree days with identical humidities will feel different if one has a breeze. It won’t feel as hot.
        But they leave that variable out. Laziness, perhaps?
        In the winter, it’s temperature and wind. But then they ignore humidity, which does the opposite in the winter; on two windless days with identical temperatures, the high humidity day will feel colder than the low humidity day. But they ignore humidity in the winter.
        Summer or winter, the wind affects temperature. But the wind almost always changes, never a steady speed all day, making any “feels like” temperature flat out wrong almost any minute of any day.
        The one break I’ll cut them is that their science is still in its infancy, not really existing at all until we put up satellites. Maybe someone from Sweden or somewhere that they value education and teachers will set our dumbass meteorologists straight.

Monday April 17, 23
06:39 PM
/dev/random

I didn't add much to the story, as I've been at the hospital visiting my daughter, who went into ICU Thursday night with ketoacidosis. She went home this morning. It prompted the only part I've written, and follows:

        The band got on stage to start having a good time playing, as playing children always do. Of course, nobody ever really grows up, not even the geriatric. Not inside, anyway. Some people’s souls die, but otherwise there’s a child inside every old codger.
        Bill finished up in the pilot room, cursing that damned Mort for dying, and hurrying to the commons. Maybe he could actually catch a show tonight, if that damned phone would shut up and let him be for a while. He sat down next to Mary, who started trying to get the best of him, female style.
        Nobody ever really grows up. She pulled out a joint.
        Bill wrinkled his already wrinkled old nose. “Excuse me,” he said, and moved to the table Joe was sitting at by himself. After perfunctories, he said “That Mary! I’m glad I’m not Ralph or Jerry. Damned woman was hitting on me. I’m four times her age!”
        Joe grinned. “Is that what the company records of your entropy say?”
        “No, that’s what the tax collector says, charging me a year’s taxes for a three month run.”
        “Good evening, ladies and gentlemen. We’re going to start with a very, very old number called ‘Moondance’.”
        Sue started playing her flute.
        Harold, as usual, was missing the show, dealing with the various miseries elderly geezers always have most of the time.
        “It hurts when I raise my arm like that.”
        “Then don’t do that.”
        “Ha, Ha.”
        “Look, George, gettin’ old ain’t for wimps, you know? You think I don’t have all the aches and pains and heartaches and misery as everybody on the ship?”
        “Can’t you give me something?”
        “You have arpirin, don’t you?”
        “Yeah, but...”
        Harold rolled his eyes. “Let me tell you a little ancient medical history. About 1800, not sure the actual year...”
        “Krodley! ancient is right. How could it apply today? They didn’t even have electricity, did they?”
        “I don’t know, but they made a drug named ‘morphine’ out of a plant that’s now extinct called a poppy. It was kind of like a modern pain diffuser, but if you took too much for too long, you had a physical need for it, so they made strict rules, laws, actually, for its use.
        “They developed more and more powerful drugs in that class, but in the twentieth century fascism was born, and was nearly wiped out in a world wide war but the nascent movement started taking hold world wide in the twenty first...”
        “They taught us all this is high school!”
        “Not all of it, they didn’t. Just about how the entire planet became a fascist dictatorship. Now, the drug industry...”
        “The drug what?”
        “Believe it or not, producing drugs, actually all aspects of health care were monetized. A diabetic without the means to afford enough medication was doomed to a horrible death by ketoacidosis...”
        “You lost me.”
        “Their blood turns to acid.”
        “They were really that cruel?
        “That’s what happens under fascism. Poverty could result in death by torture. But anyway, the opioids, as they were called, were legally only used for [FIXME] pain until the heartless drug dealers, very rich people who made medicines that doctors prescribed, somehow convinced everyone that their drugs could be safely used for [FIXME]. The result was millions of people addicted to the drugs the drug salesmen pushed, dying from overdoses, stealing to support their habits... it was awful. Believe me, you don’t want to go back to that. How about using a diffuser if it hurts that bad?” His instruments told him that George was in less pain than he was.
        He shook his head. “I can’t think straight with one of those.”
        “Drugs would be worse. Let’s get a beer and listen to some music.”
        “It’s Saturday?”
        “Well, yeah!”
        They walked down, and entered the room as raucous applause was ringing. “Good,” Doc said, “We didn’t miss it!”
        Before they reached a table, the applause died, and Bob’s amplified voice said “Thank you! Thank you! You’ve been a great audience, we’ll see you next Saturday!”
        “Well, shit.”

Wednesday April 12, 23
02:14 PM
Answers

I still haven’t found a catchy yet fitting name, so for the time being it’s Anglada.odt. It has turned out to be a sequel to Mars, Ho! and Voyage to Earth, and a prequel to Nobots. Bill Kelly returns, aged 245 Martian time, 61 relativity time. Einstein’s theory is the story’s main theme.
        This story has a lot I’ve not used before, like a dystopia. I’ve become really tired of reading future dystopias, it seems that’s the only thing kids can write these days. Probably because we’re sliding headlong into one, thanks to people like Elon Musk, Mark Zuckerberh, Bezos, the Sacklers, the Waltons, the Cochs, everyone who’s dirty, filthy, stinking rich with vast stock portfolios that include oil stock and bribes to legislators moving their taxes from them to the working class*, and legally stealing their labor.
        But there are two “worlds” here, Earth, and the spacers on Mars and in the asteroids. Those living on asteroids are called “asterites,” a word Poul Anderson coined in his Industrial Revolution. The spacers (Asimov coined that one) live well, all what we would call overweight, a boon to someone living on Sylvia or even Mars, because of the low gravity. Work is voluntary, and there’s a mandatory retirement age of sixty.
        Earth is my first dystopia, a real hellhole, with hurricanes on land, five mile wide EF-5 tornadoes, and everyone living underground, even the Amish. At the beginning of the story, the Yellowstone supervolcano has exploded, killing millions instantly and billions of war and starvation afterwards. It begins with Earth a dictatorship that resembles both fascism and communism; basically, the whole world is North Korea with weather running everyone underground, and everyone skin and bones, always hungry. You kids like dystopias? There’s a pandemic that kills three quarters of the population... but fortunately, very little concerns Earth. Most of the story is on the trip to Centauri, and the Martian base.
        It’s also my first story with a sad part; I hate sad stories. Also the only story with a little kid, an orphan whose Grandpa is headed to Anglada.
        Here are some snippets, which may or may not be in the final book. It starts off:

History’s first human venture outside our star’s heliosphere was an utter catastrophe that ended in insanity.

        After a few paragraphs, most of Grommler is in it. Everyone thinks the insanity is from the plants on Grommler, but it’s the time stretch.
        Almost everyone in the story are elderly, the youngest three on the ship are in their fifties. Explaining why would involve a spoiler. One is fifty five, the youngest (except the psychologists, in their early fifties) The fifty five year old is a musician, there to put on shows for the crew, so the story’s a lot about music, and all that goes with it, like insane copyrights, which have stretched to infinity in the story, everything before the twenty first century public domain, and afterwards perpetual copyrights owned by corporations.
        Computers write all books, plays, music... A geologist named Will is an amateur guitarist (there are no more professionals, it’s all computers) who thinks he sucks. Sue is a hydrologist who also plays a mean flute.

He finished the tune. “I told you I sucked,” he said as he put the guitar back on its stand.
        Sue was applauding. Bob said “Dude, that’s a much better version than what the computer plays.”
        “You’re just being nice.”
        Sue said, “No, really, that was good! Bob’s right, it was better than the computer version. The computer version has a lot more notes but no soul at all. You could make money playing that!”
        “You think so?” he said.
        “No,” Bob interjected. “A two hundred year old Earthian law says that an ancient corporation owns the tune and you have to pay them. There’s no way you could profit. Copyrights have been perpetual for two hundred fifty years now. Let me teach you some of the old, pre-copyright tunes. Here, here’s one called a Bolero...”

        It isn’t mentioned by name, but the song Stairway to Heaven is in it, as is...

Three days later, Bob Black sat on the stage in the commons with his guitar, a real antique, a Fender Stratocaster, tuning it with a normal electronic tuner like they’d had almost since the Strat had been invented. The computer generated Muzak that Bob hated played. Bar stools were all occupied and a large fraction of the tables were, as well. Half of the people there had never heard real music, played on a real musical instrument by a real person before.
        Bob’s family had been musically inclined for generations. He had been named after another guitar player long ago, his great grandfather Rob Black; both were named “Robert Black” on birth certificates.
        Not only had he seemingly inherited his musical talent, which science didn’t say was hereditary, but musicians did, but also books and books of sheet music going back centuries. He’d had them digitized, and the physical books were locked up in a warehouse on Mars.
        His guitar tuned up, he started with an ancient tune called “Thirty Days in the Hole” from one of the antique books. He never had found out what “Newcastle Brown” was, a disease, maybe?

        Unlike way too much science fiction, mine always actually has real science, scientists, end possible future engineering. The main science in this one is psychology, although there are other fields.
        There are no computer scientists in the story, but lots of computers. I wonder what OS they’ll be running in a few hundred years?
        So far it’s about 27,000 words and 85 pages, maybe a third of the way finished.

* In 1940, the lowest federal tax rate was over four times the median income. In the 1950s and '60s a single paycheck paid a family's bills, the minimum wage would support a young couple with a child. We have been ROBBED silently.

Monday March 27, 23
07:49 PM
Code

Sorry I'm just linking my personal site but there's just too much formatting to move it here, and I'm lazy today. I haven't even worked on the novel.

But here's a small bit that needs no formatting:

Ever since 1946 when ENIAC was patented; or rather, the presidential election of 1952 when CBS news introduced the computer to America, computers have been called "electronic brains". The name is half right, they are, in fact, electronic. But they're not brains.
<snip>
You can do a lot with numbers. You can compute orbital trajectories, predict orbits of comets and asteroids, engineering, cooking... you can even create simulations and recordings of auditory and visual signals, but they can't create or mimic reality. But people still call them "electronic brains" and speak of "artificial intelligence".

You can't mimic intelligence, but you can fake it. Margarine is more honestly called "butter" than what a computer does can be called "intelligence". The only intelligence is the real, chemical, analog intelligence, that of the programmer's.

It's a trick, not unlike the ones David Copperfield performs.

I learned magic at age seven. When my sister's grandson was four, she was showing me her new computer, and the child asked her how computers work. She shrugged, and said "it's magic." As Arthur C. Clarke said, "Any sufficiently advanced technology is indistinguishable from magic."

That's why those of us who actually understand how computers work are called "Wizards".

Magicians use subterfuge and misdirection, among other tools. The AI misdirection is from anthropomorphism and animism, two powerful forces on the human psyche.

People are easy to fool.

I thought of this as a huge problem for the future, when some evil man will use "artificial intelligence" to subjugate populations. I later found that I wasn't the only one; in the beginning of Frank Herbert's Dune there had been a jihad against "intelligent machines" which were therefore illegal.

I decided to do something about it and wrote a program to convince people that computers couldn't really think, by writing one that seemed to but was insane. The problem was, when I explained that it couldn't really think, that it was just trickery, they wouldn't believe the Wizard, probably because of that Oz guy.

There is more at the link, including some original source code and a scan of part of its printout.

Friday March 24, 23
01:38 PM
Hardware

Every now and then I see something on the internet, usually at Soylent News, about pre-digital music recording. It's almost always incorrect; something someone just thought up or heard it from someone else. Some of these people are actually pretty cognizant about most technologies.
        First, one needs to know the difference between analog and digital. Of course, one is computer codes and the other is an analogy, but when it comes to analog music, the more money you spent on equipment, especially speakers but all of it, the more it would sound like real instruments rather than an analogy. This was called High Fidelity when it was actually accurate enough that you couldn’t tell a recorded timpani from a real drum. With digital equipment it doesn’t matter as much. There are tricks that have been developed in the last few decades to fool your ears; no, actually, to fool your brain.
        So I thought I'd start at the beginning, with the birth of recorded sound and dispel all the falsehoods while I'm at it; or at least, the ones I’ve heard.
        I have personally lived through the last seventy years of innovation and change. When I took a physics class on sound and its recording, digital sound recording had yet to be invented.
        Ever since the 1940s or possibly earlier, all albums were copies. One difference between analog and digital is with every child copy, an analog signal degrades, but a copied digital signal is identical to its parent, because it is no longer a signal. It’s a series of numbers, measured voltages. In analog, as the signal from the microphone gets stronger, the voltage feeding the tape head gets stronger.
                In 1877, a century before I attended that class, Thomas Edison invented the phonograph, named with the Latin for “sound writing”, writing with sounds. The first recordings were on tin foil. In 1896 and 1897 he mass produced his phonograph players and wax cylinders. You can hear some of them here at the National Park Service website.
        At one point he developed a talking doll, with a phonograph inside. It was a commercial flop; women had to scream into the recorder, as electronics wouldn't exist until 1904 when his labs developed the vacuum tube (called the “valve” in Britain; both names are accurate). They had one of the dolls on the TV show Innovation Nation. It was a commercial flop. I imagine they would have scared the hell out of little girls.
        In 1900 he patented a method of making his cylinders out of celluloid, one of the first hard plastics. Cylinders had been produced in France since 1893, but were not mass produced as Edison’s 1900 cylinders were. Dictaphones used wax cylinders until 1947.
        Alexander Graham Bell is often credited with inventing the gramophone, probably because of its name, but it was patented in 1887 by Emile Berliner, who named it. He manufactured the disks in 1889. He came up with the lateral cut, where the needle moved side to side rather than up and down as with Edison’s phonograph.
        Records were 12.5 cm (about five inches) and are now recorded at 8 1/3, 16 2⁄3, 33 1⁄3, 45, and 78 RPM. Berliner's associate, Eldridge R. Johnson improved it with a spring loaded motor and a speed governor, making the sound as good as Edison's cylinders. However, it would be a few decades before high fidelity.
        The first records were “about 70 RPM” and standardized at 78 RPM from 1912 to 1925, the year all companies standardized. Modern turntables still play them.
        I have seen comments saying you can’t do deep bass in vinyl because the needle would jump out of the groove, which is one of those things that’s partly right while still being completely wrong.
        This was solved by a “rollover frequency.” Records were recorded with the bass attenuated when recorded, then returned to full volume on playback. However, it created another problem: The records you produced, when played on a record player you produced, sounded pretty good. But played on anybody else’s record player you would have to adjust the tone control to make it sound any good at all.
        This is why the Recording Industry Association of America (RIAA) was formed; to standardize the “rollover frequency”. It’s described well in Wikipedia. Since then, anyone’s record will play on anyone else’s player, and the quality depended on the quality of the disk and the equipment it was played on.
        However, the curve wasn’t standardized until the middle 1950s, when I was a child and high fidelity, usually called “hi-fi”, came about. Its aim was to reproduce the sound as accurately as possible, so good that a blind person couldn’t tell the difference between a recording and a live performance. They never quite got there, but they got really close. They gave up on fidelity when they invented the Compact Disk.
        An old, pre-digital myth presented itself when I was a teenager. My dad’s friend was an audiophile, and once asked me if I thought he should buy a more powerful amplifier.
        “What’s the loudest you turn it up?” I asked.
        “About three.”
        “Nope,” I then answered. “more watts doesn’t make it sound any better, only louder.”
        Some folks think the more watts, the better it will sound. It’s a myth. Or that you need a lot of watts for deep bass. Also a myth; a 1974 Kenwood 777 speaker with its fifteen inch woofer had plenty of deep bass, low enough to feel, with a portable monophonic cassette recorder powered by C batteries. Hardly high fidelity, but deep bass, and treble as good as cheap stereos. With a high fidelity receiver they would fool most into thinking it was live.
        Today’s “sub woofers” are magic; magic as in David Copperfield magic. They fool the brain into thinking there’s deep bass, because they transmit subsonics you can feel, making it seem like the bass is good, but play it with real high fidelity speakers on the same equipment and you’ll hear what a real woofer can do as opposed to a subwoofer. If you need a subwoofer, you don’t really have much bass at all. It’s a trick. There’s a lot of sound on that record that simply doesn’t come out of those cheap speakers that you can hear clearly with a pair of quality speakers with real woofers.
        By the 1950s the sound was good enough, if you could afford the high fidelity speakers, that the the only way adult ears could tell the difference was noise; tape hiss and dust on the final record. Tape hiss was minimized and even eliminated by speed; the faster the tape passed the heads, the higher the frequency of the hiss. At about 16 IPS (inches per second) the hiss was inaudible, as it was above the range of human hearing.
        The best high fidelity home tape decks were at 16 IPS (inches per second), and very expensive. Studio recordings were made at 32 IPS, twice as fast as hiss removal. Fidelity can’t get much higher than that unless they vastly increase the sample rate of digital recording, or get the ferrite grains on the tape smaller.
        It was about this time that stereo was invented. Stereo tape was easy, simply have two separate coils in the tape head, each sending a signal when recording, and receiving it when playing back. These would play both channels mixed together on monophonic tape machines. However, playback is slightly different than recording, so all but the cheapest recorders have separate heads for recording and playback.
        But how can you have two signals in a single groove of a vinyl record? How do you maintain the backwards compatibility that had existed since the Gramophone was invented? I found out in a physics class in the late 1970s.
        As mentioned earlier, the needle wiggles side to side in the same shape as the sound waves. For stereo, this motion carried both channels in the side to side motion, and a single channel in the up and down motions. These two channels are combined out of phase to remove one of the two channels from the side to side motion.
        I couldn’t remember which channel was which, so I googled, and wow! The internet is certainly full of nonsense. One site with “labs” in its name gave an explanation that was very complicated, was believable, and completely wrong, with images that could fool you.
        Even if it’s published in a bound book it may be bullshit. I have a half century old paperback titled Chariots of the Gods that “proves” that the earthen lines in Peru are evidence of extraterrestrial visitation, but it’s obvious to me from looking at them that they were ART. We artists do things like that, even though normal people don’t understand. The book was nonsense, the type of nonsense we call “conspiracy theory” in the 21st century. Way too many people think if a thing could be, that it must be. Occam’s Razor and my college professors’ teachings say they’re artworks.
        I’ve seen comments that claimed that in the fifties and sixties they made records with attenuated bass and treble so they would sound okay in car radios, which is patent nonsense. They weren’t recorded with attenuated bass and treble, you simply can’t get bass from a three inch speaker, and radios were AM only back then. AM radio and its tiny speaker is the limitation, not the music they played.
        They always strove for the highest fidelity possible in the uber-expensive stereo systems that cost thousands of dollars; if you bought a record that made your expensive stereo sound like a Fischer-Price toy, would you buy another record produced by that company?
        Car radio sucked because cars then had abysmal acoustics, and AM has never been remotely possible to produce high fidelity. Even analog FM falls short, due to bandwidth constraints. Radios were all amplitude modulation (AM) in cars, frequency modulation (FM) was new, and not much used until the 1960s, and car radios were all AM only until after 1970. AM radio has a very limited frequency response and unlimited noise; hisses and crackles from things like lightning in Tierra Del Fuego that frequency modulation lacks.
        I’m not going into detail about radio broadcasting here, perhaps in a later article. But if you had a copy of an early record from Jerry Lee Lewis or Chuck Berry, or even something silly like “My Boomerang Won’t Come Back” (it’s on YouTube, I’m sure), on a high end stereo it will sound like Mr. Lewis or Mr. Berry are in the room with you, except that the dust on the record will sound like it’s raining, with an occasional hailstone.
        Now, my dad bought a furniture hi-fi stereo that he paid hundreds of dollars for after his friend introduced him to high fidelity stereo classical music back in the early 1960s. He worked over his vacation to pay for it. This was when a McDonald’s hamburger was fifteen cents and the minimum wage was a dollar (note that the burger’s price stayed the same after the minimum wage went up to a buck fifty, despite politicians’ lies that raising the minimum wage causes inflation, a non-music, non-tech debunking).
        Even Dad’s expensive stereo wasn’t high enough fidelity to fool you, but I bought a stereo system when I was stationed in Thailand that would; sound equipment was expensive in America because of crazily high tariffs. I would have spent ten times as much on that stereo in America, but GIs could import duty-free. A Chuck Berry record played on that stereo sounded like Chuck Berry was in the room with you, with rain from the dist and scratches.
        I don’t remember exactly when Dad bought that furniture stereo, which now sits in my garage, but it was probably a couple of years before the cassette was invented in 1963 by the Dutch. Originally for dictation, the earliest ones were far from high fidelity. The eight track was invented a year later by a consortium of companies, wanting to bring stereo music to the automobile; no car had FM or stereo then.
        The cassette was an eighth inch tape, the eight track was quarter inch, which should have made the eight track superior, as well as its 3 IPS speed, twice as fast as a cassette.
        I never had an eight track, unless you count the player in the stereo my wife owned when I married her. I’d had cheap reel to real portables since I was twelve, and bought a portable monophonic cassette recorder when I started working in 1968.
        One myth wasn’t a myth to begin with. In 1964, the eight track was indeed superior to the cassette, due to its size and speed, as I mentioned. But eight tracks have disadvantages, and their possible advantage wasn’t followed.
        Cassettes got better and better fidelity until factory recorded cassettes surpassed factory eight tracks; they had invented eight tracks for cars and cassettes for dictation. But cars had abysmal acoustics back then, far worse than even today. Plus, nobody but the very, very richest had air conditioning in cars, so the stereo had to compete with wind and road noise, so producers didn’t bother with fidelity.
        By 1970 the studios had started producing pre-recorded cassettes, which sounded better than pre-recorded eight tracks because eight tracks were designed for cars, but people still thought eight tracks were superior despite their terrible habit of cutting off songs in the middle. Relatively few had cassettes; most folks had eight tracks, because of the myth. I busted that myth for a buddy in the Air Force in 1971 by simply playing a cassette.
        I always thought that designating eight tracks for cars and cassettes for homes was incredibly stupid, completely backwards. You could fit a cassette in a shirt pocket, but a cartridge was exactly four times as big as a cassette but held exactly the same amount of music as a cassette.
        The eight track was called as such because there were four stereo tracks, taking the tape size advantage away from them, instead of one or two. This allowed more tape to fit in the cartridge, but made four changes as opposed to cassette and vinyl’s two. And if it was “eaten”; pulled from its cartridge and wrapped around inside the player, it was almost impossible to repair, unlike a cassette, which was relatively easy.
        Dolby noise reduction was developed for recording studios’ master tapes in 1965, and introduced to cassettes in 1968. It worked in a similar fashion to the RIAA cutoff for vinyl; when recording, higher frequencies are greatly boosted, and reduced on playback. As the treble is attenuated, the hiss is, also.
        A twenty year old high end cassette deck is cheap. With the best, high priced equipment, a cassette can sound as good and have almost as good a frequency response as a CD, (up to 18 kHz compared to CD’s 20 kHz), although not CD’s dynamic range, which is even better than vinyl. But a CD can’t match vinyl’s frequency response, being capped at 20 kHz because of the Nyquist limit, which I’ll discuss shortly.
        “Quadraphonics” was introduced in the early 1970s, what we call “surround sound” today. There were four separate channels, two in the front and two in the rear, and the movie studios and theaters got it entirely wrong. Those two rear channels often detract from the movie, removing the magic and bringing you back to reality when the moronic director stupidly makes everyone’s head twist around to see what made that sound behind them. The four speakers should be positioned at the four corners of the screen, so sound can move up and down as well as side to side.
        Quadraphonic stereo was easy to make with eight tracks and cassettes. You simply added two coils to the tape head, each coil feeding a separate channel. This actually improved eight tracks, since there was only one track change. Cassettes had none, because they could be recorded on one side only, since a cassette only has four tracks. That’s all that could fit on a tape that narrow, so quadraphonic cassettes had to be rewound.
        An album was a different matter. I remember that once I had a stereo album that I had to replace; I don’t remember why, but its replacement was quadraphonic and didn’t sound as good as the stereo version on my turntable. Something was missing, and I couldn’t tell what. It sounded the same, but it didn’t. At the time, I had never heard a song in quadraphonic stereo. I didn’t know why it was different until I found out in that physics class later.
        They solved the problem of how to get four channels out of a single groove with electronics. They modulated the rear channels with a 40 kHz tone and mixed it with the front channels, and on playback the front channels were limited to 20 kHz and the rear channels demodulated.
        What was missing was the supersonic harmonics, over the 20kHz cutoff. Very few speakers back then and none today were good enough to tell the difference, but the pair I had went all the way to 30 kHz. You can’t hear tones above 20 kHz. For most people it’s closer to fifteen, especially older people. However, those high frequency harmonics affect the audible tones, and sound engineers can’t seem to understand that, insisting that sounds higher than you can hear can’t affect sounds you can, but I heard the difference with my own twenty five year old ears and learned what was missing the next year after the professor explained how quadraphonics worked.
        I say test it. Get thirty or forty children and teenagers and high quality headphones capable of faithfully reproducing super high frequency tones, and feed a 17 kHz sine wave to the headphones, with instructions to the kids to push a button when the sound changes. After a short time after the trial starts, change the tone from a sine to a sawtooth. I say the majority will press the button right after the tones change, the engineers say I’m full of shit. Stop making assumptions like a conspiracy theorist and TEST it scientifically! Science, bitches! Aristotle was a long time ago.
        This, I say, is what’s wrong with “digital sound”, which is actually a misnomer. All sound is analog; an analog of the original sound comes out of the speakers regardless if the recording is stored in analog or digital. It was invented when the biggest, most expensive computers on the planet were finally fast enough that they could finally record sounds up to 20 kHz, past the limits of human hearing, and cheap computers were capable of playing them back.
        The way digital sound, invented by the Dutch again, works, is periodically recording the voltage coming out of the microphone. With a CD, the voltage is tested 44,000 times a second and those numbers are stored on CD. They then discovered the Nyquist limit, named for the man who discovered it.
        Back to our teenager test with the sine and sawtooth wave, a 17 kHz sine wave sampled at 44,000 samples per second and cut off at 20 kHz is the same as a sawtooth wave, as there are only three samples per wave, far too few to discern between a sine and a sawtooth. But that untested theory says if you can’t hear it, it can’t color what you do hear. But a 17 kHz tone will audibly affect a 1000 Hz tone, even if your old ears can no longer hear a 17 kHz tone.
        Double the sample rate and that 17 kHz tone has six or seven samples. Multiply it by five and the differences should be striking, and digital should beat analog. But not at its present sample rate.
        The reason for the cutoff is that without the cutoff, ugly noise is introduced in a digital recording. It’s the computer’s bane, the rounding error. One too many samples in a wave changes its shape completely.
        That is why I earlier said that sample rates and bits per sample could be high fidelity and even surpass vinyl if they vastly raised the sample rate. They couldn’t when the CD was invented, they certainly can now that CPUs are thousands of times faster.

Thursday March 16, 23
01:53 PM
OS

It's been a while since I'd installed kubuntu and since I bought the Windows 11 notebook, and had forgotten what a pain in the ass any new OS is. But Mint is the least PITA in a long time, far easier than kubuntu was, or even setting everything up on the Windows 11 computer that already had it installed; installation is the easy part, at least after I used the OEM install. Except, have you ever installed Windows? Jesus, but any Linux distro is a breeze in comparison, unless they've changed it since XP.

The first headache was the ultra high definition. I had to bend forward and squint to be able to read the tiny type on the screen. At least the focusing muscles that operate the CrystaLens in my left eye got plenty of exercise. I really need to have one installed in my right eye. But things in this desktop's menu system are extremely easy to find, so I increased the font size to 16 points. Then I installed Audacity.

I wanted to get it so I could record and play from it, but you couldn't read any of the tool items, because most of the text is covered up by the neighboring tool bar and they're unmovable. Oh, well. I'll get back to it, I thought. Lets see if I can open the novel I'm working on.

Libre Office is installed by default, and since there are no images, it will do nicely, although I'll have to see if Open Office is available. I started it (Mint's menu system is more than head and shoulders better then any modern Windows, everything is laid out logically and rationally, while Microsoft appears to want it to be difficult), hit the "file open" button, and had to tell it "other [something]", lots of mouse clicks to get to the network drive, open the drive and it shows the two directories, with the one marked "share" (the drive shipped with those two directories). But when I clicked it, it said it wasn't a directory.

Damn. Will I have to copy the files to the local drive? No, going into Mint's file manager and clicking the file opened it in Libre Office. Rather than its native Gentium Book Basic that I use for the body text of all my books, some huge sans serif font came up, so I'll have to install some fonts. Or just keep writing on the Windows computer, it's a lot easier to open the files there.

Still better than kubuntu. I couldn't access the network at all with it. But if I'm in Mint, at least I won't have to change computers to open a document.

Back to Audacity, I went looking for changing screen resolution, found it, and set it to DVD resolution. Then I had to go back and reduce the font size. Fonts aren't all I'll have to do to it, of course. I got the sound set up so it will play in audacity, and where in its administration to change it from HDMI to Line Out and back, but still need to set up and test its recording. If I don't have it done by Friday it will be on the Kubuntu side this weekend, because KSHE will probably have their monthly "no repeat weekend" where I usually get a lot of new songs.

Lots more to do. A new OS is always a pain in the ass. Remember your first smartphone?

Wednesday March 15, 23
08:37 PM
OS

It finally fell into place. I got the network going in it, I had remembered that something, tfa? ...said that if a device wanted a login, to use “user” as the user name. But that didn’t work. Maybe it was connecting it to an Android. But it was too simple for me to immediately understand.
        In its file manager (I didn’t pay attention to its name), under “network” it listed everything connected to the router, including the drive. Clicking that pulled up the log in screen, and all that was needed was clicking “connect”.
        The answers in the last journal were all helpful and mostly informative, but no matter what method I used, its security blocked me. Damn, Windows is easy to hack! One mentioned another Linux on a disk or a thumb drive, and after a couple of beers I thought, with a start, “Hey, I have another Linux on that drive!” So today after lunch with my oldest, Leila, at D’Arcy’s, I fired up the Linux computer and booted to the kubuntu side. I was able to easily access the Mint side with Dolphin, but the key directories that would let me at it were blocked.
        I probably could have used a terminal, but I avoided it, read the links you guys supplied and tried them. All were excellent, but didn’t work. And after an hour or two my brain kicked itself in the donkey and said “dumbass, how long did it take to install Mint, and how much time have you spent trying to reset the password? Is your math that bad, dimwit?”
        So my brain rubbed its ass and put the disk back in. Now, when I first installed it, I wasn’t sure if Mint would be the answer to my problems, or suck, so I tried it out from the CD. When I saw that I could get on my network, I installed it from an icon on its desktop.
        That was a mistake. It was the cause of all my woes. This time, I chose OEM install from the boot menu. It came up as before, and I told it to overwrite the previous Mint installation.
        That made a world of difference! Poking around in this still dark and unknown distro, I discovered that I was in a temporary administrator account. A window popped up to update, so I updated, using the brain dead password I had told it when I first installed it. Then I went to install Audacity and XMMS and the few other apps not installed by default that I needed, once again being thankful that I didn’t have to uninstall a lot of third party cruft like you always have to do with a Windows computer.
        But XMMS said its package manager was broken. So I went to install GIMP, one of the package manager’s highlighted apps. It, too, said it was broken, but I knew better. There was just too much data trying to fly through the wires at once, and the installer choked. The updater was moving like molasses in January, so I switched to the Windows computer and started typing this.
        I flipped over to the Linux computer (my TV is my monitor), and there was a login field. Damn. So I moved the mouse/keyboard dongle back to the Linux computer, and kept entering wrong passwords until I realized I wasn’t logging in as the temporary administrator, which I’ll have to delete.
        Everything works now, and I want to thank you guys for your help. I may do another reinstall later and just give it the whole two terabytes.

Tuesday March 14, 23
07:05 PM
Code

I finally got around to installing Linux Mint (Cinnamon) dual-boot on the Linux box that still has Kubuntu yesterday.

The OS seems every bit as good as Mandrake used to be, I've missed that distro. It's really slow compared to kubuntu, but kubuntu is incredibly limited and hard to use. Plus, it looks clunky compared to Mint, although I really don't care what my tools look like as long as they work well. Kubuntu's don't. I hate Kate, worst text editor I've used. Well, Debug might have been worse...

I installed it after finding while trying it from the CD that connecting it to the network would be a breeze if I had the correct IP. On installation, I wrote the root password down in Notepad on the new Dell notebook and saved it on a thumb drive.

So today I fired it up after typing maybe half a page of the new novel (I hit a wall, must need a break or something, it's about a quarter way done already) and armed with the network drive's IP address, it wanted the drive's password. It has no password, so I left it blank, and it said I had its password wrong.

Well, I thought, I'll see if I can install GIMP and Audacity and tackle the network drive later.

Who says you can't teach an old dog new tricks? I'm 71 next month and I found the installation app, and unlike kubuntu I actually got it to work. It had both programs proudly highlighted, so when I went to install Audacity, the Linux computer's primary app, it of course wanted the root password.

So I got the thumb drive it was stored on, plugged it in, and copied it and pasted it into the password field. I had constructed a jumbled password that almost demanded copy/paste.

Apparently I typed in in to Notepad wrong. So, first, is it possible to retrieve a root password in Mint? If so, how?

If not, I'll reinstall Mint with a password that's impossible to screw up, then change it to something less stupid when I can build a crazy hard password that I can simply paste in.

I'll have to go in the basement after the Unix book to remember how to change the root password. I hate going down there because of the spiderwebs. There's probably a tool in the graphic desktop these days, I'll bet. Maybe I won't have to go down there until I need to change the furnace filter...

Wednesday February 15, 23
06:11 PM
Hardware

Most of this knowledge is from experience; forty years of trial and error. That’s about as long as anyone has been cooking with a microwave; commercial kitchens have never used them for anything but reheating.
        Some, like restaurant owners, think that microwaves are good for reheating but not for cooking, which is completely understandable. You learned to cook on a stove, have years or decades of experience cooking on the stove, but never learned to cook in a microwave, or practiced those skills. Different technologies require different methods; ask someone who learned to cook on a gas stove, then was forced to use an electric stove. A microwave is far more different from either than they are from each other.
        The art of cooking with external heat has been practiced for three hundred thousand to three million years; they’re not sure how long. Microwaves are less than a century old; at least, our manipulation of them is. We first used microwave radio frequencies for radar in World War Two for detection of enemy aircraft.
        Then in 1945 a man named Percy Spencer, who worked for Raytheon on their military radar, noticed a candy bar in his pocket melting. He used this discovery to invent the microwave oven, and the first thing cooked was popcorn. The second thing was an egg, which, as Wikipedia says, “exploded in the face of one of the experimenters.”
        There are a lot of people still alive for whom there was never such a thing as a microwave oven in their youth. Compared to the campfire, or even the stove, they’re brand new. This explains why there is so much misinformation about microwave cooking; it’s too new.
        So I’m going to start with breakfast. Eggs, because there are so many misconceptions about microwaving eggs; actually, microwaving in general. But we’ll start with eggs.
        It has been written, incorrectly, that if you cook an egg in the microwave it will explode and possibly start a fire. You can see from the early history of the microwave why this was believed.
        It is incorrect. Well, a little incorrect.
        Some think that you have to poke the yolk with a pin or it will explode. Also false. However, poking the yolk with a pin will help your egg cook evenly, as otherwise the whites will cook faster since he whites have more water, and it won’t turn out as good. I pierce it with the shell, if it doesn’t break when it’s dropped in the bowl.
        With eggs or anything else, especially meat, you will sometimes hear little explosions. They’re harmless. At the worst you may have to wipe up a little spilled food from the oven.
        My ex-wife used to boil eggs in the microwave. That is, until one day a quarter of a century ago when she and my youngest were in the back yard hanging clothes after turning the microwave on, and the oldest and I were in the living room. She was watching television and I was reading, and there was a very loud BOOM that sounded like it was a big explosion in the back yard. My daughter and I both ran to the back door looking out of its window, and they were out there hanging clothes like nothing had happened.
        Then I heard an intermittent buzzing behind me. The microwave door was open and smoking, and sparks were shooting out. I hurriedly yanked the plug.
        What had happened was that the water had boiled dry and the eggs, in their shells, exploded. That was the last time she boiled eggs in the microwave! Also the last time that particular microwave ever worked again, it was ruined.
        Left out of water and microwaved in its shell, an egg will indeed explode. If you’re going to boil eggs, do it on the stove! However, you can still cook eggs in the microwave.
        Now, when cooking anything in the microwave, the first “secret” of microwave cooking: the food’s not done when the microwave beeps. Microwaves cook from the inside out by exciting water molecules. When the oven signals it’s done, that only means that the transmitter has ceased transmitting radio signals and the turntable has stopped turning. It needs another minute for the heat to radiate. Note that this is only when cooking, not reheating.
        Illustrating this, If you get a bowl of pre-made frozen chili like the Converse Street Bar sells, the instructions will say to cook the frozen bowl for three minutes, let it set for a minute and a half, and cook it for another two minutes. This is because the grease will have floated to the top of the bowl before it was frozen, and while the liquid is boiling, the grease is still frozen, since there is no water in grease. The heat rising from the chili melts the grease as it sits. A better way is to microwave it for two minutes, remove the lid, break up the grease and submerge it in the chili, and nuke it for another half minute to a minute and a half, let it set, then remove it from the microwave and remove the lid; it will be too hot to eat, but not as bad as the package directions. It illustrates how the microwave cooks; water in the food produces the heat that cooks the rest of it.
        My old microwave, the one I bought to replace the one that exploded in the ’90s, took a lot less time to cook; it was 1000 watts, the new one is 750. Energy Star; the oven is less powerful but uses the same amount of electricity to cook!
        Your tax dollars at work. Remember, government employees are mostly no smarter or knowledgeable than you, don’t use their brains any more than they have to, like you, and get paid less than you do if you’re working the same job in the private sector.
        My mother, who considered the microwave oven to be the greatest invention of the 20th century, once wondered aloud if you could cook bacon in the microwave. “Yes, you can!” I told her. “In fact, some packages of bacon give you cooking instructions for frying, baking, and microwaving.” I’m pretty sure that my mom never read bacon packages, as she had been cooking bacon almost since God invented pigs. It’s about the only kind of meat that comes out okay, at least that I’ve seen.
        The time will vary from a minute to five, depending on your oven’s wattage and how crisp you like your bacon. As I wear false teeth, I no longer eat crisp bacon; it gets under the dentures and is painful. The look and texture of the bacon may not be what you’re used to, but the taste will be the same.
        To cook bacon in the microwave, place it on a plate and cover it with a paper towel, because it will splatter like it’s fried in a pan. Even cooked with an egg, it will splatter.
        The bacon package directions, when available, say to place the bacon on paper towels to soak the grease, but I never do. When it’s done I pour the grease in a jar for later use, like my grandparents and your great grandparents did.
        A really quick breakfast is to lay a piece of bacon in a shallow bowl, crack an egg in it, and stick it in the microwave for two and a half to three minutes. That’s with a 700 watt oven, a higher wattage will use less time; it was two minutes with the old, more powerful oven. You will have to experiment, since power isn’t standard with these devices. That’s why your prepackaged microwave items have instructions like “6 to 7½ minutes”. Don’t forget to let it set for a minute or so before removing it. It will still be too hot to eat and needs to cool a minute. Sssh, it’s a “secret”!
        It’s easier, quicker, and far cheaper than one of those Jimmy Dean breakfast bowls; two eggs and a slice of bacon are less than a dollar, but a pre-made breakfast bowl is three or four bucks and you have to pierce the plastic wrap, cook it for two minutes, stir it, and cook it for another minute. Or crack a couple eggs into a bowl, add a slice of bacon, and microwave.
        If you don’t like the idea of the egg and bacon mixed, you’ll have one more thing to clean. Throw the bacon in the bowl, cover it with a paper towel, and cook it. Then remove the bacon and put it on a plate, then crack the egg into the bowl with the bacon grease.
        The egg won’t stick badly to the bowl, but if you’re just cooking an egg or two by themselves, you should add butter, grease, or oil, as I found out when I bought my first anti-stick skillet and tried it out with an egg. It fried okay, but was the blandest egg I’ve ever eaten. The microwave is the same, you need a blandness remover. This is one reason why some think microwaved food isn’t tasty. If you’ve cooked something on the stove with something like oil to keep it from sticking, the oil is part of the taste. If you use oil in a pan, use it in the microwave, too.
          You can scramble it first, or just crack it into the bowl. You may be able to make one sunny side up, I haven’t really tried; the egg in the illustration had a solid yolk. It’s likely it will be as difficult as cooking chicken in a microwave and far more trouble than frying it. The yolk cooks only slightly slower than the white, so you would have to separate the yolk and set it aside, cook the white until it started to congeal, then add the yolk back. Way too much trouble when a sunny side up egg is so easy to cook on a stove.
        Often I just put it in the bowl and forget to pierce the yolk. Sometimes the yolk breaks and sometimes it doesn’t.
        Another secret is anything you cook in your kitchen, no matter how you cook it, will taste better than something pre-cooked and frozen, like those Jimmy Dean breakfast things or TV Dinners. A homemade chicken pot pie will taste far better than one from a food factory, if you’re any good at cooking at all. Even home-made potato chips are better than corporate potato chips.
        As most everyone has discovered by now, a microwave will make stale bread soft. You can’t bake in a microwave; the “oven” moniker is very misleading. You can heat a pot pie in the microwave, but you can’t bake one. You can make a Shepherd’s Pie in the microwave, since it has no crust. I just buy them from D’Arcy’s Pint; I don’t really like to cook.
        I like to make omelettes, and they’re especially good in the microwave. I make a Denver omelette; a Denver omelette has egg, meat (usually ham), cheese, green pepper, onion, and tomato. That’s a Western Omelette with added tomato. Sometimes I add hash browns and corned beef and call it a Western Irish Omelette. I usually lay the cheese on top. A pat of butter in the bowl makes it better.
        If you’re making a Denver omelette in the microwave, it will need to cook longer to evaporate the water in the tomato. Also, as might be expected, two eggs take longer than one egg.
        To make a Dr. Seuss omelette, add a drop of blue food coloring to the scrambled egg and microwave it with ham.
        I had an astronaut omelette this morning, a cheese steak omelette with a little steak I had left over from yesterday. I seldom made omelettes before I found out how good eggs were from the microwave, if cooked properly, because it’s a lot more work on the stove.
        I bought a food processor to chop all the stuff up, and discovered that shredded potatoes turn black overnight in the refrigerator, obviously oxidizing. I doubt they’re bad for you, but I’m not eating black hash browns! So I’ll give the food processor to a daughter. I’ve since bought a small handle-operated cheese grater to shred the vegetables for my omelettes, and started buying the smallest potatoes I could find. I also found that shredded potatoes keep well in the freezer, but start darkening as soon as they start thawing.
        I’ve bought pre-shredded frozen hash browns, and they kept in the refrigerator for weeks, so they must have added BHT (Beta Hydra Tolulene) to keep it from oxidizing. As the Food and Drug Administration has limits on how much BHT you can add to your pre-processed food, it’s probably not very good for you. Actually, any pre-processed food isn’t much good for you.
        My sister and her husband won’t use their microwave for anything but heating a cold cup of coffee because “I heard that the microwaves change the chemistry of the food.” It’s true, but the change in chemistry is from the heat, not from the microwaves themselves. The chemistry of the food changes exactly the same in a convection oven, microwave, or a pan on the stove. The differences are in moisture, especially the microwave because of how a microwave produces heat.
        Now, when people hear the word “radiation” they think of radioactivity and call microwaving “nuking”. But your gasoline vehicle has a radiator, and houses with steam heat used to have radiators; heat was radiated from them. In a microwave, the radiation is simple radio waves, like the radio in your car. The only difference is the frequency; the same difference as the difference between two radio stations, and enclosing those radio waves in a steel box.
        About “frequency,” AC stands for alternating current; DC is direct current that travels in one direction, while AC switches directions, the frequency being the speed at which it changes. American wall current is 60 Hz (Hertz, named after Heinrich Hertz, who proved that Maxwell’s “electromagnetic waves” were real), meaning it changes direction, or “polarity”, sixty times a second. European electricity is 50 Hz. FM radio is in the middle of the television frequencies, 88 mHz (mega Hertz, or eighty eight million cycles per second) to 108 mHz. Microwave ovens are 2.45 gHz; gHz is giga Hertz, billions of cycles per second; 2.45 gHz is 2,450,000,000 Hz, or two billion four hundred fifty million cycles per second.
        It’s roughly the same frequency as the telephone in your pocket, which is why people have ignorant, superstitious fears that cell phones cause brain cancer. If these fears had actually been warranted, brain and groin cancer rates would have spiked in the quarter century since cell phones became common. They haven’t.
        But “radiation causes cancer!” Again, the word “radiation” can mean different things depending on what is radiating and how it radiates. A radio, like your phone or a TV broadcasting station, radiates electromagnetic energy; it’s exactly like waving a magnet around. In fact, if you take a bar magnet, drill a hole in the middle and stick a stick loosely in the hole, if you turn on an induction cooker and hold it just above the burner, the magnet will spin at 60 RPM, as wall current is 60 Hz. You will have built an electric motor.
        Light is electromagnetic radiation, just like the low frequency magnetic radiation an induction cooker uses, or the colors you can see, or the signals beaming to your TV set and car radio and telephone, or your microwave oven. All are light. We just can’t see those colors with our eyes.
        Radioactivity is also light, but unlike the colors you can see, or the colors a microwave oven or telephone transmits, the photons that make up gamma rays and X-rays contain enormous amounts of energy. Comparing microwave frequencies to gamma ray frequencies is like comparing a candle flame to the sun, each at a distance of a thousand miles. That may be a bit of an exaggeration, but you get the point.
        Your phone uses microwaves, those extremely high frequencies, because with digital signals, the higher the frequency the greater the bandwidth; meaning the more phones the towers can connect to. Your microwave oven uses those higher frequencies because it’s the frequency that excites water molecules. The reason you shouldn’t put metal in a microwave is because metal is opaque to microwave frequencies of light, so it will reflect back to the transmitter and ruin it, like a high powered laser pointed straight at a mirror.
        A gas stove produces its heat by burning natural gas, and a traditional electric oven produces heat by passing electricity through an electrically resistive coil.
        There is a new type of electric stove, the induction cooker. It heats a steel pan with a low frequency electromagnetic wave. Rather than microwave, it’s a long wave about the frequency of the AC from your wall These won’t work with anything but a steel or iron pot or pan; like a microwave heats the food directly and needs water to work, the induction cooker heats the pan or pot directly and needs a ferromagnetic material like iron or steel to work. Like microwave frequencies react with water (hydrogen is actually a metal, making water like burned metal. Rust is burned iron), those extremely long wave frequencies react with iron. However, laying a steel plate on top of the cooker makes it as if it’s a normal electric burner.
        In either case, the heat is introduced from outside the food, like a campfire and unlike a microwave. The microwave energy is a radio frequency that excites water molecules; the water inside your food heats up, so it’s cooked from the inside out rather than the outside in, but the chemical changes are identical. The difference is how and where the food’s water escapes as vapor, which is why a convection oven can bake a pizza and a microwave can’t (although there are some special pizza boxes that “kinda” do, heating a pre-cooked frozen pizza without making the crust soggy).
        Because of this, it’s extremely difficult, almost impossible, to cook edible chicken in the microwave, or to even heat it up in the microwave because of how fat is situated in the meat of a bird, unlike mammal meat. I’ve managed to cook edible chicken breasts in the microwave, but it’s so hard to do you’re better off cooking them in the convection oven, or deep frying them. If I want chicken I’ll just buy it already fried; frying chicken is hard work and a mess and I’m too old for that shit! I’m not one who loves to cook, I cook for the same reason I worked: to eat.
        Mammal meat doesn’t cook well in a microwave, either; the taste is almost the same, but it comes out very unappetizingly ugly. Here’s a photo. I tested part of a raw T-bone, putting it in the microwave for two minutes. It came out like the photo here, but the taste was almost identical to the piece cooked in a frying pan, except for being more juicy and tender than the part I cooked on the stove. Yes, that’s science. One steak, part cooked on the stove and part in a microwave. Better science would do that hundreds of times and document all of the results.
        Afterwards, I experimented with barbecuing a pork steak, again, with most of it cooked in a pan. I thought perhaps the sauce would disguise the looks.
        I was wrong. It still looked disgusting when it was done. That’s what science is for; testing preconceptions. Some things that seem to be a certain way really aren’t.
        Of course, what was cooked in a pan was nowhere as good as one cooked on a grill. I put a small piece on a plate and microwaved it for three minutes, fearing undercooked pork.
        It was way overcooked; chewy, but didn’t taste any different than the barbecue in the pan. Had I overcooked it that badly in a pan it would have been like burned shoe leather.
        However, with the exception of chicken or fish you can reheat it, again being sure to reheat it, not recook it. The biggest reason mammal meat doesn’t cook well in a microwave is because you can’t brown it in a microwave.
        However, if the meat is in a dish, like ham and beans, or beef stew, or chili con carne (that’s Spanish for “chili with meat”), maybe a casserole (I never tried making a casserole in the microwave because I don’t much care for casseroles), it cooks fine in the microwave.
        If you’ve fried a steak or a hamburger or a pork chop or such on the stove or grill, and find when you cut it or bite it that it isn’t done, a minute or two in the microwave will finish its cooking without altering its taste or appearance unless you cook it too long.
        That is, unless it’s a fats food burger, those are really nasty reheated. They slap the condiments and tomatoes and other garbage on them to cover the taste of the very low-quality meat. Just give it to your dog, if he will eat it.
        But good quality hamburger you cooked on the stove heats well in a microwave, unlike a fats food burger. No, that wasn’t a typo, “fast food” is; it’s no faster than a sit-down restaurant with wait staff, but it will make you fat.
        One thing I discovered about forty years ago was that if you barbecue pork on a charcoal grill, refrigerate the leftover meat overnight, then re-heat it in the microwave the next day, it tastes twice as good as when it was first cooked! It probably has to do with the water heating the fat, but that’s just a guess.
        Anything that you normally boil will be identical in the microwave. I’ve found that it doesn’t have to boil to cook in water, making your food healthier, since less water will evaporate.
        You should always use filtered water when cooking, either in a pot on the stove or in a microwave, because evaporation will concentrate all of the inorganic poisons, like lead and arsenic. If you drink filtered or bottled water, you should cook with it, too, or you’re wasting your money. If there are 135 parts per million of nastiness coming out of your tap, like the last time I tested Springfield water (my filter pitcher came with a tester), boiled halfway down doubles that to 270 PPM.
        I have yet to find any vegetable that doesn’t come out of the microwave tasting delicious, as long as it’s cooked well, which in most cases is just heating long enough. But some require extra for the best taste. I mentioned bacon grease earlier; when I was growing up, whenever my mom made green beans she cooked them in a pot (there were no home microwave ovens back then) with water and bacon. I understand that’s how most Americans except Jews and Muslims cook green beans.
        In a microwave, I’ve used the bacon itself, but the way I cook dinner makes it better to just put the frozen beans and filtered water in a tall ten ounce cup, and add a little bacon grease.
        The tall cups allow me to heat three vegetables at the same time. Maybe I should have subtitled this “Cooking for One or two”. Before, I used bowls, and was thinking about buying two more microwaves, but don’t have the room in my little kitchen. If you’re cooking for one with the microwave, you’ll need to pre-cook most vegetables until they’re the desired softness. In the microwave, of course.
        The tall cups also, unfortunately, boil water a lot faster than in a bowl, and as soon as it boils, it boils over. The obvious answer is to not leave it in long enough for it to boil. Test it first with just tap water so you will know the maximum time it takes your microwave to boil water in a tall cup. With the cups I have and the microwave I use and the liquid at room temperature, a minute and a half is the maximum without making a mess. Veggies straight from the freezer can take two minutes; you can test this with cold water and an ice cube. When it beeps, start it again until it’s done.
        Now, another misconception about microwave cooking is that you should only reheat food in a microwave once or it will somehow become poisonous, the old “microwaves change the chemistry” nonsense. If you overcook anything, whether in a microwave, on the stove, or in an oven, it will taste nasty. It won’t be poison, but I can see how you might think that, although if you use tap water there’s a tiny grain of truth in it. Again, water that some of has evaporated is in fact more poisonous than before heating. Whether cooking on a stove or in a microwave, you should use the purest water you can.
        The biggest microwave cooking “secret” is getting the food just hot enough. A lot of people have a really bad habit of overcooking in the microwave, probably because it’s so much faster than other cooking methods. I once saw someone put a TV dinner in a microwave for ten minutes, then complained about how badly it tasted. I looked at the box—the instructions were cook it for two minutes, turn the meat over and stir the potatoes, then cook it for another minute to a minute and a half. No wonder it sucked! But back before microwaves they came in an aluminum tin and took half an hour to forty five minutes to cook in a convection oven.
        Remember that if it’s not done enough you can put it right back in the microwave, but uncooking a thing is physically impossible.
        It’s been said that “everything’s better with butter.” That’s simply not true. I tried cooking buttered broccoli and cauliflower and it was awful! You would think that butter beans would be better with butter. Ironically, buttered butter beans are disgusting.
        Broccoli and cauliflower are best just cooked in water until they’re soft, or eaten raw, the most nutritious way to eat them, although cooking will kill bacteria. Of course, there are probably some very delicious recipes with those vegetables. Butter beans and Lima beans are also best just cooked in water. Peas are very good with minced onion.
        Corn, carrots, potatoes, are all better with butter, and the more butter the better. If you buy or grow fresh carrots you’ll be amazed at how much stronger the taste is than canned or frozen. I’ve started to buy fresh vegetables when they’re available, and freezing them myself. They’re cheaper, more nutritious, and taste better; they must boil the hell out of carrots, peas, and green beans before they freeze them to get rid of the taste and nutrition. I’ve never liked canned peas, but love them fresh or frozen.
        A good rule of thumb is whatever you put with any given vegetable when cooked on the stove will work with a microwave, like butter with corn, or bacon with green beans. Often it will taste better than on a stove top. Again, that includes anything you put in the pan to resist sticking.
        The most rational way to cook is to use the method that produces the best results; taste and nutrition, ease of preparation, and costs the least. Of course, with any kind of food, there will be trade-offs between those three variables.
        Cooking with microwaves has many advantages, for things like eggs, vegetables, and soups. Actually, anything except meat, and as long as it is in a recipe, like chili or beef stew, you can cook meat in it, as mentioned before.
        Microwaved vegetables are more nutritious than vegetables cooked on a stove, because on a stove, most of the vitamins and minerals are poured down the drain. As it takes longer to cook on the stove, there will be more poison from the evaporation on a stove, compared to the microwave. On cold winter days when the air is dry I put a big pot of tap water on the stove to boil. It keeps my lips and sinuses from cracking, and the flu virus can’t live in higher than forty percent humidity. You should see what’s left in the pan when the water’s boiled out!
        A gas stove may be cheaper to cook on than a microwave; that would have to do with what you’re paying for each form of energy. But in the summer, that gas stove will run your electric bill up from making your air conditioner work harder; that’s the appliance that uses the most electricity in your house. At least in the summer and possibly all year, the microwave costs less to run.
        An electric stove will cost a whole lot more to cook on than a microwave, whether a traditional resistance stove or a new induction cooker. Each separate burner in a resistance stove uses up to 2400 watts, an induction burner uses up to 1800 watts. Your microwave maxes out at 750 to 1000 watts, and it runs for less than a fourth of the time it takes to cook on the stove.
        The microwave is a lot less work than the stove, especially cleaning up the mess. In the microwave, there are no pots, pans, or skillets, you serve the food from the container it was cooked in. Except the vegetables; you will need a slotted spoon to dish them out. There are no tongs or spatulas to wash because when food is cooked from the inside rather than the bottom, you don’t need to turn it.
        Now, where do all the myths come from? Mostly, as I said, misunderstanding something that’s been heard. But also from those dishonest rich people who stand to make more or lose less money because of the myths: gas and electric companies. If you cook your vegetables on a gas stove, the gas company gets paid, but not if you cook it in a microwave. If you have an electric stove, the power company prefers you use the stove because it takes a lot more electricity than a microwave, so they make more money from you.
        It’s better to throw away the myths and keep your money.