I finally got around to installing Linux Mint (Cinnamon) dual-boot on the Linux box that still has Kubuntu yesterday.
The OS seems every bit as good as Mandrake used to be, I've missed that distro. It's really slow compared to kubuntu, but kubuntu is incredibly limited and hard to use. Plus, it looks clunky compared to Mint, although I really don't care what my tools look like as long as they work well. Kubuntu's don't. I hate Kate, worst text editor I've used. Well, Debug might have been worse...
I installed it after finding while trying it from the CD that connecting it to the network would be a breeze if I had the correct IP. On installation, I wrote the root password down in Notepad on the new Dell notebook and saved it on a thumb drive.
So today I fired it up after typing maybe half a page of the new novel (I hit a wall, must need a break or something, it's about a quarter way done already) and armed with the network drive's IP address, it wanted the drive's password. It has no password, so I left it blank, and it said I had its password wrong.
Well, I thought, I'll see if I can install GIMP and Audacity and tackle the network drive later.
Who says you can't teach an old dog new tricks? I'm 71 next month and I found the installation app, and unlike kubuntu I actually got it to work. It had both programs proudly highlighted, so when I went to install Audacity, the Linux computer's primary app, it of course wanted the root password.
So I got the thumb drive it was stored on, plugged it in, and copied it and pasted it into the password field. I had constructed a jumbled password that almost demanded copy/paste.
Apparently I typed in in to Notepad wrong. So, first, is it possible to retrieve a root password in Mint? If so, how?
If not, I'll reinstall Mint with a password that's impossible to screw up, then change it to something less stupid when I can build a crazy hard password that I can simply paste in.
I'll have to go in the basement after the Unix book to remember how to change the root password. I hate going down there because of the spiderwebs. There's probably a tool in the graphic desktop these days, I'll bet. Maybe I won't have to go down there until I need to change the furnace filter...
No, I haven't emigrated yet, but I've used this weekend to book a cheap hotel in St. Catharines and zip around southern Ontario on their incredibly awesome Go Transit system. I saw some of Burlington, a little of Missisauga at the bus terminal, *all* of Hamilton along both King and Main streets--twice!--and got to visit Thorold today on the St. Catharines local lines.
This entire place, so far, has been a revelation. It feels completely different from everywhere I've been in the US, and in almost every single way, it feels better. A lot better. The one downside I think is that money doesn't go quite as far for most things here, though with the inflation issues in the US everyday food actually costs *less* here, which is something I was pleasantly surprised by. This is despite the higher price tags - the raw numbers are vertiginous, but multiply by ~.75 to get the price in USD. Eating out is another story - I just paid $17 for my one and only meal out, an admittedly gigantic mission-style burrito. Then again that's about USD$12.50, so still not awful. Just testing a similar order against a Chipotle in Buffalo comes to well over $12 and I suspect it's a smaller burrito so yeah.
Then there's the Bulk Barn. Steel cut oats, my go-to, are $3.50 (remember, ~$2.80 USD) for *a kilogram.* I can get an entire month plus of oats for way, way less than the cost of that burrito above. Parboiled rice is $5.90 (~$4.50 USD) a kilogram. Hard whole wheat flour is just over $5/kg. Both green and red gram (lentils) are $4/kg. Even veggies are cheaper here! Carrots appear to be about $2.25/kg for example.
What I notice most is that "working poor" goes a lot further here than in the US. Even the notionally low-end grocery stores like Food Basics look like Christmas came early to me. The selection is incredibly huge, there are plain-label generics -- often literally called "no name" -- that are just as good, and there's a much healthier spread overall. Zehr's, which is apparently only considered mid-end, looks like something out of the sort of dream I might have after 2 weeks on a starvation diet.
The transit, as mentioned, is amazingly good: buses come every hour at worst, usually every half-hour, and they all seem to be coordinating with one another so that waiting for transfers doesn't take very long at all. The fares are expensive, $3 here and $3.25 in Hamilton (but remember, $2.25 and ~$2.50 in US money!), but they have 10-ride cards, day passes, monthly passes, and transfers good for 2 hours at minimum for unlimited rides. St. Catharines is not a large city, so that effectively $2.25 fare would let me get anywhere and possibly back.
This place is also amazingly clean. There are segmented trash/recycle bins everywhere, and people actually use them properly. There's very little plastic waste, almost all the paper packaging is recycled kraft paper, and most places straight up don't have plastic bags, or even bags at all. This is definitely a BYOB nation and I am all for it.
Somewhat surprisingly, if I take the worst-case scenario of making CAD$16/hr ($32000/yr gross) up here as a pharmacy assistant, the position that most closely matches the US definition of "pharmacy tech," my tax burden is slightly *lower* than it would be in the US! And I'd get actual services, like healthcare, for that tax, so the real number for takehome pay is going to be around $1500 higher per year when considering that I won't need health insurance any longer. Apparently, while certain powerful people are trying their damndest to turn Canada into "USA 2.0 with a side of Timbits," enough people are awake (for now...) to stave it off for at least while.
The major problem is rent. Apparently, just 5 years ago, a decent 1 bedroom would have been under $1000, all-inclusive. Now? Good luck getting a bachelor(ette) for that, *without* heat and hydro thrown in. For this reason I am very likely going to attempt to land in, in this order, Thorold, Niagara Falls, and Fort Erie rather than any of the larger cities, though Hamilton still has some decent pricing in areas that Canadians consider bad neighborhoods...which, of course, I consider a vacation spot. There definitely are some parts of Hamilton even I wouldn't rent in, but very few, and they're all in the small gap between about 400-800 Main Street East and its nearby E/W corridors. Rent definitely does worry me, as apparently it's almost doubled over the last 10 years all over Ontario, but my girlfriend and I live simply and don't spend much other than the basics.
So what's next? Well, turns out the PTCB credential doesn't transfer across borders, but the good news is that 1) there are PA courses, 2) they are available online and I can take them in the US, 3) they are not very long at all -- 34 weeks but "self-directed," and 4) the whole shebang will cost me less than $4,000 US. Unfortunately the next set of these at George Brown doesn't start matriculation until July, though I haven't looked into McMaster or Niagara College yet, so it will likely be a while before I get the cert. Then, of course, I'd need a work permit and to find housing. And then, it goes without saying, I'd want to become a legal permanent resident ASAP, and hopefully a citizen once the 5-year period is over.
Wish me luck. I may be too late, or I may not have enough money, but I'm damn sure going to try.
I wrote recently asking for advice about how to help a newbie tactfully.
I have been very surprised since then about some of the things our newbie has asked me and what I have had to explain to him. Is it a "young people today" thing, am I completely out of touch?
The first thing is when code reviewing this young guy's C. As you probably know, when you are working in a programming team there are certain things you have to agree on, such as the format of the code. It may sound trivial, but if different people format the code differently, it makes tools like diff harder to use, and it makes it harder to spot structural and functional changes on the code. We use spaces for indentation. Tabs were creeping in!
More seriously, though, I found out that he didn't know about git diff. I found out from someone else that he didn't know about branching in git. I would have thought a trivial search online would have given some clues?
Function prototypes are there for a reason. They define the public interface to functions in a source module. That's why they go in a separate header, and if your program wants to call the functions in the other module, you #include that header. I found redefined function prototypes in the program source (not the module where the functions were implemented) and they had the wrong argument lists for the functions! Why?
The other strange thing was I had to explain the difference between ASCII and binary. I had to explain, with an ASCII chart (which he didn't understand), that NUL has code 0. I also had to explain why editing binary data in a text editor was not a great idea.
Also, you don't need to terminate a binary array with '\0'. Certainly not outside the array bounds.
I've tried to keep it polite, professional and helpful, but the other day he informed me that he is only used to getting one or two code review comments and they're only about "the functionality."
Kids today? He has nearly 10 years of experience. Am I too harsh? It's a lot of code, so it's got a steep learning curve, and it's for an important job.
Memory safe languages are coming back into fashion nowadays.
Back into fashion? They're not new?
That raises the question as to why the world's software is built on memory unsafe languages such as C and C++.
In a word, efficiency. If we look back in history a bit to where high level languages were becoming common, we can see why.
My experience with programming begins in the 1980s, so I have no direct experience of how things were prior to that, but I did start out on 8-bit microcomputers with BASIC interpreters.
To give you some context, these machines had CISC CPUs that ran at a few MHz (1 to 4 usually) and took several clock cycles to complete a single machine instruction. The primitive BASIC interpreter lived in a few kB of ROM and was written in machine code.
Those BASIC interpreters were slow, but they were fairly user-friendly. When you declared or even just used a new variable (integer, floating-point or string) for the first time it would be initialised. When you concatenated strings, it happened as if by magic. You could insert and delete substrings. You could take input from the user in strings of arbitrary length that would adjust dynamically. There was no such thing as overflow. Array bounds were checked. The program would stop with an error code if you tried to read or write out out bounds.
In previous decades, compiled languages were developed to run on mainframe and minicomputers. A typical 1970s vintage minicomputer was about as big (RAM) and powerful (speed) as a 1980s home microcomputer. Compilers were always complex and notoriously arcane, unreliable and generally mysterious. They were usually buggy and not to be trusted, apart from in some very specialised and expensive cases.
Along came languages like C and Pascal. Pascal was nice but considered a "toy." It needed extensions to be useful. However, in the 1970s C came along. It was simple and powerful. A C compiler could run on a relatively small machine (a few tens of k or RAM) and produce usefully efficient machine code, fast enough that writing in plain machine code or assembly language wasn't really worth it any more. There was also the advantage of portability. The compiler could target different architectures so one piece of source code could compile and run on very different hardware.
This simplicity and efficiency came at a price. The language and the compiler did not support such things as array bounds checking, variables were not automatically initialised and strings were effectively static, not dynamic. Library routines were provided to supply some of that functionality.
The run-time checking (and safety) was eliminated in favour of the programmer being all knowing and all seeing and infallible.
With hindsight, better choices might have been made but remember that the machines of the day were so small and slow that it was the only way to get efficient enough software out the door in a reasonable timeframe.
Most computers were not networked. The Internet wasn't even a thing. Security wasn't so important. A lot of the shortcomings in the languages and their libraries weren't that much of an issue.
Other things did come along but the "good enough" solutions had become entrenched. Ada was one such alternative. It didn't catch on because compilers were expensive (to call a compiler Ada it had to go through an expensive validation process) and there was a bit of a reaction against "strict" languages (for all the wrong reasons).
Programming languages are there to express mathematical concepts. They need to be precise and unambiguous. The human brain is neither. Experience tells us that we should chose languages and compilers that help us reduce ambiguity and express with greater precision that which we wish to achieve.
There is no perfect programming language. It always used to be that you would use your programming language to build up a framework of "subroutines" to help you solve your problem. FORTH was great fun for doing this. LISP looks eveen better, and is indeed the essence of programming it seems.
Back to my original point: memory safety got sidelined for efficiency. Efficiency was important because forty years ago hardware was so slow and primitive. I haven't been on a slow computer in over ten years. I have computers from over ten years ago that aren't slow.
Efficiency, in compilers or in the runtime, is not a problem any more. Why are we not programming with memory safe languages as a matter of course? What wheels is Rust reinventing?
We hired a new person at work at a fairly junior level. He's about 20 years younger than me and worked for one of those body shop companies doing embedded development. We interviewed several, but this guy had a bit of a spark, was genuinely interested in what he was doing and despite being very junior, recognised this and is keen to learn.
We got a new PHB some months ago who is not an embedded developer, to free me up from PHB duties to do Other Things. Since then I have been doing all sorts of Other Things and very few of them in my job description.
I was asked to write some C code as a bit of an emergency (the company doesn't believe in proper analysis, requirements and communication) which was relatively straight-forward for me (I can churn out a few thousand lines of code in a week if I have to).
Other projects needed my assistance urgently too so the PHB gave my code to the new guy to finish off. There are "just a few bits" that need doing but they involved having a sufficient level of understanding and a measured approach to implementation to ensure that they don't end up full of terrible bugs. I wrote my code with (to me at least) nice clear, simple unit tests in a TDD fashion and with some scripted regression tests, all run automatically.
The poor young new guy has just had the entire company induction reading to go through and a bunch of training courses, is in a new industry sector with all that implies and has been given 3k lines of code written by me "to finish off" with, at least to begin with, a ludicrous deadline.
Fortunately after putting thumb screws on other PHBs and enlightening them as to various common existing command-line utilities, the deadline has vanished, which is good.
However, I'm doing the hand-holding and advising. When I was similarly young I was fortunate to be in a great team who were very helpful most of the time and quite patient. I want to be like that.
Here's my problem. This guy has worked for a very large company doing a very specific job. All of his C coding has been done in a proprietary IDE using a proprietary compiler. Now he is faced with git, gcc and GNU Make (and watching me code using vim). He's trying very hard to get stuff done but doesn't know what he doesn't know. I think this is the first time he has seen TDD. He has put all sorts of debugging code in the main source file that really isn't necessary if you understand the unit tests.
I would like to say, "Go away and read the fine manual." However, I want to do it politely without sounding too condescending or critical. I Need to introduce the gcc command line options for compiling and linking. I could just give him a URL to the gcc manual but that would be pretty tactless.
Then of course, I'm very busy with all the other nonsense going on at work so my frame of mind isn't always the right one when he asks for help and it's difficult to change modes.
Most of this knowledge is from experience; forty years of trial and error. That’s about as long as anyone has been cooking with a microwave; commercial kitchens have never used them for anything but reheating.
Some, like restaurant owners, think that microwaves are good for reheating but not for cooking, which is completely understandable. You learned to cook on a stove, have years or decades of experience cooking on the stove, but never learned to cook in a microwave, or practiced those skills. Different technologies require different methods; ask someone who learned to cook on a gas stove, then was forced to use an electric stove. A microwave is far more different from either than they are from each other.
The art of cooking with external heat has been practiced for three hundred thousand to three million years; they’re not sure how long. Microwaves are less than a century old; at least, our manipulation of them is. We first used microwave radio frequencies for radar in World War Two for detection of enemy aircraft.
Then in 1945 a man named Percy Spencer, who worked for Raytheon on their military radar, noticed a candy bar in his pocket melting. He used this discovery to invent the microwave oven, and the first thing cooked was popcorn. The second thing was an egg, which, as Wikipedia says, “exploded in the face of one of the experimenters.”
There are a lot of people still alive for whom there was never such a thing as a microwave oven in their youth. Compared to the campfire, or even the stove, they’re brand new. This explains why there is so much misinformation about microwave cooking; it’s too new.
So I’m going to start with breakfast. Eggs, because there are so many misconceptions about microwaving eggs; actually, microwaving in general. But we’ll start with eggs.
It has been written, incorrectly, that if you cook an egg in the microwave it will explode and possibly start a fire. You can see from the early history of the microwave why this was believed.
It is incorrect. Well, a little incorrect.
Some think that you have to poke the yolk with a pin or it will explode. Also false. However, poking the yolk with a pin will help your egg cook evenly, as otherwise the whites will cook faster since he whites have more water, and it won’t turn out as good. I pierce it with the shell, if it doesn’t break when it’s dropped in the bowl.
With eggs or anything else, especially meat, you will sometimes hear little explosions. They’re harmless. At the worst you may have to wipe up a little spilled food from the oven.
My ex-wife used to boil eggs in the microwave. That is, until one day a quarter of a century ago when she and my youngest were in the back yard hanging clothes after turning the microwave on, and the oldest and I were in the living room. She was watching television and I was reading, and there was a very loud BOOM that sounded like it was a big explosion in the back yard. My daughter and I both ran to the back door looking out of its window, and they were out there hanging clothes like nothing had happened.
Then I heard an intermittent buzzing behind me. The microwave door was open and smoking, and sparks were shooting out. I hurriedly yanked the plug.
What had happened was that the water had boiled dry and the eggs, in their shells, exploded. That was the last time she boiled eggs in the microwave! Also the last time that particular microwave ever worked again, it was ruined.
Left out of water and microwaved in its shell, an egg will indeed explode. If you’re going to boil eggs, do it on the stove! However, you can still cook eggs in the microwave.
Now, when cooking anything in the microwave, the first “secret” of microwave cooking: the food’s not done when the microwave beeps. Microwaves cook from the inside out by exciting water molecules. When the oven signals it’s done, that only means that the transmitter has ceased transmitting radio signals and the turntable has stopped turning. It needs another minute for the heat to radiate. Note that this is only when cooking, not reheating.
Illustrating this, If you get a bowl of pre-made frozen chili like the Converse Street Bar sells, the instructions will say to cook the frozen bowl for three minutes, let it set for a minute and a half, and cook it for another two minutes. This is because the grease will have floated to the top of the bowl before it was frozen, and while the liquid is boiling, the grease is still frozen, since there is no water in grease. The heat rising from the chili melts the grease as it sits. A better way is to microwave it for two minutes, remove the lid, break up the grease and submerge it in the chili, and nuke it for another half minute to a minute and a half, let it set, then remove it from the microwave and remove the lid; it will be too hot to eat, but not as bad as the package directions. It illustrates how the microwave cooks; water in the food produces the heat that cooks the rest of it.
My old microwave, the one I bought to replace the one that exploded in the ’90s, took a lot less time to cook; it was 1000 watts, the new one is 750. Energy Star; the oven is less powerful but uses the same amount of electricity to cook!
Your tax dollars at work. Remember, government employees are mostly no smarter or knowledgeable than you, don’t use their brains any more than they have to, like you, and get paid less than you do if you’re working the same job in the private sector.
My mother, who considered the microwave oven to be the greatest invention of the 20th century, once wondered aloud if you could cook bacon in the microwave. “Yes, you can!” I told her. “In fact, some packages of bacon give you cooking instructions for frying, baking, and microwaving.” I’m pretty sure that my mom never read bacon packages, as she had been cooking bacon almost since God invented pigs. It’s about the only kind of meat that comes out okay, at least that I’ve seen.
The time will vary from a minute to five, depending on your oven’s wattage and how crisp you like your bacon. As I wear false teeth, I no longer eat crisp bacon; it gets under the dentures and is painful. The look and texture of the bacon may not be what you’re used to, but the taste will be the same.
To cook bacon in the microwave, place it on a plate and cover it with a paper towel, because it will splatter like it’s fried in a pan. Even cooked with an egg, it will splatter.
The bacon package directions, when available, say to place the bacon on paper towels to soak the grease, but I never do. When it’s done I pour the grease in a jar for later use, like my grandparents and your great grandparents did.
A really quick breakfast is to lay a piece of bacon in a shallow bowl, crack an egg in it, and stick it in the microwave for two and a half to three minutes. That’s with a 700 watt oven, a higher wattage will use less time; it was two minutes with the old, more powerful oven. You will have to experiment, since power isn’t standard with these devices. That’s why your prepackaged microwave items have instructions like “6 to 7½ minutes”. Don’t forget to let it set for a minute or so before removing it. It will still be too hot to eat and needs to cool a minute. Sssh, it’s a “secret”!
It’s easier, quicker, and far cheaper than one of those Jimmy Dean breakfast bowls; two eggs and a slice of bacon are less than a dollar, but a pre-made breakfast bowl is three or four bucks and you have to pierce the plastic wrap, cook it for two minutes, stir it, and cook it for another minute. Or crack a couple eggs into a bowl, add a slice of bacon, and microwave.
If you don’t like the idea of the egg and bacon mixed, you’ll have one more thing to clean. Throw the bacon in the bowl, cover it with a paper towel, and cook it. Then remove the bacon and put it on a plate, then crack the egg into the bowl with the bacon grease.
The egg won’t stick badly to the bowl, but if you’re just cooking an egg or two by themselves, you should add butter, grease, or oil, as I found out when I bought my first anti-stick skillet and tried it out with an egg. It fried okay, but was the blandest egg I’ve ever eaten. The microwave is the same, you need a blandness remover. This is one reason why some think microwaved food isn’t tasty. If you’ve cooked something on the stove with something like oil to keep it from sticking, the oil is part of the taste. If you use oil in a pan, use it in the microwave, too.
You can scramble it first, or just crack it into the bowl. You may be able to make one sunny side up, I haven’t really tried; the egg in the illustration had a solid yolk. It’s likely it will be as difficult as cooking chicken in a microwave and far more trouble than frying it. The yolk cooks only slightly slower than the white, so you would have to separate the yolk and set it aside, cook the white until it started to congeal, then add the yolk back. Way too much trouble when a sunny side up egg is so easy to cook on a stove.
Often I just put it in the bowl and forget to pierce the yolk. Sometimes the yolk breaks and sometimes it doesn’t.
Another secret is anything you cook in your kitchen, no matter how you cook it, will taste better than something pre-cooked and frozen, like those Jimmy Dean breakfast things or TV Dinners. A homemade chicken pot pie will taste far better than one from a food factory, if you’re any good at cooking at all. Even home-made potato chips are better than corporate potato chips.
As most everyone has discovered by now, a microwave will make stale bread soft. You can’t bake in a microwave; the “oven” moniker is very misleading. You can heat a pot pie in the microwave, but you can’t bake one. You can make a Shepherd’s Pie in the microwave, since it has no crust. I just buy them from D’Arcy’s Pint; I don’t really like to cook.
I like to make omelettes, and they’re especially good in the microwave. I make a Denver omelette; a Denver omelette has egg, meat (usually ham), cheese, green pepper, onion, and tomato. That’s a Western Omelette with added tomato. Sometimes I add hash browns and corned beef and call it a Western Irish Omelette. I usually lay the cheese on top. A pat of butter in the bowl makes it better.
If you’re making a Denver omelette in the microwave, it will need to cook longer to evaporate the water in the tomato. Also, as might be expected, two eggs take longer than one egg.
To make a Dr. Seuss omelette, add a drop of blue food coloring to the scrambled egg and microwave it with ham.
I had an astronaut omelette this morning, a cheese steak omelette with a little steak I had left over from yesterday. I seldom made omelettes before I found out how good eggs were from the microwave, if cooked properly, because it’s a lot more work on the stove.
I bought a food processor to chop all the stuff up, and discovered that shredded potatoes turn black overnight in the refrigerator, obviously oxidizing. I doubt they’re bad for you, but I’m not eating black hash browns! So I’ll give the food processor to a daughter. I’ve since bought a small handle-operated cheese grater to shred the vegetables for my omelettes, and started buying the smallest potatoes I could find. I also found that shredded potatoes keep well in the freezer, but start darkening as soon as they start thawing.
I’ve bought pre-shredded frozen hash browns, and they kept in the refrigerator for weeks, so they must have added BHT (Beta Hydra Tolulene) to keep it from oxidizing. As the Food and Drug Administration has limits on how much BHT you can add to your pre-processed food, it’s probably not very good for you. Actually, any pre-processed food isn’t much good for you.
My sister and her husband won’t use their microwave for anything but heating a cold cup of coffee because “I heard that the microwaves change the chemistry of the food.” It’s true, but the change in chemistry is from the heat, not from the microwaves themselves. The chemistry of the food changes exactly the same in a convection oven, microwave, or a pan on the stove. The differences are in moisture, especially the microwave because of how a microwave produces heat.
Now, when people hear the word “radiation” they think of radioactivity and call microwaving “nuking”. But your gasoline vehicle has a radiator, and houses with steam heat used to have radiators; heat was radiated from them. In a microwave, the radiation is simple radio waves, like the radio in your car. The only difference is the frequency; the same difference as the difference between two radio stations, and enclosing those radio waves in a steel box.
About “frequency,” AC stands for alternating current; DC is direct current that travels in one direction, while AC switches directions, the frequency being the speed at which it changes. American wall current is 60 Hz (Hertz, named after Heinrich Hertz, who proved that Maxwell’s “electromagnetic waves” were real), meaning it changes direction, or “polarity”, sixty times a second. European electricity is 50 Hz. FM radio is in the middle of the television frequencies, 88 mHz (mega Hertz, or eighty eight million cycles per second) to 108 mHz. Microwave ovens are 2.45 gHz; gHz is giga Hertz, billions of cycles per second; 2.45 gHz is 2,450,000,000 Hz, or two billion four hundred fifty million cycles per second.
It’s roughly the same frequency as the telephone in your pocket, which is why people have ignorant, superstitious fears that cell phones cause brain cancer. If these fears had actually been warranted, brain and groin cancer rates would have spiked in the quarter century since cell phones became common. They haven’t.
But “radiation causes cancer!” Again, the word “radiation” can mean different things depending on what is radiating and how it radiates. A radio, like your phone or a TV broadcasting station, radiates electromagnetic energy; it’s exactly like waving a magnet around. In fact, if you take a bar magnet, drill a hole in the middle and stick a stick loosely in the hole, if you turn on an induction cooker and hold it just above the burner, the magnet will spin at 60 RPM, as wall current is 60 Hz. You will have built an electric motor.
Light is electromagnetic radiation, just like the low frequency magnetic radiation an induction cooker uses, or the colors you can see, or the signals beaming to your TV set and car radio and telephone, or your microwave oven. All are light. We just can’t see those colors with our eyes.
Radioactivity is also light, but unlike the colors you can see, or the colors a microwave oven or telephone transmits, the photons that make up gamma rays and X-rays contain enormous amounts of energy. Comparing microwave frequencies to gamma ray frequencies is like comparing a candle flame to the sun, each at a distance of a thousand miles. That may be a bit of an exaggeration, but you get the point.
Your phone uses microwaves, those extremely high frequencies, because with digital signals, the higher the frequency the greater the bandwidth; meaning the more phones the towers can connect to. Your microwave oven uses those higher frequencies because it’s the frequency that excites water molecules. The reason you shouldn’t put metal in a microwave is because metal is opaque to microwave frequencies of light, so it will reflect back to the transmitter and ruin it, like a high powered laser pointed straight at a mirror.
A gas stove produces its heat by burning natural gas, and a traditional electric oven produces heat by passing electricity through an electrically resistive coil.
There is a new type of electric stove, the induction cooker. It heats a steel pan with a low frequency electromagnetic wave. Rather than microwave, it’s a long wave about the frequency of the AC from your wall These won’t work with anything but a steel or iron pot or pan; like a microwave heats the food directly and needs water to work, the induction cooker heats the pan or pot directly and needs a ferromagnetic material like iron or steel to work. Like microwave frequencies react with water (hydrogen is actually a metal, making water like burned metal. Rust is burned iron), those extremely long wave frequencies react with iron. However, laying a steel plate on top of the cooker makes it as if it’s a normal electric burner.
In either case, the heat is introduced from outside the food, like a campfire and unlike a microwave. The microwave energy is a radio frequency that excites water molecules; the water inside your food heats up, so it’s cooked from the inside out rather than the outside in, but the chemical changes are identical. The difference is how and where the food’s water escapes as vapor, which is why a convection oven can bake a pizza and a microwave can’t (although there are some special pizza boxes that “kinda” do, heating a pre-cooked frozen pizza without making the crust soggy).
Because of this, it’s extremely difficult, almost impossible, to cook edible chicken in the microwave, or to even heat it up in the microwave because of how fat is situated in the meat of a bird, unlike mammal meat. I’ve managed to cook edible chicken breasts in the microwave, but it’s so hard to do you’re better off cooking them in the convection oven, or deep frying them. If I want chicken I’ll just buy it already fried; frying chicken is hard work and a mess and I’m too old for that shit! I’m not one who loves to cook, I cook for the same reason I worked: to eat.
Mammal meat doesn’t cook well in a microwave, either; the taste is almost the same, but it comes out very unappetizingly ugly. Here’s a photo. I tested part of a raw T-bone, putting it in the microwave for two minutes. It came out like the photo here, but the taste was almost identical to the piece cooked in a frying pan, except for being more juicy and tender than the part I cooked on the stove. Yes, that’s science. One steak, part cooked on the stove and part in a microwave. Better science would do that hundreds of times and document all of the results.
Afterwards, I experimented with barbecuing a pork steak, again, with most of it cooked in a pan. I thought perhaps the sauce would disguise the looks.
I was wrong. It still looked disgusting when it was done. That’s what science is for; testing preconceptions. Some things that seem to be a certain way really aren’t.
Of course, what was cooked in a pan was nowhere as good as one cooked on a grill. I put a small piece on a plate and microwaved it for three minutes, fearing undercooked pork.
It was way overcooked; chewy, but didn’t taste any different than the barbecue in the pan. Had I overcooked it that badly in a pan it would have been like burned shoe leather.
However, with the exception of chicken or fish you can reheat it, again being sure to reheat it, not recook it. The biggest reason mammal meat doesn’t cook well in a microwave is because you can’t brown it in a microwave.
However, if the meat is in a dish, like ham and beans, or beef stew, or chili con carne (that’s Spanish for “chili with meat”), maybe a casserole (I never tried making a casserole in the microwave because I don’t much care for casseroles), it cooks fine in the microwave.
If you’ve fried a steak or a hamburger or a pork chop or such on the stove or grill, and find when you cut it or bite it that it isn’t done, a minute or two in the microwave will finish its cooking without altering its taste or appearance unless you cook it too long.
That is, unless it’s a fats food burger, those are really nasty reheated. They slap the condiments and tomatoes and other garbage on them to cover the taste of the very low-quality meat. Just give it to your dog, if he will eat it.
But good quality hamburger you cooked on the stove heats well in a microwave, unlike a fats food burger. No, that wasn’t a typo, “fast food” is; it’s no faster than a sit-down restaurant with wait staff, but it will make you fat.
One thing I discovered about forty years ago was that if you barbecue pork on a charcoal grill, refrigerate the leftover meat overnight, then re-heat it in the microwave the next day, it tastes twice as good as when it was first cooked! It probably has to do with the water heating the fat, but that’s just a guess.
Anything that you normally boil will be identical in the microwave. I’ve found that it doesn’t have to boil to cook in water, making your food healthier, since less water will evaporate.
You should always use filtered water when cooking, either in a pot on the stove or in a microwave, because evaporation will concentrate all of the inorganic poisons, like lead and arsenic. If you drink filtered or bottled water, you should cook with it, too, or you’re wasting your money. If there are 135 parts per million of nastiness coming out of your tap, like the last time I tested Springfield water (my filter pitcher came with a tester), boiled halfway down doubles that to 270 PPM.
I have yet to find any vegetable that doesn’t come out of the microwave tasting delicious, as long as it’s cooked well, which in most cases is just heating long enough. But some require extra for the best taste. I mentioned bacon grease earlier; when I was growing up, whenever my mom made green beans she cooked them in a pot (there were no home microwave ovens back then) with water and bacon. I understand that’s how most Americans except Jews and Muslims cook green beans.
In a microwave, I’ve used the bacon itself, but the way I cook dinner makes it better to just put the frozen beans and filtered water in a tall ten ounce cup, and add a little bacon grease.
The tall cups allow me to heat three vegetables at the same time. Maybe I should have subtitled this “Cooking for One or two”. Before, I used bowls, and was thinking about buying two more microwaves, but don’t have the room in my little kitchen. If you’re cooking for one with the microwave, you’ll need to pre-cook most vegetables until they’re the desired softness. In the microwave, of course.
The tall cups also, unfortunately, boil water a lot faster than in a bowl, and as soon as it boils, it boils over. The obvious answer is to not leave it in long enough for it to boil. Test it first with just tap water so you will know the maximum time it takes your microwave to boil water in a tall cup. With the cups I have and the microwave I use and the liquid at room temperature, a minute and a half is the maximum without making a mess. Veggies straight from the freezer can take two minutes; you can test this with cold water and an ice cube. When it beeps, start it again until it’s done.
Now, another misconception about microwave cooking is that you should only reheat food in a microwave once or it will somehow become poisonous, the old “microwaves change the chemistry” nonsense. If you overcook anything, whether in a microwave, on the stove, or in an oven, it will taste nasty. It won’t be poison, but I can see how you might think that, although if you use tap water there’s a tiny grain of truth in it. Again, water that some of has evaporated is in fact more poisonous than before heating. Whether cooking on a stove or in a microwave, you should use the purest water you can.
The biggest microwave cooking “secret” is getting the food just hot enough. A lot of people have a really bad habit of overcooking in the microwave, probably because it’s so much faster than other cooking methods. I once saw someone put a TV dinner in a microwave for ten minutes, then complained about how badly it tasted. I looked at the box—the instructions were cook it for two minutes, turn the meat over and stir the potatoes, then cook it for another minute to a minute and a half. No wonder it sucked! But back before microwaves they came in an aluminum tin and took half an hour to forty five minutes to cook in a convection oven.
Remember that if it’s not done enough you can put it right back in the microwave, but uncooking a thing is physically impossible.
It’s been said that “everything’s better with butter.” That’s simply not true. I tried cooking buttered broccoli and cauliflower and it was awful! You would think that butter beans would be better with butter. Ironically, buttered butter beans are disgusting.
Broccoli and cauliflower are best just cooked in water until they’re soft, or eaten raw, the most nutritious way to eat them, although cooking will kill bacteria. Of course, there are probably some very delicious recipes with those vegetables. Butter beans and Lima beans are also best just cooked in water. Peas are very good with minced onion.
Corn, carrots, potatoes, are all better with butter, and the more butter the better. If you buy or grow fresh carrots you’ll be amazed at how much stronger the taste is than canned or frozen. I’ve started to buy fresh vegetables when they’re available, and freezing them myself. They’re cheaper, more nutritious, and taste better; they must boil the hell out of carrots, peas, and green beans before they freeze them to get rid of the taste and nutrition. I’ve never liked canned peas, but love them fresh or frozen.
A good rule of thumb is whatever you put with any given vegetable when cooked on the stove will work with a microwave, like butter with corn, or bacon with green beans. Often it will taste better than on a stove top. Again, that includes anything you put in the pan to resist sticking.
The most rational way to cook is to use the method that produces the best results; taste and nutrition, ease of preparation, and costs the least. Of course, with any kind of food, there will be trade-offs between those three variables.
Cooking with microwaves has many advantages, for things like eggs, vegetables, and soups. Actually, anything except meat, and as long as it is in a recipe, like chili or beef stew, you can cook meat in it, as mentioned before.
Microwaved vegetables are more nutritious than vegetables cooked on a stove, because on a stove, most of the vitamins and minerals are poured down the drain. As it takes longer to cook on the stove, there will be more poison from the evaporation on a stove, compared to the microwave. On cold winter days when the air is dry I put a big pot of tap water on the stove to boil. It keeps my lips and sinuses from cracking, and the flu virus can’t live in higher than forty percent humidity. You should see what’s left in the pan when the water’s boiled out!
A gas stove may be cheaper to cook on than a microwave; that would have to do with what you’re paying for each form of energy. But in the summer, that gas stove will run your electric bill up from making your air conditioner work harder; that’s the appliance that uses the most electricity in your house. At least in the summer and possibly all year, the microwave costs less to run.
An electric stove will cost a whole lot more to cook on than a microwave, whether a traditional resistance stove or a new induction cooker. Each separate burner in a resistance stove uses up to 2400 watts, an induction burner uses up to 1800 watts. Your microwave maxes out at 750 to 1000 watts, and it runs for less than a fourth of the time it takes to cook on the stove.
The microwave is a lot less work than the stove, especially cleaning up the mess. In the microwave, there are no pots, pans, or skillets, you serve the food from the container it was cooked in. Except the vegetables; you will need a slotted spoon to dish them out. There are no tongs or spatulas to wash because when food is cooked from the inside rather than the bottom, you don’t need to turn it.
Now, where do all the myths come from? Mostly, as I said, misunderstanding something that’s been heard. But also from those dishonest rich people who stand to make more or lose less money because of the myths: gas and electric companies. If you cook your vegetables on a gas stove, the gas company gets paid, but not if you cook it in a microwave. If you have an electric stove, the power company prefers you use the stove because it takes a lot more electricity than a microwave, so they make more money from you.
It’s better to throw away the myths and keep your money.
A few days ago, aristarchus started a thread about warp speed and the vast distances across interstellar and intergalactic space. I posted an AC reply explaining what the warp scale means and briefly discussed inconsistencies in Star Trek. Perhaps I'd be better off posting this on the Daystrom Research Institute subreddit, but I'll give it a try here.
The writers of TNG wanted the Enterprise-D to be considerably faster than the Enterprise-no bloody A, B, C, or D. At the same time, they also modified the warp scale so that the speeds are faster for any warp factor and so warp 10 is essentially infinite velocity. Still, the writers wanted the galaxy to still seem quite large, with vast swathes of uncharted space. There are major inconsistencies between TOS and TNG, and even within the TNG era. I'll provide two examples.
In the TOS episode That Which Survives, the Enterprise travels 990.7 light years at maximum warp, which takes 11.33 hours. For the TOS Enterprise, warp 8 is maximum warp. Take the cube of the warp factor and you get a top speed of 512c. By comparison, Voyager was hurled 70,000 light years into the Delta Quadrant. If they traveled at the same speed without stopping, it should have taken a little under 34 days to return to their original location. Instead, in Caretaker, they say it would take 75 years at maximum warp.
In First Contact, Picard states that the Federation is 8,000 light years across with 150 members. Maximum warp for Voyager is 9.975, which is really, really fast. By comparison, maximum warp for the Enterprise-D is 9.6, but Data states in Encounter at Farpoint that warp 9.8 is possible with extreme risk. These are some of the Federation's fastest ships, and even the Defiant only has a maximum warp of 9.5. It should take these ships many years to cross Federation space, so an ordinary ship would likely take much longer to cross Federation space. Instead, we typically see travel times on the order of days or weeks, though that isn't from one end of the Federation to the other. Still, the travel times are far quicker than they should be with the size of Federation space. Again, this doesn't make sense with the travel time for Voyager's return to the Alpha Quadrant.
Of course, there are a couple of real reasons for the inconsistency. One is that there weren't really any standards applied in the TOS era to be consistent about the technology, stardates, or many of the other details. The speed of the Enterprise was pretty much whatever the writer wanted it to be for that episode. The other main reason is that science fiction writers generally don't do a good job representing the vast scale of interstellar space. Other inconsistencies have been resolved, such as why Klingons look different in TOS than in later series. But no explanation has been given on screen to explain the vast differences in speed.
As a creative writing exercise, I ask: If you were a writer for Star Trek, if you were trying to explain the inconsistencies in the warp scale, how would you do so? I'll offer an idea, but I'm interested in hearing other ideas.
A lot of interstellar communication appears to be instantaneous or very nearly so, particularly in the TNG era. This is explained by a network of subspace relays that accelerate the speed of subspace radio signals within Federation space. I would explain that the subspace relays don't just affect communications but also boost the speed of ships. As long as you're within the network of subspace relays, the speed of your ship gets boosted by a couple orders of magnitude. Get outside of Federation space and you lose the massive boost. It would allow ships to cross Federation space quickly while still making it difficult to travel to uncharted regions of the galaxy. Analogous to the Roman Empire's network of paved roads, travel is very efficient until you try to go beyond the network.
The apparent speeds in TOS often get explained away, with things like traveling to the other side of the galaxy referring to crossing the galactic plane. I highly doubt that this is the meaning the writers intended. It's actually much more interesting if the TOS Enterprise really did travel to the Gamma and Delta Quadrants, but they later became effectively inaccessible to Federation ships. Of course, TNG era ships still have to be faster.
In TNG's Where No One Has Gone Before, the Traveler is able to accelerate the Enterprise-D to incredible speeds. Kosinski thinks his theories about warp propulsion have unlocked much faster speeds, not realizing that it's only because of the Traveler. I would explain the incredible speeds in TOS in the same way. The warp engines were really much slower than in the TNG era, but the Federation was the recipient of outside help. Perhaps the Federation even tried to engineer new warp engines around the theories they thought would allow them to travel at those much faster speeds, but those experiments ultimately failed. That could explain why the Excelsior's transwarp drive in Star Trek III was never seen again, and the ship later on has standard warp engines. Much like Kosinski, the Federation didn't realize or want to believe that they were getting outside help.
A lot of TOS is an allegory for the Cold War, where the Klingon Empire represents the Soviet Union. I would write that the Q Continuum wanted humanity to survive and evolve, perhaps eventually to be like themselves. Q actually said that if he hadn't hurled the Enterprise 7,000 light years to encounter the Borg at system J-25, humanity would have been assimilated. I would reveal that the Q were also assisting the TOS era Federation, just without revealing their presence. Without the interference of the Continuum, the Federation would have been conquered by the Klingon Empire, the region would spend centuries under a military dictatorship, and humanity would eventually have been reduced to near extinction under the brutal conditions. As a twist, I'd also add that Daniels from Enterprise was also one of the Q, again protecting humanity from threats from the future during the Temporal Cold War.
Basically, faster travel times would give the Federation a huge advantage moving troops and supplies over long distances. This would allow them to not only avoid being conquered by the Klingon Empire but to prosper. Once the Klingon Empire was significantly weakened due to Praxis exploding, there would no longer be any need for outside assistance, and the Q stopped intervening to prevent humanity from starting too many wars and becoming conquerors. The interference was unnecessary and might have been harmful to peace in the region once the Khitomer Accords were signed.
These are my ideas for how the writers could reconcile the speed differences. I think it's much more interesting if TOS era ships actually traveled great distances than to use gimmicks like the other side of the galaxy meaning to cross the galactic plane. If you were writing for Star Trek and needed to resolve these inconsistencies in warp speed, how would you explain them?
I had an idea for a story I don't think anyone has written, although I'm probably wrong, as it seems obvious. I had Mars, ho!, and a voyage to Earth, and I thought, "it's time we left this solar system." I mean, I'd spent a lot of time on Mars and several of the larger asteroids, what's next?
Research for Grommler informed me that you could get to Sirius and back in a little over ten years, although a hundred years would pass on Mars during that time, so I thought "Why not Alpha Proxima?" Note: although college research prohibits using encyclopedias as sources, researching a fiction story needs no citations.
I found that the calculations I got from Wikipedia for a trip to Sirius at 1g thrust were wrong (so sue me) but at that amount of thrust you could get to Alpha Proxima in a year, so I'll make it two. Of course, it will seem a lot longer to us here, but I haven't figured out how much yet. Math guys?
Proxima B is in that star's habitable zone, but Jesus, what a shitty name for a planet! It will probably be hundreds of years before we get to the point where we can produce that much thrust for that long a time, and by then we will have had an awful lot of probes to our actually existing but poorly named planet.
Proxima Centauri is a red dwarf that rotates around A and B, which should make some really cool visuals for a movie set on that planet, and also a hint at what its name will be in a few hundred years. It Certainly won't be "B". Bee, maybe? Could life begin on a planet like that? I have my doubts, but could be wrong.
Also, the star's sisters have more "normal" star names, Rigil Kentaurus (Alpha Centauri A), and Toliman (B).
Anybody have a good name for this planet? And its poorly named star? Or should I just name it the same as Isaac Asimov did in Foundation and Earth? I've forgotten what it was, I'll have to read the book again.
I named the CEO of the Green-Osbourne Transportation Company after the guy who thought of whores in space as we were discussing Nobots in Felber's beer garden and a coven of crack whores walked down the street, and gave him an acknowledgement. If I use your planet name and you wish, I'll do the same for you.
We used to not have any worker safety laws. Nobody forced us to put doors on the elevators. If somebody died, so fucking what? We could foul the water so badly rivers caught fire, nobody cared what poisonous garbage we poured in it. We could spew so much poison in the air that you had to roll the windows up driving past our Monsanto plant because of the pain it caused your lungs. We could get the government to wage pointless wars so that gold would pour into our coffers, even having the government draft men to die for our filthy money.
Then those pesky kids came along.
They picketed, demonstrated, voted and wrote letters on paper to their elected representatives. They got the war stopped, damn them, and the draft. They got OSHA so now we have to put doors on our Purina elevators and guard rails. Guard rails, for Christ’s sake! The nerve!
What’s worse, they got the EPA started. Damn them!
But we’ll show ‘em. Now, we’re paying all the federal tax, and they’ve made us actually pay our full-time workers a living minimum wage. But we’ll fix their wagons. All it will take is for us to keep raising the price of the worthless junk the fools buy from us until the minimum wage won’t buy shit, and bracket creep will raise taxes so high everybody will be paying them. Then we’ll shower the dishonest fools in congress with cash and convince them to lower OUR taxes to the point only the poor will pay them.
The best part? We’ll fuck up the education system to the point that their grandchildren in the twenty first century will be too stupid and apathetic and feel too powerless to do a damned thing about it. They’ll be so damned dumb they’ll vote against their own interests. We’ll be BILLIONAIRES!!