Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by cmn32480 on Thursday October 27 2016, @02:52PM   Printer-friendly
from the resist-the-urge-to-get-amped-up dept.

According to the National Resource Defense Council, Americans waste up to $19 billion annually in electricity costs due to "vampire appliances," always-on digital devices in the home that suck power even when they are turned off.

But University of Utah electrical and computer engineering professor Massood Tabib-Azar and his team of engineers have come up with a way to produce microscopic electronic switches for appliances and devices that can grow and dissolve wires inside the circuitry that instantly connect and disconnect electrical flow. With this technology, consumer products such as smartphones and computer laptops could run at least twice as long on a single battery charge, and newer all-digital appliances such as televisions and video game consoles could be much more power efficient.
...
"Whenever they are off, they are not completely off, and whenever they are on, they may not be completely on," says Tabib-Azar, who also is a professor with the Utah Science Technology and Research (USTAR) initiative. "That uses battery life. It heats up the device, and it's not doing anything for you. It's completely wasted power."

Tabib-Azar and his team have devised a new kind of switch for electronic circuits that uses solid electrolytes such as copper sulfide to literally grow a wire between two electrodes when an electrical current passes through them, turning the switch on. When you reverse the polarity of the electrical current, then the metallic wire between the electrodes breaks down -- leaving a gap between them -- and the switch is turned off. A third electrode is used to control this process of growing and breaking down the wire.

He did not get the memo--reducing vampire current is not what the Internet of Things is all about.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Interesting) by TheRaven on Thursday October 27 2016, @04:19PM

    by TheRaven (270) on Thursday October 27 2016, @04:19PM (#419459) Journal
    The number is quite a bit higher than I expected. It works out at $165/year/household. A rough rule of thumb is that one Watt for something that's always on costs about $/year. I find it a bit hard to believe that most households have 165W of this stuff. Most appliances consume 2-10W in standby, so that's somewhere between 16-80 appliances per household. Replacing even 16 would likely cost far more (even purely in terms of energy consumption) than leaving them on.
    --
    sudo mod me up
    Starting Score:    1  point
    Moderation   +2  
       Interesting=1, Informative=1, Total=2
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4  
  • (Score: 3, Interesting) by Anonymous Coward on Thursday October 27 2016, @04:54PM

    by Anonymous Coward on Thursday October 27 2016, @04:54PM (#419475)
    I'm not worried about the 1W stuff. Those are really insignificant in the big picture your heating and cooling stuff is where you should look for your savings first (don't overdo the cooling/heating, turn off your cooling/heating when nobody is around).

    HOWEVER on the topic of always on appliances, I think those set top boxes/satellite/cable TV stuff use a lot more than 1W. If you put your hand on them you can tell they get rather hot. The internet/WiFi combo routers consume more than 1W too but less than those set top boxes. Basically if it doesn't get very warm then you don't really have to worry about it so much.

    If you're that worried about 1 watt devices you should also disable flash, run ad blockers on your computer and use noscript, etc. Some of those stupid ads can keep an entire CPU core occupied even when running in the background, and on my PC that can be an extra 10-25W. Nowadays it doesn't seem that bad - I haven't had luck finding a site that's as bad as before (I guess that's because Google Chrome and Firefox have started disabling flash by default).

    Keep in mind that if you drive your car a bit more efficiently you could save a fair bit too...
    • (Score: 2) by VLM on Friday October 28 2016, @12:19PM

      by VLM (445) on Friday October 28 2016, @12:19PM (#419808)

      I think those set top boxes/satellite/cable TV stuff use a lot more than 1W.

      I know they used to because I own a kill-a-watt meter and all power off does on a 00s era SD cable box was output a perfect black screen NTSC signal to greenwash the customer into thinking "off" did anything. Not even a flicker on the kll-a-watt, 30 watts continuous all day.

      Of course due to the pace of technological change something like a mythtv frontend or a cable settop box have gone from "wattage is no object" gamer PC with multiple fans in 00 to a little thing that draws 15 watts using VDPAU in '10 and in 15 you use an essentially zero power raspberry pi.

      Its also worth pointing out that where I live the ratio of heating to cooling degree days is in excess of 20 to 1 so as a practical matter 60 watts of electricity isn't "wasted" it merely means a bit less natgas burned. And like many people (and seemingly all local retail businesses) I subscribe to 100% renewable energy so for every KWh I use, the electric co buys a KWh from a local windmill or solar plant, so it might be expensive but the more electricity I "waste" the lower my carbon footprint becomes. Admittedly probably 90% of the population lives south of me / warmer than me but you have to go surprisingly far south before annual heating and cooling degree days are equal.

      I guess my point is often none of the watts are wasted and often there is no carbon footprint at all and for strong economic and technological change pressures the wattage wasted STRONGLY tends toward zero.

      My 5 year old roku draws essentially zero and I don't technologically envision ever going back to the days of the 30 watt settop box again.

      Maybe the first star trek holodesk will draw 1500 watts. In fact I almost guarantee it. And probably 20 years later it'll be higher res and a hundred times the storage and follow the usual trajectory to "a watt or less" and the working parts the size of a pack of cards. Its just not something to worry about.

  • (Score: 2) by LoRdTAW on Thursday October 27 2016, @06:16PM

    by LoRdTAW (3755) on Thursday October 27 2016, @06:16PM (#419512) Journal

    Modern dryer and washers have standby so the light turns on when you open the door
    Stoves and microwaves all stay on to display clocks and wait for user input
    Some toasters now have soft power buttons with LED's
    Coffee makers with timers
    phone chargers (though most in standby are in the mW-uW range)
    Cable boxes
    DVR's
    media sticks like firestick
    IoT garbage
    smart thermostats (see IoT garbage)
    and on and on.

    It's easy to see how you can quickly collect upward of 20-30 vampire gadgets without realizing it.

    • (Score: 3, Interesting) by Nuke on Thursday October 27 2016, @07:48PM

      by Nuke (3162) on Thursday October 27 2016, @07:48PM (#419546)

      Some of the stuff I have (cooker and microwave come to mind), if turned off, refuses to resume working unless you first go through an elaborate set-up procedure, including setting the date, time and default preferences. They are enough of a pain after the occasional power cut.

      • (Score: 2) by Grishnakh on Friday October 28 2016, @04:30PM

        by Grishnakh (2831) on Friday October 28 2016, @04:30PM (#419879)

        Why did you buy that microwave? You should have taken that back as soon as you found that out about it. Any decent microwave oven will just show 0s when the time isn't set (and the date shouldn't even apply to a microwave). Lots of people simply never bother to set the clocks on their microwaves.

    • (Score: 2) by pnkwarhall on Thursday October 27 2016, @11:36PM

      by pnkwarhall (4558) on Thursday October 27 2016, @11:36PM (#419622)

      Was given a Bunn coffee pot as a gift, and I thought it was great until I realized that it *always* keeps the water hot. Even at two pots-per-day, that seems like a lot of energy waste. What's great as the workhorse of a ship's coffee mess doesn't necessarily apply at home.

      Now I'm curious to know how much energy was spent each day keeping the water hot. It doesn't seem like it would be an insignificant amount.

      --
      Lift Yr Skinny Fists Like Antennas to Heaven
      • (Score: 2) by Geezer on Friday October 28 2016, @09:56AM

        by Geezer (511) on Friday October 28 2016, @09:56AM (#419781)

        We use a Bunn at our house. Thing to remember is it's more efficient to achieve and maintain a stable brewing temperature with than to heat from ambient on demand. Also, cold-start elements use much more power to brew quickly. The actual difference in total power consumed is actually very little. Fun science fair project for the kiddies: Run a trace with a recording ammeter over 24 hours with same number of pots brewed on Bunn vs. Mr. Coffee.

        • (Score: 0, Troll) by toddestan on Saturday October 29 2016, @12:57AM

          by toddestan (4982) on Saturday October 29 2016, @12:57AM (#420004)

          That doesn't make any sense. It uses the same amount of energy to heat the water to the brewing temperature no matter how you do it. If it heats it on-demand when it's needed, that's the amount of energy used. If it heats it beforehand then keeps it hot until you need it, that costs the same amount of energy to heat it initially, plus whatever it needs to keep it hot until it's needed. Which will equal the amount of heat lost from the reservoir, as it will never be perfectly insulated. The only possible benefit is that the on-demand coffee maker will draw a lot of current to heat the water quickly, whereas the always hot coffee maker can afford to draw less current as it can take longer to heat the water.

          Plus there are other considerations. Since it's heating all time, even when no one's around, it's more likely to start a fire. And if the reservoir isn't perfectly sealed, the water can eventually boil off and what then?