Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Thursday October 27 2016, @02:52PM   Printer-friendly
from the resist-the-urge-to-get-amped-up dept.

According to the National Resource Defense Council, Americans waste up to $19 billion annually in electricity costs due to "vampire appliances," always-on digital devices in the home that suck power even when they are turned off.

But University of Utah electrical and computer engineering professor Massood Tabib-Azar and his team of engineers have come up with a way to produce microscopic electronic switches for appliances and devices that can grow and dissolve wires inside the circuitry that instantly connect and disconnect electrical flow. With this technology, consumer products such as smartphones and computer laptops could run at least twice as long on a single battery charge, and newer all-digital appliances such as televisions and video game consoles could be much more power efficient.
...
"Whenever they are off, they are not completely off, and whenever they are on, they may not be completely on," says Tabib-Azar, who also is a professor with the Utah Science Technology and Research (USTAR) initiative. "That uses battery life. It heats up the device, and it's not doing anything for you. It's completely wasted power."

Tabib-Azar and his team have devised a new kind of switch for electronic circuits that uses solid electrolytes such as copper sulfide to literally grow a wire between two electrodes when an electrical current passes through them, turning the switch on. When you reverse the polarity of the electrical current, then the metallic wire between the electrodes breaks down -- leaving a gap between them -- and the switch is turned off. A third electrode is used to control this process of growing and breaking down the wire.

He did not get the memo--reducing vampire current is not what the Internet of Things is all about.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Insightful) by AthanasiusKircher on Thursday October 27 2016, @03:20PM

    by AthanasiusKircher (5291) on Thursday October 27 2016, @03:20PM (#419433) Journal

    The technology sounds potentially interesting, but is it really necessary to solve most cases where people WANT to limit "vampire" power usage?

    I'm pretty sure 99% of these devices could get rid of the "vampire" power simply by having a physical switch to turn them off (or, in other cases, designing the circuit properly so the physical switch actually turns everything off). I have a DVD player I bought 15 years ago that has a physical power button which you need to push to click in if you want to turn it on. Yes, it also has a "sleep mode" that it will go into if you leave the switch in the "on" state, thus allowing you to turn it fully back on with the remote. But since I don't watch DVDs that often anymore, I almost always hit that physical power switch. (Actually, even when I did watch DVDs, I tended to hit that switch too... why pay the electric company money to power a sensor just so I can use a remote, given that I need to make physical contact with the DVD player to load and unload the DVD in the first place?)

    Anyhow, it seems more and more devices are coming without those physical switches. Some have no "power button" at all. Others may have one, but it effectively just toggles between "on" and a form of "sleep," rather than turning the device fully off. In most cases, this seems to exist so you can just use your remote to turn the device back on. (And, as I said, some other devices just have poorly designed power supplies or circuits that keep running and drawing power for no reason when the switch is "off" -- these just need to be designed better.)

    So, if we really want to stop a lot of this "vampire" power drain, shouldn't we just be encouraging companies to make sure there's a literal power switch? Isn't that a simpler solution? Many folks already do this by connecting such devices to a power strip, which they tend shut off when they're not in use.

    In the use cases where you actually WANT to be able to turn on something with a remote or some other case that the device needs to drain some minimal "vampire" power, then the contention that "... it's not doing anything for you. It's completely wasted power" is false. It is doing something for you, e.g., providing a convenient "sleep" mode so you don't have to wait to boot up a device again or whatever.

    Perhaps this tech can improve situations like that and reduce power drain during inactive times. But let's not pretend that this "vampire" power is completely "wasted" in those use cases -- what you're using that power for is to provide convenience.

    • (Score: 2) by Scruffy Beard 2 on Thursday October 27 2016, @04:34PM

      by Scruffy Beard 2 (6030) on Thursday October 27 2016, @04:34PM (#419466)

      The old-school policy for appliances was to unplug them when on in use. (Not sure of the exact reason, but may have had to do with more unreliable cords.)

      I wonder if that will come back in fashion. I think it is unlikely, because nobody likes to crawl through a rats' nest of cables to unplug something.

      • (Score: 0) by Anonymous Coward on Thursday October 27 2016, @05:02PM

        by Anonymous Coward on Thursday October 27 2016, @05:02PM (#419480)

        I'm guessing there are a variety of scenarios that all lead to "electrical fire". Bad circuit design, flaw in wiring insulation, appliances that can't handle a power surge, etc.

        • (Score: 2) by bob_super on Thursday October 27 2016, @05:51PM

          by bob_super (1357) on Thursday October 27 2016, @05:51PM (#419500)

          Reasons I've heard from my parents for unplugging are : significant power drain even when off (leaky AC transformers), and lightning safety (a lightning strike within a not-so-short distance could start a fire).

          • (Score: 2) by VLM on Friday October 28 2016, @12:03PM

            by VLM (445) on Friday October 28 2016, @12:03PM (#419806)

            lightning safety

            My parents did that in the early 80s with the very earliest home computers.

            Ironically a decade (or more?) pre-ATX power supplies, "off" meant an airgapped switch of probably higher quality than a $5 all plastic (including the ground "conductor" power strip. Also a lightning strike that just said "whatever" to an air gap of 2 miles or so isn't going to be very impressed by another quarter inch.

            My recent experience is the most likely victim of lightning is insteon/x10 home automation components. Nothing more annoying that hearing a "bang" during a thunderstorm and the two year old automation switch you just installed is dead.

        • (Score: 1, Interesting) by Anonymous Coward on Friday October 28 2016, @07:48AM

          by Anonymous Coward on Friday October 28 2016, @07:48AM (#419763)

          I'm guessing there are a variety of scenarios that all lead to "electrical fire". Bad circuit design, flaw in wiring insulation, appliances that can't handle a power surge, etc.

          Over here we got to hear every year around December how important it was to unplug TVs because TVs tended to catch fire. Then somebody started wondering why are these spots only played in December, and looked at the statistics. Turned out that TVs did indeed tend to catch fire in December, the same time that everyone put Christmas decorations with lots of candles on top of their TVs.

          That took the spots about TVs catching fires off the airs for a few years. Later the problem went away completely with flat screen TVs that are too slim to put a Christmas decoration on top of.

      • (Score: 2) by DECbot on Thursday October 27 2016, @05:24PM

        by DECbot (832) on Thursday October 27 2016, @05:24PM (#419493) Journal

        I already have one power strip that I can use a remote to turn off the outlets. I've considered getting more for my entertainment center and such, but I haven't bothered yet because of cost.

        Note to self: use the Kill-a-Watt on that power strip to test power consumption in the "off" state. Just how off is off and how much power does it take to listen to a remote? Maybe I just need to wire the outlet to a light switch.

        --
        cats~$ sudo chown -R us /home/base
      • (Score: 2) by Grishnakh on Friday October 28 2016, @04:20PM

        by Grishnakh (2831) on Friday October 28 2016, @04:20PM (#419878)

        You don't need to unplug things. If you want to really stop the "vampire" energy usage, you just use a simple power strip that you can get for $5. This has the bonus effect of giving you more outlets, since house-builders *still* haven't figured out that 2 outlets every 20 feet or whatever it is is never enough. They really should be putting in quad outlets (rather than double), and every 6 feet minimum, and every 3 feet in a kitchen. Until they do that, power strips are an absolute necessity. The switches on those might not be the very best quality, but usually they're just fine, and if one goes bad, you can get another strip for $5 so you might as well just stock up.

        • (Score: 1) by Scruffy Beard 2 on Saturday October 29 2016, @02:05PM

          by Scruffy Beard 2 (6030) on Saturday October 29 2016, @02:05PM (#420110)

          The code actually requires outlets every 12 ft. That way, 6ft appliance cords can reach without an extension cord. 6ft cords also prevent you from reaching a non-GFI protected outlet from inside the bathroom.

          As far as I can tell, legally, extension cords and power strips don't really exist. If you read the fine-print on the package for a power strip, you will note that it is for "temporary use only."

          • (Score: 2) by Grishnakh on Monday October 31 2016, @03:39PM

            by Grishnakh (2831) on Monday October 31 2016, @03:39PM (#420899)

            6ft cords also prevent you from reaching a non-GFI protected outlet from inside the bathroom.

            Not a problem in my house. The entire second floor is wired through a single GFCI outlet in my bedroom, including all the outlets in both bedrooms, all the bedroom lights, and the bathroom lights and outlets. The GFCI breaker does pop from time to time, leaving me and my housemates in darkness.

    • (Score: 2, Interesting) by GDX on Friday October 28 2016, @04:54AM

      by GDX (1950) on Friday October 28 2016, @04:54AM (#419729)

      Today is possible to have a sleeping power off less than 2mW, 20mw if you add a led, for a cost of less tan 5$ and that don't add much to the price of most appliances. Well the cost in practice can go down to 1$ if some custom hardware is developed.

      • (Score: 0) by Anonymous Coward on Friday October 28 2016, @07:55AM

        by Anonymous Coward on Friday October 28 2016, @07:55AM (#419764)

        It used to be possible to get down to zero with a simple power switch. With some device. including PCs, most have the power button where the power switch used to be. The only "advantage" for most people is that they get to hold the power button down for eight seconds whenever Windows freezes.

        Those of us who use Wake-On-Lan are the exception, not the norm.

  • (Score: 0) by Anonymous Coward on Thursday October 27 2016, @03:34PM

    by Anonymous Coward on Thursday October 27 2016, @03:34PM (#419436)

    Tabib-Azar and his team have devised a new kind of switch for electronic circuits that uses solid electrolytes such as copper sulfide to literally grow a wire between two electrodes when an electrical current passes through them, turning the switch on. When you reverse the polarity of the electrical current, then the metallic wire between the electrodes breaks down -- leaving a gap between them -- and the switch is turned off. A third electrode is used to control this process of growing and breaking down the wire.

    Wait, isn't this just a memristor technically?

    • (Score: 3, Funny) by bob_super on Thursday October 27 2016, @05:54PM

      by bob_super (1357) on Thursday October 27 2016, @05:54PM (#419502)

      It's just a fancy relay.
      I bet it doesn't even give you that satisfying click...

  • (Score: 2, Informative) by Anonymous Coward on Thursday October 27 2016, @03:44PM

    by Anonymous Coward on Thursday October 27 2016, @03:44PM (#419440)

    The United States has ~325 million people, so $19 billion anually represents about $60 per person per year.

    So if you throw away all your current electronic products and replace them with new products, you can expect to save Up To™ a whole $60 in just one year! What savings!

    • (Score: 0) by Anonymous Coward on Thursday October 27 2016, @03:58PM

      by Anonymous Coward on Thursday October 27 2016, @03:58PM (#419448)

      WAIT, did you remember to account for the Amish in that calculation, if not it may be as high as $61/year!

    • (Score: 0) by Anonymous Coward on Thursday October 27 2016, @04:05PM

      by Anonymous Coward on Thursday October 27 2016, @04:05PM (#419452)

      Or get adapters with a switch and, since we are at it, a fuse. I once was going to DJ at a fair (Plastikman FTWTF), I hooked up the adapter, had the fuse literally go bang. If there wasn't the bang, I would have hooked up the SL 1210s next...

    • (Score: 3, Informative) by Anonymous Coward on Thursday October 27 2016, @04:11PM

      by Anonymous Coward on Thursday October 27 2016, @04:11PM (#419456)

      So this is really just a really bad introduction on the part of the article authors.

      The actual process being researched has nothing whatsoever to do with so-called "vampire current".

      The researchers are talking about leakage current. This refers to the current that flows across a transistor when it is "switched off" (on a n-channel FET, this refers to the source-drain current when the gate voltage is low). On an ideal transistor this is 0 but real transistors are not perfect and the current is slightly more than 0. It is an efficiency loss for powered on devices!

    • (Score: 4, Interesting) by TheRaven on Thursday October 27 2016, @04:19PM

      by TheRaven (270) on Thursday October 27 2016, @04:19PM (#419459) Journal
      The number is quite a bit higher than I expected. It works out at $165/year/household. A rough rule of thumb is that one Watt for something that's always on costs about $/year. I find it a bit hard to believe that most households have 165W of this stuff. Most appliances consume 2-10W in standby, so that's somewhere between 16-80 appliances per household. Replacing even 16 would likely cost far more (even purely in terms of energy consumption) than leaving them on.
      --
      sudo mod me up
      • (Score: 3, Interesting) by Anonymous Coward on Thursday October 27 2016, @04:54PM

        by Anonymous Coward on Thursday October 27 2016, @04:54PM (#419475)
        I'm not worried about the 1W stuff. Those are really insignificant in the big picture your heating and cooling stuff is where you should look for your savings first (don't overdo the cooling/heating, turn off your cooling/heating when nobody is around).

        HOWEVER on the topic of always on appliances, I think those set top boxes/satellite/cable TV stuff use a lot more than 1W. If you put your hand on them you can tell they get rather hot. The internet/WiFi combo routers consume more than 1W too but less than those set top boxes. Basically if it doesn't get very warm then you don't really have to worry about it so much.

        If you're that worried about 1 watt devices you should also disable flash, run ad blockers on your computer and use noscript, etc. Some of those stupid ads can keep an entire CPU core occupied even when running in the background, and on my PC that can be an extra 10-25W. Nowadays it doesn't seem that bad - I haven't had luck finding a site that's as bad as before (I guess that's because Google Chrome and Firefox have started disabling flash by default).

        Keep in mind that if you drive your car a bit more efficiently you could save a fair bit too...
        • (Score: 2) by VLM on Friday October 28 2016, @12:19PM

          by VLM (445) on Friday October 28 2016, @12:19PM (#419808)

          I think those set top boxes/satellite/cable TV stuff use a lot more than 1W.

          I know they used to because I own a kill-a-watt meter and all power off does on a 00s era SD cable box was output a perfect black screen NTSC signal to greenwash the customer into thinking "off" did anything. Not even a flicker on the kll-a-watt, 30 watts continuous all day.

          Of course due to the pace of technological change something like a mythtv frontend or a cable settop box have gone from "wattage is no object" gamer PC with multiple fans in 00 to a little thing that draws 15 watts using VDPAU in '10 and in 15 you use an essentially zero power raspberry pi.

          Its also worth pointing out that where I live the ratio of heating to cooling degree days is in excess of 20 to 1 so as a practical matter 60 watts of electricity isn't "wasted" it merely means a bit less natgas burned. And like many people (and seemingly all local retail businesses) I subscribe to 100% renewable energy so for every KWh I use, the electric co buys a KWh from a local windmill or solar plant, so it might be expensive but the more electricity I "waste" the lower my carbon footprint becomes. Admittedly probably 90% of the population lives south of me / warmer than me but you have to go surprisingly far south before annual heating and cooling degree days are equal.

          I guess my point is often none of the watts are wasted and often there is no carbon footprint at all and for strong economic and technological change pressures the wattage wasted STRONGLY tends toward zero.

          My 5 year old roku draws essentially zero and I don't technologically envision ever going back to the days of the 30 watt settop box again.

          Maybe the first star trek holodesk will draw 1500 watts. In fact I almost guarantee it. And probably 20 years later it'll be higher res and a hundred times the storage and follow the usual trajectory to "a watt or less" and the working parts the size of a pack of cards. Its just not something to worry about.

      • (Score: 2) by LoRdTAW on Thursday October 27 2016, @06:16PM

        by LoRdTAW (3755) on Thursday October 27 2016, @06:16PM (#419512) Journal

        Modern dryer and washers have standby so the light turns on when you open the door
        Stoves and microwaves all stay on to display clocks and wait for user input
        Some toasters now have soft power buttons with LED's
        Coffee makers with timers
        phone chargers (though most in standby are in the mW-uW range)
        Cable boxes
        DVR's
        media sticks like firestick
        IoT garbage
        smart thermostats (see IoT garbage)
        and on and on.

        It's easy to see how you can quickly collect upward of 20-30 vampire gadgets without realizing it.

        • (Score: 3, Interesting) by Nuke on Thursday October 27 2016, @07:48PM

          by Nuke (3162) on Thursday October 27 2016, @07:48PM (#419546)

          Some of the stuff I have (cooker and microwave come to mind), if turned off, refuses to resume working unless you first go through an elaborate set-up procedure, including setting the date, time and default preferences. They are enough of a pain after the occasional power cut.

          • (Score: 2) by Grishnakh on Friday October 28 2016, @04:30PM

            by Grishnakh (2831) on Friday October 28 2016, @04:30PM (#419879)

            Why did you buy that microwave? You should have taken that back as soon as you found that out about it. Any decent microwave oven will just show 0s when the time isn't set (and the date shouldn't even apply to a microwave). Lots of people simply never bother to set the clocks on their microwaves.

        • (Score: 2) by pnkwarhall on Thursday October 27 2016, @11:36PM

          by pnkwarhall (4558) on Thursday October 27 2016, @11:36PM (#419622)

          Was given a Bunn coffee pot as a gift, and I thought it was great until I realized that it *always* keeps the water hot. Even at two pots-per-day, that seems like a lot of energy waste. What's great as the workhorse of a ship's coffee mess doesn't necessarily apply at home.

          Now I'm curious to know how much energy was spent each day keeping the water hot. It doesn't seem like it would be an insignificant amount.

          --
          Lift Yr Skinny Fists Like Antennas to Heaven
          • (Score: 2) by Geezer on Friday October 28 2016, @09:56AM

            by Geezer (511) on Friday October 28 2016, @09:56AM (#419781)

            We use a Bunn at our house. Thing to remember is it's more efficient to achieve and maintain a stable brewing temperature with than to heat from ambient on demand. Also, cold-start elements use much more power to brew quickly. The actual difference in total power consumed is actually very little. Fun science fair project for the kiddies: Run a trace with a recording ammeter over 24 hours with same number of pots brewed on Bunn vs. Mr. Coffee.

            • (Score: 0, Troll) by toddestan on Saturday October 29 2016, @12:57AM

              by toddestan (4982) on Saturday October 29 2016, @12:57AM (#420004)

              That doesn't make any sense. It uses the same amount of energy to heat the water to the brewing temperature no matter how you do it. If it heats it on-demand when it's needed, that's the amount of energy used. If it heats it beforehand then keeps it hot until you need it, that costs the same amount of energy to heat it initially, plus whatever it needs to keep it hot until it's needed. Which will equal the amount of heat lost from the reservoir, as it will never be perfectly insulated. The only possible benefit is that the on-demand coffee maker will draw a lot of current to heat the water quickly, whereas the always hot coffee maker can afford to draw less current as it can take longer to heat the water.

              Plus there are other considerations. Since it's heating all time, even when no one's around, it's more likely to start a fire. And if the reservoir isn't perfectly sealed, the water can eventually boil off and what then?

    • (Score: 1, Informative) by Anonymous Coward on Thursday October 27 2016, @04:50PM

      by Anonymous Coward on Thursday October 27 2016, @04:50PM (#419474)

      Bought a batch of little switches that plug in and have one outlet. Power viper problem solved for wall warts and others.

      Some are plain, like this, http://www.adrprovita.com/Outlet_Shut_Off_Switch [adrprovita.com]
      Others have a light for devices that are not so obviously on or off, these are low priced a little over $2 each in qty. 6,
          http://www.ebay.com/itm/Lot-1-2-5-6-10-Wall-Tap-Power-Adapter-Plug-Outlet-ON-OFF-Switch-Unplugging-Cord-/122024749916?var=&hash=item1c693deb5c:m:m5KPDuVH8gCkprfFX85EBJQ [ebay.com]
      This one is a triple and has grounded outlets,
          http://www.ebay.com/itm/NEW-Triple-Plug-Outlet-Energy-Saving-On-Off-Switch-Kitchen-Office-Bathroom-Home-/181964227424?hash=item2a5de9c360:g:c2IAAOSwAKxWYmFN [ebay.com]

    • (Score: 0) by Anonymous Coward on Friday October 28 2016, @05:13AM

      by Anonymous Coward on Friday October 28 2016, @05:13AM (#419736)

      Or 600 dollars over 10 years. Hmmm more interesting.

      Over a lifetime, 4800 bucks. Yeah fuck that noise. Just toss that money away.

      Here is a tip on investing. Slow and steady wins the race there are no homeruns.

  • (Score: 3, Interesting) by bradley13 on Thursday October 27 2016, @04:22PM

    by bradley13 (3053) on Thursday October 27 2016, @04:22PM (#419460) Homepage Journal

    The odd thing is: there is a huge variance in the appliance. We have a couple of lasert printers in our home office: one consumes almost nothing in sleep mode (hard to measure, probably 1-2 watts. The other consumes 50 watts (!) in sleep mode - we try to remember to turn that one off. Stupid design? A manufacturer that doesn't give a shit?

    It's the same damned thing for batteries. Swiss TV had a program a couple of weeks ago, where a university lab tested a whole slew of devices. Some used the battery down to the dregs - leaving maybe 1% of its energy before demanding a new battery. Other devices uses less than a third of the charge. Again: stupid design. Again, a manufacturer who doesn't give a shit.

    The average consumer never thinks to ask "how much power does this device use in sleep mode?", or "does this device actually use its batteries efficiently?". Government regulations requiring efficient power use seem to be the only solution. [wikipedia.org]

    --
    Everyone is somebody else's weirdo.
    • (Score: 0) by Anonymous Coward on Thursday October 27 2016, @04:54PM

      by Anonymous Coward on Thursday October 27 2016, @04:54PM (#419476)

      They thought I was nuts at the cable tv office, but I had them pull out all the different types of digital cable boxes they had. There were 3 or 4 different brands/models that all did the same thing. The power ratings on the sticker varied from something like 20 watts up to 70 watts. Who knows if these ratings are accurate (without a Kill-A-Watt), but I chose the low power Cisco, seems to work fine, and only gets a little warm.

    • (Score: 2) by MrGuy on Thursday October 27 2016, @05:15PM

      by MrGuy (1007) on Thursday October 27 2016, @05:15PM (#419489)

      The odd thing is: there is a huge variance in the appliance. We have a couple of lasert printers in our home office: one consumes almost nothing in sleep mode (hard to measure, probably 1-2 watts. The other consumes 50 watts (!) in sleep mode - we try to remember to turn that one off. Stupid design? A manufacturer that doesn't give a shit?

      There's a third option here, and that's convenience. Should we waste power to avoid wasting time? If so, how much is "worth it?"

      With your laser printer example, laser printers need a heated fuser unit to melt the toner onto the page, which consumes a lot of energy (relatively speaking). I'd be willing to bet one of your printers powers down the fuser when it sleeps and the other doesn't. If that's correct, the more energy efficient one will be slower to print when coming out of sleep mode - it will have to heat the fuzer back up. Whereas the wasteful one can go right to printin'.

      And, when you're looking at making a consumer level device, maybe you're more worried about complaints like "Gawd, this thing takes FOREVER to print!" from customers who only print occasionally, than you are about comments about the power consumption in sleep mode.

      Another reason might be cost. You can usually get a lot more efficiency out of purpose-built circuitry that only does what you need it to than you can with general-purpose "off the shelf" components, but that's a potentially big investment. Also, even for pieces where you're working with pre-built components, it can often be the case that more power-efficient components are more costly than less efficient alternatives. Depending on what you're building, power-efficiency can be a big tax on your device.

      If you can build a device that costs $50, or a device that costs $90 and is more power efficient, and your competitors are selling similar devices at $60, which do you choose to produce? Also, how long does it take for the additional power costs to make the extra cost of the more expensive device "pay for itself"?

      That's not to say manufacturers always make long-term socially efficient decisions on either convenience or on cost - if my hypothetical lower-power $90 device saved you $30 a year in electricity, they really SHOULD make the more expensive one. But if people only look at the price, it's hard to succeed, even if the device is "better," because people aren't always rational.

      That's not to say poor design/engineering don't sometimes (often?) play a role. But there are reasons for poor energy efficiency that are more nuanced by "they're bad at designing."

    • (Score: 2) by Snotnose on Thursday October 27 2016, @05:38PM

      by Snotnose (1623) on Thursday October 27 2016, @05:38PM (#419497)

      I've got a Uverse DVR, it draws the same power turned off as it does turned on. This is a hugely stupid design decision.

      --
      When the dust settled America realized it was saved by a porn star.
      • (Score: 2) by bob_super on Thursday October 27 2016, @06:02PM

        by bob_super (1357) on Thursday October 27 2016, @06:02PM (#419504)

        The US consumer reuires their TV to turn on right now.
        The broadcasters agree that it'd be a bad idea if the consumers changed their mind about watching TV just because it took an extra 20 seconds to turn on, and the phone beeped with yet another insipid tweet.

        So we waste power, because we need to optimize advertising delivery.

  • (Score: 3, Interesting) by RamiK on Thursday October 27 2016, @04:29PM

    by RamiK (1813) on Thursday October 27 2016, @04:29PM (#419463)

    TLDR: If you don't force the supply factories to use better designs and components through regulation, they'll never compete on quality and prices won't go down for higher efficiency supplies.

    When it comes to logic components, there are plenty of incentives for fabs and consumers to bring TDP down and battery life up.

    However, when it comes to on-grid power supplies, the consumers don't notice the difference so the factories keep producing inefficient PSUs using low cost components. This affects both small chargers ( http://www.righto.com/2012/10/a-dozen-usb-chargers-in-lab-apple-is.html [righto.com] ) to the power supplies you'd find in appliances (e.g. the bronze \ silver \ gold rating in PSUs and what you're fridge is using).

    The only thing that works there is forcing the manufacturers to use higher quality components in ALL their products by regulating minimum efficiency ratings. Sometimes it comes from government like the EPA's Energy Star. Sometimes it's the OEMs that put preemptive pressures like 2007's ATX12V v2.3 PSUs as the EPA \ FCC threatens to intervene. Regardless, if the government were to ban all supplies under 75% efficiency, the industry will be forced to abandon inefficient designs and parts. Eventually the costs will even go down since it's roughly the same production times and material bills once it's scaled.

    --
    compiling...
  • (Score: 2) by cellocgw on Thursday October 27 2016, @11:42PM

    by cellocgw (4190) on Thursday October 27 2016, @11:42PM (#419625)

    Turn off those all-night illuminated stores in just a couple malls -- and kill the damn parking lot lights once the cleaning crew goes home -- and you've saved far more megajoules than a zillion (statistically made-up number) homes.

    --
    Physicist, cellist, former OTTer (1190) resume: https://app.box.com/witthoftresume
    • (Score: 2) by Grishnakh on Friday October 28 2016, @04:40PM

      by Grishnakh (2831) on Friday October 28 2016, @04:40PM (#419882)

      That would definitely help, though those are private property so it's not that easy to force correct behavior. However, another good place to save energy and taxpayer money is with street lighting; we don't really need the streets lit up like it's daytime; that's why cars have headlights. It's bad for the environment and wildlife, it's bad for making light pollution, it's bad for energy efficiency, and it's bad for the taxpayer since those lights are paid by taxes. Turn them off and remove them (at least in many places; I'm not saying every single street light is a bad idea, but they're massively overused). The only problem here is all the morons who'll scream about crime rates, even though there's zero evidence that street lights reduce crime.

      • (Score: 1) by Scruffy Beard 2 on Saturday October 29 2016, @02:10PM

        by Scruffy Beard 2 (6030) on Saturday October 29 2016, @02:10PM (#420112)

        My local municipality is slowly replacing old street-lights with lower power, more properly focused, LED ones. (The colour reproduction of 'white' vs Amber is actually better).

        After initial funding, they pay for the program through power savings.

        • (Score: 2) by Grishnakh on Monday October 31 2016, @03:42PM

          by Grishnakh (2831) on Monday October 31 2016, @03:42PM (#420900)

          They could save a lot more money by simply removing the lights altogether.