Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Thursday May 16 2019, @08:52PM   Printer-friendly
from the live-long-and-prosper? dept.

Phys.org:

Switching supply units used today are of light weight and compact design, but also susceptible to errors due to the incorporated electrolyte capacitors. Film capacitors would have much longer service lives. However, they need up to ten times more space. Scientists of KIT's[*] Light Technology Institute (LTI) have now developed a digital control method for use of film capacitors that need slightly more space only.

The control method runs on a microprocessor integrated in the supply unit and detects disturbing environmental impacts, such that e.g. higher voltage fluctuations can be balanced. Hence, storage capacitors of reduced capacity are sufficient. Michael Heidinger, LTI, summarizes the advantages: "Use of these film capacitors eliminates the main cause of failure of power supplies, i.e. electrolyte capacitors. Depending on the design, service life may be increased by a factor of up to three." The result is a much reduced maintenance expenditure.

This one is digital.

[*] Karlsruhe Institute of Technology.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Informative) by Acabatag on Friday May 17 2019, @12:27AM (4 children)

    by Acabatag (2885) on Friday May 17 2019, @12:27AM (#844542)

    Let's add hundreds of thousands of transistors, hopefully
    with something other than flash memory for the program, which
    rots over time to replace a 10 cent part

    You know, there are ten cent microprocessors out there.

    Starting Score:    1  point
    Moderation   +1  
       Informative=1, Total=1
    Extra 'Informative' Modifier   0  

    Total Score:   2  
  • (Score: 2) by exaeta on Saturday May 18 2019, @04:28PM (3 children)

    by exaeta (6957) on Saturday May 18 2019, @04:28PM (#845057) Homepage Journal
    I donno if you'd call them microprocessors, but rather microcontrollers. The difference is that a microprocessor supports multiple "rings of execution" so can have an operating system. The cheapest one I found on mouser was the Z84C2006PEG at around $6. A microcontroller needs no operating system and can be much cheaper, around $0.33 for an ATTINY. I guess you can get them a bit cheaper depending on where you source them, but $0.10 is still a bit low. And that still only gets you an 8 bit microcontroller.
    --
    The Government is a Bird
    • (Score: 2) by Acabatag on Saturday May 25 2019, @03:03AM (2 children)

      by Acabatag (2885) on Saturday May 25 2019, @03:03AM (#847476)

      Why would a microprocessor -oops, you want to call it a microcontroller- embedded in a power supply run a Operating System? Are you uncomfortable with the reset vector?

      • (Score: 2) by RS3 on Saturday May 25 2019, @03:53PM

        by RS3 (6367) on Saturday May 25 2019, @03:53PM (#847629)

        I was going to make a wisecrack, but then I remembered TFA talks about IoT, which can still be done OS-less, but knowing most business' habits, there would be minimal or no security.

        Which leads me to think that IoT stuff should need some kind of central controller that would be smart and secure, intelligently limiting access, etc. The idea would be that each IoT device would not speak IP, only central controller speak.

        You bring up an excellent point: OS or just straight code? Or maybe FreeDOS? Or QNX or VxWorks? The decision depends on the application. At one end of the scale or the other the choice will be obvious.

        Another factor is developer's time and cost. With CPUs getting cheaper and more powerful, it might make sense to use an OS if the response time ("determinism") is within necessary specs, and if you need existing libraries, OS and programming language functions, drivers and modules.

        The good thing about writing all of the code (no 3rd-party OS) is you know what's there. If there are bugs, you wrote them. If an OS has bugs, or quirks, you might not find out until later panic.

        Some years ago I had to write some specialized BASIC code that ran in a PLC module. "Real time" is relative to the needs, right? Anyway, got the code working, except- it was much too slow for the process. After much testing, I discovered that all BASIC code control transfers caused the (very stupid) interpreter to start at line 0 and scan until it found the target line. Every.Time.Through. Wha??? So all init code went at the end, and everything had to be prioritized by response-time, loop counts, etc. IE: no compiler optimizations- I had to do my own loop-unrolls, inlining, etc., but in limited RAM. Got it all working with tons of time to spare, but if I hadn't discovered that quirk, it never would have worked. I would have much preferred an RTOS and write C.

      • (Score: 2) by exaeta on Saturday May 25 2019, @05:09PM

        by exaeta (6957) on Saturday May 25 2019, @05:09PM (#847659) Homepage Journal
        They don't, which is why we'd use a microcontroller instead of a more expensive microprocessor. :P
        --
        The Government is a Bird