Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday April 20 2019, @11:10AM   Printer-friendly
from the schschschschschschschsch dept.

Famed hardware hacker Bunnie Huang has posted an overview of his notes on designing an open source entropy generator. His summary links to the full notes which include schematics, measurement results, as well as other key details.

The final optimized design takes <1cm2 area and draws 520uA at 3.3V when active and 12uA in standby (mostly 1.8V LDO leakage for the output stage, included in the measurement but normally provided by the system), and it passes preliminary functional tests from 2.8-4.4V and 0-80C. The output levels target a 0-1V swing, meant to be sampled using an on-chip ADC from a companion MCU, but one could add a comparator and turn it into a digital-compatible bitstream I suppose. I opted to use an actual diode instead of a NPN B-E junction, because the noise quality is empirically better and anecdotes on the Internet claim the NPN B-E junctions fail over time when operated as noise sources. I'll probably go through another iteration of tweaking before final integration, but afaik this is the smallest, lowest power open-source avalanche noise generator to date (slightly smaller than this one [PDF]).


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by Rupert Pupnick on Saturday April 20 2019, @01:14PM (8 children)

    by Rupert Pupnick (7277) on Saturday April 20 2019, @01:14PM (#832545) Journal

    Nice circuit design, but I’ve never heard the term used in hardware engineering, including applications such as broadband noise generators for labs or RNG. Isn’t entropy really a thermodynamic or statistical mechanics concept that’s not really a circuit function?

    Starting Score:    1  point
    Moderation   +1  
       Interesting=1, Total=1
    Extra 'Interesting' Modifier   0  

    Total Score:   2  
  • (Score: 1, Informative) by Anonymous Coward on Saturday April 20 2019, @02:18PM

    by Anonymous Coward on Saturday April 20 2019, @02:18PM (#832576)
    Entropy is a common way of discussing how random or noisy information is. Wikipedia on "Entropy (information_theory)" [wikipedia.org]
  • (Score: 1, Informative) by Anonymous Coward on Saturday April 20 2019, @02:30PM (1 child)

    by Anonymous Coward on Saturday April 20 2019, @02:30PM (#832579)

    There is a concern: the external NOISE_ON signal is directly coupled onto the input of the amplifier. Any digital noise in it might exceed the noise of the diode.

    Had the author looked for the switcher frequency in the spectrum of the noise? It might be strong enough, but still invisible on the scope.

    The 5.1M divider is not practical due to leaks; you'd have to put a protective ground trace and bathe the board in conformal coating. Easier to add two more resistors and feed the divider with a lower, cleaned voltage. Myself, I'd use a voltage reference as the bias source, as that voltage has to be super clean.

    The +1.8V rail must be regulated (linearly) within the device and feed only these parts. The needed regulator in a small SOT package will not hurt.

    • (Score: 2) by RS3 on Saturday April 20 2019, @04:13PM

      by RS3 (6367) on Saturday April 20 2019, @04:13PM (#832614)

      If there is noise in the NOISE_ON signal, it can be simply filtered with another resistor and a capacitor.

      If by "switcher frequency" you mean the power supply, I agree- it could be a significant factor, and needs to be housed in a metal can (shielded), maybe 2 stages (poles) of power supply filtering used (two resistors and two capacitors), and/or located far away, and good ground-plane PC board layout techniques used.

  • (Score: 4, Informative) by rigrig on Saturday April 20 2019, @02:31PM (4 children)

    by rigrig (5129) Subscriber Badge <soylentnews@tubul.net> on Saturday April 20 2019, @02:31PM (#832580) Homepage

    Isn’t entropy really a thermodynamic or statistical mechanics concept

    It is [wikipedia.org], but then the cryptographers [wikipedia.org] needed a nice sounding word for randomness [wikipedia.org].

    --
    No one remembers the singer.
    • (Score: 1) by Rupert Pupnick on Saturday April 20 2019, @02:46PM (3 children)

      by Rupert Pupnick (7277) on Saturday April 20 2019, @02:46PM (#832583) Journal

      Thanks, that’s what I thought. As if terminology in cryptography weren’t sexy enough. 🙄

      Is there any expectation as to what the probability distribution function of the output should be? Is it supposed to be flat over the 0 - 0.999 normalized range? Is Gaussian OK if you do the output scaling appropriately?

      Is the raw random output variable supposed to be the time interval between the sawtooth peaks on that scope trace?

      • (Score: 2) by RS3 on Saturday April 20 2019, @03:58PM (2 children)

        by RS3 (6367) on Saturday April 20 2019, @03:58PM (#832606)

        I think the "Noise = Entropy" thing came from extending the use of "entropy" from simple thermodynamics to very large-scale systems, all the way out to the entire universe. (Notice I started that sentence with "I think".)

        > Is the raw random output variable supposed to be the time interval between the sawtooth peaks on that scope trace?

        That's one thing to look at. Another is the amplitude (height) variation. Another is the slope.

        Speaking numerically (digitally), it all depends on how it's sampled: the "bitness" of the sampler, its resolution. IE., if you have a low-bit ADC, you'll miss many of the smaller peaks and valleys, regardless of the sample rate.

        (An aside: To get high bit counts without building complicated ADCs, some brilliant EEs and mathematicians came up with "delta-sigma" or "1-bit sampling" where we just look at the difference between 2 successive sample points: is the next one higher or lower? To do this we have to "oversample" by a large margin and then do filtering and processing algorithmically, but it works. And we lose DC reference. I know of commercial products which use low-quantization bit + oversampling. To get DC, they interrupt themselves regularly to recalibrate to a DC reference. They use 2 channels of ADC so you don't lose any signal- one continues sampling while the other is resetting / recalibrating.)

        https://en.wikipedia.org/wiki/Quantization_(signal_processing) [wikipedia.org]

        Sample rate is another discussion. Most of you probably know about the "Nyquist" rate, which says you must sample at least 2X the highest frequency of the signal or you'll get "aliasing". True noise contains all frequencies, so we have to remove some of our randomness by limiting bandwidth.

        • (Score: 2, Interesting) by Rupert Pupnick on Tuesday April 23 2019, @02:10PM (1 child)

          by Rupert Pupnick (7277) on Tuesday April 23 2019, @02:10PM (#833857) Journal

          Thanks, but I still really dislike the use of the term “entropy” in this context, especially when the term “randomness” is much more clear, simple, and descriptive.

          In science literature, entropy is frequently described as a measurement of the degree of disorder in a system, but even that I think is unsatisfactory and misleading. I think of a high entropy system as being highly uniform— there’s no place where there are any gradients or structures that can store any potentential energy or information. The ultimate example, as you say, is the heat death of the universe.

          Let’s get the term “entropy” out of circuit design— or at least keep it out of the terminology!

          • (Score: 2) by RS3 on Tuesday April 23 2019, @03:07PM

            by RS3 (6367) on Tuesday April 23 2019, @03:07PM (#833872)

            You have my vote! I don't remember hearing the word "entropy" until taking thermodynamics in college. I did fairly well (in spite of most people complaining about how hard it was waaaa) and it stuck with me, incl. concepts, and terms. I'll have to dig out my old college textbook and see what the definition was (late '80s). Don't hold your breath though. A good NASA reference: https://www.grc.nasa.gov/www/k-12/airplane/enthalpy.html [nasa.gov]

            Some definition / usage: https://en.wikipedia.org/wiki/Entropy [wikipedia.org] I think I see the problem. In statistics (remember that?) they use Greek capital omega, and in EE we use omega for resistor Ohms, and resistors are noisy, hence the connection and confusion. (I jest but there may be a connection, even if it was made and promulgated by non-technical-types.)

            And some good etymology: http://www.eoht.info/page/Entropy+(etymology) [eoht.info]

            Bottom line: It seems that you and I and some others have learned "entropy" in a very specific context. At one of my first jobs I had to program (configure really) "protocol converters" (data communication boxes) and that was my only exposure to the word "protocol". I got confused when I heard it used in common language, until I looked up the dictionary definition. I love learning, but some days it's tiring.