Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Saturday April 20 2019, @11:10AM   Printer-friendly
from the schschschschschschschsch dept.

Famed hardware hacker Bunnie Huang has posted an overview of his notes on designing an open source entropy generator. His summary links to the full notes which include schematics, measurement results, as well as other key details.

The final optimized design takes <1cm2 area and draws 520uA at 3.3V when active and 12uA in standby (mostly 1.8V LDO leakage for the output stage, included in the measurement but normally provided by the system), and it passes preliminary functional tests from 2.8-4.4V and 0-80C. The output levels target a 0-1V swing, meant to be sampled using an on-chip ADC from a companion MCU, but one could add a comparator and turn it into a digital-compatible bitstream I suppose. I opted to use an actual diode instead of a NPN B-E junction, because the noise quality is empirically better and anecdotes on the Internet claim the NPN B-E junctions fail over time when operated as noise sources. I'll probably go through another iteration of tweaking before final integration, but afaik this is the smallest, lowest power open-source avalanche noise generator to date (slightly smaller than this one [PDF]).


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 1, Informative) by Anonymous Coward on Saturday April 20 2019, @01:13PM (12 children)

    by Anonymous Coward on Saturday April 20 2019, @01:13PM (#832544)

    https://betrusted.io/avalanche-noise [betrusted.io]

    this project is probably better posted to hackaday, but while i have people's attention i'd like to ask your opinion on the comment posted on the page from OP's link..

    do you believe that randomness can be 'improved' algorithmically, be it as that comment suggested, by randomly selecting random generators, or any other means? iow, is there a 'better than ordinary randomness?'

    • (Score: 2) by canopic jug on Saturday April 20 2019, @01:55PM (3 children)

      by canopic jug (3949) Subscriber Badge on Saturday April 20 2019, @01:55PM (#832568) Journal

      this project is probably better posted to hackaday ...

      Those more interested can read his summary and follow the link from there, at least while linking is allowed. That's why just his summary of his notes was linked to from here. Those are about right for here, especially since randomness is sought after wherever it can be found [smbc-comics.com] for many software activities.

      Anyway, my maths skills are weak, but from what I've read, algorithms cannot increase random noise and will actually introduce predictability at some level when overlayed upon a real source.

      --
      Money is not free speech. Elections should not be auctions.
      • (Score: 3, Funny) by driverless on Sunday April 21 2019, @03:28AM (2 children)

        by driverless (4770) on Sunday April 21 2019, @03:28AM (#832840)

        I also designed an avalanche noise generator many years ago. Problem was it only worked when there was a lot of snow, so the design was rejected by the patent office.

        • (Score: 2) by JNCF on Sunday April 21 2019, @08:17PM (1 child)

          by JNCF (4317) on Sunday April 21 2019, @08:17PM (#833099) Journal

          I feel like the inventor of the landslide noise generator had a valid prior art claim on you, anyhow.

          • (Score: 2) by driverless on Tuesday April 23 2019, @03:19AM

            by driverless (4770) on Tuesday April 23 2019, @03:19AM (#833687)

            I did cite him as prior art, but the patent has long since expired so it wasn't really a problem.

    • (Score: 4, Insightful) by janrinok on Saturday April 20 2019, @02:22PM (2 children)

      by janrinok (52) Subscriber Badge on Saturday April 20 2019, @02:22PM (#832577) Journal

      this project is probably better posted to hackaday,

      This project could have been posted to hackaday but seeing that we do, from time to time, publish stories from hackaday the topic is also perfectly at home here also. There are so many uses for good quality randomly generated values in STEM that a reasonable proportion of our community will be interested in this story. That may or may not equate to lots of comments, however if we relied entirely on comment count then this site would consist only stories where the US members of our community throw political mud at each other. ...And that is most certainly not what this site is about.

      • (Score: 3, Funny) by RS3 on Saturday April 20 2019, @02:56PM

        by RS3 (6367) on Saturday April 20 2019, @02:56PM (#832589)

        This is awesome. Now we can throw electrified mud! That'll leave a mark for sure!

      • (Score: -1, Troll) by Anonymous Coward on Saturday April 20 2019, @10:20PM

        by Anonymous Coward on Saturday April 20 2019, @10:20PM (#832738)

        If Hillary had a better random number generator, her emails would still be secure.

    • (Score: 0) by Anonymous Coward on Saturday April 20 2019, @02:57PM

      by Anonymous Coward on Saturday April 20 2019, @02:57PM (#832590)

      True RNG maybe not, but for PRNG could be a chance[0]

      [0]: Increasing the quality of pseudo-random number generator based on fuzzy logic [iop.org]

    • (Score: 2) by anotherblackhat on Saturday April 20 2019, @04:02PM

      by anotherblackhat (4722) on Saturday April 20 2019, @04:02PM (#832610)

      do you believe that randomness can be 'improved' algorithmically ...

      "Randomness" is not well defined.
      If you pick any particular metric, then you can probably find an improvement for that metric.
      Can you make a biased sequence less biased? Sure.
      Can you make a predictable sequence less predictable? Sure.
      However, for most of the metrics we measure there's a limit.
      A sequence can at best be 100% unbiased, and most are already 99.999+% unbiased.

      And sometimes we have conflicting goals for "randomness"
      Can you make a sequence more repeatable? Sure.
      Can you make a sequence less repeatable? Sure.

    • (Score: 0) by Anonymous Coward on Saturday April 20 2019, @06:10PM

      by Anonymous Coward on Saturday April 20 2019, @06:10PM (#832650)

      thanks for both links guys!

      't was a trick question.. but hey, didn't know about that one could fuzz prngs to increase entropy -- which, now that i've written it that may makes sense of course.

      to the other person wondering about shaped randomness; random is an absolute like infinity (at least in my mind).
        if you need it to have a shape, you filter it, which means you mult it the curve you want it to conform to.

    • (Score: 2) by epitaxial on Saturday April 20 2019, @10:04PM

      by epitaxial (3165) on Saturday April 20 2019, @10:04PM (#832732)

      I always thought having a geiger tube triggering on background radiation would be part of a good entropy source.

    • (Score: 1) by khallow on Sunday April 21 2019, @03:13AM

      by khallow (3766) Subscriber Badge on Sunday April 21 2019, @03:13AM (#832836) Journal

      do you believe that randomness can be 'improved' algorithmically, be it as that comment suggested, by randomly selecting random generators, or any other means? iow, is there a 'better than ordinary randomness?'

      Yes, since the generator isn't optimal. But there is a hard upper limit to how much entropy can be generated per unit time and it would probably be proportional to the heat generated by the part that's generating the randomness (in the circuit [betrusted.io] pf the story the TPS61158 chip).

  • (Score: 2, Interesting) by Rupert Pupnick on Saturday April 20 2019, @01:14PM (8 children)

    by Rupert Pupnick (7277) on Saturday April 20 2019, @01:14PM (#832545) Journal

    Nice circuit design, but I’ve never heard the term used in hardware engineering, including applications such as broadband noise generators for labs or RNG. Isn’t entropy really a thermodynamic or statistical mechanics concept that’s not really a circuit function?

    • (Score: 1, Informative) by Anonymous Coward on Saturday April 20 2019, @02:18PM

      by Anonymous Coward on Saturday April 20 2019, @02:18PM (#832576)
      Entropy is a common way of discussing how random or noisy information is. Wikipedia on "Entropy (information_theory)" [wikipedia.org]
    • (Score: 1, Informative) by Anonymous Coward on Saturday April 20 2019, @02:30PM (1 child)

      by Anonymous Coward on Saturday April 20 2019, @02:30PM (#832579)

      There is a concern: the external NOISE_ON signal is directly coupled onto the input of the amplifier. Any digital noise in it might exceed the noise of the diode.

      Had the author looked for the switcher frequency in the spectrum of the noise? It might be strong enough, but still invisible on the scope.

      The 5.1M divider is not practical due to leaks; you'd have to put a protective ground trace and bathe the board in conformal coating. Easier to add two more resistors and feed the divider with a lower, cleaned voltage. Myself, I'd use a voltage reference as the bias source, as that voltage has to be super clean.

      The +1.8V rail must be regulated (linearly) within the device and feed only these parts. The needed regulator in a small SOT package will not hurt.

      • (Score: 2) by RS3 on Saturday April 20 2019, @04:13PM

        by RS3 (6367) on Saturday April 20 2019, @04:13PM (#832614)

        If there is noise in the NOISE_ON signal, it can be simply filtered with another resistor and a capacitor.

        If by "switcher frequency" you mean the power supply, I agree- it could be a significant factor, and needs to be housed in a metal can (shielded), maybe 2 stages (poles) of power supply filtering used (two resistors and two capacitors), and/or located far away, and good ground-plane PC board layout techniques used.

    • (Score: 4, Informative) by rigrig on Saturday April 20 2019, @02:31PM (4 children)

      by rigrig (5129) <soylentnews@tubul.net> on Saturday April 20 2019, @02:31PM (#832580) Homepage

      Isn’t entropy really a thermodynamic or statistical mechanics concept

      It is [wikipedia.org], but then the cryptographers [wikipedia.org] needed a nice sounding word for randomness [wikipedia.org].

      --
      No one remembers the singer.
      • (Score: 1) by Rupert Pupnick on Saturday April 20 2019, @02:46PM (3 children)

        by Rupert Pupnick (7277) on Saturday April 20 2019, @02:46PM (#832583) Journal

        Thanks, that’s what I thought. As if terminology in cryptography weren’t sexy enough. 🙄

        Is there any expectation as to what the probability distribution function of the output should be? Is it supposed to be flat over the 0 - 0.999 normalized range? Is Gaussian OK if you do the output scaling appropriately?

        Is the raw random output variable supposed to be the time interval between the sawtooth peaks on that scope trace?

        • (Score: 2) by RS3 on Saturday April 20 2019, @03:58PM (2 children)

          by RS3 (6367) on Saturday April 20 2019, @03:58PM (#832606)

          I think the "Noise = Entropy" thing came from extending the use of "entropy" from simple thermodynamics to very large-scale systems, all the way out to the entire universe. (Notice I started that sentence with "I think".)

          > Is the raw random output variable supposed to be the time interval between the sawtooth peaks on that scope trace?

          That's one thing to look at. Another is the amplitude (height) variation. Another is the slope.

          Speaking numerically (digitally), it all depends on how it's sampled: the "bitness" of the sampler, its resolution. IE., if you have a low-bit ADC, you'll miss many of the smaller peaks and valleys, regardless of the sample rate.

          (An aside: To get high bit counts without building complicated ADCs, some brilliant EEs and mathematicians came up with "delta-sigma" or "1-bit sampling" where we just look at the difference between 2 successive sample points: is the next one higher or lower? To do this we have to "oversample" by a large margin and then do filtering and processing algorithmically, but it works. And we lose DC reference. I know of commercial products which use low-quantization bit + oversampling. To get DC, they interrupt themselves regularly to recalibrate to a DC reference. They use 2 channels of ADC so you don't lose any signal- one continues sampling while the other is resetting / recalibrating.)

          https://en.wikipedia.org/wiki/Quantization_(signal_processing) [wikipedia.org]

          Sample rate is another discussion. Most of you probably know about the "Nyquist" rate, which says you must sample at least 2X the highest frequency of the signal or you'll get "aliasing". True noise contains all frequencies, so we have to remove some of our randomness by limiting bandwidth.

          • (Score: 2, Interesting) by Rupert Pupnick on Tuesday April 23 2019, @02:10PM (1 child)

            by Rupert Pupnick (7277) on Tuesday April 23 2019, @02:10PM (#833857) Journal

            Thanks, but I still really dislike the use of the term “entropy” in this context, especially when the term “randomness” is much more clear, simple, and descriptive.

            In science literature, entropy is frequently described as a measurement of the degree of disorder in a system, but even that I think is unsatisfactory and misleading. I think of a high entropy system as being highly uniform— there’s no place where there are any gradients or structures that can store any potentential energy or information. The ultimate example, as you say, is the heat death of the universe.

            Let’s get the term “entropy” out of circuit design— or at least keep it out of the terminology!

            • (Score: 2) by RS3 on Tuesday April 23 2019, @03:07PM

              by RS3 (6367) on Tuesday April 23 2019, @03:07PM (#833872)

              You have my vote! I don't remember hearing the word "entropy" until taking thermodynamics in college. I did fairly well (in spite of most people complaining about how hard it was waaaa) and it stuck with me, incl. concepts, and terms. I'll have to dig out my old college textbook and see what the definition was (late '80s). Don't hold your breath though. A good NASA reference: https://www.grc.nasa.gov/www/k-12/airplane/enthalpy.html [nasa.gov]

              Some definition / usage: https://en.wikipedia.org/wiki/Entropy [wikipedia.org] I think I see the problem. In statistics (remember that?) they use Greek capital omega, and in EE we use omega for resistor Ohms, and resistors are noisy, hence the connection and confusion. (I jest but there may be a connection, even if it was made and promulgated by non-technical-types.)

              And some good etymology: http://www.eoht.info/page/Entropy+(etymology) [eoht.info]

              Bottom line: It seems that you and I and some others have learned "entropy" in a very specific context. At one of my first jobs I had to program (configure really) "protocol converters" (data communication boxes) and that was my only exposure to the word "protocol". I got confused when I heard it used in common language, until I looked up the dictionary definition. I love learning, but some days it's tiring.

  • (Score: 2) by BsAtHome on Saturday April 20 2019, @01:17PM

    by BsAtHome (889) on Saturday April 20 2019, @01:17PM (#832547)

    It always reminds me of the quote: "Random numbers are something best not left to chance.". And then, computers and random are two worlds apart, striving to do the exact opposite.

    There is real science in randomness and, as the paper and article acknowledge, a great deal of intricate details to be handled. I love the engineering effort in these small niches, when they are well documented and presented.

  • (Score: 0) by Anonymous Coward on Saturday April 20 2019, @02:04PM (2 children)

    by Anonymous Coward on Saturday April 20 2019, @02:04PM (#832574)

    A diode is basically a transistor minus the base, so no big difference. If the NPN is failing try an SCR.

    • (Score: 2, Informative) by Rupert Pupnick on Saturday April 20 2019, @02:48PM (1 child)

      by Rupert Pupnick (7277) on Saturday April 20 2019, @02:48PM (#832584) Journal

      You mean missing the emitter or collector. If you remove the base (in the middle) you remove both PN junctions.

      • (Score: 2) by RS3 on Saturday April 20 2019, @04:07PM

        by RS3 (6367) on Saturday April 20 2019, @04:07PM (#832611)

        AC's idea would still work with a high enough voltage ( > Vbce0 ). :-}

        I'm not sure why he's suggesting an SCR, but if the BE junctions are failing, and it's due to the cyclic charge / avalanche discharge of the stored charge (which makes sense- microscopic sparks) maybe using a power transistor would solve the problem, or at least buy more time. I would stick with zeners.

  • (Score: 3, Insightful) by Anonymous Coward on Saturday April 20 2019, @02:53PM (15 children)

    by Anonymous Coward on Saturday April 20 2019, @02:53PM (#832587)

    The third edition of Horowitz and Hill's "The Art of Electronics" describes a better circuit in section 13.14.7. The schematic is in figure 13.121 on page 984.

    This circuit was designed in the mid-90s and the component and interface choices reflect that. It was used as part of a collaboration between Professor Paul Horowitz (Harvard University Physics Department) and Professor William Press (at the time Harvard University astronomy and physics) -- both members of the Jason Defense Advisory Group and both blazingly smart and utterly practical -- to create "the best 'random' bits anywhere". That random bit collection was published on the CD-ROM that came with Press's "Numerical Recipes" books.

    Anybody who is serious about creating true random numbers should study this work and the process that Horowitz and Press went through in generating these random bits.

    • (Score: 2, Interesting) by Rupert Pupnick on Saturday April 20 2019, @05:39PM (13 children)

      by Rupert Pupnick (7277) on Saturday April 20 2019, @05:39PM (#832642) Journal

      Would be nice if you could provide a link or a diagram!

      Why not use the simplest noise generator of all: a large resistor at the input of a bunch of gain stages. Sure, the noise is bandlimited by the gain stages and board parasitics, etc., but any design is inherently limited to some finite bandwidth. Again, what probabilistic requirements are people trying to design to?

      • (Score: 2) by RamiK on Sunday April 21 2019, @04:49AM (1 child)

        by RamiK (1813) on Sunday April 21 2019, @04:49AM (#832854)

        Why not use [...] a large resistor at the input

        Without even looking, power efficiency and heat.

        --
        compiling...
        • (Score: 3, Informative) by Rupert Pupnick on Sunday April 21 2019, @12:35PM

          by Rupert Pupnick (7277) on Sunday April 21 2019, @12:35PM (#832920) Journal

          Not a power or heat issue at all. The resistor doesn’t dissipate any power other than from the DC input bias current of the gain stage it’s connected to, which is typically in the microamperes. You pick a large resistor to get a large noise voltage, but not so large as to disrupt the biasing of the input stage.

          The underlying principle has been known to communications systems designers since the early days of analog circuit design and can be found here: https://en.m.wikipedia.org/wiki/Johnson–Nyquist_noise [wikipedia.org]

      • (Score: 3, Informative) by Rich on Sunday April 21 2019, @02:01PM (10 children)

        by Rich (945) on Sunday April 21 2019, @02:01PM (#832938) Journal

        The Horowitz/Hill circuit is the BE-junction of a 2N4401 through a 24dB/oct HP filter with the middle poles also being fixed gain stages. The output runs into a fast comparator, which feeds a PAL programmed to be a shift register that outputs RS-232 serial. The circuit is about as power hungry as it gets (shunted voltage, all analog resistors 3k). Entropy hinges on the offset voltage of the comparator and requires adjustment from counting the bit balance in the data output. I find their assumption about a fixed noise level at comparator input rather optmimistic, after I have dabbled with a noise circuit for a classic analog synthesizer; depending on the transistor batch used, these levels may vary WIDELY. (And there is the interesting story of that mysterious "selected" 2SC828 in 808s).

        The power consumption of this is absolutely not what Bunnie would have wanted. The Bunnie circuit is all about being small and low-power. There's a single stage at all, the zener noise runs straight into a modern low-voltage opamp with gain 6. I guess he wants to sample that using an on-controller A/D. The most interesting part is the TPS61158 application that ups the voltage on demand enough for an avalance zener to work. (btw: these micro-BGAs, and especially their accompanying 01005 resistors, can be mistaken for dust, and I'd rather leave those to Louis Rossman for showing off his skills.)

        I have no experience if zener noise levels are more predictable than those of abused transistors, but I wouldn't want to bet that this works reliably with different batches. If the post-A/D level too small, he can just use more samples for the needed entropy, but if the gain slams into the rails (which is something that happened to me when experimenting with a classic circuit and 1:1 values), it looks ugly.

        • (Score: 1) by Rupert Pupnick on Tuesday April 23 2019, @01:51PM (1 child)

          by Rupert Pupnick (7277) on Tuesday April 23 2019, @01:51PM (#833846) Journal

          Thanks for that info, Rich. You raise an excellent point about keeping the noise amplitude within the input scale of the A/D. Otherwise your converter output will have lots of full scale and zero scale values. Maybe the solution to that is to simply throw those values out?

          Still the question I keep asking about the nature of the noise and randomness remains unanswered: is the power spectral density of the noise source of any importance? Why does the design you describe include high pass filter stages? What clocks the A/D circuit— does that clock have to have randomness, too? And as other posters have suggested, what is the appropriate figure of merit to use when evaluating output randomness? There’s more to good circuit design than small layout area and low power consumption.

          • (Score: 2) by RS3 on Tuesday April 23 2019, @04:42PM

            by RS3 (6367) on Tuesday April 23 2019, @04:42PM (#833905)

            I'm in a rush so I'm not thinking, but generally if you throw samples out, you radically alter the data that you want. "Digital clipping" is what we're referring to and it's a Very Bad Thing. You want to be careful with levels, and if you're not sure, use analog compression / limiters (aka AGC or Automatic Gain Control) or something similar. Simple diode limiting is no good: it will still clip and the resulting square-ish output will contain unwanted harmonics.

            In the old days of 16-bit audio sampling we had to be very very careful with input levels and clipping. Now with 24 bit and higher, we have much more "headroom", but clipping is still a Very Bad Thing (digital clipping creates a big splatter of frequencies and sounds much worse than most analog clipping). 24 bits sampling gives you 20 log (2 ^ 24) = 144 dB dynamic range (16 bits is 96 dB). In practical usage, 16 bits is very fiddly, and 24 bits is pretty easy.

            You definitely don't want random clocks! Interesting to think about though...

            In the audio world, a popular mod (hot rodding, upmod, upgrade, etc.) is to replace the sampling master clock oscillator with a much more stable (low-jitter, temperature stabilized) clock oscillator. They're not super expensive so I'm not sure why manufacturers don't do it and brag about it in the sales literature.

            The clock needs to be 2 times (plus) the highest frequency you're sampling, known as the "Nyquist rate", or you'll get aliasing.

            Best solution is good analog low-pass filters and oversample: sample rate much higher than necessary because digital bandpass filtering is easy.

            You asked about power spectral density. Yes, you want a flat long-term average (voltage squared) spectral density. Any peaks, by definition, show a particular frequency has some kind of "favoring"; better said, it's not random anymore.

            A good not too technical reference: https://www.analog.com/en/analog-dialogue/articles/practical-filter-design-precision-adcs.html [analog.com]

        • (Score: 2) by RS3 on Tuesday April 23 2019, @04:02PM (7 children)

          by RS3 (6367) on Tuesday April 23 2019, @04:02PM (#833894)

          You might be an EE, or certainly knowledgeable. Almost any small-signal circuit can be optimized for low-power consumption (and I'm sure you know that and I'm just commenting for anyone who might read this and not know).

          The first comment in this discussion contains a link to an article where the author tries some different transistors and zeners: https://betrusted.io/avalanche-noise [betrusted.io] As I commented above, I had not heard about the transistor B-E "wearout" problem, but I don't do a lot in that world- I'm usually trying to get RID of noise! I'd stick with zeners. They're meant to do that (avalanche).

          I agree about noise, batches, etc. It's not too difficult to hand-select them (or even automate it) but they may vary over long time periods anyway.

          My hunch is that the higher the zener breakdown voltage, the more noise you get.

          Since f = 1 / (2 Pi R C), the higher the zener's bias R, the more you'll get lower frequencies. And that's easily calculated. Of course you'll need an FET (very high impedance input) op amp.

          Then I would use some nice adjustable band-pass filters (parametric EQ) to get the sound I wanted. (I'm an EQ freak on soundboards, but that's my quirk / OCD).

          A solution might be to have several zeners creating the noise, each fed into 1 op amp, and the outputs summed. Noise is additive, and the more zeners you use, the flatter your spectral output should be ("should" is the favorite word of engineers, and the most hated by everyone else).

          You could use different zeners and transisitors, and then control the levels (simple mixer) and / or mute them individually.

          A good place to start might be a TL087 quad op amp.

          Or, if you don't want to mess around, you could use one of several noise generators, like the "electric druid" https://www.banzaimusic.com/electric-druid-pentanoise-noise-generator.html [banzaimusic.com]

          Or the TI SN76477 (which is out of production and superseded by the supposedly still available ICS76477).

          • (Score: 2, Interesting) by Rupert Pupnick on Tuesday April 23 2019, @09:43PM (6 children)

            by Rupert Pupnick (7277) on Tuesday April 23 2019, @09:43PM (#834051) Journal

            Yeah, I'm an EE, temporarily retired, with a few decades of hardware experience in telecom and computers. It appears that like for most of the commenters here, noise is always something you try to rid yourself of, so the objective of actually creating noise for the purpose of providing random numbers for cryptography is something I'd never seen before and immediately piqued my interest, and the first question that comes to mind is "What's the spec?". This is of course a reflection of the business culture of the commercial design environments I've worked in.

            When engineers start talking about filters, bandwidth, and signal conditioning, that generally suggests that there's some desired outcome in terms power spectral density, or the frequency content of the noise, if you like. That distribution, in turn, sets limits on the behavior of the noise signal over time, which has an obvious impact on randomness. Looking at the write up (thanks for the more direct link), it seems the emphasis is really on limiting power consumption while at the same time getting a decent output level. The scope pictures were captured on a relatively late model Tek that probably has an FFT function, and the fact that no FFT traces are presented suggests me that the designer probably isn't interested in the spectral content of the noise.

            So maybe that's not important, especially if you can post process the data that comes out of the A/D to randomize it further (e.g. do some big floating point operation, and throw out everything but the decimal remainder to get a traditional random seed number between 0 and 1).

            • (Score: 2) by RS3 on Wednesday April 24 2019, @02:01AM (5 children)

              by RS3 (6367) on Wednesday April 24 2019, @02:01AM (#834157)

              Oh, good, we can communicate. EE here too. I'm generally a Tek snob, and prefer analog, but I have a LeCroy with FFT, and an HP 4194A, as well as various audio A/D samplers- up to 192KHz @ 24 Bits and there's all kinds of software to look at spectral density. Anyway...

              Yes, the spec. I'm definitely not expert, but I remember some bits and pieces (pun intended, ugh. sorry.) So my interpretation is that the cryptographers need truly random numbers on which to base ciphers, and that any repeating patterns- the full number, or even subsections (digit combinations) that repeat more often than others, shows a potentially predictable weakness in the algorithm. Over the years I remember reading about how /dev/random is not truly random, and here and there flaws have been found in most computed pseudo-random number generators, and Holy Grail is needed but nobody has answered these questions three.

              So if we can generate true pure noise, which by definition means all frequencies represented equally when averaged over time, we can give them a good basis for encryption.

              My gut feel (as again, not expert) is that _any_ processing done won't help the randomness. I haven't done this stuff in too many years, so I have to study up on windowing, fft, etc., for either flattest end result, or, predictable result that can be mathematically compensated out. A really flat window is: Dolph–Chebyshev. Amazing write-up with nice pictures: https://en.wikipedia.org/wiki/Window_function [wikipedia.org] Let me know if you know that stuff. It's what I pursued, incl. some MS coursework but nobody would hire me- all wanted 3-5 years experience. Sigh.

              It all depends on the number of bits of "entropy" they want, but lets say 2^12. It could be based on a combination of frequencies and amplitudes, but the end result must be: long-term average histogram of all possible numbers needs to be flat-topped. Not sure what I mean by long-term, but we could derive that, and it should be based on the lowest frequency and the number of frequencies we're looking at. We could let it run for hours and see how it looks. (I do NOT need another project, but now I want to run it! Oh gosh, this is too easy not to do...)

              Again, not expert, so I'm not sure what just an amplitude histogram averaged would look like. My hunch is Gaussian- what say you?

              • (Score: 1) by Rupert Pupnick on Wednesday April 24 2019, @04:46PM (4 children)

                by Rupert Pupnick (7277) on Wednesday April 24 2019, @04:46PM (#834399) Journal

                I guess a few comments. First on my circuit design idea.

                So the noise voltage out of a 1M resistor as I suggested previously is about 0.13 uV/Hz^0.5. In a 1 MHz bandwidth (just picking an op-ampish number) that's a factor of a thousand or about 0.13 mV. To get to a 1.3V amplitude requires a gain of about 10^4 which might be doable with three gain stages with an off the shelf op amp (haven't looked at any spec sheets for GBW product for typical op amp, was guessing 50 MHz). So right away my solution's probably not cheaper. It does get you a reasonably flat spectrum, though, from the low frequency corner of the amplifier coupling to the limiting bandwidth of the system. But it's not clear to me that spectral flatness is an important or necessary condition for randomness, when I think about it some more.

                This is because as soon as you have a bandlimited signal, you establish a correlation between two successive samples in time: the lower the noise bandwidth, the more time you need between samples to make them uncorrelated. Probably not a serious functional restriction since I doubt you need to make RNG samples at MHz rates, but I think it illustrates a limitation of all noise generation systems for RNG.

                Anyway it all comes back to the question of how random is random enough for the application, which is really cryptography-- I don't have a lot of experience there. As you point out, a computationally derived RNG isn't really random because it in fact derives from a deterministic process. But practically speaking, the attacker would have to know all about the seed generation routine to defeat it (including undocumented tolerances, initial conditions and states, how long the state machine has been running, and other sources of perturbation-- e.g. data from TOY clock), and this seems really unlikely to me.

                • (Score: 2) by RS3 on Thursday April 25 2019, @06:46PM

                  by RS3 (6367) on Thursday April 25 2019, @06:46PM (#834872)

                  As too often, I'm in a rush and can't write up all my thoughts. But basically I wasn't going into the circuit details yet- that's just grunt work. I was trying to establish the need, philosophy, proof of concept, etc. Yes, resistors are great noise sources. "Shot noise" from any noise source could be a problem- more study needed (when I have more time...)

                  1) I'm not expert but I've read here and there that good cryptography is based on truly random numbers, and any kind of repeating pattern can lead to breaking the cipher. Again, I'm not expert, but I believe them and see a need for true randomness.

                  2) By definition true pure noise is composed of all frequencies within a given range of frequencies, occurring at random times and amplitudes.

                  So if we can generate pure noise and correctly digitize it, we can provide true random numbers. Before getting into too many circuit details, I want to understand and define the system theory.

                  One area I'm having trouble with (and might consult with a professor) is sampling. I know a fair bit about sampling, Nyquist, etc., but Nyquist isn't going to work for varying amplitudes (and I have strong thoughts about audio sampling but more another time). Issues I have: noise, by definition, contains frequencies which are varying in amplitude. That varying, also known as amplitude modulation, creates sideband frequencies which might exceed Nyquist. So to do this correctly, we need to oversample, but I'm not sure by how much, and I don't have time to think it through (gotta pay bills- wish this topic was paying my bills!)

                • (Score: 2) by RS3 on Thursday April 25 2019, @09:59PM

                  by RS3 (6367) on Thursday April 25 2019, @09:59PM (#834928)

                  Okay, upon further thought, I need to see that the frequencies all reach the same maximal amplitude (peak hold display), and that all frequencies have the same average amplitude, or energy, but either way, long-term average has to be a flat-topped graph. All that before I can qualify a noise source as truly random.

                • (Score: 2) by RS3 on Thursday April 25 2019, @10:18PM (1 child)

                  by RS3 (6367) on Thursday April 25 2019, @10:18PM (#834936)

                  Some really good reading: https://en.wikipedia.org/wiki/Hardware_random_number_generator [wikipedia.org], especially at "Quantum random properties" where they discuss quantum mechanics, molecular motion, etc., in systems above absolute zero, hence: "entropy". It's good when things connect.

                  Also in that section they discuss various noise sources, frequency compensation to achieve flat spectral response, etc. If I get time I'll set up some tests and see what kind of spectral response I get from various noise sources.

                  • (Score: 1) by Rupert Pupnick on Friday April 26 2019, @05:17PM

                    by Rupert Pupnick (7277) on Friday April 26 2019, @05:17PM (#835198) Journal

                    Good link with a nice summary of the considerations, thanks. Sounds like a 1 bit comparator into a shift register is the most elegant solution on the digital side once you have a noise source you’re happy with.

    • (Score: 0) by Anonymous Coward on Sunday April 21 2019, @08:30PM

      by Anonymous Coward on Sunday April 21 2019, @08:30PM (#833104)

      Aha, thank you!

  • (Score: 0) by Anonymous Coward on Saturday April 20 2019, @10:15PM (4 children)

    by Anonymous Coward on Saturday April 20 2019, @10:15PM (#832736)

    "afaik this is the smallest"

    Bunny's always got his eye out for a way to get himself in the headlines. Self-promotion is job #1.

    I'm still waiting for a utility to back up his filesystem. Y'know, like dump(8), or restore(8).

    • (Score: 0) by Anonymous Coward on Sunday April 21 2019, @01:44AM (3 children)

      by Anonymous Coward on Sunday April 21 2019, @01:44AM (#832816)

      Using a zener diode in avalanche breakdown mode for a RNG is a REALLY OLD idea.
      Isn't it surprising then that Huang doesn't cite any previous circuit designs (of which there are many) when he presents "his" design?
      How does it differ from the others in meaningful ways? So I agree with you; Huang presents his done-to-death circuit as something new and innovative.

      • (Score: 2) by janrinok on Sunday April 21 2019, @09:16AM (2 children)

        by janrinok (52) Subscriber Badge on Sunday April 21 2019, @09:16AM (#832894) Journal

        He explains why he did it in his notes:

        I had to do a new design because the existing open-source ones I could find were too large and power hungry to integrate into a mobile device.

        You asked "How does it differ from the others in meaningful ways?" and the question had already been answered.

        • (Score: 0) by Anonymous Coward on Sunday April 21 2019, @05:48PM (1 child)

          by Anonymous Coward on Sunday April 21 2019, @05:48PM (#833042)

          He states other designs were too large and power hungry, but then he does not state which designs he rejected.
          The info is not anywhere to be found in his notes.

          • (Score: 2) by janrinok on Monday April 22 2019, @07:29AM

            by janrinok (52) Subscriber Badge on Monday April 22 2019, @07:29AM (#833290) Journal

            The design of the circuit is irrelevant if building it results in a circuit that is too large to fit his mobile device or drains too much power, although it is true that it might be of interest to some. This was the smallest design that he could build that met his very limited power budget.

(1)