Studying chaos with one of the world's fastest cameras:
[...] chaotic systems [...] are notable for exhibiting behavior that is predictable at first, but grows increasingly random with time.
[...] In the latest issue of Science Advances, [Caltech's Lihong] Wang describes how he has used an ultrafast camera of his own design that recorded video at one billion frames per second to observe the movement of laser light in a chamber specially designed to induce chaotic reflections.
"Some cavities are non-chaotic, so the path the light takes is predictable," Wang says. But in the current work, he and his colleagues have used that ultrafast camera as a tool to study a chaotic cavity, "in which the light takes a different path every time we repeat the experiment."
The camera makes use of a technology called compressed ultrafast photography (CUP), which Wang has demonstrated in other research to be capable of speeds as fast as 70 trillion frames per second. The speed at which a CUP camera takes video makes it capable of seeing light—the fastest thing in the universe—as it travels.
But CUP cameras have another feature that make them uniquely suited for studying chaotic systems. Unlike a traditional camera that shoots one frame of video at a time, a CUP camera essentially shoots all of its frames at once. This allows the camera to capture the entirety of a laser beam's chaotic path through the chamber all in one go.
Journal Reference:
Linran Fan, Xiaodong Yan, Han Wang, et al. Real-time observation and control of optical chaos [open], Science Advances (DOI: 10.1126/sciadv.abc8448)
(Score: 2) by fakefuck39 on Sunday January 17 2021, @10:35AM (1 child)
A lot of cameras lately that capture things fast, allowing for super-rate video, or captures of instant things like atoms exploding from a laser pulse. But let's take a look at what they're doing on the other end of that. Capturing long exposure space images. These are some very expensive NASA-style cameras, where the sensor collects light for a full hour. But why not cheap, and for the backyard?
To get high resolution, you need tiny sensors. This means each one gets less light as you increase resolution. So they need to be extra sensitive, have low noise, a huge lens, and end up costing hundreds of thousands.
But why not this: regular sized sensors, just a much larger die. A slightly imperfect large lens is cheap too - and you move the sensor far back. This hides most of the lens imperfections - especially for things captured over an hour - the earth spins a little anywise. So how much is a CMOS sensor for a regular camera? Well, let's just put 16 of those together. Now you've got high resolution, it's dirt cheap, but it takes at least several minutes to snap any kind of photo.
I bet there's a big market for those - probably as many people as buy little hobby telescopes.
(Score: 0) by Anonymous Coward on Tuesday January 19 2021, @03:03AM
CMOS scales very nicely. Because the electronics are located at the pixel, you can seamlessly stitch them together on the die, whereas with CCDs you need gaps between them to get the data out. However, for hard-core astronomy, CCDs still give you better performance over CMOS, which is why they'll live with the gaps between sensors [stanford.edu], but CMOS is still used.
I think the problem with what you're talking about is that it is not hard to get high resolution, the hard part is getting high resolution onto the sensor. The resolution is determined by the size of the pixels, but also the size of the blur spot, which goes as the f-number of the system. If you want a small blur spot, you need a small f-number, which means you need large optics and short focal lengths. However, getting decent optical performance for low f-numbers is very hard, and very expensive. So to optimize your performance, you need to match your pixel size with your optics. Complicating matters is that the atmosphere dances around (there's tricks you can play there, if you have enough signal, and for the backyard astronomer you can get some very nice results [cloudynights.com] for bright enough objects).
One thing you are right on the spot about is the lack of a need to go to smaller pixel sizes, and there are very good reasons to not go that way. The size of the pixels now is overkill for the sizes of the blur spots you get from reasonable telescopes anyway. You want the relationship between your blur spot and pixel to be such that your blur spot is about 2-3 times the size of the pixel (known as the "Q" of the imaging system). Putting more pixels across the blur spot is just wasting effort. If you have a larger blur spot, you ideally want a larger pixel, and as you mentioned, that comes with added benefits such as better pixel performance. Unfortunately, on the consumer size, pixel size is being used like processor frequency was used 15 years ago, where "megapixel" has replaced "gigahertz." The optical designers who make those ridiculously small lenses are already having fits trying to make them give good performance for the tiny size of the pixels now, and making smaller pixels just makes that job even harder.
(Score: 2) by crafoo on Sunday January 17 2021, @12:27PM (1 child)
Chaotic dynamic systems do grow more chaotic with time, but will exhibit patterns if observed long enough. Strange Attractors. https://www.stsci.edu/~lbradley/seminar/attractors.html [stsci.edu]
A very good book on the subject
https://www.routledge.com/A-First-Course-In-Chaotic-Dynamical-Systems-Theory-And-Experiment/Devaney/p/book/9780367235994?PageSpeed=noscript [routledge.com]
(Score: 0) by Anonymous Coward on Sunday January 17 2021, @08:46PM
Thanks for the good links. Wish I had time to go through that textbook. Wish it'd been written back when I was in college!
(Score: 1, Insightful) by Anonymous Coward on Monday January 18 2021, @04:01AM (1 child)
Y'all should see my ex-wife without her makeup on. Eeee-gads!
(Score: 0) by Anonymous Coward on Monday January 18 2021, @04:13PM
That's not chaos, that's entirely predictable. Replace every 5-7 years.