from the NOW-I-can-see-what-you-did-there dept.
NASA has caught on to the High Dynamic Range craze:
While thousands turned out watch NASA's Space Launch System (SLS) recently complete a full-scale test of its booster, few were aware of the other major test occurring simultaneously. NASA's High Dynamic Range Stereo X (HiDyRS-X) project, a revolutionary high-speed, high dynamic range camera, filmed the test, recording propulsion video data in never before seen detail.
The HiDyRS-X project originated from a problem that exists when trying to film rocket motor tests. Rocket motor plumes, in addition to being extremely loud, are also extremely bright, making them difficult to record without drastically cutting down the exposure settings on the camera. Doing so, however, darkens the rest of the image, obscuring other important components on the motor.
[...] When the team reviewed the camera footage, they saw a level of detail on par with the other successful HiDyRS-X tests. The team saw several elements never before caught on film in an engine test. "I was amazed to see the ground support mirror bracket tumbling and the vortices shedding in the plume," Conyers says. The team was able to gather interesting data from the slow motion footage, and Conyers also discovered something else by speeding up the playback. "I was able to clearly see the exhaust plume, nozzle and the nozzle fabric go through its gimbaling patterns, which is an expected condition, but usually unobservable in slow motion or normal playback rates."
The camera was developed as part of the Game Changing Development Program. An enhanced version is already in the works. Video on YouTube. Here is a stabilized version without the slow motion.
According to a statement from NASA, scientists tried out the camera while testing its booster, QM-2. They monitored the camera from a safe distance, but its automatic timer failed to go off, meaning scientists had to start it manually.
And apparently, the force of the booster test was so great that it disconnected the camera's power source. So NASA got confirmation that its camera works, but also that its rocket is very powerful.
(Score: 2, Insightful) by Anonymous Coward on Tuesday August 09 2016, @12:06PM
Suppose you have a nice evening with the moon present, and some clouds around, illuminated by the moon.
Now, try taking a picture of the scene.
With current tech, it's pretty much impossible to have proper exposure for both the moon and the clouds, either you'll have the moon nice and dandy, but clouds hardly visible, or you have nice clouds and the moon totally burnt out. This is independant on whether you use RAW (e.g. 14bit/channel for my Canon) or JPG output.
To solve this, you'll have to take several pictures and blend them to an HDR result. Add some movement to such a scene, and you lose that option.
So, having the sensors provide (way) more dynamic range is definitely of interest, as is the possibility of handling > 8bit/channel (e.g. GIMP 2.9 instead of 2.8).
(Score: 3, Interesting) by ledow on Tuesday August 09 2016, @02:43PM
I have to say that this is cool, yes, maybe even pretty. But however the camera is made it's just a composite of various ranges from different sensors, at the end of the day. It just so happens that one sensor "chip" contains many sensors.
There are already cameras that profess to have "infinite focus" too - where they can focus on all distances simultaneously through the use of tricky lenses and post-processing.
And although using them for science gives you more accurate data (literally just more sensors of the appropriate range), using HDR for photography? Really? Is it because my eye adjusts to what it's looking at and therefore your picture is what I would see if I scanned each little bit of the scene with my eye and let it adjust, and then put it all together in one humungous panorama? It's hardly a "realistic" image at all.
For science, extra sensors is the way to go and this is just a way of getting the image to two sensors of differing sensitivity rather than just one, surely? We don't care about colour accuracy and relative brightness when we're looking for data.
But for photography, HDR is just a way of photoshopping in-camera, isn't it?
Is it really photography when you mess around so much, or is it photography when you use the limitations of your tools to come up with a difficult result that a layman would struggle to reproduce?
There's a photography competition show on UK TV at the moment, and they spend the last few hours of every task in post-production tweaking the images and fixing all the stupid things they did in the original image (composition, layout, colouration, exposure, etc.). It drives me mad. Is that really being a good photographer? Or are you now just a graphic artist using a source photo that you happened to press the button for?
I enjoy astrophotography, but I really have no interest in "faking" things, even from "real" data. Even image-stacking I find uninteresting. And when people superimpose their captured meteor storm over the top of a ground-scenery image, it drives me mad. My prized personal photo is actually one of the Moon that I took afocally (i.e. shoving a camera down the telescope eyepiece and capturing what I could see with my eye).
So although I see the use for science, HDR is really just the fad that they tried to push on me with Half-Life 2 as well. It makes unrealistic images that aren't really like that if I was to look at them versus the real scene they are depicting.
(Score: 2) by Scruffy Beard 2 on Tuesday August 09 2016, @03:12PM
You need unrealistic processing to see HDR images on a non-HDR monitor.
Maybe that is the benefit of 4k resolution: you can "fake" HDR with dithering.
(Score: 2) by bob_super on Tuesday August 09 2016, @04:37PM
It's definitely highly useful for science. I saw the video a space shuttle launch filmed with a higher bit depth, then processed to see both the machine and the subtleties of each flame. That was both amazing, and highly informative for the ground guys.
My phone taking three pictures and showing both the campfire and the moon is a bit cheating, but it fits in my pocket and produces an output that matches reality. I wouldn't try to claim any artistic value, obviously.
(Score: 0, Disagree) by Anonymous Coward on Wednesday August 10 2016, @08:06AM
My phone taking three pictures and showing both the campfire and the moon is a bit cheating, but it fits in my pocket and produces an output that matches reality.
Matches the imaginary image in your brain not reality. Your eyes and brain adjust the brightness levels depending on whether you're looking at the campfire or the moon. Your pupils close and open accordingly, your retina gets less sensitive to bright stuff and there's lots of "postprocessing" (which explains some optical illusions).
(Score: 2) by bob_super on Wednesday August 10 2016, @04:28PM
So, what's reality to you? A cat will see the same scene with much higher brightness, a hawk with much higher resolution.
When I look out of my dark room through the window, and can see both the birds in the sky and the bumps on the inside wall, that's my reality. The fact that a camera needs HDR to produce the same image doesn't make it not real, it just points out how the human visual processing system has higher dynamic range than the camera.
(Score: 0) by Anonymous Coward on Tuesday August 09 2016, @11:01PM
With old-skool film (b&w) you can do it.
First you expose the film in the camera, develop it and then you put the 35mm film in another "camera" and project it onto non transparent photosensitive paper which you "wash" in some special smelly liquid and the, after rinsing the paper, you fix it in another liquid.
Since two "cameras" are used, i would say for the moon and clouds you would pick a small but lengthy aperature for 35mm but a big aperature and short exposure time for the paper?