Photography website PetaPixel reports that researchers affiliated with Google and MIT have devised an algorithm that automatically removes reflections and obstructions from photographs, provided multiple frames are captured. Even more intriguing: this algorithm can also recover a reflected image.
Differences between the reflections/obstructions and the scene can be detected and extracted by comparing all the different shots, resulting in one clear photo of the obstruction-free background scene, and one clear photo of the extracted obstruction (e.g. a reflection or fence).
Now that's where things get even crazier: the algorithm is able to provide clear photos of what reflections show....
The MIT Technology Review provides more details about the algorithm:
Michael Rubinstein, a research scientist at Google who worked as a postdoctoral researcher at Microsoft Research while some of the work was conducted, says the basic principle behind the algorithm is the phenomenon of motion parallax....
Tianfan Xue, lead author of the paper and a graduate student at MIT, says that in addition to reflections on windows and chain-link fences, the algorithm can correct for a number of different kinds of obstructions on windows like raindrops or dirt.
Fairly impressive! Does this have the potential to become a new standard tool for photographers, or will this appeal primarily to cameraphone-toting consumers?
(Score: 2) by TheLink on Thursday August 06 2015, @07:01PM
But should start to get safer once they've got this and the other stuff they've missed, and the other cool stuff that they should do - e.g. mount some cameras/sensors at near bumper height to detect moving objects (people esp children, animals, etc) behind tall vehicles - by looking _under_ those vehicles.