We all know what the problem is with the current crop of AR "glasses": bulky devices which cause you to discover all kinds of new and unknown neck muscles the longer you wear them.
But ho-ho-ho. A team at Stanford University, Hong Kong University and NVIDIA now have worked something out, massively leaning on machine learning and scifi stuff called -- how unoriginal -- optical metasurfaces. Metasurfaces, in case you didn't know, are components engineered to bend light in unusual ways. Research on metasurfaces and other metamaterials has led to invisibility cloaks that can hide objects from light, sound, heat, and other types of waves, among other discoveries. Whoa.
In the boffins own words, they created
a unique combination of inverse-designed full-colour metasurface gratings, a compact dispersion-compensating waveguide geometry and artificial-intelligence-driven holography algorithms. These elements are co-designed to eliminate the need for bulky collimation optics between the spatial light modulator and the waveguide and to present vibrant, full-colour, 3D AR content in a compact device form factor. To deliver unprecedented visual quality with our prototype, we develop an innovative image formation model that combines a physically accurate waveguide model with learned components that are automatically calibrated using camera feedback. Our unique co-design of a nanophotonic metasurface waveguide and artificial-intelligence-driven holographic algorithms represents a significant advancement in creating visually compelling 3D AR experiences in a compact wearable device.
Again: whoa -- them scientists created AR glasses as thin as glasses.
There are still a few minor problems though -- the usable field of view is rather limited, for example -- but I'll leave it to the engineering types around here to pick the team's Nature article [nature.com] apart. For all the others, there's a slightly more readable IEEE Spectrum article [ieee.org] to enjoy.
Me, I'm going to brush up on Doom III.