Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

Nvidia Algorithm Turns 2D Photos Into a 3D Scene "Almost Instantly"

Accepted submission by takyon at 2022-03-26 14:05:06
Software

Nvidia shows off AI model that turns a few dozen snapshots into a 3D-rendered scene [theverge.com]

Nvidia's latest AI demo is pretty impressive: a tool that quickly turns a "few dozen" 2D snapshots into a 3D-rendered scene. In the video below you can see the method in action, with a model dressed like Andy Warhol holding an old-fashioned Polaroid camera. (Don't overthink the Warhol connection: it's just a bit of PR scene dressing.)

The tool is called Instant NeRF, referring to "neural radiance fields [matthewtancik.com]" — a technique developed by researchers from UC Berkeley, Google Research, and UC San Diego in 2020. If you want a detailed explainer of neural radiance fields, you can read one here [medium.com], but in short, the method maps the color and light intensity of different 2D shots, then generates data to connect these images from different vantage points and render a finished 3D scene. In addition to images, the system requires data about the position of the camera.

Researchers have been improving this sort of 2D-to-3D model for a couple of years now [newscientist.com], adding more detail to finished renders and increasing rendering speed. Nvidia says its new Instant NeRF model is one of the fastest yet developed and reduces rendering time from a few minutes to a process that is finished "almost instantly."

Also at Tom's Hardware [tomshardware.com] and PetaPixel [petapixel.com].

Previously: Breakthrough AI Technique Enables Real-Time Rendering of Scenes in 3D From 2D Images [soylentnews.org]


Original Submission