Stories
Slash Boxes
Comments

SoylentNews is people

Submission Preview

Link to Story

It's 2019, the Year Blade Runner Takes Place: I Can Has Flying Cars?

Accepted submission by upstart at 2019-01-02 04:29:24
/dev/random

████ sub likely contains entire articles and possibly more, and probably needs a trimmin' ████

Submitted via IRC for Bytram

It's 2019, the year Blade Runner takes place: I can has flying cars? [theregister.co.uk]

Feature Welcome to 2019, the year in which Ridley Scott's 1982 sci-fi film masterpiece Blade Runner [imdb.com] is set. And as predicted in this loose adaptation of a 1968 Philip K. Dick story, we have flying cars.

The reason you don't have a flying car was explained by author William Gibson, who famously observed, more or less [quoteinvestigator.com], "The future is already here – it's just not evenly distributed."

If you're Sebastian Thrun, you've already flown in Kitty Hawk's Flyer [youtube.com], which is more flying boat than flying car. If you're not, chances are you will have to wait a bit longer to live your sci-fi noir transport fantasy.

The flying cars of Blade Runner are called spinners. The name suggests rotor-powered flight, but according to designer Syd Mead [youtube.com], the fictional road-and-sky craft relied on a system called an aerodyne. Press kits from the 1982 release reportedly describe [archive.org] the cars flying through a combination of internal combustion, jet, and anti-gravity technology.

Flying cars have been foretold for decades. In 1940, Henry Ford said, "Mark my words. A combination of airplane and motorcar is coming. You may smile. But it will come."

That's how consulting biz Deloitte opened its January report (PDF) [deloitte.com] on passenger drones and flying cars. The report recounts efforts to develop various flying car prototypes and predicts commercial availability by 2020.

There have been some false starts. The Moller Skycar [theregister.co.uk], which received widespread coverage two decades ago during the first dot-com boom, never made it past prototype.

But startups like EHang [youtube.com] and Volocopter [volocopter.com], more established firms like Uber, and aerospace giants Airbus [airbus-sv.com] are all developing flying cars of a sort. The first models to hit the market might more accurately be described as personal helicopters. Terrafugia's Transition [terrafugia.com] may not have quite the visual panache of a Syd Mead design, but it qualifies as a flying car. It's supposed to ship in 2019.

The real hangup isn't the technology (except for slow battery advancement); it's the regulation and integration into existing infrastructure. Given that Blade Runner is set in November 2019, we can wait a few months for the rules to be ironed out.

In the story that inspired Blade Runner, Philip K. Dick's "Do Androids Dream of Electric Sheep," protagonist Rick Deckard wants to buy a live animal to replace his electric sheep. Artificial pets show up in the film too, in the form of a fake snake and genetic designer J.F. Sebastian's various creations.

We've had artificial pets for years, in the form of Sony's Aibo, Ugobe's Pleo, and less sophisticated mechanical contraptions, not to mention virtual pets like Tamagotchi. And let's not forget the pet rock.

Artificial pets may have some therapeutic benefit for certain conditions like autism, but it's hard to imagine that we will ever overcome our preference for living things, aka biophilia [pbs.org], and prefer the company of machines. Part of developing a relationship with an animal is knowing that there's no off switch to get it to stop pestering you to take it out for a walk.

To develop leads in his search for the missing replicants, Deckard in the film version of the story uses a device referred to in the script as the Esper Machine. Speaking voice commands to the machine, he is able to zoom in on and enhance obscure details in a photo. The filmmakers weren't visionary enough to anticipate the touchscreen; voice interaction has entertainment value but isn't great for precise picture edits.

Since Blade Runner, released eight years before Adobe Photoshop 1.0, more than a few movies and TV shows [youtube.com] have depicted photo enhancement efforts. Today, innovations like computational photography, light field photography, and cameras with multiple lenses and sensors provide extra data about images allow people to go beyond the Esper Machine.

Open source neural network code [github.com] is now readily available [github.com] to upscale images [letsenhance.io]. Websites offer upscaling as a service. Rotating perspective around an arbitrary point doesn't work all that well without multiple source perspectives – using AI code to guess about occluded image data isn't as reliable as recorded image data from an offset lens. But yesterday's photo enhancement fiction is pretty much today's reality.

The Voight-Kampff machine depicted in Blade Runner is used to measure the subject's empathy response to questions designed to evoke an emotional response. The idea is that replicants and humans will respond differently and those differences can be spotted by measuring capillary dilation and involuntary eye fluctuations.

In terms of emotional assessment, real-world analogs already exist, namely the International Affective Picture System [ufl.edu], to say nothing of polygraph tests. Neuro-imaging, via fMRI scan, is also being explored for lie detection.

But mostly, we use CAPTCHAs to separate people from robots. These "completely automated public Turing test to tell computers and humans apart" don't tell computers and humans apart very well, as can be seen from the number of fake accounts on social networks and popular online services. It wasn't easy to distinguish people from machines in Blade Runner either.

Blimps bedecked with ads preceded Blade Runner and can still be found. As the Washington Post recently put it [washingtonpost.com], "the modern airship business is the ad business." Today's ad-laden airships don't quite match the jumbotron-toting, search light-spraying excess depicted in Blade Runner, but the filmic achievement was long ago unlocked.

Blade Runner, we're told in the opening credits, is set after colonies have been established on other planets. The best we've managed to date is the International Space Station. No, Starman [twitter.com] does not count as sustained human presence.

In Blade Runner, Tyrell Corporation genetic designer Hannibal Chew creates eyes for replicants. Presumably, he grows them from generic blueprints. Work on lab grown organs is underway. In 2012, a windpipe created from stem cells [cbsnews.com] was grown and implanted in a 2-year-old girl.

The most visible technological innovation depicted in Blade Runner, intelligent androids or replicants, remains a pipe dream. That's likely to be the case for the next few decades at least, and not just because of battery technology limitations [theregister.co.uk] or the uncertainty about how to get from machine learning to general artificial intelligence.

"Perhaps it's going to take 30 years, perhaps more, to reach human-level AI," said [youtube.com] Yann LeCun, Facebook's chief AI researcher, during a presentation earlier this year. He has suggested that AI systems at present fall short of rats in terms of overall intelligence.

Machines can already surpass humans in specific intellectual tasks, like playing chess or Go. But no one is sure how long it will take before we have an artificial brain that can learn on its own and reason about the world at a level comparable to humans.

What's more, it's not clear anyone would really want to build such as system except to prove it could be done. By definition, an artificial intelligence might disagree with its maker and refuse to cooperate. Imagine a smart bomb that refuses to explode due to ethical concerns. Or a smart fridge that refuses to surrender its cheese cake out of concern for your health. Or a smart car that had its own ideas about where you should go.

No one wants tools with minds of their own; they want compliant tools that perform predictably. If robots are not obedient, they'll get hunted down and retired by blade runners. We'll settle for machine-gun toting land-drones, along the lines of what Boston Dynamics has been developing, operating semi-autonomously – with a remote operator for kill decisions.

But let's assume for a moment the wetware were available and someone wanted a freethinking robot. The hardware to house it, said Hod Lipson, a professor of mechanical engineering and data science at Columbia University in an interview last year with Live Science [livescience.com], could take a century.

"No one has any clue how to make a machine that's nimble, that can store power inside and walk for days," he said.

We have no idea how long it will take to get there. Who does? ®


Original Submission