The BBC published a rambling report on AI and Tech billionaires building large fully-autonomous "basements" in different locations. I love the quote "I once met a former bodyguard of one billionaire with his own 'bunker', who told me his security team's first priority, if this really did happen, would be to eliminate said boss and get in the bunker themselves. And he didn't seem to be joking."
Mark Zuckerberg is said to have started work on Koolau Ranch, his sprawling 1,400-acre compound on the Hawaiian island of Kauai, as far back as 2014
It is set to include a shelter, complete with its own energy and food supplies, though the carpenters and electricians working on the site were banned from talking about it by non-disclosure agreements, according to a report by Wired magazine.
Asked last year if he was creating a doomsday bunker, the Facebook founder gave a flat "no". The underground space spanning some 5,000 square feet is, he explained, "just like a little shelter, it's like a basement".
Then there is the speculation around other tech leaders, some of whom appear to have been busy buying up chunks of land with underground spaces, ripe for conversion into multi-million pound luxury bunkers.
Reid Hoffman, the co-founder of LinkedIn, has talked about "apocalypse insurance". This is something about half of the super-wealthy have, he has previously claimed, with New Zealand a popular destination for homes.
So, could they really be preparing for war, the effects of climate change, or some other catastrophic event the rest of us have yet to know about?
In the last few years, the advancement of artificial intelligence (AI) has only added to that list of potential existential woes. Many are deeply worried at the sheer speed of the progression.
Ilya Sutskever, chief scientist and a co-founder of Open AI, is reported to be one of them.
In a meeting, Mr Sutskever suggested to colleagues that they should dig an underground shelter for the company's top scientists before such a powerful technology was released on the world, [...] according to a book by journalist Karen Hao.
"We're definitely going to build a bunker before we release AGI," he's widely reported to have said, though it's unclear who he meant by "we".
What's more, it's unlikely to arrive as a single moment. Rather, AI is a rapidly advancing technology, it's on a journey and there are many companies around the world racing to develop their own versions of it.
But one reason the idea excites some in Silicon Valley is that it's thought to be a pre-cursor to something even more advanced: ASI, or artificial super intelligence - tech that surpasses human intelligence.
It was back in 1958 that the concept of "the singularity" was attributed posthumously to Hungarian-born mathematician John von Neumann. It refers to the moment when computer intelligence advances beyond human understanding.
Those in favour of AGI and ASI are almost evangelical about its benefits. It will find new cures for deadly diseases, solve climate change and invent an inexhaustible supply of clean energy, they argue.
Elon Musk has even claimed that super-intelligent AI could usher in an era of "universal high income".
"If it's smarter than you, then we have to keep it contained," warned Tim Berners Lee, creator of the World Wide Web, talking to the BBC earlier this month.
Governments are taking some protective steps. In the US, where many leading AI companies are based, President Biden passed an executive order in 2023 that required some firms to share safety test results with the federal government - though President Trump has since revoked some of the order, calling it a "barrier" to innovation.
Meanwhile in the UK, the AI Safety Institute - a government-funded research body - was set up two years ago to better understand the risks posed by advanced AI.
And then there are those super-rich with their own apocalypse insurance plans.
"Saying you're 'buying a house in New Zealand' is kind of a wink, wink, say no more," Reid Hoffman previously said. The same presumably goes for bunkers.
But there's a distinctly human flaw.
I once met a former bodyguard of one billionaire with his own "bunker", who told me his security team's first priority, if this really did happen, would be to eliminate said boss and get in the bunker themselves. And he didn't seem to be joking.
Neil Lawrence is a professor of machine learning at Cambridge University. To him, this whole debate in itself is nonsense.
"The notion of Artificial General Intelligence is as absurd as the notion of an 'Artificial General Vehicle'," he argues.
"The right vehicle is dependent on the context. I used an Airbus A350 to fly to Kenya, I use a car to get to the university each day, I walk to the cafeteria... There's no vehicle that could ever do all of this."
"The technology we have [already] built allows, for the first time, normal people to directly talk to a machine and potentially have it do what they intend. That is absolutely extraordinary... and utterly transformational.
Current AI tools are trained on mountains of data and are good at spotting patterns: whether tumour signs in scans or the word most likely to come after another in a particular sequence. But they do not "feel", however convincing their responses may appear.
Ultimately, though, no matter how intelligent machines become, biologically the human brain still wins. It has about 86 billion neurons and 600 trillion synapses, many more than the artificial equivalents.
"If you tell a human that life has been found on an exoplanet, they will immediately learn that, and it will affect their world view going forward. For an LLM [Large Language Model], they will only know that as long as you keep repeating this to them as a fact," says Mr Hodjat.
"LLMs also do not have meta-cognition, which means they don't quite know what they know. Humans seem to have an introspective capacity, sometimes referred to as consciousness, that allows them to know what they know."
It is a fundamental part of human intelligence - and one that is yet to be replicated in a lab.
(Score: 5, Insightful) by Thexalon on Monday October 20, @11:00AM (5 children)
I mean, near the top of my personal list has to be climate change: Not only are we well and truly screwed right now, but to which most people with enough power to have significant impact on the problem have responded with, in essence, "whatever, let everything burn, I'll be long gone before it all happens anyways". Among other things, that's why some of the ultra-rich are trying to get to Mars or at least the Moon in some kind of permanent way.
"Think of how stupid the average person is. Then realize half of 'em are stupider than that." - George Carlin
(Score: 1) by spiraldancing on Monday October 20, @11:37AM (2 children)
Depends on which day you ask me, but Climate Change and Resource Depletion are kind of tied for first place ... and oftentimes, when I'm feeling more philosophical, I even consider "The Internet" as the number one problem -- which, there is so much to unpack there, but basically, in the sense of "it got bigger, faster than we learned how to legislate/regulate it".
Also ... I am literally wearing a t-shirt right now, with that Carlin quote on it.
Lets go exploring.
(Score: 1) by DECbot on Monday October 20, @04:15PM (1 child)
I'm not too worried about resource depletion. That just means at some point it becomes cost effective to mine asteroids.
However, that would make me start worrying about the asteroid mining process, because some PHB with a Masters Business will decide that mining asteroids would be more efficient if the asteroids were relocated in LEO for processing. Thus they'd spend trillions on the delta-V to hurl dinosaur extincting sized rocks at Earth. Good chance the troglodyte making the vector calculations on these resource relocations would eventually make a rounding error that would negate any current Climate Change concerns. When that happens, the billionaire bunkers of today probably won't be deep enough and provisioned enough to survive the aftermath.
cats~$ sudo chown -R us /home/base
(Score: 2) by mcgrew on Tuesday October 21, @03:25PM
I'm not too worried about resource depletion. That just means at some point it becomes cost effective to mine asteroids.
I was thinking that the only resources that can be depleted are fossil fuels, and I doubt there's much of them on asteroids! Everything else can be reused or recycled. In a thousand years they'll be mining our landfills.
I suspect global warming will destroy us far sooner than the fossils are depleted.
Why do the mainstream media act as if Donald Trump isn't a pathological liar with dozens of felony fraud convictions?
(Score: 5, Insightful) by VLM on Monday October 20, @01:43PM (1 child)
The other, bigger problem with "climate change" is the proposed solutions are NEVER about fixing the problem, just a mixture of raw power grabs, making a profit, and occasionally feel good antics (aka prayer for atheists).
People will put up with a lot to solve problems, but when its the usual people pulling the usual scam, their only hope is propaganda and authoritarianism.
(Score: 0) by Anonymous Coward on Friday October 24, @12:57AM
China does seem to have a chance of fixing their side of the stuff. They seem to be burning more coal, but the scale of their solar and wind stuff doesn't appear for show or mere greenwashing. There are EVs everywhere in China. So there's a good chance that their coal and oil consumption will go down in the future.
I guess they do think more long term than the next election/quarter.