The problem with teleological narratives is that they make us ignore the fact that change is often made by people who didn’t intend that change to happen…
Tagged with “singularity” (29)
In this thought-provoking episode we tackle the future of AI, the singularity, racism in science fiction, space opera, and so much more. An interview with Charles Stross. Music: Kevin MacLeod. Blind Panels is the inclusive geek podcast. It’s created by Comics Empower, the comic book store for the blind and the visually impaired. Check it out: ComicsEmpower.com
Original video: https://soundcloud.com/user-817263528/ep-38-the-future-of-science-fiction-part-3charles-stross
Downloaded by http://huffduff-video.snarfed.org/ on Wed, 24 Aug 2016 15:47:52 GMT Available for 30 days after download
Cory Doctorow and Charles Stross have just signed with Tor Books to co-author a fix-up novel based on a series of short stories called Rapture of the Nerds. The authors and their editor told us what to expect:
Cory and Charlie intend to write a third novella in the sequence begun with "Jury Service" and "Appeals Court," and THE RAPTURE OF THE NERDS will consist of all three novellas, possibly with some small additional connective tissue if necessary.
Many distinguished SF "novels" have actually been stitched together from short-fiction serieses like this; the venerable industry term for such a book is "fix-up", which doesn’t imply anything deprecatory.
Are we nearing the singularity, the point where philosophers say the computer programs we create will be smarter than us?
Artificial intelligence is all around us. In phones, in cars, in our homes. Voice recognition systems, predicative algorithms, GPS. Sometimes they may not work very well, but they are improving all the time, you might even say they are learning.
Come on an entertaining journey through the ethics of artificial intelligence or AI, the science behind intelligent computer programs and robotics. Some software engineers think about the philosophy of the artificial intelligence they are creating, others really don’t care.
You’ll also hear two very human AI stories; the strange tale of the robotic resurrection of science fiction author Philip K Dick, and the resulting android that made headlines around the world, an Australian woman and Karen Jacobsen who has a talent for making voice systems warm and human-like, bringing her International fame as the voice of Siri and giving the Aussie accent some much needed street cred.
You know her well - she’s in your phone, in your car and in your home. Is the singularity nigh?
James Barrat, author of Our Final Invention: Artificial Intelligence and the End of the Human Era, discusses the future of Artificial Intelligence (AI). Barrat takes a look at how to create friendly AI with human characteristics, which other countries are developing AI, and what we could expect with the arrival of the Singularity. He also touches on the evolution of AI and how companies like Google and IBM and government entities like DARPA and the NSA are developing artificial general intelligence devices right now.
Vinge began by declaring that he still believes that a Singularity event in the next few decades is the most likely outcome— meaning that self-accelerating technologies will speed up to the point of so profound a transformation that the other side of it is unknowable.
And this transformation will be driven by Artificial Intelligences (AIs) that, once they become self-educating and self-empowering, soar beyond human capacity with shocking suddenness.
He added that he is not convinced by the fears of some that the AIs would exterminate humanity.
He thinks they would be wise enough to keep us around as a fallback and backup— intelligences that can actually function without massive connectivity!
(Later in the Q&A I asked him about the dangerous period when AI’s are smart enough to exterminate us but not yet wise enough to keep us around.
How long would that period be?
“About four hours,” said Vinge .)
Since a Singularity makes long-term thinking impractical, Vinge was faced with the problem of how to say anything useful in a Seminar About Long-term Thinking, so he came up with a plausible set of scenarios that would be Singularity-free.
He noted that they all require that we achieve no faster-than-light space travel.
The overall non-Singularity condition he called “The Age of Failed Dreams.”
The main driver is that software simply continues failing to keep pace with hardware improvements.
One after another, enormous billion-dollar software projects simply do not run, as has already happened at the FBI, air traffic control, IRS, and many others.
Some large automation projects fail catastrophically, with planes running into each.
So hardware development eventually lags, and materials research lags, and no strong AI develops.
To differentiate visually his three sub-scenarios, Vinge showed a graph ranging over the last 50,000 and next 50,000 years, with power (in maximum discrete sources) plotted against human populaton, on a log-log scale.
Thus the curve begins at the lower left with human power of 0.3 kilowatts and under a hundred thousand population, curves up through steam engines with one megawatt of power and a billion population, up further to present plants generating 13 gigawatts.
His first scenario was a bleak one called “A Return to MADness.”
Driven by increasing environmental stress (that a Singularity might have cured), nations return to nuclear confrontation and policies of “Mutually Assured Destruction.”
One “bad afternoon,” it all plays out, humanity blasts itself back to the Stone Age and then gradually dwindles to extinction.
His next scenario was a best-case alternative named “The Golden Age,” where population stabilizes around 3 billion, and there is a peaceful ascent into “the long, good time.”
Humanity catches on that the magic ingredient is education, and engages the full plasticity of the human psyche, empowered by hope, information, and communication.
A widespread enlightened populism predominates, with the kind of tolerance and wise self-interest we see embodied already in Wikipedia.
One policy imperative of this scenario would be a demand for research on “prolongevity”— “Young old people are good for the future of humanity.”
Far from deadening progress, long-lived youthful old people would have a personal stake in the future reaching out for centuries, and would have personal perspective reaching back for centuries.
The final scenario, which Vinge thought the most probable, he called “The Wheel of Time.”
Catastrophes and recoveries of various amplitudes follow one another.
Enduring heroes would be archaeologists and “software dumpster divers” who could recover lost tools and techniques.
What should we do about the vulnerabilities in these non-Singularity scenarios?
Vinge ’s main concern is that we are running only one, perilously narrow experiment on Earth.
“The best hope for long-term survival is self-sufficient off-Earth settlements.”
We need a real space program focussed on bringing down the cost of getting mass into space, instead of “the gold-plated sham” of present-day NASA.
There is a common critique that there is no suitable place for humans elsewhere in the Solar System, and the stars are too far.
“In the long now,” Vinge observed, “the stars are not too far.”
(Note: Vinge’s detailed notes for this talk, and the graphs, may be found online at: http://rohan.sdsu.edu/faculty/vinge /longnow/index.htm ) —Stewart Brand
Noted author and futurist Vernor Vinge is surprisingly optimistic when it comes to the prospect of civilization collapsing.
“I think that [civilization] coming back would actually be a very big surprise,” he says in this week’s episode of the Geek’s Guide to the Galaxy podcast. “The difference between us and us 10,000 years ago is … we know it can be done.”
Vinge has a proven track record of looking ahead. His 1981 novella True Names was one of the first science fiction stories to deal with virtual reality, and he also coined the phrase, “The Technological Singularity” to describe a future point at which technology creates intelligences beyond our comprehension. The term is now in wide use among futurists.
But could humanity really claw its way back after a complete collapse? Haven’t we plundered the planet’s resources in ways that would be impossible to repeat?
“I disagree with that,” says Vinge. “With one exception — fossil fuels. But the stuff that we mine otherwise? We have concentrated that. I imagine that ruins of cities are richer ore fields than most of the natural ore fields we have used historically.”
That’s not to say the collapse of civilization is no big deal. The human cost would be horrendous, and there would be no comeback at all if the crash leaves no survivors. A ravaged ecosphere could stymie any hope of rebuilding, as could a disaster that destroys even the ruins of cities.
“I am just as concerned about disasters as anyone,” says Vinge. “I have this region of the problem that I’m more optimistic about than some people, but overall, avoiding existential threats is at the top of my to-do list.”
A.I., artificial intelligence, has had a big run in Hollywood. The computer Hal in Kubrick’s “2001” was fiendishly smart. And plenty of robots and server farms beyond HAL. Real life A.I. has had a tougher launch over the decades. But slowly, gradually, it has certainly crept into our lives.
Think of all the “smart” stuff around you. Now an explosion in Big Data is driving new advances in “deep learning” by computers. And there’s a new wave of excitement.
Guests: Yann LeCun, professor of Computer Science, Neural Science, and Electrical and Computer Engineering at New York University.
Peter Norvig, director of research at Google Inc.
One reason lots of people don’t want to think long term these days is because technology keeps accelerating so rapidly, we assume the world will become unrecognizable in a few years and then move on to unimaginable. Long-term thinking must be either impossible or irrelevant.
The commonest shorthand term for the runaway acceleration of technology is “the Singularity”—a concept introduced by science fiction writer Vernor Vinge in 1984. The term has been enthusiastically embraced by technology historians, futurists, extropians, and various trans-humanists and post-humanists, who have generated variants such as “the techno-rapture,” “the Spike,” etc.
It takes a science fiction writer to critique a science fiction idea.
Along with being one of America’s leading science fiction writers and technology journalists, Bruce Sterling is a celebrated speaker armed with lethal wit. His books include The Zenith Angle (just out), Hacker Crackdown, Holy Fire, Distraction, Mirrorshades (cyberpunk compendium), Schismatrix, The Difference Engine (with William Gibson), Tomorrow Now, and Islands in the Net.
The Seminar About Long-term Thinking on June 10-11 was Bruce Sterling examining “The Singularity: Your Future as a Black Hole.” He treated the subject of hyper-acceleration of technology as a genuine threat worth alleviating and as a fond fantasy worth cruel dismemberment.
Legendary comics author and novelist Warren Ellis joins me on The DisinfoCast for a conversation about the future that was, artificial intelligence, the Singularity, aliens (ancient and otherwise), the legacy of Hunter S. Thompson, porn and even a little bit about comic books. Tune in.
Page 1 of 3Older