chrispederick / tags / singularity

Tagged with “singularity” (6) activity chart

  1. Vernor Vinge: What If the Singularity Does NOT Happen? - The Long Now

    Non-Singularity scenarios

    Vinge began by declaring that he still believes that a Singularity event in the next few decades is the most likely outcome— meaning that self-accelerating technologies will speed up to the point of so profound a transformation that the other side of it is unknowable.

    And this transformation will be driven by Artificial Intelligences (AIs) that, once they become self-educating and self-empowering, soar beyond human capacity with shocking suddenness.

    He added that he is not convinced by the fears of some that the AIs would exterminate humanity.

    He thinks they would be wise enough to keep us around as a fallback and backup— intelligences that can actually function without massive connectivity!

    (Later in the Q&A I asked him about the dangerous period when AI’s are smart enough to exterminate us but not yet wise enough to keep us around.

    How long would that period be?

    “About four hours,” said Vinge .)

    Since a Singularity makes long-term thinking impractical, Vinge was faced with the problem of how to say anything useful in a Seminar About Long-term Thinking, so he came up with a plausible set of scenarios that would be Singularity-free.

    He noted that they all require that we achieve no faster-than-light space travel.

    The overall non-Singularity condition he called “The Age of Failed Dreams.”

    The main driver is that software simply continues failing to keep pace with hardware improvements.

    One after another, enormous billion-dollar software projects simply do not run, as has already happened at the FBI, air traffic control, IRS, and many others.

    Some large automation projects fail catastrophically, with planes running into each.

    So hardware development eventually lags, and materials research lags, and no strong AI develops.

    To differentiate visually his three sub-scenarios, Vinge showed a graph ranging over the last 50,000 and next 50,000 years, with power (in maximum discrete sources) plotted against human populaton, on a log-log scale.

    Thus the curve begins at the lower left with human power of 0.3 kilowatts and under a hundred thousand population, curves up through steam engines with one megawatt of power and a billion population, up further to present plants generating 13 gigawatts.

    His first scenario was a bleak one called “A Return to MADness.”

    Driven by increasing environmental stress (that a Singularity might have cured), nations return to nuclear confrontation and policies of “Mutually Assured Destruction.”

    One “bad afternoon,” it all plays out, humanity blasts itself back to the Stone Age and then gradually dwindles to extinction.

    His next scenario was a best-case alternative named “The Golden Age,” where population stabilizes around 3 billion, and there is a peaceful ascent into “the long, good time.”

    Humanity catches on that the magic ingredient is education, and engages the full plasticity of the human psyche, empowered by hope, information, and communication.

    A widespread enlightened populism predominates, with the kind of tolerance and wise self-interest we see embodied already in Wikipedia.

    One policy imperative of this scenario would be a demand for research on “prolongevity”— “Young old people are good for the future of humanity.”

    Far from deadening progress, long-lived youthful old people would have a personal stake in the future reaching out for centuries, and would have personal perspective reaching back for centuries.

    The final scenario, which Vinge thought the most probable, he called “The Wheel of Time.”

    Catastrophes and recoveries of various amplitudes follow one another.

    Enduring heroes would be archaeologists and “software dumpster divers” who could recover lost tools and techniques.

    What should we do about the vulnerabilities in these non-Singularity scenarios?

    Vinge ’s main concern is that we are running only one, perilously narrow experiment on Earth.

    “The best hope for long-term survival is self-sufficient off-Earth settlements.”

    We need a real space program focussed on bringing down the cost of getting mass into space, instead of “the gold-plated sham” of present-day NASA.

    There is a common critique that there is no suitable place for humans elsewhere in the Solar System, and the stars are too far.

    “In the long now,” Vinge observed, “the stars are not too far.”

    (Note: Vinge’s detailed notes for this talk, and the graphs, may be found online at: http://rohan.sdsu.edu/faculty/vinge /longnow/index.htm ) —Stewart Brand

    http://longnow.org/seminars/02007/feb/15/what-if-the-singularity-does-not-happen/

    —Huffduffed by chrispederick

  2. Vernor Vinge Is Optimistic About the Collapse of Civilization

    Noted author and futurist Vernor Vinge is surprisingly optimistic when it comes to the prospect of civilization collapsing.

    “I think that [civilization] coming back would actually be a very big surprise,” he says in this week’s episode of the Geek’s Guide to the Galaxy podcast. “The difference between us and us 10,000 years ago is … we know it can be done.”

    Vinge has a proven track record of looking ahead. His 1981 novella True Names was one of the first science fiction stories to deal with virtual reality, and he also coined the phrase, “The Technological Singularity” to describe a future point at which technology creates intelligences beyond our comprehension. The term is now in wide use among futurists.

    But could humanity really claw its way back after a complete collapse? Haven’t we plundered the planet’s resources in ways that would be impossible to repeat?

    “I disagree with that,” says Vinge. “With one exception — fossil fuels. But the stuff that we mine otherwise? We have concentrated that. I imagine that ruins of cities are richer ore fields than most of the natural ore fields we have used historically.”

    That’s not to say the collapse of civilization is no big deal. The human cost would be horrendous, and there would be no comeback at all if the crash leaves no survivors. A ravaged ecosphere could stymie any hope of rebuilding, as could a disaster that destroys even the ruins of cities.

    “I am just as concerned about disasters as anyone,” says Vinge. “I have this region of the problem that I’m more optimistic about than some people, but overall, avoiding existential threats is at the top of my to-do list.”

    http://www.wired.com/underwire/2012/03/vernor-vinge-geeks-guide-galaxy/

    —Huffduffed by chrispederick

  3. Bruce Sterling: The Singularity: Your Future as a Black Hole - The Long Now

    One reason lots of people don’t want to think long term these days is because technology keeps accelerating so rapidly, we assume the world will become unrecognizable in a few years and then move on to unimaginable. Long-term thinking must be either impossible or irrelevant.

    The commonest shorthand term for the runaway acceleration of technology is “the Singularity”—a concept introduced by science fiction writer Vernor Vinge in 1984. The term has been enthusiastically embraced by technology historians, futurists, extropians, and various trans-humanists and post-humanists, who have generated variants such as “the techno-rapture,” “the Spike,” etc.

    It takes a science fiction writer to critique a science fiction idea.

    Along with being one of America’s leading science fiction writers and technology journalists, Bruce Sterling is a celebrated speaker armed with lethal wit. His books include The Zenith Angle (just out), Hacker Crackdown, Holy Fire, Distraction, Mirrorshades (cyberpunk compendium), Schismatrix, The Difference Engine (with William Gibson), Tomorrow Now, and Islands in the Net.

    The Seminar About Long-term Thinking on June 10-11 was Bruce Sterling examining “The Singularity: Your Future as a Black Hole.” He treated the subject of hyper-acceleration of technology as a genuine threat worth alleviating and as a fond fantasy worth cruel dismemberment.

    http://longnow.org/seminars/02004/jun/11/the-singularity-your-future-as-a-black-hole/

    —Huffduffed by chrispederick

  4. Vernor Vinge Is Optimistic About the Collapse of Civilization | Underwire | Wired.com

    Noted author and futurist Vernor Vinge is surprisingly optimistic when it comes to the prospect of civilization collapsing.

    “I think that [civilization] coming back would actually be a very big surprise,” he says in this week’s episode of the Geek’s Guide to the Galaxy podcast. “The difference between us and us 10,000 years ago is … we know it can be done.”

    Vinge has a proven track record of looking ahead. His 1981 novella True Names was one of the first science fiction stories to deal with virtual reality, and he also coined the phrase, “The Technological Singularity” to describe a future point at which technology creates intelligences beyond our comprehension. The term is now in wide use among futurists.

    But could humanity really claw its way back after a complete collapse? Haven’t we plundered the planet’s resources in ways that would be impossible to repeat?

    “I disagree with that,” says Vinge. “With one exception — fossil fuels. But the stuff that we mine otherwise? We have concentrated that. I imagine that ruins of cities are richer ore fields than most of the natural ore fields we have used historically.”

    That’s not to say the collapse of civilization is no big deal. The human cost would be horrendous, and there would be no comeback at all if the crash leaves no survivors. A ravaged ecosphere could stymie any hope of rebuilding, as could a disaster that destroys even the ruins of cities.

    “I am just as concerned about disasters as anyone,” says Vinge. “I have this region of the problem that I’m more optimistic about than some people, but overall, avoiding existential threats is at the top of my to-do list.”

    http://www.wired.com/underwire/2012/03/vernor-vinge-geeks-guide-galaxy/

    —Huffduffed by chrispederick

  5. Charlie Stross on Singularity 1 on 1: The World is Complicated. Elegant Narratives Explaining Everything Are Wrong!

    Want to find out why Charlie Stross thinks that the singularity, if it happens at all, may not leave any room for humans? Check out his interview for www.SingularityWeblog.com

    Today my guest on Singularity 1 on 1 is award winning science fiction author Charles Stross. It was his seminal singularity book Accelerando that not only won the 2006 Locus Award (in addition to being a finalist for the John W. Campbell Memorial Award and on the final ballot for the Hugo Award) but was also at least in part responsible for my launching of SingularitySymposium.com and SingularityWeblog.com.

    During my conversation with Charlie we discuss issues such as: his early interest in and love for science fiction; his work as a “code monkey” for a start up company during the first dot com boom of the late nineties and the resulting short sci fi story Lobsters (which eventually turned into Accelerando); his upcoming book Rule 34; his take on the human condition, brain uploading, the technological singularity and our chances of surviving it.

    Charles Stross, 46, is a full-time science fiction writer and resident of Edinburgh, Scotland. The winner of two Locus Reader Awards and winner of the 2005 and 2010 Hugo awards for best novella, Stross’ works have been translated into over twelve languages.

    Like many writers, Stross has had a variety of careers, occupations, and job-shaped-catastrophes in the past, from pharmacist (he quit after the second police stake-out) to first code monkey on the team of a successful dot-com startup (with brilliant timing he tried to change employer just as the bubble burst).

    http://singularityblog.singularitysymposium.com/charlie-stross-on-singularity-1-on-1-the-world-is-complicated-elegant-narratives-explaining-everything-are-wrong/

    —Huffduffed by chrispederick

  6. Jaron Lanier on technology and humanity

    Jaron Lanier, pioneering computer scientist, musician, visual artist, and author, discusses his book, You Are Not a Gadget: A Manifesto. Lanier discusses effects of the web becoming “regularized” and dangers he sees with “hive mind” production, which he claims leads to “crummy design.” He also explains why he thinks advertising is a misnomer, contending that modern advertising is more about access to potential consumers than expressive or creative form. Lanier also advocates for more peer-to-peer rather than hub-and-spoke transactions, discusses why he’s worried about the disappearance of the middle class, claims that “free” isn’t really free, talks about libertarian ideals, and explains why he’s ultimately hopeful about the future.

    http://surprisinglyfree.com/2011/02/15/jaron-lanier/

    —Huffduffed by chrispederick