lot49a / Tim Maly

So this is like Instapaper for audio?

There are no people in lot49a’s collective.

Huffduffed (33)

  1. Bruce Sterling: The Singularity: Your Future as a Black Hole - The Long Now

    Your future as a black hole

    One reason lots of people don’t want to think long term these days is because technology keeps accelerating so rapidly, we assume the world will become unrecognizable in a few years and then move on to unimaginable. Long-term thinking must be either impossible or irrelevant.

    The commonest shorthand term for the runaway acceleration of technology is “the Singularity”—a concept introduced by science fiction writer Vernor Vinge in 1984. The term has been enthusiastically embraced by technology historians, futurists, extropians, and various trans-humanists and post-humanists, who have generated variants such as “the techno-rapture,” “the Spike,” etc.

    It takes a science fiction writer to critique a science fiction idea.

    Along with being one of America’s leading science fiction writers and technology journalists, Bruce Sterling is a celebrated speaker armed with lethal wit. His books include The Zenith Angle (just out), Hacker Crackdown, Holy Fire, Distraction, Mirrorshades (cyberpunk compendium), Schismatrix, The Difference Engine (with William Gibson), Tomorrow Now, and Islands in the Net.

    The Seminar About Long-term Thinking on June 10-11 was Bruce Sterling examining “The Singularity: Your Future as a Black Hole.” He treated the subject of hyper-acceleration of technology as a genuine threat worth alleviating and as a fond fantasy worth cruel dismemberment.

    Sterling noted that the first stating of the Singularity metaphor and threat came from John Von Neuman in the 1950s in conversation with Stan Ulam—”the rate of change in technology accelerates until it is mind-boggling and beyond social control or even comprehension.” But it was science fiction writer Vernor Vinge who first published the idea, in novels and a lecture in the early 1980s, and it was based on the expectation of artificial intelligence surpassing human intelligence. Vinge wrote: “I believe that the creation of greater than human intelligence will occur during the next thirty years. I’ll be surprised if this event occurs before 2005 or after 2030.” Vinge was not thrilled at the prospect.

    The world-changing event would happen relatively soon, it would be sudden, and it would be irrevocable.

    “It’s an end-of-history notion,” Sterling drawled, “and like most end-of-history notions, it is showing its age.” It’s almost 2005, and the world is still intelligible. Computer networks have accelerated wildly, but water networks haven’t—in fact we’re facing a shortage of water.

    The Singularity feels like a 90s dot-com bubble idea now—it has no business model. “Like most paradoxes it is a problem of definitional systems involving sci-fi handwaving around this magic term ‘intelligence.’ If you fail to define your terms, it is very easy to divide by zero and reach infinite exponential speed.” It was catnip for the intelligentsia: “Wow, if we smart guys were more like we already are, we’d be godlike.”

    Can we find any previous Singularity-like events in history? Sterling identified three—the atomic bomb, LSD, and computer viruses. The bomb was sudden and world changing and hopeful—a new era! LSD does FEEL like it’s world changing. Viruses proliferate exponentially on the net. LSD is pretty much gone now. Mr. Atom turned out to be not our friend and has blended in with other tools and problems.

    Singularity proponents, Sterling observed, are organized pretty much like virus writers—loose association, passionate focus, but basically gentle. (They’d be easily rounded up.) “They don’t have to work very hard because they are mesmerized by the autocatalyzing cascade effect. ‘Never mind motivating voters, raising funds, or persuading the press; we’ve got a mathematician’s smooth line on a 2D graph! Why bother, since pretty soon we’ll be SUPERHUMAN. It’s bound to happen to us because we are EARLY ADAPTERS.’”

    Vernor Vinge wrote: “For me, superhumanity is the essence of the Singularity. Without that we would get a glut of technical riches, never properly absorbed.” Said Sterling, “A glut of technical riches never properly absorbed sounds like a great description of the current historical epoch.”

    Sterling listed five kinds of kinds of reactions to the Singularity. 1) Don’t know and don’t care (most people). 2) The superbian transhumanists. 3) The passive singularitatians—the Rapture of the Nerds. 4) The terrified handflapping apocalypse millennialists (a dwindling group, too modish to stay scared of the same apocalypse for long). 5) The Singularity resistance—Bill Joy killjoys who favor selective relinquishment. Sterling turned out to be a fellow traveler of the Resistance: “Human cognition becoming industrialized is something I actually worry about.”

    Vinge did a great thing, said Sterling. The Singularity has proved to be a rich idea. “In the genre of science fiction it is more important to be fruitfully mistaken than dully accurate. That’s why we are science fiction writers, not scientists.”

    Suppose some kind of Singularity does come about. Even though it is formally unthinkable to characterize post-Singularity reality, Sterling proposed you could probably be sure of some things. The people there wouldn’t feel like they are “post”-anything. For them, most things would be banal. There wouldn’t be one Singularity but different ones on different schedules, and they would keep on coming. It would be messy. Death would continue as the great leveler.

    Suppose humanity elected to slow down an approaching Singularity to a manageable pace, what could we actually do? How do you stop a runaway technology?

    —You could reverse the order of scientific prestige from honoring the most dangerous new science (such as nuclear physics) to honoring the most responsible restraint—a Relinquishment Nobel. It would be scientific self-regulation.

    —You could destroy scientific prestige through commercialization. Scientists diminish into mercenaries—”put-upon Dilberts and top-secret mandarins.” It would still be dangerous, though.

    —Have a few cities leveled by a Singularity technology and you could bounce into world government with intense surveillance and severe repression of suspect technologies and technologists. Most societies are already anti-science; this would fulfill their world view. “You can run but you can’t hide! You will be brought to justice or justice will be brought to you! Into the steel cage, Mr. Singularity. Into Guantanamo till you tell us who your friends are. Then they join you in there.” Or maybe it’s not that fierce, and it’s all done by benevolent non-governmental organizations.

    Sterling concluded: “It does come down to a better way to engage with the passage of time. The loss of the future is becoming acute. The most effective political actors on the planet right now are guys who want to blow themselves up—they really DON’T want to get out of bed in the morning and face another day. People need a motivating vision of what comes next and the awareness that more will happen after that, that the future is a process not a destination. The future is a verb, not a noun. Our minds may reach the ends of their tethers, but we’ll never stop futuring.”

           —Stewart Brand


    —Huffduffed by lot49a

  2. Bonus: Conversations with AI, featuring Brian Roemmele

    Voice-first advocate Brian Roemmele returns for a chat with Rene Ritchie about the current status of Siri at Apple, and its place among other voice assistants. In January of this year, he told Rene the company's reluctance to let the Siri feature become the SiriOS platform is holding them back. As of December 2018, let's see where things stand now. SPONSOR:

    Thanks to Skillshare for sponsoring this show! Get all-you-can-learn access to over 20,000 courses for 2 months for free!


    @BrianRoemmele Brian Roemmele on Twitter Voice First Expert


    Gear: https://kit.com/reneritchie Podcast: http://applepodcasts.com/vector Twitter: https://twitter.com/reneritchie Instagram: https://instagram.com/reneritchie


    Apple Podcasts Overcast Pocket Casts Castro RSS YouTube



    —Huffduffed by lot49a

  3. Two (Totally Opposite) Ways to Save the Planet (Ep. 346) - Freakonomics Freakonomics

    The environmentalists say we’re doomed if we don’t drastically reduce consumption. The technologists say that human ingenuity can solve just about any problem. A debate that’s been around for decades has become a shouting match. Is anyone right?


    —Huffduffed by lot49a

  4. TFSR: “The Inspection House”, surveillance, Bentham, Foucault & intentions (with Emily Horne & Tim Maly) | The Final Straw Radio (AshevilleFM)

    Jeremy Bentham (died 1832) on display at London College into the 1970’s.

    Note his mummified head between his feet…This week William speaks with Emily Horne and Tim Maly about their book “The Inspection House; An Impertinent Field Guide to Modern Surveillance”, which was published in October 2014 by Coach House Books in their Exploded Views series. This interview comes right before the authors book tour of locations in Canada.

    From the book’s website:

    “In 1787, British philosopher and social reformer Jeremy Bentham conceived of the panopticon, a ring of cells observed by a central watchtower, as a labor-saving device for those in authority. While Bentham’s design was ostensibly for a prison, he believed that any number of places that require supervision—factories, poorhouses, hospitals, and schools—would benefit from such a design. The French philosopher Michel Foucault took Bentham at his word. In his groundbreaking 1975 study, Discipline and Punish, the panopticon became a metaphor to describe the creeping effects of personalized surveillance as a means for ever-finer mechanisms of control.

    Forty years later, the available tools of scrutiny, supervision, and discipline are far more capable and insidious than Foucault dreamed, and yet less effective than Bentham hoped. Public squares, container ports, terrorist holding cells, and social networks all bristle with cameras, sensors, and trackers. But, crucially, they are also rife with resistance and prime opportunities for revolution.”

    In the interview, Emily and Tim talk about Jeremy Bentham’s life, the intended and actual uses of the panopticon, the dangers of the well intentioned, and more!

    The book has a lot of good stuff in it, history and analysis and humor. For more info about “The Inspection House” and about the author’s Canadian tour, you can visit http://www.chbooks.com/catalogue/inspection-house

    The Panopticam (live streaming & timelapse from the top of the cabinet in which Jeremy Bentham sits)

    Metro.UK article on Jeremy Bentham’s attendence record at the University College of London since his passing in 1838.

    Episode playlist: www.ashevillefm.org/node/11859

    Download This Episode


    —Huffduffed by lot49a

  5. Bruce Sterling Closing Talk - SXSW Interactive 2015 by SXSW | Free Listening on SoundCloud

    Bruce Sterling Closing Talk - SXSW Interactive 2015

    by SXSW

    published on 2015/03/18 22:53:30 +0000

    World traveler, science fiction author, journalist, and future-focused design critic Bruce Sterling spins the globe a few rounds as he wraps up the Interactive Conference with his peculiar view of the state of the world. Always unexpected, invented on the fly, a hash of trends, trepidations, and creative prognostication. Don't miss this annual event favorite. What will he covered in 2015?

    Download Bruce Sterling Closing Talk - SXSW Interactive 2015

    Users who like Bruce Sterling Closing Talk - SXSW Interactive 2015

    Users who reposted Bruce Sterling Closing Talk - SXSW Interactive 2015

    Playlists containing Bruce Sterling Closing Talk - SXSW Interactive 2015

    Groups containing Bruce Sterling Closing Talk - SXSW Interactive 2015

    More tracks like Bruce Sterling Closing Talk - SXSW Interactive 2015

    License: all-rights-reserved


    —Huffduffed by lot49a

Page 1 of 4Older