"The actual path of a raindrop as it goes down the valley is unpredictable, but the general direction is inevitable," says digital visionary Kevin Kelly — and technology is much the same, driven by patterns that are surprising but inevitable. Over the next 20 years, he says, our penchant for making things smarter and smarter will have a profound impact on nearly everything we do. Kelly explores three trends in AI we need to understand in order to embrace it and steer its development. "The most popular AI product 20 years from now that everyone uses has not been invented yet," Kelly says. "That means that you’re not late."
Tagged with “computers” (21)
The 15th-century Renaissance was triggered, Lloyd began, by a flood of new information which changed how people thought about everything, and the same thing is happening now.
All of us have had to shift, just in the last couple decades, from hungry hunters and gatherers of information to overwhelmed information filter-feeders.
Information is physical.
A bit can be represented by an electron here to signify 0, and there to signify 1.
Information processing is moving electrons from here to there.
But for a “qubit" in a quantum computer, an electron is both here and there at the same time, thanks to "wave-particle duality.”
Thus with “quantum parallelism” you can do massively more computation than in classical computers.
It’s like the difference between the simple notes of plainsong and all that a symphony can do—a huge multitude of instruments interacting simultaneously, playing arrays of sharps and flats and complex chords.
Quantum computers can solve important problems like enormous equations and factoring—cracking formerly uncrackable public-key cryptography, the basis of all online commerce.
With their ability to do “oodles of things at once," quantum computers can also simulate the behavior of larger quantum systems, opening new frontiers of science, as Richard Feynman pointed out in the 1980s.
Simple quantum computers have been built since 1995, by Lloyd and ever more others.
Mechanisms tried so far include: electrons within electric fields; nuclear spin (clockwise and counter); atoms in ground state and excited state simultaneously; photons polarized both horizontally and vertically; and super-conducting loops going clockwise and counter-clockwise at the same time; and many more.
To get the qubits to perform operations—to compute—you can use an optical lattice or atoms in whole molecules or integrated circuits, and more to come.
The more qubits, the more interesting the computation.
Starting with 2 qubits back in 1996, some systems are now up to several dozen qubits.
Over the next 5-10 years we should go from 50 qubits to 5,000 qubits, first in special-purpose systems but eventually in general-purpose computers.
Lloyd added, “And there’s also the fascinating field of using funky quantum effects such as coherence and entanglement to make much more accurate sensors, imagers, and detectors.”
Like, a hundred thousand to a million times more accurate.
GPS could locate things to the nearest micron instead of the nearest meter.
Even with small quantum computers we will be able to expand the capability of machine learning by sifting vast collections of data to detect patterns and move on from supervised-learning (“That squiggle is a 7”) toward unsupervised-learning—systems that learn to learn.
The universe is a quantum computer, Lloyd concluded.
Biological life is all about extracting meaningful information from a sea of bits.
For instance, photosynthesis uses quantum mechanics in a very sophisticated way to increase its efficiency.
Human life is expanding on what life has always been—an exercise in machine learning.
Max Allen and Ted Nelson discuss the future of computers.
On 12 November 1971, in the presidential palace in the Republic of Chile, President Salvador Allende and a British theorist named Stafford Beer engaged in a highly improbable conversation. Beer was a world-renowned cybernetician and Allende was the newly elected leader of the impoverished republic.
The physical reality of our digital world - Future Tense - ABC Radio National (Australian Broadcasting Corporation)
We often think of our digital world as something that’s not about physical stuff, but about things that happen out there in the air, in space. We speak of cyber space and cloud-computing. But how much of our digital infrastructure is grounded in physical reality? And what are some of the future implications of the growing push to move more of our data into cloud based technology?
Andrew Blum, Correspondent for Wired and Contributing Editor to Metropolis. Author of ‘Tubes: Behind The Scenes At The Internet’.
Dr danah boyd, Senior Researcher at Microsoft Research and Research Assistant Professor in Media, Culture and Communication at New York University.
Ted Striphas, Associate Professor of Media and Cultural Studies at Indiana University’s Department of Communication and Culture.
John Naughton, Professor of the Public Understanding of Technology at the Open University in the UK and columnist for The Observer Newspaper.
Gary Cook, Senior Policy Analyst, Cool IT Campaign, Greenpeace International.
Rich Wolski, Chief Technology Officer and Co-founder of Eucalyptus Systems Inc. And Professor of Computer Science at the University of California, Santa Barbara.
Title: Tubes: Behind The Scenes At The Internet
Author: Andrew Blum
Publisher: Viking (Penguin Australia)
Andrew Blum’s website (http://andrewblum.net/)
Rich Wolski’s webpage (http://www.cs.ucsb.edu/~rich/)
Ted Striphas website (http://www.indiana.edu/~cmcl/faculty/striphas.shtml)
GreenPeace Cool IT Challenge (http://www.greenpeace.org/international/en/campaigns/climate-change/cool-it/)
danah boyd’s website (http://www.danah.org/)
John Naughton’s Guardian Profile (http://www.guardian.co.uk/profile/johnnaughton)
Are computers changing human character? Is our closeness with computers changing us as a species? Alix and Lulu look at the ways technology affects us.
The coming war on general purpose computing - Future Tense - ABC Radio National (Australian Broadcasting Corporation)
Sci-fi author and digital rights activist Cory Doctorow talks about a coming ‘war on general purpose computing’, which could have far reaching consequences for our society.
In today’s programme have we all become cyborgs without even knowing it?
We’ve always extended our human bodies ever since we first picked up rocks or sticks as tools, it’s part of human nature. So are the digital tools of today any different? Aleks asks just how far we’ve come and are willing to go to become one with our technology and become cyborg.
Aleks hears from film maker Rob Spence better known as Eyeborg about the reaction he gets to the camera he has where his right eye used to be. It’s a different type of eye artist and composer Neil Harbisson uses, born entirely colour blind Neil uses an electronic eye on an antenna attached to his skull to hear colours it’s now such a part of how Neil perceives the world that he hears the colours in his dreams!
Brandy Ellis is a very different type of cyborg; having suffered from depression for years she opted to have electronics implanted in her brain to control her symptoms. Her feelings are literally regulated by a machine.
Ultimately Aleks finds out from anthropologist Amber Case how we’re all every bit as cyborg as Rob, Neil or Brandy in how we coexist symbiotically with our digital devices.
Robot traders are dominating stock markets using high speed computer algorithms. Human traders and government regulators canât keep up, and markets could be one programming glitch away from the next big crash. Stan Correy investigates.
In the 1940s and 1950s, a group of brilliant engineers led by John von Neumann gathered in Princeton, New Jersey with the joint goal of realizing Alan Turing’s theoretical universal machine-a thought experiment that scientists use to understand the limits of mechanical computation. As a result of their fervent work, the crucial advancements that dominated 20th century technology emerged. In Turing’s Cathedral, technology historian George Dyson recreates the scenes of focused experimentation, mathematical insight, and creative genius that broke the distinction between numbers that mean things and numbers that do things-giving us computers, digital television, modern genetics, and models of stellar evolution. Also a philosopher of science, Dyson’s previous books include Baidarka, Darwin Among the Machines, and Project Orion. (recorded 3/13/2012)
Page 1 of 3Older