Tagged with “computers” (26)

  1. BBC Radio 4 - The Life Scientific, Stephanie Shirley on computer coding

    As a young woman, Stephanie Shirley worked at the Dollis Hill Research Station building computers from scratch: but she told young admirers that she worked for the Post Office, hoping they would think she sold stamps. In the early 60s she changed her name to Steve and started selling computer programmes to companies who had no idea what they were or what they could do, employing only mothers who worked from home writing code by hand with pen and pencil and then posted it to her. By the mid-80s her software company employed eight thousand people, still mainly women with children. She made an absolute fortune but these days Stephanie thinks less about making money and much more about how best to give it away.

    http://www.bbc.co.uk/programmes/b05pmvl8

    —Huffduffed by adactio

  2. BBC Radio 4 - Great Lives, Series 31, Konnie Huq on Ada Lovelace

    Lord Byron’s only legitimate child is championed by Konnie Huq.

    From Banking, to air traffic control systems and to controlling the United States defence department there’s a computer language called ‘Ada’ - it’s named after Ada Lovelace - a 19th century mathematician and daughter of Lord Byron. Ada Lovelace is this week’s Great Life. She’s been called many things - but perhaps most poetically by Charles Babbage whom she worked with on a steam-driven calculating machine called the Difference Engine an ‘enchantress of numbers’, as her similarly mathematical mother had been called by Lord Byron a "princess of parallelograms". Augusta ‘Ada’ Byron was born in 1815 but her parents marriage was short and unhappy; they separated when Ada was one month old and she never saw her father , he died when was eight years old. Her mother, Annabella concerned Ada might inherit Byron’s "poetic tendencies" had her schooled her in maths and science to try to combat any madness inherited from her father. She’s championed by TV presenter and writer -Konnie Huq, most well known for presenting the BBC’s children’s programme - ‘Blue Peter’ and together with expert- Suw Charman- Anderson, a Social technologist, they lift the lid on the life of this mathematician, now regarded as the first computer programmer with presenter Matthew Parris.

    https://www.bbc.co.uk/programmes/b03b0ydy

    —Huffduffed by adactio

  3. Did You Get the Memo? | Dell Technologies United States

    Communication’s come a long way since the days of Samuel Morse. We crack the code on how it’s evolved in this episode.

    https://www.delltechnologies.com/en-us/perspectives/podcasts/trailblazers/s02-e02-did-you-get-the-memo.htm#autoplay=false&autoexpand=true&episode=201&transcript=false

    —Huffduffed by adactio

  4. CHM Live│Programmed Inequality

    According to the National Center for Women & Information Technology, women held just 25 percent of professional computing jobs in the US in 2015. How damaging is this gender gap to the future of the tech industry?

    The rise and fall of Britain’s electronic computing industry between 1944–1974 holds clues. In her book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, historian Marie Hicks explores how gender discrimination, changing labor demographics, and government policy during this 30-year period shaped the UK’s path in computing. She also explains how this path had detrimental economic effects on the UK—and why the US may be facing similar risks today.

    Dr. Marie Hicks sits down with David C. Brock, Director of the Museum’s Center for Software History, to share insights from her book.

    Hicks received her BA from Harvard University and her MA and PhD from Duke University. Before entering academia, she worked as a UNIX systems administrator. She is currently an assistant professor at the University of Wisconsin-Madison. Her work focuses on how gender and sexuality bring hidden technological dynamics to light and how women’s experiences change the core narratives of the history of computing.

    ===
    Original video: https://www.youtube.com/watch?v=WTLJ7saIV3o
    Downloaded by http://huffduff-video.snarfed.org/ on Sat, 13 Jan 2018 10:50:18 GMT Available for 30 days after download

    —Huffduffed by adactio

  5. Zeynep Tufekci: Machine intelligence makes human morals more important | TED Talk | TED.com

    Machine intelligence is here, and we’re already using it to make subjective decisions. But the complex way AI grows and improves makes it hard to understand and even harder to control. In this cautionary talk, techno-sociologist Zeynep Tufekci explains how intelligent machines can fail in ways that don’t fit human error patterns — and in ways we won’t expect or be prepared for. "We cannot outsource our responsibilities to machines," she says. "We must hold on ever tighter to human values and human ethics."

    https://www.ted.com/talks/zeynep_tufekci_machine_intelligence_makes_human_morals_more_important

    —Huffduffed by adactio

  6. Kevin Kelly: How AI can bring on a second Industrial Revolution

    "The actual path of a raindrop as it goes down the valley is unpredictable, but the general direction is inevitable," says digital visionary Kevin Kelly — and technology is much the same, driven by patterns that are surprising but inevitable. Over the next 20 years, he says, our penchant for making things smarter and smarter will have a profound impact on nearly everything we do. Kelly explores three trends in AI we need to understand in order to embrace it and steer its development. "The most popular AI product 20 years from now that everyone uses has not been invented yet," Kelly says. "That means that you’re not late."

    http://www.ted.com/talks/kevin_kelly_how_ai_can_bring_on_a_second_industrial_revolution

    —Huffduffed by adactio

  7. Seth Lloyd: Quantum Computer Reality - The Long Now

    The 15th-century Renaissance was triggered, Lloyd began, by a flood of new information which changed how people thought about everything, and the same thing is happening now.

    All of us have had to shift, just in the last couple decades, from hungry hunters and gatherers of information to overwhelmed information filter-feeders.

    Information is physical.

    A bit can be represented by an electron here to signify 0, and there to signify 1.

    Information processing is moving electrons from here to there.

    But for a “qubit" in a quantum computer, an electron is both here and there at the same time, thanks to "wave-particle duality.”

    Thus with “quantum parallelism” you can do massively more computation than in classical computers.

    It’s like the difference between the simple notes of plainsong and all that a symphony can do—a huge multitude of instruments interacting simultaneously, playing arrays of sharps and flats and complex chords.

    Quantum computers can solve important problems like enormous equations and factoring—cracking formerly uncrackable public-key cryptography, the basis of all online commerce.

    With their ability to do “oodles of things at once," quantum computers can also simulate the behavior of larger quantum systems, opening new frontiers of science, as Richard Feynman pointed out in the 1980s.

    Simple quantum computers have been built since 1995, by Lloyd and ever more others.

    Mechanisms tried so far include: electrons within electric fields; nuclear spin (clockwise and counter); atoms in ground state and excited state simultaneously; photons polarized both horizontally and vertically; and super-conducting loops going clockwise and counter-clockwise at the same time; and many more.

    To get the qubits to perform operations—to compute—you can use an optical lattice or atoms in whole molecules or integrated circuits, and more to come.

    The more qubits, the more interesting the computation.

    Starting with 2 qubits back in 1996, some systems are now up to several dozen qubits.

    Over the next 5-10 years we should go from 50 qubits to 5,000 qubits, first in special-purpose systems but eventually in general-purpose computers.

    Lloyd added, “And there’s also the fascinating field of using funky quantum effects such as coherence and entanglement to make much more accurate sensors, imagers, and detectors.”

    Like, a hundred thousand to a million times more accurate.

    GPS could locate things to the nearest micron instead of the nearest meter.

    Even with small quantum computers we will be able to expand the capability of machine learning by sifting vast collections of data to detect patterns and move on from supervised-learning (“That squiggle is a 7”) toward unsupervised-learning—systems that learn to learn.

    The universe is a quantum computer, Lloyd concluded.

    Biological life is all about extracting meaningful information from a sea of bits.

    For instance, photosynthesis uses quantum mechanics in a very sophisticated way to increase its efficiency.

    Human life is expanding on what life has always been—an exercise in machine learning.

    —Stewart Brand

    http://longnow.org/seminars/02016/aug/09/quantum-computer-reality/

    —Huffduffed by adactio

  8. Ideas: Max Allen and Ted Nelson

    Max Allen and Ted Nelson discuss the future of computers.

    https://archive.org/details/ideas-maxallen-tednelson

    —Huffduffed by adactio

  9. Nineteen Seventy Three • Damn Interesting

    On 12 November 1971, in the presidential palace in the Republic of Chile, President Salvador Allende and a British theorist named Stafford Beer engaged in a highly improbable conversation. Beer was a world-renowned cybernetician and Allende was the newly elected leader of the impoverished republic.

    http://www.damninteresting.com/nineteen-seventy-three/

    —Huffduffed by adactio

  10. The physical reality of our digital world - Future Tense - ABC Radio National (Australian Broadcasting Corporation)

    We often think of our digital world as something that’s not about physical stuff, but about things that happen out there in the air, in space. We speak of cyber space and cloud-computing. But how much of our digital infrastructure is grounded in physical reality? And what are some of the future implications of the growing push to move more of our data into cloud based technology?

    Guests:
    Andrew Blum, Correspondent for Wired and Contributing Editor to Metropolis. Author of ‘Tubes: Behind The Scenes At The Internet’.

    Dr danah boyd, Senior Researcher at Microsoft Research and Research Assistant Professor in Media, Culture and Communication at New York University.

    Ted Striphas, Associate Professor of Media and Cultural Studies at Indiana University’s Department of Communication and Culture.

    John Naughton, Professor of the Public Understanding of Technology at the Open University in the UK and columnist for The Observer Newspaper.

    Gary Cook, Senior Policy Analyst, Cool IT Campaign, Greenpeace International.

    Rich Wolski, Chief Technology Officer and Co-founder of Eucalyptus Systems Inc. And Professor of Computer Science at the University of California, Santa Barbara.

    Publications:
    Title: Tubes: Behind The Scenes At The Internet
    Author: Andrew Blum
    Publisher: Viking (Penguin Australia)

    Further Information:
    Andrew Blum’s website (http://andrewblum.net/)
    Rich Wolski’s webpage (http://www.cs.ucsb.edu/~rich/)
    Ted Striphas website (http://www.indiana.edu/~cmcl/faculty/striphas.shtml)
    GreenPeace Cool IT Challenge (http://www.greenpeace.org/international/en/campaigns/climate-change/cool-it/)
    danah boyd’s website (http://www.danah.org/)
    John Naughton’s Guardian Profile (http://www.guardian.co.uk/profile/johnnaughton)

    http://www.abc.net.au/radionational/programs/futuretense/the-physical-reality-of-our-digital-world/4150766

    —Huffduffed by adactio

Page 1 of 3Older