djinna / collective / tags / computers

Tagged with “computers” (29)

  1. Computers at Home

    In the 1980s, ‘micro computers’ invaded the home. In this episode, Hannah Fry discovers how the computer was transported from the office and the classroom right into our living room.

    From eccentric electronics genius Clive Sinclair and his ZX80, to smart-suited businessman Alan Sugar and the Amstrad PC, she charts the 80s computer boom - a time when the UK had more computers per head of population than anywhere else in the world.

    Presented by Hannah Fry

    Produced by Michelle Martin

    https://www.bbc.co.uk/programmes/b06bnq0y

    —Huffduffed by adactio

  2. In 1968, computers got personal: How the ‘mother of all demos’ changed the world

    A 90-minute presentation in 1968 showed off the earliest desktop computer system. In the process it introduced the idea that technology could make individuals better – if government funded research.

    https://theconversation.com/in-1968-computers-got-personal-how-the-mother-of-all-demos-changed-the-world-101654

    —Huffduffed by adactio

  3. James Bridle: The nightmare videos of children’s YouTube — and what’s wrong with the internet today | TED Talk

    Writer and artist James Bridle uncovers a dark, strange corner of the internet, where unknown people or groups on YouTube hack the brains of young children in return for advertising revenue. From "surprise egg" reveals and the "Finger Family Song" to algorithmically created mashups of familiar cartoon characters in violent situations, these videos exploit and terrify young minds — and they tell us something about where our increasingly data-driven world is headed. "We need to stop thinking about technology as a solution to all of our problems, but think of it as a guide to what those problems actually are, so we can start thinking about them properly and start to address them," Bridle says.

    https://www.ted.com/talks/james_bridle_the_nightmare_videos_of_childrens_youtube_and_what_s_wrong_with_the_internet_today

    —Huffduffed by adactio

  4. BBC Radio 4 - The Life Scientific, Stephanie Shirley on computer coding

    As a young woman, Stephanie Shirley worked at the Dollis Hill Research Station building computers from scratch: but she told young admirers that she worked for the Post Office, hoping they would think she sold stamps. In the early 60s she changed her name to Steve and started selling computer programmes to companies who had no idea what they were or what they could do, employing only mothers who worked from home writing code by hand with pen and pencil and then posted it to her. By the mid-80s her software company employed eight thousand people, still mainly women with children. She made an absolute fortune but these days Stephanie thinks less about making money and much more about how best to give it away.

    http://www.bbc.co.uk/programmes/b05pmvl8

    —Huffduffed by adactio

  5. BBC Radio 4 - Great Lives, Series 31, Konnie Huq on Ada Lovelace

    Lord Byron’s only legitimate child is championed by Konnie Huq.

    From Banking, to air traffic control systems and to controlling the United States defence department there’s a computer language called ‘Ada’ - it’s named after Ada Lovelace - a 19th century mathematician and daughter of Lord Byron. Ada Lovelace is this week’s Great Life. She’s been called many things - but perhaps most poetically by Charles Babbage whom she worked with on a steam-driven calculating machine called the Difference Engine an ‘enchantress of numbers’, as her similarly mathematical mother had been called by Lord Byron a "princess of parallelograms". Augusta ‘Ada’ Byron was born in 1815 but her parents marriage was short and unhappy; they separated when Ada was one month old and she never saw her father , he died when was eight years old. Her mother, Annabella concerned Ada might inherit Byron’s "poetic tendencies" had her schooled her in maths and science to try to combat any madness inherited from her father. She’s championed by TV presenter and writer -Konnie Huq, most well known for presenting the BBC’s children’s programme - ‘Blue Peter’ and together with expert- Suw Charman- Anderson, a Social technologist, they lift the lid on the life of this mathematician, now regarded as the first computer programmer with presenter Matthew Parris.

    https://www.bbc.co.uk/programmes/b03b0ydy

    —Huffduffed by adactio

  6. Did You Get the Memo? | Dell Technologies United States

    Communication’s come a long way since the days of Samuel Morse. We crack the code on how it’s evolved in this episode.

    https://www.delltechnologies.com/en-us/perspectives/podcasts/trailblazers/s02-e02-did-you-get-the-memo.htm#autoplay=false&autoexpand=true&episode=201&transcript=false

    —Huffduffed by adactio

  7. CHM Live│Programmed Inequality

    According to the National Center for Women & Information Technology, women held just 25 percent of professional computing jobs in the US in 2015. How damaging is this gender gap to the future of the tech industry?

    The rise and fall of Britain’s electronic computing industry between 1944–1974 holds clues. In her book, Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, historian Marie Hicks explores how gender discrimination, changing labor demographics, and government policy during this 30-year period shaped the UK’s path in computing. She also explains how this path had detrimental economic effects on the UK—and why the US may be facing similar risks today.

    Dr. Marie Hicks sits down with David C. Brock, Director of the Museum’s Center for Software History, to share insights from her book.

    Hicks received her BA from Harvard University and her MA and PhD from Duke University. Before entering academia, she worked as a UNIX systems administrator. She is currently an assistant professor at the University of Wisconsin-Madison. Her work focuses on how gender and sexuality bring hidden technological dynamics to light and how women’s experiences change the core narratives of the history of computing.

    ===
    Original video: https://www.youtube.com/watch?v=WTLJ7saIV3o
    Downloaded by http://huffduff-video.snarfed.org/ on Sat, 13 Jan 2018 10:50:18 GMT Available for 30 days after download

    —Huffduffed by adactio

  8. Zeynep Tufekci: Machine intelligence makes human morals more important | TED Talk | TED.com

    Machine intelligence is here, and we’re already using it to make subjective decisions. But the complex way AI grows and improves makes it hard to understand and even harder to control. In this cautionary talk, techno-sociologist Zeynep Tufekci explains how intelligent machines can fail in ways that don’t fit human error patterns — and in ways we won’t expect or be prepared for. "We cannot outsource our responsibilities to machines," she says. "We must hold on ever tighter to human values and human ethics."

    https://www.ted.com/talks/zeynep_tufekci_machine_intelligence_makes_human_morals_more_important

    —Huffduffed by adactio

  9. Kevin Kelly: How AI can bring on a second Industrial Revolution

    "The actual path of a raindrop as it goes down the valley is unpredictable, but the general direction is inevitable," says digital visionary Kevin Kelly — and technology is much the same, driven by patterns that are surprising but inevitable. Over the next 20 years, he says, our penchant for making things smarter and smarter will have a profound impact on nearly everything we do. Kelly explores three trends in AI we need to understand in order to embrace it and steer its development. "The most popular AI product 20 years from now that everyone uses has not been invented yet," Kelly says. "That means that you’re not late."

    http://www.ted.com/talks/kevin_kelly_how_ai_can_bring_on_a_second_industrial_revolution

    —Huffduffed by adactio

  10. Seth Lloyd: Quantum Computer Reality - The Long Now

    The 15th-century Renaissance was triggered, Lloyd began, by a flood of new information which changed how people thought about everything, and the same thing is happening now.

    All of us have had to shift, just in the last couple decades, from hungry hunters and gatherers of information to overwhelmed information filter-feeders.

    Information is physical.

    A bit can be represented by an electron here to signify 0, and there to signify 1.

    Information processing is moving electrons from here to there.

    But for a “qubit" in a quantum computer, an electron is both here and there at the same time, thanks to "wave-particle duality.”

    Thus with “quantum parallelism” you can do massively more computation than in classical computers.

    It’s like the difference between the simple notes of plainsong and all that a symphony can do—a huge multitude of instruments interacting simultaneously, playing arrays of sharps and flats and complex chords.

    Quantum computers can solve important problems like enormous equations and factoring—cracking formerly uncrackable public-key cryptography, the basis of all online commerce.

    With their ability to do “oodles of things at once," quantum computers can also simulate the behavior of larger quantum systems, opening new frontiers of science, as Richard Feynman pointed out in the 1980s.

    Simple quantum computers have been built since 1995, by Lloyd and ever more others.

    Mechanisms tried so far include: electrons within electric fields; nuclear spin (clockwise and counter); atoms in ground state and excited state simultaneously; photons polarized both horizontally and vertically; and super-conducting loops going clockwise and counter-clockwise at the same time; and many more.

    To get the qubits to perform operations—to compute—you can use an optical lattice or atoms in whole molecules or integrated circuits, and more to come.

    The more qubits, the more interesting the computation.

    Starting with 2 qubits back in 1996, some systems are now up to several dozen qubits.

    Over the next 5-10 years we should go from 50 qubits to 5,000 qubits, first in special-purpose systems but eventually in general-purpose computers.

    Lloyd added, “And there’s also the fascinating field of using funky quantum effects such as coherence and entanglement to make much more accurate sensors, imagers, and detectors.”

    Like, a hundred thousand to a million times more accurate.

    GPS could locate things to the nearest micron instead of the nearest meter.

    Even with small quantum computers we will be able to expand the capability of machine learning by sifting vast collections of data to detect patterns and move on from supervised-learning (“That squiggle is a 7”) toward unsupervised-learning—systems that learn to learn.

    The universe is a quantum computer, Lloyd concluded.

    Biological life is all about extracting meaningful information from a sea of bits.

    For instance, photosynthesis uses quantum mechanics in a very sophisticated way to increase its efficiency.

    Human life is expanding on what life has always been—an exercise in machine learning.

    —Stewart Brand

    http://longnow.org/seminars/02016/aug/09/quantum-computer-reality/

    —Huffduffed by adactio

Page 1 of 3Older