adactio / tags / information

Tagged with “information” (27)

  1. Alex Wright: Glut: Mastering Information Though the Ages - The Long Now

    A Series of Information Explosions

    As usual, microbes led the way.

    Bacteria have swarmed in intense networks for 3.5 billion years.

    Then a hierarchical form emerged with the first nucleated cells that were made up of an enclosed society of formerly independent organisms.

    That’s the pattern for the evolution of information, Alex Wright said.

    Networks coalesce into hierarchies, which then form a new level of networks, which coalesce again, and so on.

    Thus an unending series of information explosions is finessed.

    In humans, classification schemes emerged everywhere, defining how things are connected in larger contexts.

    Researchers into “folk taxonomies” have found that all cultures universally describe things they care about in hierarchical layers, and those hierarchies are usually five layers deep.

    Family tree hierarchies were accorded to the gods, who were human-like personalities but also represented various natural forces.

    Starting 30,000 years ago the “ice age information explosion” brought the transition to collaborative big game hunting, cave paintings, and elaborate decorative jewelry that carried status information.

    It was the beginning of information’s “release from social proximity.”

    5,000 years ago in Sumer, accountants began the process toward writing, beginning with numbers, then labels and lists, which enabled bureaucracy.

    Scribes were just below kings in prestige.

    Finally came written narratives such as Gilgamesh.

    The move from oral culture to literate culture is profound.

    Oral is additive, aggregative, participatory, and situational, where literate is subordinate, analytic, objective, and abstract.

    (One phenomenon of current Net culture is re-emergence of oral forms in email, twittering, YouTube, etc.)

    Wright honored the sequence of information-ordering visionaries who brought us to our present state.

    In 1883 Charles Cutter devised a classification scheme that led in part to the Library of Congress system and devised an apparatus of keyboard and wires that would fetch the desired book.

    H.G. Wells proposed a “world brain” of data and imagined that it would one day wake up.

    Teilhard de Chardin anticipated an “etherization of human consciousness” into a global noosphere.

    The greatest unknown revolutionary was the Belgian Paul Otlet.

    In 1895 he set about freeing the information in books from their bindings.

    He built a universal decimal classification and then figured out how that organized data could be explored, via “links” and a “web.”

    In 1910 Otlet created a “radiated library” called the Mundameum in Brussels that managed search queries in a massive way until the Nazis destroyed the service.

    Alex Wright showed an astonishing video of how Otlet’s distributed telephone-plus-screen system worked.

    Wright concluded with the contributions of Vannevar Bush (”associative trails” in his Memex system), Eugene Garfield’s Science Citation Index, the predecessor of page ranking.

    Doug Engelbart’s working hypertext system in the “mother of all demos.”

    And Ted Nelson who helped inspire Engelbart and Berners-Lee and who Wright considers “directly responsible for the generation of the World Wide Web.”

    —Stewart Brand

    —Huffduffed by adactio

  2. Infinite Scroll

    Our distant ancestors often felt overloaded by information. (“Have you read Cicero’s latest speech?” “I don’t have time!”) Throughout history we’ve invented shortcuts like tables of contents, indexes, book reviews, and encyclopedias. What technological solutions might help us cope with the information overload we experience today? Guests include: Stewart Butterfield, CEO of Slack, and Nathan Jurgenson, Snapchat sociologist.

    —Huffduffed by adactio

  3. A Little Less Conversation

    Some people thought the laying of the trans-Atlantic cable might bring world peace, because connecting humans could only lead to better understanding and empathy. That wasn’t the outcome—and recent utopian ideas about communication (Facebook might bring us together and make us all friends!) have also met with a darker reality (Facebook might polarize us and spread false information!). Should we be scared of technology that promises to connect the world? Guests include: Robin Dunbar, inventor of Dunbar’s Number; Nancy Baym, Microsoft researcher.

    —Huffduffed by adactio

  4. Episode 7: Misinformation on the Internet - Untangling the Web

    How did the internet become a tangled web of misinformation? Miles speaks to danah boyd, Principal Researcher at Microsoft Research, founder of Data & Society, and Visiting Professor at New York University. boyd offers insight into the history of misinformation on the internet and the role social media plays in the proliferation of fake news. It’s an interview we did for our upcoming series on "junk news" for the PBS NewsHour.

    —Huffduffed by adactio

  5. Claude Shannon, Father of Information Theory | Internet History Podcast

    Claude Shannon was a mathematician, electrical engineer, and cryptographer known as “the father of information theory.” In the pantheon of cool people who made the modern information era possible, he’s right up there. Today, we’re going to talk about Shannon’s life with Jimmy Sony and Rob Goodman, authors of a great biography of the man called A Mind At Play, How Claude Shannon Invented the Information Age. Especially you software engineers out there, if you don’t know who Claude Shannon was, get educated. You owe your livelihood to this man.

    —Huffduffed by adactio

  6. The Far Future

    How do we prepare for the distant future? Helen Keen meets the people who try to.

    If our tech society continues then we can leave data for future generations in huge, mundane quantities, detailing our every tweet and Facebook ‘like’. But how long could this information be stored? And if society as we know it ends, will our achievements vanish with it? How do we plan for and protect those who will be our distant descendants and yet may have hopes, fears, languages, beliefs, even religions that we simply cannot predict? What if anything can we, should we, pass on?

    —Huffduffed by adactio

  7. Seth Lloyd: Quantum Computer Reality - The Long Now

    The 15th-century Renaissance was triggered, Lloyd began, by a flood of new information which changed how people thought about everything, and the same thing is happening now.

    All of us have had to shift, just in the last couple decades, from hungry hunters and gatherers of information to overwhelmed information filter-feeders.

    Information is physical.

    A bit can be represented by an electron here to signify 0, and there to signify 1.

    Information processing is moving electrons from here to there.

    But for a “qubit" in a quantum computer, an electron is both here and there at the same time, thanks to "wave-particle duality.”

    Thus with “quantum parallelism” you can do massively more computation than in classical computers.

    It’s like the difference between the simple notes of plainsong and all that a symphony can do—a huge multitude of instruments interacting simultaneously, playing arrays of sharps and flats and complex chords.

    Quantum computers can solve important problems like enormous equations and factoring—cracking formerly uncrackable public-key cryptography, the basis of all online commerce.

    With their ability to do “oodles of things at once," quantum computers can also simulate the behavior of larger quantum systems, opening new frontiers of science, as Richard Feynman pointed out in the 1980s.

    Simple quantum computers have been built since 1995, by Lloyd and ever more others.

    Mechanisms tried so far include: electrons within electric fields; nuclear spin (clockwise and counter); atoms in ground state and excited state simultaneously; photons polarized both horizontally and vertically; and super-conducting loops going clockwise and counter-clockwise at the same time; and many more.

    To get the qubits to perform operations—to compute—you can use an optical lattice or atoms in whole molecules or integrated circuits, and more to come.

    The more qubits, the more interesting the computation.

    Starting with 2 qubits back in 1996, some systems are now up to several dozen qubits.

    Over the next 5-10 years we should go from 50 qubits to 5,000 qubits, first in special-purpose systems but eventually in general-purpose computers.

    Lloyd added, “And there’s also the fascinating field of using funky quantum effects such as coherence and entanglement to make much more accurate sensors, imagers, and detectors.”

    Like, a hundred thousand to a million times more accurate.

    GPS could locate things to the nearest micron instead of the nearest meter.

    Even with small quantum computers we will be able to expand the capability of machine learning by sifting vast collections of data to detect patterns and move on from supervised-learning (“That squiggle is a 7”) toward unsupervised-learning—systems that learn to learn.

    The universe is a quantum computer, Lloyd concluded.

    Biological life is all about extracting meaningful information from a sea of bits.

    For instance, photosynthesis uses quantum mechanics in a very sophisticated way to increase its efficiency.

    Human life is expanding on what life has always been—an exercise in machine learning.

    —Stewart Brand

    —Huffduffed by adactio

  8. The Big Web Show #142: Information Architecture is Still Very Much a Thing, with Abby Covert

    Jeffrey Zeldman’s guest is Abby Covert, Information Architect; curator of IA Summit; co-founder of World IA Day; president of IA Institute; teacher in the Products of Design MFA program at New York’s School of Visual Arts; and author of How To Make Sense

    —Huffduffed by adactio

  9. On Learning and Comprehension

    We often see blog posts about optimizing our images or HTML, or even our team’s work flow. But what about optimizing our comprehension? In an ever-changing industry where tools, ideas, and opinions grow exponentially, how can we keep up? This is a topic very close to my heart as somebody who is both stubbornly ambitious and also has a really terrible memory.

    Front end developers are often bombarded with so many tasks, options, and stimuli, that we end up being overwhelmed by choices, causing a complete paralysis and block to getting anything done at all. This is called option paralysis or analysis paralysis (that’s a real thing). And it doesn’t help that we work on the internet, where opening a new tab is like walking into a new room, causing us to forget what we were just focused on doing, and starting on a new stimuli instead.

    The Doorway Effect

    Tammy Everts wrote a really good blog post about this phenomenon — "The Doorway Effect" — where she relates neuroscience to the need for quick and efficient website performance. In short, the Doorway Effect explains why we can go searching for something in one room, walk into another room to look for it, and forget what we were looking for in the first place. This is due to our sensory memory, which works surprisingly similarly to a computer’s memory bank.

    This graphic demonstrates the theory of Persistence of Vision: a phenomenon where an after-image persists in human memory. A radially spinning rope is percieved as an unbroken circle if spinning faster than 100ms, while breaks are noticed when it spins slower. Sensory memory, responsible for Persistence of Vision, works in 100ms bursts. Once this time is up, we simply move on to the next sensory input to take up that space. Google’s Urz Holzle, in this talk at Velocity 2010, describes how their goal is to make web pages seamless — like they are pages in a book. He says, "we’re really aiming, something very, very, high here, at something like 100ms." Ironic? Maybe. Maybe not.

    Learning Modalities

    While there is some disagreement on whether people learn better when tailoring their experience to a particular "learning style" (AKA the "are-learning-styles-even-real?" debate), it is universally agreed upon that various stimuli do influence our ability to recall information. These stimuli come in four "modalities:" visual, auditory, read/write, and kinesthetic.

    According to Sunni Brown’s research, in order to really comprehend information and do something with it, we must engage either two of those modalities, or any one of them paired with an emotional response. This is because memory is stored in terms of meaning, and activating multiple senses during any experience makes it more meaningful and memorable.


    Last week, Jodi Cutler responded to one of my Instagram photos of a doodle I drew in an important meeting with a link to this TED Talk. At first, I thought, "shit, I must be in trouble". But The TED Talk outlines how negative perceptions of doodling are ignorant and inconsistent with statistics based on education research.

    Doodling gets a bad reputation as something people do when they’re uninterested or unfocused, but that is far from true! According to recent neuroscience studies, people who doodle were more likely to retain information than those who did not (about 29% more to be exact). In addition to an increased retention, doodlers have proven to be more creative and spark more ideas. Drawing an image helps stimulate new images in your brain that spark ideas. Brown preaches that "doodling should be leveraged in … situations where information density is very high."

    This made me feel a lot better about my doodling, which has covered the sides of my school notes and has recently found itself creeping onto Post-Its and sketchbooks. It can be awkward, especially around new teams, where people may think you’re ignoring them or not listening. Education is key here. To further avoid distraction, I personally find that the best method for me is to doodle repetitive, geometric shapes; it keeps my body moving and my mind in a neutral state.

    The dot grid doodle from a morning meeting last week

    If you can see, the side comment in the notebook, says "Chemi’s class crits doodle page." Chemi was the professor.

    "Doodling helps you concentrate and grasp new concepts because it keeps the brain at an ideal state of arousal"

    This idea relates to a lot of the reasons why Developers Work At Night. When we’re at 100% brain power, and buzzing, we’re thinking about too many other things to concentrate on a single task. Swizec Teller’s theory of the "sleepy brain", says that we work best at night because "there isn’t enough left-over brainpower to afford losing concentration."

    Another fun fact from Teller’s book: we’re more productive in the morning, yet more creative in the evening.


    Something I just started doing lately (unlike doodling) is listening to blog posts instead of reading them. This stemmed out of my intense desire to multi-task as much as possible, and my enjoyment of the freedom that podcasts allow. I think its been working pretty well. If I really want to focus on the content, I will do both (read and listen to the same text). That way, if I get distracted and end up on another web page, there is still the auditory stimuli that keeps my sensory memory flowing (thus avoiding the "doorway effect"), and keeping me on track.

    I liked listening to my blog posts so much, that I started turning ebooks into audiobooks

    Listening also allows for that mild, ideal temperament to best retain information. One night, while relaxing and crafting holiday cards, I found myself going through 7 or 8 blog posts. The way I do this online is via a Chrome extension called Select and Speak. So blogs are a good start, but I also have several e-books on my iPad that I wish I could listen to instead of read. I discovered an app called Voice Dream Reader ($9.99) for the iPad, and have been using that to turn e-books into audiobooks. Voice Dream Reader also integrates with Pocket.

    Keep in mind, none of these are ideal solutions — TTS (Text to Speech) has a long way to go, especially when it comes to development or code-heavy posts. The screen reader trying to read code is just downright hilarious (but also very confusing/annoying). That’s why I’ve decided to provide an audio version for this post, and all future blog posts. It also makes the post less robotic to hear, since it’ll be coming from the author’s own voice. Regardless, audio works best for non-code-focused articles.

    The task you’re doing while listening is also important. For example, I can’t write while I listen to another audio source, making coding and blogging difficult. However, I do listen to blog posts while casually browsing the internet, driving, or crafting. Its a great way to passively absorb information, especially when you’ve got a long list like I do..

    Actively Avoid Your Other Tasks

    Focusing is something that I often have trouble with. I have so many ideas, so many thoughts, so many things I want to do right now! So yes, I understand how advising someone to "focus on one task" literally means nothing. However, one of my co-workers the other day told me the same thing, but from a different perspective. She said:

    Pick one thing and try not to think about any of the other things you’re not doing.

    Well, of course! That makes sense. Instead of actively trying to focus on one task, actively avoid doing other tasks. If you are making a conscious decision that It is in your best interest to put those other tasks aside, it alleviates the anxiety and guilt of not doing them.

    Distractions are doorways. Avoiding distractions (or avoiding "The Doorway Effect"), will help you in completing whatever learning you’ve begun rather than just getting lost in a tangle of disjointed links. There are a lot of tools I use to help with distractions while working.

    Self Control (the App)

    The Self Control App allows you to to block content on distracting websites (i.e. Facebook) for a set period of time. The app prevents your computer from accessing any of the sites on your self-determined "block list," reminding you to get back to focusing on the task and preventing you from falling into a black hole of distraction.

    Leverage Habituation

    Nutritional studies show that participants who ate the same food each day were more likely to lose weight than those who consumed a more varried diet (even when they ate mac & cheese every day). The researchers said "habituation" — the body’s decreasing response to a stimulus after repeated exposure to it — was the cause.

    When doing development work, listen to music that’s familiar to you. If I put on new music (i.e. someone else’s playlist), and hear something I like, I’ll end up researching lyrics, adding it to my own playlists, searching other songs by that artist, etc. You get the point. So what I’m basically saying here is, play music that won’t be an unintended distraction.

    tl;dr: Doodle at meetings, try listening to blog posts, and actively avoid unrelated tasks.

    I’m curious to hear if these tips help you out. Don’t hesitate to leave a comment or tweet at me!

    —Huffduffed by adactio

  10. James Gleick: Bits and Bytes

    Former ‘New York Times’ writer James Gleick (the man who popularised "the butterfly effect" in ‘Chaos’) has produced the definitive history of the age in which we live, ‘The Information’. In Gleick’s book ‘Information’ he speaks about the information "flood". He talks with Robyn Williams, presenter of ABC Science and ABC Radio National.

    We are in a predicament where we have the ability to reach out and get facts easily. Although we may have access this does not necessarily bring with it knowledge. The gatekeepers of information are more important than ever, due to our reliance on these authorities for truth.

    This event was presented by Sydney Writer’s Festival 2011

    James Gleick is an author, journalist and biographer whose books explore the cultural ramifications of science and technology. His books have popularised concepts such as "The Butterfly Effect" and sold bucketloads around the world. His most recent book, "The Information: A History, a Theory, a Flood", is being hailed as his crowning work. Gleick is also the author of the bestselling books "Chaos", ‘Genius’, ‘Faster’ and a biography of Isaac Newton. Three of these books have been Pulitzer Prize and National Book Award finalists, and have been translated into more than 20 languages. James divides his time between New York City and Florida.

    Robyn Williams has presented science programs on ABC radio and television since 1972. He is the first journalist to be elected a fellow of the Australian Academy of Science, was a visiting fellow at Balliol College, Oxford, and is a visiting professor at the University of NSW.

    —Huffduffed by adactio

Page 1 of 3Older