Old records are breaking, cassette tapes are warping, even digital recordings can become obsolete. The Library of Congress is working to save millions of the nation’s recordings before they’re lost.
Tagged with “audio” (22)
Deep beneath the border of France and Switzerland, the world’s most massive physics machine is sending subatomic particles smashing into each other at speeds nearing the speed of light. Physicists working with the 17-mile-long Large Hadron Collider hope it will help solve some of the universe’s mysteries.
But first, researchers must overcome two very mundane hurdles: how to handle all of the data the LHC generates, and how to get non-scientists to care.
One physicist has a novel way to solve both problems: sound.
"I have some musician friends that I was talking to about physics, which I do a lot, if people will let me, and I was doing impersonations of particles — as you do — or maybe not," Lily Asquith says with a laugh. She is a physicist who until recently worked with the LHC at CERN, the European Organization for Nuclear Research.
Here’s How It Works
The concept underlying the LHC Sound project is a principle called sonification — using data to make sound. At the most basic level, sonification correlates any physical property, such as a distance, speed or direction, to a sound property such as loudness, pitch or duration. On her blog, Lily Asquith explains how to make sound out of anything. The audio clip below is sonified data from the ATLAS detector at the Large Hadron Collider. Here’s how the sound was made: Enlarge image Lily Asquith/LHC Sound Listen To The Data playlist As a beam of particles is fired through the detector, three data points are collected and mapped to sound parameters: (1) The distance the particle travels away from the beam (dR in the diagram above) becomes the sound’s pitch, (2) the amount of energy a particle has correlates to volume, and (3) how far the particle travels becomes the timing of the notes. Asquith, like many physicists, spends a lot of time thinking about particles like the elusive Higgs boson — the subatomic particle that scientists say endows everything in the universe with mass. Proving the existence of the Higgs boson is one of the main goals of the collider.
"You tend to personify things that you think about a lot," she says. She gives particles personalities, colors and sounds. "I think electrons, perhaps, sound like a glockenspiel to me."
In the process of the search for the Higgs, the collider generates a massive amount of information — more than 40 million pieces of data every second. And that’s just from the ATLAS detector, one of the four main detectors in the deep underground complex that tunnels back and forth across the French-Swiss border.
So Asquith was trying to figure out a new way to understand and sort through all of this data. The LHC currently produces colorful images as an output from the data — sprays of particles in different directions.
"It’s quite easy to step from there, really, to consider that there could be some kind of sound associated with these things," she says.
Making Sound From The Data
She thought about a heart monitor in a hospital; it turns the electrical data from your heart into sound.
"You don’t have to watch the monitor because you can hear it without making any effort," she says. "Just a steady beep — you can quite easily detect if it starts going quicker or if it stops even for a second."
She wondered what would happen if she used music composition software to turn data from the collider into sound. So she fed in a sample of the LHC data — three columns of numbers.
"So we’ll map, for example, the first column of numbers, which may be a distance, to time," Asquith says. "And we may map the second column of numbers to pitch, and the third, perhaps, to volume."
What she got isn’t quite music, but sounds that are more out of this world — bells, beeps and clangs.
Interpreting The Sounds
Right now, Asquith says, the sounds don’t tell scientists very much. But she hopes that in the future, it could help them understand the data in new ways.
Video: Colliding Particles This animation shows a collision between particles in the ATLAS detector at the Large Hadron Collider. Note: the video clip has no sound.
Credit: ATLAS Experiment She says that in certain situations, it’s much easier to use your ears than your eyes, particularly with something that’s changing over time. Collider data do that.
"You could certainly have an alarm system which told you when, for example, you have an event which looks ridiculous according to what you’ve expected," she says. "And that’s quite difficult to do using your eyes."
But the project is doing something else — making what’s going on at the collider accessible and interesting to people without a Ph.D. That includes many of Lily’s friends who are musicians.
They are really interested in — even fascinated by — what’s going on at the LHC. But she says they start to look frightened when she brings up the hard science.
"I just think that’s unnecessary that it frightens people — it should be something that everyone should enjoy," she says.
This week Jamillah talks to the creator of electronics for people with disabilities so they can use games consoles, a guy who tells stories in 3d online, a couple of chaps who created music using camera sounds and the creator of Dr Puppet - a web based animation linked to the 50th anniversary of Dr Who.
Claudia Hammond travels to Japan to investigate a condition known as hikikomori
Emily Maitlis discusses the digital future with Google head Eric Schmidt; data journalist James Ball; curator Honor Harger; and risk expert David Spiegelhalter.
Douglas Coupland and William Gibson discuss culture, technology, and the craft of writing. Communications technologies are a global memory prosthesis, says Gibson, and aspire to an experience in which distinctions between the "virtual" and the "real" are dissolved. We are already the borg, Gibson says.
Once long past, listening gave clues for survival. Now we listen unconsciously, blocking noise and tuning in to what we want to hear. Yet the unwanted sounds we filter out tell us a lot about our environment and our lives. Broadcaster Teresa Goff listens for the messages in our walls of sound.
As civilization has become more mechanized, more urbanized and more digitized, the amount of noise has increased in tandem. This noise, according to Garrett Keizer, author of The Unwanted Sound of Everything We Want: A Book about Noise , "is a window for understanding some of the paradoxes and contradictions of being human." If you take the sum total of all sounds within any area, what you have is an intimate reflection of the social, technological, and natural conditions of that place.
Hildegard Westerkamp, a founding member of the World Forum for Acoustic Ecology, says that "Environmental sound is like a spoken word with each sound or soundscape having its own meanings and expressions." So when you listen to the noise, what does it have to tell you? "Noise is a pit of interpretation," says noise musician Brian Chippendale. Broadcaster Teresa Goff goes into the pit with her documentary, The Signal of Noise.
Tom Kane is the man behind thousands of video games, commercials, and yes, movie trailers.
Cartoonist Matt Groening remembers how he created The Simpsons 25 years ago.
This week’s episode of the CoP Show explains what transmedia storytelling is and why producers might want to use it.
The simplest definition of transmedia storytelling is that it is a technique used to tell stories across multiple platforms: TV, radio, games, novels, social media, online or anywhere a story can unfold.
A transmedia storyteller may create many "entrypoints" across different platforms, so that, for example, a fan of a drama can read the online diaries of their favourite characters or follow their comments on Twitter.
The theory goes that by doing this not only can you give your audience more of what they want and love but you can also bring in a whole new audience that otherwise would not find your content.
Joining presenter Simon Smith are Chris Sizemore, Executive Editor of BBCâs Learning & Knowledge Online, Adrian Hon the Chief Creative at transmedia specialist Six to Start and Meg Jayanth, a BBC multi-platform producer.
Page 1 of 3Older