Skip to main content [aditude-amp id="stickyleaderboard" targeting='{"env":"staging","page_type":"article","post_id":712234,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']

Dolby’s Poppy Crum wants to give you sensory superpowers

Dolby’s Poppy Crum wants to give you sensory superpowers

Violinist-turned-neuroscientist Poppy Crum talks sound, neuro-plasticity, and sensory superpowers.

Poppy Crum_photo_Final_April_7“Our sensory experience of the world should be controllable by us, so that we are not limited by what our physical senses let us experience. I want to build super-powers,” says Poppy Crum, a Senior Scientist at Dolby Laboratories. A neuroscientist by training, her job is to use how the brain perceives sound and other sensory input to inform the design of Dolby’s products. Dolby just announced the Dolby 3D format at NAB, a system for creating glasses-free 3D content. “I study sensory perception because it fascinates me at every level,” she says.

Crum often pauses in intense silences or starts sentences without finishing them while she grapples with the problem of how to describe a concept. Precision is important to her. While she clearly loves her work, Crum started off as a violinist, not a neuroscientist. At music school she took an elective course in neuroscience. “One of the papers was from Professor Eric Knudsen at Stanford who studied representations of what we hear and see and how we form a multi-sensory map of an object based on these two senses. When he reared owls with prism glasses on (so what they saw was inconsistent with what they heard), they developed a secondary map that integrated auditory and visual space.”

[aditude-amp id="flyingcarpet" targeting='{"env":"staging","page_type":"article","post_id":712234,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']

Crum has absolute pitch, which she describes as hearing sound as other people see color, but the names you give to those auditory colors depends on a learned pitch centre. In Western music, for example, the note A is assigned to the frequency 440 Hertz, whereas in early music — music from the medieval and Renaissance periods —  it’s 415 Hertz, a whole semi-tone lower. “I had a map of pitch that was at 440. Playing early music was driving me crazy for about six months. Every time I played a note, I saw it on the page and heard something different in my brain. One day I tuned up my early music violin at 415 and realized that I had absolute pitch at 415. I had developed an entirely new map of pitch. At that point, I decided that I had to understand my brain a little bit more.”

The science of sound

AI Weekly

The must-read newsletter for AI and Big Data industry written by Khari Johnson, Kyle Wiggers, and Seth Colaner.

Included with VentureBeat Insider and VentureBeat VIP memberships.

Sound poses a complex problem for the brain. If you hear a dog barking, your brain needs to figure out which sound frequencies are associated with the dog, integrate those sounds with what you see, and determine where the dog is. “It’s like monitoring just one square foot of the waves on a lake and figuring out how many boats are on that lake, how fast they are moving, and where they are located. That’s the problem our ears are having to solve.”

Our brain is constantly reweighting the information it receives from different senses. “The ability to locate something visually has less error, so we weight what we see more than what we hear, but in terms of when something happens, we weight our sonic information much more than our visual information. As you get to lower light levels, the model starts to changes and auditory information becomes more relevant.”

Different sounds constantly mask and obscure other sounds. Our brain uses context and expectation to make sense of it all. “When you listen to an old record, it’s got scratches everywhere, but you can still hear the music. The experience of the music and even the emotion is often just as strong. When there’s a big scratch on a record, to our brain something is still there. Information is missing, but we fill it in dynamically — in frequency, in time, and in spatial position.”

Sensory Superpowers

Dolby is constantly developing new sound and imaging technologies, and an understanding of how the brain perceives is vital to doing that effectively. “All of our products take advantage of perception in some way. Our codecs are a computational neural model which reduce information but maintain the perceptual experience by getting rid of information which I wouldn’t experience in real life.” One of the products in development is an imaging technology that can produce up to 20 thousand nits (a measure of light emitted per unit area as perceived by the human eye), as opposed to the 450-1000 nits emitted by a typical HDTV display. When Crum watched a video of fire on one of these new screens, something strange happened.

“I was watching a variety of content, all of which was producing the same amount of nits, but when the content was fire, I experienced my cheeks get warm,” Crum explains. “So I used thermal imaging cameras to track people’s faces, and there were changes when they saw flame. When we see flame in real life, our bodies are already preparing to expel heat based on the luminescence which is reaching our retina in conjunction with the fact that we know it’s fire. I ran the same test on HD displays, and you don’t see anything like this. This technology is truly creating a realistic experience by tricking the body.”

[aditude-amp id="medium1" targeting='{"env":"staging","page_type":"article","post_id":712234,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']

If we can create technology that can trick the body, all kinds of new sensory experiences become possible. “You don’t just want to create reality, you want to create something that’s even better. By using the synergistic effects of our senses on each other, we can amplify them so we have heightened experiences and potentially heightened emotional responses. Many species have superpowers, like bats and their ability to navigate. You can look at these species and how their brains have solved problems and use technology to create an experience that is not limited by the physical capabilities of our senses.”

Musical Brain Training

Crum is not only concerned with understanding and manipulating the way the brain perceives, but also with how we can use sensory input to alter the structure of the brain itself.

She teaches a class on Neuroplasticity and musical gaming at Stanford on how to create games that alter the organization of the brain in a targeted manner. “You have cells that have what is called a receptive field, the stimulus set which optimally produces a response from that cell out of some vast number of stimuli. With neuroplasticity, you might find different cells change the stimulus they respond to, or that they become more selective so you are developing more specificity.”

[aditude-amp id="medium2" targeting='{"env":"staging","page_type":"article","post_id":712234,"post_type":"story","post_chan":"none","tags":null,"ai":false,"category":"none","all_categories":"business,mobile,","session":"C"}']

Crum’s students have developed all kinds of sound-based games to target different skills. One game trains people to detect phonetic differences that are usually lost in adulthood in order to allow them to become bilingual more quickly. Students have created games to increase accuracy in localizing sound and train digit span to improve memory.

One application of this kind of gaming is compensating for age-related sensory decline. “If I have hearing loss, I am relying too much on sonic information. I’ve had students re-weight how we integrate audio and visual information to allow people to hear better in noisy environments, to rely more on their visual cues when their auditory cues are compromised. In that case, they were training people to detect phonetic differences in lip reading and then progressing to different words.”

Our future may be full of devices that can create the impression of reality or even heighten it, manipulate our sensory experiences and emotions and even reshape our brains. This is less of a leap than it seems. “The brain’s job is not to get it right, it’s to be robust – the most impervious to error in action,” says Crum.”We are usually experiencing some kind of illusion.”

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Learn More