The Symphony of Sound: Decoding the Brain’s Auditory Secrets
Neuroscientist Michael Burger’s innovative research blends genetics and virtual reality to explore how the ear and brain work in harmony to process sound.
The field of neuroscience is ever evolving, driven by advancements in technology and the tenacity of researchers to push boundaries. Neuroscientist Michael Burger and members of his laboratory have yielded profound insights into the auditory system. From manipulating genes in embryonic chicken models to using virtual reality for synaptic analysis, his work aims to decode the intricate relationship between the ear and the brain.
Sound is everywhere, and categorizing and locating these sounds is critical as we navigate our surroundings. To make sense of sound, a major function of the ear is to separate frequencies, a process that allows you to appreciate the complexity of music and language. It achieves this by processing each frequency in a separate “channel,” as neurons in the ear respond to low or high frequencies independently. In the coiled tube of the inner ear, or cochlea, vibrations produced by sounds are converted to neural activity before being sent to the brain.
Hair cells, which respond to sound, convert sound into neural signals with neurons specialized to process specific frequencies. Due to mechanical properties within the cochlear duct, low frequencies resonate at one end and high frequencies at the other, like a musical instrument. It translates frequency into place, a fundamental organizing principle of auditory processing, Burger says. This mapping of “frequency to place” is called tonotopy.
This tonotopic organization is then remapped everywhere in the brain where sounds are processed. The tonotopic organization of hearing is of particular interest to Burger. His lab has identified several properties of auditory neurons that appear to “tune” their own frequencies along the tonotopy within the brain. The key question is, how did this perfect tuning arise in development? Burger thinks it might be explained by one of two theories. One suggests that the tonotopic properties first arise in the ear, then during development, the ear drives the tuning of neurons in the brain. Alternatively, brain organization may develop independently of the ear, instead relying on mapping cues present in the developing brain itself to establish tonotopic patterns.
“How do the neurons in the brain develop their specializations since they're spatially distributed,” says Burger, professor of neuroscience in the department of biological sciences. “Do the neurons become who they are because of where they live? Or alternatively, are they instructed to become that way by the ear?”
Pioneering Techniques in Auditory Research
Burger's research examines how the brain organizes and processes auditory information, focusing on understanding how the brain adapts to changes in hearing. Using chicken embryos, his team alters the development of the inner ear to study how the brain responds to modified sound input. By injecting a gene that carries the instructions to produce bone morphogenetic protein (BMP7) into one of a chicken embryo’s developing ears, they change its auditory structure, so it only picks up low-frequency sounds and high frequency processing is dramatically diminished. This creates a uniquely patterned ear for studying how the brain’s neurons develop with this altered input pattern. Do they adjust to this new sound range, or do they stick to their original role?
Chickens are ideal for this research because their auditory system is simpler than mammals and furthermore, the embryos grow in an accessible egg outside of the mother. Like humans, chickens’ ears and brains use tonotopic organization, but unlike our coiled cochleae, chickens’ cochleae are straight, making it easier to study how hair cells are organized.
“In the birds, the hair cells are electrically tuned to resonate different frequencies the same way you turn the dial of an old-fashioned radio and change the electrical tuning of the capacitors in the radio to find your favorite station,” Burger says.
The team injects these genetic constructs into embryos just two days after the eggs start incubating. They open a small hole in an egg, inject the gene into one of the chick's developing ears, and then seal the egg to let it grow normally. This creates a chicken with one ear that hears all frequencies normally and one ear that only hears low frequencies.
"What we're asking right now is if we change the organization of the ear, do we change the organization of this synaptic pattern in the brain?"

Why does this matter? The team wants to know whether neurons in the brain are programmed based on their location (like being “born” a high-frequency neuron) or if they can change based on the sounds they receive. Their findings show it’s the sounds—or the input—that determine what kind of neuron develops.
“One of the things that differentiates low and high frequency cells in the brain is that the low frequency cells get many small inputs from the ear, about 10 to 13 tiny little synaptic inputs,” Burger says. “The high frequency cells, though, get one to three very large inputs with synapses that are large enough to easily see in the microscope. So, what we're asking right now is if we change the organization of the ear, do we change the organization of this synaptic pattern in the brain?’
Applications of Virtual Reality in Synaptic Analysis
Burger’s lab isn’t just innovating at the genetic level; they’re also revolutionizing how data is analyzed. Once doctoral student Kwame Owusu-Nyantakyi creates slices of a chicken’s brain, three-dimensional images are produced. The lab creates computations using this collected data to generate a virtual three-dimensional representation of the cell. Using virtual reality goggles with handheld controllers, undergraduate neuroscience major Audrey Snyder ’26 interacts with representations of neural synapses, rotating and exploring them in virtual space. This immersive approach allows for precise measurements of synaptic structures and facilitates collaboration across the lab.
“Previously there was a system that did this, but in a more tedious way,” Owusu-Nyantakyi says. “So, for all of the data we get, we use confocal imaging, and you get optical slices of each of the segments through the tissue. Back then you had to go and literally stitch up all those images back together to give you the 3D volume, do some post-hoc analysis, and make sure your image is good, and then try to pull out the data you want. Now, once you do the annotation, you've got your data, which saves up so much time. And it works with so many different types of tissues, not just brains.”
Snyder can work within a cell—zooming in and out in high resolution or rotating it to assign labels and color to cells, synapses, axons, and terminals for differentiation. She can also determine cellular boundaries (e.g., where axons and synaptic terminals start and end) and mark uncertainties with virtual questions on the cell or notes for later team reviews. The collected data can then be easily exported into Microsoft Excel spreadsheets.
Beyond the research applications, the lab offers advanced experiential learning opportunities. Snyder says the experiences in the lab will help her when she continues to a graduate program. A native of Ghana, Owusu-Nyantakyi came to Lehigh from Knox College and says the lab supports him as he continues to hone his research skills.

“I have enjoyed working in the Burger lab,” Owusu-Nyantakyi says. “I have acquired skills such as patch clamp techniques and using VR systems to analyze data that I would not have been able to get in any other lab. I find SyGlass technology very cool because it allows me to visually interact with my data in ways I have not experienced throughout my science career. This has greatly improved my work as it is now easy to visualize and analyze my data in real time, which saves me a lot of time to work on my other graduate work. I also think it is a great tool for undergraduate research, as it is very engaging and keeps you coming back to work on your data. I have also found that it is a great teaching tool for visualizing biological structures in their complexity. Additionally, working with Mike has been great. I have been able to present at conferences and definitely grown as a scientist.”
Burger’s lab exemplifies the synergy between traditional scientific expertise and emerging technologies. Burger remains cautiously optimistic about the future. “Sometimes as a scientist, I’m afraid to talk too prospectively about things that may or may not work,” he says. Yet, the strides already made—from gene manipulation in embryos to VR-based synaptic analysis—underscore the transformative potential of his work.
The journey ahead is filled with unanswered questions. How do developmental processes in the ear shape the brain’s auditory capabilities? Can the principles uncovered in the chicken model be translated to human health? And what new technologies will emerge to further unravel these mysteries?
For now, Burger’s lab continues to chart new territory, driven by curiosity and a commitment to advancing our understanding of the brain. “We’re finally getting to some of the bigger questions,” Burger says. The possibilities are as expansive as the intricate networks of the brain itself.