Jennifer M. Groh
Professor of Psychology and Neuroscience
Research in my laboratory concerns how sensory and motor systems work together, and how neural representations play a combined role in sensorimotor and cognitive processing (embodied cognition). Most of our work concerns the interactions between vision and hearing. We frequently perceive visual and auditory stimuli as being bound together if they seem likely to have arisen from a common source. That's why we tend not to notice that the speakers on TV sets or in movie theatres are located beside, and not behind, the screen. Research in my laboratory is devoted to investigating the question of how the brain coordinates the information arising from the ears and eyes. Our findings challenge the historical view of the brain's sensory processing as being automatic, autonomous, and immune from outside influence. We have recently established that neurons in the auditory pathway (inferior colliculus, auditory cortex) alter their responses to sound depending on where the eyes are pointing. This finding suggests that the different sensory pathways meddle in one another's supposedly private affairs, making their respective influences felt even at very early stages of processing. The process of bringing the signals from two different sensory pathways into a common frame of reference begins at a surprisingly early point along the primary sensory pathways.
Metzger, Ryan R., et al. “Auditory saccades from different eye positions in the monkey: implications for coordinate transformations.” Journal of Neurophysiology, vol. 92, no. 4, Oct. 2004, pp. 2622–27. Epmc, doi:10.1152/jn.00326.2004. Full Text
Groh, Jennifer M., et al. “A monotonic code for sound azimuth in primate inferior colliculus.” Journal of Cognitive Neuroscience, vol. 15, no. 8, Nov. 2003, pp. 1217–31. Epmc, doi:10.1162/089892903322598166. Full Text
Werner-Reiss, Uri, et al. “Eye position affects activity in primary auditory cortex of primates.” Current Biology : Cb, vol. 13, no. 7, Apr. 2003, pp. 554–62. Epmc, doi:10.1016/s0960-9822(03)00168-4. Full Text
Groh, J. M., and M. S. Gazzaniga. “How the brain keeps time.” Daedalus, vol. 132, no. 2, 2003, pp. 56–61.
Groh, J. M. “Converting neural signals from place codes to rate codes.” Biological Cybernetics, vol. 85, no. 3, Sept. 2001, pp. 159–65. Epmc, doi:10.1007/s004220100249. Full Text
Boucher, L., et al. “Afferent delays and the mislocalization of perisaccadic stimuli.” Vision Research, vol. 41, no. 20, Sept. 2001, pp. 2631–44. Epmc, doi:10.1016/s0042-6989(01)00156-0. Full Text
Groh, J. M., et al. “Eye position influences auditory responses in primate inferior colliculus.” Neuron, vol. 29, no. 2, Feb. 2001, pp. 509–18. Epmc, doi:10.1016/s0896-6273(01)00222-7. Full Text
Boucher, L., et al. “Visual latency and the mislocalization of perisaccadic stimuli.” Vision Research, vol. 41, 2001, pp. 2631–44.
Born, R. T., et al. “Segregation of object and background motion in visual area MT: effects of microstimulation on eye movements.” Neuron, vol. 26, no. 3, June 2000, pp. 725–34. Epmc, doi:10.1016/s0896-6273(00)81208-8. Full Text