Today's Hours: 10:00am - 6:00pm

Search

Filter Applied Clear All

Did You Mean:

Search Results

  • Book
    Adrian K.C. Lee, Mark T. Wallace, Allison B. Coffin, Arthur N. Popper, Richard R. Fay, editors.
    Summary: Auditory behavior, perception, and cognition are all shaped by information from other sensory systems. This volume examines this multi-sensory view of auditory function at levels of analysis ranging from the single neuron to neuroimaging in human clinical populations. Visual Influence on Auditory Perception Adrian K.C. Lee and Mark T. Wallace Cue Combination within a Bayesian Framework David Alais and David Burr Toward a Model of Auditory-Visual Speech Intelligibility Ken W. Grant and Joshua G.W. Bernstein An Object-based Interpretation of Audiovisual Processing Adrian K.C. Lee, Ross K. Maddox, and Jennifer K. Bizley Hearing in a "Moving" Visual World: Coordinate Transformations Along the Auditory Pathway Shawn M. Willett, Jennifer M. Groh, Ross K. Maddox Multisensory Processing in the Auditory Cortex Andrew J. King, Amy Hammond-Kenny, Fernando R. Nodal Audiovisual Integration in the Primate Prefrontal Cortex Bethany Plakke and Lizabeth M. Romanski Using Multisensory Integration to Understand Human Auditory Cortex Michael S. Beauchamp Combining Voice and Face Content in the Primate Temporal Lobe Catherine Perrodin and Christopher I. Petkov Neural Network Dynamics and Audiovisual Integration Julian Keil and Daniel Senkowski Cross-Modal Learning in the Auditory System Patrick Bruns and Brigitte Röder Multisensory Processing Differences in Individuals with Autism Spectrum Disorder Sarah H. Baum Miller, Mark T. Wallace Adrian K.C. Lee is Associate Professor in the Department of Speech & Hearing Sciences and the Institute for Learning and Brain Sciences at the University of Washington, Seattle Mark T. Wallace is the Louise B McGavock Endowed Chair and Professor in the Departments of Hearing and Speech Sciences, Psychiatry, Psychology and Director of the Vanderbilt Brain Institute at Vanderbilt University, Nashville Allison B. Coffin is Associate Professor in the Department of Integrative Physiology and Neuroscience at Washington State University, Vancouver, WA Arthur N. Popper is Professor Emeritus and research professor in the Department of Biology at the University of Maryland, College Park Richard R. Fay is Distinguished Research Professor of Psychology at Loyola University, Chicago.

    Contents:
    Intro; Acoustical Society of America; Series Preface; Springer Handbook of Auditory Research; Preface 1992; Volume Preface; Contents; Contributors;
    Chapter 1: Visual Influence on Auditory Perception; 1.1 Introduction; 1.1.1 Basic Concepts and Historical Perspectives; 1.2 Volume Roadmap; 1.3 Outlook; References;
    Chapter 2: Cue Combination Within a Bayesian Framework; 2.1 Multisensory Integration and the Problem of Cue Combination; 2.2 Cue Combination in a Bayesian Framework; 2.3 The Maximum Likelihood Estimation Model; 2.4 Maximum Likelihood Estimation: A Flexible Cue Combination Model 2.5 Maximum Likelihood Estimation Cue Combination in the Time Domain2.6 Changes in Maximum Likelihood Estimation Cue Weightings Over Development; 2.7 Cross-Modal Calibration During Development; 2.8 Cross-Modal Calibration and Sensory Deficits; 2.9 Summary; References;
    Chapter 3: Toward a Model of Auditory-Visual Speech Intelligibility; 3.1 Introduction; 3.1.1 The Importance of Signal-Based Models of Speech Intelligibility; 3.1.2 The Overlooked Problem of Auditory-Visual Speech Intelligibility; 3.1.3 Speech-Feature Complementarity and the Relative Importance of Different Spectral Regions 3.1.4 Auditory-Visual Integration Efficiency3.1.5 Auditory-Visual Asynchrony; 3.1.6 Perception of Auditory-Visual Coherence and the Enhancement of the Auditory Speech Envelope; 3.2 Modeling Auditory-Visual Speech Intelligibility; 3.3 Future Challenges; 3.3.1 Complex Auditory Backgrounds; 3.3.2 Individual Differences: Hearing Acuity, Visual Acuity, and Integration Efficiency; 3.4 Summary; References;
    Chapter 4: An Object-Based Interpretation of Audiovisual Processing; 4.1 Introduction; 4.1.1 Multisensory Cocktail Party: Disambiguating Sound Mixtures Using Visual Cues 4.1.2 Object-Based Attention4.1.3 The Auditory Perspective; 4.2 Visual, Auditory, and Auditory-Visual Objects; 4.2.1 Integration Versus Binding; 4.2.1.1 Unity Assumption; 4.2.1.2 Stimulus Factors Guiding Multisensory Integration; Spatial Colocation; Temporal Coincidence; Context Influencing Multisensory Integration; 4.2.2 Strong Test of Multisensory Binding and Multisensory Objecthood; 4.2.3 Models of Audiovisual Integration and the Role of Attention; 4.3 Reinterpreting Classic Audiovisual Illusions: Binding or Multisensory Integration?; 4.3.1 Ventriloquism; 4.3.2 Sound-Induced Flash Illusion 4.3.3 McGurk Effect4.4 Competing Objects in the Audiovisual Scene; 4.4.1 Prediction from Unisensory Object-Based Attention Theory; 4.4.2 Effect of Spatial Cues; 4.4.3 Effect of Temporal Coherence; 4.5 Summary; References;
    Chapter 5: Hearing in a "Moving" Visual World: Coordinate Transformations Along the Auditory Pathway; 5.1 Introduction; 5.2 The Why and How of Linking Visual and Auditory Signals in Space; 5.3 Auditory Reference Frames in the Superior Colliculus; 5.4 Reference Frames Throughout the Brain; 5.4.1 Reference Frames in the Parietal and Frontal Cortices
    Digital Access Springer 2019