Signal-processing

Auditory attention decoding

Object-based Speech Perception

In this project, we analyze various kinds of sound signals including human speech, animal vocalizations and even noise to understand the human auditory perception using time-frequency analysis and neural responses (mostly EEG). Currently, we are trying to develop a python-based time-frequency analysis toolbox and design an auditory psychophysics experiment to test whether or not stable time-frequency structures are important for signal detection and speech perception. For more information, refer to this github page!