Methodological developments in auditory reverse-correlation experiments

One of the main focuses of my research lies in the development and implementation of a new psycholinguistic approach, purely behavioral in nature, known as Auditory Classification Images (ACI). This approach draws inspiration from reverse correlation studies in vision psychophysics, which aimed to link random fluctuations in a stimulus to the corresponding perceptual response of a participant in a task, on a trial-by-trial basis. The objective of this project is to adapt and operationalize the reverse correlation technique for the auditory modality, with the ultimate goal of applying it to the exploration of speech perception, particularly phoneme comprehension. Auditory Classification Images enable us to visualize the « listening strategy » of a participant, revealing the information they extract from the auditory stimulus to perform the task.

The Matlab toolbox that we develop is available as a Github repository : https://github.com/aosses-tue/fastACI. It can be used to replicate previous auditory revcorr experiments or to design new ones.

With this toolbox you can run listening experiments as used in the studies below. You can also reproduce most of the figures contained in the mentioned references. The toolbox was also used to replicate Ahuma et al.’s seminal experiment (Ahumada, Marken & Sandusky, 1975). Experiments can be conducted on human participants through headphones, or on auditory models (Osses & Varnet, 2021; see also Osses & Varnet 2024)

REFERENCE fastACI experiment name TASK Type of background noise Target sounds
Ahumada, Marken & Sandusky, 1975 replication_ahumada1975 tone-in-noise detection white 500-Hz pure tone vs. silence
Varnet & Lorenzi 2022 modulationACI amplitude modulation detection white modulated or unmodulated tones
Varnet et al. 2013
speechACI_varnet2013 phoneme categorization white /aba/-/ada/, female speaker
Varnet et al. 2015 speechACI_varnet2015 phoneme categorization white /alda/-/alga/-/arda/-/arga/, male speaker
Osses & Varnet 2021 speechACI_varnet2013 phoneme categorization speech shaped noise (SSN) /aba/-/ada/, female speaker
Osses & Varnet 2024 speechACI_Logatome phoneme categorization white, bump, MPS /aba/-/ada/, male speker from the OLLO database
Carranante et al., subm.
speechACI_Logatome phoneme categorization bump Pairs of contrasts using /aba/, /ada/, /aga/, /apa/, /ata/ from the same male speaker (S43M) in OLLO
Osses et al., 2023 segmentation word segmentation random prosody Pairs of contrasts: /l’amie/-/la mie/, /l’appel/-/la pelle/, /l’accroche/-/la croche/, /l’alarme/-/la larme/

Administrative details

Collaborators: Alejandro Osses

Fundings:

NSCo Doctoral School, L. Varnet’s thesis scholarship (2012-2015)

Agence Nationale de la Recherche, projet ANR-20-CE28-0004 « Exploration des représentations phonétiques et de leur adaptabilité par la méthode des Images de Classification Auditive rapides » (2021-2023)

Selected publications and presentations:

English

General overview of the toolbox

Varnet, L. Presentation to the Laboratoire de Psychologie et NeuroCognition (2023, Grenoble):  » Using reverse correlation to study speech perception » (slides)

Varnet, L. Presentation to the Hearing Institute (2021, Paris): « New methodologies for studying listening strategies in phoneme categorization tasks » (slides)

Osses, A., Lorenzi, C., Varnet, L. (2022)  Assessment of individual listening strategies in amplitude-modulation detection and phoneme categorisation tasks. ICA (article)

Varnet, L. Presentation to ARO 2022: « Auditory Classification Images: A Psychophysical Paradigm to Explore Listening Strategies in Phoneme Perception » (slides, video)

About the use of auditory models within the toolbox

Osses, A. & Varnet, L. (2021). Consonant-in-noise discrimination using an auditory model with different speech-based decision devices. DAGA proceedings (article)

Osses, A. & Varnet, L. (2022) Using reverse correlation to study individual perception: Including an auditory model in the experimental design loop. (slides)

Osses, A. & Varnet, L. (2023) Using auditory models to mimic human listeners in reverse correlation experiments from the fastACI
toolbox. Forum Acusticum (article)

Articles using the toolbox

Varnet, L., Knoblauch, K., Meunier, F., Hoen M. (2013). Using auditory classification images for the identification of fine acoustic cues used in speech perception. Frontiers in Human Neuroscience, 7:865. doi: 10.3389/fnhum.2013.00865. (article)

Varnet, L., Knoblauch, K., Serniclaes, W., Meunier, F., Hoen M. (2015). A Psychophysical Imaging Method Evidencing Auditory Cue Extraction during Speech Perception: A Group Analysis of Auditory Classification Images. PLoS ONE, 10(3):e0118009. doi:10.1371/journal.pone.011800. (article)

Osses, A. & Varnet, L. (2021). Consonant-in-noise discrimination using an auditory model with different speech-based decision devices. DAGA proceedings (article)

Osses, A. & Varnet, L. (2024). A microscopic investigation of the effect of random envelope fluctuations on phoneme-in-noise perception. Journal of the Acoustical Society of America, (article)

Carranante, G., Cany, C., Farri, P., Giavazzi, M., & Varnet, L. (in press). Mapping the spectrotemporal regions influencing perception of French stop consonants in noise (bioRxiv)

Varnet, L., Lorenzi, C. (2022). Probing temporal modulation detection in white noise using intrinsic envelope fluctuations: A reverse-correlation study. Journal of the Acoustical Society of America, 2022, 151 (2), pp.1353-1366. (article)

Français

Varnet, L. (2015). Identification des indices acoustiques utilisés dans la compréhension de la parole dégradée. Université Claude Bernard Lyon 1. (thèse)

Varnet, L. Présentation à la Fête de la Science (2020, ENS Paris): « Observer l’esprit humain sans neuroimagerie » (vidéo)

« Visualiser ce que nous écoutons pour comprendre les sons de parole » (vidéo)

Sur ce blog :
L’image de classification auditive, partie 1 : Le cerveau comme boîte noire
L’Image de Classification Auditive, partie 2 : À la recherche des indices acoustiques de la parole
A visual compendium of auditory revcorr studies