Kanona et al. (2022) Cochlear Implants International

Do patients with Meniere’s Disease need more intensive rehabilitation following cochlear implantation? Meniere’s Disease is known to produce fluctuations in hearing (and balance) over time. However, some people with Meniere’s Disease become completely deaf and are therefore given a cochlear implant. Small-scale studies have shown that these individuals may need greater assistance because the performanceContinue reading “Kanona et al. (2022) Cochlear Implants International”

Douaud et al. (2022) Nature

Do milder cases of COVID-19 infection produce changes in the brain? Severe cases of COVID-19 are known to be associated with abnormal changes in the brain. Using the UK Biobank, we ask whether COVID-19 can produce brain abnormalities in milder cases where a hospital visit is not necessary. The UK Biobank is a large-scale long-termContinue reading “Douaud et al. (2022) Nature”

Vasquez-Lopez et al. (2017) eLife

In the auditory cortex, do neighbouring neurons receive similar input frow lower levels of processing? In the auditory cortex, neighbouring neurons tend to like sounds that are similar in frequency. We ask whether this is because neighbouring neurons receive similar input from lower levels of the brain (specifically the thalamus). If we zoom out andContinue reading “Vasquez-Lopez et al. (2017) eLife”

Nodal et al. (2010) J Neurosci Methods

Can ferrets wear earphones? Earphones make it possible to precisely control the sound heard by each ear. This allows us to simulate different forms of hearing loss and create virtual auditory environments. It also enables us to produce unnatural sounds that can create auditory illusions. All of these features make earphones an invaluable experimental toolContinue reading “Nodal et al. (2010) J Neurosci Methods”

Dahmen et al. (2010) Neuron

Do humans and neurons adapt to recent sound statistics? When people spend time listening to sounds that come from a very narrow range of locations (statisticians would call this a low-variance distribution), they become very sensitive to even small changes in sound location. On the other hand, if people spend time listening to sounds thatContinue reading “Dahmen et al. (2010) Neuron”

King et al. (2011) Neurosci Biobehav Rev

How is sound localization affected by previous experience? Sound localization is an excellent model system for understanding the neural basis of auditory learning and adaptation. In particular, sounds heard previously affect auditory perception at multiple different timescales. This review article summarizes recent advances in our understanding of this topic. King AJ, Dahmen JC, Keating P,Continue reading “King et al. (2011) Neurosci Biobehav Rev”

Keating et al. (2013) JARO

Can ferrets use different sound localization cues? Sound localization relies upon a variety of different cues. For example, when a sound is located on one side of the head, it will be louder in one ear than the other (known as an Interaural Level Difference). It will also arrive earlier at one ear than theContinue reading “Keating et al. (2013) JARO”

Keating et al. (2013) Curr Biol

What happens to sound localization if you experience hearing loss during development? Approximately 80% of children experience some form of hearing loss before the age of 2. This is commonly caused by ‘glue ear’, which often produces a hearing loss in one ear. Sound localization in the horizontal plane typically relies on differences between theContinue reading “Keating et al. (2013) Curr Biol”

Keating & King (2013) Front Syst Neuro

What are the developmental effects of a hearing loss in one ear? The developing brain has a remarkable capacity for rewiring and learning. This review paper outlines recent advances in our understanding of how the developing auditory brain is shaped by experience, considering the impact of asymmetric hearing loss on sound localization. Keating P, KingContinue reading “Keating & King (2013) Front Syst Neuro”

Keating et al. (2014) EJN

Does experience affect which cues to sound location are used at different sound frequencies? Humans locate low-frequency sounds using differences in the timing of sound heard by each ear (a sound presented on one side of the head takes longer to reach the far ear). At high frequencies, humans rely more on differences in theContinue reading “Keating et al. (2014) EJN”