Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:( Audiovisual speech integration ). Showing records 1 – 30 of 22187 total matches.

[1] [2] [3] [4] [5] … [740]

Search Limiters

Last 2 Years | English Only

Levels

Languages

Country

▼ Search Limiters


Washington University in St. Louis

1. Gaunt, Lauren. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.

Degree: MA(AM/MA), Psychology, 2019, Washington University in St. Louis

Speech perception improves when listeners are able to see as well as hear a talker, compared to listening alone. This phenomenon is commonly referred to… (more)

Subjects/Keywords: audiovisual integration, eyetracking, speech perception, audiovisual benefit; Cognitive Psychology

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gaunt, L. (2019). Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. (Thesis). Washington University in St. Louis. Retrieved from https://openscholarship.wustl.edu/art_sci_etds/1975

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gaunt, Lauren. “Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.” 2019. Thesis, Washington University in St. Louis. Accessed December 04, 2020. https://openscholarship.wustl.edu/art_sci_etds/1975.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gaunt, Lauren. “Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.” 2019. Web. 04 Dec 2020.

Vancouver:

Gaunt L. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. [Internet] [Thesis]. Washington University in St. Louis; 2019. [cited 2020 Dec 04]. Available from: https://openscholarship.wustl.edu/art_sci_etds/1975.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gaunt L. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. [Thesis]. Washington University in St. Louis; 2019. Available from: https://openscholarship.wustl.edu/art_sci_etds/1975

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

2. Overton, Dawson James. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.

Degree: 2015, University of Toronto

Some evidence exists for audiovisual speech integration deficits in schizophrenia, but the generality of these deficits is unclear. We sought to characterize these deficits more… (more)

Subjects/Keywords: Audiovisual Integration; Multisensory Integration; Neuroanatomy of Audiovisual Binding; Schizophrenia; Schizotypal Personality; Speech Perception; 0633

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Overton, D. J. (2015). Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/70523

Chicago Manual of Style (16th Edition):

Overton, Dawson James. “Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.” 2015. Masters Thesis, University of Toronto. Accessed December 04, 2020. http://hdl.handle.net/1807/70523.

MLA Handbook (7th Edition):

Overton, Dawson James. “Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.” 2015. Web. 04 Dec 2020.

Vancouver:

Overton DJ. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. [Internet] [Masters thesis]. University of Toronto; 2015. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1807/70523.

Council of Science Editors:

Overton DJ. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. [Masters Thesis]. University of Toronto; 2015. Available from: http://hdl.handle.net/1807/70523


Queens University

3. Buchan, Julie N. Cognitive resources in audiovisual speech perception .

Degree: Psychology, 2011, Queens University

 Most events that we encounter in everyday life provide our different senses with correlated information, and audiovisual speech perception is a familiar instance of multisensory… (more)

Subjects/Keywords: attention ; cognitive load ; audiovisual distractors ; audiovisual speech perception ; temporal integration ; perception ; multisensory integration ; selective attention

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Buchan, J. N. (2011). Cognitive resources in audiovisual speech perception . (Thesis). Queens University. Retrieved from http://hdl.handle.net/1974/6835

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Buchan, Julie N. “Cognitive resources in audiovisual speech perception .” 2011. Thesis, Queens University. Accessed December 04, 2020. http://hdl.handle.net/1974/6835.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Buchan, Julie N. “Cognitive resources in audiovisual speech perception .” 2011. Web. 04 Dec 2020.

Vancouver:

Buchan JN. Cognitive resources in audiovisual speech perception . [Internet] [Thesis]. Queens University; 2011. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1974/6835.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Buchan JN. Cognitive resources in audiovisual speech perception . [Thesis]. Queens University; 2011. Available from: http://hdl.handle.net/1974/6835

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Queens University

4. Nahanni, Celina. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .

Degree: Neuroscience Studies, 2014, Queens University

 In a noisy environment, speech intelligibility is greatly enhanced by seeing the speaker’s face. This enhancement results from the integration of auditory and visual signals,… (more)

Subjects/Keywords: Speech-in-noise ; Confusions ; McGurk illusion ; Audiovisual speech ; Audiovisual integration ; Integration enhancement ; Word identification ; Open-set

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nahanni, C. (2014). Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . (Thesis). Queens University. Retrieved from http://hdl.handle.net/1974/12299

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Nahanni, Celina. “Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .” 2014. Thesis, Queens University. Accessed December 04, 2020. http://hdl.handle.net/1974/12299.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Nahanni, Celina. “Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .” 2014. Web. 04 Dec 2020.

Vancouver:

Nahanni C. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . [Internet] [Thesis]. Queens University; 2014. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1974/12299.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Nahanni C. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . [Thesis]. Queens University; 2014. Available from: http://hdl.handle.net/1974/12299

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Texas Medical Center

5. Sertel, Muge O. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.

Degree: PhD, 2017, Texas Medical Center

Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is… (more)

Subjects/Keywords: Intracranial EEG; Electrocorticography; Speech Perception; Audiovisual Integration; Cognitive Neuroscience; Systems Neuroscience

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sertel, M. O. (2017). Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. (Doctoral Dissertation). Texas Medical Center. Retrieved from https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797

Chicago Manual of Style (16th Edition):

Sertel, Muge O. “Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.” 2017. Doctoral Dissertation, Texas Medical Center. Accessed December 04, 2020. https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797.

MLA Handbook (7th Edition):

Sertel, Muge O. “Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.” 2017. Web. 04 Dec 2020.

Vancouver:

Sertel MO. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. [Internet] [Doctoral dissertation]. Texas Medical Center; 2017. [cited 2020 Dec 04]. Available from: https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797.

Council of Science Editors:

Sertel MO. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. [Doctoral Dissertation]. Texas Medical Center; 2017. Available from: https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797


University of Maryland

6. Jenkins III, Julian. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.

Degree: Biology, 2011, University of Maryland

 Natural scenes and ecological signals are inherently complex and understanding of their perception and processing is incomplete. For example, a speech signal contains not only… (more)

Subjects/Keywords: Neurosciences; Audiovisual Integration; Auditory Cognition; MEG; Psychophysics; Speech; Vowels

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jenkins III, J. (2011). MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. (Thesis). University of Maryland. Retrieved from http://hdl.handle.net/1903/12084

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Jenkins III, Julian. “MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.” 2011. Thesis, University of Maryland. Accessed December 04, 2020. http://hdl.handle.net/1903/12084.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Jenkins III, Julian. “MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.” 2011. Web. 04 Dec 2020.

Vancouver:

Jenkins III J. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. [Internet] [Thesis]. University of Maryland; 2011. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1903/12084.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jenkins III J. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. [Thesis]. University of Maryland; 2011. Available from: http://hdl.handle.net/1903/12084

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – Irvine

7. Venezia, Jonathan Henry. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.

Degree: Psychology, 2014, University of California – Irvine

 Human speech processing (perception and in some cases production) is approached from three levels. At the top level, I investigate the role of the motor… (more)

Subjects/Keywords: Cognitive psychology; Neurosciences; Psychology; Audiovisual Speech; Auditory Field Maps; Sensorimotor Integration; Signal Detection; Speech Perception; Speech Production

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Venezia, J. H. (2014). Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. (Thesis). University of California – Irvine. Retrieved from http://www.escholarship.org/uc/item/4st223zk

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Venezia, Jonathan Henry. “Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.” 2014. Thesis, University of California – Irvine. Accessed December 04, 2020. http://www.escholarship.org/uc/item/4st223zk.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Venezia, Jonathan Henry. “Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.” 2014. Web. 04 Dec 2020.

Vancouver:

Venezia JH. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. [Internet] [Thesis]. University of California – Irvine; 2014. [cited 2020 Dec 04]. Available from: http://www.escholarship.org/uc/item/4st223zk.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Venezia JH. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. [Thesis]. University of California – Irvine; 2014. Available from: http://www.escholarship.org/uc/item/4st223zk

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Universitat Pompeu Fabra

8. Sánchez García, Carolina, 1984-. Cross-modal predictive mechanisms during speech perception.

Degree: Departament de Ciències Experimentals i de la Salut, 2013, Universitat Pompeu Fabra

 The present dissertation addresses the predictive mechanisms operating online during audiovisual speech perception. The idea that prediction mechanisms operate during the perception of speech at… (more)

Subjects/Keywords: Habla audiovisual; Predicción; Percepción del habla; Integración multisensorial; Predicción fonológica; Audiovisual speech; Predictive coding; Speech perception; Multisensory integration; Event-related potentials; Phonology based-prediction; 81

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sánchez García, Carolina, 1. (2013). Cross-modal predictive mechanisms during speech perception. (Thesis). Universitat Pompeu Fabra. Retrieved from http://hdl.handle.net/10803/293266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sánchez García, Carolina, 1984-. “Cross-modal predictive mechanisms during speech perception.” 2013. Thesis, Universitat Pompeu Fabra. Accessed December 04, 2020. http://hdl.handle.net/10803/293266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sánchez García, Carolina, 1984-. “Cross-modal predictive mechanisms during speech perception.” 2013. Web. 04 Dec 2020.

Vancouver:

Sánchez García, Carolina 1. Cross-modal predictive mechanisms during speech perception. [Internet] [Thesis]. Universitat Pompeu Fabra; 2013. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/10803/293266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sánchez García, Carolina 1. Cross-modal predictive mechanisms during speech perception. [Thesis]. Universitat Pompeu Fabra; 2013. Available from: http://hdl.handle.net/10803/293266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

9. Deonarine, Justin. Noise reduction limits the McGurk Effect.

Degree: 2011, University of Waterloo

 In the McGurk Effect (McGurk & MacDonald, 1976), a visual depiction of a speaker silently mouthing the syllable [ga]/[ka] is presented concurrently with the auditory… (more)

Subjects/Keywords: speech perception; psycholinguistics; audiovisual integration; McGurk Effect

speech perception illusion, which is often used as an example of audiovisual integration. It… …x28;and audiovisual integrative speech perception as a whole). Many use the established… …suggesting a temporal component to audiovisual integration. Deonarine (2010) expanded on… …which accompany the Signal. He suggested that audiovisual integration occurs due to the use of… …x28;2010). He suggested that audiovisual integration can be subject dependent, with some… 

Page 1 Page 2 Page 3 Page 4 Page 5 Sample image Sample image

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Deonarine, J. (2011). Noise reduction limits the McGurk Effect. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6046

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Deonarine, Justin. “Noise reduction limits the McGurk Effect.” 2011. Thesis, University of Waterloo. Accessed December 04, 2020. http://hdl.handle.net/10012/6046.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Deonarine, Justin. “Noise reduction limits the McGurk Effect.” 2011. Web. 04 Dec 2020.

Vancouver:

Deonarine J. Noise reduction limits the McGurk Effect. [Internet] [Thesis]. University of Waterloo; 2011. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/10012/6046.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Deonarine J. Noise reduction limits the McGurk Effect. [Thesis]. University of Waterloo; 2011. Available from: http://hdl.handle.net/10012/6046

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

10. Reetzke, Rachel Denise. Developmental and cultural factors of audiovisual speech perception in noise.

Degree: MA, Communication Sciences and Disorders, 2014, University of Texas – Austin

 The aim of this project is two-fold: 1) to investigate developmental differences in intelligibility gains from visual cues in speech perception-in-noise, and 2) to examine… (more)

Subjects/Keywords: Audiovisual Integration; Speech perception-in-noise

…informational maskers have been found to differentially modulate audiovisual speech integration in… …explain audiovisual integration in speech perception in noisy environments. The first is the… …derived to explain audiovisual integration in speech perception. According to this principle… …audiovisual integration benefits speech intelligibility the most when the signal-to-noise ratio… …and further demonstrate that the integration of audiovisual cues in speech perception-in… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Reetzke, R. D. (2014). Developmental and cultural factors of audiovisual speech perception in noise. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/25904

Chicago Manual of Style (16th Edition):

Reetzke, Rachel Denise. “Developmental and cultural factors of audiovisual speech perception in noise.” 2014. Masters Thesis, University of Texas – Austin. Accessed December 04, 2020. http://hdl.handle.net/2152/25904.

MLA Handbook (7th Edition):

Reetzke, Rachel Denise. “Developmental and cultural factors of audiovisual speech perception in noise.” 2014. Web. 04 Dec 2020.

Vancouver:

Reetzke RD. Developmental and cultural factors of audiovisual speech perception in noise. [Internet] [Masters thesis]. University of Texas – Austin; 2014. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/2152/25904.

Council of Science Editors:

Reetzke RD. Developmental and cultural factors of audiovisual speech perception in noise. [Masters Thesis]. University of Texas – Austin; 2014. Available from: http://hdl.handle.net/2152/25904


Rice University

11. Basu Mallick, Debshila. Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect.

Degree: PhD, Social Sciences, 2016, Rice University

 Multisensory speech perception occurs when an individual integrates spoken sounds and mouth movements of a talker into a coherent percept, e.g., during face-to-face conversations. Under… (more)

Subjects/Keywords: Audiovisual; speech perception; McGurk effect

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Basu Mallick, D. (2016). Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect. (Doctoral Dissertation). Rice University. Retrieved from http://hdl.handle.net/1911/96262

Chicago Manual of Style (16th Edition):

Basu Mallick, Debshila. “Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect.” 2016. Doctoral Dissertation, Rice University. Accessed December 04, 2020. http://hdl.handle.net/1911/96262.

MLA Handbook (7th Edition):

Basu Mallick, Debshila. “Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect.” 2016. Web. 04 Dec 2020.

Vancouver:

Basu Mallick D. Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect. [Internet] [Doctoral dissertation]. Rice University; 2016. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1911/96262.

Council of Science Editors:

Basu Mallick D. Factors Affecting Audiovisual Speech Perception as Measured by the McGurk Effect. [Doctoral Dissertation]. Rice University; 2016. Available from: http://hdl.handle.net/1911/96262


McMaster University

12. Wong, Nadia P. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.

Degree: MSc, 2014, McMaster University

Our experience with the world depends on how we integrate sensory information. Multisensory integration generates contextually rich experiences, which are more distinct and more easily… (more)

Subjects/Keywords: Multisensory Integration; Audiovisual; Recognition Memory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wong, N. P. (2014). HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/16513

Chicago Manual of Style (16th Edition):

Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Masters Thesis, McMaster University. Accessed December 04, 2020. http://hdl.handle.net/11375/16513.

MLA Handbook (7th Edition):

Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Web. 04 Dec 2020.

Vancouver:

Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Internet] [Masters thesis]. McMaster University; 2014. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/11375/16513.

Council of Science Editors:

Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Masters Thesis]. McMaster University; 2014. Available from: http://hdl.handle.net/11375/16513


University of Manchester

13. Banks, Briony. Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech.

Degree: 2016, University of Manchester

 Adverse listening conditions can be a hindrance to communication, but humans are remarkably adept at overcoming them. Research has begun to uncover the cognitive and… (more)

Subjects/Keywords: Speech perception; Cognition; Audiovisual speech; Eye-tracking

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Banks, B. (2016). Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech. (Doctoral Dissertation). University of Manchester. Retrieved from http://www.manchester.ac.uk/escholar/uk-ac-man-scw:297082

Chicago Manual of Style (16th Edition):

Banks, Briony. “Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech.” 2016. Doctoral Dissertation, University of Manchester. Accessed December 04, 2020. http://www.manchester.ac.uk/escholar/uk-ac-man-scw:297082.

MLA Handbook (7th Edition):

Banks, Briony. “Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech.” 2016. Web. 04 Dec 2020.

Vancouver:

Banks B. Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2020 Dec 04]. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:297082.

Council of Science Editors:

Banks B. Perceptual Plasticity in Adverse Listening Conditions: Factors Affecting Adaptation to Accented and Noise-Vocoded Speech. [Doctoral Dissertation]. University of Manchester; 2016. Available from: http://www.manchester.ac.uk/escholar/uk-ac-man-scw:297082


University of Toronto

14. Richards, Michael David. Audiovisual Processing and Integration in Amblyopia.

Degree: PhD, 2018, University of Toronto

 Amblyopia is a developmental visual disorder caused by abnormal visual experience during early life. Accumulating evidence points to perceptual deficits in amblyopia beyond vision, in… (more)

Subjects/Keywords: Amblyopia; Audiovisual integration; Audiovisual processing; Multisensory integration; Multisensory processing; Psychophysics; 0317

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Richards, M. D. (2018). Audiovisual Processing and Integration in Amblyopia. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/82956

Chicago Manual of Style (16th Edition):

Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Doctoral Dissertation, University of Toronto. Accessed December 04, 2020. http://hdl.handle.net/1807/82956.

MLA Handbook (7th Edition):

Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Web. 04 Dec 2020.

Vancouver:

Richards MD. Audiovisual Processing and Integration in Amblyopia. [Internet] [Doctoral dissertation]. University of Toronto; 2018. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1807/82956.

Council of Science Editors:

Richards MD. Audiovisual Processing and Integration in Amblyopia. [Doctoral Dissertation]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/82956

15. Shatzer, Hannah Elizabeth. Visual and Temporal Influences on Multimodal Speech Integration.

Degree: MA, Psychology, 2015, The Ohio State University

 Auditory and visual speech cues are often used in tandem to maximize understanding of a speech signal when communicating. A neural model by Bhat et… (more)

Subjects/Keywords: Psychology; audiovisual speech; integration; visual informativeness; duration; onset asynchrony

…List of Figures Figure 1. Neural reweighting model of audiovisual speech integration… …exact role of visual cues in audiovisual (AV) speech integration is a subject of… …Atteveldt, 2013). However, despite all that is currently known about audiovisual speech, the… …x28;1954) found no improvement in identification judgments of audiovisual speech compared to… …speech cues can have such a strong influential role in speech integration that they can… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shatzer, H. E. (2015). Visual and Temporal Influences on Multimodal Speech Integration. (Masters Thesis). The Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560

Chicago Manual of Style (16th Edition):

Shatzer, Hannah Elizabeth. “Visual and Temporal Influences on Multimodal Speech Integration.” 2015. Masters Thesis, The Ohio State University. Accessed December 04, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560.

MLA Handbook (7th Edition):

Shatzer, Hannah Elizabeth. “Visual and Temporal Influences on Multimodal Speech Integration.” 2015. Web. 04 Dec 2020.

Vancouver:

Shatzer HE. Visual and Temporal Influences on Multimodal Speech Integration. [Internet] [Masters thesis]. The Ohio State University; 2015. [cited 2020 Dec 04]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560.

Council of Science Editors:

Shatzer HE. Visual and Temporal Influences on Multimodal Speech Integration. [Masters Thesis]. The Ohio State University; 2015. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560


Vanderbilt University

16. Nidiffer, Aaron Ross. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.

Degree: PhD, Hearing and Speech Sciences, 2018, Vanderbilt University

 Our successful interaction with the environment requires the brain to appropriately combine and segregate sensory information coming from various events. These events frequently produce energy… (more)

Subjects/Keywords: binding; behavior; audiovisual; integration; proximity; similarity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nidiffer, A. R. (2018). Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/13916

Chicago Manual of Style (16th Edition):

Nidiffer, Aaron Ross. “Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.” 2018. Doctoral Dissertation, Vanderbilt University. Accessed December 04, 2020. http://hdl.handle.net/1803/13916.

MLA Handbook (7th Edition):

Nidiffer, Aaron Ross. “Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.” 2018. Web. 04 Dec 2020.

Vancouver:

Nidiffer AR. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. [Internet] [Doctoral dissertation]. Vanderbilt University; 2018. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1803/13916.

Council of Science Editors:

Nidiffer AR. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. [Doctoral Dissertation]. Vanderbilt University; 2018. Available from: http://hdl.handle.net/1803/13916


McMaster University

17. Chuen, Lorraine. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.

Degree: MSc, 2015, McMaster University

An observer’s inference that multimodal signals come from a common underlying source can facilitate cross-modal binding in the temporal domain. This ‘unity assumption’ can cause… (more)

Subjects/Keywords: temporal perception; audiovisual integration; unity assumption

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chuen, L. (2015). Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/18054

Chicago Manual of Style (16th Edition):

Chuen, Lorraine. “Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.” 2015. Masters Thesis, McMaster University. Accessed December 04, 2020. http://hdl.handle.net/11375/18054.

MLA Handbook (7th Edition):

Chuen, Lorraine. “Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.” 2015. Web. 04 Dec 2020.

Vancouver:

Chuen L. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. [Internet] [Masters thesis]. McMaster University; 2015. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/11375/18054.

Council of Science Editors:

Chuen L. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. [Masters Thesis]. McMaster University; 2015. Available from: http://hdl.handle.net/11375/18054


University of Toronto

18. Loria, Tristan. The Influence of Action-based Attention on Audiovisual Integration.

Degree: PhD, 2020, University of Toronto

 We inhabit a world that offers a multitude of sensory cues that need to be disambiguated in order to perceive and interact with our surrounding… (more)

Subjects/Keywords: attention; audiovisual; multisensory integration; upper-limb; 0575

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Loria, T. (2020). The Influence of Action-based Attention on Audiovisual Integration. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/100972

Chicago Manual of Style (16th Edition):

Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Doctoral Dissertation, University of Toronto. Accessed December 04, 2020. http://hdl.handle.net/1807/100972.

MLA Handbook (7th Edition):

Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Web. 04 Dec 2020.

Vancouver:

Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Internet] [Doctoral dissertation]. University of Toronto; 2020. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1807/100972.

Council of Science Editors:

Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Doctoral Dissertation]. University of Toronto; 2020. Available from: http://hdl.handle.net/1807/100972


York University

19. Ferland, Melissa. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.

Degree: MA -MA, Psychology(Functional Area: Clinical-Developmental), 2018, York University

 Being able to integrate information from multiple sensory modalities, such as hearing and sight, is essential for everyday functioning. The temporal binding window (TBW) refers… (more)

Subjects/Keywords: Psychology; Audiovisual integration; Temporal binding window; Methodologies

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ferland, M. (2018). Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. (Masters Thesis). York University. Retrieved from http://hdl.handle.net/10315/34476

Chicago Manual of Style (16th Edition):

Ferland, Melissa. “Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.” 2018. Masters Thesis, York University. Accessed December 04, 2020. http://hdl.handle.net/10315/34476.

MLA Handbook (7th Edition):

Ferland, Melissa. “Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.” 2018. Web. 04 Dec 2020.

Vancouver:

Ferland M. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. [Internet] [Masters thesis]. York University; 2018. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/10315/34476.

Council of Science Editors:

Ferland M. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. [Masters Thesis]. York University; 2018. Available from: http://hdl.handle.net/10315/34476


University of Ontario Institute of Technology

20. McCracken, Heather. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.

Degree: 2018, University of Ontario Institute of Technology

 Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder with behavioural and neurophysiological characteristics. Several cortical structures that are altered in ADHD are involved in the process… (more)

Subjects/Keywords: Multisensory integration; ADHD; EEG; Response time; Audiovisual

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

McCracken, H. (2018). Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Thesis, University of Ontario Institute of Technology. Accessed December 04, 2020. http://hdl.handle.net/10155/1058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Web. 04 Dec 2020.

Vancouver:

McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Internet] [Thesis]. University of Ontario Institute of Technology; 2018. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/10155/1058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Thesis]. University of Ontario Institute of Technology; 2018. Available from: http://hdl.handle.net/10155/1058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Louisville

21. Wu, Jia. Speech perception and the McGurk effect : a cross cultural study using event-related potentials.

Degree: PhD, 2009, University of Louisville

  Previous research has indicated the important role of visual information in the speech perception process. These studies have elucidated the areas of the brain… (more)

Subjects/Keywords: Speech perception; Event-related potentials; ERP; N400; McGurk; Multi-sensory; Chinese; Audiovisual; McGurk effect; Audiovisual integration

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wu, J. (2009). Speech perception and the McGurk effect : a cross cultural study using event-related potentials. (Doctoral Dissertation). University of Louisville. Retrieved from 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597

Chicago Manual of Style (16th Edition):

Wu, Jia. “Speech perception and the McGurk effect : a cross cultural study using event-related potentials.” 2009. Doctoral Dissertation, University of Louisville. Accessed December 04, 2020. 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597.

MLA Handbook (7th Edition):

Wu, Jia. “Speech perception and the McGurk effect : a cross cultural study using event-related potentials.” 2009. Web. 04 Dec 2020.

Vancouver:

Wu J. Speech perception and the McGurk effect : a cross cultural study using event-related potentials. [Internet] [Doctoral dissertation]. University of Louisville; 2009. [cited 2020 Dec 04]. Available from: 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597.

Council of Science Editors:

Wu J. Speech perception and the McGurk effect : a cross cultural study using event-related potentials. [Doctoral Dissertation]. University of Louisville; 2009. Available from: 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597


University of Toronto

22. Yi, Astrid. Gaze Strategies and Audiovisual Speech Enhancement.

Degree: 2010, University of Toronto

Quantitative relationships were established between speech intelligibility and gaze patterns when subjects listened to sentences spoken by a single talker at different auditory SNRs while… (more)

Subjects/Keywords: audiovisual speech enhancement; gaze strategies; speech intelligibility; 0541

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yi, A. (2010). Gaze Strategies and Audiovisual Speech Enhancement. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/25527

Chicago Manual of Style (16th Edition):

Yi, Astrid. “Gaze Strategies and Audiovisual Speech Enhancement.” 2010. Masters Thesis, University of Toronto. Accessed December 04, 2020. http://hdl.handle.net/1807/25527.

MLA Handbook (7th Edition):

Yi, Astrid. “Gaze Strategies and Audiovisual Speech Enhancement.” 2010. Web. 04 Dec 2020.

Vancouver:

Yi A. Gaze Strategies and Audiovisual Speech Enhancement. [Internet] [Masters thesis]. University of Toronto; 2010. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1807/25527.

Council of Science Editors:

Yi A. Gaze Strategies and Audiovisual Speech Enhancement. [Masters Thesis]. University of Toronto; 2010. Available from: http://hdl.handle.net/1807/25527

23. Banks, Briony. Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech.

Degree: PhD, 2016, University of Manchester

 Adverse listening conditions can be a hindrance to communication, but humans are remarkably adept at overcoming them. Research has begun to uncover the cognitive and… (more)

Subjects/Keywords: 401; Speech perception; Cognition; Audiovisual speech; Eye-tracking

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Banks, B. (2016). Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech. (Doctoral Dissertation). University of Manchester. Retrieved from https://www.research.manchester.ac.uk/portal/en/theses/perceptual-plasticity-in-adverse-listening-conditions-factors-affecting-adaptation-to-accented-and-noisevocoded-speech(c5227984-13b8-4e33-9233-5e1715cf8516).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684765

Chicago Manual of Style (16th Edition):

Banks, Briony. “Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech.” 2016. Doctoral Dissertation, University of Manchester. Accessed December 04, 2020. https://www.research.manchester.ac.uk/portal/en/theses/perceptual-plasticity-in-adverse-listening-conditions-factors-affecting-adaptation-to-accented-and-noisevocoded-speech(c5227984-13b8-4e33-9233-5e1715cf8516).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684765.

MLA Handbook (7th Edition):

Banks, Briony. “Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech.” 2016. Web. 04 Dec 2020.

Vancouver:

Banks B. Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech. [Internet] [Doctoral dissertation]. University of Manchester; 2016. [cited 2020 Dec 04]. Available from: https://www.research.manchester.ac.uk/portal/en/theses/perceptual-plasticity-in-adverse-listening-conditions-factors-affecting-adaptation-to-accented-and-noisevocoded-speech(c5227984-13b8-4e33-9233-5e1715cf8516).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684765.

Council of Science Editors:

Banks B. Perceptual plasticity in adverse listening conditions : factors affecting adaptation to accented and noise-vocoded speech. [Doctoral Dissertation]. University of Manchester; 2016. Available from: https://www.research.manchester.ac.uk/portal/en/theses/perceptual-plasticity-in-adverse-listening-conditions-factors-affecting-adaptation-to-accented-and-noisevocoded-speech(c5227984-13b8-4e33-9233-5e1715cf8516).html ; http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.684765


University of California – Riverside

24. Dias, James William. Crossmodal Influences in Selective Speech Adaptation.

Degree: Psychology, 2016, University of California – Riverside

 Repeated presentation of speech syllables can change identification of ambiguous syllables, a perceptual aftereffect known as selective speech adaptation (e.g., Eimas & Corbit, 1973). Adaptation… (more)

Subjects/Keywords: Psychology; Cognitive psychology; audiovisual speech; lexical; multisensory perception; phonetic recalibration; selective speech adaptation; speech perception

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Dias, J. W. (2016). Crossmodal Influences in Selective Speech Adaptation. (Thesis). University of California – Riverside. Retrieved from http://www.escholarship.org/uc/item/5sd725cp

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Dias, James William. “Crossmodal Influences in Selective Speech Adaptation.” 2016. Thesis, University of California – Riverside. Accessed December 04, 2020. http://www.escholarship.org/uc/item/5sd725cp.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Dias, James William. “Crossmodal Influences in Selective Speech Adaptation.” 2016. Web. 04 Dec 2020.

Vancouver:

Dias JW. Crossmodal Influences in Selective Speech Adaptation. [Internet] [Thesis]. University of California – Riverside; 2016. [cited 2020 Dec 04]. Available from: http://www.escholarship.org/uc/item/5sd725cp.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Dias JW. Crossmodal Influences in Selective Speech Adaptation. [Thesis]. University of California – Riverside; 2016. Available from: http://www.escholarship.org/uc/item/5sd725cp

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Oregon

25. Shen, Chia-Ni Jennie. Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception.

Degree: 2019, University of Oregon

 This Project examines the effectiveness of audio-visual training on non-native English speech perception. Previous research utilizing audio-visual training has been employed in the field of… (more)

Subjects/Keywords: Psychology; Speech Learning; Speech Perception; Audiovisual Training; Non-Native Sounds; Speech Production

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shen, C. J. (2019). Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception. (Thesis). University of Oregon. Retrieved from https://scholarsbank.uoregon.edu/xmlui/handle/1794/25058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Shen, Chia-Ni Jennie. “Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception.” 2019. Thesis, University of Oregon. Accessed December 04, 2020. https://scholarsbank.uoregon.edu/xmlui/handle/1794/25058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Shen, Chia-Ni Jennie. “Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception.” 2019. Web. 04 Dec 2020.

Vancouver:

Shen CJ. Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception. [Internet] [Thesis]. University of Oregon; 2019. [cited 2020 Dec 04]. Available from: https://scholarsbank.uoregon.edu/xmlui/handle/1794/25058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Shen CJ. Effectiveness of Audiovisual Training on Non-Native English Speech Production and Perception. [Thesis]. University of Oregon; 2019. Available from: https://scholarsbank.uoregon.edu/xmlui/handle/1794/25058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

26. Basu Mallick, Debshila. An investigation of audiovisual speech perception using the McGurk effect.

Degree: MA, Social Sciences, 2014, Rice University

 Integrating information from the auditory and visual modalities is vital for speech perception. In this thesis, I describe two studies of audiovisual speech perception that… (more)

Subjects/Keywords: Audiovisual; McGurk effect; Speech perception; Illusions; Multisensory integration; Systems; Cognitive neuroscience

…comprehensive picture of between-group differences in audiovisual integration. MULTISENSORY SPEECH… …speech, it may lead to increased audiovisual integration and McGurk perception. Kim, Seitz, and… …studies have used speech reading as a means of increasing audiovisual integration (… …congruent audiovisual stimuli. MULTISENSORY SPEECH PERCEPTION S1 (A/ba/-V/ga) S5… …subjects with strong and weak audiovisual integration (as measured with the McGurk effect… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Basu Mallick, D. (2014). An investigation of audiovisual speech perception using the McGurk effect. (Masters Thesis). Rice University. Retrieved from http://hdl.handle.net/1911/76346

Chicago Manual of Style (16th Edition):

Basu Mallick, Debshila. “An investigation of audiovisual speech perception using the McGurk effect.” 2014. Masters Thesis, Rice University. Accessed December 04, 2020. http://hdl.handle.net/1911/76346.

MLA Handbook (7th Edition):

Basu Mallick, Debshila. “An investigation of audiovisual speech perception using the McGurk effect.” 2014. Web. 04 Dec 2020.

Vancouver:

Basu Mallick D. An investigation of audiovisual speech perception using the McGurk effect. [Internet] [Masters thesis]. Rice University; 2014. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1911/76346.

Council of Science Editors:

Basu Mallick D. An investigation of audiovisual speech perception using the McGurk effect. [Masters Thesis]. Rice University; 2014. Available from: http://hdl.handle.net/1911/76346

27. Yi, Han-Gyol. Audiovisual integration for perception of speech produced by nonnative speakers.

Degree: MA, Communication Sciences and Disorders, 2013, University of Texas – Austin

Speech often occurs in challenging listening environments, such as masking noise. Visual cues have been found to enhance speech intelligibility in noise. Although the facilitatory… (more)

Subjects/Keywords: Speech perception; Nonnative speech; Foreign-accented speech; Audiovisual integration; Speech perception in noise; Speech intelligibility; Visual cues; Visemes; IAT; Implicit association test; Sociophonetics

audiovisual integration in nonnative speech perception. In the Introduction section, I will review… …the current literature on audiovisual integration in speech perception and nonnative speech… …the sole determinants of audiovisual perception of speech, while the integration process… …that audiovisual integration for speech produced in noise is more effective than for speech… …speech cues could lead to ineffective audiovisual integration. Importantly, as discussed before… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Yi, H. (2013). Audiovisual integration for perception of speech produced by nonnative speakers. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/25876

Chicago Manual of Style (16th Edition):

Yi, Han-Gyol. “Audiovisual integration for perception of speech produced by nonnative speakers.” 2013. Masters Thesis, University of Texas – Austin. Accessed December 04, 2020. http://hdl.handle.net/2152/25876.

MLA Handbook (7th Edition):

Yi, Han-Gyol. “Audiovisual integration for perception of speech produced by nonnative speakers.” 2013. Web. 04 Dec 2020.

Vancouver:

Yi H. Audiovisual integration for perception of speech produced by nonnative speakers. [Internet] [Masters thesis]. University of Texas – Austin; 2013. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/2152/25876.

Council of Science Editors:

Yi H. Audiovisual integration for perception of speech produced by nonnative speakers. [Masters Thesis]. University of Texas – Austin; 2013. Available from: http://hdl.handle.net/2152/25876


University of Toronto

28. Narinesingh, Cindy. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.

Degree: 2016, University of Toronto

Amblyopia is a neurodevelopmental disorder associated with reduced vision in one or both eyes, caused by abnormal visual experience during early childhood. The sound-induced flash… (more)

Subjects/Keywords: amblyopia; audiovisual integration; multisensory integration; sound-induced flash illusion; temporal binding window; visual deprivation; 0381

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Narinesingh, C. (2016). Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/75831

Chicago Manual of Style (16th Edition):

Narinesingh, Cindy. “Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.” 2016. Masters Thesis, University of Toronto. Accessed December 04, 2020. http://hdl.handle.net/1807/75831.

MLA Handbook (7th Edition):

Narinesingh, Cindy. “Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.” 2016. Web. 04 Dec 2020.

Vancouver:

Narinesingh C. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. [Internet] [Masters thesis]. University of Toronto; 2016. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1807/75831.

Council of Science Editors:

Narinesingh C. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. [Masters Thesis]. University of Toronto; 2016. Available from: http://hdl.handle.net/1807/75831

29. Andersen, Tobias S. Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception.

Degree: 2005, Helsinki University of Technology

Information processing in the sensory modalities is not segregated but interacts strongly. The exact nature of this interaction is not known and might differ for… (more)

Subjects/Keywords: categorical audiovisual perception; speech perception; rapid flashes; beeps; mathematical modeling

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Andersen, T. S. (2005). Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception. (Thesis). Helsinki University of Technology. Retrieved from http://lib.tkk.fi/Diss/2005/isbn9512275481/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Andersen, Tobias S. “Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception.” 2005. Thesis, Helsinki University of Technology. Accessed December 04, 2020. http://lib.tkk.fi/Diss/2005/isbn9512275481/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Andersen, Tobias S. “Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception.” 2005. Web. 04 Dec 2020.

Vancouver:

Andersen TS. Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception. [Internet] [Thesis]. Helsinki University of Technology; 2005. [cited 2020 Dec 04]. Available from: http://lib.tkk.fi/Diss/2005/isbn9512275481/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Andersen TS. Model-Based Assessment of Factors Influencing Categorical Audiovisual Perception. [Thesis]. Helsinki University of Technology; 2005. Available from: http://lib.tkk.fi/Diss/2005/isbn9512275481/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Vanderbilt University

30. Simon, David Michael. Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain.

Degree: PhD, Neuroscience, 2018, Vanderbilt University

 NEUROSCIENCE Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain David Michael Simon Dissertation under the direction of Professor Mark T. Wallace Events in… (more)

Subjects/Keywords: Event Related Potentials; EEG; Speech; Audiovisual; Multisensory; Temporal

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Simon, D. M. (2018). Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/10909

Chicago Manual of Style (16th Edition):

Simon, David Michael. “Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain.” 2018. Doctoral Dissertation, Vanderbilt University. Accessed December 04, 2020. http://hdl.handle.net/1803/10909.

MLA Handbook (7th Edition):

Simon, David Michael. “Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain.” 2018. Web. 04 Dec 2020.

Vancouver:

Simon DM. Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain. [Internet] [Doctoral dissertation]. Vanderbilt University; 2018. [cited 2020 Dec 04]. Available from: http://hdl.handle.net/1803/10909.

Council of Science Editors:

Simon DM. Electrophysiological Signatures of Multisensory Temporal Processing in the Human Brain. [Doctoral Dissertation]. Vanderbilt University; 2018. Available from: http://hdl.handle.net/1803/10909

[1] [2] [3] [4] [5] … [740]

.