Advanced search options

Advanced Search Options 🞨

Browse by author name (“Author name starts with…”).

Find ETDs with:

in
/  
in
/  
in
/  
in

Written in Published in Earliest date Latest date

Sorted by

Results per page:

Sorted by: relevance · author · university · dateNew search

You searched for subject:(Audiovisual integration). Showing records 1 – 30 of 38 total matches.

[1] [2]

Search Limiters

Last 2 Years | English Only

Levels

Country

▼ Search Limiters


McMaster University

1. Wong, Nadia P. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.

Degree: MSc, 2014, McMaster University

Our experience with the world depends on how we integrate sensory information. Multisensory integration generates contextually rich experiences, which are more distinct and more easily… (more)

Subjects/Keywords: Multisensory Integration; Audiovisual; Recognition Memory

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wong, N. P. (2014). HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/16513

Chicago Manual of Style (16th Edition):

Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Masters Thesis, McMaster University. Accessed September 29, 2020. http://hdl.handle.net/11375/16513.

MLA Handbook (7th Edition):

Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Web. 29 Sep 2020.

Vancouver:

Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Internet] [Masters thesis]. McMaster University; 2014. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/11375/16513.

Council of Science Editors:

Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Masters Thesis]. McMaster University; 2014. Available from: http://hdl.handle.net/11375/16513


University of Toronto

2. Richards, Michael David. Audiovisual Processing and Integration in Amblyopia.

Degree: PhD, 2018, University of Toronto

 Amblyopia is a developmental visual disorder caused by abnormal visual experience during early life. Accumulating evidence points to perceptual deficits in amblyopia beyond vision, in… (more)

Subjects/Keywords: Amblyopia; Audiovisual integration; Audiovisual processing; Multisensory integration; Multisensory processing; Psychophysics; 0317

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Richards, M. D. (2018). Audiovisual Processing and Integration in Amblyopia. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/82956

Chicago Manual of Style (16th Edition):

Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Doctoral Dissertation, University of Toronto. Accessed September 29, 2020. http://hdl.handle.net/1807/82956.

MLA Handbook (7th Edition):

Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Web. 29 Sep 2020.

Vancouver:

Richards MD. Audiovisual Processing and Integration in Amblyopia. [Internet] [Doctoral dissertation]. University of Toronto; 2018. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1807/82956.

Council of Science Editors:

Richards MD. Audiovisual Processing and Integration in Amblyopia. [Doctoral Dissertation]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/82956


Washington University in St. Louis

3. Gaunt, Lauren. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.

Degree: MA(AM/MA), Psychology, 2019, Washington University in St. Louis

 Speech perception improves when listeners are able to see as well as hear a talker, compared to listening alone. This phenomenon is commonly referred to… (more)

Subjects/Keywords: audiovisual integration, eyetracking, speech perception, audiovisual benefit; Cognitive Psychology

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Gaunt, L. (2019). Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. (Thesis). Washington University in St. Louis. Retrieved from https://openscholarship.wustl.edu/art_sci_etds/1975

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Gaunt, Lauren. “Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.” 2019. Thesis, Washington University in St. Louis. Accessed September 29, 2020. https://openscholarship.wustl.edu/art_sci_etds/1975.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Gaunt, Lauren. “Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios.” 2019. Web. 29 Sep 2020.

Vancouver:

Gaunt L. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. [Internet] [Thesis]. Washington University in St. Louis; 2019. [cited 2020 Sep 29]. Available from: https://openscholarship.wustl.edu/art_sci_etds/1975.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Gaunt L. Investigating the Relationship Between Gaze Behavior and Audiovisual Benefit Across Various Speech-to-Noise Ratios. [Thesis]. Washington University in St. Louis; 2019. Available from: https://openscholarship.wustl.edu/art_sci_etds/1975

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Vanderbilt University

4. Nidiffer, Aaron Ross. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.

Degree: PhD, Hearing and Speech Sciences, 2018, Vanderbilt University

 Our successful interaction with the environment requires the brain to appropriately combine and segregate sensory information coming from various events. These events frequently produce energy… (more)

Subjects/Keywords: binding; behavior; audiovisual; integration; proximity; similarity

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nidiffer, A. R. (2018). Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/13916

Chicago Manual of Style (16th Edition):

Nidiffer, Aaron Ross. “Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.” 2018. Doctoral Dissertation, Vanderbilt University. Accessed September 29, 2020. http://hdl.handle.net/1803/13916.

MLA Handbook (7th Edition):

Nidiffer, Aaron Ross. “Temporal Correlation and Its Role in Multisensory Feature Integration and Binding.” 2018. Web. 29 Sep 2020.

Vancouver:

Nidiffer AR. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. [Internet] [Doctoral dissertation]. Vanderbilt University; 2018. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1803/13916.

Council of Science Editors:

Nidiffer AR. Temporal Correlation and Its Role in Multisensory Feature Integration and Binding. [Doctoral Dissertation]. Vanderbilt University; 2018. Available from: http://hdl.handle.net/1803/13916


McMaster University

5. Chuen, Lorraine. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.

Degree: MSc, 2015, McMaster University

An observer’s inference that multimodal signals come from a common underlying source can facilitate cross-modal binding in the temporal domain. This ‘unity assumption’ can cause… (more)

Subjects/Keywords: temporal perception; audiovisual integration; unity assumption

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Chuen, L. (2015). Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/18054

Chicago Manual of Style (16th Edition):

Chuen, Lorraine. “Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.” 2015. Masters Thesis, McMaster University. Accessed September 29, 2020. http://hdl.handle.net/11375/18054.

MLA Handbook (7th Edition):

Chuen, Lorraine. “Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli.” 2015. Web. 29 Sep 2020.

Vancouver:

Chuen L. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. [Internet] [Masters thesis]. McMaster University; 2015. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/11375/18054.

Council of Science Editors:

Chuen L. Evaluating the influence of audiovisual unity in cross-modal temporal binding of musical stimuli. [Masters Thesis]. McMaster University; 2015. Available from: http://hdl.handle.net/11375/18054


University of Toronto

6. Loria, Tristan. The Influence of Action-based Attention on Audiovisual Integration.

Degree: PhD, 2020, University of Toronto

 We inhabit a world that offers a multitude of sensory cues that need to be disambiguated in order to perceive and interact with our surrounding… (more)

Subjects/Keywords: attention; audiovisual; multisensory integration; upper-limb; 0575

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Loria, T. (2020). The Influence of Action-based Attention on Audiovisual Integration. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/100972

Chicago Manual of Style (16th Edition):

Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Doctoral Dissertation, University of Toronto. Accessed September 29, 2020. http://hdl.handle.net/1807/100972.

MLA Handbook (7th Edition):

Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Web. 29 Sep 2020.

Vancouver:

Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Internet] [Doctoral dissertation]. University of Toronto; 2020. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1807/100972.

Council of Science Editors:

Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Doctoral Dissertation]. University of Toronto; 2020. Available from: http://hdl.handle.net/1807/100972

7. Ferland, Melissa. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.

Degree: MA -MA, Psychology(Functional Area: Clinical-Developmental), 2018, York University

 Being able to integrate information from multiple sensory modalities, such as hearing and sight, is essential for everyday functioning. The temporal binding window (TBW) refers… (more)

Subjects/Keywords: Psychology; Audiovisual integration; Temporal binding window; Methodologies

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Ferland, M. (2018). Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. (Masters Thesis). York University. Retrieved from http://hdl.handle.net/10315/34476

Chicago Manual of Style (16th Edition):

Ferland, Melissa. “Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.” 2018. Masters Thesis, York University. Accessed September 29, 2020. http://hdl.handle.net/10315/34476.

MLA Handbook (7th Edition):

Ferland, Melissa. “Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli.” 2018. Web. 29 Sep 2020.

Vancouver:

Ferland M. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. [Internet] [Masters thesis]. York University; 2018. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10315/34476.

Council of Science Editors:

Ferland M. Audiovisual Integration in Adults: Using a Dynamic Task to Measure Differences in Temporal Binding Windows Across Stimuli. [Masters Thesis]. York University; 2018. Available from: http://hdl.handle.net/10315/34476


University of Ontario Institute of Technology

8. McCracken, Heather. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.

Degree: 2018, University of Ontario Institute of Technology

 Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder with behavioural and neurophysiological characteristics. Several cortical structures that are altered in ADHD are involved in the process… (more)

Subjects/Keywords: Multisensory integration; ADHD; EEG; Response time; Audiovisual

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

McCracken, H. (2018). Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Thesis, University of Ontario Institute of Technology. Accessed September 29, 2020. http://hdl.handle.net/10155/1058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Web. 29 Sep 2020.

Vancouver:

McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Internet] [Thesis]. University of Ontario Institute of Technology; 2018. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10155/1058.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Thesis]. University of Ontario Institute of Technology; 2018. Available from: http://hdl.handle.net/10155/1058

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

9. Overton, Dawson James. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.

Degree: 2015, University of Toronto

Some evidence exists for audiovisual speech integration deficits in schizophrenia, but the generality of these deficits is unclear. We sought to characterize these deficits more… (more)

Subjects/Keywords: Audiovisual Integration; Multisensory Integration; Neuroanatomy of Audiovisual Binding; Schizophrenia; Schizotypal Personality; Speech Perception; 0633

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Overton, D. J. (2015). Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/70523

Chicago Manual of Style (16th Edition):

Overton, Dawson James. “Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.” 2015. Masters Thesis, University of Toronto. Accessed September 29, 2020. http://hdl.handle.net/1807/70523.

MLA Handbook (7th Edition):

Overton, Dawson James. “Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia.” 2015. Web. 29 Sep 2020.

Vancouver:

Overton DJ. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. [Internet] [Masters thesis]. University of Toronto; 2015. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1807/70523.

Council of Science Editors:

Overton DJ. Audiovisual Integration Deficits in Schizotypal Personality and Implications for Populations Diagnosed with Schizophrenia. [Masters Thesis]. University of Toronto; 2015. Available from: http://hdl.handle.net/1807/70523


Queens University

10. Buchan, Julie N. Cognitive resources in audiovisual speech perception .

Degree: Psychology, 2011, Queens University

 Most events that we encounter in everyday life provide our different senses with correlated information, and audiovisual speech perception is a familiar instance of multisensory… (more)

Subjects/Keywords: attention ; cognitive load ; audiovisual distractors ; audiovisual speech perception ; temporal integration ; perception ; multisensory integration ; selective attention

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Buchan, J. N. (2011). Cognitive resources in audiovisual speech perception . (Thesis). Queens University. Retrieved from http://hdl.handle.net/1974/6835

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Buchan, Julie N. “Cognitive resources in audiovisual speech perception .” 2011. Thesis, Queens University. Accessed September 29, 2020. http://hdl.handle.net/1974/6835.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Buchan, Julie N. “Cognitive resources in audiovisual speech perception .” 2011. Web. 29 Sep 2020.

Vancouver:

Buchan JN. Cognitive resources in audiovisual speech perception . [Internet] [Thesis]. Queens University; 2011. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1974/6835.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Buchan JN. Cognitive resources in audiovisual speech perception . [Thesis]. Queens University; 2011. Available from: http://hdl.handle.net/1974/6835

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Queens University

11. Nahanni, Celina. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .

Degree: Neuroscience Studies, 2014, Queens University

 In a noisy environment, speech intelligibility is greatly enhanced by seeing the speaker’s face. This enhancement results from the integration of auditory and visual signals,… (more)

Subjects/Keywords: Speech-in-noise ; Confusions ; McGurk illusion ; Audiovisual speech ; Audiovisual integration ; Integration enhancement ; Word identification ; Open-set

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Nahanni, C. (2014). Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . (Thesis). Queens University. Retrieved from http://hdl.handle.net/1974/12299

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Nahanni, Celina. “Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .” 2014. Thesis, Queens University. Accessed September 29, 2020. http://hdl.handle.net/1974/12299.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Nahanni, Celina. “Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception .” 2014. Web. 29 Sep 2020.

Vancouver:

Nahanni C. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . [Internet] [Thesis]. Queens University; 2014. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1974/12299.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Nahanni C. Sources and Correlates of Performance Enhancement in Audiovisual Speech Perception . [Thesis]. Queens University; 2014. Available from: http://hdl.handle.net/1974/12299

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Toronto

12. Narinesingh, Cindy. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.

Degree: 2016, University of Toronto

Amblyopia is a neurodevelopmental disorder associated with reduced vision in one or both eyes, caused by abnormal visual experience during early childhood. The sound-induced flash… (more)

Subjects/Keywords: amblyopia; audiovisual integration; multisensory integration; sound-induced flash illusion; temporal binding window; visual deprivation; 0381

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Narinesingh, C. (2016). Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/75831

Chicago Manual of Style (16th Edition):

Narinesingh, Cindy. “Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.” 2016. Masters Thesis, University of Toronto. Accessed September 29, 2020. http://hdl.handle.net/1807/75831.

MLA Handbook (7th Edition):

Narinesingh, Cindy. “Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion.” 2016. Web. 29 Sep 2020.

Vancouver:

Narinesingh C. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. [Internet] [Masters thesis]. University of Toronto; 2016. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1807/75831.

Council of Science Editors:

Narinesingh C. Examining Audiovisual Integration in Amblyopia using the Sound-induced Flash Illusion. [Masters Thesis]. University of Toronto; 2016. Available from: http://hdl.handle.net/1807/75831


Texas Medical Center

13. Sertel, Muge O. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.

Degree: PhD, 2017, Texas Medical Center

  Speech is inherently multisensory, containing auditory information from the voice and visual information from the mouth movements of the talker. Hearing the voice is… (more)

Subjects/Keywords: Intracranial EEG; Electrocorticography; Speech Perception; Audiovisual Integration; Cognitive Neuroscience; Systems Neuroscience

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sertel, M. O. (2017). Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. (Doctoral Dissertation). Texas Medical Center. Retrieved from https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797

Chicago Manual of Style (16th Edition):

Sertel, Muge O. “Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.” 2017. Doctoral Dissertation, Texas Medical Center. Accessed September 29, 2020. https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797.

MLA Handbook (7th Edition):

Sertel, Muge O. “Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans.” 2017. Web. 29 Sep 2020.

Vancouver:

Sertel MO. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. [Internet] [Doctoral dissertation]. Texas Medical Center; 2017. [cited 2020 Sep 29]. Available from: https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797.

Council of Science Editors:

Sertel MO. Investigating the Neural Basis of Audiovisual Speech Perception with Intracranial Recordings in Humans. [Doctoral Dissertation]. Texas Medical Center; 2017. Available from: https://digitalcommons.library.tmc.edu/utgsbs_dissertations/797


University of Rochester

14. Lee, Susan Sojung. Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development.

Degree: PhD, 2011, University of Rochester

 Nonverbal information, such as motion cues from the lips, face, hands, and body, contributes significantly to social communication as it provides helpful information for understanding… (more)

Subjects/Keywords: Autism; Audiovisual Integration; Gesture; Multimodal; Superior Temporal Sulcus

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lee, S. S. (2011). Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development. (Doctoral Dissertation). University of Rochester. Retrieved from http://hdl.handle.net/1802/15887

Chicago Manual of Style (16th Edition):

Lee, Susan Sojung. “Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development.” 2011. Doctoral Dissertation, University of Rochester. Accessed September 29, 2020. http://hdl.handle.net/1802/15887.

MLA Handbook (7th Edition):

Lee, Susan Sojung. “Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development.” 2011. Web. 29 Sep 2020.

Vancouver:

Lee SS. Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development. [Internet] [Doctoral dissertation]. University of Rochester; 2011. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1802/15887.

Council of Science Editors:

Lee SS. Audiovisual Integration During Language Comprehension:The Neural Basis of Social Communication in Autism and Typical Development. [Doctoral Dissertation]. University of Rochester; 2011. Available from: http://hdl.handle.net/1802/15887


University of Rochester

15. Smith, Elizabeth Gayle. Multisensory integration and temporal synchrony in autism.

Degree: PhD, 2013, University of Rochester

 Multisensory integration is the process by which individuals combine information from multiple sources (e.g., vision, audition, touch) to produce a unique and unitary percept that… (more)

Subjects/Keywords: Audiovisual; Autism; Integration; Multisensory; Synchrony; Temporal; Localization; Identification

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Smith, E. G. (2013). Multisensory integration and temporal synchrony in autism. (Doctoral Dissertation). University of Rochester. Retrieved from http://hdl.handle.net/1802/27919

Chicago Manual of Style (16th Edition):

Smith, Elizabeth Gayle. “Multisensory integration and temporal synchrony in autism.” 2013. Doctoral Dissertation, University of Rochester. Accessed September 29, 2020. http://hdl.handle.net/1802/27919.

MLA Handbook (7th Edition):

Smith, Elizabeth Gayle. “Multisensory integration and temporal synchrony in autism.” 2013. Web. 29 Sep 2020.

Vancouver:

Smith EG. Multisensory integration and temporal synchrony in autism. [Internet] [Doctoral dissertation]. University of Rochester; 2013. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1802/27919.

Council of Science Editors:

Smith EG. Multisensory integration and temporal synchrony in autism. [Doctoral Dissertation]. University of Rochester; 2013. Available from: http://hdl.handle.net/1802/27919


Colorado State University

16. Becker, Katherine M. Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy.

Degree: MS(M.S.), Psychology, 2017, Colorado State University

 The perception of another person's emotional state is formed by the intersection of simultaneously presented affective vocal and facial information. These two channels are highly… (more)

Subjects/Keywords: bimodal bias; emotion perception; right hemisphere; categorical perception; audiovisual integration; prosody

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Becker, K. M. (2017). Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy. (Masters Thesis). Colorado State University. Retrieved from http://hdl.handle.net/10217/183912

Chicago Manual of Style (16th Edition):

Becker, Katherine M. “Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy.” 2017. Masters Thesis, Colorado State University. Accessed September 29, 2020. http://hdl.handle.net/10217/183912.

MLA Handbook (7th Edition):

Becker, Katherine M. “Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy.” 2017. Web. 29 Sep 2020.

Vancouver:

Becker KM. Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy. [Internet] [Masters thesis]. Colorado State University; 2017. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10217/183912.

Council of Science Editors:

Becker KM. Prosodic influence in face emotion perception: evidence from behavioral measures and functional near-infrared spectroscopy. [Masters Thesis]. Colorado State University; 2017. Available from: http://hdl.handle.net/10217/183912


Virginia Tech

17. Lorenzi, Jill Elizabeth. Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions.

Degree: MS, Psychology, 2012, Virginia Tech

 Previous research on emotion identification in Autism Spectrum Disorders (ASD) has demonstrated inconsistent results. While some studies have cited a deficit in emotion identification for… (more)

Subjects/Keywords: Children; Emotion Identification; Autism Spectrum Disorders; Eye Tracking; Audiovisual Integration

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Lorenzi, J. E. (2012). Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions. (Masters Thesis). Virginia Tech. Retrieved from http://hdl.handle.net/10919/42642

Chicago Manual of Style (16th Edition):

Lorenzi, Jill Elizabeth. “Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions.” 2012. Masters Thesis, Virginia Tech. Accessed September 29, 2020. http://hdl.handle.net/10919/42642.

MLA Handbook (7th Edition):

Lorenzi, Jill Elizabeth. “Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions.” 2012. Web. 29 Sep 2020.

Vancouver:

Lorenzi JE. Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions. [Internet] [Masters thesis]. Virginia Tech; 2012. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10919/42642.

Council of Science Editors:

Lorenzi JE. Ability of Children with Autism Spectrum Disorders to Identify Emotional Facial Expressions. [Masters Thesis]. Virginia Tech; 2012. Available from: http://hdl.handle.net/10919/42642


University of Maryland

18. Jenkins III, Julian. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.

Degree: Biology, 2011, University of Maryland

 Natural scenes and ecological signals are inherently complex and understanding of their perception and processing is incomplete. For example, a speech signal contains not only… (more)

Subjects/Keywords: Neurosciences; Audiovisual Integration; Auditory Cognition; MEG; Psychophysics; Speech; Vowels

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Jenkins III, J. (2011). MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. (Thesis). University of Maryland. Retrieved from http://hdl.handle.net/1903/12084

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Jenkins III, Julian. “MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.” 2011. Thesis, University of Maryland. Accessed September 29, 2020. http://hdl.handle.net/1903/12084.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Jenkins III, Julian. “MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION.” 2011. Web. 29 Sep 2020.

Vancouver:

Jenkins III J. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. [Internet] [Thesis]. University of Maryland; 2011. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/1903/12084.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Jenkins III J. MEG, PSYCHOPHYSICAL AND COMPUTATIONAL STUDIES OF LOUDNESS, TIMBRE, AND AUDIOVISUAL INTEGRATION. [Thesis]. University of Maryland; 2011. Available from: http://hdl.handle.net/1903/12084

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Western Ontario

19. Schormans, Ashley L. Cortical Plasticity following Adult-Onset Hearing Loss.

Degree: 2019, University of Western Ontario

 The consequences of hearing loss are not confined to how the central auditory system processes sound; crossmodal plasticity also occurs, which is characterized by an… (more)

Subjects/Keywords: hearing loss; crossmodal plasticity; audiovisual integration; audiovisual perception; extracellular electrophysiology; lateral extrastriate visual cortex; Systems Neuroscience

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Schormans, A. L. (2019). Cortical Plasticity following Adult-Onset Hearing Loss. (Thesis). University of Western Ontario. Retrieved from https://ir.lib.uwo.ca/etd/6050

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Schormans, Ashley L. “Cortical Plasticity following Adult-Onset Hearing Loss.” 2019. Thesis, University of Western Ontario. Accessed September 29, 2020. https://ir.lib.uwo.ca/etd/6050.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Schormans, Ashley L. “Cortical Plasticity following Adult-Onset Hearing Loss.” 2019. Web. 29 Sep 2020.

Vancouver:

Schormans AL. Cortical Plasticity following Adult-Onset Hearing Loss. [Internet] [Thesis]. University of Western Ontario; 2019. [cited 2020 Sep 29]. Available from: https://ir.lib.uwo.ca/etd/6050.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Schormans AL. Cortical Plasticity following Adult-Onset Hearing Loss. [Thesis]. University of Western Ontario; 2019. Available from: https://ir.lib.uwo.ca/etd/6050

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Universitat Pompeu Fabra

20. Sánchez García, Carolina, 1984-. Cross-modal predictive mechanisms during speech perception.

Degree: Departament de Ciències Experimentals i de la Salut, 2013, Universitat Pompeu Fabra

 The present dissertation addresses the predictive mechanisms operating online during audiovisual speech perception. The idea that prediction mechanisms operate during the perception of speech at… (more)

Subjects/Keywords: Habla audiovisual; Predicción; Percepción del habla; Integración multisensorial; Predicción fonológica; Audiovisual speech; Predictive coding; Speech perception; Multisensory integration; Event-related potentials; Phonology based-prediction; 81

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Sánchez García, Carolina, 1. (2013). Cross-modal predictive mechanisms during speech perception. (Thesis). Universitat Pompeu Fabra. Retrieved from http://hdl.handle.net/10803/293266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Sánchez García, Carolina, 1984-. “Cross-modal predictive mechanisms during speech perception.” 2013. Thesis, Universitat Pompeu Fabra. Accessed September 29, 2020. http://hdl.handle.net/10803/293266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Sánchez García, Carolina, 1984-. “Cross-modal predictive mechanisms during speech perception.” 2013. Web. 29 Sep 2020.

Vancouver:

Sánchez García, Carolina 1. Cross-modal predictive mechanisms during speech perception. [Internet] [Thesis]. Universitat Pompeu Fabra; 2013. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10803/293266.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Sánchez García, Carolina 1. Cross-modal predictive mechanisms during speech perception. [Thesis]. Universitat Pompeu Fabra; 2013. Available from: http://hdl.handle.net/10803/293266

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of California – Irvine

21. Venezia, Jonathan Henry. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.

Degree: Psychology, 2014, University of California – Irvine

 Human speech processing (perception and in some cases production) is approached from three levels. At the top level, I investigate the role of the motor… (more)

Subjects/Keywords: Cognitive psychology; Neurosciences; Psychology; Audiovisual Speech; Auditory Field Maps; Sensorimotor Integration; Signal Detection; Speech Perception; Speech Production

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Venezia, J. H. (2014). Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. (Thesis). University of California – Irvine. Retrieved from http://www.escholarship.org/uc/item/4st223zk

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Venezia, Jonathan Henry. “Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.” 2014. Thesis, University of California – Irvine. Accessed September 29, 2020. http://www.escholarship.org/uc/item/4st223zk.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Venezia, Jonathan Henry. “Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing.” 2014. Web. 29 Sep 2020.

Vancouver:

Venezia JH. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. [Internet] [Thesis]. University of California – Irvine; 2014. [cited 2020 Sep 29]. Available from: http://www.escholarship.org/uc/item/4st223zk.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Venezia JH. Psychophysical and neurophysiological investigations from three approaches to understanding human speech processing. [Thesis]. University of California – Irvine; 2014. Available from: http://www.escholarship.org/uc/item/4st223zk

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Université de Grenoble

22. Coutrot, Antoine. Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model.

Degree: Docteur es, Signal, image, paroles, télécoms, 2014, Université de Grenoble

Nous étudions l'influence de différents attributs audiovisuels sur l'exploration visuelle de scènes naturelles dynamiques. Nous démontrons que si la façon dont nous explorons une scène… (more)

Subjects/Keywords: Mouvements oculaires; Scènes naturelles dynamiques; Saillance audiovisuelle; Intégration; Visages; Eye movements; Natural dynamic scenes; Audiovisual saliency; Integration; Faces; 620

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Coutrot, A. (2014). Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model. (Doctoral Dissertation). Université de Grenoble. Retrieved from http://www.theses.fr/2014GRENT119

Chicago Manual of Style (16th Edition):

Coutrot, Antoine. “Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model.” 2014. Doctoral Dissertation, Université de Grenoble. Accessed September 29, 2020. http://www.theses.fr/2014GRENT119.

MLA Handbook (7th Edition):

Coutrot, Antoine. “Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model.” 2014. Web. 29 Sep 2020.

Vancouver:

Coutrot A. Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model. [Internet] [Doctoral dissertation]. Université de Grenoble; 2014. [cited 2020 Sep 29]. Available from: http://www.theses.fr/2014GRENT119.

Council of Science Editors:

Coutrot A. Influence du son lors de l’exploration de scènes naturelles dynamiques : prise en compte de l’information sonore dans un modèle d’attention visuelle : Influence of sound on visual exploration of dynamic natural scenes : integration of auditory information in a visual attention model. [Doctoral Dissertation]. Université de Grenoble; 2014. Available from: http://www.theses.fr/2014GRENT119

23. Reetzke, Rachel Denise. Developmental and cultural factors of audiovisual speech perception in noise.

Degree: MA, Communication Sciences and Disorders, 2014, University of Texas – Austin

 The aim of this project is two-fold: 1) to investigate developmental differences in intelligibility gains from visual cues in speech perception-in-noise, and 2) to examine… (more)

Subjects/Keywords: Audiovisual Integration; Speech perception-in-noise

…conclusion regarding how and when audiovisual integration processes develop across the lifespan… …and parents to better understand the developmental trajectory of audiovisual integration and… …development of audiovisual integration. For the purpose of this paper, we define audiovisual… …x29; audiovisual integration is present early in an infant’s life (e.g. Alridge, Braga… …Werker, 2003) and 2) audiovisual integration develops over time through learning and… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Reetzke, R. D. (2014). Developmental and cultural factors of audiovisual speech perception in noise. (Masters Thesis). University of Texas – Austin. Retrieved from http://hdl.handle.net/2152/25904

Chicago Manual of Style (16th Edition):

Reetzke, Rachel Denise. “Developmental and cultural factors of audiovisual speech perception in noise.” 2014. Masters Thesis, University of Texas – Austin. Accessed September 29, 2020. http://hdl.handle.net/2152/25904.

MLA Handbook (7th Edition):

Reetzke, Rachel Denise. “Developmental and cultural factors of audiovisual speech perception in noise.” 2014. Web. 29 Sep 2020.

Vancouver:

Reetzke RD. Developmental and cultural factors of audiovisual speech perception in noise. [Internet] [Masters thesis]. University of Texas – Austin; 2014. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/2152/25904.

Council of Science Editors:

Reetzke RD. Developmental and cultural factors of audiovisual speech perception in noise. [Masters Thesis]. University of Texas – Austin; 2014. Available from: http://hdl.handle.net/2152/25904


University of Western Ontario

24. Holloway, Ian Douglas. Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals.

Degree: 2012, University of Western Ontario

 Humans are born with a sensitivity to numerical magnitude. In literate cultures, these numerical intuitions are associated with a symbolic notation (e.g..Hindu-Arabic numerals). While a… (more)

Subjects/Keywords: Hindu-Arabic numerals; fMRI; numerical magnitude representation; audiovisual integration; intraparietal sulcus; fusiform gyrus; superior temporal gyrus; Cognitive Neuroscience

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Holloway, I. D. (2012). Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals. (Thesis). University of Western Ontario. Retrieved from https://ir.lib.uwo.ca/etd/711

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Holloway, Ian Douglas. “Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals.” 2012. Thesis, University of Western Ontario. Accessed September 29, 2020. https://ir.lib.uwo.ca/etd/711.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Holloway, Ian Douglas. “Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals.” 2012. Web. 29 Sep 2020.

Vancouver:

Holloway ID. Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals. [Internet] [Thesis]. University of Western Ontario; 2012. [cited 2020 Sep 29]. Available from: https://ir.lib.uwo.ca/etd/711.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Holloway ID. Symbolizing Number: fMRI investigations of the semantic, auditory, and visual correlates of Hindu-Arabic numerals. [Thesis]. University of Western Ontario; 2012. Available from: https://ir.lib.uwo.ca/etd/711

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

25. Deonarine, Justin. Noise reduction limits the McGurk Effect.

Degree: 2011, University of Waterloo

 In the McGurk Effect (McGurk & MacDonald, 1976), a visual depiction of a speaker silently mouthing the syllable [ga]/[ka] is presented concurrently with the auditory… (more)

Subjects/Keywords: speech perception; psycholinguistics; audiovisual integration; McGurk Effect

…speech perception illusion, which is often used as an example of audiovisual integration. It… …suggesting a temporal component to audiovisual integration. Deonarine (2010) expanded on… …which accompany the Signal. He suggested that audiovisual integration occurs due to the use of… …x28;2010). He suggested that audiovisual integration can be subject dependent, with some… …was being visually articulated. Most audiovisual integration research assumes that… 

Page 1 Page 2 Page 3 Page 4 Page 5 Sample image Sample image

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Deonarine, J. (2011). Noise reduction limits the McGurk Effect. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/6046

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Deonarine, Justin. “Noise reduction limits the McGurk Effect.” 2011. Thesis, University of Waterloo. Accessed September 29, 2020. http://hdl.handle.net/10012/6046.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Deonarine, Justin. “Noise reduction limits the McGurk Effect.” 2011. Web. 29 Sep 2020.

Vancouver:

Deonarine J. Noise reduction limits the McGurk Effect. [Internet] [Thesis]. University of Waterloo; 2011. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10012/6046.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Deonarine J. Noise reduction limits the McGurk Effect. [Thesis]. University of Waterloo; 2011. Available from: http://hdl.handle.net/10012/6046

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


University of Louisville

26. Wu, Jia. Speech perception and the McGurk effect : a cross cultural study using event-related potentials.

Degree: PhD, 2009, University of Louisville

  Previous research has indicated the important role of visual information in the speech perception process. These studies have elucidated the areas of the brain… (more)

Subjects/Keywords: Speech perception; Event-related potentials; ERP; N400; McGurk; Multi-sensory; Chinese; Audiovisual; McGurk effect; Audiovisual integration

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Wu, J. (2009). Speech perception and the McGurk effect : a cross cultural study using event-related potentials. (Doctoral Dissertation). University of Louisville. Retrieved from 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597

Chicago Manual of Style (16th Edition):

Wu, Jia. “Speech perception and the McGurk effect : a cross cultural study using event-related potentials.” 2009. Doctoral Dissertation, University of Louisville. Accessed September 29, 2020. 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597.

MLA Handbook (7th Edition):

Wu, Jia. “Speech perception and the McGurk effect : a cross cultural study using event-related potentials.” 2009. Web. 29 Sep 2020.

Vancouver:

Wu J. Speech perception and the McGurk effect : a cross cultural study using event-related potentials. [Internet] [Doctoral dissertation]. University of Louisville; 2009. [cited 2020 Sep 29]. Available from: 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597.

Council of Science Editors:

Wu J. Speech perception and the McGurk effect : a cross cultural study using event-related potentials. [Doctoral Dissertation]. University of Louisville; 2009. Available from: 10.18297/etd/1597 ; https://ir.library.louisville.edu/etd/1597

27. Shatzer, Hannah Elizabeth. Visual and Temporal Influences on Multimodal Speech Integration.

Degree: MA, Psychology, 2015, The Ohio State University

 Auditory and visual speech cues are often used in tandem to maximize understanding of a speech signal when communicating. A neural model by Bhat et… (more)

Subjects/Keywords: Psychology; audiovisual speech; integration; visual informativeness; duration; onset asynchrony

…List of Figures Figure 1. Neural reweighting model of audiovisual speech integration… …exact role of visual cues in audiovisual (AV) speech integration is a subject of… …Atteveldt, 2013). However, despite all that is currently known about audiovisual speech, the… …x28;1954) found no improvement in identification judgments of audiovisual speech compared to… …speech cues can have such a strong influential role in speech integration that they can… 

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Shatzer, H. E. (2015). Visual and Temporal Influences on Multimodal Speech Integration. (Masters Thesis). The Ohio State University. Retrieved from http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560

Chicago Manual of Style (16th Edition):

Shatzer, Hannah Elizabeth. “Visual and Temporal Influences on Multimodal Speech Integration.” 2015. Masters Thesis, The Ohio State University. Accessed September 29, 2020. http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560.

MLA Handbook (7th Edition):

Shatzer, Hannah Elizabeth. “Visual and Temporal Influences on Multimodal Speech Integration.” 2015. Web. 29 Sep 2020.

Vancouver:

Shatzer HE. Visual and Temporal Influences on Multimodal Speech Integration. [Internet] [Masters thesis]. The Ohio State University; 2015. [cited 2020 Sep 29]. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560.

Council of Science Editors:

Shatzer HE. Visual and Temporal Influences on Multimodal Speech Integration. [Masters Thesis]. The Ohio State University; 2015. Available from: http://rave.ohiolink.edu/etdc/view?acc_num=osu1437403560

28. Uutela, Kimmo. Estimating Neural Currents from Neuromagnetic Measurements.

Degree: 2001, Helsinki University of Technology

This thesis concerns three new methods for estimating the electrical activity of the human brain from the magnetic fields measured outside the head. The first… (more)

Subjects/Keywords: magnetoencephalography; inverse problem; minimum-norm estimate; multidipole model; head movements; visuomotor interaction; visual attention; audiovisual integration; sign language

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Uutela, K. (2001). Estimating Neural Currents from Neuromagnetic Measurements. (Thesis). Helsinki University of Technology. Retrieved from http://lib.tkk.fi/Diss/2001/isbn9512256991/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Chicago Manual of Style (16th Edition):

Uutela, Kimmo. “Estimating Neural Currents from Neuromagnetic Measurements.” 2001. Thesis, Helsinki University of Technology. Accessed September 29, 2020. http://lib.tkk.fi/Diss/2001/isbn9512256991/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

MLA Handbook (7th Edition):

Uutela, Kimmo. “Estimating Neural Currents from Neuromagnetic Measurements.” 2001. Web. 29 Sep 2020.

Vancouver:

Uutela K. Estimating Neural Currents from Neuromagnetic Measurements. [Internet] [Thesis]. Helsinki University of Technology; 2001. [cited 2020 Sep 29]. Available from: http://lib.tkk.fi/Diss/2001/isbn9512256991/.

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Council of Science Editors:

Uutela K. Estimating Neural Currents from Neuromagnetic Measurements. [Thesis]. Helsinki University of Technology; 2001. Available from: http://lib.tkk.fi/Diss/2001/isbn9512256991/

Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation


Université Paris-Sud – Paris XI

29. Bouchara, Tifanie. Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies.

Degree: Docteur es, Informatique, 2012, Université Paris-Sud – Paris XI

 Bien que de plus en plus de données sonores et audiovisuelles soient disponibles, la majorité des interfaces qui permettent d’y accéder reposent uniquement sur une… (more)

Subjects/Keywords: Interfaces multimodales; Design d’interaction sonore; Exploration de collection multimédia; Stratégies de presentation de l’information; Techniques focus+contexte; Intégration audiovisuelle en sortie; Perception visuelle, auditive et multisensorielle; Multimodal interfaces; Sonic Interaction Design; Exploration of multimedia collection; Presentation strategies; Focus+context techniques; Audiovisual integration; Visual, auditory and multisensorial perception

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Bouchara, T. (2012). Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies. (Doctoral Dissertation). Université Paris-Sud – Paris XI. Retrieved from http://www.theses.fr/2012PA112248

Chicago Manual of Style (16th Edition):

Bouchara, Tifanie. “Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies.” 2012. Doctoral Dissertation, Université Paris-Sud – Paris XI. Accessed September 29, 2020. http://www.theses.fr/2012PA112248.

MLA Handbook (7th Edition):

Bouchara, Tifanie. “Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies.” 2012. Web. 29 Sep 2020.

Vancouver:

Bouchara T. Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies. [Internet] [Doctoral dissertation]. Université Paris-Sud – Paris XI; 2012. [cited 2020 Sep 29]. Available from: http://www.theses.fr/2012PA112248.

Council of Science Editors:

Bouchara T. Comparaison et combinaison de rendus visuels et sonores pour la conception d'interfaces homme-machine : des facteurs humains aux stratégies de présentation à base de distorsion : Comparison and combination of visual aud audio renderings to conceive human-computer interfaces : from human factors to distortion-based presentation strategies. [Doctoral Dissertation]. Université Paris-Sud – Paris XI; 2012. Available from: http://www.theses.fr/2012PA112248

30. Moro, Stefania Siera. Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing.

Degree: PhD, Psychology (Functional Area: Brain, Behaviour & Cognitive Science), 2018, York University

 A growing body of research shows that complete deprivation of the visual system from the loss of both eyes early in life results in changes… (more)

Subjects/Keywords: Neurosciences; Neuroscience; Visual processing; Auditory processing; Visual deprivation; Monocular enucleation; Audiovisual processing; Multisensory integration; Temporal binding window; Double flash illusion; Visual capture; McGurk illusion; Person identification; Object identification; Medial geniculate body; MRI

Record DetailsSimilar RecordsGoogle PlusoneFacebookTwitterCiteULikeMendeleyreddit

APA · Chicago · MLA · Vancouver · CSE | Export to Zotero / EndNote / Reference Manager

APA (6th Edition):

Moro, S. S. (2018). Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing. (Doctoral Dissertation). York University. Retrieved from http://hdl.handle.net/10315/34989

Chicago Manual of Style (16th Edition):

Moro, Stefania Siera. “Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing.” 2018. Doctoral Dissertation, York University. Accessed September 29, 2020. http://hdl.handle.net/10315/34989.

MLA Handbook (7th Edition):

Moro, Stefania Siera. “Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing.” 2018. Web. 29 Sep 2020.

Vancouver:

Moro SS. Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing. [Internet] [Doctoral dissertation]. York University; 2018. [cited 2020 Sep 29]. Available from: http://hdl.handle.net/10315/34989.

Council of Science Editors:

Moro SS. Long-Term Consequences of Early Eye Enucleation on Audiovisual Processing. [Doctoral Dissertation]. York University; 2018. Available from: http://hdl.handle.net/10315/34989

[1] [2]

.