You searched for subject:(Multisensory Integration MSI )
.
Showing records 1 – 30 of
11801 total matches.
◁ [1] [2] [3] [4] [5] … [394] ▶

University of Ontario Institute of Technology
1.
Karellas, Antonia.
The influence of subclinical neck pain on multisensory integration.
Degree: 2018, University of Ontario Institute of Technology
URL: http://hdl.handle.net/10155/1057
► Subclinical neck pain (SCNP) is defined as neck pain of mild to moderate severity for which individuals have not yet been treated. This condition is…
(more)
▼ Subclinical neck pain (SCNP) is defined as neck pain of mild to moderate severity for which individuals have not yet been treated. This condition is associated with altered somatosensory
integration, which has been revealed through motor learning studies as well as work incorporating mental rotation of objects and proprioception tasks. Knowledge of this contributes to the prediction that
multisensory integration (
MSI) may also be affected by untreated neck pain.
MSI is an ongoing activity of the brain that occurs constantly throughout the day. Study one of this research investigated differences in
MSI task performance and event related potentials (ERPs) recorded by electroencephalography (EEG) between an SCNP group and a neck pain free control group. This study showed decrements in RT to all stimulus conditions in the presence of neck pain, accompanied by reductions in overall neural activity and
multisensory enhancement. In the second experiment, the effect of a six week chiropractic intervention program was assessed between an SCNP control and SCNP treatment group. The combination of these studies indicated not only that analysis of the divergence between
MSI and SUM waveforms is effective in measuring
MSI, but also that treatment may be an effective mechanism to improve overall neural activity levels and RT to various sensory inputs in those who present with neck pain.
Advisors/Committee Members: Murphy, Bernadette, Yielder, Paul.
Subjects/Keywords: Multisensory Integration (MSI); Subclinical neck pain (SCNP); Electroencephalography (EEG); Event related potential (ERP)
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Karellas, A. (2018). The influence of subclinical neck pain on multisensory integration. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1057
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Karellas, Antonia. “The influence of subclinical neck pain on multisensory integration.” 2018. Thesis, University of Ontario Institute of Technology. Accessed March 05, 2021.
http://hdl.handle.net/10155/1057.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Karellas, Antonia. “The influence of subclinical neck pain on multisensory integration.” 2018. Web. 05 Mar 2021.
Vancouver:
Karellas A. The influence of subclinical neck pain on multisensory integration. [Internet] [Thesis]. University of Ontario Institute of Technology; 2018. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/10155/1057.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Karellas A. The influence of subclinical neck pain on multisensory integration. [Thesis]. University of Ontario Institute of Technology; 2018. Available from: http://hdl.handle.net/10155/1057
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

Vanderbilt University
2.
Kwakye, Leslie Ellen Dowell.
Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders.
Degree: PhD, Neuroscience, 2011, Vanderbilt University
URL: http://hdl.handle.net/1803/14908
► Autism spectrum disorders (ASD) are characterized by disrupted social interactions, disordered language and communication, and restrictive interests and repetitive behaviors. Although not part of the…
(more)
▼ Autism spectrum disorders (ASD) are characterized by disrupted social interactions, disordered language and communication, and restrictive interests and repetitive behaviors. Although not part of the main diagnostic criteria, alterations in sensory responsiveness and perception are also prevalent in autistic individuals. Neurobiologically based theories as to the underlying mechanisms of autism, such as increased excitation/inhibition ratio, minicolumnopathy, and temporal binding deficit suggest that several fundamental aspects of sensory processing may be disrupted in ASD. At the core of these ideas is the concept that neurons in the autistic brain may have a reduced capacity for filtering sensory information as compared to neurons in a typically developing brain, and that timing-related aspects of sensory processes may be altered. To test these filtering- and temporal-based aspects of sensory processing in ASD, we have employed a battery of tasks to contrast performance in ASD and typically developing (TD) individuals. We established that autism is associated with disruptions in sensory filtering by investigating differences in the critical bandwidth (the range of frequencies which are able to mask a pure tone or gabor target) for both the auditory and visual modalities. We were also able to demonstrate that children with ASD show differences in auditory, but not visual, temporal acuity by measuring thresholds on visual and auditory temporal order judgment (TOJ) tasks. Finally, we demonstrated that autism is associated with disrupted
multisensory temporal processing by determining the temporal window of
multisensory integration (the duration in which the
integration of information from multiple modalities is likely) for three distinct
multisensory tasks: (the flash beep illusion, the
multisensory TOJ, and the detection of
multisensory targets). In each of these tasks, children with ASD showed a dramatic enlargement in the size of the
multisensory temporal window, thus establishing disrupted temporal
multisensory processing as an integral factor in the sensory abnormalities that characterize autism. These changes in filtering and temporal processing of individuals with ASD represent a unique way of looking at the nature of the sensory disturbances seen in autism, and may represent an important window into the etiology of autistic symptomology and its relationship to the core disrupted domains including communication and social interactions. Furthermore, these results are likely to provide important insights for the development of better interventional strategies for those living with autism.
Advisors/Committee Members: Daniel Polley (committee member), Micah Murray (committee member), Mark Wallace (committee member), Craig Kennedy (Committee Chair).
Subjects/Keywords: multisensory integration; autism
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kwakye, L. E. D. (2011). Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/14908
Chicago Manual of Style (16th Edition):
Kwakye, Leslie Ellen Dowell. “Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders.” 2011. Doctoral Dissertation, Vanderbilt University. Accessed March 05, 2021.
http://hdl.handle.net/1803/14908.
MLA Handbook (7th Edition):
Kwakye, Leslie Ellen Dowell. “Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders.” 2011. Web. 05 Mar 2021.
Vancouver:
Kwakye LED. Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders. [Internet] [Doctoral dissertation]. Vanderbilt University; 2011. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1803/14908.
Council of Science Editors:
Kwakye LED. Disruptions in the spatial filtering and temporal processing of low-level unisensory and multisensory stimuli in autism spectrum disorders. [Doctoral Dissertation]. Vanderbilt University; 2011. Available from: http://hdl.handle.net/1803/14908

Georgia Tech
3.
Shadi, Kamal.
Inference of structural brain networks and modeling of cortical multi-sensory integration.
Degree: PhD, Computer Science, 2019, Georgia Tech
URL: http://hdl.handle.net/1853/62258
► Recent advances in neuroimaging have enabled major progress in the field of brain connectomics, i.e., constructing maps of connections between brain regions at different scales.…
(more)
▼ Recent advances in neuroimaging have enabled major progress in the field of brain connectomics, i.e., constructing maps of connections between brain regions at different scales. Diffusion MRI (dMRI) and probabilistic tractography algorithms are state of the art methods to map the structural connectome of the brain non-invasively and in vivo. Although probabilistic tractography can detect many major connections in the brain, it also reports some spurious ones. We propose and evaluate a method, referred to as MANIA (Minimum Asymmetry Network Inference Algorithm) that can infer the structural brain network that interconnects a given set of Regions of Interest (ROIs) from probabilistic tractography data in a threshold-free manner. Given that diffusion MRI is unable to detect the direction of each connection, we formulate the network inference process as an optimization problem that minimizes the (appropriately normalized) asymmetry of the observed network. The most fundamental property of the human connectome, its density, is still elusive and debated. MANIA is well-positioned to address this open question because it does not depend on an arbitrary weight threshold. We use MANIA to infer the human cortico-cortical connectome from the data published by Human Connectome Project (HCP). MANIA reports connectomes that are highly consistent across individuals at a density of approximately 3.2%. We validate the accuracy of these connectomes by comparing the connections inferred using MANIA at 3T MRI acquisitions with 7T high-resolution MRI acquisitions of the same subjects. Having a structural network is instrumental in analyzing communication dynamics and information processing in the brain. The last research problem, we focus on relates to multi-sensory
integration in the cortex. We model this process on the mouse cortical connectome (provided by the Allen Institute) by employing an Asynchronous Linear Threshold (ALT) diffusion model on that connectome. The ALT model captures how evoked activity that originates at a primary sensory region of the cortex “ripples through” other cortical regions. We validate the ALT model using Voltage Sensitive Dye (VSD) imaging data. Our results show that a small number of cortical regions (including the Claustrum) integrate almost all sensory information streams, suggesting that the cortex uses an hourglass architecture to integrate and compress multi-sensory information.
Advisors/Committee Members: Dovrolis, Constantine (committee member), Zegura, Ellen (committee member), Keilholz, Shella (committee member), Dyer, Eva (committee member), Kira, Zsolt (committee member).
Subjects/Keywords: Anatomical brain connectome; Multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Shadi, K. (2019). Inference of structural brain networks and modeling of cortical multi-sensory integration. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/62258
Chicago Manual of Style (16th Edition):
Shadi, Kamal. “Inference of structural brain networks and modeling of cortical multi-sensory integration.” 2019. Doctoral Dissertation, Georgia Tech. Accessed March 05, 2021.
http://hdl.handle.net/1853/62258.
MLA Handbook (7th Edition):
Shadi, Kamal. “Inference of structural brain networks and modeling of cortical multi-sensory integration.” 2019. Web. 05 Mar 2021.
Vancouver:
Shadi K. Inference of structural brain networks and modeling of cortical multi-sensory integration. [Internet] [Doctoral dissertation]. Georgia Tech; 2019. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1853/62258.
Council of Science Editors:
Shadi K. Inference of structural brain networks and modeling of cortical multi-sensory integration. [Doctoral Dissertation]. Georgia Tech; 2019. Available from: http://hdl.handle.net/1853/62258

McMaster University
4.
Wong, Nadia P.
HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.
Degree: MSc, 2014, McMaster University
URL: http://hdl.handle.net/11375/16513
► Our experience with the world depends on how we integrate sensory information. Multisensory integration generates contextually rich experiences, which are more distinct and more easily…
(more)
▼ Our experience with the world depends on how we integrate sensory information. Multisensory integration generates contextually rich experiences, which are more distinct and more easily retrievable than their unisensory counterparts. Here, we report a series of experiments examining the impact semantic audiovisual (AV) congruency has on recognition memory. Participants were presented with AV word pairs which could either be the same or different (i.e., hear “ring”, see “phone”) followed by a recognition test. Recognition memory was found to be improved for words following incongruent presentations. Results suggest higher cognitive processes may be recruited to resolve sensory conflicts, leading to superior recognition for incongruent words. Integration may help in easing the processing of multisensory events, but does not promote the processing needed to make them distinctive.
Thesis
Master of Science (MSc)
Advisors/Committee Members: Shore, David I., Psychology.
Subjects/Keywords: Multisensory Integration; Audiovisual; Recognition Memory
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Wong, N. P. (2014). HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/16513
Chicago Manual of Style (16th Edition):
Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Masters Thesis, McMaster University. Accessed March 05, 2021.
http://hdl.handle.net/11375/16513.
MLA Handbook (7th Edition):
Wong, Nadia P. “HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY.” 2014. Web. 05 Mar 2021.
Vancouver:
Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Internet] [Masters thesis]. McMaster University; 2014. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/11375/16513.
Council of Science Editors:
Wong NP. HEAR THIS, READ THAT; AUDIOVISUAL INTEGRATION EFFECTS ON RECOGNITION MEMORY. [Masters Thesis]. McMaster University; 2014. Available from: http://hdl.handle.net/11375/16513

University of Guelph
5.
Cloke, Jacob.
Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System.
Degree: PhD, Department of Psychology, 2016, University of Guelph
URL: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/9915
► This thesis investigated the selective multisensory integration impairment in ketamine-treated rats, a rodent model of schizophrenia. Previous findings have shown that ketamine-treated rats are impaired…
(more)
▼ This thesis investigated the selective
multisensory integration impairment in ketamine-treated rats, a rodent model of schizophrenia. Previous findings have shown that ketamine-treated rats are impaired on a crossmodal object recognition (CMOR) task, but not on similar tests that do not require
multisensory binding. In the current thesis, this effect was replicated and extended using a novel
multisensory object oddity (MSO) task, which directly assesses
multisensory binding in the absence of significant memory demands; ketamine-treated rats were impaired on tactile-visual and olfactory-visual MSO, but not on similar unimodal oddity tasks. Systemic administration of nicotine reversed the CMOR and MSO impairments in ketamine-treated rats. This effect appears to be mediated by α4β2 nicotinic acetylcholine receptors (nAChR), as the selective α4β2 agonist ABT-418, but not the selective α7 nAChR agonist GTS-21, restored CMOR and MSO performance in ketamine-treated rats. The involvement of nAChRs in remediating this cognitive impairment was explored in the orbitofrontal (OFC) and medial prefrontal (mPFC) cortices. Intra-OFC, but not intra-mPFC, ABT-418 restored
multisensory cognition in ketamine-treated rats. Prompted by past research indicating dysfunctional GABAergic transmission in schizophrenia and ketamine-treated rats, we hypothesized that decreased OFC GABAergic function disrupts
multisensory cognition and that OFC nAChRs restore performance in ketamine-treated rats by enhancing GABAergic release. Accordingly, the GABAA antagonist bicuculline blocked the remediating effect of intra-OFC ABT-418 on CMOR and MSO performance. Additionally, whole-cell electrophysiology revealed disrupted GABAergic transmission in the OFC of ketamine-treated rats, and activation of α4β2 nAChRs restored GABAergic function. Moreover, immunohistochemical analyses indicated that GABAergic parvalbumin (PV)-interneuron (PV-IN) expression was decreased in the OFC of ketamine-treated rats. Thus, DREADDS (Designer Receptors Exclusively Activated by Designer Drugs) were used to silence OFC PV-INs in PV-Cre mice, revealing a selective MSO deficit that could be reversed by ABT-418. This thesis therefore proposes that dysfunction of OFC PV-INs in ketamine-treated rats causes their
multisensory impairment and that activation of α4β2 nAChR on PV-INs restores essential inhibitory transmission reversing the
multisensory deficit. Given the parallels between schizophrenia and the current rodent model, pharmacological therapies targeting the GABAergic system and/or GABAergic-nicotinic interactions hold promise for treating specific cognitive symptoms in schizophrenia.
Advisors/Committee Members: Winters, Boyer (advisor).
Subjects/Keywords: Schizophrenia; Multisensory Integration; Cognition
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Cloke, J. (2016). Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System. (Doctoral Dissertation). University of Guelph. Retrieved from https://atrium.lib.uoguelph.ca/xmlui/handle/10214/9915
Chicago Manual of Style (16th Edition):
Cloke, Jacob. “Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System.” 2016. Doctoral Dissertation, University of Guelph. Accessed March 05, 2021.
https://atrium.lib.uoguelph.ca/xmlui/handle/10214/9915.
MLA Handbook (7th Edition):
Cloke, Jacob. “Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System.” 2016. Web. 05 Mar 2021.
Vancouver:
Cloke J. Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System. [Internet] [Doctoral dissertation]. University of Guelph; 2016. [cited 2021 Mar 05].
Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/9915.
Council of Science Editors:
Cloke J. Multisensory Integration Impairment in Rodent Models of Schizophrenia: Converging Evidence for Remediation by Nicotinic Receptor Stimulation of the GABAergic System. [Doctoral Dissertation]. University of Guelph; 2016. Available from: https://atrium.lib.uoguelph.ca/xmlui/handle/10214/9915

Georgia Tech
6.
May, Keenan Russell.
Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality.
Degree: PhD, Psychology, 2020, Georgia Tech
URL: http://hdl.handle.net/1853/62783
► Extended Reality (XR) systems are currently of interest to both academic and commercial communities. XR systems may involve interacting with many objects in three-dimensional space.…
(more)
▼ Extended Reality (XR) systems are currently of interest to both academic and commercial communities. XR systems may involve interacting with many objects in three-dimensional space. The usability of such systems could be improved by playing sounds that are perceptually integrated with visual representations of objects. In the
multisensory integration process, humans take into account various types of crossmodal congruency to determine whether auditory and visual stimuli should be bound into unified percepts. In XR environments, spatial and temporal congruency may be unreliable. As such, the present research expands on associative congruency, which refers to content congruency effects that are acquired via perceptual learning in response to exposure to co-occurrent stimuli or features. A new type of associative congruency is proposed called action-object congruency. Research in ecological sound perception has identified a number of features of objects and actions that humans can discern based on the sounds produced by sound-producing events. Since humans can infer such information through sound, this information should also inform the
integration of auditory and visual stimuli. When perceiving a realistic depiction of a sound-producing event such as a strike, scrape or rub,
integration should be more likely to occur if a concurrently-presented sound is congruent with the objects and action that are seen. These effects should occur even if the visual objects and the sound are novel and unrecognizable, as long as relevant features can be ascertained visually and via sound. To evaluate this, the temporal and spatial ventriloquism illusions were utilized to assess the impact of action congruency and object congruency on
multisensory integration. Visual depictions of interacting objects were displayed in virtual reality, and congruent or incongruent sounds were played over speakers. In two types of trials, participants either localized the sounds via pointing, or judged whether the sounds and visual events were simultaneous. Action-object congruent visual and auditory pairings led to greater localization biasing and higher rates of perceived simultaneity, reflecting stronger
integration of stimuli. Action and object congruency were both impactful, but action congruency had a larger effect. The effects of action and object congruency were additive, providing support for the linear summation model of congruency type combination. These results suggest that action-object congruency can be used to better understand how humans conduct
multisensory integration as well as to improve
MSI in future XR environments.
Advisors/Committee Members: Walker, Bruce (advisor), Gorman, Jamie (advisor), Catrambone, Richard (advisor), Brown, Thackery (advisor), Gandy, Maribeth (advisor).
Subjects/Keywords: perception; extended reality; multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
May, K. R. (2020). Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality. (Doctoral Dissertation). Georgia Tech. Retrieved from http://hdl.handle.net/1853/62783
Chicago Manual of Style (16th Edition):
May, Keenan Russell. “Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality.” 2020. Doctoral Dissertation, Georgia Tech. Accessed March 05, 2021.
http://hdl.handle.net/1853/62783.
MLA Handbook (7th Edition):
May, Keenan Russell. “Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality.” 2020. Web. 05 Mar 2021.
Vancouver:
May KR. Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality. [Internet] [Doctoral dissertation]. Georgia Tech; 2020. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1853/62783.
Council of Science Editors:
May KR. Impact of Action-Object Congruency on the Integration of Auditory and Visual Stimuli in Extended Reality. [Doctoral Dissertation]. Georgia Tech; 2020. Available from: http://hdl.handle.net/1853/62783

University of Toronto
7.
Bak, Katherine Joanne.
Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults.
Degree: 2020, University of Toronto
URL: http://hdl.handle.net/1807/103276
► When objects collide, they produce an auditory and visual event. Studying impact events can be used to expand what is known about audio-visual integration. The…
(more)
▼ When objects collide, they produce an auditory and visual event. Studying impact events can be used to expand what is known about audio-visual integration. The present study used the Schutz- Lipscomb Illusion to examine how an audio-visual impact event is perceived by younger and older adults and whether temporal and spatial offsets of the audio and visual stimuli influence judgements. Twenty-one older and 21 younger adults completed tone duration and temporal order judgements (TOJ) for audio-visual stimuli in which the spatial and temporal offsets were manipulated. Results demonstrated that the strength of the illusion based on tone duration did not differ between groups and was not influenced by spatial or temporal offsets. Older adults showed an “auditory bias” when making TOJs. TOJ precision and illusion strength were not correlated. The current findings extend beyond traditionally used audio-visual paradigms to help provide a further understanding of age-related differences in audio-visual integration.
M.A.
Advisors/Committee Members: Campos, Jennifer L, Psychology.
Subjects/Keywords: Aging; Integration; Multisensory; 0621
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bak, K. J. (2020). Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/103276
Chicago Manual of Style (16th Edition):
Bak, Katherine Joanne. “Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults.” 2020. Masters Thesis, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/103276.
MLA Handbook (7th Edition):
Bak, Katherine Joanne. “Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults.” 2020. Web. 05 Mar 2021.
Vancouver:
Bak KJ. Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults. [Internet] [Masters thesis]. University of Toronto; 2020. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/103276.
Council of Science Editors:
Bak KJ. Audio-viisual Temporal Judgments and the Schutz- Lipscomb Illusion in Younger and Older Adults. [Masters Thesis]. University of Toronto; 2020. Available from: http://hdl.handle.net/1807/103276

University of Toronto
8.
Diaconescu, Andreea.
The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging.
Degree: 2011, University of Toronto
URL: http://hdl.handle.net/1807/29703
► Perceptual objects often comprise of a visual and auditory signature, which arrives simultaneously through distinct sensory channels, and multisensory features are linked by virtue of…
(more)
▼ Perceptual objects often comprise of a visual and auditory signature, which arrives simultaneously through distinct sensory channels, and multisensory features are linked by virtue of being attributed to a specific object. The binding of familiar auditory and visual signatures can be referred to as semantic audiovisual (AV) integration because it involves higher level representations of naturalistic multisensory objects. While integration of semantically related multisensory features is behaviorally advantageous, multisensory competition, or situations of sensory dominance of one modality at the expense of another, impairs performance. Multisensory facilitation and competition effects on performance are exacerbated with age. Older adults show a significantly larger performance gain from bimodal presentations compared to unimodal ones. In the present thesis project, magnetoencephalography (MEG) recordings of semantically related bimodal and unimodal stimuli captured the spatiotemporal patterns underlying both multisensory facilitation and competition in young and older adults. We first demonstrate that multisensory processes unfold in multiple stages: first, posterior parietal neurons respond preferentially to bimodal stimuli; secondly, regions in superior temporal and posterior cingulate cortices detect the semantic category of the stimuli; and finally, at later processing stages, orbitofrontal regions process crossmodal conflicts when complex sounds and pictures are semantically incongruent. Older adults, in contrast to young, are more efficient at integrating semantically congruent multisensory information across auditory and visual channels. Moreover, in these multisensory facilitation conditions, increased neural activity in medial fronto-parietal brain regions predicts faster motor performance in response to bimodal stimuli in older compared to younger adults. Finally, by examining the variability of the MEG signal, we also showed that an increase in local entropy with age is also behaviourally adaptive in the older group as it significantly correlates with more stable and more accurate performance in older compared to young adults.
PhD
Advisors/Committee Members: McIntosh, Anthony Randal, Psychology.
Subjects/Keywords: multisensory integration; aging; 0989; 0633
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Diaconescu, A. (2011). The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/29703
Chicago Manual of Style (16th Edition):
Diaconescu, Andreea. “The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging.” 2011. Doctoral Dissertation, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/29703.
MLA Handbook (7th Edition):
Diaconescu, Andreea. “The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging.” 2011. Web. 05 Mar 2021.
Vancouver:
Diaconescu A. The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging. [Internet] [Doctoral dissertation]. University of Toronto; 2011. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/29703.
Council of Science Editors:
Diaconescu A. The Co-occurrence of Multisensory Facilitation and Competition in the Human Brain and its Impact on Aging. [Doctoral Dissertation]. University of Toronto; 2011. Available from: http://hdl.handle.net/1807/29703

University of Waterloo
9.
Tugac, Naime.
The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task.
Degree: 2017, University of Waterloo
URL: http://hdl.handle.net/10012/11285
► Objective: Binocular vision provides the most accurate and precise depth information; however, many people have impairments in binocular visual function. It is currently unknown whether…
(more)
▼ Objective: Binocular vision provides the most accurate and precise depth information; however, many people have impairments in binocular visual function. It is currently unknown whether depth information from another modality can improve depth perception during action planning and execution. Therefore, the goal of this thesis was to assess whether somatosensory input improves target localization during the performance of a precision placement task. It was hypothesized that somatosensory input regarding target location will improve task performance.
Methods: Thirty visually normal participants performed a bead-threading task with their right hand during binocular and monocular viewing. Upper limb kinematics and eye movements were recorded using the Optotrak and EyeLink 2 while participants picked up the beads and placed them on a vertical needle. In study 1, somatosensory and visual feedback provided input about needle location (i.e., participants could see their left hand holding the needle). In study 2, only somatosensory feedback was provided (i.e., view of the left hand holding the needle was blocked, and practice trials were standardized). The main outcome variables that were examined were placement time, peak acceleration, and mean position and variability of the limb along the trajectory. A repeated analysis of variance with 2 factors, Viewing Condition (binocular/left eye monocular/right eye monocular) and Modality (vision/somatosensory) was used to test the hypothesis.
Results: Results from study 1 were in accordance with our hypothesis, showing a significant interaction between viewing condition and modality for placement time (p=0.0222). Specifically, when somatosensory feedback was provided, placement time was >150 ms shorter in both monocular viewing conditions compared to the vision only condition. In contrast, somatosensory feedback did not significantly affect placement time during binocular viewing. There was no evidence to support that motor planning was improved when somatosensory input about end target location was provided. Limb trajectory showed a deviation toward needle location along azimuth at various kinematic markers during movement execution when somatosensory feedback was provided. Results from study 2 showed a main effect of modality for placement time (p=0.0288); however, the interaction between modality and vision was not significant. The results also showed that somatosensory input was associated with faster movement times and higher peak accelerations. Similar to study one, limb trajectory showed a deviation toward needle location at various kinematic markers during movement execution when somatosensory feedback was provided.
Conclusions: This study demonstrated that information from another modality can improve planning and execution of reaching movements under certain conditions. It may be that the role of somatosensory input is not as effective when practice is not administered. It is important to note that despite the improved performance when somatosensory input was…
Subjects/Keywords: Binocular Vision; Multisensory Integration; Prehension
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Tugac, N. (2017). The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task. (Thesis). University of Waterloo. Retrieved from http://hdl.handle.net/10012/11285
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Tugac, Naime. “The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task.” 2017. Thesis, University of Waterloo. Accessed March 05, 2021.
http://hdl.handle.net/10012/11285.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Tugac, Naime. “The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task.” 2017. Web. 05 Mar 2021.
Vancouver:
Tugac N. The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task. [Internet] [Thesis]. University of Waterloo; 2017. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/10012/11285.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Tugac N. The Contribution of Visual & Somatosensory Input to Target Localization During the Performance of a Precision Grasping & Placement Task. [Thesis]. University of Waterloo; 2017. Available from: http://hdl.handle.net/10012/11285
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
10.
O'Sullivan, Aisling.
The impact of visual speech on neural processing of auditory speech.
Degree: School of Engineering. Discipline of Electronic & Elect. Engineering, 2021, Trinity College Dublin
URL: http://hdl.handle.net/2262/94842
► When we listen to someone speak, seeing their face can help us to understand them better, especially when there is background noise or other people…
(more)
▼ When we listen to someone speak, seeing their face can help us to understand them better, especially when there is background noise or other people speaking at the same time. Research examining the neural processes underlying this benefit have centered on the use of isolated syllables and words with multiple repetitions. While these approaches have provided important insights, they are limited by the fact that they fail to capture the rapid dynamics of natural speech. In this thesis, we use natural speech together with electroencephalography (EEG) recordings from humans to employ recently developed analysis techniques for studying speech processing in more natural settings (Crosse et al., 2016; de Cheveigne et al., 2018) in order to investigate the impact of visual speech on auditory speech processing.
Our first study shows that attention to visual speech impacts auditory speech tracking, and this effect is thought to be driven by enhanced visual cortical processing as well as
multisensory interactions when the visual speech matches the attended auditory speech. We also investigated how visual speech impacts auditory spectrogram and phonetic processing by quantifying the strength of the encoding of those features in the EEG using canonical correlation analysis. We found
multisensory interactions for both stages of processing and the strength of
multisensory interactions were more pronounced at the level of phonetic processing for speech in noise relative to speech in quiet, indicating that listeners rely more on articulatory details from visual speech in challenging listening conditions. The final study in this thesis tested the effect of blurring the mouth while retaining the overall dynamics of the mouth. This revealed that phonetic encoding was reduced when the mouth was blurred compared with when it was clear, whereas spectrogram encoding was not affected by the mouth blurring. This suggests that the mouth details provides visual phonetic information that helps to improve understanding of the speech.
Together, the finding from these studies support the notion that the
integration of audio and visual speech is a flexible, multistage process that adapts to optimize comprehension based on the current listening conditions and the available visual information.
Advisors/Committee Members: Reilly, Richard.
Subjects/Keywords: speech; multisensory integration; EEG
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
O'Sullivan, A. (2021). The impact of visual speech on neural processing of auditory speech. (Thesis). Trinity College Dublin. Retrieved from http://hdl.handle.net/2262/94842
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
O'Sullivan, Aisling. “The impact of visual speech on neural processing of auditory speech.” 2021. Thesis, Trinity College Dublin. Accessed March 05, 2021.
http://hdl.handle.net/2262/94842.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
O'Sullivan, Aisling. “The impact of visual speech on neural processing of auditory speech.” 2021. Web. 05 Mar 2021.
Vancouver:
O'Sullivan A. The impact of visual speech on neural processing of auditory speech. [Internet] [Thesis]. Trinity College Dublin; 2021. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/2262/94842.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
O'Sullivan A. The impact of visual speech on neural processing of auditory speech. [Thesis]. Trinity College Dublin; 2021. Available from: http://hdl.handle.net/2262/94842
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Toronto
11.
Richards, Michael David.
Audiovisual Processing and Integration in Amblyopia.
Degree: PhD, 2018, University of Toronto
URL: http://hdl.handle.net/1807/82956
► Amblyopia is a developmental visual disorder caused by abnormal visual experience during early life. Accumulating evidence points to perceptual deficits in amblyopia beyond vision, in…
(more)
▼ Amblyopia is a developmental visual disorder caused by abnormal visual experience during early life. Accumulating evidence points to perceptual deficits in amblyopia beyond vision, in the realm of audiovisual
multisensory perception. This thesis presents a systematic psychophysical investigation of audiovisual processing and
integration in adults with unilateral amblyopia. Study I examines audiovisual spatial
integration and reveals amblyopic deficits in localization precision for unisensory visual and auditory stimuli, but statistically optimal
integration according to the maximum likelihood estimation model of
multisensory integration. Study II confirms the novel deficit in sound localization described in Study I, and reveals a non-uniform spatial pattern of sound localization deficits that implicates the superior colliculus as a primary neural locus affected by abnormal visual experience. Study III measures audiovisual simultaneity perception, and shows that asynchronous audiovisual pairs are perceived as synchronous over wider temporal intervals than normal, regardless of which eye is viewing. Study IV examines audiovisual temporal
integration using the temporal ventriloquism effect, and reveals successful temporal
integration in amblyopia, but possibly over a wider interval of audiovisual asynchrony. In sum, the findings suggest that the capacities for spatial and temporal audiovisual
integration are intact in amblyopia, but that non-integrative
multisensory processes, including cross-modal temporal matching and cross-sensory calibration of sound localization, are impaired.
Advisors/Committee Members: Wong, Agnes M. F., Goltz, Herbert C., Medical Science.
Subjects/Keywords: Amblyopia; Audiovisual integration; Audiovisual processing; Multisensory integration; Multisensory processing; Psychophysics; 0317
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Richards, M. D. (2018). Audiovisual Processing and Integration in Amblyopia. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/82956
Chicago Manual of Style (16th Edition):
Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Doctoral Dissertation, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/82956.
MLA Handbook (7th Edition):
Richards, Michael David. “Audiovisual Processing and Integration in Amblyopia.” 2018. Web. 05 Mar 2021.
Vancouver:
Richards MD. Audiovisual Processing and Integration in Amblyopia. [Internet] [Doctoral dissertation]. University of Toronto; 2018. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/82956.
Council of Science Editors:
Richards MD. Audiovisual Processing and Integration in Amblyopia. [Doctoral Dissertation]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/82956

Vanderbilt University
12.
Kurela, LeAnne Renee.
Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus.
Degree: PhD, Neuroscience, 2017, Vanderbilt University
URL: http://hdl.handle.net/1803/11170
► The ability to integrate information across the senses is vital for coherent perception of and interaction with the surrounding world. In the mammalian brain, the…
(more)
▼ The ability to integrate information across the senses is vital for coherent perception of and interaction with the surrounding world. In the mammalian brain, the superior colliculus (SC) is critical for this
multisensory processing to occur. Much is known regarding the organization and function of neurons within the SC, including the necessity of normal sensory experience for proper development of these neurons, influence of modulatory neurotransmitter systems, and how these specific neurons are involved in
multisensory integrative processing. However, open questions regarding how neurotransmitter systems and developmental parameters are involved in
multisensory processing in adulthood remain. Previous work has shown that sensory experience throughout development is essential for proper
multisensory integrative capacity of SC neurons. Here, it is established that this normal sensory experience requirement is maintained throughout a lifetime; perturbation of visual experience in adulthood also affects
multisensory integrative capacities of SC neurons. In addition, studies detailed here sought to determine the role of the serotonergic (5-HT) system in
multisensory processing occurring within the SC. Through electrophysiological and pharmacological methods, a modulatory role of the 5-HT system was demonstrated, as alterations in the serotonergic signaling system within the SC affected responsivity, receptive field characteristics and integrative capacities of
multisensory neurons. These studies help to further our knowledge and understanding of the mechanisms at work in the SC in order to produce and maintain proper
multisensory processing capabilities.
Advisors/Committee Members: Mark Wallace (committee member), Alex Maier (committee member), Joseph Neimat (committee member), Jon Kaas (Committee Chair).
Subjects/Keywords: superior colliculus; multisensory; multisensory integration; serotonin; sensory experience
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kurela, L. R. (2017). Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/11170
Chicago Manual of Style (16th Edition):
Kurela, LeAnne Renee. “Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus.” 2017. Doctoral Dissertation, Vanderbilt University. Accessed March 05, 2021.
http://hdl.handle.net/1803/11170.
MLA Handbook (7th Edition):
Kurela, LeAnne Renee. “Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus.” 2017. Web. 05 Mar 2021.
Vancouver:
Kurela LR. Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus. [Internet] [Doctoral dissertation]. Vanderbilt University; 2017. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1803/11170.
Council of Science Editors:
Kurela LR. Insights into the influences of sensory experience and serotonin on multisensory processing in the superior colliculus. [Doctoral Dissertation]. Vanderbilt University; 2017. Available from: http://hdl.handle.net/1803/11170

University of California – Riverside
13.
DeLoss, Denton.
Age-Related Differences in Audiovisual Multisensory Integration.
Degree: Psychology, 2015, University of California – Riverside
URL: http://www.escholarship.org/uc/item/3hp1n5q8
► Recent research has demonstrated that multisensory integration, once thought to be isolated to later stages of processing in polysensory areas, may play a significant part…
(more)
▼ Recent research has demonstrated that multisensory integration, once thought to be isolated to later stages of processing in polysensory areas, may play a significant part in nearly all sensory processing. A large volume of research has also found a wide array of perceptual and cognitive changes with age. Given these sensory declines, we would also expect similar declines in multisensory integration. The present studies examined age-related changes in multisensory integration using the sound-induced flash illusion. The illusion is studied by presenting a number of visual flashes paired with a discrepant number of auditory beeps. The first experiment examined multisensory integration in younger and older individuals using the sound-induced flash illusion. Older individuals were found to have stronger multisensory integration as compared to younger individuals. The second experiment examined whether this increased integration could be due to decreased inhibitory control in older individuals, or a decrease in their ability to ignore the auditory beeps. This was examined by including an unrelated task in the visual and auditory modalities. The results of the study found that the addition of the task did influence the strength of the illusion in both older and younger individuals. However, this did not differ by age, indicating that attentional differences are not the cause of increased integration in older individuals. The third experiment examined whether the strength of the visual and auditory stimuli influences the strength of integration and whether this differs for older and younger individuals. The strength of the stimuli was found to influence the strength of the illusion, with decreased stimulus strength increasing the strength of the illusion. No age-related differences were found for the sound-induced flash illusion. However, older individuals showed a greater change in integration with decreased stimulus strength for the reverse illusion in which participants report the number of beeps presented instead of the number of flashes. Lastly, the fourth experiment examined whether spatial disparity influences the illusion and whether this changes with age. Spatial disparity between the auditory and visual flashes was not found to influence the illusion to a high degree and no age-related differences were found.
Subjects/Keywords: Psychology; Aging; Aging; Audition; Multisensory Integration; Vision
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
DeLoss, D. (2015). Age-Related Differences in Audiovisual Multisensory Integration. (Thesis). University of California – Riverside. Retrieved from http://www.escholarship.org/uc/item/3hp1n5q8
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
DeLoss, Denton. “Age-Related Differences in Audiovisual Multisensory Integration.” 2015. Thesis, University of California – Riverside. Accessed March 05, 2021.
http://www.escholarship.org/uc/item/3hp1n5q8.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
DeLoss, Denton. “Age-Related Differences in Audiovisual Multisensory Integration.” 2015. Web. 05 Mar 2021.
Vancouver:
DeLoss D. Age-Related Differences in Audiovisual Multisensory Integration. [Internet] [Thesis]. University of California – Riverside; 2015. [cited 2021 Mar 05].
Available from: http://www.escholarship.org/uc/item/3hp1n5q8.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
DeLoss D. Age-Related Differences in Audiovisual Multisensory Integration. [Thesis]. University of California – Riverside; 2015. Available from: http://www.escholarship.org/uc/item/3hp1n5q8
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

McMaster University
14.
Zhou, Yichu.
The Temporal Window of Visuotactile Integration.
Degree: MSc, 2016, McMaster University
URL: http://hdl.handle.net/11375/20429
► The simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks are the two widely used methods for measuring the window of multisensory integration; however, there…
(more)
▼ The simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks are the two widely used methods for measuring the window of multisensory integration; however, there are some indications that these two tasks involve different cognitive processes and therefore produce unrelated results. The present study measured observers’ visuotactile window of integration using these two tasks in order to examine whether or not SJs and TOJs produce consistent results for this particular pairing of modalities. Experiment 1 revealed no significant correlations between the SJ and TOJ tasks, indicating that they appear to measure distinct processes in visuotactile integration, and in addition showed that both sensory and decisional factors contribute to this difference. These findings were replicated in Experiment 2, which, along with Experiment 3, also showed that the reliability of the SJ and TOJ tasks may in part be responsible for the lack of agreement between these two tasks. A secondary result concerned the point of subjective simultaneity (PSS), which were tactile-leading across all three experiments. This contradicts some of the previous literature in visuotactile integration. Manipulating the spatial distance between the visual and tactile stimulus (Experiment 2) and the certainty of stimulus location (Experiment 3) did not lead to significant changes of the location of the PSS.
Thesis
Master of Science (MSc)
Perception often involves the use of more than one sensory modality at the same time; for example, touching an object usually produces sensory signals in the visual and tactile modalities. Since the amount of time needed to transmit and process sensory signals is different among the modalities, the brain allows for a certain time difference between signals of various pairs of modalities that it will consider as coming from one event. Two tasks commonly used to measure these allowable time differences are the simultaneity judgment (SJ) and temporal order judgment (TOJ) tasks. Although they are usually used interchangeably, the present data show that the results from these tasks in the visuotactile pairing of modalities are unrelated, and a major contributing reason appears to be that these tasks are not the most reliable.
Advisors/Committee Members: Shore, David, Psychology.
Subjects/Keywords: Multisensory integration; Simultaneity judgment; Temporal order judgment
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Zhou, Y. (2016). The Temporal Window of Visuotactile Integration. (Masters Thesis). McMaster University. Retrieved from http://hdl.handle.net/11375/20429
Chicago Manual of Style (16th Edition):
Zhou, Yichu. “The Temporal Window of Visuotactile Integration.” 2016. Masters Thesis, McMaster University. Accessed March 05, 2021.
http://hdl.handle.net/11375/20429.
MLA Handbook (7th Edition):
Zhou, Yichu. “The Temporal Window of Visuotactile Integration.” 2016. Web. 05 Mar 2021.
Vancouver:
Zhou Y. The Temporal Window of Visuotactile Integration. [Internet] [Masters thesis]. McMaster University; 2016. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/11375/20429.
Council of Science Editors:
Zhou Y. The Temporal Window of Visuotactile Integration. [Masters Thesis]. McMaster University; 2016. Available from: http://hdl.handle.net/11375/20429

University of Toronto
15.
Loria, Tristan.
The Influence of Action-based Attention on Audiovisual Integration.
Degree: PhD, 2020, University of Toronto
URL: http://hdl.handle.net/1807/100972
► We inhabit a world that offers a multitude of sensory cues that need to be disambiguated in order to perceive and interact with our surrounding…
(more)
▼ We inhabit a world that offers a multitude of sensory cues that need to be disambiguated in order to perceive and interact with our surrounding environment. However, it is not clear how goal-directed actions reciprocally influence
multisensory integration processes. The current dissertation examined how directing attention through visual gaze and to-be-reached locations influenced
multisensory perception of visual and auditory cues. Participants performed pointing movements towards one of three potential targets, while looking above or beside that target. At the onset of the movement,
multisensory stimuli (i.e., one or two flashes combined with one or two beeps) could be presented at the target of the reaching action, or in one of the non-target locations. Audiovisual perception was quantified by examining the influence of the auditory beeps on the number of perceived flashes (i.e., audio-visual illusion). It was hypothesized that looking at and/ or reaching to a target would modulate audiovisual perception. The results revealed audiovisual perception (specifically the fusion illusion, though not the fission illusion) was enhanced at the target relative to the non-target locations. Audiovisual
integration associated with the fusion illusion was further reduced with increasing eccentricity of non-target locations. As well, when decoupling gaze locations from the target locations, audiovisual perception was additively driven by gaze and reaching locations. Moreover, this modulation of audiovisual perception at target vs. non-target locations was only observed in the presence, but not the absence, of a goal-directed movement. Overall, the results indicated that deploying attention by looking and reaching to a target location may enhance audiovisual perception at that location, emphasizing the critical importance of voluntary movements to the perception of our surrounding environment. The data also suggest that the fusion and fission illusions may arise from distinct
integration mechanisms.
Advisors/Committee Members: Tremblay, Luc, Exercise Sciences.
Subjects/Keywords: attention; audiovisual; multisensory integration; upper-limb; 0575
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Loria, T. (2020). The Influence of Action-based Attention on Audiovisual Integration. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/100972
Chicago Manual of Style (16th Edition):
Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Doctoral Dissertation, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/100972.
MLA Handbook (7th Edition):
Loria, Tristan. “The Influence of Action-based Attention on Audiovisual Integration.” 2020. Web. 05 Mar 2021.
Vancouver:
Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Internet] [Doctoral dissertation]. University of Toronto; 2020. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/100972.
Council of Science Editors:
Loria T. The Influence of Action-based Attention on Audiovisual Integration. [Doctoral Dissertation]. University of Toronto; 2020. Available from: http://hdl.handle.net/1807/100972

University of Toronto
16.
Manson, Gerome.
The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events.
Degree: 2013, University of Toronto
URL: http://hdl.handle.net/1807/43131
► Recent evidence suggests audiovisual perception changes as one engages in action. Specifically, if an audiovisual illusion comprised of 2 flashes and 1 beep is presented…
(more)
▼ Recent evidence suggests audiovisual perception changes as one engages in action. Specifically, if an audiovisual illusion comprised of 2 flashes and 1 beep is presented during the
high velocity portion of upper- limb movements, the influence of the auditory stimuli is subdued. The goal of this thesis was to examine if visuomotor regulation processes that rely on information obtained when the limb is traveling at a high velocity could explain this perceptual modulation. In the present study, to control for engagement in visuomotor regulation processes, vision of the environment was manipulated. In conditions without vision of the environment, participants did not show the noted modulation of the audiovisual illusion. Also, analysis of the movement trajectories and endpoint precision revealed that movements without vision were less controlled than movements performed with vision. These results suggest that engagement in visuomotor regulation processes can influence perception of certain audiovisual events during goal-directed action.
MAST
Advisors/Committee Members: Tremblay, Luc, Exercise Sciences.
Subjects/Keywords: multisensory integration; perception; action; vision; 0384
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Manson, G. (2013). The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events. (Masters Thesis). University of Toronto. Retrieved from http://hdl.handle.net/1807/43131
Chicago Manual of Style (16th Edition):
Manson, Gerome. “The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events.” 2013. Masters Thesis, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/43131.
MLA Handbook (7th Edition):
Manson, Gerome. “The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events.” 2013. Web. 05 Mar 2021.
Vancouver:
Manson G. The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events. [Internet] [Masters thesis]. University of Toronto; 2013. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/43131.
Council of Science Editors:
Manson G. The Role of Visuomotor Regulation Processes on Perceived Audiovisual Events. [Masters Thesis]. University of Toronto; 2013. Available from: http://hdl.handle.net/1807/43131

University of Edinburgh
17.
Makovac, Elena.
Audio-visual interactions in manual and saccadic responses.
Degree: PhD, 2013, University of Edinburgh
URL: http://hdl.handle.net/1842/8040
► Chapter 1 introduces the notions of multisensory integration (the binding of information coming from different modalities into a unitary percept) and multisensory response enhancement (the…
(more)
▼ Chapter 1 introduces the notions of multisensory integration (the binding of information coming from different modalities into a unitary percept) and multisensory response enhancement (the improvement of the response to multisensory stimuli, relative to the response to the most efficient unisensory stimulus), as well as the general goal of the present thesis, which is to investigate different aspects of the multisensory integration of auditory and visual stimuli in manual and saccadic responses. The subsequent chapters report experimental evidence of different factors affecting the multisensory response: spatial discrepancy, stimulus salience, congruency between cross-modal attributes, and the inhibitory influence of concurring distractors. Chapter 2 reports three experiments on the role of the superior colliculus (SC) in multisensory integration. In order to achieve this, the absence of S-cone input to the SC has been exploited, following the method introduced by Sumner, Adamjee, and Mollon (2002). I found evidence that the spatial rule of multisensory integration (Meredith & Stein, 1983) applies only to SC-effective (luminance-channel) stimuli, and does not apply to SC-ineffective (S-cone) stimuli. The same results were obtained with an alternative method for the creation of S-cone stimuli: the tritanopic technique (Cavanagh, MacLeod, & Anstis, 1987; Stiles, 1959; Wald, 1966). In both cases significant multisensory response enhancements were obtained using a focused attention paradigm, in which the participants had to focus their attention on the visual modality and to inhibit responses to auditory stimuli. Chapter 3 reports two experiments showing the influence of shape congruency between auditory and visual stimuli on multisensory integration; i.e. the correspondence between structural aspects of visual and auditory stimuli (e.g., spiky shape and “spiky” sounds). Detection of audio-visual events was faster for congruent than incongruent pairs, and this congruency effect occurred also in a focused attention task, where participants were required to respond only to visual targets and could ignore irrelevant auditory stimuli. This particular type of cross-modal congruency was been evaluated in relation to the inverse effectiveness rule of multisensory integration (Meredith & Stein, 1983). In Chapter 4, the locus of the cross-modal shape congruency was evaluated applying the race model analysis (Miller, 1982). The results showed that the violation of the model is stronger for some congruent pairings in comparison to incongruent pairings. Evidence of multisensory depression was found for some pairs of incongruent stimuli. These data imply a perceptual locus for the cross-modal shape congruency effect. Moreover, it is evident that multisensoriality does not always induce an enhancement, and in some cases, when the attributes of the stimuli are particularly incompatible, a unisensory response may be more effective that the multisensory one. Chapter 5 reports experiments centred on saccadic generation mechanisms.…
Subjects/Keywords: multisensory integration; cross-modal congruency; saccadic inhibition
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Makovac, E. (2013). Audio-visual interactions in manual and saccadic responses. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/8040
Chicago Manual of Style (16th Edition):
Makovac, Elena. “Audio-visual interactions in manual and saccadic responses.” 2013. Doctoral Dissertation, University of Edinburgh. Accessed March 05, 2021.
http://hdl.handle.net/1842/8040.
MLA Handbook (7th Edition):
Makovac, Elena. “Audio-visual interactions in manual and saccadic responses.” 2013. Web. 05 Mar 2021.
Vancouver:
Makovac E. Audio-visual interactions in manual and saccadic responses. [Internet] [Doctoral dissertation]. University of Edinburgh; 2013. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1842/8040.
Council of Science Editors:
Makovac E. Audio-visual interactions in manual and saccadic responses. [Doctoral Dissertation]. University of Edinburgh; 2013. Available from: http://hdl.handle.net/1842/8040

University of Edinburgh
18.
Hunter, Edyta Monika.
Multisensory integration of social information in adult aging.
Degree: PhD, 2011, University of Edinburgh
URL: http://hdl.handle.net/1842/8742
► Efficient navigation of our social world depends on the generation, interpretation and combination of social signals within different sensory systems. However, the influence of adult…
(more)
▼ Efficient navigation of our social world depends on the generation, interpretation and combination of social signals within different sensory systems. However, the influence of adult aging on cross-modal integration of emotional stimuli remains poorly understood. Therefore, the aim of this PhD thesis is to understand the integration of visual and auditory cues in social situations and how this is associated with other factors important for successful social interaction such as recognising emotions or understanding the mental states of others. A series of eight experiments were designed to compare the performance of younger and older adults on tasks related to multisensory integration and social cognition. Results suggest that older adults are significantly less accurate at correctly identifying emotions from one modality (faces or voices alone) but perform as well as younger adults on tasks where congruent auditory and visual emotional information are presented concurrently. Therefore, older adults appear to benefit from congruent multisensory information. In contrast, older adults are poorer than younger adults at detecting incongruency from different sensory modalities involved in decoding cues to deception, sarcasm or masking of emotions. It was also found that age differences in the processing of relevant and irrelevant visual and auditory social information might be related to changes in gaze behaviour. A further study demonstrated that the changes in behaviour and social interaction often reported in patients post-stroke might relate to problems in integrating the cross-modal social information. The pattern of findings is discussed in relation to social, emotional, neuropsychological and cognitive theories.
Subjects/Keywords: 305.26; Aging; Emotion perception; Multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Hunter, E. M. (2011). Multisensory integration of social information in adult aging. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/8742
Chicago Manual of Style (16th Edition):
Hunter, Edyta Monika. “Multisensory integration of social information in adult aging.” 2011. Doctoral Dissertation, University of Edinburgh. Accessed March 05, 2021.
http://hdl.handle.net/1842/8742.
MLA Handbook (7th Edition):
Hunter, Edyta Monika. “Multisensory integration of social information in adult aging.” 2011. Web. 05 Mar 2021.
Vancouver:
Hunter EM. Multisensory integration of social information in adult aging. [Internet] [Doctoral dissertation]. University of Edinburgh; 2011. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1842/8742.
Council of Science Editors:
Hunter EM. Multisensory integration of social information in adult aging. [Doctoral Dissertation]. University of Edinburgh; 2011. Available from: http://hdl.handle.net/1842/8742

Virginia Tech
19.
Bruce, Madeleine D.
Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers.
Degree: M. S., Developmental Science, 2019, Virginia Tech
URL: http://hdl.handle.net/10919/96597
► Multisensory integration allows children to make sense of information received across their senses. Previous research has shown that events containing simultaneous and overlapping sensory information…
(more)
▼ Multisensory integration allows children to make sense of information received across their senses. Previous research has shown that events containing simultaneous and overlapping sensory information aid children in learning about objects. However, research has yet to evaluate whether children’s'
multisensory integration abilities are related to language learning. Thus, this study’s first goal was to look at whether toddlers are equally skilled at integrating
multisensory information in social and nonsocial contexts, and if
multisensory integration skills are related to toddlers' language skills. This study’s second goal was to examine whether parenting behaviors and/or familial access to resources (i.e., socioeconomic status) play a role in the hypothesized relationship between
multisensory integration and language in toddlerhood. Results indicated that toddlers show better
multisensory integration abilities when viewing social as opposed to nonsocial sensory information, and that social
multisensory integration skills were significantly related to their language skills. Also, maternal parenting behaviors, but not socioeconomic status, were significantly related to toddlers' language abilities. These findings suggest that at 24-months of age, both sensitive maternal parenting and the ability to integrate social
multisensory information are important to the development of language in toddlerhood.
Advisors/Committee Members: Panneton, Robin K. (committeechair), Bell, Martha Ann (committee member), Axsom, Danny K. (committee member).
Subjects/Keywords: multisensory integration; language; maternal sensitivity; toddlerhood
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bruce, M. D. (2019). Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers. (Masters Thesis). Virginia Tech. Retrieved from http://hdl.handle.net/10919/96597
Chicago Manual of Style (16th Edition):
Bruce, Madeleine D. “Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers.” 2019. Masters Thesis, Virginia Tech. Accessed March 05, 2021.
http://hdl.handle.net/10919/96597.
MLA Handbook (7th Edition):
Bruce, Madeleine D. “Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers.” 2019. Web. 05 Mar 2021.
Vancouver:
Bruce MD. Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers. [Internet] [Masters thesis]. Virginia Tech; 2019. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/10919/96597.
Council of Science Editors:
Bruce MD. Multisensory Integration in Social and Nonsocial Events and Emerging Language in Toddlers. [Masters Thesis]. Virginia Tech; 2019. Available from: http://hdl.handle.net/10919/96597

University of Toronto
20.
Ramkhalawansingh, Robert Charles.
Age-related changes in multisensory self-motion perception.
Degree: PhD, 2018, University of Toronto
URL: http://hdl.handle.net/1807/82948
► To derive the precise estimates of self-motion necessary to perform mobility-related tasks like walking and driving, humans integrate information about their movement from across their…
(more)
▼ To derive the precise estimates of self-motion necessary to perform mobility-related tasks like walking and driving, humans integrate information about their movement from across their sensory systems (e.g. visual, auditory, proprioceptive, vestibular). However, recent evidence suggests that the way in which multiple sensory inputs are integrated by the adult brain changes with age. The objective of this thesis was to consider, for the first time, whether age-related changes in
multisensory integration are observed in the context of self-motion perception. Two research approaches were used. First, I used a simple, simulated driving task to provide visual cues to self-motion and to manipulate the availability of auditory and/or vestibular cues to self-motion (i.e., unisensory versus
multisensory conditions). The results revealed that relative to younger adults, older adults generally demonstrate greater differences in performance between
multisensory and unisensory conditions. However, the driving task could not disentangle the effects of age-related differences in real-world driving experience from age-related differences in sensory integrative mechanisms. Second, I used an established and highly controlled psychophysical heading perception task to evaluate whether, like younger adults, older adults integrate visual and vestibular cues to self-motion in a statistically optimal fashion. I considered conditions where each of the two cues was presented alone, in combination and congruent, or in combination but indicating conflicting heading angles. Results showed that while older adults did demonstrate optimal
integration during congruent conditions, they were comparatively less tolerant to spatial conflicts between the visual and vestibular inputs. Overall, these results may have important implications for the way that older adults perform mobility-related tasks under various perceptual and environmental conditions.
Advisors/Committee Members: Campos, Jennifer L, Psychology.
Subjects/Keywords: Aging; Integration; Motion; Multisensory; Optimal; Perception; 0621
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Ramkhalawansingh, R. C. (2018). Age-related changes in multisensory self-motion perception. (Doctoral Dissertation). University of Toronto. Retrieved from http://hdl.handle.net/1807/82948
Chicago Manual of Style (16th Edition):
Ramkhalawansingh, Robert Charles. “Age-related changes in multisensory self-motion perception.” 2018. Doctoral Dissertation, University of Toronto. Accessed March 05, 2021.
http://hdl.handle.net/1807/82948.
MLA Handbook (7th Edition):
Ramkhalawansingh, Robert Charles. “Age-related changes in multisensory self-motion perception.” 2018. Web. 05 Mar 2021.
Vancouver:
Ramkhalawansingh RC. Age-related changes in multisensory self-motion perception. [Internet] [Doctoral dissertation]. University of Toronto; 2018. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1807/82948.
Council of Science Editors:
Ramkhalawansingh RC. Age-related changes in multisensory self-motion perception. [Doctoral Dissertation]. University of Toronto; 2018. Available from: http://hdl.handle.net/1807/82948

University of Ontario Institute of Technology
21.
McCracken, Heather.
Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.
Degree: 2018, University of Ontario Institute of Technology
URL: http://hdl.handle.net/10155/1058
► Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder with behavioural and neurophysiological characteristics. Several cortical structures that are altered in ADHD are involved in the process…
(more)
▼ Attention-Deficit/Hyperactivity Disorder (ADHD) is a neurodevelopmental disorder with behavioural and neurophysiological characteristics. Several cortical structures that are altered in ADHD are involved in the process of
multisensory integration (
MSI).
MSI is a fundamental form of sensory processing involved in many everyday tasks. Therefore, it is important to know whether those with ADHD experience altered
MSI. Two different paradigms were used to assess
MSI in adults with a diagnosis of ADHD. First, a simple response time (RT) task was completed. Electroencephalography (EEG) analysis revealed that those with ADHD had
MSI occur, while there were significant differences in brain activity between groups. Study two employed a two-alternative forced-choice discrimination task. Those with ADHD responded faster than controls. EEG analysis revealed that those with ADHD have enhanced
MSI. Activity differences were found in brain regions that are structurally altered in those with ADHD, indicating that structural alterations in ADHD may promote sensory processing.
Advisors/Committee Members: Yielder, Paul, Murphy, Bernadette.
Subjects/Keywords: Multisensory integration; ADHD; EEG; Response time; Audiovisual
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
McCracken, H. (2018). Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. (Thesis). University of Ontario Institute of Technology. Retrieved from http://hdl.handle.net/10155/1058
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Thesis, University of Ontario Institute of Technology. Accessed March 05, 2021.
http://hdl.handle.net/10155/1058.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
McCracken, Heather. “Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder.” 2018. Web. 05 Mar 2021.
Vancouver:
McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Internet] [Thesis]. University of Ontario Institute of Technology; 2018. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/10155/1058.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
McCracken H. Audiovisual multisensory integration in young adults with and without a diagnosis of Attention-Deficit/Hyperactivity Disorder. [Thesis]. University of Ontario Institute of Technology; 2018. Available from: http://hdl.handle.net/10155/1058
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
22.
Brooks, Cassandra.
Integration of auditory and visual temporal rate in aging.
Degree: 2017, University of Melbourne
URL: http://hdl.handle.net/11343/192299
► This thesis investigated how aging affects the integration of visual flicker (the temporal modulation of luminance) with auditory flutter (the temporal modulation of sound amplitude)…
(more)
▼ This thesis investigated how aging affects the integration of visual flicker (the temporal modulation of luminance) with auditory flutter (the temporal modulation of sound amplitude) to produce a unified audiovisual percept of temporal modulation rate. A group of younger and older adults judged the temporal rate of an auditory and/or visual stimulus oscillating at 10 Hz. Whichever sensory modality discriminates temporal rate more precisely, contributes more to the audiovisual percept. Consequently, the first experiment explored how aging affected the precision of auditory temporal rate discrimination relative to vision. Auditory temporal rate discrimination in older adults was degraded by an age-related impairment in sensitivity to auditory amplitude modulation. In subsequent audiovisual experiments, auditory modulation depth was individually tailored to equate flutter and flicker temporal rate discrimination thresholds to normalise for this age-related sensory loss. When auditory and visual rates were conflicting, partial integration distorted perceived rate such that the auditory or visual rate subjectively equivalent to a reference was nonveridical. Distortions in perceived rate were unaffected by older age, indicating that the ability to integrate conflicting auditory and visual rates is preserved in aging. However, younger adults’ heightened sensitivity to auditory amplitude modulation was sufficient to increase the influence of audition on perceived rate when the modulation depth of auditory flutter was the same as the average older adult. Therefore, the age-related impairment in auditory rate discriminability is expected to increase visual influence on audiovisual rate perception in older adults. When auditory and visual rates are identical, temporal rate discrimination thresholds improved in line with statistically optimal integration in younger but not older adults. This indicates an age-related impairment in integration, which will be further compounded by the age-related decline in auditory temporal rate discriminability under natural conditions. These findings indicate that older adults will perceive audiovisual temporal rate differently to younger adults. These age-related changes in audiovisual rate perception will be the complex product of the age-related interaction between rate congruence and integration ability, and the age-related decline in auditory temporal rate discrimination.
Subjects/Keywords: vision; audition; temporal perception; multisensory integration; aging
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Brooks, C. (2017). Integration of auditory and visual temporal rate in aging. (Masters Thesis). University of Melbourne. Retrieved from http://hdl.handle.net/11343/192299
Chicago Manual of Style (16th Edition):
Brooks, Cassandra. “Integration of auditory and visual temporal rate in aging.” 2017. Masters Thesis, University of Melbourne. Accessed March 05, 2021.
http://hdl.handle.net/11343/192299.
MLA Handbook (7th Edition):
Brooks, Cassandra. “Integration of auditory and visual temporal rate in aging.” 2017. Web. 05 Mar 2021.
Vancouver:
Brooks C. Integration of auditory and visual temporal rate in aging. [Internet] [Masters thesis]. University of Melbourne; 2017. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/11343/192299.
Council of Science Editors:
Brooks C. Integration of auditory and visual temporal rate in aging. [Masters Thesis]. University of Melbourne; 2017. Available from: http://hdl.handle.net/11343/192299

University of Edinburgh
23.
Saunders, Ian.
Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty.
Degree: PhD, 2012, University of Edinburgh
URL: http://hdl.handle.net/1842/9516
► To make sense of our unpredictable world, humans use sensory information streaming through billions of peripheral neurons. Uncertainty and ambiguity plague each sensory stream, yet…
(more)
▼ To make sense of our unpredictable world, humans use sensory information streaming through billions of peripheral neurons. Uncertainty and ambiguity plague each sensory stream, yet remarkably our perception of the world is seamless, robust and often optimal in the sense of minimising perceptual variability. Moreover, humans have a remarkable capacity for dexterous manipulation. Initiation of precise motor actions under uncertainty requires awareness of not only the statistics of our environment but also the reliability of our sensory and motor apparatus. What happens when our sensory and motor systems are disrupted? Upper-limb amputees tted with a state-of-the-art prostheses must learn to both control and make sense of their robotic replacement limb. Tactile feedback is not a standard feature of these open-loop limbs, fundamentally limiting the degree of rehabilitation. This thesis introduces a modular closed-loop upper-limb prosthesis, a modified Touch Bionics ilimb hand with a custom-built linear vibrotactile feedback array. To understand the utility of the feedback system in the presence of multisensory and sensorimotor influences, three fundamental open questions were addressed: (i) What are the mechanisms by which subjects compute sensory uncertainty? (ii) Do subjects integrate an artificial modality with visual feedback as a function of sensory uncertainty? (iii) What are the influences of open-loop and closed-loop uncertainty on prosthesis control? To optimally handle uncertainty in the environment people must acquire estimates of the mean and uncertainty of sensory cues over time. A novel visual tracking experiment was developed in order to explore the processes by which people acquire these statistical estimators. Subjects were required to simultaneously report their evolving estimate of the mean and uncertainty of visual stimuli over time. This revealed that subjects could accumulate noisy evidence over the course of a trial to form an optimal continuous estimate of the mean, hindered only by natural kinematic constraints. Although subjects had explicit access to a measure of their continuous objective uncertainty, acquired from sensory information available within a trial, this was limited by a conservative margin for error. In the Bayesian framework, sensory evidence (from multiple sensory cues) and prior beliefs (knowledge of the statistics of sensory cues) are combined to form a posterior estimate of the state of the world. Multiple studies have revealed that humans behave as optimal Bayesian observers when making binary decisions in forced-choice tasks. In this thesis these results were extended to a continuous spatial localisation task. Subjects could rapidly accumulate evidence presented via vibrotactile feedback (an artificial modality ), and integrate it with visual feedback. The weight attributed to each sensory modality was chosen so as to minimise the overall objective uncertainty. Since subjects were able to combine multiple sources of sensory information with respect to their sensory…
Subjects/Keywords: 612.8; prosthetic hand; uncertainty; sensorimotor; multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Saunders, I. (2012). Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/9516
Chicago Manual of Style (16th Edition):
Saunders, Ian. “Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty.” 2012. Doctoral Dissertation, University of Edinburgh. Accessed March 05, 2021.
http://hdl.handle.net/1842/9516.
MLA Handbook (7th Edition):
Saunders, Ian. “Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty.” 2012. Web. 05 Mar 2021.
Vancouver:
Saunders I. Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty. [Internet] [Doctoral dissertation]. University of Edinburgh; 2012. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1842/9516.
Council of Science Editors:
Saunders I. Closed-loop prosthetic hand : understanding sensorimotor and multisensory integration under uncertainty. [Doctoral Dissertation]. University of Edinburgh; 2012. Available from: http://hdl.handle.net/1842/9516

Vanderbilt University
24.
Butera, Iliza M.
Audiovisual listening in cochlear implant users.
Degree: PhD, Neuroscience, 2019, Vanderbilt University
URL: http://hdl.handle.net/1803/14135
► Cochlear implants (CIs)-widely considered the most successful neuroprosthetic devices-afford over half a million users worldwide access to sound perception following severe-to-profound hearing loss. However, visual…
(more)
▼ Cochlear implants (CIs)-widely considered the most successful neuroprosthetic devices-afford over half a million users worldwide access to sound perception following severe-to-profound hearing loss. However, visual cues remain vitally important for many CI users to interpret the impoverished auditory information that implants convey. While auditory-only speech understanding is well characterized in clinical outcome measures, relatively little is known about audiovisual (AV) speech comprehension in this cohort. In total, we recruited 116 normal hearing controls and 86 CI users (with 119 implanted ears) for a series of experiments that test: perceptions of the McGurk illusion and the sound-induced flash illusion (chapter 2), unisensory and
multisensory listening in noise (chapter 4), speech-evoked cortical activity via near-infrared spectroscopy (chapters 3 and 4), and the underlying temporal processing for how auditory and visual speech and non-speech stimuli are integrated (chapter 5). The results of these studies suggest that CI users perceptually weight visual speech more highly than auditory speech-a strategy that we describe as "visuocentric listening." Given that clinical outcome measures for CI users are both highly variable and difficult to predict, an important factor for addressing this variability is to identify where in the auditory pathway it is introduced. This dissertation addresses central mechanisms of brain plasticity with a focus on multimodal sensory
integration. The overall goal of this work is to better understand audiovisual
integration and how it relates to speech comprehension of CI users, particularly in ecological listening conditions that are naturally
multisensory. This knowledge is essential for our understanding of proficiency with a CI and, most importantly, how users can best utilize all sensory information to enhance speech intelligibility and improve quality of life.
Advisors/Committee Members: Mark T. Wallace (committee member), G. Christopher Stecker (committee member), Daniel H. Ashmead (committee member), René H. Gifford (committee member), Troy A. Hackett (Committee Chair).
Subjects/Keywords: fNIRS; auditory neuroscience; cochlear implants; multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Butera, I. M. (2019). Audiovisual listening in cochlear implant users. (Doctoral Dissertation). Vanderbilt University. Retrieved from http://hdl.handle.net/1803/14135
Chicago Manual of Style (16th Edition):
Butera, Iliza M. “Audiovisual listening in cochlear implant users.” 2019. Doctoral Dissertation, Vanderbilt University. Accessed March 05, 2021.
http://hdl.handle.net/1803/14135.
MLA Handbook (7th Edition):
Butera, Iliza M. “Audiovisual listening in cochlear implant users.” 2019. Web. 05 Mar 2021.
Vancouver:
Butera IM. Audiovisual listening in cochlear implant users. [Internet] [Doctoral dissertation]. Vanderbilt University; 2019. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1803/14135.
Council of Science Editors:
Butera IM. Audiovisual listening in cochlear implant users. [Doctoral Dissertation]. Vanderbilt University; 2019. Available from: http://hdl.handle.net/1803/14135

University of St. Andrews
25.
Mansouri Benssassi, Esma.
Bio-inspired multisensory integration of social signals
.
Degree: 2020, University of St. Andrews
URL: http://hdl.handle.net/10023/20182
► Emotions understanding represents a core aspect of human communication. Our social behaviours are closely linked to expressing our emotions and understanding others’ emotional and mental…
(more)
▼ Emotions understanding represents a core aspect of human communication. Our social behaviours
are closely linked to expressing our emotions and understanding others’ emotional and mental
states through social signals. Emotions are expressed in a
multisensory manner, where humans
use social signals from different sensory modalities such as facial expression, vocal changes, or
body language. The human brain integrates all relevant information to create a new
multisensory
percept and derives emotional meaning.
There exists a great interest for emotions recognition in various fields such as HCI, gaming,
marketing, and assistive technologies. This demand is driving an increase in research on
multisensory
emotion recognition. The majority of existing work proceeds by extracting meaningful
features from each modality and applying fusion techniques either at a feature level or decision
level. However, these techniques are ineffective in translating the constant talk and feedback
between different modalities. Such constant talk is particularly crucial in continuous emotion
recognition, where one modality can predict, enhance and complete the other.
This thesis proposes novel architectures for
multisensory emotions recognition inspired by
multisensory integration in the brain. First, we explore the use of bio-inspired unsupervised
learning for unisensory emotion recognition for audio and visual modalities. Then we propose
three
multisensory integration models, based on different pathways for
multisensory integration
in the brain; that is,
integration by convergence, early cross-modal enhancement, and
integration
through neural synchrony. The proposed models are designed and implemented using third generation
neural networks, Spiking Neural Networks (SNN) with unsupervised learning. The
models are evaluated using widely adopted, third-party datasets and compared to state-of-the-art
multimodal fusion techniques, such as early, late and deep learning fusion. Evaluation results
show that the three proposed models achieve comparable results to state-of-the-art supervised
learning techniques. More importantly, this thesis shows models that can translate a constant
talk between modalities during the training phase. Each modality can predict, complement and
enhance the other using constant feedback. The cross-talk between modalities adds an insight
into emotions compared to traditional fusion techniques.
Advisors/Committee Members: Ye, Juan (advisor).
Subjects/Keywords: Multisensory integration;
Spiking neural networks;
Emotions recognition
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Mansouri Benssassi, E. (2020). Bio-inspired multisensory integration of social signals
. (Thesis). University of St. Andrews. Retrieved from http://hdl.handle.net/10023/20182
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Mansouri Benssassi, Esma. “Bio-inspired multisensory integration of social signals
.” 2020. Thesis, University of St. Andrews. Accessed March 05, 2021.
http://hdl.handle.net/10023/20182.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Mansouri Benssassi, Esma. “Bio-inspired multisensory integration of social signals
.” 2020. Web. 05 Mar 2021.
Vancouver:
Mansouri Benssassi E. Bio-inspired multisensory integration of social signals
. [Internet] [Thesis]. University of St. Andrews; 2020. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/10023/20182.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Mansouri Benssassi E. Bio-inspired multisensory integration of social signals
. [Thesis]. University of St. Andrews; 2020. Available from: http://hdl.handle.net/10023/20182
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation

University of Edinburgh
26.
Bates, Sarah Louise.
Multisensory integration of spatial cues in old age.
Degree: PhD, 2015, University of Edinburgh
URL: http://hdl.handle.net/1842/19530
► Spatial navigation is essential for everyday function. It is successfully achieved by combining internally generated information – such vestibular and self-motion cues (known as path…
(more)
▼ Spatial navigation is essential for everyday function. It is successfully achieved by combining internally generated information – such vestibular and self-motion cues (known as path integration) – with external sources of information such as visual landmarks. These multiple sources and sensory domains are often associated with uncertainty and can provide conflicting information. The key to successful navigation is therefore how to integrate information from these internal and external sources in the best way. Healthy younger adults do this in a statistically optimal fashion by considering the perceived reliability of a cue during integration, consistent with the rules of Bayesian integration. However, the precise impact of ageing on the component senses of path integration and integration of such self-motion with external information is currently unclear. Given that impaired spatial ability is a common problem associated with ageing and is often a primary indicator of Alzheimer’s disease, this thesis asks whether age-related navigational impairments are related to fundamental deficits in the components of path integration and/or inadequate integration of spatial cues. Part 1 focussed on how ageing impacts the vestibular, kinaesthetic and visual components of path integration during linear navigation in the real world. Using path reproduction, distance estimation and depth perception tasks, I found that older adults showed no performance deficits in conditions that replicated those of everyday walking when visual and self-motion cues were present. However, they were impaired when relying on vestibular information alone. My results suggest that older adults are especially vulnerable to sensory deprivation but that weaker sensory domains can be compensated for by other sensory information, potentially by integrating different spatial cues in a Bayesian fashion: where the impact of unreliable/diminished senses can be minimised. Part 2 developed the conclusions of Part 1 by testing younger and older adults’ integration of visual landmarks and self-motion information during a simple homing task. I investigated the hypothesis that the integration of spatial information from multiple sensory domains is driven by Bayesian principles and that old age may affect the efficiency and elasticity of reliability-driven integration. Younger and older participants navigated to a previously visited location using self-motion and/or visual information. In some trials there was a conflict of information, which revealed the relative influence of self-motion and visual landmarks on behaviour. Findings revealed that both younger and older adults integrated visual and self-motion information to improve accuracy and precision, but older adults did not place as much influence on visual information as would have been optimal. This may have been the result of increased noise in the underlying spatial representations of older adults. Furthermore, older adults did not effectively re-weight visual and self-motion cues in line with the changing…
Subjects/Keywords: 612.6; spatial navigation; Bayesian integration; multisensory integration; ageing
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Bates, S. L. (2015). Multisensory integration of spatial cues in old age. (Doctoral Dissertation). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/19530
Chicago Manual of Style (16th Edition):
Bates, Sarah Louise. “Multisensory integration of spatial cues in old age.” 2015. Doctoral Dissertation, University of Edinburgh. Accessed March 05, 2021.
http://hdl.handle.net/1842/19530.
MLA Handbook (7th Edition):
Bates, Sarah Louise. “Multisensory integration of spatial cues in old age.” 2015. Web. 05 Mar 2021.
Vancouver:
Bates SL. Multisensory integration of spatial cues in old age. [Internet] [Doctoral dissertation]. University of Edinburgh; 2015. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1842/19530.
Council of Science Editors:
Bates SL. Multisensory integration of spatial cues in old age. [Doctoral Dissertation]. University of Edinburgh; 2015. Available from: http://hdl.handle.net/1842/19530

University of South Carolina
27.
Mac Adams, Spencer Lawrence.
Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration.
Degree: PhD, Psychology, 2019, University of South Carolina
URL: https://scholarcommons.sc.edu/etd/5605
► Multisensory integration (MSI) refers to the neural processes that integrate information from multiple different sensory systems and follows three established principles: the spatial, temporal…
(more)
▼ Multisensory integration (
MSI) refers to the neural processes that integrate information from multiple different sensory systems and follows three established principles: the spatial, temporal and inverse effectiveness principles. Evidence now suggests that
MSI can occur at the earliest stages of sensory processing in primary sensory cortices, including audiovisual
integration in primary visual cortex; however, the mechanism responsible for audio-visual
MSI enhancements remains elusive. Recently, unimodally presented sounds have been shown to activate visual cortex; however, no research has been conducted to evaluate if these sound-evoked responses reflect the auditory contribution to audiovisual
integration in primary visual cortex. Here we conducted a series of three studies in which we systematically evaluated whether sound-evoked responses in visual cortex operated in a manner consistent with the principles of
MSI by manipulating different auditory stimulus features while neural activity was recorded using an electroencephalogram (EEG). In the first study (Chapter 2), two experiments were conducted in which sound location was manipulated to allow us to evaluate if sound-evoked responses had the necessary spatial specificity to result in
MSI in visual cortex. We observed a novel early-latency event-related potential (ERP) in primary visual cortex, the rapid occipital auditory-evoked response (ROAR) that satisfied both the spatial and temporal rules, and showed that the established late-latency sound-evoked response, the auditorily-evoked contralateral occipital positivity (ACOP), failed to meet the spatial and temporal principles. Chapters 3 and 4 manipulated sound intensity and frequency, respectively, to evaluate if the two observed sound-evoked responses operated in a manner consistent with the principle of inverse effectiveness. The ROAR displayed inverse effectiveness to sound intensity, but not to sound frequency, whereas the ACOP did not display inverse effectiveness to either sound intensity or sound frequency. Taken together we believe our results indicate that the ACOP does not reflect a mechanism of audiovisual
integration in visual cortex, while the ROAR satisfies all three
integration principles and likely plays a causal role in audiovisual
integration within primary visual cortex.
Advisors/Committee Members: Jessica Green.
Subjects/Keywords: Psychology; visual; cortex; multi sensory; integration; Multisensory integration
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Mac Adams, S. L. (2019). Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration. (Doctoral Dissertation). University of South Carolina. Retrieved from https://scholarcommons.sc.edu/etd/5605
Chicago Manual of Style (16th Edition):
Mac Adams, Spencer Lawrence. “Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration.” 2019. Doctoral Dissertation, University of South Carolina. Accessed March 05, 2021.
https://scholarcommons.sc.edu/etd/5605.
MLA Handbook (7th Edition):
Mac Adams, Spencer Lawrence. “Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration.” 2019. Web. 05 Mar 2021.
Vancouver:
Mac Adams SL. Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration. [Internet] [Doctoral dissertation]. University of South Carolina; 2019. [cited 2021 Mar 05].
Available from: https://scholarcommons.sc.edu/etd/5605.
Council of Science Editors:
Mac Adams SL. Sound-Evoked Activations of Visual Cortex and the Principles of Multisensory Integration. [Doctoral Dissertation]. University of South Carolina; 2019. Available from: https://scholarcommons.sc.edu/etd/5605
28.
Felch, Daniel L.
Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles.
Degree: PhD, Neuroscience, 2015, Brown University
URL: https://repository.library.brown.edu/studio/item/bdr:419494/
► Early in the development of Xenopus laevis tadpoles, as the animals become capable of actively navigating their environment, individual neurons in the optic tectum become…
(more)
▼ Early in the development of Xenopus laevis tadpoles,
as the animals become capable of actively navigating their
environment, individual neurons in the optic tectum become
cross-modal. That is, they receive information both from the eye,
via retinal ganglion cell axons, and also from mechanosensory
nuclei in the hindbrain. At present it is unknown how, or even
whether, these two modalities interact in these young tectal
neurons and in the tectal circuit, generally. To begin to address
these questions, I here utilize an isolated-brain preparation to
stimulate these afferent pathways and record, from single cells,
either the excitatory and inhibitory synaptic inputs each receives,
or output that each generates upon activation with cross-modal
stimulus combinations, as well as with uni-modal (within-modality)
combinations. Additionally, to investigate how these relationships
might change over a developmental epoch characterized by extensive
experience-dependent plasticity, I collect data from two groups:
stages 44–46 and stages 48–49. My results show that cross-modal
stimuli do indeed interact in individual neurons of the developing
tectum, such that the magnitude (i.e., total number) and onset
latency of responses are both dependent on inter-stimulus interval.
Furthermore, the data show a selective sensitivity of these
responses for cross-modal combinations, which is
developmentally-regulated. Critically, although the pharmacological
blockade of inhibition abolishes this differential
integration of
cross-modal and uni-modal combinations, no differences are seen
between cross-modal and uni-modal effectiveness in the enhancement
of synaptic inhibition, or excitation, at any stage of development.
Additional experiments show a developmentally-regulated increase in
the extent of the recurrent, intra-tectal connections that are
activated by cross-modal combinations, however. These results thus
indicate a mechanism for cross-modal sensitivity that is more
nuanced than the simple balance between excitation and inhibition,
and illustrate important roles for synapse location and dendritic
integration.
Advisors/Committee Members: Aizenman, Carlos (Director), Berson, David (Reader), Kauer, Julie (Reader), Connors, Barry (Reader), Chen, Chinfei (Reader).
Subjects/Keywords: Multisensory
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Felch, D. L. (2015). Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles. (Doctoral Dissertation). Brown University. Retrieved from https://repository.library.brown.edu/studio/item/bdr:419494/
Chicago Manual of Style (16th Edition):
Felch, Daniel L. “Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles.” 2015. Doctoral Dissertation, Brown University. Accessed March 05, 2021.
https://repository.library.brown.edu/studio/item/bdr:419494/.
MLA Handbook (7th Edition):
Felch, Daniel L. “Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles.” 2015. Web. 05 Mar 2021.
Vancouver:
Felch DL. Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles. [Internet] [Doctoral dissertation]. Brown University; 2015. [cited 2021 Mar 05].
Available from: https://repository.library.brown.edu/studio/item/bdr:419494/.
Council of Science Editors:
Felch DL. Cross-Modal Interactions in the Optic Tectum of Xenopus
laevis Tadpoles. [Doctoral Dissertation]. Brown University; 2015. Available from: https://repository.library.brown.edu/studio/item/bdr:419494/

Universiteit Utrecht
29.
van der Stoep, N.
Into the depths of spatial attention and multisensory integration.
Degree: 2015, Universiteit Utrecht
URL: http://dspace.library.uu.nl:8080/handle/1874/322764
► During our daily lives our senses are flooded with information. We can see, hear, feel, smell, and taste all at the same time. Luckily, our…
(more)
▼ During our daily lives our senses are flooded with information. We can see, hear, feel, smell, and taste all at the same time. Luckily, our brain is helping us to make sense of this abundant information by combining information from different senses. The simultaneous presentation of information to different senses often results in behavioral benefits like faster detection and localization as compared to when only a single sense is stimulated. Two processes through which such benefits can occur are crossmodal exogenous spatial attention and
multisensory integration (
MSI). These two processes are essential for spatial orienting and are central to the studies that are described in the current thesis. In the first part, studies investigating when and how crossmodal exogenous spatial attention and
MSI contribute to
multisensory improvements in spatial perception are reported. Among others, it was shown that
MSI is the main cause of
multisensory benefits when sound and light are presented in close spatial and temporal proximity. However, both
MSI and crossmodal exogenous spatial attention contribute when the time interval between sound and light increases. At longer intervals
MSI does not longer contribute and spatial attention takes over. It was also shown that
MSI is reduced when a
multisensory stimulus is attended compared to when it is unattended. Several findings from neurophysiological and neuropsychological studies of attention and
multisensory perception have revealed that the brain processes information coming from near and far space differently. In the studies that are described in part 2 it was investigated how the behavioral benefits of crossmodal exogenous spatial attention and
MSI change as a function of the location of sound and light in three-dimensional space. These studies revealed that crossmodal exogenous spatial attention is distance-specific. Sounds that are presented in far space improve visual perception in far space, but not in near space, and the other way around. Further support for the idea that the attention system takes the distance of information into account comes from a study of spatial attention in stroke patients. It was observed that impairments in visuospatial attention could be distance-specific. Patients could have attention impairments in near space, without visuospatial attention impairments in far space. Patients could also have attention deficits in far space only, or in both near and far space. In a different study, we observed that the benefits of
multisensory integration for the detection and localization of sound and light is enhanced in far relative to near space. Not only do these studies add to the understanding of human
multisensory perception, they also provide a foundation for the application of these findings to more practical, real-life situations. For example, in situations where responses should be as fast and as accurate as possible (e.g., while driving), it can be very helpful to present a
multisensory warning signal to warn for an approaching threat. After all, it…
Advisors/Committee Members: Postma, Albert, van der Stigchel, Stefan, Nijboer, Tanja.
Subjects/Keywords: Multisensory integration; spatial attention; crossmodal; space; depth; audition; vision
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
van der Stoep, N. (2015). Into the depths of spatial attention and multisensory integration. (Doctoral Dissertation). Universiteit Utrecht. Retrieved from http://dspace.library.uu.nl:8080/handle/1874/322764
Chicago Manual of Style (16th Edition):
van der Stoep, N. “Into the depths of spatial attention and multisensory integration.” 2015. Doctoral Dissertation, Universiteit Utrecht. Accessed March 05, 2021.
http://dspace.library.uu.nl:8080/handle/1874/322764.
MLA Handbook (7th Edition):
van der Stoep, N. “Into the depths of spatial attention and multisensory integration.” 2015. Web. 05 Mar 2021.
Vancouver:
van der Stoep N. Into the depths of spatial attention and multisensory integration. [Internet] [Doctoral dissertation]. Universiteit Utrecht; 2015. [cited 2021 Mar 05].
Available from: http://dspace.library.uu.nl:8080/handle/1874/322764.
Council of Science Editors:
van der Stoep N. Into the depths of spatial attention and multisensory integration. [Doctoral Dissertation]. Universiteit Utrecht; 2015. Available from: http://dspace.library.uu.nl:8080/handle/1874/322764

University of Edinburgh
30.
Kanellopoulos, Athanasios.
Multisensory Integration Effect on Feature Binding in Visual Working Memory.
Degree: 2013, University of Edinburgh
URL: http://hdl.handle.net/1842/8603
► The focus of this experiment is to assess the effect of spatial attention biases on access to information in perceptual and visual working memory by…
(more)
▼ The focus of this experiment is to assess the effect of spatial attention biases on access to information in perceptual and visual working memory by cuing trials using auditory, visual or audiovisual signals in change detection paradigm. Audiovisual cues elicited the strongest effect on shifting attention, being super-additive in comparison to its unimodal components, while visual cues were effective though weaker and auditory cues totally ineffective. This effect of spatial attention on performance was gradually inhibited, as the delay interval between presentation of the memory and test array increased, changing significantly between the 500ms and 1500ms mark. Results are interpreted in the context of contemporary theories of visual feature binding,
multisensory integration and exogenous versus endogenous attention.
Advisors/Committee Members: Logie, Robert.
Subjects/Keywords: visual working memory; feature binding; multisensory integration; attention control
Record Details
Similar Records
Cite
Share »
Record Details
Similar Records
Cite
« Share





❌
APA ·
Chicago ·
MLA ·
Vancouver ·
CSE |
Export
to Zotero / EndNote / Reference
Manager
APA (6th Edition):
Kanellopoulos, A. (2013). Multisensory Integration Effect on Feature Binding in Visual Working Memory. (Thesis). University of Edinburgh. Retrieved from http://hdl.handle.net/1842/8603
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Chicago Manual of Style (16th Edition):
Kanellopoulos, Athanasios. “Multisensory Integration Effect on Feature Binding in Visual Working Memory.” 2013. Thesis, University of Edinburgh. Accessed March 05, 2021.
http://hdl.handle.net/1842/8603.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
MLA Handbook (7th Edition):
Kanellopoulos, Athanasios. “Multisensory Integration Effect on Feature Binding in Visual Working Memory.” 2013. Web. 05 Mar 2021.
Vancouver:
Kanellopoulos A. Multisensory Integration Effect on Feature Binding in Visual Working Memory. [Internet] [Thesis]. University of Edinburgh; 2013. [cited 2021 Mar 05].
Available from: http://hdl.handle.net/1842/8603.
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
Council of Science Editors:
Kanellopoulos A. Multisensory Integration Effect on Feature Binding in Visual Working Memory. [Thesis]. University of Edinburgh; 2013. Available from: http://hdl.handle.net/1842/8603
Note: this citation may be lacking information needed for this citation format:
Not specified: Masters Thesis or Doctoral Dissertation
◁ [1] [2] [3] [4] [5] … [394] ▶
.