Our world is packed full of information. These are received by our senses, somehow combined and then interpreted so that we have a good understanding of the world.
And along the way, other processes also occur. Some information is more important than others in a given moment, so these are prioritised and put into the centre of our focus. The rest are still as valuable, giving us a more general idea of what the world is like and directing which of the information is more important than others. Finally, some information may be useful in the future so it is stored - this stored knowledge can then be used to guide the processes above.
The complex interactions between processes such as sensation, perception, attention, memory and cognition is made even more complex by the variability of the different sources of information we have, for example, vision and audition. It is important that these complex interactions are studied as they shape our day-to-day behaviour and any faults in the process may lead to a poorer quality of life. A good understanding of these interactions can also be applied in different scenarios to take advantage of different parts of the processes.
There are multiple approaches to study this topic, the most familiar of which is through behavioural studies. In this way, changes in behaviour are measured when distinct changes to the outside world are made. This gives us an insight into how the brain operates, but it does not shed light on the features of the processes involved.
Computational approaches take ideas from a wide range of disciplines, including behavioural and neurophysiological studies, to give insight into what processes may occur in the brain without cutting one open. Possible features of the bigger process are modelled and results can be compared with behavioural studies to test whether these features may be involved. This has an added advantage of also furthering knowledge in the field of computer science - biological principles have already been used as a foundation for important areas of research, such as machine vision.
The project I have already begun this summer aims to investigate the cross-modal influences of auditory information on vision of naturalistic scenes through computational modelling. Does hearing the sound of an object enhance your ability to find it within a scene? If so, how might the brain do this? Can computational efforts mimic the way in which human observers carry out this task?
This task may sound unnatural at first. Usually, the location of the sound, as well as whether the sound syncs up with any visual changes in our environment, are what help us find objects from our surroundings. However, previous studies have shown that the identity of the object, when taking spatial and temporal properties out of the equation, is also a guiding factor. This suggests that visual and auditory information may also interact at a high level where information exists as abstract semantic ideas (lower level information include visual orientation and auditory pitch).
A cross-modal effect of audition on vision may have implications on research regarding health and wellbeing. If sounds are useful in a visual task, the principles behind this process may be used to design more sophisticated prostheses for those who are visually impaired. These prostheses would provide an enriched experience of the world, improving their quality of day-to-day living. Also, understanding the mechanisms behind interactions between visual and auditory stimuli can also help guide research in developing multisensory stimulation techniques for stroke rehabilitation. For example, it may be important to distinguish whether recovery involves one modality taking priority over the damage one, or whether the integrated information is merely relatively weaker than prior to the stroke event. Finally, interactions between different senses could benefit individuals without neurological deficits as well. For example, the positive and negative effects of natural and urban spaces respectively could be modulated when representations of these spaces are perceived from multiple senses.
The research I will undertake is far away from the applications I described above - it is merely a puzzle piece in a very large jigsaw that is no where near complete. However, studying processes such as perception and attention is still important as they guide the more practical research that have a direct effect on people’s lives. So far, I have been incredibly fascinated reading about how these processes may interact with each other, as well as realising how little is known about the nature of these interactions.
I’m very excited to undertake this research under the mentorship of Dr Karla K. Evans and her PhD student, Cameron Kyle-Davidson, with the Complex Cognitive Processing Lab here at the University of York. The work so far has been challenging (in a good way), and I’m looking forward to what’s to come.
Please sign in
If you are a registered user on Laidlaw Scholars Network, please sign in