Oscillatory synchronization model of attention to moving objects

dc.citation.epage36en_US
dc.citation.spage20en_US
dc.citation.volumeNumber29-30en_US
dc.contributor.authorYilmaz, O.en_US
dc.date.accessioned2016-02-08T09:47:01Z
dc.date.available2016-02-08T09:47:01Z
dc.date.issued2012en_US
dc.departmentNational Magnetic Resonance Research Center (UMRAM)en_US
dc.departmentDepartment of Psychologyen_US
dc.description.abstractThe world is a dynamic environment hence it is important for the visual system to be able to deploy attention on moving objects and attentively track them. Psychophysical experiments indicate that processes of both attentional enhancement and inhibition are spatially focused on the moving objects; however the mechanisms of these processes are unknown. The studies indicate that the attentional selection of target objects is sustained via a feedforward-feedback loop in the visual cortical hierarchy and only the target objects are represented in attention-related areas. We suggest that feedback from the attention-related areas to early visual areas modulates the activity of neurons; establishes synchronization with respect to a common oscillatory signal for target items via excitatory feedback, and also establishes de-synchronization for distractor items via inhibitory feedback. A two layer computational neural network model with integrate-and-fire neurons is proposed and simulated for simple attentive tracking tasks. Consistent with previous modeling studies, we show that via temporal tagging of neural activity, distractors can be attentively suppressed from propagating to higher levels. However, simulations also suggest attentional enhancement of activity for distractors in the first layer which represents neural substrate dedicated for low level feature processing. Inspired by this enhancement mechanism, we developed a feature based object tracking algorithm with surround processing. Surround processing improved tracking performance by 57% in PETS 2001 dataset, via eliminating target features that are likely to suffer from faulty correspondence assignments. © 2012 Elsevier Ltd.en_US
dc.description.provenanceMade available in DSpace on 2016-02-08T09:47:01Z (GMT). No. of bitstreams: 1 bilkent-research-paper.pdf: 70227 bytes, checksum: 26e812c6f5156f83f0e77b261a471b5a (MD5) Previous issue date: 2012en
dc.identifier.doi10.1016/j.neunet.2012.01.005en_US
dc.identifier.issn0893-6080
dc.identifier.urihttp://hdl.handle.net/11693/21483
dc.language.isoEnglishen_US
dc.publisherElsevier
dc.relation.isversionofhttp://dx.doi.org/10.1016/j.neunet.2012.01.005en_US
dc.source.titleNeural Networksen_US
dc.subjectAttentionen_US
dc.subjectCortical oscillationsen_US
dc.subjectNeural synchronyen_US
dc.subjectObject trackingen_US
dc.subjectComputational neural networksen_US
dc.subjectData setsen_US
dc.subjectDesynchronizationen_US
dc.subjectDynamic environmentsen_US
dc.subjectEnhancement mechanismen_US
dc.subjectFeature-baseden_US
dc.subjectInhibitory feedbacken_US
dc.subjectIntegrate-and-fire neuronsen_US
dc.subjectLow-level featuresen_US
dc.subjectModeling studiesen_US
dc.subjectMoving objectsen_US
dc.subjectNeural activityen_US
dc.subjectNeural substratesen_US
dc.subjectObject tracking algorithmen_US
dc.subjectOscillatory signalsen_US
dc.subjectOscillatory synchronizationen_US
dc.subjectPsychophysical experimentsen_US
dc.subjectTarget featureen_US
dc.subjectTarget objecten_US
dc.subjectTracking performanceen_US
dc.subjectTwo layersen_US
dc.subjectVisual areasen_US
dc.subjectVisual corticalen_US
dc.subjectVisual systemsen_US
dc.subjectBrainen_US
dc.subjectNeural networksen_US
dc.subjectFeedbacken_US
dc.titleOscillatory synchronization model of attention to moving objectsen_US
dc.typeArticleen_US

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Oscillatory synchronization model of attention to moving objects.pdf
Size:
1.32 MB
Format:
Adobe Portable Document Format
Description:
Full printable version