Treffer: Model-Based Approach to Study of Mechanisms of Complex Image Viewing

Title:
Model-Based Approach to Study of Mechanisms of Complex Image Viewing
Source:
Optical memory & neural networks. 18(2):114-121
Publisher Information:
New York, NY; New York, NY: Allerton Press, Springer, 2009.
Publication Year:
2009
Physical Description:
print, 26 ref
Original Material:
INIST-CNRS
Time:
4266
Document Type:
Fachzeitschrift Article
File Description:
text
Language:
English
Author Affiliations:
A.B. Kogan Research Institute for Neurocybernetics, Southern Federal University, Stacka Ave. 194/1, Rostov-ou-Don 344090, Russian Federation
Psychology Department, Southern Federal University, Russian Federation
ISSN:
1060-992X
Rights:
Copyright 2009 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Metrology

Vertebrates : nervous system and sense organs
Accession Number:
edscal.21805981
Database:
PASCAL Archive

Weitere Informationen

A model-based approach to study complex image viewing mechanisms and the first results ot its implementation are presented. The choice ot the most informative regions (MIRs) is performed according to results of psychophysical tests with high-accuracy tracking of eye movements. For three test images, the MIRs were determined as image regions with maximal density of gaze fixations for the all subjects (n = 9). Individual image viewing scanpaths (n = 49) were classified into three basic types (i.e. viewing, object-consequent, and object-returned scanpaths). Task-related and temporal dynamics of eye movement parameters for the same subjects have been found. Artificial image scanpaths similar to experimental have been obtained by means of gaze attraction function.