Treffer: Gaze-based prediction of pen-based virtual interaction tasks

Title:
Gaze-based prediction of pen-based virtual interaction tasks
Source:
International journal of human-computer studies. 73:91-106
Publisher Information:
Oxford: Elsevier, 2015.
Publication Year:
2015
Physical Description:
print, 3/4 p
Original Material:
INIST-CNRS
Subject Terms:
Computer science, Informatique, Psychology, psychopathology, psychiatry, Psychologie, psychopathologie, psychiatrie, Sciences exactes et technologie, Exact sciences and technology, Sciences appliquees, Applied sciences, Informatique; automatique theorique; systemes, Computer science; control theory; systems, Logiciel, Software, Systèmes informatiques et systèmes répartis. Interface utilisateur, Computer systems and distributed systems. User interface, Organisation des mémoires. Traitement des données, Memory organisation. Data processing, Traitement des données. Listes et chaînes de caractères, Data processing. List processing. Character string processing, Simulation, Intelligence artificielle, Artificial intelligence, Reconnaissance des formes. Traitement numérique des images. Géométrie algorithmique, Pattern recognition. Digital image processing. Computational geometry, Assistance utilisateur, User assistance, Asistencia usuario, Clavier, Keyboard, Teclado, Comportement utilisateur, User behavior, Comportamiento usuario, Couplage mode, Mode coupling, Acoplamiento modo, Dessin main levée, Freehand sketch, Dibujo a mano alzada, Equipement entrée sortie, Input output equipment, Equipo entrada salida, Intention, Intencíon, Interface graphique, Graphical interface, Interfaz grafica, Interface multimodale, Multimodal interface, Interfaz multimodal, Interface utilisateur, User interface, Interfase usuario, Levier commande, Control lever, Palanca de mando, Menu, Mouvement oculaire, Eye movement, Movimiento ocular, Regard, Gaze, Mirada, Réalité virtuelle, Virtual reality, Realidad virtual, Simulation ordinateur, Computer simulation, Simulación computadora, Système homme machine, Man machine system, Sistema hombre máquina, Feature representation, Feature selection, Gaze-based interfaces, Multimodal databases, Multimodal interaction, Predictive interfaces, Sketch-based interaction
Document Type:
Fachzeitschrift Article
File Description:
text
Language:
English
Author Affiliations:
Intelligent User Interfaces Lab, Department of Computer Engineering, Koç University, Istanbul 34450, Turkey
ISSN:
1071-5819
Rights:
Copyright 2015 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Computer science; theoretical automation; systems
Accession Number:
edscal.28891985
Database:
PASCAL Archive

Weitere Informationen

In typical human-computer interaction, users convey their intentions through traditional input devices (e.g. keyboards, mice, joysticks) coupled with standard graphical user interface elements. Recently, pen-based interaction has emerged as a more intuitive alternative to these traditional means. However, existing pen-based systems are limited by the fact that they rely heavily on auxiliary mode switching mechanisms during interaction (e.g. hard or soft modifier keys, buttons, menus). In this paper, we describe how eye gaze movements that naturally occur during pen-based interaction can be used to reduce dependency on explicit mode selection mechanisms in pen-based systems. In particular, we show that a range of virtual manipulation commands, that would otherwise require auxiliary mode switching elements, can be issued with an 88% success rate with the aid of users' natural eye gaze behavior during pen-only interaction.