default search action
ETRA 2008: Savannah, Georgia, USA
- Kari-Jouko Räihä, Andrew T. Duchowski:
Proceedings of the Eye Tracking Research & Application Symposium, ETRA 2008, Savannah, Georgia, USA, March 26-28, 2008. ACM 2008, ISBN 978-1-59593-982-1
Keynote abstract
- Shumin Zhai:
On the ease and efficiency of human-computer interfaces. 9-10
Eye typing
- Jacob O. Wobbrock, James Rubinstein, Michael W. Sawyer, Andrew T. Duchowski:
Longitudinal evaluation of discrete consecutive gaze gestures for text entry. 11-18 - Outi Tuisku, Päivi Majaranta, Poika Isokoski, Kari-Jouko Räihä:
Now Dasher! Dash away!: longitudinal study of fast text entry by Eye Gaze. 19-26 - Marco Porta, Matteo Turina:
Eye-S: a full-screen input modality for pure eye-based communication. 27-34
Late breaking results: oral presentations
- Jeffrey B. Mulligan:
Measurement of eye velocity using active illumination. 35-38 - Sami Pietinen, Roman Bednarik, Tatiana Glotova, Vesa Tenhunen, Markku Tukiainen:
A method to study visual attention aspects of collaboration: eye-tracking pair programmers simultaneously. 39-42 - Matthew K. Feusner, Brian Lukoff:
Testing for statistically significant differences between groups of scan patterns. 43-46 - Geoffrey Tien, M. Stella Atkins:
Improving hands-free menu selection using eyegaze glances and fixations. 47-50 - Anke Huckauf, Mario H. Urbina:
Gazing with pEYEs: towards a universal input for various applications. 51-54 - I. Scott MacKenzie, Xuang Zhang:
Eye typing using word and letter prediction and a fixation algorithm. 55-58
Late breaking results: poster presentations
- Craig Hennessey, Peter D. Lawrence:
3D point-of-gaze estimation on a volumetric display. 59 - Wayne J. Ryan, Andrew T. Duchowski, Stanley T. Birchfield:
Limbus/pupil switching for wearable eye tracking under variable lighting conditions. 61-64 - Manu Kumar, Jeff Klingner, Rohan Puranik, Terry Winograd, Andreas Paepcke:
Improving the accuracy of gaze input for interaction. 65-68 - Jeff Klingner, Rakshit Kumar, Pat Hanrahan:
Measuring the task-evoked pupillary response with a remote eye tracker. 69-72 - Minoru Nakayama, Yosiyuki Takahasi:
Estimation of certainty for multiple choice tasks using features of eye-movements. 73-76 - Minoru Nakayama, Makoto Katsukura:
Assessing usability with eye-movement frequency analysis. 77 - Emiliano Castellina, Fulvio Corno, Paolo Pellegrino:
Integrated speech and gaze control for realistic desktop environments. 79-82 - Sara Dalzel-Job, Craig Nicol, Jon Oberlander:
Comparing behavioural and self-report measures of engagement with an embodied conversational agent: a first report on eye tracking in Second Life. 83-85 - Cihan Topal, Ömer Nezih Gerek, Atakan Dogan:
A head-mounted sensor-based eye tracking device: eye touch system. 87-90 - Sven-Thomas Graupner, Michael Heubner, Sebastian Pannasch, Boris M. Velichkovsky:
Evaluating requirements for gaze-based interaction in a see-through head mounted display. 91-94 - Takashi Nagamatsu, Junzo Kamahara, Takumi Iko, Naoki Tanaka:
One-point calibration gaze tracking based on eyeball kinematics using stereo cameras. 95-98 - Roman Bednarik, Markku Tukiainen:
Temporal eye-tracking data: evolution of debugging strategies with multiple representations. 99-102 - Zhang Yun, Zhao Xin-Bo, Zhao Rong-Chun, Zhou Yuan, Zou Xiao-Chun:
EyeSecret: an inexpensive but high performance auto-calibration eye tracker. 103-106 - Oleg Spakov, Kari-Jouko Räihä:
KiEV: a tool for visualization of reading and writing processes in translation of text. 107-110 - Frederick Shic, Brian Scassellati, Katarzyna Chawarska:
The incomplete fixation measure. 111-114 - Inger Ekman, Antti Poikola, Meeri Mäkäräinen, Tapio Takala, Perttu Hämäläinen:
Voluntary pupil size change as control in eyes only interaction. 115-118 - Tony Poitschke, Markus Ablaßmeier, Gerhard Rigoll, Stanislavs Bardins, Stefan Kohlbecher, Erich Schneider:
Contact-analog information representation in an automotive head-up display. 119-122 - Selina Sharmin, Oleg Spakov, Kari-Jouko Räihä, Arnt Lykke Jakobsen:
Effects of time pressure and text complexity on translators' fixations. 123-126 - Margarita Vinnikov, Robert S. Allison, Dominik Swierad:
Real-time simulation of visual defects with gaze-contingent display. 127-130 - Yoshiko Habuchi, Muneo Kitajima, Haruhiko Takeuchi:
Comparison of eye movements in searching for easy-to-find and hard-to-find information in a hierarchically organized information structure. 131-134 - Stefan Kohlbecher, Stanislavs Bardins, Klaus Bartl, Erich Schneider, Tony Poitschke, Markus Ablaßmeier:
Calibration-free eye tracking by reconstruction of the pupil ellipse in 3D space. 135-138 - Sheng Liu, Hong Hua:
Spatialchromatic foveation for gaze contingent displays. 139-142 - Moran Cerf, Edward Paxon Frady, Christof Koch:
Using semantic content as cues for better scanpath prediction. 143-146 - Harri Rantala:
Eye2i: coordinated multiple views for gaze data. 147 - Thiago S. Barcelos, Carlos Hitoshi Morimoto:
GInX: gaze based interface extensions. 149-152 - Sylvain Chartier, Patrice Renaud:
An online noise filter for eye-tracker data recorded in a virtual environment. 153-156
Looking at faces, chess boards, and maps
- Sheree Josephson, Michael E. Holmes:
Cross-race recognition deficit and visual attention: do they all look (at faces) alike? 157-164 - Pieter J. Blignaut, Tanya René Beelders, C.-Y. So:
The visual span of chess players. 165-171 - Mauro Cherubini, Marc-Antoine Nüssli, Pierre Dillenbourg:
Deixis and gaze in collaborative work at a distance (over a shared map): a computational model to detect misunderstandings. 173-180
Advances in eye tracking technology
- Susan M. Munn, Jeff B. Pelz:
3D point-of-regard, position and head orientation from a portable monocular video-based eye tracker. 181-188 - Jixu Chen, Yan Tong, Wayne D. Gray, Qiang Ji:
A robust 3D eye gaze tracking system using noise reduction. 189-196 - Dale Roberts, Mark Shelhamer, Aaron Wong:
A new "wireless" search-coil system. 197-204
Gaze interfaces
- Dan Witzner Hansen, Henrik H. T. Skovsgaard, John Paulin Hansen, Emilie Møllenbach:
Noise tolerant selection by gaze-controlled pan and zoom in 3D. 205-212 - Yvonne Kammerer, Katharina Scheiter, Wolfgang Beinhauer:
Looking my way through the menu: the impact of menu design and multimodal input on gaze-based menu selection. 213-220 - Howell O. Istance, Richard Bates, Aulikki Hyrskykari, Stephen Vickers:
Snap clutch, a moded approach to solving the Midas touch problem. 221-228
Prediction, bias, estimation
- Oleg Komogortsev, Javed I. Khan:
Eye movement prediction by Kalman filter with integrated linear horizontal oculomotor plant mechanical model. 229-236 - Elias Daniel Guestrin, Moshe Eizenman, Jeffrey J. Kang, Erez Eizenman:
Analysis of subject-dependent point-of-gaze estimation bias in the cross-ratios method. 237-244 - Hirotake Yamazoe, Akira Utsumi, Tomoko Yonezawa, Shinji Abe:
Remote gaze estimation with a single camera based on facial-feature tracking without special calibration actions. 245-250
Calibration
- Martin Böhme, Michael Dorr, Mathis Graw, Thomas Martinetz, Erhardt Barth:
A software framework for simulating eye trackers. 251-258 - Juan J. Cerrolaza, Arantxa Villanueva, Rafael Cabeza:
Taxonomic study of polynomial regressions applied to the calibration of video-oculographic systems. 259-266 - Elias Daniel Guestrin, Moshe Eizenman:
Remote point-of-gaze estimation requiring a single-point calibration for applications with infants. 267-274
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.