default search action
ETRA 2018: Warsaw, Poland
- Bonita Sharif, Krzysztof Krejtz:
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications, ETRA 2018, Warsaw, Poland, June 14-17, 2018. ACM 2018
Cognition
- Manuele Reani, Niels Peek, Caroline Jay:
An investigation of the effects of n-gram length in scanpath analysis for eye-tracking research. 1:1-1:8 - Unaizah Obaidellah, Mohammed Al Haek:
Evaluating gender difference on algorithmic problems using eye-tracker. 2:1-2:8 - Chaitra Yangandul, Sachin Paryani, Madison Le, Eakta Jain:
How many words is a picture worth?: attention allocation on thumbnails versus title text regions. 3:1-3:5 - Tobias Appel, Christian Scharinger, Peter Gerjets, Enkelejda Kasneci:
Cross-subject workload classification using pupil-related measures. 4:1-4:8 - Pierre Weill-Tessier, Hans Gellersen:
Correlation between gaze and hovers during decision-making interaction. 5:1-5:5 - Pieter Potgieter, Pieter J. Blignaut:
A system to determine if learners know the divisibility rules and apply them correctly. 6:1-6:8
Fundamental eye tracking
- Andoni Larumbe, Rafael Cabeza, Arantxa Villanueva:
Supervised descent method (SDM) applied to accurate pupil detection in off-the-shelf eye tracking systems. 7:1-7:8 - Wolfgang Fuhl, David Geisler, Thiago Santini, Tobias Appel, Wolfgang Rosenstiel, Enkelejda Kasneci:
CBF: circular binary features for robust and real-time pupil center detection. 8:1-8:6 - Kai Dierkes, Moritz Kassner, Andreas Bulling:
A novel approach to single camera, glint-free 3D eye model fitting including corneal refraction. 9:1-9:9 - Argenis Ramirez Gomez, Hans Gellersen:
Smooth-i: smart re-calibration using smooth pursuit eye movements. 10:1-10:5 - Pawel Kasprowski, Katarzyna Harezlak:
Comparison of mapping algorithms for implicit calibration using probable fixation targets. 11:1-11:8 - Xucong Zhang, Yusuke Sugano, Andreas Bulling:
Revisiting data normalization for appearance-based gaze estimation. 12:1-12:9
Digital interactions
- Nelson Silva, Tobias Schreck, Eduardo E. Veas, Vedran Sabol, Eva Eggeling, Dieter W. Fellner:
Leveraging eye-gaze and time-series features to predict user interests and build a recommendation model for visual analysis. 13:1-13:9 - Yulia Gizatdinova, Oleg Spakov, Outi Tuisku, Matthew A. Turk, Veikko Surakka:
Gaze and head pointing for hands-free text entry: applicability to ultra-small virtual keyboards. 14:1-14:9 - Vijay Rajanna, John Paulin Hansen:
Gaze typing in virtual reality: impact of keyboard design, selection method, and motion. 15:1-15:10 - Alexandra Papoutsaki, Aaron Gokaslan, James Tompkin, Yuze He, Jeff Huang:
The eye of the typer: a benchmark and analysis of gaze behavior during typing. 16:1-16:9 - Christina P. Katsini, George E. Raptis, Christos Fidas, Nikolaos M. Avouris:
Towards gaze-based quantification of the security of graphical authentication schemes. 17:1-17:5 - Raphael Menges, Hanadi Tamimi, Chandan Kumar, Tina Walber, Christoph Schaefer, Steffen Staab:
Enhanced representation of web pages for usability analysis with eye tracking. 18:1-18:9
Mobile eye tracking
- Martin Weier, Thorsten Roth, André Hinkenjann, Philipp Slusallek:
Predicting the gaze depth in head-mounted displays using multiple feature regression. 19:1-19:9 - Karishma Singh, Mahmoud Kalash, Neil D. B. Bruce:
Capturing real-world gaze behaviour: live and unplugged. 20:1-20:9 - Seonwook Park, Xucong Zhang, Andreas Bulling, Otmar Hilliges:
Learning to find eye region landmarks for remote gaze estimation in unconstrained settings. 21:1-21:10 - Mihai Bâce, Sander Staal, Gábor Sörös:
Wearable eye tracker calibration at your fingertips. 22:1-22:5 - Julian Steil, Michael Xuelin Huang, Andreas Bulling:
Fixation detection for head-mounted eye tracking based on visual similarity of gaze targets. 23:1-23:9 - Michael Barz, Florian Daiber, Daniel Sonntag, Andreas Bulling:
Error-aware gaze-based interfaces for robust mobile gaze interaction. 24:1-24:10
Gaze-based interaction
- Eduardo Velloso, Flavio Luiz Coutinho, Andrew T. N. Kurauchi, Carlos H. Morimoto:
Circular orbits detection for gaze interaction using 2D correlation and profile matching algorithms. 25:1-25:9 - Toshiya Isomoto, Toshiyuki Ando, Buntarou Shizuki, Shin Takahashi:
Dwell time reduction technique using Fitts' law for gaze-based target acquisition. 26:1-26:7 - Thomas Mattusch, Mahsa Mirzamohammad, Mohamed Khamis, Andreas Bulling, Florian Alt:
Hidden pursuits: evaluating gaze-selection via pursuits when the stimuli's trajectory is partially hidden. 27:1-27:5 - Florian Jungwirth, Michael Haslgrübler, Alois Ferscha:
Contour-guided gaze gestures: using object contours as visual guidance for triggering interactions. 28:1-28:10 - Fabian Göbel, Peter Kiefer, Ioannis Giannopoulos, Andrew T. Duchowski, Martin Raubal:
Improving map reading with gaze-adaptive legends. 29:1-29:9 - Brent D. Parsons, Richard B. Ivry:
Rapid alternating saccade training. 30:1-30:5
Social and natural behaviors
- Philipp Müller, Michael Xuelin Huang, Xucong Zhang, Andreas Bulling:
Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviour. 31:1-31:10 - Deepak Akkil, Biju Thankachan, Poika Isokoski:
I see what you see: gaze awareness in mobile video collaboration. 32:1-32:9 - Pernilla Qvarfordt, Matthew Lee:
Gaze patterns during remote presentations while listening and speaking. 33:1-33:9 - Mathias Trefzger, Tanja Blascheck, Michael Raschke, Sarah Hausmann, Thomas Schlegel:
A visual comparison of gaze behavior from pedestrians and cyclists. 34:1-34:5 - Jan Petruzálek, Denis Sefara, Marek Franek, Martin Kabelác:
Scene perception while listening to music: an eye-tracking study. 35:1-35:5 - Oleg Spakov, Howell O. Istance, Tiia Viitanen, Harri Siirtola, Kari-Jouko Räihä:
Enabling unsupervised eye tracker calibration by school children through games. 36:1-36:9
Clinical and emotional
- Anke Huckauf:
Systematic shifts of fixation disparity accompanying brightness changes. 37:1-37:5 - Alessandro Grillini, Daniel Ombelet, Rijul S. Soans, Frans W. Cornelissen:
Towards using the spatio-temporal properties of eye movements to classify visual field defects. 38:1-38:5 - Nora Castner, Enkelejda Kasneci, Thomas C. Kübler, Katharina Scheiter, Juliane Richter, Thérése Eder, Fabian Hüttig, Constanze Keutel:
Scanpath comparison in medical image reading skills of dental students: distinguishing stages of expertise development. 39:1-39:9 - Pawel Kasprowski, Katarzyna Harezlak, Sabina Kasprowska:
Development of diagnostic performance & visual processing in different types of radiological expertise. 40:1-40:6 - Nina A. Gehrer, Michael Schönenberg, Andrew T. Duchowski, Krzysztof Krejtz:
Implementing innovative gaze analytic methods in clinical psychology: a study on eye movements in antisocial violent offenders. 41:1-41:9 - Minoru Nakayama:
Ocular reactions in response to impressions of emotion-evoking pictures. 42:1-42:5 - Krzysztof Krejtz, Katarzyna Wisiecka, Izabela Krejtz, Pawel Holas, Michal Olszanowski, Andrew T. Duchowski:
Dynamics of emotional facial expression recognition in individuals with social anxiety. 43:1-43:9
Notes (short papers)
- Vagner Figueredo de Santana, Juliana Jansen Ferreira, Rogério Abreu de Paula, Renato Fontoura de Gusmão Cerqueira:
An eye gaze model for seismic interpretation support. 44:1-44:10 - Benjamin I. Outram, Yun Suen Pai, Tanner Person, Kouta Minamizawa, Kai Kunze:
Anyorbit: orbital navigation in virtual environments with eye-tracking. 45:1-45:5 - Andrew D. Wilson, Shane Williams:
Autopager: exploiting change blindness for gaze-assisted reading. 46:1-46:5 - Laura Sesma-Sanchez, Dan Witzner Hansen:
Binocular model-based gaze estimation with a camera and a single infrared light source. 47:1-47:5 - Wolfgang Fuhl, Shahram Eivazi, Benedikt Hosp, Anna Eivazi, Wolfgang Rosenstiel, Enkelejda Kasneci:
BORE: boosted-oriented edge optimization for robust, real time remote pupil center detection. 48:1-48:5 - Kévin Bannier, Eakta Jain, Olivier Le Meur:
Deepcomics: saliency estimation for comics. 49:1-49:5 - Kai Otto, Nora Castner, David Geisler, Enkelejda Kasneci:
Development and evaluation of a gaze feedback system integrated into eyetrace. 50:1-50:5 - Saleh Mozaffari, Pascal Klein, Jouni Viiri, Sheraz Ahmed, Jochen Kuhn, Andreas Dengel:
Evaluating similarity measures for gaze patterns in the context of representational competence in physics education. 51:1-51:5 - Michael Burch, Kuno Kurzhals, Niklas Kleinhans, Daniel Weiskopf:
EyeMSA: exploring eye movement data with pairwise and multiple sequence alignment. 52:1-52:5 - Sandeep Vidyapu, V. Vijaya Saradhi, Samit Bhattacharya:
Fixation-indices based correlation between text and image visual features of webpages. 53:1-53:5 - Jeroen S. Benjamins, Roy S. Hessels, Ignace Th. C. Hooge:
Gazecode: open-source software for manual mapping of mobile eye-tracking data. 54:1-54:4 - Maurice Koch, Kuno Kurzhals, Daniel Weiskopf:
Image-based scanpath comparison with slit-scan visualization. 55:1-55:5 - Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu:
Implicit user calibration for gaze-tracking systems using an averaged saliency map around the optical axis of the eye. 56:1-56:5 - Diederick C. Niehorster, Marcus Nyström:
Microsaccade detection using pupil and corneal reflection signals. 57:1-57:5 - Jutta Hild, Michael Voit, Christian Kühnle, Jürgen Beyerer:
Predicting observer's task from eye movement patterns during motion image analysis. 58:1-58:5 - Lukas Greiter, Christoph Strauch, Anke Huckauf:
Pupil responses signal less inhibition for own relative to other names. 59:1-59:5 - Roman Bednarik, Piotr Bartczak, Hana Vrzakova, Jani Koskinen, Antti-Pekka Elomaa, Antti Huotarinen, David Gil de Gómez Pérez, Mikael von und zu Fraunberg:
Pupil size as an indicator of visual-motor workload and expertise in microsurgical training tasks. 60:1-60:5 - Thiago Santini, Wolfgang Fuhl, Enkelejda Kasneci:
PuReST: robust pupil tracking for real-time pervasive eye tracking. 61:1-61:5 - Nilavra Bhattacharya, Jacek Gwizdka:
Relating eye-tracking measures with changes in knowledge on search tasks. 62:1-62:5 - Filip Dechterenko, Jirí Lukavský:
Robustness of metrics used for scanpath comparison. 63:1-63:5 - Maryam Keyvanara, Robert S. Allison:
Sensitivity to natural 3D image transformations during eye movements. 64:1-64:5 - Haofei Wang, Jimin Pi, Tong Qin, Shaojie Shen, Bertram E. Shi:
SLAM-based localization of 3D gaze using a mobile eye tracker. 65:1-65:5 - William Rosengren, Marcus Nyström, Björn Hammar, Martin Stridh:
Suitability of calibration polynomials for eye-tracking data with simulated fixation inaccuracies. 66:1-66:5 - Jan Ehlers, Christoph Strauch, Anke Huckauf:
Training facilitates cognitive control on pupil dilation. 67:1-67:5 - Poika Isokoski, Jari Kangas, Päivi Majaranta:
Useful approaches to exploratory analysis of gaze data: enhanced heatmaps, cluster maps, and transition maps. 68:1-68:9
ETRA doctoral symposium abstracts
- Zhe Zeng, Matthias Roetting:
A text entry interface using smooth pursuit movements and language model. 69:1-69:2 - Fabian Deitelhoff:
Asynchronous gaze sharing: towards a dynamic help system to support learners during program comprehension. 70:1-70:3 - M. J. de Boer, Deniz Baskent, Frans W. Cornelissen:
Audio-visual interaction in emotion perception for communication: doctoral symposium, extended abstract. 71:1-71:2 - Wivine Blekic, Mandy Rossignol:
Automatic detection and inhibition of neutral and emotional stimuli in post-traumatic stress disorder: an eye-tracking study: eye-tracking data of an original antisaccade task. 72:1-72:2 - Itziar Lozano, Ruth Campos, Mercedes Belinchón:
Eye-tracking measures in audiovisual stimuli in infants at high genetic risk for ASD: challenging issues. 73:1-73:3 - Christophe Antony Lounis, Vsevolod Peysakhovich, Mickaël Causse:
Intelligent cockpit: eye tracking integration to enhance the pilot-aircraft interaction. 74:1-74:3 - Sabine Dziemian, Nicolas Langer:
Investigating the multicausality of processing speed deficits across developmental disorders with eye tracking and EEG: extended abstract. 75:1-75:2 - Hsing-fen Tu:
Seeing in time: an investigation of entrainment and visual processing in toddlers. 76:1-76:3 - Maria Timoshenko:
Seeing into the music score: eye-tracking and sight-reading in a choral context. 77:1-77:2 - Christian Schlösser:
Towards concise gaze sharing. 78:1-78:3 - Carolina Barzantny:
Training operational monitoring in future ATCOs using eye tracking: extended abstract. 79:1-79:3 - Birte Gestefeld, Alessandro Grillini, Jan-Bernard C. Marsman, Frans W. Cornelissen:
Using eye tracking to simplify screening for visual field defects and improve vision rehabilitation: extended abstract. 80:1-80:2 - Marius Rubo, Matthias Gamer:
Virtual reality as a proxy for real-life social attention? 81:1-81:2
ETRA video presentation abstracts
- John Paulin Hansen, Alexandre Alapetite, Martin Thomsen, Zhongyu Wang, Katsumi Minakata, Guangtao Zhang:
Head and gaze control of a telepresence robot with an HMD. 82:1-82:3 - Raimondas Zemblys, Oleg Komogortsev:
Developing photo-sensor oculography (PS-OG) system for virtual reality headsets. 83:1-83:3 - Adam Bykowski, Szymon Kupinski:
Automatic mapping of gaze position coordinates of eye-tracking glasses video on a common static reference image. 84:1-84:3 - Sigrid Klerke, Janus Askø Madsen, Emil Juul Jacobsen, John Paulin Hansen:
Substantiating reading teachers with scanpaths. 85:1-85:3 - Marius Rubo, Matthias Gamer:
Tracing gaze-following behavior in virtual reality using wiener-granger causality. 86:1-86:2 - Chaiyawan Auepanwiriyakul, Alex Harston, Pavel Orlov, Ali Shafti, A. Aldo Faisal:
Semantic fovea: real-time annotation of ego-centric videos with gaze context. 87:1-87:3 - Korok Sengupta, Min Ke, Raphael Menges, Chandan Kumar, Steffen Staab:
Hands-free web browsing: enriching the user experience with gaze and voice modality. 88:1-88:3 - Zofija Tupikovskaja-Omovie, David J. Tyler:
Mobile consumer shopping journey in fashion retail: eye tracking mobile apps and websites. 89:1-89:3 - Tim Claudius Stratmann, Uwe Gruenefeld, Susanne Boll:
EyeMR: low-cost eye-tracking for rapid-prototyping in head-mounted mixed reality. 90:1-90:2 - Pavel Orlov, Ali Shafti, Chaiyawan Auepanwiriyakul, Noyan Songur, A. Aldo Faisal:
A gaze-contingent intention decoding engine for human augmentation. 91:1-91:3 - Peyman Toreini, Moritz Langner, Alexander Maedche:
Use of attentive information dashboards to support task resumption in working environments. 92:1-92:3 - Shaharam Eivazi, Maximilian Maurer:
Eyemic: an eye tracker for surgical microscope. 93:1-93:2 - Islam Akef Ebeid, Jacek Gwizdka:
Real-time gaze transition entropy. 94:1-94:3 - Michiya Yamamoto, Ryoma Matsuo, Satoshi Fukumori, Takashi Nagamatsu:
Modeling corneal reflection for eye-tracking considering eyelid occlusion. 95:1-95:3 - Raphael Menges, Hanadi Tamimi, Chandan Kumar, Tina Walber, Christoph Schaefer, Steffen Staab:
Enhanced representation of web pages for usability analysis with eye tracking. 96:1-96:9 - Miika Toivanen, Visajaani Salonen, Markku S. Hannula:
Self-made mobile gaze tracking for group studies. 97:1-97:2
ETRA demo presentation abstracts
- Dillon J. Lohr, Samuel-Hunter Berndt, Oleg Komogortsev:
An implementation of eye movement-driven biometrics in virtual reality. 98:1-98:3 - Benjamin I. Outram, Yun Suen Pai, Tanner Person, Kouta Minamizawa, Kai Kunze:
Anyorbit: orbital navigation in virtual environments with eye-tracking. 99:1-99:5 - Iyad Aldaqre, Roberto Delfiore:
Robust marker tracking system for mapping mobile eye tracking data. 100:1-100:3 - Mikhail Startsev, Ioannis Agtzidis, Michael Dorr:
Deep learning vs. manual annotation of eye movements. 101:1-101:3 - Vijay Rajanna, Tracy Hammond:
A gaze gesture-based paradigm for situational impairments, accessibility, and rich interactions. 102:1-102:3 - Stanislav Popelka, Jitka Dolezalová, Marketa Beitlova:
New features of scangraph: a tool for revealing participants' strategy from eye-movement data. 103:1-103:2 - Mathias Trefzger, Tanja Blascheck, Michael Raschke, Sarah Hausmann, Thomas Schlegel:
A visual comparison of gaze behavior from pedestrians and cyclists. 104:1-104:5 - Drew T. Guarnera, Corey A. Bryant, Ashwin Mishra, Jonathan I. Maletic, Bonita Sharif:
iTrace: eye tracking infrastructure for development environments. 105:1-105:3 - Shaharam Eivazi, Thomas C. Kübler, Thiago Santini, Enkelejda Kasneci:
An inconspicuous and modular head-mounted eye tracker. 106:1-106:2
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.