COGAIN Symposium:
Communication by Gaze Interaction

Denver, Colorado • June 25-28, 2019

In collaboration with:

Important dates

Abstract due: Jan 25, 2019 (extended)
Papers due: Jan 25, 2019 (extended)
Feedback: Feb 18, 2019
Rebuttals: Feb 25, 2019
Decisions: Mar 04, 2019
Camera-ready: Mar 29, 2019

General co-chairs
Päivi Majaranta
John Paulin Hansen

Program co-chairs
Diako Mardanbegi
Ken Pfeuffer

The 2019 COGAIN Symposium

The Symposium on Communication by Gaze Interaction organized by the COGAIN Association will be co-located with ETRA 2019, the ACM Symposium on Eye Tracking Research & Applications. ETRA 2019 will take place in Denver, Colorado, from June 25th to the 28th.

The COGAIN Symposium will be organized as a "special session" at ETRA. By combining our efforts with ETRA, we hope to encourage a broader exchange of knowledge and experiences amongst the communities of researchers, developers, manufacturers, and users of eye trackers.

We invite authors to prepare and submit papers and notes following the same ETRA's ACM format. Long papers are up to 8 pages (+ 2 additional pages for references). Short papers are up to 4 pages (+ 2 additional pages for references). During the submission process to ETRA 2019, you will be asked if you would like your paper to be presented at the COGAIN Symposium. Accepted papers for the COGAIN Symposium will be published as part of the ETRA 2019 ACM Proceedings.

The COGAIN Symposium focuses on all aspects of gaze interaction, with special emphasis on eye-controlled assistive technology. The symposium will present advances in these areas, leading to new capabilities in gaze interaction, gaze enhanced applications, gaze contingent devices etc. Topics of interest include all aspects of gaze interaction and communication by gaze including, but not limited to

  • Eye-controlled assistive technology
  • Eye-typing
  • Gaze-contingent devices
  • Gaze-enhanced games
  • Gaze-controlled robots and vehicles
  • Gaze interaction with mobile devices
  • Gaze-controlled smart-home devices
  • Gaze interfaces for wearable computing
  • Gaze interaction in 3D (VR/AR/MR & real world)
  • Gaze interaction paradigms
  • Usability and UX evaluation of gaze-based interfaces
  • User context estimation from eye movements
  • Gaze-supported multimodal interaction (gaze with multitouch, mouse, gesture, etc.)