About

What are current and future challenges that incorporating eye tracking into game design and development creates? The Second EyePlay workshop at CHI Play 2018 brings together academic researchers and industry practitioners from the fields of eye tracking and games to explore these questions. In recent years, gaming has been at the forefront of the commercial popularisation of eye tracking. In this workshop, we will share experiences in the development of gaze-enabled games, discuss best practices and tools, and explore future challenges for research. Topics of interest lie at the intersection of eye tracking and games including, but limited to, novel interaction techniques and game mechanics, development processes and tools, accessible games, evaluation, and future visions.

 

The initial submission deadlines for workshop participants have already passed. However, if you would still like to participate in this workshop without a paper submission, please contact the main contact person of the organizer team Michael Lankes ().

 

Background

Since the EyePlay workshop at CHI Play 2014 [8], eye tracking and gaze-based interaction have gained popularity with gaming as a main driver for wider adoption. Tobii Gaming has been leading this charge since the release of their affordable Tobii EyeX and 4C eye trackers and with its continued integration of eye tracking capabilities in over 100 game titles. The integration of eye tracking in games has enabled a plethora of novel game mechanics (see Velloso and Carter’s survey for an overview [10]), opening up a completely novel design space. Eye tracking has continued to be a promising technology in games, as evidenced by the incorporation of eye tracking technology into advancing technologies such as VR [1].

Recent works have further expanded the reach of gaze-enabled games from single-player towards multi-player contexts [2,3,4,5], towards AIs that adapt to players’ gaze patterns [6,11] and towards novel input devices such as AR and VR [1,9}. Though commercial applications of gaze in games have demonstrated a wide range of exciting implementations, these new directions are still underexplored. As such, there is a great opportunity for the exchange of ideas between academic researchers and industry practitioners to discuss future directions for eye tracking in games. Nevertheless, we recognize that eye-based gaming in both academia and industry has come a long way and that it is time researchers to critically reflect its opportunities and challenges at hand [7], especially in the advent of new developments in eye tracking technology.

The main goals of this workshop are two-fold. First, to bring together academic researchers and industry partners from fields of eye tracking and games to discuss the past, present and future challenges of eye-based gaming. Second, to review and advance the goals of the first EyePlay workshop, by raising questions such as: How can we further integrate existing eye-tracking research into commercial games to create novel gameplay experiences? The outcomes of this workshop will be compiled and communicated to the broader community; possibly through a publication.

How to Participate

Participants are invited to submit in either of the categories below. Submissions should also include a brief biography (150-200 words), to be submitted in PDF format via mail to the main contact person of the organizer team Michael Lankes ().

  • Position Paper, up to 4 pages (excluding references) written in the SIGCHI CHI Extended Abstract format (https://chiplay.acm.org/2018/authors-guidelines/). Position papers may describe a particular vision or concept, preliminary results (work-in-progress), or case study. Participants may also submit a demonstrative video as a supplement to their position paper.
  • Presentation Abstract/Interactive Demos with slides and/or video, this type of submission is ONLY recommended for industry participants. Participants are required to accompany their submission with a 2-4 page description and motivation of their work (no format requirements).

Submissions should not be anonymised. Submissions will be reviewed, and up to 20 selected, based on originality, quality and research diversity.

Important Dates

Workshop Date: 28.October 2018
Main Conference Date: 28. – 31.October 2018

Programm

Before the Workshop

Participants will be asked to familiarize the work of one another and to prepare a one-sentence summary of their research or interests for before the workshop. Participants are also asked to prepare and bring any prototypes they would like to demo or use in the play sessions. The organizers will prepare a list of research challenges in EyePlay, augmented with challenges from the accepted submission from participants. Further, the organizers will prepare equipment for the play sessions.

During the Workshop

The workshop will be made up of 2 parts: a discussion phase (focus on theory of EyePlay), and a interactive hands-on design experience (design phase). In the discussion phase participants will be confronted with the challenge to create a “EyePlay Map” to get a common understanding of the research field and to categorize their submissions. This phase will be made up of three steps: First, participants will be asked to formulate themes of EyePlay (e.g. social gaze, navigation via gaze, etc.) Afterwards, participants shall position their submissions on the map in order to contextualize the papers and to establish connections between them. The last step will include discussion about the empty spaces of EyePlay (themes without submissions) and current and future research challenges of each theme. The outcome of the discussion phase will compromise of an EyePlay Map and a list of design challenges. In the design phase workshop participants will collaborate to create conceptual or rough interactive prototypes to address challenges of EyePlay.

Schedule

  • 09.00-09.10: Workshop Kick-off. Workshop organizers present the agenda for the day.
    09.10-10.25: Introductions. Participants are given 5 minutes each to introduce themselves, present their submissions and convey their interests.
  • 10.25-12.05: Discussion. Participants will be confronted with the challenge to create a “EyePlay Map” to get a common understanding of the research field and to categorize their submissions. This phase will be made up of three steps: First, participants will be asked to formulate themes of EyePlay (e.g. social gaze, navigation via gaze, etc.) Afterwards, participants shall position their submissions on the map in order to contextualize the papers and to establish connections between them. The last step will include discussion about the empty spaces of EyePlay (themes without submissions) and current and future research challenges of each theme. The outcome of the discussion phase will compromise of an EyePlay Map and a list of design challenges.
  • 12.05-13.05: Lunch Break
  • 13.05-14.20: Design Phase 1. Participants will form groups and will design solutions for research challenges formulated in the discussion phase.
  • 14.20-16.15: Design Phase 2. Each group will develop a prototype based on the outcome of design phase 1.
  • 16.15-17.15: Presentations
  • 17.15-17.30: Workshop wrap-up

Organizers

References

  1. Mohamed Khamis, Carl Oechsner, Florian Alt, and Andreas Bulling. 2018. VRpursuits: Interaction in Virtual Reality Using Smooth Pursuit Eye Movements. In Proceedings of the 2018 International Conference on Advanced Visual Interfaces (AVI ’18). ACM, New York, NY, USA, Article 18, 8 pages. DOI: http://dx.doi.org/10.1145/3206505.3206522
  2. Michael Lankes, Bernhard Maurer, and Barbara Stiglbauer. 2016. An Eye for an Eye: Gaze Input in Competitive Online Games and Its Effects on Social Presence. In Proceedings of the 13th International Conference on Advances in Computer Entertainment Technology (ACE ’16). ACM, New York, NY, USA, Article 17, 9 pages. DOI: http://dx.doi.org/10.1145/3001773.3001774
  3. Bernhard Maurer, Michael Lankes, and Manfred Tscheligi. 2018. Where the eyes meet: Lessons learned from shared gaze-based interactions in cooperative and competitive online games. Entertainment Computing 27 (2018), 47 – 59. DOI: http://dx.doi.org/10.1016/j.entcom.2018.02.009
  4. Joshua Newn, Fraser Allison, Eduardo Velloso, and Frank Vetere. 2018. Looks Can Be Deceiving: Using Gaze Visualisation to Predict and Mislead Opponents in Strategic Gameplay. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18). ACM, New York, NY, USA, Article 261, 12 pages. DOI: http://dx.doi.org/10.1145/3173574.3173835
  5. Joshua Newn, Eduardo Velloso, Fraser Allison, Yomna Abdelrahman, and Frank Vetere. 2017. Evaluating Real-Time Gaze Representations to Infer Intentions in Competitive Turn-Based Strategy Games. In Proceedings of the Annual Symposium on Computer-Human Interaction in Play (CHI PLAY ’17). ACM, New York, NY, USA, 541–552. DOI: http://dx.doi.org/10.1145/3116595.3116624
  6. Ronal Singh, Tim Miller, Joshua Newn, Liz Sonenberg, Eduardo Velloso, and Frank Vetere. 2018. Combining Planning with Gaze for Online Human Intention Recognition. In Proceedings of the 17th International Conference on Autonomous Agents and Multiagent System (AAMAS ’18). International Foundation for Autonomous Agents and Multiagent Systems, Richland, SC.
  7. Veronica Sundstedt, Diego Navarro, and Julian Mautner. 2016. Possibilities and Challenges with Eye Tracking in Video Games and Virtual Reality Applications. In SIGGRAPH ASIA 2016 Courses (SA ’16). ACM, New York, NY, USA, Article 17, 150 pages. DOI:http://dx.doi.org/10.1145/2988458.2988466
  8. Jayson Turner, Eduardo Velloso, Hans Gellersen, and Veronica Sundstedt. 2014. EyePlay: Applications for Gaze in Games. In Proceedings of the First ACM SIGCHI Annual Symposium on Computer-human Interaction in Play (CHI PLAY ’14). ACM, New York, NY, USA, 465–468. DOI:http://dx.doi.org/10.1145/2658537.2659016
  9. Hidde van der Meulen, Andrew L. Kun, and Orit Shaer. 2017. What Are We Missing?: Adding Eye-Tracking to the HoloLens to Improve Gaze Estimation Accuracy. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (ISS ’17). ACM, New York, NY, USA, 396–400. DOI: http://dx.doi.org/10.1145/3132272.3132278
  10. Eduardo Velloso and Marcus Carter. 2016. The Emergence of EyePlay: A Survey of Eye Interaction in Games. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (CHI PLAY ’16). ACM, New York, NY, USA, 171–185. DOI:http://dx.doi.org/10.1145/2967934.2968084
  11. Stefanie Wetzel, Katharina Spiel, and Sven Bertel.2014. Dynamically Adapting an AI Game Engine Based on Players’ Eye Movements and Strategies. In Proceedings of the 2014 ACM SIGCHI Symposium on Engineering Interactive Computing Systems (EICS ’14). ACM, New York, NY, USA, 3–12. DOI: http://dx.doi.org/10.1145/2607023.2607029