Home
Scholarly Works
SocialEyes: Scaling Mobile Eye-tracking to...
Conference

SocialEyes: Scaling Mobile Eye-tracking to Multi-person Social Settings

Abstract

Eye movements provide a window into human behaviour, attention, and interaction dynamics. Challenges in real-world, multi-person environments have, however, restrained eye-tracking research predominantly to single-person, in-lab settings. We developed a system to stream, record, and analyse synchronised data from multiple mobile eye-tracking devices during collective viewing experiences (e.g., concerts, films, lectures). We implemented lightweight operator interfaces for real-time-monitoring, remote-troubleshooting, and gaze-projection from individual egocentric perspectives to a common coordinate space for shared gaze analysis. We tested the system in a live concert and a film screening with 30 simultaneous viewers during each of two public events (N=60). We observe precise time-synchronisation between devices measured through recorded clock-offsets, and accurate gaze-projection in challenging dynamic scenes. Our novel analysis metrics and visualizations illustrate the potential of collective eye-tracking data for understanding collaborative behaviour and social interaction. This advancement promotes ecological validity in eye-tracking research and paves the way for innovative interactive tools.

Authors

Saxena S; Visram A; Lobo N; Mirza Z; Khan M; Pirabaharan B; Nguyen A; Fink LK

Pagination

pp. 1-19

Publisher

Association for Computing Machinery (ACM)

Publication Date

April 26, 2025

DOI

10.1145/3706598.3713910

Name of conference

Proceedings of the 2025 CHI Conference on Human Factors in Computing Systems
View published work (Non-McMaster Users)

Contact the Experts team