Andrew T. Duchowski
Interactive Storytelling with Gaze-Responsive Subtitles
Duchowski, Andrew T.; Vieira, Patrícia Araújo; Assis, Ítalo Alves Pinto de; Krejtz, Krzysztof; Hughes, Chris; Orero, Pilar
Authors
Patrícia Araújo Vieira
Ítalo Alves Pinto de Assis
Krzysztof Krejtz
Dr Chris Hughes C.J.Hughes@salford.ac.uk
Interim Deputy Dean
Pilar Orero
Abstract
The paper describes an eye-tracking framework for offline analysis of and real-time interaction with gaze-responsive subtitled media. The eventual goal is to introduce and to evaluate gaze-responsive subtitles, which afford pausing of video when reading subtitles. Initial modes of interaction include: look-to-read and look-to-release. The former pauses video as long as gaze is detected over subtitles, the latter pauses video until gaze falls on subtitles. To avoid disrupted perception of media content, an additional ambient soundtrack matched to the general content of video is proposed. Note that this is potentially revolutionary as it would require an entirely novel approach to film direction. Just as Audio Description is now included in most modern films, ambient sound would also be needed to fill in brief temporal gaps when the user’s visual attention is directed toward subtitles. Concomitantly, the eye-tracking framework fosters quantitative analysis of attention to audiovisual content apart from qualitative evaluation on which most of subtitling standardization is based.
Presentation Conference Type | Conference Paper (published) |
---|---|
Conference Name | Proceedings of the ACM International Conference on Interactive Media Experiences Workshops |
Start Date | Jun 3, 2025 |
Acceptance Date | May 1, 2025 |
Online Publication Date | Jun 3, 2025 |
Deposit Date | Jun 4, 2025 |
Publicly Available Date | Jun 10, 2025 |
Peer Reviewed | Peer Reviewed |
Pages | 19-25 |
DOI | https://doi.org/10.5753/imxw.2025.9779 |
Files
Published Version
(26.2 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by/4.0/
You might also like
Real-Time Mobile Transition Matrix Entropy Based on Eye and Head Movements
(2025)
Presentation / Conference Contribution
Subtitles in VR 360° video. Results from an eye-tracking experiment
(2023)
Journal Article
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Presentation / Conference Contribution
VR 360º subtitles: designing a test suite with eye-tracking technology
(2022)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search