Marta Brescia-Zapata
Subtitles in VR 360° video. Results from an eye-tracking experiment
Brescia-Zapata, Marta; Krejtz, Krzysztof; Duchowski, Andrew T.; Hughes, Chris J.; Orero, Pilar
Authors
Krzysztof Krejtz
Andrew T. Duchowski
Dr Christopher Hughes C.J.Hughes@salford.ac.uk
Interim Deputy Dean
Pilar Orero
Abstract
Virtual and Augmented Reality, collectively known as eXtended Reality, are key technologies for the next generation of human?computer?human interaction. In this context, 360° videos are becoming ubiquitous and especially suitable for providing immersive experiences thanks to the proliferation of affordable devices. This new medium has an untapped potential for the inclusion of modern subtitles to foster media content accessibility (Gejrot et al., 2021), e.g., for the deaf or hard-of-hearing people, and to also promote cultural inclusivity via language translation (Orero, 2022). Prior research on the presentation of subtitles in 360° videos relied on subjective methods and involved a small number of participants (Brown et al., 2018; Agulló, 2019; Oncins et al., 2020), leading to inconclusive results. The aim of this paper is to compare two conditions of subtitles in 360° videos: position (head-locked vs fixed) and colour (monochrome vs colour). Empirical analysis relies on novel triangulation of data from three complementary methods: psycho-physiological attentional process measures (eye movements), performance measures (media content comprehension), and subjective task-load and preferences (self-report measures). Results show that head-locked coloured subtitles are the preferred option.
Citation
Brescia-Zapata, M., Krejtz, K., Duchowski, A. T., Hughes, C. J., & Orero, P. (2023). Subtitles in VR 360° video. Results from an eye-tracking experiment. Perspectives, 1-23. https://doi.org/10.1080/0907676x.2023.2268122
Journal Article Type | Article |
---|---|
Acceptance Date | Sep 1, 2023 |
Online Publication Date | Nov 13, 2023 |
Publication Date | Nov 13, 2023 |
Deposit Date | Nov 13, 2023 |
Publicly Available Date | Nov 14, 2023 |
Journal | Perspectives |
Print ISSN | 1062-1083 |
Publisher | American Bar Association |
Peer Reviewed | Peer Reviewed |
Pages | 1-23 |
DOI | https://doi.org/10.1080/0907676x.2023.2268122 |
Keywords | Linguistics and Language |
Files
Published Version
(3.4 Mb)
PDF
Publisher Licence URL
http://creativecommons.org/licenses/by-nc-nd/4.0/
You might also like
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Presentation / Conference Contribution
VR 360º subtitles: designing a test suite with eye-tracking technology
(2022)
Journal Article
3D Gaze in Virtual Reality: Vergence, Calibration, Event Detection
(2022)
Journal Article
Universal access : user needs for immersive captioning
(2021)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2025
Advanced Search