Subtitles in VR 360° video. Results from an eye-tracking experiment

StatusVoR
cris.lastimport.scopus2025-08-25T03:12:23Z
dc.abstract.enVirtual and Augmented Reality, collectively known as eXtended Reality, are key technologies for the next generation of human–computer–human interaction. In this context, 360° videos are becoming ubiquitous and especially suitable for providing immersive experiences thanks to the proliferation of affordable devices. This new medium has an untapped potential for the inclusion of modern subtitles to foster media content accessibility (Gejrot et al., Citation2021), e.g., for the deaf or hard-of-hearing people, and to also promote cultural inclusivity via language translation (Orero, Citation2022). Prior research on the presentation of subtitles in 360° videos relied on subjective methods and involved a small number of participants (Brown et al., Citation2018; Agulló, Citation2019; Oncins et al., Citation2020), leading to inconclusive results. The aim of this paper is to compare two conditions of subtitles in 360° videos: position (head-locked vs fixed) and colour (monochrome vs colour). Empirical analysis relies on novel triangulation of data from three complementary methods: psycho-physiological attentional process measures (eye movements), performance measures (media content comprehension), and subjective task-load and preferences (self-report measures). Results show that head-locked coloured subtitles are the preferred option.
dc.affiliationOśrodek Badań Okulograficznych, Wydział Psychologii w Warszawie
dc.affiliationInstytut Psychologii
dc.affiliationWydział Psychologii w Warszawie
dc.contributor.authorBrescia-Zapata, Marta
dc.contributor.authorKrejtz, Krzysztof
dc.contributor.authorDuchowski, Andrew T.
dc.contributor.authorHughes, Chris J.
dc.contributor.authorOrero, Pilar
dc.date.access2023-11-13
dc.date.accessioned2023-11-14T11:18:09Z
dc.date.available2023-11-14T11:18:09Z
dc.date.created2023-09-01
dc.date.issued2023-11-13
dc.description.accesstimeat_publication
dc.description.grantnumberCA19142
dc.description.granttitleCOST Action LEAD ME
dc.description.physical1-24
dc.description.versionfinal_published
dc.identifier.doi10.1080/0907676X.2023.2268122
dc.identifier.eissn1747-6623
dc.identifier.issn0907-676X
dc.identifier.urihttps://share.swps.edu.pl/handle/swps/148
dc.identifier.weblinkhttps://www.tandfonline.com/doi/full/10.1080/0907676X.2023.2268122?src=
dc.languageen
dc.pbn.affiliationpsychologia
dc.rightsCC-BY-NC-ND
dc.rights.questionYes_rights
dc.share.articleOTHER
dc.subject.enImmersive environments
dc.subject.en360° video
dc.subject.enmedia accessibility
dc.subject.ensubtitles
dc.subject.encaptions
dc.subject.eneye-tracking
dc.swps.sciencecloudnosend
dc.titleSubtitles in VR 360° video. Results from an eye-tracking experiment
dc.title.journalPerspectives: Studies in Translation Theory and Practice
dc.typeJournalArticle
dspace.entity.typeArticle