Interactive Storytelling with Gaze-Responsive Subtitles

StatusVoR
dc.abstract.enThe paper describes an eye-tracking framework for offline analysis of and real-time interaction with gaze-responsive subtitled media. The eventual goal is to introduce and to evaluate gaze-responsive subtitles, which afford pausing of video when reading subtitles. Initial modes of interaction include: look-to-read and look-to-release. The former pauses video as long as gaze is detected over subtitles, the latter pauses video until gaze falls on subtitles. To avoid disrupted perception of media content, an additional ambient soundtrack matched to the general content of video is proposed. Note that this is potentially revolutionary as it would require an entirely novel approach to film direction. Just as Audio Description is now included in most modern films, ambient sound would also be needed to fill in brief temporal gaps when the user’s visual attention is directed toward subtitles. Concomitantly, the eye-tracking framework fosters quantitative analysis of attention to audiovisual content apart from qualitative evaluation on which most of subtitling standardization is based.
dc.affiliationWydział Psychologii w Warszawie
dc.affiliationInstytut Psychologii
dc.conference2025 ACM International Conference on Interactive Media Experiences
dc.conference.countryBrazylia
dc.conference.coverageinternational
dc.conference.datefinish2025-06-06
dc.conference.datestart2025-06-03
dc.conference.placeRio de Janeiro
dc.conference.seriesACM International Conference on Interactive Media Experiences
dc.conference.seriesshortcutIMX
dc.conference.seriesweblinkhttps://imx.acm.org/
dc.conference.shortcutIMX 2025
dc.conference.weblinkhttps://imx.acm.org/2025/
dc.contributor.authorDuchowski, Andrew T.
dc.contributor.authorAraújo Vieira, Patrícia
dc.contributor.authorPinto de Assis, Ítalo Alves
dc.contributor.authorKrejtz, Krzysztof
dc.contributor.authorHughes, Chris J.
dc.contributor.authorOrero, Pilar
dc.contributor.editorGhinea, George
dc.contributor.editorConci, Aura
dc.contributor.editorVan den Broeck, Wendy
dc.contributor.editorViterbo, José
dc.contributor.editorWillrich, Roberto
dc.date.access2025-06-06
dc.date.accessioned2025-11-07T12:52:07Z
dc.date.available2025-11-07T12:52:07Z
dc.date.created2025
dc.date.issued2025
dc.description.abstract<jats:p>The paper describes an eye-tracking framework for offline analysis of and real-time interaction with gaze-responsive subtitled media. The eventual goal is to introduce and to evaluate gaze-responsive subtitles, which afford pausing of video when reading subtitles. Initial modes of interaction include: look-to-read and look-to-release. The former pauses video as long as gaze is detected over subtitles, the latter pauses video until gaze falls on subtitles. To avoid disrupted perception of media content, an additional ambient soundtrack matched to the general content of video is proposed. Note that this is potentially revolutionary as it would require an entirely novel approach to film direction. Just as Audio Description is now included in most modern films, ambient sound would also be needed to fill in brief temporal gaps when the user’s visual attention is directed toward subtitles. Concomitantly, the eye-tracking framework fosters quantitative analysis of attention to audiovisual content apart from qualitative evaluation on which most of subtitling standardization is based.</jats:p>
dc.description.accesstimeat_publication
dc.description.physical19-25
dc.description.sdgQualityEducation
dc.description.sdgGenderEquality
dc.description.sdgReducedInequalities
dc.description.seriesProceedings of the ACM International Conference on Interactive Media Experiences Workshops
dc.description.versionfinal_published
dc.identifier.doi10.5753/imxw.2025.9779
dc.identifier.isbn.
dc.identifier.urihttps://share.swps.edu.pl/handle/swps/1971
dc.identifier.weblinkhttps://sol.sbc.org.br/index.php/imxw/article/view/35221/35011
dc.languageen
dc.pbn.affiliationpsychologia
dc.publisher.ministerialAssociation for Computing Machinery (ACM)
dc.relation.book2025: Proceedings of the ACM International Conference on Interactive Media Experiences Workshops
dc.relation.pages159
dc.rightsCC-BY
dc.rights.questionYes_rights
dc.share.monoOTHER
dc.subject.eneye tracking
dc.subject.ensubtitles
dc.subject.eninteraction
dc.swps.sciencecloudsend
dc.titleInteractive Storytelling with Gaze-Responsive Subtitles
dc.title.journalProceedings of the ACM International Conference on Interactive Media Experiences Workshops (ACM IMXw 2025)
dc.typeMonographyChapterConference
dspace.entity.typeBook