Dr Christopher Hughes C.J.Hughes@salford.ac.uk
Director of CSE & Strategic Change
Automatic retrieval of closed captions for web clips from broadcast TV content
Hughes, CJ; Armstrong, M
Authors
M Armstrong
Abstract
As broadcasters’ web sites become more media
rich it would be prohibitively expensive to manually caption
all of the videos provided. However, many of these videos
have been clipped from broadcast television and would have
been captioned at the point of broadcast.
The recent FCC ruling requires all broadcasters to
provide closed captions for all ‘straight lift’ video clips that
have been broadcast on television from January 2016. From
January 2017 captions will be required for ‘Montages’
which consist of multiple clips, and the requirement to
caption clips from live or near-live television will apply from
July 2017.
This paper presents a method of automatically
finding a match for a video clip from within a set of off-air
television recordings. It then shows how the required set of
captions can be autonomously identified, retimed and
reformatted for use with IP-delivery. It also shows how
captions can be retrieved for each sub-clip within a montage
and combined to create a set of captions. Finally it describes
how, with a modest amount of human intervention, live
captions can be corrected for errors and timing to provide
improved captions for video clips presented on the web.
Citation
Hughes, C., & Armstrong, M. (2015, April). Automatic retrieval of closed captions for web clips from broadcast TV content. Presented at 2015 NAB Broadcast Engineering Conference, Las Vegas, USA
Presentation Conference Type | Other |
---|---|
Conference Name | 2015 NAB Broadcast Engineering Conference |
Conference Location | Las Vegas, USA |
Start Date | Apr 6, 2015 |
End Date | Apr 11, 2015 |
Publication Date | Apr 10, 2015 |
Deposit Date | Jan 28, 2019 |
Related Public URLs | http://www.nabstore.com/NAB_Broadcast_Engineering_Conference_Proceedings_p/cp150.htm |
Additional Information | Event Type : Conference |
You might also like
Subtitles in VR 360° video. Results from an eye-tracking experiment
(2023)
Journal Article
Eye-tracked Evaluation of Subtitles in Immersive VR 360° Video
(2023)
Conference Proceeding
VR 360º subtitles: designing a test suite with eye-tracking technology
(2022)
Journal Article
Universal access : user needs for immersive captioning
(2021)
Journal Article
Downloadable Citations
About USIR
Administrator e-mail: library-research@salford.ac.uk
This application uses the following open-source libraries:
SheetJS Community Edition
Apache License Version 2.0 (http://www.apache.org/licenses/)
PDF.js
Apache License Version 2.0 (http://www.apache.org/licenses/)
Font Awesome
SIL OFL 1.1 (http://scripts.sil.org/OFL)
MIT License (http://opensource.org/licenses/mit-license.html)
CC BY 3.0 ( http://creativecommons.org/licenses/by/3.0/)
Powered by Worktribe © 2024
Advanced Search