[FFmpeg-user] synching multilingual subtitles using ffmpeg ...

Albretch Mueller lbrtchx at gmail.com
Fri Oct 19 11:24:16 CEST 2012


 I double as teacher and tech monkey and once saw a functionality on
ted.com/talks that I think could be very effectively used for teaching
~
 1) go: ted.com/talks/julian_assange_why_the_world_needs_wikileaks.html
~
 2) click on the "show transcript" javascript widget to select a
language (say Arabic)
~
 3) then click on any text segment and you will see how the video feed
gets synched, to what is said in English (in this case)
~
 They use Flash player and javascript. If you look into the html
source you will see something like:
~
<p>
<a href="#420000" class="transcriptLink" rel="nofollow">CA: I mean, if
you did receive thousands</a>
<a href="#422000" class="transcriptLink" rel="nofollow">of U.S.
embassy diplomatic cables ...</a>
</p>
<p><a href="#425000" class="transcriptLink" rel="nofollow">JA: We
would have released them. (CA: You would?)</a>
</p>
~
 with a transcriptSeek function as part of a very long java script
functionality. So all you need are those "#425000" timing/synching
anchors, which of course are extracted from the audio stream + video
synching track in the video
~
 I would like to use ffmpeg and C, C++ or java to do such a thing
~
 I have read about extracting the sound stream or subtitles, editing
and synching them back into the file using ffmpeg, but I hasn't been
able to find such a functionality
~
 What would be the steps one should go to synch a textual and video
feed a la ted talks?
~
 thanks
 lbrtchx
 ffmpeg-user at ffmpeg.org: synching multilingual subtitles using ffmpeg ...


More information about the ffmpeg-user mailing list