[FFmpeg-user] synching multilingual subtitles using ffmpeg ...

Francois Visagie francois.visagie at gmail.com
Fri Oct 19 13:08:38 CEST 2012


> -----Original Message-----
> From: ffmpeg-user-bounces at ffmpeg.org [mailto:ffmpeg-user-
> bounces at ffmpeg.org] On Behalf Of Albretch Mueller
> Sent: 19 October 2012 11:24
> To: ffmpeg-user at ffmpeg.org
> Subject: [FFmpeg-user] synching multilingual subtitles using ffmpeg ...
> 
>  I double as teacher and tech monkey and once saw a functionality on
> ted.com/talks that I think could be very effectively used for teaching ~
>  1) go: ted.com/talks/julian_assange_why_the_world_needs_wikileaks.html
> ~
>  2) click on the "show transcript" javascript widget to select a language
(say
> Arabic) ~
>  3) then click on any text segment and you will see how the video feed
gets
> synched, to what is said in English (in this case) ~  They use Flash
player and
> javascript. If you look into the html source you will see something like:
> ~
> <p>
> <a href="#420000" class="transcriptLink" rel="nofollow">CA: I mean, if you
> did receive thousands</a> <a href="#422000" class="transcriptLink"
> rel="nofollow">of U.S.
> embassy diplomatic cables ...</a>
> </p>
> <p><a href="#425000" class="transcriptLink" rel="nofollow">JA: We would
> have released them. (CA: You would?)</a> </p> ~  with a transcriptSeek
> function as part of a very long java script functionality. So all you need
are
> those "#425000" timing/synching anchors, which of course are extracted
> from the audio stream + video synching track in the video ~  I would like
to
> use ffmpeg and C, C++ or java to do such a thing ~  I have read about
> extracting the sound stream or subtitles, editing and synching them back
into
> the file using ffmpeg, but I hasn't been able to find such a functionality
~
> What would be the steps one should go to synch a textual and video feed a
la
> ted talks?

Certainly a worthwhile concept for teaching purposes. 

In some cases video may contain superimposed or encoded subtitles which you
can extract with tools and varying degrees of difficulty, in others you need
to author the subtitles yourself from scratch.

In its most basic form subtitle source is a text transcript file containing
spoken utterances only. A timed subtitle file contains time/frame count
references which enable a player or other rendering tools to synchronise it
with the target video/audio.

Tools like www.aegisub.org attempt to make creating and timing subtitles
easier. Aegisub displays your authored subtitles on top of the target video
as they would appear, it can display an audio waveform to make identifying
utterance timing easier, and it contains some basic timing tools.

www.videohelp.com has a wealth of further information on subtitles.

Enjoy!

> ~
>  thanks
>  lbrtchx
>  ffmpeg-user at ffmpeg.org: synching multilingual subtitles using ffmpeg ...
> _______________________________________________
> ffmpeg-user mailing list
> ffmpeg-user at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-user



More information about the ffmpeg-user mailing list