[FFmpeg-user] syncing live streams when mosaicing

Daniel Oberhoff daniel at danieloberhoff.de
Wed Feb 6 00:47:30 CET 2013


We need to fuse live streams from two cameras, so that frames always 
come in corresponding pairs. We have had some success using ffmpeg like 

ffmpeg -rtsp_transport tcp -r 25 -i 
rtsp://user:password@ \
       -rtsp_transport tcp -r 25 -i 
rtsp://user:password@ \
       -filter_complex "[0:0]pad=iw*2:ih[a];[a][1:0]overlay=w" ...

which arranges the two streams side by side. but the problem is, that 
ffmpeg seems to start the streams at different times, but assigns the 
first stream from frame a to the first frame from frame b etc. which 
means the streams forever remain time shifted to each other by the 
starting delay.

now we know the streams are time stamped correctly using the pts (in an 
h264 stream). can ffmpeg somehow use the pts to avoid the shift?

help much appreaciated!


More information about the ffmpeg-user mailing list