[FFmpeg-user] Realtime Audio Transcoding delay

shahab shirazi shahab.sh.70 at gmail.com
Fri Dec 21 23:01:18 CET 2012


Hi,

I'm writing an Android app. This app gets audio stream from the mic,
encodes it and sends this stream to my server (through a tcp connection).
The server is supposed to read this stream, transcode it on the fly and
send it to an external device which only supports g726 audio format.

Now the problem is the delay. Currently it's about 10 seconds. After some
digging here is what I found:
The Android app -> server seems to be working fine. Sends audio stream of
AMR-NB format, 8000 Hz, 1 channels, flt, 12 kb/s
Server feeds ffmpeg (or vlc) at about 1.6 KBytes/s and it takes about 8
seconds for ffmpeg/vlc to output any data. This is the bottle neck.

Here is the command I used for ffmpeg:
    ffmpeg -i - -acodec g726 -ar 8k -ac 1 -b:a 8k -f wav -

Input stream is passed to the stdin of ffmpeg and the transcoded stream is
read from the stdout.

I tried dfferent formats (e.g. vorbis) as the input or output with
different parameters but the delay remained unacceptable.
Why is the delay so high? can you think of any way to reduce it?

I have full control over the server and the app but Android (2.3+) only
supports encoding AAC-LC, AMR-NB, AMR-WB.


More information about the ffmpeg-user mailing list