[FFmpeg-user] Enconding and streaming raw pixel stream via WiFi

Moritz Barsnick barsnick at gmx.net
Wed May 17 23:11:30 EEST 2017


On Wed, May 17, 2017 at 19:29:52 +0200, Ralf Ramsauer wrote:

So many questions in just one e-mail. *sigh* ;-)

> My intention is to capture a v4l2 cam device, do some magic with opencv
> and stream with ffserver and RTSP via WiFi, as the device is a flying
> microcopter platform with an unstable WiFi connection.

May I ask what you to with opencv? Is it something an ffmpeg filter
could perhaps do?

> However, the pixel stream has no fix framerate, it produces as many
> frames as possible. IOW, a new frame is sent to stdout as soon as it is
> available. Effectively, this results in a framerate of about ~28fps.

I'm not sure ffmpeg handles variable framerates as a raw input (though
my opinion is it should).

> This is how I invoke ffmpeg:
> 
> ./my_opencv_app | ffmpeg -f rawvideo -r 30 -pixel_format gray
> -video_size 640x480 -tune zerolatency -i - http://localhost:8081/foo.ffm
> 
> Please find the corresponding ffserver.conf below.
> 
> Everything *somehow* works, but not as intended. The video delay is
> constantly increasing, and it produces high network load (~1.4MB/s) for
> rather simple pictures (grayscaled).
> 
> - My opencv app currently produces as many frames as possible. Should it
>   produce frames at a (more or less) constant frame rate?

I'm not sure, I'll let someone else answer. Optimally, ffmpeg would add
the wallclock as timestamp when receiving, and make a VFR stream of it.

> - Apparently, this approach makes ffmpeg use the mpeg4 codec per
>   default. This results in high network load, though my frames are only
>   grayscaled.

This depends on how ffmpeg was configured and built, and on ffmpeg's
defaults for the output format. If your ffmpeg supports other codecs,
you are free to say "-c:v othercodec".

(BTW, I believe ffmpeg's mpeg4 encoder defaults to 200 kbits/s, that is
not really high network load!)

>   So I tried to switch to h26[45]. This broke everything. I get a bunch
>   of errors and no video stream when I try to watch it with mplayer.

Now I have a question: If you have an issue you would like help with,
why don't you show us these errors? Where do we get crystal balls from?
In other words: Command line and complete, uncut console output
missing.

>   I tried to replace my opencv app by a direct stream of the webcam with
>   ffmpeg. Same issues.

ffmpeg -f v4l2 -i /dev/video0 ??

> - What are 'recommended' codec settings for my use case? Which codec
>   would probably be the best for me? (Sorry, I'm not an A/V guy ;) )

What is your case? I.e. what do the clients require? How much bandwidth
do you have available? How much latency can you tolerate? How much CPU
power do you have to spare? A/V is not trivial, indeed.

> - I have to specify the framerate as a ffmpeg parameter "-r 30", and in
>   the ffserver.conf. Why do I have to specify it twice? Why do I have to
>   specify a framerate at all?

I don't know much about ffserver. I pass.

> - I'd like to keep latency as low as possible. Are there some special
>   tweaks for achieving low latency with ffmpeg?

Certainly. Let me google that for you:
https://trac.ffmpeg.org/wiki/StreamingGuide#Latency

Sorry, I left some stuff unanswered for others to pick up. :)

Moritz


More information about the ffmpeg-user mailing list