[FFmpeg-user] Transcoding, recording, and displaying multiple webcams
rob51 at mac.com
Tue Dec 11 16:58:41 EET 2018
I've been using mjpg_streamer to display live video from 2 webcams on a
web page (in separate iframes).
It works, but now I need to also record the video while this happens.
Can ffmpeg do this?
Here's the setup:
* The webcams are typical USB cameras that output either motion JPEG
or raw video. I'm currently using 640x480 MJPEG @30 fps.
* They're connected to USB ports on a Raspberry Pi 3 B+.
* I want to record the video streams in a space-efficient format while
simultaneously serving them to browsers connected to the Pi's web
* Currently using 2 cameras but would like the option for 3 or 4.
The ffmpeg docs are pretty intimidating for this noob; what would the
command(s) to do this look like? (the feeds are available as /dev/video0
* Directly recording raw or MJPEG streams would use far too much disk
space. What would the best format for the disk files be? Could it be
something that would be easy to burn to a DVD? (just a thought, not
a high priority)
* Would it be better and more CPU-efficient to start with the raw
video from the cameras?
* What's the best format to send to browsers, in terms of quality and
overall load on the Pi?
* Is there a way to "burn" software-generated text (similar to
captions) into one or more of the video feeds?
* Is there a way to combine 2 input streams into a single "side by
side" video stream for recording? IOW, 2 views of the same action,
displayed in one video window and combined in one recording.
Tips greatly appreciated.
More information about the ffmpeg-user