[FFmpeg-user] capture from camera (macOS) and pass video to ffplay

Peter Gusev gpeetonn at gmail.com
Wed Sep 12 02:25:04 EEST 2018

Hi there! I need to capture video from a camera (macOS) and then pass it
through a file pipe to ffplay for displaying it. Can't figure out how to
specify ffmpeg's output pixel format. When passing -pix_fmt argb ffmpeg
says that the format is incompatible with the input device (?). I expect
ffmpeg to transcode whatever incoming pixel format is to the one I
specified so that I can have ffplay read from file pipe like this "ffplay
-f rawvideo -vcodec rawvideo -s 320x180 -i file_pipe".
Also, can ffmpeg write to a file pipe (not the pipe:
<http://ffmpeg.org/ffmpeg-protocols.html#pipe> functionality)?


Peter Gusev <https://www.linkedin.com/in/peter-gusev-8135441a/>
*gpeetonn at gmail.com <gpeetonn at gmail.com>*

*+1 213 587-27-48*

*Research Scholar @ REMAP UCLA <http://www.remap.ucla.edu/home/about>Video
streaming/ICN networks/Creative Coding/Interactive Media*
*dj peetonn <https://soundcloud.com/peter-gusev>*

More information about the ffmpeg-user mailing list