[FFmpeg-devel] [RFC] Move ffplay engine to separate library

wm4 nfxjfg at googlemail.com
Tue Oct 1 00:25:49 CEST 2013

On Mon, 30 Sep 2013 23:29:33 +0200
Lukasz M <lukasz.m.luki at gmail.com> wrote:

> Hi,
> I'd like to propose new library of ffmpeg project that provides middle
> layer between current ffmpeg tools and player that uses it.
> There is a lot to do to make a player based on ffmpeg working correctly,
> this is probably common for most players.
> That would be good to make it as a part of ffmpeg project so it can be
> maintained globaly and people may contribute to it.
> I used to make an iOS player for my own purpose, that base on ffmpeg and I
> ported ffplay as a core for it.
> Unfortunatelly it is hard to keep it up to date with git master, I believe
> there is many people that would appreciate this kind of library inside
> ffmpeg.
> I made some work as kind of "proof of concept". I pushed it on
> https://github.com/lukaszmluki/ffmpeg/tree/libavengine
> This is not a stuff ready to merge. In contrary, there is a lot of things
> not finished. It's just a vision more or less how it would look like.

Well... I think ffmpeg would benefit from the following things:

  1) Simplifying APIs to reduce the amount of code needed to accomplish
     simple things. Generally you need too much boilerplate for all
     kinds of rather simple day-to-day tasks.

  2) Moving ffplay parts to the core. (Why the heck does
     ffplay.c have a function like blend_subrect()?)

  3) Adding a high level API, that just allows opening a file and
     getting raw audio/video data from it. All ffplay should care about
     is feeding data to SDL. Getting packets from the demuxer and
     feeding it to the decoder, determining PTS, setting up video and
     audio filters etc. should all be done by the high level API.
     Looking at ffms2 (https://github.com/FFMS/ffms2) might help to
     give inspiration what such an API could look like, although ffms2
     in particular was made for exact random access, and not playback.

Taking the hairball that ffplay.c is and putting it into a library so
that everyone can stop bothering with the libav* low level API isn't
really a solution. (If you just want to embed a video player - maybe
use something like libVLC?)

> In general I wanted to make a library that is platform independet (it
> depends only on ffmpeg). As it requires threads, synchronization etc,
> it would be provided as callbacks by player (ffplay as a internal player).
> These callback would be global (initalized once)

IMO callbacks for very basic things like locking primitives is a really
crap approach to portability, but maybe that's just me.

> There would be also callbacks for player stuff like audio initialization /
> closing, picture rendering etc.
> The player would need to call (names are kept from ffplay)
> video_refresh periodically and sdl_audio_callback when audio system needs
> data.
> At this moment I don't want to talk about details. Just want to hear your
> opinion about this idea.
> Best Regards,
> Lukasz Marek
> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel

More information about the ffmpeg-devel mailing list