[FFmpeg-devel] [RFC] Move ffplay engine to separate library

Lukasz M lukasz.m.luki at gmail.com
Sat Nov 23 01:26:05 CET 2013


On 1 October 2013 12:53, Stefano Sabatini <stefasab at gmail.com> wrote:

> On date Monday 2013-09-30 23:29:33 +0200, Lukasz M encoded:
> > Hi,
> >
> > I'd like to propose new library of ffmpeg project that provides middle
> > layer between current ffmpeg tools and player that uses it.
> > There is a lot to do to make a player based on ffmpeg working correctly,
> > this is probably common for most players.
> > That would be good to make it as a part of ffmpeg project so it can be
> > maintained globaly and people may contribute to it.
> >
> > I used to make an iOS player for my own purpose, that base on ffmpeg and
> I
> > ported ffplay as a core for it.
> > Unfortunatelly it is hard to keep it up to date with git master, I
> believe
> > there is many people that would appreciate this kind of library inside
> > ffmpeg.
> >
> > I made some work as kind of "proof of concept". I pushed it on
> > https://github.com/lukaszmluki/ffmpeg/tree/libavengine
> >
> > This is not a stuff ready to merge. In contrary, there is a lot of things
> > not finished. It's just a vision more or less how it would look like.
> >
> > In general I wanted to make a library that is platform independet (it
> > depends only on ffmpeg). As it requires threads, synchronization etc,
> > it would be provided as callbacks by player (ffplay as a internal
> player).
> > These callback would be global (initalized once)
> >
> > There would be also callbacks for player stuff like audio initialization
> /
> > closing, picture rendering etc.
> > The player would need to call (names are kept from ffplay)
> > video_refresh periodically and sdl_audio_callback when audio system needs
> > data.
> >
>
> > At this moment I don't want to talk about details. Just want to hear your
> > opinion about this idea.
>
> One problem with your approach is that you depend on SDL. Ideally a
> player should be able to select the output device depending on
> availability/preference, and a generic framework as FFmpeg should be
> able to abstract from a specific implementation.
>
> As I envision FFmpeg, a player should be built based on the basic
> blocks provided by the libraries, in particular by composing filters
> and input / output devices.
>
> For example you could have something like this:
>
> [input device] -> [movie source] -> [filters] -> [output device]
> |--------------------------------------------------------------|
>                             ^
>                             |
>                             v
>                     [control device]
>
> and provide some way to send interactive commands, e.g. to adjust
> filtering and/or to seek back.
>
> What we lack right now, amongst the other things: better output
> devices (e.g. opengl, audio outputs), an efficient way to build the
> filtergraph, more filter commands (e.g. to seek back in the movie
> source), and a high-level API and/or high-level language bindings.


A bit old topic, but I was working on it in spare time.
Your vision of it work well. I made proof of concept and a simple player
using pulseaudio + xv work well.
It has very similar cpu and memory usage as ffplay.

One important thing I skipped is a support for filters. I just used src +
sink for implicit format conversion when needed.
Can you elaborate what you mean by an efficient way to build the
filtergrapth?
I honestly didn't investigate filter module deeply yet, but as far I see it
is usually built by providing a string with description of filtergraph.

This is just high level vision, but I see that player should be possible to
change filter's parameters at runtime. (I'm not sure, but I have feeling it
was discussed somehow)
For example, you may put volume filter connected somehow with a slider in
player's UI and changing slider position should just change volume option
in filter config.
I'm aware it is not so simple, but I'm asking for some hints, warning about
pitfalls and your opinion.

Other thing that concern me is want you mean by filter command to seek back
in the movie? I assumed seeking require change a position in a stream at IO
level (protocol).


More information about the ffmpeg-devel mailing list