[Libav-user] need advice on new player
nicolas.george at normalesup.org
Mon Feb 20 00:05:52 CET 2012
Le primidi 1er ventôse, an CCXX, Deron a écrit :
> So, I presume then that avformat_open_input (or?) does the interpreting
> in the background and amends the stream list etc so that packet info is
> basically the output from the filtergraph? How would I implement that in
> some other app? Pass "lavfi" to avformat_open_input as the input format?
Exactly, and the filtergraph as the file name. But if you write your own
application, you can also use audio filters directly.
> Looking at the existing devices, I only see 4 output devices (alsa, sdl,
> oss, and sndio). None of those work like the decklink (well, SDL can be
> the keeper of time as demonstrated in ffplay but the libavdevice simply
> draws every time it gets an image). Since timing was the biggest mplayer
> problem, I am not at all confident that this can be made to work.
> I'm willing to try, if you can confirm that this is not an issue. As I
> understand it, the device will need to create 2 queues (one for video,
> one for audio), with timing, and then simply not return(block) when a
> highwater mark is reached in the queues to stop ffmpeg from further
> decoding until the decklink catches up. Any examples of this in action
> as a libavdevice? Or more complex (output) libavdevices?
I do not know how the decklink card work at all, I can only make some
An output device in lavd looks almost exactly like a muxer: your can try to
create various streams in it (but usually, it will cause an error if they
are not the expected number and type; but the same would happen if you try
to mux video in WAV), and then mux packets. The device will usually want
their streams to be in raw video or PCM, but there is no constraint.
As for timing, there are various solutions:
The audio devices usually write continuously to the sound card, either
blocking or returning EAGAIN (depending on whether nonblocking mode has been
activated) when the driver's buffer is full. There is an API to get accurate
and allow to sync with other events: av_get_output_timestamp; currently,
only ALSA implements it.
The SDL driver is assumed to be instantaneous: whatever you write_frame() to
it is considered to be immediately displayed on the screen. Delaying and
syncing is left to the application. For that reason, if you use ffmpeg -i
... -f sdl -, the video will play as fast as your CPU allows, completely
disregarding the normal speed. You can get slightly better results with -re,
but that is only an approximate.
If you wanted to write an application using avdevice both for audio and
video output, I guess you would assume the video output is instantaneous,
feed the audio continuously to the audio device, keep a queue of the decoded
video frames and send them to the video device when av_get_output_timestamp
on the audio devices goes beyond their PTS.
If the decklink card can handle both audio and video, I guess you would need
to write an avdevice muxer that is able to accept both a video and an audio
stream and internally ensure the sync, either blocking or returning EAGAIN.
And if the video output has a delay too... then the problem is not easy.
> >Ideally, MPlayer and ffmpeg should be made to be able to use libavdevice for
> >audio and video output.
> Are you saying that ffmpeg does not current support libavdevice for
> audio/video output? Or that the ideal way is to add a decklink
> libavdevice so that mplayer/ffmpeg supports decklink "automatically"?
I was saying that someone ought to write the code in mplayer to allow
something like that:
mplayer -vo lavd:device=sdl -ao lavd:device=alsa
That way, anything added to libavdevice would automatically be available in
mplayer for no additional cost.
I may give it a shot sometime, but my TODO list is quite full right now.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Size: 198 bytes
Desc: Digital signature
More information about the Libav-user