[FFmpeg-devel] [PATCH 2/4] lavd: add device capabilities API

Michael Niedermayer michaelni at gmx.at
Thu Feb 6 17:16:07 CET 2014


On Thu, Feb 06, 2014 at 05:12:14PM +0100, Michael Niedermayer wrote:
> On Thu, Feb 06, 2014 at 01:06:03PM +0100, Lukasz Marek wrote:
> > On 06.02.2014 03:57, Don Moir wrote:
> > >
> > >----- Original Message ----- From: "Don Moir" <donmoir at comcast.net>
> > >To: "FFmpeg development discussions and patches" <ffmpeg-devel at ffmpeg.org>
> > >Sent: Wednesday, February 05, 2014 7:18 PM
> > >Subject: Re: [FFmpeg-devel] [PATCH 2/4] lavd: add device capabilities API
> > >
> > >
> > >>
> > >>----- Original Message ----- From: "Lukasz Marek"
> > >><lukasz.m.luki at gmail.com>
> > >>To: <ffmpeg-devel at ffmpeg.org>
> > >>Sent: Wednesday, February 05, 2014 6:59 PM
> > >>Subject: Re: [FFmpeg-devel] [PATCH 2/4] lavd: add device capabilities API
> > >>
> > >>
> > >>>>>>Hi Lukasz,
> > >>>>>>
> > >>>>>>I am not sure about your device list API.
> > >>>>>>
> > >>>>>>You have:
> > >>>>>>
> > >>>>>>typedef struct AVDeviceInfoList {
> > >>>>>>    AVDeviceInfo *devices;               /**< list of autodetected
> > >>>>>>devices */
> > >>>>>>    int nb_devices;                      /**< number of autodetected
> > >>>>>>devices */
> > >>>>>>    int default_device;                  /**< index of default
> > >>>>>>device */
> > >>>>>>} AVDeviceInfoList;
> > >>>>>>
> > >>>>>>int avdevice_list_devices(struct AVFormatContext *s, AVDeviceInfoList
> > >>>>>>**device_list);
> > >>>>>>
> > >>>>>>Not sure why I would need an AVFormatContext but I may missing
> > >>>>>>something
> > >>>>>>there.
> > >>>>>
> > >>>>>To get dev cap you need context for options for example. In
> > >>>>>implementation you need to "open" device to check if configuration is
> > >>>>>really working or list properties ranges. For example pulse audio
> > >>>>>allows to play on remote server. You need to know that user wants to
> > >>>>>test remote server and its done by device options.
> > >>>>>
> > >>>>>>Just not exactly clear so this is just what makes the most sense
> > >>>>>>to me.
> > >>>>>>
> > >>>>>>1)  in avdevice_list_devices,  identify type of video and or audio
> > >>>>>>devices.
> > >>>>>
> > >>>>>Make sure you distinguish device at lavd level (pulseaudio, alsa, oss
> > >>>>>for audio and fbdev, xv, opengl, sdl for video) and device names for
> > >>>>>each of them (sound outputs, sound cards etc).
> > >>>>>This function list the second ones for given lavd device. Maybe
> > >>>>>function name should be changed to not confuse.
> > >>>>>
> > >>>>>>2)  Provide a list and their capabilites at same time. So maybe:
> > >>>>>>
> > >>>>>>typedef struct AVDeviceInfo {
> > >>>>>>    char *device_name;                   /**< device name, format
> > >>>>>>depends on device */
> > >>>>>>    char *device_description;            /**< human friendly name */
> > >>>>>>    // either list or count
> > >>>>>>    int  n_capabilities;
> > >>>>>>   AVDeviceCapabilities *capabilities;
> > >>>>>>} AVDeviceInfo;
> > >>>>>>
> > >>>>>>I know you have ways of doing it, but it seems akward at best and
> > >>>>>>then
> > >>>>>>more work to first find devices and then lookup capabilities for each
> > >>>>>>deivce. I see I must init something to get capabilities as well so
> > >>>>>>just
> > >>>>>>don't see how that falls in line well.
> > >>>>>
> > >>>>>It was already discussed. I started with something similar, but
> > >>>>>unfortunately it is not suitable for all cases. You cannot just return
> > >>>>>list of capabilities because they can interact with each other and
> > >>>>>they may differ for each device name.
> > >>>>
> > >>>>I guess I don't understand how devices interact with each other.
> > >>>
> > >>>not devices interact with each other, but caps of the device. When
> > >>>you set one param, it may affect others.
> > >>>
> > >>>>Each device I know of have unique names and capabilites. Could be
> > >>>>audio and
> > >>>>or video. I don't consider Opengl and SDL to be true devices is that is
> > >>>>where you are coming from.
> > >>>
> > >>>I don't know what you mean by "true devices". Yes, opengl nor SDL are
> > >>>not a hardware, but they are "devices" that do that (quote from
> > >>>documentation)
> > >>>"The libavdevice library provides the same interface as libavformat.
> > >>>Namely, an input device is considered like a demuxer, and an output
> > >>>device like a muxer, and the interface and generic device options are
> > >>>the same provided by libavformat (see the ffmpeg-formats manual)."
> > >>>
> 
> > >>>>>The simple flow I see for video output is:
> > >>>>>pick lavd device.
> > >>>>>list device names.
> > >>>>>pick device name
> > >>>>>start cap query
> > >>>>>set frame_width/height
> > >>>>>query codecs
> > >>>>>set codec
> > >>>>>query formats
> > >>>>>set valid format in filterchain sink
> > >>>>>finish cap queries
> > >>>>>
> > >>>>>And I don't think it is too much complicated.
> > >>>>
> 
> > >>>>I am developing for windows and then mac. So for windows interested in
> > >>>>dshow devices. Currently, I enumerate the devices, names and their
> > >>>>capabilites in one step rather than the 10 steps you suggest. During
> 
> isnt that a purely cosmetical difference?
> 
> 
> > >>>>the
> > >>>>enumeration of the devices I am there so good to get the capabilites in
> > >>>>one step. It appears that your steps may cause the dshow code in ffmpeg
> > >>>>to go thru the same code multiple times. There is no need to set a
> > >>>>frame
> > >>>>width / height to query the formats as this is all in the same
> > >>>>structure
> > >>>>for dshow at least.
> > >>>
> > >>>You put example of dshow, but I don't want to make interface for
> > >>>supporting dshow, but generic one, for all already implemented and
> > >>>future devs.
> > >>>In many cases it would be possible to return all at once, yes, but it
> > >>>is assumption that can be not met at some point.
> > >>
> > >>It probably can be met for any true hardware device which is what I am
> > >>interested in. SDL and OpenGL and the like, to me fall more in the
> > >>line of applications issues.
> 
> true hw devices have complex limitations, for example look at any
> high speed camera, chances are the 1000fps will be at a significantly
> lower resolution than lower frame rates.
> Have you considered that the reason why you dont see complex
> limitations is not because they dont exist but rather because you
> dont look at the hw but rather a high level interface on mac/windows?
> 
> also about format, its quite likely that hw that can do realtime
> encoding to h264 and mjpeg will support higher resolutions or
> framerates in the computationally simpler encoder.
> 
> 
> > >>
> > >>>Basically the resulting structure would need to be more complex,
> > >>
> 
> > >>Better to have a more complex structure than to have a complex
> > >>interface to it.
> 
> the complex structure would, if it supports all cases probably be
> quite unwieldy and hard to use.
> Why i think that, nothing posted came close to a structure that
> supports all cases and some already where somewhat complex nothing like
> a single flat structure like you seem to imagine.
> 
> 
> 
> > >>Probably leads to less usage of the thing you are
> > >>spending time on.
> > >>
> > >
> > >Might be good to separate this out some to simplify it. A lot of people
> > >are interested in knowing only about capture devices and could care less
> > >about things like SDL and OpenGL in ffmpeg.
> > >
> 
> > >Could be a simple interface for ennumerating capture devices. Like I
> > >said before, you don't really want to walk thru the ennumeration
> > >possibly several times for some devices. The ennumeration can cause load
> > >and unload of resources and you never know what a device might be
> > >initializing. Some do this quickly and some slowly.
> 
> gathering the information about the hardware or API and presenting it
> can be 2 different steps. The 5 calls could easily read from the
> cached output from the hw or API wraper over the hw.
> 
> 
> > 
> > Solution you suggest is the same I proposed before and was rejected.
> > So I give up any further work on it until you figure out what should
> > it look like.
> 
> Maybe a solution is to do both ?
> have a very simple flat structure that lists limitations but
> would not be able to repesent complex real hw so for example
> like these:  http://gopro.com/product-comparison-hero3-cameras
> 
> so it would then possibly list 30fps and 1080p as maximum

this was supposed to be 60fps


> while the AVOption interface would list that it also can do
> 4K at 15fps and 960p at 100fps ans wvga at 240fps
> 
> [...]
> 
> -- 
> Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB
> 
> If a bugfix only changes things apparently unrelated to the bug with no
> further explanation, that is a good sign that the bugfix is wrong.



> _______________________________________________
> ffmpeg-devel mailing list
> ffmpeg-devel at ffmpeg.org
> http://ffmpeg.org/mailman/listinfo/ffmpeg-devel


-- 
Michael     GnuPG fingerprint: 9FF2128B147EF6730BADF133611EC787040B0FAB

Awnsering whenever a program halts or runs forever is
On a turing machine, in general impossible (turings halting problem).
On any real computer, always possible as a real computer has a finite number
of states N, and will either halt in less than N cycles or never halt.
-------------- next part --------------
A non-text attachment was scrubbed...
Name: not available
Type: application/pgp-signature
Size: 198 bytes
Desc: Digital signature
URL: <http://ffmpeg.org/pipermail/ffmpeg-devel/attachments/20140206/276c0002/attachment.asc>


More information about the ffmpeg-devel mailing list