[FFmpeg-devel] [DOC] FFSERVER Configuration Files.

Piero Bugoni ffmpeg.devel
Sat Aug 25 14:30:03 CEST 2007


--- Piero Bugoni <ffmpeg.devel at yahoo.com> wrote:

> > > 
> > > There is no option mentioned in the documentation for
> "VideoBitRateRange".
> > 
> > the documentation is in ISO c format
> > 
> 
> Nice.
> 
> 
> > > 
> > > If no "VideoQMin/Max" settings are mentioned will the Q vary
> automatically
> > up
> > > to the limit allowed for the bitrate? Or, Between .1 and 31, or something
> > like
> > > that?
> > 
> > yes
> 
> OK, so the best quality image for a given bitrate will be used, and the lower
> the bitrate specified, the higher the "Q" value will be set, up to a maximum
> of
> 31.
> 
> 
> > > 
> > > As I have said before, 
> > > 1) No image with a "Q" more than 12 did I consider worthwhile for
> anything.
> > > 2) Security camera applications mean that a face must be recognizable.
> > 
> > look, if your connection can transmit at 500kbit/sec and you set max rate
> > 500 and bitrate somewhat lower and libavcodec chooses lets say qp=20 that
> > means libavcodec cannot encode the images within the bitrate at a lower qp
> > if you force qp=5 your stream will end up needing maybe 5000kbit/sec and
> you
> > will not recognize any faces because you will not see anything simply
> > because your network is too slow
> 
> So use the bitrate as the throttling mechanism and as much as possible, match
> it to network speeds.
> 
> What happens when the network becomes temporarily loaded, and won't support
> your bitrate setting? If "VideoBitRateRange" is set from 0-N, the stream will
> use up to "N" if available, and down to no transmission whatsoever if no
> bandwidth is available?
> 
> The issue with "Q" is this:
> Higher "Q" values make the image look more and more like a mosaic. With files
> at least, any "Q" beyond 12 produces an image that I do not want to even look
> at. (I typically watch full-screen). So the goal at this point is to
> incorporate  the method you suggest, but set a limit on the image-crappiness
> that is allowed. So the way to do this would be use VideoBitRateRange, do not
> specify Qmin, but do specify Qmax?
> 
> Finally, where does the rate control buffer fit in all this, and how does one
> arrive at a setting, (other than by trial and error, as I did).
> 
> 
> P. 
> 
> 
> 

OK, since no one has responded to this post can I assume that my assumptions
herein are correct? If so, I will edit the files I have posted accordingly.

Otherwise, let me know in advance so I do no have to keep re-writing a bunch of
bullshit.

Again, the point of all this was that getting ffserver to work at all is a pain
in the ass. (But, amazingly, it does actually work), The existing documentation
approaches worthlessness, and as far as I can tell from almost a year of
subscription to this list, no one else is taking care of it.

I have been using Linux since Kernel 1.1.59, (Slackware 3). I still had to fuck
with this thing for hours to get a working system. (No other software package
can claim this title. Cinelerra is a close second). Apparently however, the
team working on the UNIVAC port of X Windows is having similar difficulties.

Pre-set configuration files is a worthwhile contribution. If anyone has
anything else to say, please speak up so I do not have to keep re writing. 

So, as this stands now:
1) Bitrate will be the limiting factor.
2) Where applicable, conf files will be customized to network bandwith.
3) A "brake" will be set on "Q" to limit image-crappiness.

(Image-Crappiness is a non technical term. Whereas Image Quality can be
expressed as the faithfulness of a reproduction to its original,
Image-Crappiness is the amount of distortion you are willing to tolerate. 


Thanks,

P.




       
____________________________________________________________________________________
Be a better Heartthrob. Get better relationship answers from someone who knows. Yahoo! Answers - Check it out. 
http://answers.yahoo.com/dir/?link=list&sid=396545433





More information about the ffmpeg-devel mailing list