[FFmpeg-devel] lavf: different flush_packet options...

Reimar Döffinger Reimar.Doeffinger at gmx.de
Sat Jun 28 19:27:18 CEST 2014


Hello,
options_table.h contains these:
{"fflags", NULL, OFFSET(flags), AV_OPT_TYPE_FLAGS, {.i64 = AVFMT_FLAG_FLUSH_PACKETS }, INT_MIN, INT_MAX, D|E, "fflags"},
{"flush_packets", "reduce the latency by flushing out packets immediately", 0, AV_OPT_TYPE_CONST, {.i64 = AVFMT_FLAG_FLUSH_PACKETS }, INT_MIN, INT_MAX, D, "fflags"},

and:

{"flush_packets", "enable flushing of the I/O context after each packet", OFFSET(flush_packets), AV_OPT_TYPE_INT, {.i64 = 1}, 0, 1, E},

There are several things that confuse me completely.
a) what is the difference between these
b) why is the first one marked as decode-only, the second as encode
only, but both are actually used at the same place:
mux.c:    if (s->flush_packets && s->pb && ret >= 0 && s->flags & AVFMT_FLAG_FLUSH_PACKETS)
(and this is btw. the _only_ use of AVFMT_FLAG_FLUSH_PACKETS, and I
absolutely see no reason why it requires flush_packets to be set except
making it needlessly hard to use)

I suspect that the non-flag documentation should say "allow explicit
flushing to reduce latency by writing NULL packets", assuming it is
removed from the above if.
The flag version should probably warn that it might break with some
muxers (not sure, just a suspicion).

Regards,
Reimar


More information about the ffmpeg-devel mailing list