[FFmpeg-trac] #2343(avformat:new): Stream is blocked by 20 seconds or more when reading an mjpeg stream at low resolutions

FFmpeg trac at avcodec.org
Sun Mar 10 10:04:47 CET 2013


#2343: Stream is blocked by 20 seconds or more when reading an mjpeg stream at low
resolutions
-------------------------------------+------------------------------------
             Reporter:  virtuald     |                    Owner:
                 Type:  enhancement  |                   Status:  new
             Priority:  wish         |                Component:  avformat
              Version:  git-master   |               Resolution:
             Keywords:               |               Blocked By:
             Blocking:               |  Reproduced by developer:  0
Analyzed by developer:  0            |
-------------------------------------+------------------------------------
Changes (by cehoyos):

 * priority:  normal => wish
 * type:  defect => enhancement
 * version:  1.1.3 => git-master
 * component:  undetermined => avformat


Comment:

 Replying to [ticket:2343 virtuald]:
 > The problem shows up on version 1.1.3 on Windows (32-bit) and Linux
 (64-bit).

 Am I correct that you get the same problem with current git head?

 > It also shows up on the version installed on my Ubuntu box by default,
 0.8.5-6:0.8.5-0ubuntu0.12.10.1.

 (This is an intentionally broken version of FFmpeg that is not supported
 here.)

 > Note that in the logfile I attached, it hangs right before the line
 "[mjpeg @ 0x1016500] max_analyze_duration 5000000 reached at 5000000" for
 about 25 seconds or so. And yes, it ends in an error saying that there is
 no output file, but an output file isn't necessary to illustrate the
 issue. The point is that it hangs. :)

 (As your post explains, it does not hang, it waits for 5000000 ms of
 audio/video data because you did not specify another value on the command
 line.)

 > I originally found this issue when using OpenCV, but the ffmpeg
 executable has the same problem. I was able to remove the delay in OpenCV
 by setting max_analyze_duration to 50000 on the context. However, it seems
 like the default setting should just work.

 How should FFmpeg know that you are playing a stream with a very uncommon
 low bitrate before looking into the stream (for the duration to analyze
 that you specify on the command line, if you don't specify it, the default
 rate - which is very low for many real-world streams - will be used)?

 Reducing the default value for -analyzeduration seems unacceptable to me.
 Maybe an additional option -analyze_real_world_duration could be
 implemented.

-- 
Ticket URL: <https://ffmpeg.org/trac/ffmpeg/ticket/2343#comment:1>
FFmpeg <http://ffmpeg.org>
FFmpeg issue tracker


More information about the FFmpeg-trac mailing list