[Ffmpeg-devel-irc] ffmpeg.log.20160429

burek burek021 at gmail.com
Sat Apr 30 02:05:01 CEST 2016


[01:01:35 CEST] <spirou> neuro_sys: what linux distro/version are you using?
[01:32:48 CEST] <smthorwhat> [15:46:11] <smthorwhat> hey guys. anyone know what this means " [graph 1 aresample for input stream 0:2 @ 0x207b280] [SWR @ 0x21de040] Failed to compensate for timestamp delta of 273.516250"
[01:32:48 CEST] <smthorwhat> [15:46:16] <smthorwhat> is there something i can use to fix that ?
[01:36:26 CEST] <pandb> i'm using the muxing.c example to live stream on youtube, but the video pauses frequently (to buffer, i presume) and the video plays back slightly faster than it should
[01:36:45 CEST] <pandb> where should I start looking?
[01:38:50 CEST] <pandb> the frame->pts gets incremented before being sent to the encoder, then the resulting packet is given to av_packet_rescale_ts() with the codec's time_base and the stream's time_base fields as the second and third parameters
[01:41:04 CEST] <pandb> i've managed to slow playback down by writing two or three copies of the same frame before getting more input image data, but that feels like a an overly kludgy way to do things
[01:42:32 CEST] <pandb> should I write my program to only grab a new frame after a certain number of milliseconds have elapsed(based on my preferred frame rate)?
[02:11:22 CEST] <spiderkeys> I'm looking at execution times for my pipeline and found that av_read_frame is taking ~250us per frame (1080p h264 stream), while av_write_frame only takes ~50us (muxing into mp4)
[02:11:35 CEST] <spiderkeys> Does anyone know of any options or methods I can use to speed up the read frame
[02:12:26 CEST] <spiderkeys> my ReadPacket( void *userDataIn, uint8_t *avioBufferOut, int avioBufferSizeAvailableIn ) function only takes ~3-5us of the time spent by av_read_frame
[02:15:42 CEST] <spiderkeys> wondering if the "AVFMT_FLAG_NOPARSE" flag might help, since I am absolutely sure that what I pass it is a single h264 frame, and all I need to do is mux it into a fragmented mp4 container
[02:23:26 CEST] <spiderkeys> hmm, closed the stream's parser, which effectively would have done the same. no noticeable difference
[03:44:33 CEST] <pandb> What should I do when I constantly see messages like "Delay between the first packet and last packet in the muxing queue is 10066000 > 10000000: forcing output'
[04:24:58 CEST] <needmorespeed> Are there any known issues with FFmpeg locking a file and not releasing once everything is freed?
[07:46:33 CEST] <pinPoint> does ffmpeg support .r3d yet?
[07:47:24 CEST] <Betablocker> is this for red cameras ?
[07:48:14 CEST] <Betablocker> https://trac.ffmpeg.org/ticket/2690
[07:49:31 CEST] <pinPoint> yeap
[10:53:19 CEST] <pkajaba> Hello guys, Have you tried to compile ffmpeg with new OpenCV 3.1?
[10:53:34 CEST] <pkajaba> I have this problem: http://koji.russianfedora.pro/kojifiles/work/tasks/4865/74865/build.log
[10:57:31 CEST] <jkqxz> From your output:  q(Include the log file "config.log" produced by configure as this will help solve the problem.)
[10:58:23 CEST] <jkqxz> Though mainly it looks like you haven't installed opencv in a place that the ffmpeg configure can see it.  Does pkg-config give you something sensible for it directly?
[11:09:34 CEST] <pkajaba> jkqxz I will try it. However there is file /usr/lib64/pkgconfig/opencv.pc
[11:10:31 CEST] <pkajaba> so pkg-config should be OK and openCV installed
[11:15:15 CEST] <jkqxz> Look at the pkg-config command that config.log shows the ffmpeg configure running, and try it standalone.
[11:18:08 CEST] <JEEB> you can add directories to pkg-config's search path with PKG_CONFIG_PATH
[11:18:25 CEST] <JEEB> PKG_CONFIG_PATH=/opt/huehue/lib/pkgconfig ./configure --muh-configure-parameters
[11:37:16 CEST] <Fyr> guys, people who upload porn videos usually show tiled screenshots of the movie. how do they do that?
[11:37:34 CEST] <Fyr> is it some kind of script for ffmpeg?
[11:40:03 CEST] <Fyr> those screenshots usually are not the same, it means the program compare the screenshots and if they are close/identical, moves the timeline and makes another screenshot.
[11:42:49 CEST] <Fyr> mosaic of screenshots, that's the right definition, I guess.
[12:06:32 CEST] <jkqxz> Fyr:  You want to make a set of images from a stream, where you look at all the frames and output a new image on each frame if it is "sufficiently different" somehow from the previous one?
[12:07:44 CEST] <jkqxz> Maybe look at the mpdecimate filter?
[12:08:34 CEST] <Fyr> jkqxz, I meant thmbnailer.
[12:08:54 CEST] <Fyr> is there a small python/bash/winthing tool to create thumbnails?
[12:09:01 CEST] <Fyr> every porn movie has thumgnails.
[12:09:47 CEST] <Fyr> when it comes to create thumbnails from a normal movie I'm about to upload, I'm stuck with googling that program.
[12:11:22 CEST] <Carlrobertoh> Could please anybody help me out? I'm able to capture screen with codec h.264 and save it on disk as test.h264 file. But if i want to save it on disk as test.mp4 then it doesn't play the video.
[12:11:26 CEST] <Carlrobertoh> Also, the .mp4 video can be played in MPlayer but cannot played in VLC.
[12:11:29 CEST] <Carlrobertoh> Also if I use ffmpeg command line tool to encode .h264 byte-stream to mp4 then it works like a charm
[12:12:33 CEST] <Fyr> Carlrobertoh, missing -f mp4 option?
[12:13:06 CEST] <Carlrobertoh> i'm writing code for screen capture. im not using the command line tool
[12:13:17 CEST] <Fyr> ok
[12:13:49 CEST] <Carlrobertoh> what could be the issue here? some missing headers perhaps?
[12:18:07 CEST] <jkqxz> Carlrobertoh:  Can you write anything other than an H.264 elementary stream?  (You are using libavformat to do the writing here, right?)
[12:19:36 CEST] <Carlrobertoh> Yes, i've tried mpeg aswell
[12:20:33 CEST] <Carlrobertoh> But how's that possible that MPlayer can play the mp4 container but VLC don't ?
[12:22:50 CEST] <jkqxz> Fyr:  Do you actually just want a thumbnail every time interval rather than comparing frames?  Try something like q(ffmpeg -i ... -vf 'fps=1/10,scale=160x90' out%03d.jpeg) to make a 160x90 image every 10 seconds.
[12:23:48 CEST] <jkqxz> Carlrobertoh:  Is it actually an mp4 container at all?  (What do file and mplayer actually say about it?)
[12:25:18 CEST] <Carlrobertoh> jkqxz, http://www.upload.ee/files/5767920/test.mp4.html
[12:48:58 CEST] <Fyr> guys, I'm trying to find what means -vf select=gt(scene\,0.3)
[12:49:11 CEST] <Fyr> what does "gt" mean?
[12:49:29 CEST] <Fyr> what is that number after "gt?"
[12:49:29 CEST] <furq> greater than
[12:49:52 CEST] <Fyr> ok, furq, what are the numbers for scenes?
[12:51:12 CEST] <furq> https://ffmpeg.org/ffmpeg-filters.html#select_002c-aselect
[12:51:20 CEST] <t4nk321> hello guys
[12:53:05 CEST] <Fyr> "Output file is empty, nothing was encoded (check -ss / -t / -frames parameters if used)"
[12:53:06 CEST] <Fyr> =)
[12:53:12 CEST] <Fyr> ='(
[12:53:27 CEST] <t4nk321> i make an overlay video with two video input, but i have a problem to choose one of the audio stream that come both in two videos, and turn off another one??
[12:54:23 CEST] <t4nk321> how can we achieve that, thanks.
[13:02:32 CEST] <Fyr> guys, how to create a mosaic from a movie by scene selection with setting the number of tiles?
[13:02:44 CEST] <Fyr> the movie is too long to get all the scenes.
[13:03:07 CEST] <Fyr> I need only 20 or 60 scenes.
[13:10:54 CEST] <jkqxz> Carlrobertoh:  "test.mp4: JVT NAL sequence, H.264 video @ L 50" - that's an elementary stream.  You're not actually writing an mp4, you've just changed the file name.
[13:11:24 CEST] <Carlrobertoh> Okey, but how could i write the actual mp4 file ?
[13:17:58 CEST] <t4nk283> hello i making overlay two video input that comes with its own audio for both of video, how can i choose just to output the second one in filter complex, thanks.
[13:21:38 CEST] <jkqxz> Carlrobertoh:  You're using avformat_alloc_output_context2(), presumably?  If it isn't doing the right thing from the filename, you must be passing something unhelpful in one of the other arguments.
[13:23:03 CEST] <Carlrobertoh> no, i'm using avformat_alloc_context() and output_format_ctx->oformat = av_guess_format()
[13:35:48 CEST] <trfl> I just followed the centos compile guide to build the exact same source on rhel6 and centos7, and for some reason the rhel6 build lets me use yuv444p with libvpx-vp9, while the c7 build only supports yuv420p... c7 also warns me that vp9 is experimental while rh6 doesn't, but c7 somehow has a higher version number for libavcodec etc in the ffmpeg -version output. What's going on?
[14:30:15 CEST] <GimmeSpace> hello guys..
[14:30:21 CEST] <Fyr> hey
[14:30:25 CEST] <GimmeSpace> its my pastebin cli command
[14:30:29 CEST] <GimmeSpace> http://pastebin.com/L6trzV25
[14:31:30 CEST] <GimmeSpace> the result of it give me an audio stream from the first video, i need the result give the second one. thanks.
[14:33:37 CEST] <GimmeSpace> how to achieve it? thanks.
[14:33:47 CEST] <Fyr> just do -an and then use second command:
[14:33:47 CEST] <Fyr> ffmpeg -i vid2.mp4 -c:a copy -vn -f mp4 1.m4a
[14:33:47 CEST] <Fyr> ffmpeg -i gen.mp4 -i 1.m4a -c:v copy -c:a copy final.mp4
[14:34:00 CEST] <Fyr> two-stage muxing
[14:34:11 CEST] <trfl> could it be that the lack of yuv444p support is due to missing cpu features? the centos7 box is a core2duo T6600 from 2009, doesn't have VTx or anything like that
[14:34:24 CEST] <Fyr> you can also use filter_complex, which I don't know. =))))))
[14:35:08 CEST] <GimmeSpace> yeah, i need one line cli command i use filter_complex, but didnt found any sugestion on the net?
[14:35:46 CEST] <Fyr> two commands are as good as one.
[14:36:05 CEST] <Fyr> less trouble =)
[14:36:55 CEST] <trfl> this might work: -map 1:a
[14:37:00 CEST] <GimmeSpace> i have to use it in bulk, so two stage will be wasted time, and not to mention i have old computer. thanks.
[14:37:21 CEST] <Fyr> I would use pipes. =)
[14:38:12 CEST] <trfl> also, if youre just copying streams without reencoding them then two stages will be *almost* as fast as 1 (hint: -vcodec copy)
[14:38:21 CEST] <GimmeSpace> trfl, where we put the line "-map 1:a"?
[14:38:41 CEST] <GimmeSpace> ow i dont know that trfl, thanks.
[14:39:10 CEST] <trfl> try right before the filename you are saving to
[14:39:39 CEST] <GimmeSpace> i had, but didnot work still it give me an error.
[14:39:55 CEST] <trfl> what's the error?
[14:41:13 CEST] <GimmeSpace> Filter format has a unconnected output
[14:41:42 CEST] <trfl> hmm
[14:42:52 CEST] <trfl> just a guess, but drop the -map thing and instead change [bg][fg] to [1:a][bg][fg]
[14:45:12 CEST] <GimmeSpace> ERROR : Too many inputs specified for the "overlay" filter.
[14:45:13 CEST] <GimmeSpace> Error initializing complex filters.
[14:45:13 CEST] <GimmeSpace> Invalid argument
[15:53:13 CEST] <Fyr> how will -c:a copy work with multi-channel audio?
[15:53:36 CEST] <JEEB> that works on streams, not channels
[15:54:02 CEST] <Fyr> ok, with many audio streams.
[15:55:08 CEST] <JEEB> standard ffmpeg cli behavior is that it picks the "best" video and audio track so those selected tracks will get handled
[15:55:17 CEST] <JEEB> use -map to map the input streams you want to handle
[15:55:29 CEST] <Fyr> ok
[15:55:38 CEST] <JEEB> that disables the automagic so you will specify everything you need
[15:56:00 CEST] <JEEB> -map 0:v maps all video tracks from first input f.ex.
[15:56:14 CEST] <Fyr> ok
[15:56:14 CEST] <Fyr> JEEB, is there any chance that the delay bug will be fixed in this century?
[15:56:15 CEST] <JEEB> -map 0:a:2 picks the third audio track from the first input
[15:57:03 CEST] <Fyr> thanks for the mapping, it will help.
[16:15:01 CEST] <Fyr> JEEB, do you know how to create mosaic thumbnail from scenes of a movie, with specified number (like, 16). I found out how to make it from scenes, however, FFMPEG creates them as many as it finds scenes, without upper limit.
[16:15:46 CEST] <Fyr> what if I need only 20 of them, but dispersed all over on the timespan.
[16:18:26 CEST] <furq> Fyr: -frames:v will limit the frames but it'll just be the first 20
[16:19:16 CEST] <furq> i've been meaning to use the API to do the same thing
[16:19:35 CEST] <furq> but if you're not on a headless box then you might as well use mplayer
[16:20:05 CEST] <Fyr> ok, guys, I'm out of words, how to do this:
[16:20:05 CEST] <Fyr> http://s32.postimg.org/bs1ltmzpx/pic.jpg
[16:20:06 CEST] <Fyr> with FFMPEG?
[16:20:22 CEST] <Fyr> the picture may hurt your feelings.
[16:20:27 CEST] <furq> yeah that is what i was hoping to do
[16:20:32 CEST] <furq> except less explicit
[16:20:59 CEST] <furq> like i said you can do it with mplayer if you just want a command line method
[16:22:24 CEST] <durandal_1707> thumbnail filter
[16:23:22 CEST] <Fyr> durandal_1707, ok, if the file were "input.mp4," how to create that nice picture?
[16:23:24 CEST] <furq> oh neat
[16:23:41 CEST] <Fyr> with the timestamps, scene selection, some info on the top.
[16:23:43 CEST] <furq> Fyr: you'd need two passes, or more likely one ffprobe pass first
[16:24:19 CEST] <furq> grab the info with ffprobe, use the total frame count to get the right argument for the thumbnail filter, then use the tile/drawbox/drawtext filters to fill in the info
[16:24:33 CEST] <durandal_1707> select filter plus thumbnail filter
[16:24:47 CEST] <Fyr> guys, I need the exact command.
[16:24:53 CEST] <furq> we don't have the exact command
[16:25:02 CEST] <furq> if i knew it i'd have already been using it
[16:25:12 CEST] <Fyr> then, I can say the same: vf filter + ffprobe.
[16:25:16 CEST] <durandal_1707> send me $$$ and I will tell it
[16:25:19 CEST] <Fyr> very useful answer.
[16:25:47 CEST] <Fyr> been there, tried that.
[16:26:20 CEST] <Fyr> I posted a bug report and sent a donation of $5.
[16:26:23 CEST] <Fyr> something simple
[16:26:36 CEST] <Fyr> still, the bug is not fixed.
[16:26:40 CEST] <durandal_1707> and there is tile filter. Too
[16:27:02 CEST] <durandal_1707> What bug?
[16:27:10 CEST] <Fyr> I'll never donate FFMPEG again.
[16:27:20 CEST] <Fyr> I don't remember. =)
[16:27:30 CEST] <Fyr> it's in the logs.
[16:27:52 CEST] <durandal_1707> about what?
[16:30:56 CEST] <Fyr> durandal_1707, look through your logs at [22 Nov 15 16:27:28]
[16:31:32 CEST] <Fyr> [22 Nov 15 16:30:28] * durandal_1707: you mean coverart in m4a ?
[16:31:32 CEST] <Fyr> [22 Nov 15 16:30:32] * Fyr: yes
[16:31:47 CEST] <Fyr> you've seen it.
[16:33:45 CEST] <s00b5u> I have a question, if we have a video which has Variable VBR and we transcode it to Constant VBR... are there chances of loosing lip sync?
[16:34:05 CEST] <durandal_170> select=gt(scene\,0.4),scale=qcif,tile
[16:47:10 CEST] <durandal_1707> Fyr: I posted command
[16:48:19 CEST] <Fyr> oh
[16:58:06 CEST] <vade> Im curious about how I might go about troubleshooting this issue: I have a video transcode that appears to be outputting valid h.264 to a mp4 container via libavformat libx264. VLC plays back and throws no errors / shows no errors / retiming log info, etc. However, upon opening the file with Quicktime, I get a kVTVideoDecoderUnsupportedDataFormatErr error.
[16:58:06 CEST] <vade> Do I need to manually hint profiles, or provide some side dara to the encoder to satisy other decoders even though VLC seems to be ok or at least, silently recover from whatever idiocy I am doing? :)
[16:58:54 CEST] <vade> Im specifying AV_CODEC_ID_H264 into a container that is MP4 part 14
[17:00:01 CEST] <jkqxz> AV_PIX_FMT_YUV420P?
[17:00:55 CEST] <vade> yup
[17:01:03 CEST] <vade> AV_PIX_FMT_YUV420P
[17:01:40 CEST] <vade> is it possible to provide PTS / DTS that is valid for libx264 but maybe not for other decoders?
[17:03:19 CEST] <jkqxz> What does ffprobe say about the H.264 stream inside the mp4 container?
[17:03:47 CEST] <kepstin> i'm not sure what that sentence means, libx264 is not a decoder, and you don't give dts to an encoder
[17:04:54 CEST] <vade> ohhh
[17:05:35 CEST] <vade> do you have to set the resulting AVPackets stream index manually?
[17:05:46 CEST] <vade> (from the encoder?)
[17:05:54 CEST] <vade> sorry, first time playing with LibAV :)
[18:02:53 CEST] <amdbcg> hello
[18:14:41 CEST] <durandal_1707> hello
[18:31:21 CEST] <amdbcg> Line 45 inlibavutil\common.h, 	[Error] libavutil/avconfig.h: No such file or directory
[18:31:41 CEST] <amdbcg> I looked in libavutil, and there was no avconfig.h :o
[18:32:26 CEST] <amdbcg> so either the config is wrong or I have not made/ installed something.
[18:43:24 CEST] <c_14> make distclean and try again, are you building from git? What's your configure line
[18:50:23 CEST] <amdbcg> I'm on windows, and just discovered gcc is unable to create an executable file from the configure. ... I'm installing more dev packages to fix.
[18:52:14 CEST] <amdbcg> and... I might just use a dev binary instead.
[18:59:58 CEST] <amdbcg> c_14 : what should the configure line look like?
[19:02:39 CEST] <c_14> ./configure
[19:02:45 CEST] <c_14> plus whatever you want to add
[19:34:41 CEST] <Kurogane> Hello, i have a problem with installing ffmpeg i follow this guide https://trac.ffmpeg.org/wiki/CompilationGuide/Centos and the problem i have is say yasm is not found/old http://pastie.org/private/7ebsixq9smdon32khsgq
[19:35:28 CEST] <DHE> centos 6 has too old a version of yasm. I built mine from source and run that instead
[19:36:11 CEST] <DHE> hmm.. a quick check shows it may have been upgraded. EPEL has 1.2
[19:36:44 CEST] <Kurogane> i built from git.....
[19:36:46 CEST] <drv> if you are trying to use yasm from your ~/bin directory, that needs to be in $PATH
[19:37:31 CEST] <drv> or use --yasmexe to point to it
[19:43:23 CEST] <Kurogane> i not understand now this warning "WARNING: pkg-config not found, library detection may fail.", if we need other custom paths why not put that on the guide or i miss something i'm not aware?
[19:45:07 CEST] <furq> the guide mentions that you need to install pkg-config
[19:45:22 CEST] <furq> and that you need to set PKG_CONFIG_PATH
[19:45:45 CEST] <furq> if your distro calls pkg-config anything other than pkg-config then you'll need to set --pkg-config="pkgconf" or whatever your binary is called
[19:51:02 CEST] <Kurogane> furkan, yes, you're right, but still not understand why not detect yasm
[19:51:18 CEST] <Kurogane> furq, yes, you're right, but still not understand why not detect yasm*
[20:50:54 CEST] <graphitemaster> I'm explicitly building ffmpeg with --disable-swscale and yet the libavfilter.so is complaining that it needs "libswscale.so.3, needed by ../../thirdparty/usr/lib/libavfilter.so"
[20:55:43 CEST] <kepstin> graphitemaster: a build in a clean source tree?
[20:56:27 CEST] <graphitemaster> yes
[21:16:15 CEST] <c_14> graphitemaster: what was your configure line? what's the output of ldd libavfilter.so? Are you building from git?
[21:21:11 CEST] <vade> I have an AVFrame that is being decoded marked bt709 color primaries / transfer etc etc. However my output / encoded stream via FFPRobe doesnt indicate my video stream is marked BT709
[21:21:54 CEST] <vade> I dont see an AVStream option / ivar I can fiddle with to set the color space transfer and primaries / color range etc
[21:22:03 CEST] <vade> how would I go about marking that? :X
[21:24:24 CEST] <vade> oh im an ass, its in the output stream -> codec
[21:31:15 CEST] <jadesoturi> hi all. I have a questions. I used ffmpeg to rip audio from a file that is set to be 60fps and i added it to a file with 24fps. the problem now is the sync. If i sync it up at one part of the video, its out of sync a few minutes later.. how does one solve this?
[21:32:55 CEST] <jadesoturi> i get the same sync problem in both avidemux and in vlc...and the merging was also done with ffmpeg
[21:33:31 CEST] <kepstin> to fix that you'll have to adjust the speed of either the audio or video track
[21:34:50 CEST] <jadesoturi> ok. total noob here. how does on do that? can i use ffmpeg?
[21:35:42 CEST] <jadesoturi> its an aac file(used -acodec copy and saved as .aac when ripping)
[21:35:43 CEST] <kepstin> sure, there are filters that can do either in ffmpeg. you'll want to find out the difference in length between your audio and video tracks to find out how much adjustment is needed.
[21:48:14 CEST] <jadesoturi> it is the acc file i am talking about. it cuts at 1:37:09 but says its 1:48:17. the original file i ripped it from was 1:37:20(skipped the first 11 seconds)
[21:48:43 CEST] <c_14> Then it's probably 1:37:09
[21:49:07 CEST] <jadesoturi> ok. but how do i make it "match" the 60fps 1:37:20 videofile?
[21:50:34 CEST] <jadesoturi> is it the atempo filter you guys are talking about? cause i can only specify a ratio there, but not sure what that ratio should be with the difference is apparently on 7 seconds or so in time..
[21:51:35 CEST] <c_14> should be around 1.00189
[21:51:56 CEST] <jadesoturi> ok. thx ill try that..
[22:12:07 CEST] <vade> hrm. Im setting my output stream and codec contexts frame rate prior to opening to AVRational num 25, den 1
[22:12:18 CEST] <vade> but ffprobe shows my stream as being FPS 25.11 -
[22:13:03 CEST] <vade> is that a PTS thing for my frames, or is it a container / stream nuance?
[22:56:01 CEST] <kepstin> vade: what container?
[22:56:08 CEST] <vade> mp4
[22:56:35 CEST] <kepstin> huh, I'm not familiar with how pts/framerate is represented in mp4.
[22:56:49 CEST] <vade> i feel like the further I dig the less I know :P
[22:57:48 CEST] <kepstin> I would expect 25fps to work fine in mp4, that's a common framerate for deinterlaced PAL stuff :/
[22:59:14 CEST] <pandb> After a minute or so of live streaming to youtube, my muxer is outputting messages that say "Delay between the first packet and last packet in the muxing queue is 10067000 > 10000000: forcing output"
[23:00:11 CEST] <vade> you might be feeding packets in faster than you output them to your muxer, and the queue has a set length?
[23:00:21 CEST] <vade> just guessing from that message
[23:01:01 CEST] <pandb> How can I rectify that?
[23:01:17 CEST] <pandb> I've looked at the avformatcontext documentation but I didn't see a suitable looking setting
[23:01:34 CEST] <pandb> (assuming that's where I should look)
[23:02:02 CEST] <vade> are you encoding too somewhere?
[23:02:47 CEST] <pandb> yes, I think
[23:03:08 CEST] <pandb> I'm taking a sws_scale'd frame and encoding it with avcodec_encode_video
[23:03:30 CEST] <pandb> then calling av_interleaved_write_frame with the resulting packet
[23:04:20 CEST] <pandb> it's actually a modified version of the muxing.c example that comes with ffmpeg
[23:05:50 CEST] <pandb> Perhaps AVFormatContext::max_picture_buffer would be relevant?
[23:17:14 CEST] <vade> yea (sorry walked off for a moment) - I was thinking something along those lines
[23:17:35 CEST] <vade> the encoder buffers n frames based on the a bunch of variables I dont quite understand but you can toy with it
[23:26:30 CEST] <kepstin> assuming you're using x264, you probably want to use the "tune=zerolatency" avoption to make the encoder not buffer lots of frames
[23:35:38 CEST] <DHE> x264 has a default lookahead buffer of 40 (!) frames
[23:35:55 CEST] <DHE> though the speed presets change that
[00:00:00 CEST] --- Sat Apr 30 2016


More information about the Ffmpeg-devel-irc mailing list