[Ffmpeg-devel-irc] ffmpeg.log.20181029

burek burek021 at gmail.com
Tue Oct 30 03:05:02 EET 2018


[01:09:00 CET] <asterismo_l> furq, its a bug in the x264 lib?
[09:42:17 CET] <Shibe> Hi, what is wrong with this command ? sudo ffmpeg -f kmsgrab -i - -vf 'hwmap=derive_device=vaapi,scale_vaapi=w=1920:h=1080:format=nv12' -vcodec h264_vaapi -f pulse -ac 2 -i 1 -vcodec libx264 -crf 0 -preset ultrafast -acodec pcm_s16le output.mkv
[09:42:27 CET] <Shibe> Option vf (set video filters) cannot be applied to input url 1 -- you are trying to apply an input option to an output file or vice versa. Move this option before the file it belongs to.
[09:57:41 CET] <durandal_1707> Shibe: why are you having two -vcodec ?
[10:20:07 CET] <MarcoT> Hello
[10:20:18 CET] <MarcoT>  may you help me with my problem? I am using the example here (https://ffmpeg.org/doxygen/trunk/decoding__encoding_8c_source.html#l00503) to build a decoder. However, in the example they read from file, while I should read from socket, and in the framework I am using, I have already TCP packets as input. What I do now is to fill avcodec_decode with that packets but it results in some warnings (ac-tex damaged, Warning MVs not availabl
[10:20:31 CET] <MarcoT> And at a certain point avcodec_decode fails returning a negative number (Error slice too small)
[10:21:43 CET] <bencoh> Most codecs need complete encoded frames to be able to decode video stream
[10:22:06 CET] <MarcoT> I am using mpeg
[10:22:28 CET] <bencoh> meaning you need some code to chunk stream properly before feeding it to decoder
[10:23:58 CET] <bencoh> you can either write it (it can get pretty complex with some formats), re-use code from some other project, or somehow manage to feed your data to avformat before handing it to avcodec
[10:25:24 CET] <MarcoT> But if I read a video from file, with chunks of size 1448 (length of the packets I receive) it works
[10:25:25 CET] <bencoh> (see AVIOContext for instance ... assuming this is still the way to do it nowadays)
[10:25:38 CET] <bencoh> it "works" yeah
[10:29:42 CET] <MarcoT> Yes actually I looked on other projects but I found nothing interesting to solve my problem, because they use AVInputFormat and AVStream to directly read from a url...
[10:32:03 CET] <MarcoT> Anyway thanks for your help @bencoh, I will see if I can solve with AVIOContext ;)
[10:32:44 CET] <bencoh> :)
[10:33:00 CET] <bencoh> you might want to check first whether this is still the proper way to do it :)
[10:33:39 CET] <MarcoT> Hmm what do you mean?
[10:34:13 CET] <bencoh> I haven't written ffmpeg-related code for a few years, so I might have missed API changes since then
[10:36:04 CET] <MarcoT> Ah ok, but at least you gave me hope :)
[10:41:28 CET] <mertyildiran> Hi, I have a video file that keeps being updated by another program (let's say every 10 seconds and it's adding more and more frames at the end of the video). What I want to do here is to play that video file using ffmpeg and continue to play the new frames whenever the file updated by the other program. How can I do that?
[10:41:59 CET] <BtbN> What format is the file in?
[10:42:13 CET] <mertyildiran> I also wonder if are we able to replay the video whenever the file updated?
[10:42:21 CET] <mertyildiran> BtbN: it's .mp4
[10:42:28 CET] <BtbN> Impossible then
[10:42:36 CET] <BtbN> mp4 cannot be played until it's finalized
[10:42:53 CET] <mertyildiran> it's finalized
[10:42:59 CET] <BtbN> You just said the opposite
[10:43:45 CET] <mertyildiran> there is a playable file with let's say 4 seconds duration. After 10 seconds delay we add 2 more seconds to the video file so it becomes 6 seconds.
[10:43:59 CET] <mertyildiran> what I want to do is continue to play that 2 seconds
[10:44:21 CET] <BtbN> It can't be mp4 then. Unless you completely re-write the whole file from scratch every time, which sounds highly impractical and slow.
[10:44:41 CET] <BtbN> mp4 has a final header with an index on how to play the whole file. That final header sits at the end of the file
[10:45:07 CET] <BtbN> So appending to an mp4 file is impossible, and playing an mp4 file without that header is also impossible.
[10:45:23 CET] <mertyildiran> it's .mp4 and yeah it's rewritten every time. I'm not debating whether it's efficient or not
[10:45:34 CET] <mertyildiran> just want to continue to play the file
[10:45:45 CET] <BtbN> You cannot reasonably "continue" then. As there is no meaningful way to continue.
[10:45:46 CET] <mertyildiran> also wondering if it's possible to replay it
[10:46:02 CET] <BtbN> You'll have to wait until it's no longer being re-written
[10:46:27 CET] <c_14> you can store the current timestamp, reopen the file and seek to the timestamp (assuming the file at that point is complete and indexed)
[10:46:53 CET] <BtbN> You'll need to write your own application for that though. Pretty sure nothing out of the box supports such a weird mode
[10:47:03 CET] <ritsuka> BtbN: it's possible if you use a fragmented mp4, but the player should support it
[10:47:14 CET] <BtbN> ritsuka, this doesn't sound like fragmented mp4 at all
[10:47:23 CET] <BtbN> The easy solution would be to not use mp4, but mkv or mpeg-ts.
[10:47:27 CET] <BtbN> Those can be appended to
[10:47:48 CET] <mertyildiran> OK how can I achieve this if the file is mkv?
[10:47:54 CET] <ritsuka> for example QuickTime Player on macOS supports it, you can watch a mp4 exported by Final Cut and it will update as soon as a new part is ready
[10:47:56 CET] <mertyildiran> could you give me the command?
[10:48:06 CET] <BtbN> Just pipe it into ffmpeg with tail -f
[10:48:36 CET] <BtbN> (and make sure tail -f starts at the beginning, not actually just at the tail)
[10:49:36 CET] <mertyildiran> I was actually using ffplay to play the file. Is there any option for ffplay?
[10:49:51 CET] <c_14> the same thing, but with ffplay instead of ffmpeg
[10:49:52 CET] <BtbN> no, but you can use the exact same method
[10:50:17 CET] <mertyildiran> if there is none then please provide the complete command for ffmpeg because I did not understand the usage
[10:50:26 CET] <BtbN> One problem with this is though, that it will never know the file actually finished, and hang forever at the end
[10:50:54 CET] <mertyildiran> "hangover at the end" sure that's the exact behavior I'm looking for
[10:50:56 CET] <BtbN> What even is the usecase for that? It sounds as weird as it gets.
[10:51:12 CET] <mertyildiran> live-stream teaching solution
[10:51:18 CET] <BtbN> Well, at some point, it will _actually_ end. And you have no way to finish cleanly then
[10:51:20 CET] <bencoh> uh?
[10:51:42 CET] <BtbN> And why is your live stream in an mp4 file that gets constantly fully re-written?
[10:51:53 CET] <mertyildiran> https://github.com/3b1b/manim
[10:52:14 CET] <mertyildiran> This library generates math animations (using ffmpeg by the way)
[10:52:57 CET] <mertyildiran> so I'm investigating methods to turn that video generation into live-stream
[10:53:19 CET] <mertyildiran> though I couldn't figure it out directly from the library itself and looking for a workaround
[10:53:33 CET] <BtbN> Unless that thing generates a constant and steady flow of frames, any kind of livestream from it will fail
[10:54:44 CET] <Kleiner0901> Does anybody know how to input a H264 fragment into FFMpeg?
[10:55:16 CET] <BtbN> define h264 fragment. Just a plain .h264 file with raw bitstream?
[10:55:17 CET] <Kleiner0901> I have raw H264 frames created both by extracting from a mp4 with ffmpeg and created programmatically
[10:55:46 CET] <mertyildiran> it generates animation by animation. Watch one of his videos so you will understand the flow https://www.youtube.com/watch?v=WUvTyaaNkzM
[10:57:42 CET] <Kleiner0901> >"define h264 fragment. Just a plain .h264 file with raw bitstream?"    -   a file beginning 0000 0001 <h264 bitstream>
[10:58:11 CET] <BtbN> ffmpeg can take that as normal input. Use one of the various concat methods to combine them.
[10:58:56 CET] <Kleiner0901> thanks I'll look for those
[11:07:51 CET] <sibok> Hi i'm on Funtoo GNU/Linux and compiling ffmpeg saying smbclient does not exists. Samba is already installed on the system and the needed /usr/include/samba-4.0/libsmbclient.h also exists. This [1] is the full (13k lines) log file. And this [2] is the last 500 lines of the config.log file. Could someone take a look? thx [1]https://drive.google.com/open?id=1f7SAahTjBUJIoXtvOlOExULC5Z8h5KHI [2]https://pastebin.com/Dn3502cu
[11:19:29 CET] <BtbN> sibok, there's a bunch of errors in that header, so it can't compile the test program.
[11:19:59 CET] <BtbN> try latest ffmpeg master, if it doesn't build there, report a bug
[11:27:39 CET] <mertyildiran> BtbN: I changed the output format to ".mkv" So what's the easy solution you were talking about?
[11:28:06 CET] <mertyildiran> I'm also able to get .mov format
[11:28:25 CET] <BtbN> That depends... does it still re-create the whole file every time? If it just keeps appending, just use tail -f ... | ffmpeg ...
[11:28:29 CET] <BtbN> mov is mp4.
[11:28:42 CET] <BtbN> at least for all that matters
[11:29:24 CET] <mertyildiran> BtbN: I don't understand what do you mean with "-f ... | ffmpeg ..."?
[11:29:42 CET] <mertyildiran> "tail -f ... | ffmpeg ..."
[11:29:54 CET] <mertyildiran> what are these ....s?
[11:30:02 CET] <BtbN> dots, put the right stuff you need there..
[11:30:47 CET] <BtbN> Also, if it still re-writes the whole file every time, that won't work. I don't think there's a proper way to do that at all
[12:02:15 CET] <Kleiner0901> BtBN do you know where I should look for converting a raw frame to image? My attempt is so far resulting in failure https://i.imgur.com/BuwwnLX.jpg
[12:02:34 CET] <BtbN> raw frame as in raw yuv/rgb?
[12:02:59 CET] <Kleiner0901> raw h264 extracted from yuv420p
[12:03:36 CET] <BtbN> That looks like you told ffmpeg to interpret the h264 file as raw uncompressed yuv.
[12:04:00 CET] <Kleiner0901> ohhh
[12:25:28 CET] <Kleiner0901> "looks like you told ffmpeg to interpret the h264 file as raw uncompressed yuv." - thanks got it to work :)
[13:31:01 CET] <fling> Is there a fine filter for making a slow motion video?
[13:31:12 CET] <fling> The problem is the source has only 25 fps
[13:40:08 CET] <durandal_1707> fling: minterpolate... there are other solutions too, but not in lavfi
[13:42:19 CET] <fling> durandal_1707: thanks will try it
[13:48:16 CET] <fling> durandal_1707: I'm about to make a side-by-side comparison of two clips, wanted to slow them down to make the similarity more obvious to the viewer.
[13:49:18 CET] <JEEB> you could use minterpolate and then slow it down (or in the other order of first slowing down and then doing minterpolate)
[13:49:58 CET] <fling> I will script this today.
[13:50:42 CET] <fling> Both clips are at 25fps but the object on the video is moving with different speeds so I will need to change fps twice on one of them or drop some frames.
[14:25:42 CET] <analogical> how do I use FFmpeg to split randomvideo.mp4 into one video track and one audio track ?
[14:26:48 CET] <BtbN> -c:v copy -an -sn /// -c:a copy -vn -sn
[14:28:22 CET] <analogical> what does the /// do ?
[14:29:20 CET] <spaam> its a seperator between the first arguments and the second one .
[14:29:50 CET] <analogical> is it really necessary?
[14:30:25 CET] <spaam> you need to run ffmpeg twice ..
[14:30:55 CET] <BtbN> Technically you can do it in one invocation, but just running it twice is simple enough
[14:58:10 CET] <analogical> [NULL @ 0000000002c61080] Unable to find a suitable output format for '///'
[14:58:46 CET] <durandal_1707> you are not supposed to put that
[15:03:10 CET] <analogical> ffmpeg.exe -i video.mp4 -c:v copy -an -sn doesn't work :/
[15:03:34 CET] <fling> How do I slowdown with minterpolate properly? Should I just set output fps somehow?
[15:03:50 CET] <JEEB> I think setpts filter forces the frame rate instead of dropping/adding frames
[15:04:02 CET] <JEEB> and then minterpolate that interpolates that lower fps to higher
[15:04:43 CET] <fling> Or I can set the input fps
[15:06:40 CET] <analogical> please tell me exactly what to type so I can split the audio and video from movie.mp4 into two separate tracks?
[15:07:20 CET] <TheSashmo> does anyone know if the 2110-20 has been added to any ffmpeg builds or any other open source solutions?
[15:09:34 CET] <JEEB> TheSashmo: there's rtpdec_rfc4175
[15:10:05 CET] <TheSashmo> thanks JEEB
[15:10:24 CET] <JEEB> which supposedly is related
[15:13:00 CET] <JEEB> seems like its naming was mostly related to teh fact that at the point of merging 2110-20 was not yet published :P
[15:45:17 CET] <fling> What does this mean? [ffmpeg/video] h264: Invalid NAL unit 0, skipping.
[15:45:30 CET] <fling> it is sometimes unit 7 or unit 8 in mpv output
[15:48:35 CET] <dustobub> Would anyone be interested in doing a small custom (paid) patch on libavfilter/vf_pad?
[15:49:41 CET] <JEEB> fling: means that the NAL unit (AVC and HEVC packets are called NAL units) type id was invalid
[15:49:53 CET] <JEEB> aka "I dunno what type of NAL unit this is, ignoring"
[15:50:30 CET] <JEEB> dustobub: I would guess durandal_1707 is one of the filtering people
[15:50:49 CET] <fling> JEEB: is this bad? :P
[15:51:29 CET] <JEEB> could be anything from "This is a bug" to "the file just has random data in there"
[15:51:44 CET] <JEEB> if you have a sample available you could post it on the trac
[15:52:02 CET] <dustobub> JEEB: thanks! I'll ping them.
[15:52:20 CET] <JEEB> dustobub: also generally I guess you could make a feature request on the trac issue tracker
[15:53:18 CET] <fling> It works!
[15:53:34 CET] <fling> First I padded the first clip ffmpeg -y -i "concat:$annexb_pad|$annexb_first" -c copy -bsf:a aac_adtstoasc $padded_first
[15:53:37 CET] <dustobub> JEEB: yeah, but it's for a very specific use-case and I doubt would be generally useful. I need the padding color to be unique per frame (with around 1000 unique colors/frames possible)
[15:53:55 CET] <fling> Then I stacked it with the second clip and it looks great!
[15:56:51 CET] <fling> analogical: ffmpeg -i randomvideo.mp4 -c:v copy -an video-track.mp4
[15:57:30 CET] <fling> analogical: ffmpeg -i randomvideo.mp4 -c:a copy -vn audio-track.mp4
[15:57:50 CET] <fling> analogical: might also want -sn
[16:20:30 CET] <fling> ffmpeg -r 30 -i $padded_first -r 25 -i $second -filter_complex hstack&
[16:20:41 CET] <fling> How do I slow it down properly? ^
[16:20:52 CET] <q3cpma> Hello, does anyone know if the "100 buffers queued in out_0_0, something may be wrong." message is important?
[16:20:53 CET] <fling> -filter_complex hstack,setpts=?
[16:24:52 CET] <fling> Looks like I'm not doing it properly
[16:25:18 CET] <fling> minterpolate should go before hstack/vstack
[16:40:01 CET] <durandal_1707> q3cpma: it may be bad if it increases to 1000 buffers, what filtergraph?
[16:42:28 CET] <q3cpma> durandal_1707: None, just copying some streams and converting truehd 24bit to flac s16.
[16:43:06 CET] <durandal_1707> ah, then it should not increases to 1000 at all
[16:43:26 CET] <q3cpma> I suppose, the interweb only gave me filtering problems.
[17:10:29 CET] <fling> How do I convert to anaglyph?
[17:10:57 CET] <fling> Should I just stack and apply the filter I use for viewing 3d in glasses?
[17:13:53 CET] <durandal_1707> fling: anaglyph is old tech
[17:15:23 CET] <durandal_1707> if you have separate files for each eye and want to do one of anaglyphs use hstack/vstack & stereo3d filters
[17:15:44 CET] <fling> durandal_1707: this is what I'm thinking about.
[17:17:18 CET] <durandal_1707> http://trac.ffmpeg.org/wiki/Stereoscopic
[17:26:32 CET] <fling> durandal_1707: this worked for me -vf stereo3d=sbsl:arch
[18:29:59 CET] <gh0st3d> Hey everyone, looking for some help. I've got 2 commands that create an animated gif with the first 10 seconds of a video. I want to change the 2nd line to also include a png file as an overlay on the gif... Is this even doable in 2 commands? Here's the 2 lines and my attempt at changing the 2nd line: https://hastebin.com/ecayexufec.md
[18:31:19 CET] <durandal_1707> that link looks broken to me
[18:32:56 CET] <gh0st3d> Hmm, works for me, here's a different one though:   https://pastebin.com/tVvY3ehD    the ffmpeg.pastebin.com link from the channel info is not working for me so I used these
[18:33:28 CET] <gh0st3d> Ugh, also ignore the ffmpeg typo in the first line. That was from me editing out the python code
[18:36:35 CET] <durandal_1707> gh0st3d: you need to merge two complex graphs into one
[18:37:20 CET] <gh0st3d> Ohh ok. I'll try and figure that out in a few and get back on here if I need more help
[18:37:24 CET] <gh0st3d> Thanks for the info!
[18:39:00 CET] <durandal_1707> gh0st3d: ffmpeg -t 10 -i "/tmp/videofile.mp4" -i overlay.png -i /tmp/palette.png -filter_complex "[0:v][1:v]overlay=25:25:enable='between(t,0,10)',fps=10,scale=320:-1:flags=lanczos[x];[x][2:v]paletteuse"
[18:39:08 CET] <durandal_1707> something like that
[18:39:51 CET] <durandal_1707> note that you may need to use palettegen with overlay too, to get best results
[18:50:23 CET] <devinheitmueller> Does anyone have any thoughts on how safe it would be to call avfilter_graph_alloc_filter() from within a filter itself?
[18:51:14 CET] <devinheitmueller> The use case is a filter which includes a bunch of business logic which decides what combination of scaling/fps/pad to run against video frames.
[18:53:16 CET] <devinheitmueller> Doh, meant to say avfilter_graph_create_filter, not avfilter_graph_alloc_filter.
[18:53:28 CET] <durandal_1707> you mean separate graph?
[18:53:53 CET] <devinheitmueller> No, I want a filter which adds other filters to the current graph.
[18:54:35 CET] <devinheitmueller> I dont want to create a huge filter which replicates the functionality in scale/fps/pad, and I would prefer not to jam the logic into ffmpeg.c or avfiltergraph.c, since its specific to one use case.
[18:54:51 CET] <durandal_1707> ugh, that can not be possible
[18:55:40 CET] <durandal_1707> to be sure, also ask on devel mailing list
[18:55:50 CET] <devinheitmueller> Fair enough.  Figured I would start here first.
[19:11:35 CET] <iive> devinheitmueller, maybe you want to create a separate graph, so your filter takes the input, processes it through the graph and outputs the result of the graph...
[19:11:51 CET] <iive> no idea if that is possible at all .
[19:11:56 CET] <devinheitmueller> Yeah, I had considered that.
[19:12:36 CET] <devinheitmueller> Its not a terrible idea.  It obscures to the outside though that the filter references other filters, which could break things if the filter uses fifos.
[19:13:13 CET] <devinheitmueller> I do like that today you can dump out the entire filter graph to stderr, and see all the elements in one view.
[19:14:36 CET] <devinheitmueller> It feels like a concept that could be really useful (i.e. filters which realize they need to invoke other filters), but I can imagine it causing issues with stuff that is in the process of iterating over the list of graph filters.
[19:15:03 CET] <devinheitmueller> We do this today for the scaling filter, but its buried in avfiltergraph.c and isnt very flexible.
[19:33:48 CET] <aloo_shu> hey, I think I can describe a long standing and reproducible bug, but would not like to go through filing a proper report myself. Describing it is simple. Anybody willing to listen for a minute?
[19:36:22 CET] <durandal_1707> ?
[19:37:26 CET] <ChocolateArmpits> aloo_shu, well we can listen
[19:37:56 CET] <aloo_shu> the documentation suggests, in two places, to use the image2 filter as an input filter, in order to combine a series of images to a video
[19:38:15 CET] <ChocolateArmpits> image2 format * :)
[19:38:25 CET] <aloo_shu> like, for instance ffmpeg -f image2 -pattern_type glob -i 'foo-*.jpeg' -r 12 -s WxH foo.avi
[19:38:48 CET] <aloo_shu> where WxH needs to be specified
[19:39:40 CET] <aloo_shu> what it does, instead, is, as I've repeatedly observed over the years, and now up to v. 4
[19:40:29 CET] <aloo_shu> 1) it *overwrites* the input files with resized versions of themselves
[19:41:06 CET] <aloo_shu> 2) it converts exactly one image only to the desired video format
[19:43:42 CET] <aloo_shu> either a) image2 needs fixing, or b) combining images needs a different approach, like e.g. concatenate, or c) I've been blatantly missing something
[19:44:43 CET] <durandal_1707> how it overwrites when that never worked?
[19:45:21 CET] <aloo_shu> try to reproduce, you'll see
[19:46:05 CET] <durandal_1707> aloo_shu: what version of ffmpeg you use?
[19:46:11 CET] <aloo_shu> seems to be a first pass before converting
[19:46:27 CET] <aloo_shu> wait a sec, but various
[19:49:07 CET] <furq> aloo_shu: http://vpaste.net/4UrjN
[19:49:41 CET] <durandal_1707> i just tried and: 1) did not happen, 2) same
[19:54:27 CET] <aloo_shu> ah, getting nearer, thought I had newer versions. got 1.0.10 and a statically compiled one from about the same time, both armv7, the other instance was on ppc debian a while back. that turns it into a different question: any newer version available backported to debian wheezy (the highest that will run on the kernel that android is forcing me to use) for armhf/armv7?
[19:59:25 CET] <furq> https://www.johnvansickle.com/ffmpeg/
[19:59:29 CET] <furq> you can try the arm builds on here
[20:07:10 CET] <aloo_shu> furq: I think that's where I had my static built from (built 2014), but there seems to be newer. shall manually check wheezy-backports, too, deb-multimedia gives me 1.0.10 only
[20:08:34 CET] <aloo_shu> thanks so far, keep up the good work :) (I'd love to see all linux software so well documented)
[22:12:55 CET] <flying_sausages> Hey guys, I'm getting a flood of output like the example in the paste even when using the -v quiet option. Anyone know why that is and how can i get rid of it?
[22:12:57 CET] <flying_sausages> https://privatebin.net/?028e33cc9cfd269e#PdjX93xID+7HUSLcJc+4+0/u8NJgQ0LNDOlMnlPDiQk=
[22:15:55 CET] <durandal_1707> flying_sausages: you use some ffprobe command?
[22:18:42 CET] <ariyasu> try add -stats
[22:18:56 CET] <ariyasu> ffmpeg -v quiet -stats 'rest of your command here'
[22:18:57 CET] <flying_sausages> durandal_1707, not as far as I'm aware, I'm on WSL
[23:09:14 CET] <brimestone> Hey guys, im trying to use ffprobe -select_streams v:0 -show_entries stream=avg_frame_rate which works.. but how can I look at the timecode. Its TAG:timecode=23:49:38:00
[00:00:00 CET] --- Tue Oct 30 2018


More information about the Ffmpeg-devel-irc mailing list