[Ffmpeg-devel-irc] ffmpeg.log.20170127

burek burek021 at gmail.com
Sat Jan 28 03:05:02 EET 2017


[01:00:20 CET] <the_k> hello ppl
[01:03:12 CET] <the_k> http://imgur.com/nSTYQdn
[01:03:21 CET] <the_k> http://i.imgur.com/nSTYQdn.png
[01:03:59 CET] <the_k> i'm recording a stream here.. just wondering if anyone could point out anything that could be corrected as i see some errors in the display
[01:04:02 CET] <the_k> it does work though
[01:05:59 CET] <the_k> is there a build for windows with pthread support, and would it help?
[01:17:24 CET] <furq> recording from a live source to a non-streaming container seems like a bad idea
[01:17:35 CET] <furq> also -b and -r don't do anything if you're copying streams
[01:18:19 CET] <thebombzen> is there a way for libopus to do vbr encoding (like libvorbis's -q:a or libx264
[01:18:26 CET] <thebombzen> or its -crf:v)
[01:18:53 CET] <furq> -b:a
[01:19:25 CET] <furq> there is no abr mode in opus, -b:a 192k is more or less the same thing as lame -V2
[01:19:39 CET] <furq> it's nominally 192k but won't actually attempt to hit that bitrate
[01:21:00 CET] <furq> -vbr on is the default iirc but if not then turn that on too
[01:28:30 CET] <faLUCE> hello. I'm trying to h264-encode YUYV422 images. I already encoded YUYV420P images, and all worked fine. But with YUYV422 I obtain this error:  [libx264 @ 0x29d81a0] Input picture width (320) is greater than stride (0).   I'm sure that I have filled the img's stride with  av_image_alloc(frame->data, frame->linesize, 640, 480, AV_PIX_FMT_YUYV422, 1);    and I have a linesize of {1280, 0, 0}.... so.... why this error?
[01:36:21 CET] <the_k> furq, so i should convert it?
[01:36:36 CET] <the_k> i wanted the raw input so that i don't have any loss
[01:38:08 CET] <furq> if you want 60fps output then you have to convert it
[01:38:20 CET] <furq> although that's just going to duplicate every frame three times, so i don't see why you would
[01:38:28 CET] <furq> you should definitely use a more robust container like mpegts though
[01:42:55 CET] <the_k> ah yeah that part of the string was a suggestion from a friend
[01:43:01 CET] <the_k> i tried asking him why it was needed
[01:43:04 CET] <the_k> didn't answer
[01:43:13 CET] <the_k> same goes for the 3000kbit part..
[01:43:35 CET] <the_k> couldn't figure out how if it was saving the raw input it would need a bandwidth figure
[01:47:09 CET] <thebombzen> the_k: if you're just trying to dump the stream to view it later, you can use ffmpeg -i rtsp://input_stream -c copy output_stream.ts
[01:47:27 CET] <thebombzen> that will never end if the stream never ends though, so be careful with that
[01:47:42 CET] <the_k> -i is input?
[01:48:09 CET] <the_k> i was going to terminate it after every 24 hours
[01:48:23 CET] <the_k> with -t 24:00:)0
[01:48:26 CET] <the_k> with -t 24:00:00
[01:48:32 CET] <thebombzen> "-i foo" means use "foo" as an input
[01:48:37 CET] <the_k> yep ok
[01:48:44 CET] <furq> if you want 24-hour chunks you'd be better off using the segment muxer for that
[01:49:41 CET] <the_k> ah
[01:50:43 CET] <furq> -f segment -strftime 1 -segment_time 86400 out%F-%T.ts
[01:51:10 CET] <furq> will give something like out2017-01-27-00:00:00.ts
[01:51:12 CET] <thebombzen> does the segment muxer only mux mpegts into segments?
[01:51:24 CET] <furq> the examples show it working with nut
[01:51:28 CET] <furq> i'd imagine flv etc work as well
[01:51:36 CET] <thebombzen> how do you specify the format then
[01:51:42 CET] <furq> er
[01:51:43 CET] <thebombzen> it appears you did via filename but -f won't work
[01:51:46 CET] <furq> oh right
[01:51:59 CET] <thebombzen> so like how would you force the format
[01:52:04 CET] <furq> is there a need to
[01:52:08 CET] <thebombzen> no, just curious
[01:52:17 CET] <furq> i don't see how you'd use -f segment to output to anything other than files
[01:52:21 CET] <thebombzen> or is this one of those muxers like image2 where the filename is important
[01:52:27 CET] <thebombzen> so you can't
[01:52:37 CET] <furq> probably not
[01:52:58 CET] <furq> oh
[01:53:00 CET] <furq> -segment_format
[01:53:00 CET] <furq> duh
[01:53:11 CET] <the_k>  Could not get segment filename with strftime
[01:53:25 CET] <furq> weird
[01:53:36 CET] <the_k> it's a camera stream
[01:53:40 CET] <furq> well `-f segment -segment_time 86400 out%d.ts` works too
[01:53:44 CET] <the_k> should it contain a time index?
[01:53:51 CET] <furq> no that's based off wallclock time
[01:53:53 CET] <the_k> rtsp stream
[01:53:58 CET] <the_k> right
[01:53:59 CET] <furq> it's not affected by the input
[01:54:02 CET] <the_k> oh
[01:54:04 CET] <the_k> right ok
[01:54:48 CET] <the_k> ok t his works but it's showing the bitrate as N/A
[01:54:55 CET] <the_k> and size=N/A
[01:54:59 CET] <the_k> which is different
[01:55:00 CET] <furq> you can also use `-segment_atclocktime 1` if you want it to always change over at midnight
[01:55:25 CET] <faLUCE> hello. I'm trying to h264-encode YUYV422 images. I already encoded YUYV420P images, and all worked fine. But with YUYV422 I obtain this error:  [libx264 @ 0x29d81a0] Input picture width (320) is greater than stride (0).   I'm sure that I have filled the img's stride with  av_image_alloc(frame->data, frame->linesize, 640, 480, AV_PIX_FMT_YUYV422, 1);    and I have a linesize of {1280, 0, 0}.... so.... why this error?
[01:55:28 CET] <the_k> the file plays fine though
[01:55:36 CET] <furq> maybe the segment muxer breaks the bitrate/size display
[01:55:44 CET] <furq> since it'd have to keep track of multiple files
[01:55:57 CET] <furq> it's not really useful if you're copying anyway
[01:56:03 CET] <furq> as long as the frame counter works you know it's running
[01:56:23 CET] <the_k> ok well the file actually plays now while it's saving to diskl
[01:56:30 CET] <the_k> so that's a good improvement
[01:56:32 CET] <furq> yeah that's one of the benefits of using mpegts
[01:56:40 CET] <the_k> ok great
[01:56:46 CET] <the_k> thanks for that!!
[01:56:54 CET] <furq> it also won't make the file unplayable if the input drops
[01:57:12 CET] <the_k> right ok
[01:57:31 CET] <furq> i wonder why strftime doesn't work
[01:57:35 CET] <furq> maybe it's a windows thing
[01:57:53 CET] <the_k> i've no idea but i could test a few commands if you want
[01:58:01 CET] <furq> oh
[01:58:04 CET] <furq> no %F on windows
[01:58:07 CET] <the_k> ah!
[01:58:32 CET] <furq> %F-%T is the same as %Y-%m-%d-%H:%M:%S
[01:58:39 CET] <furq> so try it with that if you want the files timestamped
[01:58:56 CET] <the_k> so i'll add -strftime 1
[01:58:59 CET] <furq> yeah
[01:59:12 CET] <the_k> i can add this just before the file output name right?
[01:59:26 CET] <furq> anywhere between the input and output filenames
[01:59:39 CET] <the_k> hm
[01:59:40 CET] <the_k> out27.ts
[01:59:44 CET] <the_k> this is the filename
[01:59:53 CET] <furq> what's your command
[02:00:05 CET] <the_k> .... out%d.ts
[02:00:10 CET] <furq> oh right
[02:00:14 CET] <the_k> i want %Y-%m-%d-%H:%M:%S ?
[02:00:24 CET] <furq> yeah i meant out%Y-%m-%d-%H:%M:%S.ts
[02:00:31 CET] <furq> or replace out with whatever prefix you want
[02:01:00 CET] <the_k> [segment @ 0000000003b73820] Failed to open segment 'fdout2017-01-27-01:00:46.ts'
[02:01:00 CET] <the_k> Could not write header for output file #0 (incorrect codec parameters ?): Protocol not found
[02:01:13 CET] <furq> uhh
[02:01:18 CET] <the_k> colons
[02:01:21 CET] <furq> yeah
[02:01:24 CET] <furq> that's stupid
[02:01:34 CET] <furq> maybe it'll work in quotes, but w/e
[02:01:50 CET] <the_k> fdout2017-01-27-01-01-40.ts
[02:01:56 CET] <the_k> cool :)
[02:02:09 CET] <the_k> no, windows won't allow a colon
[02:02:30 CET] <the_k> could use a ;
[02:02:30 CET] <furq> oh right yeah no colons in filenames
[02:02:51 CET] <the_k> fdout2017-01-27-01;02;43.ts
[02:02:57 CET] <the_k> looks a little silly but better
[02:03:19 CET] <furq> i mean i'd probably just use %Y%m%d-%H%M%S
[02:03:21 CET] <furq> but whatever suits you
[02:04:17 CET] <the_k> mm
[02:04:22 CET] <the_k> ok thanks for that
[02:06:22 CET] <thebombzen> no colons in filenames
[02:06:26 CET] <thebombzen> what?
[02:06:45 CET] <thebombzen> I know NTFS already has weird restrictions like no parentheses or ?
[02:06:55 CET] <thebombzen> but colons?
[02:07:04 CET] <thebombzen> why not
[02:08:35 CET] <the_k> ah this is great. i can play the file directly from the drive instead of taking up more network bandwidth
[02:08:55 CET] <the_k> and yet i can also now skip back and forth in time
[02:12:27 CET] <furq> thebombzen: drive letters
[02:12:51 CET] <thebombzen> oh right
[02:13:06 CET] <furq> that one at least makes sense
[02:13:10 CET] <thebombzen> but other than \, :, and null, why does windows not allow other weird things
[02:13:16 CET] <furq> i'm not sure why <>*| aren't permitted
[02:13:33 CET] <thebombzen> oh because in windows you can't backslash a character
[02:13:35 CET] <furq> is usually an acceptable substitute for \ and also that would be a nightmare for nfs and stuff
[02:13:39 CET] <furq>  /
[02:13:58 CET] <furq> oh and ? isn't allowed as well
[02:14:00 CET] <thebombzen> forward slash can be interpreted as a path separator for compatibility
[02:14:06 CET] <furq> some of these are probably dos holdovers
[02:14:25 CET] <thebombzen> well * and ? are primitive expansions but you should be able to quote them.
[02:14:31 CET] <furq> yeah
[02:16:01 CET] <furq> this page i'm reading this on also mentions PATH length, and don't get me started on that
[02:16:08 CET] <furq> er, path length. not $PATH length
[02:30:03 CET] <faLUCE> can I x264-encode directly from YUYV422 frames? I can't open the codec with this format  (Specified pixel format yuyv422 is invalid or not supported)   when calling avcodec_open2(),  and I'm forced to resample i to YUV420P, but I see that ffmpeg command line can encode with YUYV422 format.... what's wrong? how can I debug?
[02:30:26 CET] <thebombzen> furq: silly goose
[02:30:29 CET] <thebombzen> it's %PATH%
[02:33:34 CET] <furq> faLUCE: you have to convert it to planar
[02:33:39 CET] <furq> yuv422p should work
[02:36:15 CET] <faLUCE> furq: nope, when executing:  "ffmpeg -f v4l2 -i /dev/video0 -pix_fmt yuyv422 -c:v libx264 -tune zerolatency -b:v 900k -listen 1 -f mpegts http://localhost:5556"  it works, and also the player shows yuyv422 info, without the "p" (planar) token
[02:37:51 CET] <faLUCE> furq: sorry, you are right
[02:38:35 CET] <faLUCE> the strange thing is that the ffmpeg cli doesn't show the planar token:   tream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
[02:41:42 CET] <furq> that's the input stream
[02:42:04 CET] <faLUCE> furq: yes, sorry again.
[02:43:05 CET] <furq> the cli will normally try to convert to the closest accepted pixel format
[02:43:36 CET] <furq> i didn't know it did that if you specify -pix_fmt but apparently it does now
[02:46:27 CET] <faLUCE> well. Now my lib is finished, after a hard work of 3 weeks. I can stream x264 http with very low latency,
[02:46:49 CET] <faLUCE> with both audio and video perfectly in sync
[03:00:43 CET] <thebombzen> wha'ts the practical difference between -pix_fmt and -vf format, similar to -s and -vf scale
[03:02:03 CET] <furq> i'm pretty sure one is an alias for the other in both cases
[03:02:20 CET] <furq> iirc the difference is that the flag just appends it to the end of the filterchain
[03:02:30 CET] <furq> rather than letting you put it where you want
[03:45:39 CET] <beastwick> Hi, I am trying to use libav to open a x11grab device and encode the video. I *kind of* understand how this is all put together as I was working off of two different tutorials. I am able to see the encoding going for 250 frames and write to a file, that does fill with data, but when I try playing it back in VLC I see nothing. There is no duration i
[03:45:39 CET] <beastwick> nformation or image.
[03:46:08 CET] <beastwick> http://pastebin.com/u3pvTGEX
[04:01:39 CET] <beastwick> My bad, that pastebin had a bug.
[04:02:01 CET] <beastwick> http://pastebin.com/6zxXRJaR
[04:02:07 CET] <beastwick> same issue
[04:10:50 CET] <NermaN> Hello, when i'm trying to encode video using h264_vaapi encoder it seems to get only empty frames (full debug log here: http://pastebin.com/Pb9PdMx8 ) is there any way to solve this issue?
[05:11:47 CET] <bms20> Hi All, I've got a question r/e reference frames and uploading textures to GL.  I am wondering if I can coerce the decoder into writing its output into a PBO which I can render from another thread.  My concern is that when the PBO is locked for rendering the decoder will not be able to use it as a reference frame.  Is there a way to do this, or am I missing the point?
[05:40:52 CET] <thebombzen> bms20__: an avcodec decoder won't necessarily output the right pixel format - for example afaik OpenGL can't render a yuv420p buffer
[05:41:04 CET] <thebombzen> it might need to be bgra, bgr24, or bgr0
[05:42:12 CET] <bms20__> thebombzen: yuv420 is fine - just convert it in the pixel shader.
[05:42:35 CET] <thebombzen> well then I don't know. I'm not very experienced with GL framebuffers
[05:43:38 CET] <NermaN>  drm.debug=0x06 don't shows anything about ffmpeg issue =(
[05:43:53 CET] <NermaN> it still can't encode using h264_vaapi
[05:44:09 CET] <NermaN> should i fill this like bug for ffmpeg or vaapi?
[07:20:17 CET] <Nune> I am trying to learn how to render a h.264 mp4 with alpha transparency from an image sequence of semi-transparent .PNGs?
[07:20:27 CET] <Nune> Any help greatly appreciated. :)
[07:29:13 CET] <Nune> This seems pretty close: ffmpeg -framerate 60 -i dialog_line_%03d.png -vcodec png dialog_line.mp4
[07:29:27 CET] <Nune> Oh, actually, I think this is it :D
[08:13:13 CET] <spiderkeys> Can anyone point me to some literature or reference for the exponential golomb encoding procedure that uses a table (as it is done in FFMPEG and x264)? I imagine this is a performance optimization over the rote algorithm, but would like to know how that approach was derived.
[10:43:32 CET] <NermaN> Even current git version don't works with vaapi =(
[10:43:38 CET] <NermaN> And nobody know why =(
[10:46:43 CET] <jkqxz> Can you try a different version of the driver?
[10:50:01 CET] <NermaN> jkqxz: you mean libva-intel-driver?
[10:50:11 CET] <NermaN> i can try to build another version
[10:56:22 CET] <Traktorrr> 9Traktorrr:  Hi people [14:53:29]	9Traktorrr:  we deduce from the audio stream data for analysis such as lufs, dbfs. ipolzuya command: ffmpeg -f s16le -ac 2 -ar 12K -i http: // xxx -icecast0.xxx .xxx: 8000 / ch_wav -af ebur128 = peak = true -f null - 2> & 1 [14:54:29]	9Traktorrr:  Now we need to get
[10:56:38 CET] <jkqxz> NermaN:  Yeah.  From what you've said so far it doesn't look like an ffmpeg issue, rather something further down the stack.
[10:58:27 CET] <Traktorrr> Now we need to get more, and the phase between audio channels, left and right is it possible?
[11:00:27 CET] <Traktorrr> in general we have data stream (lufs,dbfs), as we get the phase between the left and right channel?
[11:05:05 CET] <Traktorrr> (((
[11:08:37 CET] <Traktorrr> i need help!!!!!!!!!!!!!!
[11:26:07 CET] <Traktorrr> nu suki
[11:26:16 CET] <Traktorrr> ebanie
[11:26:58 CET] <NermaN> Traktorrr: a che tak?
[11:27:17 CET] <Traktorrr> Uraaaaaaaa russkieeeeeeeee
[11:27:18 CET] <Traktorrr> )))
[11:27:34 CET] <Traktorrr> 40 2@>B :><?>B =8:B> ?><>GL =5 <>65B
[11:27:44 CET] <NermaN> C <5=O B0:65, >1=8<5<AO xD
[11:27:51 CET] <Traktorrr> E0E0E00
[11:34:44 CET] <Traktorrr> Hi people [14:53:29]	9Traktorrr:  we deduce from the audio stream data for analysis such as lufs, dbfs. ipolzuya command: ffmpeg -f s16le -ac 2 -ar 12K -i http: // xxx -icecast0.xxx .xxx: 8000 / ch_wav -af ebur128 = peak = true -f null - 2> & 1 [14:54:29]	9Traktorrr:  Now we need to get more, and th
[11:34:55 CET] <Traktorrr>  Now we need to get more, and the phase between audio channels, left and right is it possible?
[11:37:57 CET] <durandal_1707> Traktorrr: aphasemeter filter?
[11:38:44 CET] <Traktorrr> yes
[11:39:26 CET] <Traktorrr> example please
[11:40:21 CET] <durandal_1707> Traktorrr: if you just need numbers? than call it with video output disabled
[11:41:52 CET] <Traktorrr> i used audio icecast , NOT VIDEO  , i need console output DBFS : Left, DBFS: right, LUFS, and PHASE i need numbers between [-1, 1]
[11:43:52 CET] <durandal_1707> that aphasemeter does. gives metadata to frame when video is not enabled
[11:44:53 CET] <durandal_1707> but it doesnt give global one because that doesnt make sense
[11:48:17 CET] <Traktorrr> how to receive a phase? we need to remove indication of levels of signals of the left and right channel and besides to remove a phase
[11:48:43 CET] <Traktorrr> how to receive a phase? we need to remove indication of levels of signals of the left and right channel and besides to display a phase
[11:49:22 CET] <Traktorrr> the ebur filter to remove necessary to us rovn of signals how to force it to remove a phase???
[11:51:03 CET] <Traktorrr> the ebur filter displays LUFS, DBFS levels of signals necessary to us how to force it to display a phase???
[11:52:53 CET] <durandal_1707> you cant with ebur128
[11:53:08 CET] <durandal_1707> but can with another filter
[11:55:26 CET] <Traktorrr> [Parsed_ebur128_0 @ 0x249af80] t: 0.0999792  M:-120.7 S:-120.7     I: -70.0 LUFS     LRA:   0.0 LU  FTPK: -14.6 -13.6 dBFS  TPK: -14.6 -13.6 dBFS
[11:55:45 CET] <Traktorrr> [Parsed_ebur128_0 @ 0x249af80] t: 0.0999792  M:-120.7 S:-120.7     I: -70.0 LUFS     LRA:   0.0 LU  FTPK: -14.6 -13.6 dBFS  TPK: -14.6 -13.6 dBFS [Parsed_ebur128_0 @ 0x249af80] t: 0.199979   M:-120.7 S:-120.7     I: -70.0 LUFS     LRA:   0.0 LU  FTPK: -14.3 -13.2 dBFS  TPK: -14.3 -13.2 dBFS [Parsed_
[11:56:00 CET] Last message repeated 1 time(s).
[12:47:04 CET] <chamath> Hi I have a problem with video splitting. The time value seems not accurate.
[12:48:31 CET] <chamath> ffmpeg -y -ss 0 -accurate_seek -i <input mp4> -ss 0.570 -f lavfi -i aevalsrc=0 -t 6.690000 -strict -2 -c:v libx264 -preset faster -crf 28 -acodec aac -map_metadata -1 -movflags faststart <output>
[12:49:34 CET] <chamath> I expect final output video to be length of 6.69 seconds, but it is 6.712 seconds long.
[12:49:48 CET] <chamath> I calculate the duration of the video using ffprobe
[12:50:19 CET] <chamath> ffprobe -i <output file> -show_entries format=duration -v quiet -of csv="p=0"
[12:50:30 CET] <chamath> Any idea what can be wrong here?
[12:54:40 CET] <iive> JFY, what is your fps? what is the frame duration?
[13:16:27 CET] <chamath> fps is 60
[13:22:12 CET] <Guest49090> Hey! I'd get the next error when I do try to stream an mp4 file to rtmp: [flv @ 0x2ecb980] video stream discovered after head already parsed
[13:22:42 CET] <Guest49090> I'd hope someone can explain why I'd get this
[13:22:54 CET] <Guest49090> and im not sure if it is an error
[14:12:45 CET] <brontosaurusrex> would that 'ffmpeg -i in.mov -vf minterpolate=fps=50 -c:v prores -r 25 out.mov', assuming input is 25fps, valid command line?
[14:12:52 CET] <brontosaurusrex> +be*
[14:20:58 CET] <brontosaurusrex> nope.
[14:23:07 CET] <brontosaurusrex> refrazing: minterpolate example that would slomo to 50% making output 2 times longer?
[14:29:12 CET] <brontosaurusrex> cough
[14:55:49 CET] <brontosaurusrex> Nicks #ffmpeg: [@`md @durandal_1707 @superdump @ubitux +uau [-T-] [mbm] ]R[ _0x5eb_ __jack__ _corrupt _Kate A3G1S aarwine aballier adgtl Aerroon Akira^^_ alexspeller Alina-malina alu alucryd Amadiro antlarr araml Arokh Arthur_D artista_frustrad Arwalk ashka asm89 AssPirate Asterisk AstralStorm atomnuker Azelphur Balliad balrog barhom basisbit bbert beastwick beatdown bencoh Benjojo benwilber besc bjoe2k4
[14:55:51 CET] <brontosaurusrex> Blaxar blb blinky42 bmhm boiled_sugar Bombo bove bret brontosaurusrex BtbN c_14 cantstanya CARAM__ cbdev cbsrobot chamath chandoo__ charims Chloe[m] chovy Cldfire Cloudef codebam CoJaBo colde comgot Cork cosmo1t courrier Croepha CruX| CustosLimen cyphase D-ion dagobert__ damdai damnlie_ dantti dashcloud_ DasMoeh davidmichaelkarr db01 DeathShot debianuser decay devster31 DHE Diag dl2s4 dlb76 dongs donics
[14:55:53 CET] <brontosaurusrex> doogaille dro_bot drv dustinm` dv_ e Ekho Elysion_ enp0s3 Exagone313 f0lder faLUCE feliwir fengshaun fflogger fidothe Filarius22 firewyre_ flarunt Fletcher fling flux fnords foonix fritsch furq fusl fzn G gaeta gen93 georgie ghost1 ghoti gix glebihan glebihan_ gmh GRMrGecko Guest27309 Guest47316 Guest49090 Guest94912 gusto haagch haasn hamsheet HarryHallman hawken Hello71 hfb Hink Hobbyboy Holbrook howitdo
[14:55:55 CET] <brontosaurusrex> hreinnbeck hurricanehrndz hyponic igitoor_ iive ikevin Intrepd irock ItWasntMe2013 ivanich jA_cOp JackWinter jacobwg Jaex jarainf jarryd jason_- jbermudes JEEB jfmcarreira Jikan jimbankoski__ jkqxz john51_ JohnPreston72 joshbaptiste JoshX jpf3 jswagner justinmrkva jya k-man K1rk k_sze[work] kam187 kasper93 KDDLB Kei_N kepstin kerio Keshl ketas kevc klaxa kode54 kraft Kronuz kuroro Kuukunen kvz larsi lavalike
[14:55:57 CET] <brontosaurusrex> lebster LegendThinker Len lesderid limbo_ lkiesow llamapixel lomancer LRN lroe luc4 luminarys m1dnight_ madprops MadWasp manuelschneid3r maqr marcoslater Matador mateo` MatthewAllan93 matthiaskrgr Mavrik Me4502 merzo michaelni microchip_ Mista_D mixfix41 mixi miyalys monokrome mosb3rg moser MSG|Maverick mundus2018 Muzer Myrsloik Nanashi nate nd neonfuz Neville nirvanko Nitori nitrix Nothing4You nwoki
[14:55:59 CET] <brontosaurusrex> nyuszika7h ObsidianX ocrete olspookishmagus oorm ootje Orphis ossifrage pa paperManu parasite_ ParkerR PaulCapestany pbos PharaohAnubis phillipk Phlarp_ phryk pigeon pigoz pinPoint ploop Plorkyeran_ podman Polochon_street pomaranc pprkut ps-auxw psychicist__ ptx0 pzich r0r0 raijin rcombs relaxed Relict retard rexbron ribasushi rikai ritsuka rjp421 rkantos_ road|runner rossome roundtrip rsully runawayfive
[14:56:29 CET] <madprops> finally im in a list
[14:56:35 CET] <colde> lol madprops
[14:56:41 CET] <iive> you were quite fast :)
[14:56:43 CET] <colde> I can add you to a few lists if you like ;)
[14:56:57 CET] <DHE> how not to use IRC?
[14:57:12 CET] <colde> How to piss of everybody he wants to help him at least
[14:57:30 CET] <larsi> hey guys I need help
[14:57:33 CET] <larsi> but let me annoy everyone first
[14:57:37 CET] <ploop> christ
[14:57:52 CET] <ikevin> oO
[14:57:56 CET] <furq> well he did succeed in getting all the idlers to respond
[14:58:08 CET] <rjp421> heh
[14:58:28 CET] <ploop> if you guys have weechat you should install the colored nicks plugin, it turns spam into beautiful quilts https://lep.pw/img/839b91b4d2a76380.png
[14:59:01 CET] <DHE> purdy
[14:59:19 CET] <DHE> (that's supposed sound like "pretty")
[15:14:08 CET] <Nanashi> *All* the idlers? http://i.imgur.com/PSWd2jl.png
[15:14:23 CET] <Nanashi> Anyway, should off to discord where there's @everyone or @here unless disabled.
[15:23:29 CET] <MSG|Maverick> there was a cat on his keyboard
[15:31:01 CET] <faLUCE> Hello. How is it possible that  on the first 3 or 4 frames to encode, avcodec_encode_video2(...., got_packet_ptr) returns 0 (success) and got_packet_ptr is=0 at the same time?
[15:31:33 CET] <faLUCE> (codec = x264)
[15:39:18 CET] <DHE> faLUCE: b-frames buffer a lot of packets in before providing frames out
[15:41:10 CET] <faLUCE> DHE: where are them buffered? In some internal buffer of the codec context?
[15:41:32 CET] <JEEB> libx264 handles required buffering and avcodec is just feeding to it
[15:41:48 CET] <faLUCE> JEEB: I see
[15:41:54 CET] <JEEB> if you need low latency just set tune to zerolatency or so
[15:42:01 CET] <JEEB> that will disable features that add latency
[15:42:21 CET] <JEEB> other parts of lavc/lavf will still of course need to be optimized, but libx264 itself will at that point be set to minimal latency
[15:43:06 CET] <faLUCE> then the pkt filled by avcodec_encode_video2() does not correspond to the current frame, but to 4 frames before?
[15:43:30 CET] <JEEB> when you start getting packets they will have their timestamps etc
[15:43:51 CET] <JEEB> but yes, at the end you will have to flush the encoder, which IIRC is documented
[15:50:18 CET] <faLUCE> is there a way to set the x264 buffer siize other than zerolatency ?
[15:50:52 CET] <JEEB> yes, but if you need low latency then that's what you use :P
[15:51:20 CET] <JEEB> otherwise you use stuff like the lookahead parameters and stuff like that to limit the amount of required buffering
[15:51:40 CET] <faLUCE> JEEB :-) tnx
[15:52:07 CET] <JEEB> as the zerolatency tune also adjusts stuff like the threading mode
[15:52:22 CET] <JEEB> because frame threading adds latency
[15:52:39 CET] <JEEB> while slice threading lessens compression but enables threading within a picture
[15:55:19 CET] <faLUCE> JEEB: in addition, if I want to reduce the audio latency, I suppose I have to minimize the MPEG audio frame, right? Currently it's automatically set to 1152 samples by the audioEncoder's CodecContext and I dont' know how to minimize it
[15:55:52 CET] <JEEB> depends on the encoder
[15:55:56 CET] <faLUCE> mp2
[15:56:13 CET] <faLUCE> (but I could use another one)
[15:57:18 CET] <JEEB> I won't be able to tell you :P I haven't cared about audio personally so you will have to read the code of the encoder to know if it can do that at all
[15:57:28 CET] <JEEB> (and that will show you how to do it as well)
[15:57:45 CET] <faLUCE> ok, I'll do that
[16:16:56 CET] <kerio> opus is pretty good if you have strict requirements on latency
[16:22:03 CET] <jubalh> hi
[16:22:55 CET] <jubalh> I have several .mp4.ts files which i would like to convert into one mp4 file, could someone help me with that?
[16:24:51 CET] <jubalh> I assume something like: ffmpeg *.ts -acodec copy -vcodec copy output.mp4
[16:24:59 CET] <jubalh> but that doesnt seem to be right
[16:25:10 CET] <jubalh> ffmpeg -i *.ts <- I mean
[16:25:24 CET] <JEEB> welcome to the hell of concatenation in FFmpeg
[16:25:34 CET] <JEEB> there's like three or four different concat modules on different levels
[16:26:11 CET] <JEEB> I think in  your case the simplest way would be to just append the ts files in the correct order with cat, and then remux with `ffmpeg -i all.ts -c copy out.mp4`
[16:26:36 CET] <JEEB> MPEG-TS just happens to be something that should work like that :P
[16:27:25 CET] <JEEB> also you can use stdout output with cat and do input from that
[16:27:28 CET] <JEEB> with piping
[16:27:56 CET] <JEEB> -i - (a "-" as the 'input file') should be stdin)
[16:29:08 CET] <dl2s4> good old mass highlight.. i was like "wow, i am important i am highlighted in #ffmpeg" =)
[16:32:44 CET] <jubalh> JEEB: worked like a charm! Thanks a lot :)
[16:33:37 CET] <thebombzen> jubalh: for future reference if you like the concat filter, you can use ffmpeg $(echo -i\ *.ts)
[16:33:53 CET] <thebombzen> ohwait nvm that works with {} but not *
[16:33:54 CET] <thebombzen> hmmm
[16:34:40 CET] <thebombzen> if the ts files are labeled 01..20.mp4.ts or something, you could use ffmpeg $(echo -i\ {01..24}.mp4.ts)
[16:34:46 CET] <thebombzen> and then use the concat filter
[16:35:50 CET] <jubalh> ok
[16:35:55 CET] <furq> wtf
[16:35:57 CET] <furq> how does that work
[16:36:07 CET] <JEEB> concat filter is after decoding
[16:36:09 CET] <JEEB> so -c copy won't work
[16:36:22 CET] <JEEB> concat demuxer or protocol might work, but it's a mess
[16:36:35 CET] <JEEB> so if you've just got mpeg-ts I prefer to just go around that
[16:37:11 CET] <thebombzen> furq: the shell expands -i\ {01..20}.mp4.ts into twenty different arguments: "-i 01.mp4.ts" "-i 02.mp4.ts" etc.
[16:37:26 CET] <furq> yeah how does that work
[16:37:37 CET] <thebombzen> {01..20} is a shell thing
[16:37:43 CET] <thebombzen> it works similar to {a,b,c}
[16:38:02 CET] <thebombzen> but the space is part oft he argument so filtering it through echo removes the space, and you end up with all 40 arguments instead of 20
[16:38:09 CET] <furq> oh duh
[16:38:28 CET] <furq> i missed that \
[16:38:35 CET] <thebombzen> oh lol yea it's crucial
[16:38:41 CET] <furq> yeah that makes perfect sense now
[16:39:15 CET] <furq> that's a nice trick
[16:39:25 CET] <furq> shame it doesn't work with globs though
[16:39:38 CET] <thebombzen> yea it won't work with *. that requires more work
[16:42:06 CET] <thebombzen> with * as long as you have no spaces in filenames, you can do ffmpeg -i $(echo *glob*.ts | sed 's/ / -i /'g)
[16:46:14 CET] <furq> bye
[18:00:06 CET] <Bombo> how do i add an rpath?
[18:00:31 CET] <Bombo> wrong window
[18:00:32 CET] <Bombo> ;=)
[18:01:06 CET] <Bombo> (--enable-rpath)
[18:01:55 CET] <Franciman> Hi
[18:03:35 CET] <Bombo> ho
[18:03:48 CET] <Franciman> I'd like to extract audio from a video file and get data to generate a waveform (not using the waveform filter), is libavfilter the right way to do that?
[18:10:43 CET] <Bombo> Franciman: you want to do an audio player?
[18:10:59 CET] <Bombo> good question ;)
[18:11:00 CET] <Franciman> kind of, yeah
[18:11:25 CET] <Franciman> it's a program to edit subtitles
[18:11:31 CET] <JEEB> sounds like aegisub
[18:11:55 CET] <Franciman> kind of
[18:12:21 CET] <JEEB> but yes, libav{format,codec} to get decoded samples, and then libavfilter for adding filters that take in images or audio samples and output one or the other (or both)
[18:12:39 CET] <JEEB> the first part you can simplify by using something like ffms2, which is what aegisub is doing as well
[18:13:02 CET] <JEEB> (although I guess plugging in avfilter becomes a wee bit harder since you no longer deal with avcodec output avframes)
[18:13:32 CET] <Bombo> hmm does it use the audio data to synch it to the subtitle?
[18:14:03 CET] <Franciman> yes
[18:14:17 CET] <Bombo> sounds quite useful
[18:14:18 CET] <Franciman> I mean, you need to do that by hand, eh!
[18:14:27 CET] <Bombo> oh ;(
[18:14:29 CET] <Bombo> ;)
[18:14:29 CET] <Franciman> but maybe... one day...
[18:14:50 CET] <Bombo> but it displays the waveform
[18:14:52 CET] <JEEB> you will never release stuff with any automated timing thing, you generally always want to at the very least finish it up manually
[18:15:10 CET] <Bombo> does aegisub do that? i never did subtitles
[18:15:13 CET] <JEEB> yes
[18:15:17 CET] <JEEB> aegisub has a waveform
[18:15:38 CET] <JEEB> and no, I'm pretty sure it has no automated timing stuff although it could be possible
[18:15:45 CET] <JEEB> that said, the heuristics for it are hard :P
[18:15:47 CET] <Franciman> yeah, that'd pretty hard, I guess
[18:15:56 CET] <Bombo> just an idea ;)
[18:15:57 CET] <Franciman> JEEB, do you do subtitles?
[18:16:04 CET] <JEEB> I used to do them more
[18:16:08 CET] <Franciman> cool!
[18:16:18 CET] <JEEB> nowadays I'm a grumpy man doing video processing
[18:16:28 CET] <Franciman> ahah, cool!
[18:16:49 CET] <JEEB> but yeah, if I was developing something for subtitles I would most likely start by seeing if I could plug something into aegisub
[18:16:55 CET] <JEEB> since it already has a whole lot of useful stuff
[18:17:00 CET] <JEEB> and export modules etc
[18:17:09 CET] <JEEB> cross-platform too
[18:17:40 CET] <Franciman> yeah it's cool, absolutely. In my defense, I'm doing this by myself as a personal exercise
[18:17:43 CET] <Franciman> ahah
[18:18:55 CET] <JEEB> one thing that pops up would be to somehow make aegisub use mpv's opengl renderer stuff for the preview
[18:19:06 CET] <JEEB> since it's currently opengl as well
[18:19:17 CET] <JEEB> and mpv's is just much more feature-filled
[18:19:31 CET] <Franciman> JEEB, shhh that's my killer feature
[18:19:41 CET] <JEEB> lol
[18:21:48 CET] <Franciman> aegisub is also written in C++11, neat!
[18:24:18 CET] <Franciman> anyways, thanks a lot for your help Bombo and JEEB
[18:25:27 CET] <JEEB> another thing I've thought of was export into blu-ray subpictures (since the spec for that is known). have libass (or VSFilter) render onto a video-sized canvas and write 4:4:4 YCbCr images
[18:34:02 CET] <Franciman> that'd be really cool
[18:55:44 CET] <hron84> Hi! I trying to stream a WebM video to an IceCast2 server, but I cannot pass 4-7 FPS while OGG streams work normally (24-28 FPS). I tried to stream to /dev/null  to make sure it's not a IceCast limitation or a network issue, but it seems like not. Also, I cannot specify ffmpeg to do not reencode the stream just copy, because if I specify -f webm -c copy then ffmpeg exits instantly without starting the stream. Can anyone help me with t
[19:03:41 CET] <JEEB> you might want to see the error related :P
[19:04:06 CET] <JEEB> although most likely you're trying to push streams not allowed in "webm" (subset of matroska defined by el GOOG)
[19:04:14 CET] <JEEB> and yes, libvpx is slow
[19:04:18 CET] <JEEB> esp. if you're trying to do vp9
[19:06:19 CET] <hron84> JEEB: it does not display an error message, just exits
[19:06:46 CET] <hron84> If you mean for -c copy
[19:07:13 CET] <hron84> so i specify parameters, press enter, it display input and output stream details and instantly quits
[19:07:44 CET] <JEEB> hron84: I would bet a piece of virtual money on that it gives you an error message
[19:07:47 CET] <hron84> without any error. I tried enable a debug logging, it reads up the stream and the log ends without any information about what's happening
[19:07:58 CET] <JEEB> just pastebin the full terminal log and link your paste here
[19:08:02 CET] <JEEB> use whatever hits your fancy
[19:08:07 CET] <hron84> let me do it
[19:08:10 CET] <JEEB> either pastebin.com or something else
[19:11:00 CET] <hron84> JEEB: http://pastebin.com/nJi3SdDZ
[19:11:14 CET] <hron84> I cannot see any error here.
[19:11:33 CET] <hron84> The next line is my prompt again.
[19:12:00 CET] <furq> that's working correctly
[19:12:13 CET] <hron84> except it does not do anything?
[19:12:13 CET] <furq> you probably want -re to write at the input framerate
[19:12:28 CET] <furq> that's copying the entire file into the output, just like you asked
[19:12:33 CET] <furq> it's just doing it at 351x realtime
[19:12:46 CET] <hron84> Ahh.
[19:12:49 CET] <hron84> Hmm.
[19:12:50 CET] <furq> add -re before -i if you're streaming from a file
[19:13:06 CET] <hron84> let me check the output
[19:13:56 CET] <Hello71> does it drop frames if it can't keep up then
[19:14:15 CET] <hron84> So that is -re for!
[19:14:20 CET] <hron84> I never understood it.
[19:14:38 CET] <hron84> furq: you saved my ass. Thanks!
[19:15:07 CET] <JEEB> -re is very fragile tho
[19:15:29 CET] <JEEB> like, if your input suddenly has a timestamp jump it will derp you up
[19:15:34 CET] <hron84> A risky question: can it work with -f concat too (I mean -c copy and -f concat)? It would be very wonderful.
[19:15:41 CET] <furq> "very" is maybe an exaggeration
[19:15:47 CET] <furq> i've never had any trouble with well-formed inputs
[19:15:58 CET] <JEEB> well yes, but even well-formed transport streams can derp it up
[19:16:01 CET] <furq> but yeah i wouldn't trust anything valuable to it
[19:16:31 CET] <JEEB> hron84: might. concat related things work for very limited subsets of things
[19:16:32 CET] <hron84> Can you explain how it can "derp"? Few dropped frames could be okay if the stream can keep up.
[19:16:38 CET] <furq> hron84: it should do, but concat is a bit weird in itself
[19:16:44 CET] <Threads> if i worked on the matroskaenc to allow aac bitrate display would it be looked at by ff-devs ?
[19:17:03 CET] <furq> is that something that the muxer has control over
[19:17:05 CET] <JEEB> all of the concat filters, demuxers etc have their capabilities and limitations
[19:17:26 CET] <Threads> furq ac3/dts can be displayed
[19:17:32 CET] <furq> well yeah those are cbr
[19:17:43 CET] <Threads> mp3 even
[19:17:53 CET] <JEEB> hron84: like ffmpeg.c going to wait for a few years because of a timestamp reset or so
[19:17:58 CET] <JEEB> as in, -re
[19:18:04 CET] <furq> i'm pretty sure that's flagged somewhere in the bitstream itself
[19:18:05 CET] <hron84> JEEB: all i want is take N webm files and blow them to icecast. No reencoding, no remuxing, nothing but a simple 1:1 copy.
[19:18:16 CET] <furq> it's nothing to do with the container, matroska doesn't have metadata for that afaik
[19:18:16 CET] <hron84> JEEB: wow.
[19:18:27 CET] <JEEB> furq: it has "tags" :DDD
[19:18:29 CET] <furq> that's why it works in mp4, because mp4 does
[19:18:53 CET] <JEEB> so you have timestamps in the container and then tags for example for the "duration"
[19:18:53 CET] <furq> well yeah you could store that informatiom somewhere, but you could equally well write it on a post-it note and stick it to your dog
[19:18:56 CET] <JEEB> or "bit rate"
[19:19:08 CET] <furq> it's going to be readable by roughly as many people either way
[19:19:08 CET] <JEEB> I really facepalmed when mkvmerge started writing that stuff
[19:19:42 CET] <JEEB> hron84: it might work, the timestamp resets between multiple files are what might break -re completely
[19:19:45 CET] <JEEB> in other words, try it out :P
[19:20:02 CET] <JEEB> it will still remux of course, that's what -c copy does
[19:20:08 CET] <hron84> Doing it right now :D
[19:20:10 CET] <JEEB> it does demux completely and then remux
[19:20:17 CET] <furq> that's what you want though
[19:20:29 CET] <JEEB> well he specifically wanted to just throw the already muxed data as-is
[19:20:36 CET] <JEEB> just saying he won't get exactly that
[19:20:45 CET] <JEEB> he will get a demuxed and then once again muxed data
[19:20:48 CET] <furq> well he said that's what he wanted but i'm pretty sure it isn't actually
[19:20:51 CET] <hron84> Meh, i am not in this video-encoding thing, i just learning the whole stuff now, I have to maintain a icecast server and help my friend to stream videos
[19:21:08 CET] <furq> does webm over icecast work reliably yet
[19:21:13 CET] <furq> i tried it just after it landed and it was a mess
[19:21:17 CET] <JEEB> furq: that might be true but warning him that it isn't 100% what he said is still OK in my books :P
[19:21:32 CET] <furq> sure, i was just clarifying to him that remuxing is in fact what he wants
[19:21:53 CET] <JEEB> well, what he wants would be possible with a way higher level thing as well as far as I can tell
[19:22:05 CET] <JEEB> but yeh, with ffmpeg.c -c copy is what he wants
[19:22:57 CET] <hron84> :-)
[19:23:26 CET] <JEEB> furq: btw for nightmare food go look at the vsync code in ffmpeg.c
[19:23:40 CET] <JEEB> I don't think even the people who originally wrote that fully understand WTF is going on
[19:23:50 CET] <JEEB> s/food/fuel/
[19:24:09 CET] <furq> i'd rather hang on to the illusion of reliability from ffmpeg.c
[19:24:19 CET] <furq> the best way to do that is to not ever read it
[19:24:33 CET] <JEEB> yeah, it's surprising how many things are hanging on and kind of working with it :P
[19:25:01 CET] <JEEB> the API itself is nowadays much more saner but ffmpeg.c is a special snowflake full of random stuff that screams "can we just make this stuff go away"
[19:25:57 CET] <hron84> it seems like -f concat and -c copy are friends in this combination. \o/
[19:26:10 CET] <JEEB> just make sure you tested with -re if you're going to be using it
[19:26:12 CET] <hron84> and no derps even with -re
[19:26:15 CET] <JEEB> ok
[19:26:18 CET] <JEEB> good luck then
[19:26:31 CET] <JEEB> you're threading in deep waters that look shallow
[19:26:32 CET] <JEEB> :D
[19:26:46 CET] <hron84> It could make the tomorrow stream great again! :-)
[19:28:44 CET] <faLUCE> Well, this is terribly confusing. How can I allocate an AVPacket on the heap with libav 2.8?  I see there's  av_new_packet(),  but if I access the pkt's fields soon after this allocation, I obtain a segfault. Same thing if I call av_init_packet() after this allocation.
[19:29:35 CET] <faLUCE> I don't have problem if the AVPacket is stacked
[19:30:11 CET] <faLUCE> (sorry, I had problems with internet connection) Well, this is terribly confusing. How can I allocate an AVPacket on the heap with libav 2.8?  I see there's  av_new_packet(),  but if I access the pkt's fields soon after this allocation, I obtain a segfault. Same thing if I call av_init_packet() after this allocation.
[19:31:07 CET] <phillipk_> thanks in advance: I'm getting an output with audio in stream 1 and video in stream 2 (I want it the other way around)... see my cmd:
[19:31:08 CET] <phillipk_> http://pastebin.com/uN5nnSCr
[19:32:01 CET] <furq> phillipk_: https://www.ffmpeg.org/ffmpeg.html#Advanced-options
[19:32:04 CET] <JEEB> use map
[19:32:48 CET] <JEEB> also update your FFmpeg if you're using -strict experimental for the AAC encoder
[19:32:58 CET] <JEEB> that enables you to remove that since the AAC encoder got improved
[19:39:26 CET] <phillipk_> ok
[19:39:48 CET] <phillipk_> so, I'm confused with how map works/conflicts with the filter_complex I made
[19:41:30 CET] <furq> -map 0:v -map "[out]"
[19:42:59 CET] <furq> and append [out] after amix=4
[19:43:58 CET] <phillipk_> let me test it, thanks
[19:44:12 CET] <furq> i thought [out] was the default name for an unlabelled output but apparently not
[19:44:22 CET] <furq> the docs say it is but it doesn't seem to work
[19:46:05 CET] <JEEB> probably implied in something else :P and "map" then tries to explicitly use it or so
[19:46:22 CET] <furq> http://vpaste.net/4R1mi
[21:04:58 CET] <Nune> I have a sequence of semi-transparent images that I want to concatenate into a semi-transparent 6 second video clip (MP4). I've done this so far, yay! While the image sequence is not seamless, I'd like to loop the playback and crossfade the final frames into the initial frames. Can FFMPEG accomplish this?
[21:25:45 CET] <durandal_170> Nune: how much to loop?
[21:29:50 CET] <Nune> durandal_170 The video is 6 seconds. In the last 1 second of the 6, I'd like to blend out the frames towards alpha=0, and blend in the first frames towards alpha=1, so that at the end of the video, we're showing the first frame at alpha=1.
[21:35:14 CET] <Nune> Ah, and obviously the clip would need to shrink by 1s to accomodate the crossfade. So the result would be 5s
[21:50:16 CET] <Nune> This would be a command in Windows 10, btw.
[21:50:25 CET] <Nune> Found this: https://superuser.com/questions/1001039/what-is-an-efficient-way-to-do-a-video-crossfade-with-ffmpeg
[21:50:32 CET] <Nune> Not quite able to translate it into what I need yet.
[21:55:17 CET] <monokrome> O_O
[22:41:29 CET] <Nune> ok I think I need to use the overlay filter
[22:42:13 CET] <Nune> I have a 6 second clip. My bottom layer needs to render 0..5s at full opacity. My top layer needs to render 5..6s, starting at full opacity and fading out to zero opacity. Both layers start playing at the same time.
[22:51:44 CET] <durandal_170> Nune: for looping short sequence try loop filter
[22:53:13 CET] <Nune> durandal_170: Oh, really? That will crossfade the loop? That'd be great!
[22:55:16 CET] <durandal_170> Nune: loop just loops frames you want to loop
[22:56:59 CET] <Nune> Ah, no, that's not what I want then.
[22:57:16 CET] <Nune> have a 6 second video that isn't seamless
[23:08:51 CET] <furq> Nune: how many times do you want it to loop
[23:12:22 CET] <furq> i have no idea why the guy in this SO answer is jerking about with trim and concat, you don't need that
[23:13:41 CET] <Nune> hah
[23:13:47 CET] <Nune> I don't want the video to loop at all.
[23:13:53 CET] <Nune> It will be played back on loop by the media player.
[23:14:00 CET] <Nune> furq: ^
[23:14:09 CET] <furq> http://vpaste.net/d1MHP
[23:14:15 CET] <furq> that's all you need for a crossfade
[23:14:27 CET] <Nune> thank you, reading and trying
[23:14:39 CET] <furq> you could just encode that to something lossless and then trim the start and end off to get a perfect loop
[23:15:10 CET] <Nune> right..
[23:15:23 CET] <furq> you could probably do that in the same command
[23:15:29 CET] <Nune> is there a way to do this from an image sequence?
[23:15:33 CET] <Nune> that's my original source content
[23:15:43 CET] <furq> just replace the inputs with img%03d.png or whatever you've got
[23:15:49 CET] <Nune> right
[23:16:15 CET] <furq> and replace st=4 in the 0:v filter, and FRAME_RATE*4 in the 1:v filter, to your input length minus one second
[23:16:31 CET] <furq> or however long you want the fade to be (in which case change d=1 too)
[23:16:37 CET] <Nune> right
[23:16:42 CET] <Nune> 1s fade should be fine
[23:17:05 CET] <Nune> st=4; is that the start point of the fade out?
[23:17:39 CET] <furq> yeah
[23:18:12 CET] <Nune> i think i need that to be st=5 then, since the original clip is 6s long
[23:19:13 CET] <Nune> receiving 3 warnings that "Past duration 0.619.. too large"
[23:20:30 CET] <Nune> regardless, it yielded a result; which i then trimmed
[23:20:38 CET] <Nune> didnt seem to give the right result tho
[23:20:51 CET] <furq> [0:v][tmp3]overlay,trim=start=1:end=6[out]
[23:20:56 CET] <furq> that should give you a perfect loop
[23:21:11 CET] <furq> the start will be offset but i guess that's not a problem
[23:21:13 CET] <Nune> http://vpaste.net/eyKCE
[23:21:23 CET] <Nune> yeah offset wouldnt matter
[23:21:41 CET] <furq> yeah just replace the existing overlay bit of the filterchain with that
[23:21:42 CET] <Nune> thanks for the help
[23:21:45 CET] <Nune> ok
[23:22:37 CET] <furq> oh
[23:22:39 CET] <Nune> not sure what the FRAME_RATE bit is; do I want FRAME_RATE*5 since its 6s long?
[23:22:43 CET] <furq> yeah
[23:22:44 CET] <Nune> k
[23:22:44 CET] <furq> i was just typing that
[23:22:51 CET] <Nune> did i find all the other 4->5 caveats?
[23:22:59 CET] <furq> that offsets the timestamps of the second input by FRAME_RATE*5 frames
[23:23:06 CET] <Nune> ah
[23:23:10 CET] <furq> those are the only two
[23:23:46 CET] <Nune> hmm
[23:23:49 CET] <Nune> ok, properly yields a 5s clip
[23:23:54 CET] <Nune> not seamlessly looping though
[23:24:06 CET] <Nune> http://vpaste.net/om89L
[23:25:55 CET] <Nune> this is my image sequence -> video snippet, btw: http://vpaste.net/BVGTT
[23:26:29 CET] <furq> you can probably just use the sequence as inputs to the main command
[23:26:34 CET] <Nune> ahuh
[23:27:02 CET] <furq> cropdetect doesn't actually crop btw
[23:27:10 CET] <furq> it just prints out the params you should use with crop
[23:27:14 CET] <Nune> thats.. what i thought... but then it did?
[23:27:30 CET] <furq> er
[23:27:34 CET] <furq> well it shouldn't have ;_;
[23:28:03 CET] <Nune> was anticipating needing to take the final result (aggregate crop dimensions?) and plug that into a crop filter, but the output was cropped already so i went with it :P
[23:28:10 CET] <furq> weird
[23:28:31 CET] <furq> well yeah if nothing else you shouldn't use webm for that, use something lossless like ffv1
[23:28:43 CET] <furq> otherwise you're compressing it twice
[23:28:50 CET] <Nune> i see..
[23:29:08 CET] <Nune> (very new to ffmpeg)
[23:29:49 CET] <Nune> happy to rework that and optimize it, but want to get the seamless transition part ironed out first
[23:30:02 CET] <Nune> something isnt quite right with what ive got so far
[23:30:07 CET] <furq> not sure about that
[23:30:43 CET] <furq> i just tried with that exact filterchain and it works
[23:30:52 CET] <Nune> hmm
[23:31:08 CET] <Nune> trying again then
[23:31:19 CET] <furq> http://vpaste.net/Xc56J
[23:31:22 CET] <furq> that's the exact command i used
[23:31:43 CET] <furq> i changed color=black:d=10 but i don't think that should matter
[23:32:01 CET] <furq> that shouldn't be visible after 8 seconds and you're trimming before then anyway
[23:32:52 CET] <Nune> how do you define 'testsrc' in this command?
[23:33:01 CET] <furq> -f lavfi -i testsrc is just a builtin test video source
[23:33:04 CET] <Nune> oh
[23:33:09 CET] <furq> it displays colour bars and a timestamp
[23:33:34 CET] <furq> what player are you using btw
[23:33:39 CET] <furq> maybe it's hitching at the loop point
[23:33:53 CET] <Nune> VLC (quick test) and OBS Studio (final usage)
[23:33:55 CET] <furq> `mpv --loop-file=inf out.mkv` works here
[23:35:35 CET] <Nune> dialog_line_cropped.webm: https://drive.google.com/open?id=0B3kkDhHHsCMhMmVGNzlIRktKWTg
[23:36:24 CET] <Nune> i rendered your exact command with the testsrc, worked fine; replaced testsrc definition with ^ content and unless im asking the wrong question, didnt seem to work as expected
[23:37:19 CET] <Nune> though it also played to 6s, not 5s, so maybe it is setup but didnt trim properly, trimming manually..
[23:37:57 CET] <furq> weird, doesn't work here either
[23:38:11 CET] <Nune> hmm newp
[23:38:33 CET] <Nune> glad im not crazy then, could it be a problem with using .webm? i can render another format
[23:38:41 CET] <furq> maybe try using the image sequence as inputs
[23:38:44 CET] <Nune> ok
[23:39:09 CET] <phillipk> Is there a way to figure out exactly the version of ffmpeg I have if I got a compiled one--for ffmpeg -version I see: N-71481-g1c37848
[23:39:59 CET] <drv> that g<hex stuff> at the end is the git short hash
[23:41:00 CET] <phillipk> right on
[23:44:52 CET] <Nune> furq: no dice
[23:45:20 CET] <Nune> yielded a 6s video that loops at 5s, but not seamless
[23:45:42 CET] <Nune> maybe has to do with the alpha data in the images? its a semi-transparent render
[23:46:11 CET] <furq> those webms are yuv420p so that shouldn't make any difference
[23:46:53 CET] <Nune> any chance youd be interested in the source content (image sequence)? tho i dont see why starting with them vs the dialog_line_cropped.webm would be any different
[23:47:15 CET] <furq> oh hang on
[23:47:34 CET] <furq> try with color=black=d:10:s=976x400
[23:47:55 CET] <Nune> sure
[23:48:11 CET] <furq> lol
[23:48:15 CET] <furq> yeah i think that's it
[23:49:27 CET] <Nune> getting an argument error with that
[23:50:06 CET] <furq> d=10 sorry
[23:50:15 CET] <Nune> ah
[23:51:37 CET] <furq> actually nvm it's something different
[23:51:46 CET] <Nune> hehe
[23:51:46 CET] <furq> if i remove trim the output file is still 6 seconds long
[23:51:51 CET] <furq> something's gone to fuck with the timestamps
[23:52:16 CET] <Nune> possible the input is broke?
[23:52:34 CET] <Nune> maybe since its the cropdetect output? i did try from the images tho, so cant JUST be that.
[00:00:00 CET] --- Sat Jan 28 2017


More information about the Ffmpeg-devel-irc mailing list