[Ffmpeg-devel-irc] ffmpeg.log.20161103

burek burek021 at gmail.com
Fri Nov 4 03:05:02 EET 2016


[00:06:04 CET] <c_14> Can you put ffv1 in anything besides matroska or nut?
[00:08:50 CET] <kerio> wikipedia claims Contained by	AVI, MKV, MOV, NUT, etc.
[00:10:35 CET] <kerio> hm what are the advantages of mkv over nut?
[00:12:40 CET] <c_14> Supported by not-ffmpeg
[00:13:11 CET] <kerio> lmao
[00:13:37 CET] <c_14> As in, if you want your video to be read in by players/whatever not based on libav* go with matroska over nut
[00:17:53 CET] <kerio> huh, can i drop the "encoder" metadata from the ffmpeg command line?
[00:19:24 CET] <c_14> -fflags +bitexact afair
[00:26:33 CET] <kerio> ty
[00:37:15 CET] <tobiasBora> Hello,
[00:38:38 CET] <tobiasBora> I don't know why, but when I start ffmpeg, during a few seconds the speed of conversion is correct (around 0.4x the real time), but after a few seconds, it drops until 0.08x : the video is reallyyyy slow to convert !
[00:39:23 CET] <DHE> most codecs buffer frames in memory for rate estimation, b-frames, etc. a quick burst of input at the start is common
[01:00:49 CET] <tobiasBora> Ok. And a ratio of 20x the time of the video to convert is common ? It's pretty long to wait for 20mn to translate 1mn of video...
[01:05:04 CET] <DHE> depends on the CPU mainly. and whether multithreading is enabled. I can do 1080p in real-time if I throw my whole core i7 at it (all 4 threads with hyperthreading)
[01:12:52 CET] <iive> tobiasBora: are you encoding hvec?
[01:13:15 CET] <tobiasBora> hvec .
[01:13:25 CET] <tobiasBora> iive: *hvec ?
[01:13:27 CET] <iive> hevc?
[01:13:48 CET] <iive> h265
[01:14:08 CET] <tobiasBora> no, I'm encoding webm using libvpx
[01:15:16 CET] <iive> oh, that one is also insanely slow.
[01:16:13 CET] <tobiasBora> Do you know a not too slow codec which is "open source" (no fees...)
[01:16:28 CET] <tobiasBora> and if possible usable in a webm container
[01:17:11 CET] <iive> h264
[01:22:24 CET] <tobiasBora> h264 is supposed to be "free" only for non commercial support I think no ?
[01:22:49 CET] <DHE> it's more complicated than that. but for any small scale distribution (under 1/4 million users iirc?) it's free
[01:22:58 CET] <DHE> been a while since I read the terms
[01:23:08 CET] <TD-Linux> read the terms to be sure
[01:23:27 CET] <TD-Linux> tobiasBora, you can try a faster speed setting for libvpx, with -speed.
[01:23:36 CET] <TD-Linux> 0 is slowest 8 is fastest
[01:23:46 CET] <tobiasBora> TD-Linux: Does it change the quality ?
[01:23:55 CET] <DHE> yes
[01:24:01 CET] <TD-Linux> yes, faster will be lower quality per bit
[01:24:20 CET] <DHE> if you specify a constant quality mode, the quality will be about the same but a bigger file
[01:25:59 CET] <iive> afair h264 is free for internet distribution. they kept the terms. the user thing is for devices.
[01:26:30 CET] <DHE> ah, okay.
[01:26:45 CET] <TD-Linux> you also have to pay for encoder/decoder implementations
[01:27:20 CET] <TD-Linux> also vp9 still beats x264 at most speed levels (dependent on settings of course)
[01:27:38 CET] <DHE> encoding speed?
[01:27:38 CET] <TD-Linux> *libvpx vp9
[01:27:48 CET] <TD-Linux> yes
[01:28:51 CET] <tobiasBora> TD-Linux: you mean ? If I use -speed 8 I would have a similar result to the one I would get using x264 ?
[01:29:24 CET] <DHE> "-speed 8" is likely similar to x264's "-preset superfast"
[01:29:30 CET] <DHE> at least conceptually
[01:29:42 CET] <TD-Linux> tobiasBora, with x264 at a setting that encodes at a similar speed. though it does vary based on bitrate and the like
[01:30:31 CET] <TD-Linux> you can do a test encode to compare yourself
[01:31:03 CET] <TD-Linux> libvpx does best at low bitrates where x264 falls off a cliff. at high bitrates it can be reversed
[02:05:39 CET] <Zeranoe> Why cant x264's bit depth be selected on the fly and needs to be compiled in?
[02:40:12 CET] <fa0> Hello all
[02:44:58 CET] <fa0> There are many dependencies that ffmpeg can be compiled against at build time, without the need for them installed, but I noticed when you do this, when looking at ffmpeg output for the compile options you'll still see the /path that is used for where all the dependencies had been compile against, added into the output; http://pastebin.ca/3735717
[02:46:15 CET] <c_14> So?
[02:46:28 CET] <c_14> The configuration line is just useful information for people trying to help you with issues.
[02:46:32 CET] <fa0> These are the --extra-cflags= & --extra-ldflags whcih in my pastebin show this /tmp path, I'm assuming isn't going to cause harm, but it seems odd to keep something like this, when it was only a build time /path and it no longer exists
[02:46:51 CET] <c_14> The path is set at configure time and never changes
[02:46:58 CET] <c_14> Well, the whole line
[02:47:04 CET] <c_14> It's also never used except to print that line
[02:47:35 CET] <fa0> I just found it interesting, wasn't sure if in the future a way to have this change to the temp building path can be removed
[02:47:55 CET] <fa0> to/so the temp...
[02:50:36 CET] <fa0> I guess some people for whatever reason(s) might keep their /tmp build path, but it would look better in the output, cleaner, if somehow the code has the ability to remove these tmp build paths, maybe something for the future
[02:51:14 CET] <klaxa> i don't build in /tmp
[02:52:02 CET] <fa0> The path isn't the importance, it's just the fact the ouput is displaying something leftover from build time
[02:52:43 CET] <c_14> The configure line isn't parsed for "useful" and "not useful" information
[02:52:54 CET] <klaxa> the information may or may not be relevant for pastes, just removing it because it seems "out of place" seems quite harsh
[02:53:00 CET] <c_14> It'd be too much effort for too little benefit with too much risk of hiding valuable information.
[02:57:14 CET] <dsc_> Was just wondering, if you took a random 2mb part of a h264/x264 (dont know difference) movie, would ffmpeg be able to generate a thumbnail/screenshot (from found keyframes I guess)
[02:57:31 CET] <dsc_> if so - how do I determine how large my range (2mb?) needs to be - does it depend on the bitrate or the amount of keyframes
[02:57:41 CET] <fa0> ok, no worries, just wanted to ask and find out... thx
[02:57:43 CET] <dsc_> would I first read the file header (and codec information) and determine my 'range' on taht
[02:57:47 CET] <c_14> You need at least one keyframe
[02:58:19 CET] <dsc_> c_14: how many bytes would I need to download to have a keyframe?
[02:58:22 CET] <dsc_> what determines that?
[02:58:48 CET] <dsc_> is there some thing (file header?) I can look at to make it less heuristic
[02:59:24 CET] <dsc_> also, does ffmpeg not break when I let it generate thumbnails from non-complete movie files
[02:59:45 CET] <dsc_> would I need to write something custom using ffmpeg libraries?
[02:59:58 CET] <dsc_> questions ;)
[03:00:11 CET] <c_14> For the last one you can use -frames:v 1 as an output option
[03:00:17 CET] <c_14> It'll abort as soon as it produces a frame
[03:00:33 CET] <dsc_> c_14: so ffmpeg does not need a complete mp4 file?
[03:00:41 CET] <dsc_> you can just have a random 2mb stream
[03:00:44 CET] <c_14> yes and no
[03:00:46 CET] <dsc_> (stream of bytes)
[03:01:25 CET] <c_14> If the input format is mp4 you'll need the moov atom which is usually either at the end (or beginning) of the file
[03:01:57 CET] <dsc_> cool, now im getting somewhere :)
[03:02:10 CET] <dsc_> lets say the input is indeed .mp4
[03:02:49 CET] <dsc_> I'd grab this moov atom
[03:02:54 CET] <dsc_> now what?
[03:03:21 CET] <dsc_> use that as a parameter for ffmpeg cli?
[03:03:58 CET] <c_14> If you just want a random thumbnail, the best thing is to just point the ffmpeg cli at the file and tell it to spit out 1 frame (potentially using the thumbnail filter or something)
[03:05:00 CET] <dsc_> the file would not be locally stored - i'tll be either ftp/webdav/scp/cifs/
[03:05:17 CET] <dsc_> protocols that support fetching parts of files at an offset
[03:06:03 CET] <kuroro> hello. im using the following command to merge three short clips (1 sec each) in ffmpeg using this command - http://pastebin.com/raw/mgN84fVR, but it takes a long time to complete in android (4-5 seconds)
[03:06:33 CET] <kuroro> any suggestions on how I can speed up the concat filter?
[03:06:41 CET] <kuroro> or see where the bottleneck is?
[03:07:03 CET] <c_14> ffmpeg supports at least 2 of those natively; the problem with pointing ffmpeg at a bite offset in a file is that it might not be able to find the headers that tell it how to decode the file
[03:07:36 CET] <c_14> kuroro: probably either the scale filters or io bottleneck
[03:09:30 CET] <kuroro> hmm. if its the scale filters, i can probably pre-scale clips so that doing the merge itself would be faster
[03:09:42 CET] <kuroro> for io, what do you mean by it exactly
[03:09:51 CET] <c_14> disk
[03:09:52 CET] <kuroro> like how would i improve that?
[03:10:15 CET] <c_14> write into a ramdisk (tmpfs or something) or raid-0 or ssd
[03:10:27 CET] <dsc_> c_14: with file headers you meant the moov atom, I take it. where the moov atom contains information about the video
[03:10:39 CET] <dsc_> so first job is finding that metadata
[03:11:02 CET] <dsc_> you know
[03:11:07 CET] <dsc_> im just gonna go through ffmpeg source code
[03:11:09 CET] <kuroro> c_14: interesting, i'll try to look into whether android supports ramdisk
[03:11:23 CET] <kuroro> but i'll test removing the scale filters first
[04:06:36 CET] <babadoc> hey gus
[04:06:37 CET] <babadoc> guys
[04:06:50 CET] <babadoc> does ffmpeg have any options to scan through vide for a specific frame
[04:06:55 CET] <babadoc> video*
[04:07:30 CET] <babadoc> anyone know how i could scan through video for a frame?
[04:13:38 CET] <SchrodingersScat> yes
[04:55:27 CET] <k_sze[work]> When I run ffprobe -show_frames on a video file, is there any guarantee that the lines inside each [FRAME][/FRAME] is always in the same order?
[04:55:51 CET] <k_sze[work]> And that every [FRAME][/FRAME] for the same stream has the same lines?
[05:23:43 CET] <Nanashi> I have a short clip with a variable framerate. How can I find out how long each frame lasts?
[06:50:04 CET] <Alex> *wave* - anyone able to register on trac? Creating a new account just hangs on the POST..
[06:50:42 CET] <Alex> Oh, it worked. Just took 55s. Nm :)
[07:28:36 CET] <thebombzen> hmmm. I'm having trouble looping an input video
[07:29:14 CET] <thebombzen> I tried using ffmpeg -stream_loop -1 -i input.mkv -c:v ffv1 -c:a flac -t 10 out.mkv
[07:29:24 CET] <thebombzen> for some reason the timestamps in the new file are corrupted.
[07:48:45 CET] <fling> How do I align videos together to a single multi-angle video?
[08:13:25 CET] <kerio> how do i "benchmark" the reading speed of a video?
[08:13:31 CET] <kerio> output as rawvideo to /dev/null ?
[08:16:12 CET] <thebombzen> fling: try using the overlay filter
[08:16:45 CET] <thebombzen> kerio: the -f null format discards output. I'd use -c copy -f null
[08:17:00 CET] <kerio> thebombzen: ty
[08:17:20 CET] <kerio> right now i'm using rawvideo to stdout piped to md5 so i can also check for integrity
[08:17:33 CET] <kerio> but -f null seems way better than outputting to /dev/null
[08:17:48 CET] <thebombzen> yea you skip the device driver. the discarding is done one step sooner.
[08:18:16 CET] <kerio> aite mjpeg2000 is WAAAAAAAY slower
[08:18:19 CET] <kerio> than ffv1
[08:19:25 CET] <kerio> and ffv1 is even slightly smaller
[08:22:28 CET] <furq> -c copy won't decode the streams
[08:22:36 CET] <furq> you probably want -c:v rawvideo -f null
[08:22:37 CET] <kerio> holy crap it's like a 6x slowdown
[08:22:53 CET] <kerio> 1778,11s user 10,33s system vs 319,82s user 5,88s system
[08:23:33 CET] <kerio> hmm can't ffv1 decode with more threads than 2?
[08:25:16 CET] <furq> it uses more than that here
[08:25:25 CET] <kerio> ffmpeg is stuck at 200% cpu :(
[08:26:08 CET] <kerio> thebombzen: apparently the default output codec for "null" is wrapped_avframe
[08:26:10 CET] <kerio> which does decode
[08:27:03 CET] <furq> oh that's helpful
[08:27:17 CET] <kerio> i have NO idea of what -threads is about
[08:27:20 CET] <kerio> 8 is actually 2
[08:27:29 CET] <kerio> i requested 32 and i got told that anything greater than 16 is not recommended
[08:27:40 CET] <kerio> and i'm only using 4
[08:28:09 CET] <furq> what resolution is the input
[08:28:13 CET] <kerio> 640x480
[08:28:16 CET] <kerio> gray16
[08:28:21 CET] <furq> i guess it depends on the input
[08:28:29 CET] <furq> i get two threads used for 480p but six for 1080p50
[08:28:36 CET] <kerio> fair enough
[08:31:19 CET] <fling> thebombzen: sounds complex!
[08:31:57 CET] <fling> Another question is how to wipe few seconds of sounds here and there in a video?
[08:32:51 CET] <kerio> ...am i using an old ffmpeg
[08:33:07 CET] <kerio> istr there was some special casing done so you could setsar even when using vcodec copy
[08:33:20 CET] <fling> kerio: ffmpeg -version
[08:33:28 CET] <kerio> 3.2
[08:33:40 CET] <kerio> ok it's -aspect apparently
[08:34:08 CET] <bencoh> and it's a DAR not a SAR
[08:34:33 CET] <furq> yeah you can't change the sar without reencoding
[08:34:37 CET] <furq> at least not with ffmpeg
[08:34:46 CET] <kerio> but it's the same thing :<
[08:34:59 CET] <fling> Do I need an external editor for manipulating sound? Do I also need an external editor for aligning multiple videos?
[08:35:29 CET] <bencoh> kerio: no! :)
[08:35:29 CET] <furq> you can usually set the dar at container level
[08:35:47 CET] <furq> -aspect just sets a tag in the output container
[08:36:20 CET] <fling> Is there also another aspect somewhere in videostream itself?
[08:36:59 CET] <furq> the sar is part of the bitstream in h264 at least
[08:37:14 CET] <kerio> well this is either mj2 or ffv1
[08:37:30 CET] <fling> and it gets ignored when dar is set on the container level right?
[08:37:35 CET] <kerio> fling: "maybe"
[08:37:42 CET] <kerio> media players be crazy
[08:37:47 CET] <fling> depends on the player ok
[08:38:10 CET] <fling> There could be also container level crop which gets ignored by mplayer
[08:38:23 CET] <fling> Have not tried one from mkv with mpv
[08:39:36 CET] <kerio> what does it mean when sar and dar are displayed between []?
[08:45:40 CET] <kerio> ok this is ridiculous how do i drop the "encoder" metadata
[08:45:41 CET] <kerio> i did it before
[09:10:32 CET] <kerio> is there a way to only keep specific metadata from an input stream?
[09:34:12 CET] <kerio> ok i got it
[09:34:33 CET] <kerio> apparently fflags +bitexact will add "Lavf" instead of "Lavf57.56.100" or whatever the version
[09:34:52 CET] <kerio> however, with vcodec copy, +bitexact will actually strip the encoder ;o
[09:42:59 CET] <bencoh> how is that bitexact then?
[09:43:13 CET] <kerio> i guess "bitexact copy" has no encoder field
[09:43:26 CET] <kerio> whereas "bitexact encoding" will have a version-agnostic encoder field
[09:43:36 CET] <kerio> i dunno, you tell me
[09:46:41 CET] <DHE> well, a copy is either an exact copy or it isn't. usually it is, to the point that sometimes you'll be instructed to add a bitstream filter manually if the copy can't take place properly
[10:17:26 CET] <noggor> Hello, Im tring to transmux rtmp live stream to mpeg-ts over http with ffmpeg. I tried to find something in forums and google, but no luck. Could someone please help me to get on track
[11:49:48 CET] <bwe> Hi, I converted a .mov to .mkv via `ffmpeg -i in.mov -c:v libx264 -preset slow -crf 18 -c:a copy -pix_fmt yuv420p out.mkv`. The audio precedes the video about a couple of seconds in the mkv. How can I prevent that?
[11:57:50 CET] <thebombzen> bwe: try using a setpts filter. i.e. -vf 'setpts=PTS-STARTPTS'
[12:02:49 CET] <thebombzen> fling: try the volume filter. you can reduce the volume of a segment to zero with it.
[12:03:07 CET] <thebombzen> and fling no you don't need an external editor for aligning multiple videos. I already answered that question... use the overlay filter.
[12:28:18 CET] <bwe> thebombzen: Okay. I've tried `ffmpeg -i in.mov -c:v libx264 -vf 'setpts=PTS-STARTPTS' -preset slow -crf 18 -c:a copy -pix_fmt yuv420p out.mkv`. The result seems slightly better. But still, audio is not in sync with video. What can I try?
[12:38:39 CET] <bwe> Hold on, I've mixed filenames up! Let me check&
[12:55:43 CET] <bwe> thebombzen: No, I can't get audio and video in sync with the above command.
[13:47:18 CET] <bwe> I've deleted the .mkv file and then re-executed the above command. Yet the audio is not in sync with video.
[13:47:36 CET] <bwe> in in.mov (the input file) it is in sync.
[14:15:21 CET] <fling> thebombzen: how will overlay filter help me?
[14:34:34 CET] <Nils__> anybody here who can help me with a little problem ?
[14:40:24 CET] <pomaranc> hello, if in h.264 there is a frame coded as two fields (non-interlaced), at what point in ffmpeg are those fields joined up? only at progressive encoding, or sooner?
[15:08:56 CET] <Nils__> can sb tell me why the ffplayer isn't showing up when I type 'brew optioins ffmpeg' in my terminal ?
[15:09:15 CET] <iive> ffplay?
[15:09:29 CET] <pomaranc> options?
[15:09:42 CET] <Mavrik> Nils__, --with-sdl2"
[15:10:31 CET] <Nils__> i'm not too certain if I installed sdl2 correctly
[15:11:08 CET] <Mavrik> huh
[15:11:14 CET] <Mavrik> brew will pull it itself anyway.
[15:11:27 CET] <Nils__> ah okay thanks
[15:12:07 CET] <Nils__> brew options ffmpeg --with-sdl2 is not showing it either tho
[15:16:56 CET] <Nils__> would be awesome if you could help me out
[15:25:30 CET] <Nils__> i have a  problem using the ffplay tool. using homebrew its not showing up if i type in 'brew options ffmpeg' can sb help me out ? that would be awesome
[15:27:10 CET] <kerio> mpv
[15:27:31 CET] <kerio> :D
[15:29:30 CET] <Nils__> I need it to stream from my gopro wireless
[15:31:05 CET] <bencoh> to stream or to play?
[15:31:13 CET] <Nils__> stream
[15:31:22 CET] <bencoh> then why ffplay?
[15:31:44 CET] <Nils__> I have no idea how to do it otherwise
[15:32:04 CET] <bencoh> if you just need to stream without displaying it to screen you can use ffmpeg
[15:32:19 CET] <Nils__> i do want to display it
[15:32:43 CET] <Nils__> live stream from the gopro wireless to the display
[15:33:06 CET] <Nils__> found this instruction which says ffplay udp://:8554 would work if i connect myself to the gopro wifi network
[15:33:49 CET] <bencoh> then you want to play it (maybe from a stream, but meh) :)
[15:34:39 CET] <Nils__> sorry about the wrong explanation than
[15:35:59 CET] <Nils__> still I can't find the ffplay tool at all -.-
[15:37:55 CET] <bencoh> Nils__: I suspect it'd work with mpv and/or vlc (maybe with a different uri) as well
[15:38:44 CET] <kerio> mpv should Almost" be a drop-in replacement for ffplay
[15:39:27 CET] <Nils__> i gonna try it out with mpv thanks for the help so far
[15:48:35 CET] <pgorley> hi all, does the patcheck tool take as argument only the .patch file?
[15:49:44 CET] <BtbN> it doesn't gake any arguments iirc
[15:49:46 CET] <BtbN> *take
[15:50:00 CET] <BtbN> it takes the patch on stdin, and generates the report on stdout
[15:52:07 CET] <pgorley> thanks
[15:58:45 CET] <waltercruz> Hi all
[15:59:19 CET] <waltercruz> I'm trying to do something like that: http://stackoverflow.com/questions/5015771/merge-video-and-audio-with-ffmpeg-loop-the-video-while-audio-is-not-over
[15:59:27 CET] <waltercruz> join a video and a audio
[15:59:38 CET] <waltercruz> the video is shorter, so I want to loop it until audio ends
[15:59:51 CET] <waltercruz> and I always get: dev/fd/63: No such file or directory
[16:00:44 CET] <waltercruz> I'm using the script pointed in the link
[16:00:45 CET] <waltercruz> https://gist.github.com/waltercruz/8f66c6169cc361159f503f2313f5f498
[16:01:34 CET] <waltercruz> any ideas?
[16:15:38 CET] <dl2s4> waltercruz, i would just do it in several steps. also if you would paste your command and output there would be a higher chance anybody could help
[16:15:58 CET] <ikevin> hi
[16:16:20 CET] <waltercruz> ok dl2s4
[16:16:29 CET] <waltercruz> I'm using this script: https://gist.github.com/LeonB/0a9c7f944d0e1bf2c8e6fd2de0bbecea
[16:17:04 CET] <ikevin> i'm trying to make an infinite video of 1 jpg file, so the jpg file is updated by another programm, can i use ffmpeg to do that?
[16:17:44 CET] <waltercruz> the commands and the results: https://gist.github.com/waltercruz/88f010e6146bcfc6949ee737e683f7bd
[16:18:01 CET] <waltercruz> the problem seems to be in this part: <(for i in $(seq 1 ${n_loops}); do printf "file '%s'\n" ${video}; done)
[16:46:22 CET] <bwe> I've tried `ffmpeg -i in.mov -c:v libx264 -vf 'setpts=PTS-STARTPTS' -preset slow -crf 18 -c:a copy -pix_fmt yuv420p out.mkv` to get audio and video in sync. However the audio is not in sync with the video.
[16:46:30 CET] <bwe> What can I do?
[17:06:34 CET] <kerio> ...why's metal gear solid twin snakes a format
[17:07:20 CET] <furq> the same reason wing commander 3 movie is a format
[17:08:15 CET] <bencoh> :))
[17:10:51 CET] <SchrodingersScat> is it really?
[17:12:13 CET] <zeryx> if I want to extract the first audio stream, and then re-attach it later (after doing something to the video) generically, what would my command look like?
[17:12:13 CET] <zeryx> right now it's failing
[17:12:57 CET] <Threads> do you mean mux back into a file ?
[17:13:09 CET] <zeryx> my extraction command is: ffmpeg -vn -acodec copy audiofile.aac videofile.mp4 -y
[17:13:45 CET] <zeryx> my attachment command is: ffmpeg -i editedVideoFile.mp4 -i audiofile.aac -codec copy -shortest complete_video.mp4 -y
[17:14:02 CET] <zeryx> not sure what the different between mux and attach/detatch is
[17:14:04 CET] <thebombzen> fling: read the documentation.
[17:14:08 CET] <zeryx> but if they're the same, then yeah
[17:14:30 CET] <zeryx> that command I just wrote doesn't always work, depends on the input & output file formats and the audio formats
[17:14:32 CET] <zeryx> I want to generalize it
[17:15:27 CET] <thebombzen> fling: see https://ffmpeg.org/ffmpeg-filters.html#overlay-1
[17:15:37 CET] <thebombzen> one of the examples specifically puts two videos sidebyside
[17:16:54 CET] <zeryx> any ideas Threads?
[17:17:02 CET] <thebombzen> zeryx: if you want to be super general, you can use the "nut" container, which is an ffmpeg creation. afaik it can hold any stream that ffmpeg is capable of producing.
[17:17:45 CET] <zeryx> can I specify the output container in the command without putting it in the output file extension?
[17:18:08 CET] <thebombzen> yes. that's the "-f" option. i.e. "-f nut" or "-f matroska" or "-f mp4" etc. etc.
[17:18:11 CET] <zeryx> I'd like to have an extensionless temporary audio file while I do work on the individual frames of the video, then reencode them back together
[17:18:14 CET] <zeryx> oh neat kk
[17:18:28 CET] <thebombzen> You also don't have to extract it.
[17:18:30 CET] <zeryx> so ffmpeg -vn acodec copy -f nut audiofile videofile.mp4 -y
[17:18:34 CET] <thebombzen> You can do something like this:
[17:18:54 CET] <thebombzen> ffmpeg -i editedVideoFile.mp4 -i originalVideoFile.mp4 -map 0:v -map 1:a -c copy output_file.mp4
[17:18:56 CET] <zeryx> oh I could copy the streams from the old video file into the new one
[17:19:01 CET] <zeryx> nice
[17:19:01 CET] <thebombzen> yes.
[17:19:03 CET] <zeryx> yeah that's a good idea
[17:19:05 CET] <zeryx> I like that
[17:19:14 CET] <zeryx> however, the output file might have a different FPS
[17:19:20 CET] <zeryx> that should still line up though right?
[17:19:27 CET] <zeryx> (assuming I handle fps correctly)
[17:19:36 CET] <thebombzen> the "-map" option tells ffmpeg which streams to select. 0:v means "the video stream from input 0" and "1:a" means "the audio stream from input 1"
[17:19:45 CET] <thebombzen> those will line up as long as you did the video editing correctly.
[17:19:52 CET] <zeryx> kk
[17:20:02 CET] <zeryx> why would I map the 0:v stream?
[17:20:13 CET] <thebombzen> well do you want video? :)
[17:20:21 CET] <zeryx> oh I see what you mean
[17:20:23 CET] <thebombzen> if you specify -map, it'll select only streams you map.
[17:20:32 CET] <zeryx> I have a pre-processed video and a post-processed video
[17:20:41 CET] <zeryx> and then a post-processed with audio
[17:21:01 CET] <zeryx> but I guess I could add the audio channel inside of the image2 reencoder command yeah?
[17:21:16 CET] <zeryx> (not that it's really any time savings, remuxing the audio stream back in is almost instant)
[17:21:18 CET] <thebombzen> no. image2 is a container format that literally means "sequence of images"
[17:21:57 CET] <bwe> thebombzen: May I ask you kindly on my issue of audio-video sync whether the problem is me by applying the -vf 'setpts=PTS-STARTPTS' correctly or whether the problem comes from something different, please?
[17:22:15 CET] <zeryx> so ffmpeg -framerate 25 -i /tmp/files-%05d.png -i originalVideoFile.mp4 -map 1:a -c copy output_file.mp4 won't work?
[17:22:30 CET] <thebombzen> bwe: first of all, what player are you using to check if the video/audio are in-sync?
[17:22:40 CET] <thebombzen> zeryx: no that'll work
[17:22:42 CET] <zeryx> (also I'm totally fine with maping all other channels besides the video channel from the original into the edited video)
[17:23:28 CET] <zeryx> I might split up the operations just to keep things clean
[17:23:28 CET] <thebombzen> although
[17:23:28 CET] <zeryx> but thats good to know
[17:23:28 CET] <thebombzen> it'll work but it won't map any video
[17:23:28 CET] <thebombzen> you still need to -map 0:v. if you specify any "-map" switches, it'll disable auto-selection and you have to select all the streams you want manually.
[17:23:29 CET] <zeryx> yeah that's fine, the `-i /tmp/files-%05d.png` should create that video stuff I want
[17:23:34 CET] <zeryx> oh
[17:23:37 CET] <zeryx> interesting
[17:23:47 CET] <zeryx> so in that case -map 0:v would map to itself?
[17:24:00 CET] <zeryx> because I don't want to copy the video from the original file over
[17:24:02 CET] <zeryx> I just want to copy the other streams
[17:24:44 CET] <bencoh> '0:v means "the video stream from input 0"'
[17:24:44 CET] <thebombzen> the -map option selects the stream you pass to it. so -map 0:v selects the stream "0:v" which means "the video stream from input 0"
[17:24:50 CET] <zeryx> oh ok
[17:24:57 CET] <thebombzen> I did say that above...
[17:25:13 CET] <zeryx> I thought 0:v meant copy the 0th video stream from the first pervious input
[17:25:33 CET] <thebombzen> [12:18:57] <thebombzen> the "-map" option tells ffmpeg which streams to select. 0:v means "the video stream from input 0" and "1:a" means "the audio stream from input 1"
[17:25:46 CET] <zeryx> so in ffmpeg -i video_file.mp4 -map 0:v -c copy -i other_file.mp4 complete_file.mp4 I thought it would copy the first video stream from the first input into the second?
[17:26:53 CET] <bwe> thebombzen: vlc.
[17:26:59 CET] <bwe> thebombzen: I observed the original problem (before trying that -vf 'setpts=PTS-STARTPTS' switch) in my youtube video after uploading the mkv. I can confirm the same with vlc on the same file.
[17:27:08 CET] <thebombzen> you should specify all of the -map options after the -i options.
[17:27:09 CET] <zeryx> ah
[17:27:09 CET] <thebombzen> as well as -c copy. In general you should try to put all  of your input optiosn first
[17:27:09 CET] <thebombzen> and then all the output options. some options don't care about location but most do.
[17:27:09 CET] <thebombzen> In particular, putting -c before -i overrides ffmpeg's autodetection of the input file's codec.
[17:27:09 CET] <thebombzen> bwe: try with ffplay. vlc sucks.
[17:28:12 CET] <thebombzen> also what is the output of "ffprobe in.mov" and "ffprobe out.mkv"
[17:29:46 CET] <zeryx> thebombzen: how can I copy all streams that aren't the video stream?
[17:29:46 CET] <thebombzen> zeryx: the "-vn" option says "do not select video streams"
[17:29:46 CET] <thebombzen> so if you use "-map 0 -vn" that should do it.
[17:29:46 CET] <zeryx> could I do -map 0:v -map 1 -vn ?
[17:30:00 CET] <zeryx> not sure how clean the options DSL is for stuff like this
[17:30:03 CET] <thebombzen> zeryx: doing -map 0:v -vn is silly because -map 0:v selects only the video stream from 0, and -vn doesn't select it.
[17:30:19 CET] <zeryx> but would -map 0:v -map 1 conflict?
[17:30:24 CET] <thebombzen> no?
[17:30:28 CET] <thebombzen> why would they?
[17:30:29 CET] <bwe> thebombzen: ffprobe: .mov => https://bpaste.net/show/2660eadf73b3 .mkv => https://bpaste.net/show/0106779a2e1a
[17:30:33 CET] <zeryx> because I'm selecting a video stream twice
[17:30:54 CET] <thebombzen> bwe: [mov,mp4,m4a,3gp,3g2,mj2 @ 0x7ff013800000] multiple edit list entries, a/v desync might occur, patch welcome
[17:31:09 CET] <thebombzen> try demuxing it without libavformat. looks like a bug.
[17:31:20 CET] <thebombzen> i.e. try using MP4Box to repack it or something.
[17:32:40 CET] <bwe> demuxing means getting audio and video separately?
[17:32:58 CET] <thebombzen> zeryx: there's no rule that says you can't have two video streams in one file.
[17:32:58 CET] <thebombzen> It's generally confusing to media players but totally allowed.
[17:32:58 CET] <zeryx> yeah, I figured there wasn't
[17:32:58 CET] <zeryx> I only want one video stream
[17:32:58 CET] <zeryx> but I want every other stream from the other video that isn't a video stream
[17:33:20 CET] <thebombzen> so you want all the streams from input 0 that aren't video, and you want the video from input 1?
[17:33:22 CET] <zeryx> (I'm editing each frame of the video generically and I don't want things to break because of a weird video structure)
[17:33:30 CET] <zeryx> yea
[17:33:38 CET] <zeryx> well, all the streams that won't break stuff
[17:33:43 CET] <thebombzen> well how many streams are in input 0?
[17:33:44 CET] <zeryx> (that might be a tall order to add)
[17:34:02 CET] <zeryx> between 1-n I assume? I'd prefer to only take audio tracks
[17:34:06 CET] <zeryx> but I'm not sure if you can do that
[17:34:14 CET] <zeryx> actually you know what, hmm
[17:34:23 CET] <zeryx> this is tough because subtitles are considered video streams right?
[17:34:31 CET] <c_14> -map 0:v:0 -map 0:a -map 0:s
[17:35:00 CET] <thebombzen> you can also use -map 0 -map 1 -map -0:v
[17:35:16 CET] <zeryx> the -map 0:v supercedes the others?
[17:35:17 CET] <thebombzen> sorry I mean -map 0 -map -0:v -map 1:v
[17:35:26 CET] <zeryx> interesting
[17:35:29 CET] <thebombzen> note the negative sign before a stream specifier. it means "not this."
[17:35:31 CET] <zeryx> what would that do as a structure?
[17:35:37 CET] <zeryx> oooh
[17:35:40 CET] <zeryx> ahhh
[17:35:42 CET] <zeryx> I get it :D
[17:35:52 CET] <zeryx> didn't see that initially
[17:36:01 CET] <bencoh> isn't -map 0:a enough to get all audio streams from 0, like c_14 suggested?
[17:36:10 CET] <zeryx> yeah but I'm thinking about subtitle streams as well
[17:36:16 CET] <thebombzen> -map 0 says "everything in input 0" "-map -0:v" says "except for the video" and then "-map 1:v" says "but do use the video from input 1"
[17:36:17 CET] <c_14> 0:s maps subtitle streams
[17:36:24 CET] <c_14> But thebombzen's is better
[17:36:32 CET] <thebombzen> c_14 there's often data streams like attached fonts to go with ASS subs.
[17:36:33 CET] <zeryx> :+1:
[17:36:42 CET] <c_14> thebombzen: which is why yours is better
[17:36:47 CET] <thebombzen> :D
[17:36:47 CET] <bencoh> :)
[17:36:49 CET] <zeryx> nice guys, thanks
[17:36:49 CET] <c_14> Though you could also map 0:d
[17:37:01 CET] <thebombzen> sounds like it's getting cluttered already :)
[17:37:03 CET] <zeryx> would i need to copy the codecs still?
[17:37:06 CET] <zeryx> I'm not encoding or reencoding
[17:37:08 CET] <zeryx> so I figure no?
[17:37:13 CET] <c_14> -c copy
[17:37:14 CET] <thebombzen> yea you coudl specify -c copy
[17:37:18 CET] <zeryx> kk
[17:37:36 CET] <zeryx> I thought codec copying was only useful when reencoding or changing encoding?
[17:37:51 CET] <zeryx> (actually I might be re-encoding if the user specifies a different output container from the input)
[17:38:07 CET] <bwe> thebombzen: https://bpaste.net/show/ef98beb7f91a Is it right to assume that I feed ffmpeg the audio file _track1.mov separately instead of using -c:a copy?
[17:38:09 CET] <thebombzen> no. codec copying literally means "don't reencode. pass the encoded stuff from input to output"
[17:39:01 CET] <thebombzen> bwe: this is getting past my knowledge. sorry.
[17:39:14 CET] <thebombzen> I'm not very good at working around bugs in libavformat.
[17:39:34 CET] <zeryx> is there a ffprobe compare option?
[17:39:43 CET] <zeryx> so I can compare the differences between two files?
[17:40:00 CET] <zeryx> (I could probably make a tmux session and put them side by side, but I figure there's an easier way)
[17:40:59 CET] <bwe> thebombzen: No problem. I would need to do it only without libavformat? I have now separated audio from video with mp4box: https://bpaste.net/show/873be604f213
[17:41:04 CET] <furq> if you want to check whether two files contain identical streams then use the hash muxer
[17:50:40 CET] <pgorley> is there a way to disable vaapi-x11 without disabling xlib and enabling vaapi-drm? i've fiddled around with configure options but haven't found anything yet
[17:54:34 CET] <jkqxz> pgorley:  Can you clarify what combination you want?  No vaapi at all is just --disable-vaapi.  Having vaapi enabled but neither vaapi_x11 nor vaapi_drm would mean it's built in but only usable from libraries.
[17:55:45 CET] <zeryx> thebombzen: I have another question for you, do you know how to set the jpeg compression ratio with the image2 muxer?
[17:55:54 CET] <furq> -q:v
[17:55:57 CET] <furq> 2 is best, 31 is worst
[17:56:13 CET] <zeryx> I'm right now saving as png files and files blow up pretty badly when I don't compress
[18:00:07 CET] <zeryx> why does it go up to 2 instead of 0 or 1? Also is that a real jpeg compression ratio or an internal parameter defined by ffmpeg?
[18:03:53 CET] <pgorley> jkqxz: i want vaapi-drm, but not vaapi-x11
[18:04:02 CET] <pgorley> so vaapi enabled
[18:06:13 CET] <fling> thebombzen: will I use the sidebyside video for calculating the offsets and fps multiplier?
[18:06:36 CET] <fling> Another question is don't I need mp4 and avi and other containers? Should I just use mkv everywhere?
[18:06:57 CET] <bencoh> zeryx: why are you saving as png? do you need to do editing on a frame basis/on picture files?
[18:06:57 CET] <zeryx> yeah, I'm editing each frame bencoh, then stitching the images back together
[18:06:57 CET] <zeryx> using an image processing algorithm (which is generic)
[18:07:05 CET] <zeryx> bencoh can you think of a better approach then using the image2 demuxer for it?
[18:07:33 CET] <zeryx> (also yeah it's kind of painful beacuse it essentially decompressed the video file, which can get pretty darn enormous, but it works pretty well)
[18:07:34 CET] <jkqxz> pgorley:  But xlib has to be enabled?  You can't get that combination without minor hackery.  You could rm /usr/include/va/va_x11.h (or wherever it is), or you could comment out lines 6004-6006 of configure.
[18:07:41 CET] <bencoh> I suppose you could somehow process frames at the same time, but....
[18:07:45 CET] <zeryx> yeah
[18:07:48 CET] <zeryx> I'm processing frames in parallel
[18:07:57 CET] <bencoh> no I mean, while decoding/encoding
[18:08:00 CET] <zeryx> around 50 parallel threads on 5 different machines on average
[18:08:07 CET] <zeryx> oh yeah, I did that intiailly but data bandwidth was the limiter
[18:08:09 CET] <pgorley> jkqxz: ideally xlib would be enabled, yes
[18:08:10 CET] <jkqxz> pgorley:  Or if you built libva yourself you can build it without x11 support.
[18:08:18 CET] <zeryx> aws is pretty fast, but not fast enough
[18:08:35 CET] <pgorley> alright, thanks for the tips jkqxz!
[18:08:42 CET] <bencoh> I wouldn't exactly call it "fast" but .... ;)
[18:08:46 CET] <zeryx> it's unfortunately more efficient to decode/renecode on a single machine, although multi-threading might be viable
[18:08:58 CET] <bencoh> (certainly not when talking about video)
[18:09:01 CET] <zeryx> yea
[18:09:08 CET] <zeryx> well, the video is pretty quick to download and upload
[18:09:20 CET] <zeryx> it's the images & image files that totally annhiliate our bandwidth :P
[18:14:10 CET] <zeryx> I try to reduce the number of data transfer round trips as possible
[18:14:10 CET] <zeryx> and keep as much work on one machine as I can
[18:14:10 CET] <zeryx> (pretty analogous to memory management in the gpu world tbh)
[18:14:11 CET] <zeryx> so that means right now I'm decoding a video into frames with the image2 demuxer, doing a par_iter map where I upload & process batches of frames on other servers, and wait for all the threads to finish with a .join(), then concatenate them back into a video again
[18:14:11 CET] <bencoh> it still doesn't explain why you cant locally process frames as you decode/encode, but ...
[18:14:12 CET] <zeryx> servers are different, but if I interacted with libffmpeg directly I might be able to..
[18:14:12 CET] <zeryx> actually that's a good idea beacuse C++ FFI is incredibly easy with rust
[18:14:12 CET] <bencoh> dunno about rust+libavfilter though :D
[18:14:12 CET] <zeryx> (which this app is made with)
[18:14:12 CET] <bencoh> no idea how well that would map
[18:14:12 CET] <zeryx> there are a few wrappers but I'd want it to return me a rayon par_iter map of concurrent threads each contaiing a frame
[18:14:12 CET] <zeryx> which may or may not be impossible LP
[18:14:12 CET] <bencoh> but I'd personally probably go for a pipeline approach, yeah
[18:14:12 CET] <zeryx> containing a future frame* which I then realize by collapsing it with a .wait()
[18:14:12 CET] <zeryx> yeah
[18:14:12 CET] <zeryx> cool
[18:14:12 CET] <zeryx> I already build the basic version but the data transit time over aws was a total killer
[18:14:12 CET] <bencoh> (well, not exactly surprising considering the kind of projects I've been working on ^^)
[18:14:12 CET] <zeryx> https://algorithmia.com/algorithms/media/VideoAlgorithms
[18:14:12 CET] <zeryx> rebuilding that so it's less shit and unweidly
[18:47:13 CET] <bwe> thebombzen: I've fed ffmpeg with video and audio separately. However the result is the same: audio precedes video by a couple of seconds.
[18:47:19 CET] <bwe> thebombzen: Which alternatives to ffmpeg could I use to try to transform mov into mkv (with audio and video in sync)?
[18:48:12 CET] <furq> mkvmerge
[18:49:31 CET] <kerio> is there a way to use a program as a video filter?
[18:50:40 CET] <bwe> furq: and how could I recode that >300 MiB file into some ~30 MiB file via x264 or similar (like ffmpeg does but without a/v-sync)?
[18:51:24 CET] <bwe> furq: source .mov file would be https://bpaste.net/show/2660eadf73b3
[19:04:52 CET] <ChocolateArmpits> kerio: What do you mean ?
[19:05:34 CET] <kerio> meh i guess it's way too complicated
[19:06:07 CET] <kerio> because of possibly variable framerates and all that
[19:06:47 CET] <kerio> ChocolateArmpits: execute a program, pass frames to its stdin, read frames from its stdout
[19:06:58 CET] <kerio> possibly in different formats
[19:10:18 CET] <ChocolateArmpits> kerio: Yeah absolutely.
[19:10:35 CET] <ChocolateArmpits> You just have to be able to read the supplied format and write the requested format
[19:10:55 CET] <kerio> ChocolateArmpits: yea but without running ffmpeg twice
[19:11:05 CET] <kerio> and outputting/inputting from a pipe
[19:11:16 CET] <ChocolateArmpits> Maybe via named pipes ?
[19:11:31 CET] <kerio> ...that's not really the point :\
[19:11:32 CET] <ChocolateArmpits> certainly not via traditional cli pipe
[19:11:48 CET] <ChocolateArmpits> I know however if you're on Windows that won't be doable
[19:11:48 CET] <kerio> that's why i wanted something as a video filter oslt
[19:12:35 CET] <ChocolateArmpits> Anyways I would run two instances, it's not like you're fighting for memory, it'll be simpler in the end
[19:13:29 CET] <ChocolateArmpits> Is the data being read from a file ?
[19:14:14 CET] <kerio> yeah
[19:14:35 CET] <kerio> running two ffmpegs is a pain tho
[19:14:40 CET] <kerio> you have to remember to silence one
[19:14:44 CET] <kerio> or the output on stderr is a mess
[19:16:30 CET] <ChocolateArmpits> What's it worth to run -loglevel error ?
[19:16:35 CET] <ChocolateArmpits> or -loglevel quiet
[19:17:21 CET] <kerio> i wonder if it's worth it to put everything in a single ffmpeg invocation regardless
[19:19:04 CET] <ChocolateArmpits> In the end your filter application will have to work in pretty much the same way
[19:19:06 CET] <ChocolateArmpits> So focus on that
[19:20:28 CET] <kerio> my filter application is already done
[19:21:00 CET] <ChocolateArmpits> oh ok then
[19:23:56 CET] <kerio> aww, ffmpeg gets stuck waiting for my filter's output before starting with the filter's input
[19:24:08 CET] <kerio> i wonder if there's a way to make it not wait
[19:24:52 CET] <ChocolateArmpits> -analyzeduration or -probesize maybe ?
[19:25:27 CET] <kerio> oh wait i should tell ffmpeg that these are fifos maybe
[19:27:23 CET] <BtbN> ffmpeg.c is single threaded. It processes all filters, decoders, encoders sequentually
[20:07:05 CET] <kepstin> and i've run into issues with the fact that raw video frames are sometimes bigger than the OS pipe buffers, which forces stuff to run in lockstep. If you're doing a separate filtering app, make sure you're using threaded/async io if possible :)
[20:08:26 CET] <kerio> kepstin: that is... a good point i guess?
[20:08:36 CET] <kerio> nothing seems to bottleneck tho
[20:09:53 CET] <kerio> honestly, why isn't there a "matplotlib imshow" video filter in ffmpeg?
[20:10:06 CET] <kerio> hm maybe i can find something like that in the opencv filter
[20:26:49 CET] <kerio> is rgb24 to yuv420p "lossless"?
[20:27:08 CET] <kerio> or should i figure out what the colors are supposed to be in yuv in the first place?
[20:27:26 CET] <klaxa> no
[20:29:44 CET] <klaxa> yuv420p has 8 bytes for a 2x2 macropixel with 4Y 2U and 2V values, averaging 2 bytes per pixel, whereas rgb24 has 3 bytes per pixel
[20:29:44 CET] <klaxa> so by design it cannot be lossless
[20:30:14 CET] <kerio> i meant more from a color theory perspective
[20:30:39 CET] <klaxa> still no?
[20:30:59 CET] <klaxa> YUV color space is smaller than rgb, and not all rgb -> yuv conversions will yield the same yuv -> rgb result
[20:31:08 CET] <kerio> smaller is fine
[20:31:17 CET] <kerio> the end result will have to be yuv420 anyway
[20:31:18 CET] <kerio> because of quicktime
[20:31:20 CET] <klaxa> you can do some "mapping" if you want
[20:31:37 CET] <kerio> ok actually let's go from the beginning
[20:31:43 CET] <kerio> i have some gray16 lossless video
[20:32:14 CET] <kerio> that doesn't actually contain brightness values
[20:32:26 CET] <klaxa> ok
[20:32:39 CET] <klaxa> from my understanding grayscale is nothing BUT brightness?
[20:32:47 CET] <kerio> so what i want to do is take each frame, map the lowest intensity to 0, the highest to 1, and then pass it through a colormap
[20:33:00 CET] <kerio> (with everything in between being scaled linearly)
[20:33:31 CET] <kerio> klaxa: nominally, yes
[20:33:51 CET] <kerio> in my particular case, each pixel's value is an absolute temperature in units of hundredths of a K
[20:36:01 CET] <kerio> anyway
[20:36:09 CET] <kerio> right now i'm outputting my colormapped values as rgb24
[20:36:39 CET] <kerio> i could try to figure out a functional representation of my colormap in YUV space
[20:36:46 CET] <kerio> but is it worth it?
[20:37:33 CET] <klaxa> ah i see
[20:37:33 CET] <klaxa> your usecase seems to overlap with some of my work, i basically did that
[20:37:34 CET] <klaxa> just in my case i read data from a netcdf file instead of using a video as the source
[20:37:34 CET] <klaxa> is this supposed to be just visualization?
[20:37:34 CET] <klaxa> or do you plan to keep the data and work with it?
[20:37:34 CET] <Sashmo> can  anyone help me understand why ffmpeg is adding a delay to my transcodes?  http://pastebin.com/uQNWLq1L  706ms added to my encodes
[20:37:45 CET] <klaxa> kerio: is the rgb24 mapped data supposed to be read back and used as the original data?
[20:37:50 CET] <kerio> oh god of course not
[20:38:07 CET] <Sashmo> its actually worse.... that was in the menu section, if I look at the audio portion, its double that.... http://pastebin.com/QF3mvnB9
[20:38:15 CET] <klaxa> if it is just for visualization rgb -> yuv conversion shouldn't make it super bad
[20:38:25 CET] <kerio> especially because i'm also encoding as h264
[20:38:30 CET] <klaxa> subtle changes in hues
[20:38:40 CET] <klaxa> well h264 supports rgb24 :^)
[20:38:44 CET] <kerio> :^)
[20:38:48 CET] <kerio> the thing is
[20:39:00 CET] <kerio> i'd need a way to put the dynamic range back to what it was initially
[20:39:16 CET] <kerio> even if i wanted some hybrid monstruosity of a data file that's also a visible colormapped video
[20:39:39 CET] <kerio> klaxa: does netcdf support ffv1? :^)
[20:40:01 CET] <kerio> because honestly, i'm getting about a 300% compression
[20:40:06 CET] <kerio> that's pretty good
[20:40:25 CET] <klaxa> use a metadata file or add it as metadata to your file?
[20:40:25 CET] <klaxa> hmmm...
[20:40:25 CET] <klaxa> what
[20:40:29 CET] <klaxa> netcdf is for scientific data, not videos
[20:40:50 CET] <kerio> this is scientific data, fam
[20:40:57 CET] <klaxa> yes, but in a video file
[20:41:04 CET] <kerio> because it is a video
[20:41:09 CET] <kerio> ...kinda
[20:41:17 CET] <klaxa> doesn't make netcdf a container format :P
[20:41:24 CET] <klaxa> anyway, gotta go~
[20:41:49 CET] <kerio> raw data for this particular video is 11gb, ffv1 yields 3.6
[20:56:57 CET] <Sashmo> can  anyone help me understand why ffmpeg is adding a delay to my transcodes?  http://pastebin.com/uQNWLq1L  706ms added to my encodes
[21:22:27 CET] <cluelessperson> hey guys, how can I record the screen to a timestamp file, but rotate the file I'm saving to in segments of one hour?
[21:24:42 CET] <cluelessperson> I'm on debian
[21:37:03 CET] <Sashmo> does anyone know why I get a start time added to my transcodes?  Duration: 23:17:30.63, start: 11677.213522, bitrate: 1 kb/s
[21:37:03 CET] <Sashmo> it should be start 0
[21:42:11 CET] <furq> cluelessperson: https://www.ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment
[21:51:41 CET] <cluelessperson> furq, Thank you
[21:51:52 CET] <cluelessperson> furq, I'm having difficulty finding how to overlay a timestamp, any ideas?
[21:55:51 CET] <ChocolateArmpits> Sashmo: Maybe they already have those timestamps ?
[21:59:27 CET] <furq> cluelessperson: https://ffmpeg.org/ffmpeg-filters.html#drawtext-1
[21:59:27 CET] <furq> one of the examples does that
[22:05:43 CET] <ChocolateArmpits> When compiling with Decklink should it appear next to "Enabled indevs" ?
[22:18:01 CET] <cluelessperson> furq, doesn't work
[22:21:11 CET] <cluelessperson> Now ffmpeg is stuck and only records the FIRST frame at the start of the recording
[22:21:22 CET] <cluelessperson> minus all the extras too, no added options
[22:21:31 CET] <cluelessperson> like the buffer is stuck
[22:24:14 CET] <geri> hi, i convert a serie of images to a video. how does ffmpeg find the fps?
[22:24:50 CET] <furq> it doesn't
[22:24:51 CET] <furq> you either tell it or it defaults to 25
[22:24:53 CET] <furq> cluelessperson: pastebin the command line and output
[22:25:25 CET] <geri> i can set it with -framerate 30  to 30fps?
[22:25:32 CET] <furq> yes
[22:25:57 CET] <geri> but when i see the convertion ffmpeg shows fps=81 during the conversion...why is that?
[22:26:11 CET] <furq> that's the encoding speed
[22:26:15 CET] <geri> oh
[22:27:12 CET] <geri> i have 1800 images with 30fps... whouldnt the resulting video be 60 sec?
[22:27:34 CET] <furq> yes
[22:28:18 CET] <cluelessperson> furq,    ffmpeg -video_size 1920x1080 -framerate 2 -f x11grab -i :0.0  output.mp4
[22:28:24 CET] <cluelessperson> stops at frame when this shows up
[22:28:26 CET] <cluelessperson> [x11grab @ 0x14ec360] Stream #0: not enough frames to estimate rate; consider increasing probesize
[22:40:11 CET] <cluelessperson> Still no success
[22:40:12 CET] <cluelessperson> :/
[22:42:27 CET] <cluelessperson> 1.  Record entire desktop,  not working because stops after 1 frame.
[22:42:37 CET] <cluelessperson> 2. overlay timestamp,  not working
[22:42:43 CET] <cluelessperson> 3.  segment,  not working
[22:43:26 CET] <Sashmo> ChocolateArmpits: nah I checked....
[22:46:16 CET] <Sashmo> but why the "start" time being added when I add them together, thats baffling me
[23:52:32 CET] <misterhat> i can't get x11grab to work w/ ffmpeg on either of my debian machines. it works fine using ffplay, but trying to output it into any sort of format hangs 2 or 5 frames in
[23:52:34 CET] <misterhat> ffmpeg version 3.2-2 Copyright (c) 2000-2016 the FFmpeg developers
[23:52:36 CET] <misterhat>   built with gcc 6.2.0 (Debian 6.2.0-10) 20161027
[23:52:49 CET] <misterhat> $ ffmpeg -f x11grab -i :0 -s 800x600 test.webm
[23:53:23 CET] <misterhat> frame=    5 fps=0.5 q=0.0 size=       0kB time=00:00:00.00 bitrate=N/A dup=0 drop=293 speed=   0x
[23:53:28 CET] <misterhat> drop= raises indefinitely
[23:53:38 CET] <c_14> tried without the scale?
[23:53:44 CET] <misterhat> yeah
[23:53:53 CET] <c_14> tried with a different format?
[23:54:06 CET] <misterhat> yeah, flv, mp4, avi
[23:54:10 CET] <misterhat> but ffplay works fine
[23:54:38 CET] <c_14> can you upload the output of `ffmpeg -v debup -f x11grab -i :0 -f null /dev/null' to a pastebin service?
[23:54:45 CET] <jkqxz> "-vsync 0"?
[23:54:46 CET] <c_14> *debug
[23:55:08 CET] <misterhat> where would that go jkqxz
[23:56:33 CET] <misterhat> c_14: https://paste.debian.net/892189/
[23:56:38 CET] <misterhat> those bottom 3 are repeated indefinitely
[23:56:51 CET] <jkqxz> Anywhere, it's a global option.
[23:57:02 CET] <misterhat> this one seems to be working though c_14
[23:57:12 CET] <misterhat> like it's not hanging
[23:57:19 CET] <misterhat> frame=   51 fps= 33 q=-0.0 Lsize=N/A time=00:00:00.16 bitrate=N/A speed=0.109x
[23:57:53 CET] <misterhat> hmm jkqxz yeah it works with webm i think
[23:58:14 CET] <misterhat> let me try it with what i need it for
[23:58:41 CET] <misterhat> ah nevermind it doesn't produce a valid video file with -vsync 0 jkqxz, but it doesn't hang
[23:58:56 CET] <jkqxz> That should stop it dropping, but you might still get the timestamp warnings and funny playback.  You probably want to set -framerate for the input.
[23:59:15 CET] <misterhat> ok
[23:59:54 CET] <misterhat> nah jkqxz it won't play
[23:59:57 CET] <misterhat> i used -framerate 25
[00:00:00 CET] --- Fri Nov  4 2016


More information about the Ffmpeg-devel-irc mailing list