[Ffmpeg-devel-irc] ffmpeg.log.20140514

burek burek021 at gmail.com
Thu May 15 02:05:01 CEST 2014

[02:21] <nitero> hi, are there any gui's for ffmpeg, for linux?  i need to convert .mkv to .avi with a certain resolution. i'm not adverse to learning CLI but i'm not very experienced with it ... i'm reading the manpage for ffmpeg right now
[02:25] <c_14> There are, but I don't know any good ones. I'd just go with the cli if I were you. It's pretty easy to learn and allows you to accomplish rather complex things once you're used to it.
[02:46] <c_14> If you're just trying to convert to something specific, there's usually a wiki page with examples that you can look at.
[02:47] <nitero> okay, that sounds good, i'll look that up, thank you =)
[08:45] <maksimkaaa> Hi, I want to type a timestamp on every image of a set of 1000 frames I have, I am planning to convert the images into mp4 but I need a guide about putting a TimeStamp on every images.. is there any good doc about that?
[08:45] <maksimkaaa> **The timestamp is different for every image.
[08:54] <blippyp> maksimkaaa: do you have a table of the filenames and timestamps?
[09:13] <maksimkaaa> Hi, I want to type a timestamp on every image of a set of 1000 frames I have, I am planning to convert the images into mp4 but I need a guide about putting a different TimeStamp on every image.. is there any good doc about that?
[09:22] <blippyp> maksimkaaa: um yeah -> have you made a table with the filenames and timestamps yet?
[11:17] <Wu> c_14: thanks for the info and the link yesterday, I managed to replace ffmpeg2theora with ffmpeg, same resulting ogv
[11:40] <maksimkaaa> Hi, I have 300 frames ready to generate video using ffmpeg, I know the timestamp of every image and want to print the timestamp of every image over that image while generating the video. how is that possible with ffmpeg?
[14:09] <mrskman> Hi! Is there any way how convert videos to OGV with ffmpeg and using multiple threads? I guess libtheora has no multi-thread support but is there any good alternative?
[14:10] <klaxa|work> if you have multiple videos you can run multiple ffmpeg instances
[14:13] <mrskman> Sure, but I need to convert only single video file
[15:16] <coreb1te> mrskman: ffmpeg have a "-threads" parameter
[15:17] <coreb1te> for example, ffmpeg -i video.avi -threads 4 video.mp4
[15:18] <coreb1te> in this case use 4 threads to convert the video and is faster
[15:19] <relaxed> libtheora isn't multithreaded
[15:21] <coreb1te> upps.. sorry, xD
[15:22] <relaxed> I think libx264 now uses all cores by default
[15:24] <coreb1te> relaxed: yes, but I think that is necesary ffmpeg compiled with option "--enable-runtime-cpudetect"
[15:26] <relaxed> No, I just tested it.
[15:26] <Mavrik> coreb1te, no it isn't
[15:26] <Mavrik> x264 always does runtime cpudetect
[15:27] <coreb1te> ok, perfect
[15:27] <Mavrik> it's a separate library that does not rely on ffmpeg and thus ffmpeg configuration parameters have no effect on it.
[15:27] <relaxed> libx264 did not used to be multithreaded by default when using ffmpeg.
[15:28] <relaxed> you had to pass -threads 0
[15:28] <Mavrik> yes, ffmpeg now passes threads 0 by default.
[15:29] <relaxed> that contradicts your last statement
[15:31] <Mavrik> no.
[15:31] <coreb1te> I'm using now ffmpeg 2.2.2 and when convert a video to .mp4 using libx264 codecs by default use all cores
[15:31] <Mavrik> you just didn't understand that "configuration parameters" in those context are parameters passed to "./configure"  like the "--enable-runtime-cpudetect" mentioned by coreb1te
[15:31] <Mavrik> not parameters passed when running ffmpeg.
[15:31] <relaxed> right
[15:32] <relaxed> coreb1te: it's been the default for some time
[16:35] <kriegerod> is there a way to look at ffplay-ish audio visualizer without having audio output to any audio devices?
[16:35] <kriegerod> i.e. i need either to show visualizer in ffmpeg or to mute ffplay
[16:43] <Sembiance> Guides online say to use 'x264 --help' to see available presets, but there is no 'x264' command on my system.
[16:49] <c_14> kriegerod: ffplay -an ?
[16:49] <Sembiance> figured it out, I was missing 'x264-encoder' package ;)
[17:13] <Sembiance> So, using ffmpeg with x11grab, I am NOT getting a cursor, but I want one :)
[17:13] <Sembiance> Using command `ffmpeg -f x11grab -r 30 -s 1280x720 -i :0.0+1721,262 -vcodec libx264 -preset ultrafast -threads 0 -y test.mp4`  with ffmpeg 2.2.1
[17:14] <sacarasc> -draw_mouse 1
[17:14] <sacarasc> I think.
[17:14] <Sembiance> tried that :)
[17:14] <Sembiance> just tried using a 0,0 window position, worked then
[17:15] <Sembiance> so it appears to have some weird bug capturing the cursor when on my second monitor
[17:21] <Sembiance> hrm, err hrm...
[17:24] <Sembiance> oh wait, so I guess no, I didn't see a cursor after all even with 0x0 window position
[17:24] <Sembiance> hrm....
[17:31] <Sembiance> no, no cursor, no matter what I try. hrm...
[17:36] <Sembiance> well, recordmydesktop won't draw one either and trying --dummy-cursor with it seg faults it.
[17:36] <Sembiance> so maybe my xmonad config somehow isn't playing nicely with cursor recording ;)
[19:29] <mrskman> https://trac.ffmpeg.org/wiki/Seeking%20with%20FFmpeg#Fastandaccurateseeking Is it true that -ss output param is no longer needed when using version >=2.1?
[19:30] <c_14> Where did you hear that?
[19:31] <mrskman> I'm guessing from changelog: - when transcoding with ffmpeg (i.e. not streamcopying), -ss is now accurate
[19:31] <mrskman>   even when used as an input option. Previous behavior can be restored with
[19:31] <mrskman>   the -noaccurate_seek option.
[19:36] <c_14> Hmm, if it's in the changelog it should be true. I also just tested it and it seems to be working.
[19:37] <tyler1> is there a way to make an inputted file be divisble by 2 both height and width?
[19:42] <c_14> If you have a given height: scale="trunc(oh*a/2)*2:720" for a given width scale="1280:trunc(ow/a/2)*2"
[19:42] <tyler1> @c_14 the problem is that i don't know the height or width ahead of time
[19:43] <tyler1> if I have 147x89 source,  I need to output it as 148x90
[19:43] <tyler1> (or 148x88) or whatever
[19:44] <c_14> you can use iw and ih as variables which contain the input width and the input height respectively.
[19:45] <tyler1> could you give me an example with that?
[19:50] <c_14> scale="trunc(iw/2)*2:trunc(ih/2)*2" should do it
[19:50] <c_14> If my understanding of trunc is correct, that should round down every time.
[19:55] <c_14> Ye, it'll round every non-even number down to the nearest even number.
[19:58] <tyler1> awesome let me try that, thanks c_14
[20:10] <tyler1> c_14:  i'm getting this error: Unable to find a suitable output format for 'scale="trunc(iw/2)*2:trunc(ih/2)*2"'
[20:10] <tyler1> here's my command: ffmpeg scale="trunc(iw/2)*2:trunc(ih/2)*2" -i temp/monq%05d.png -vcodec libx264 -an -pix_fmt yuv420p -preset fast -y out/monq.gif.mp4
[20:10] <c_14> ffmpeg -vf scale[..]
[20:10] <tyler1> ah
[20:10] <c_14> And after the -i $file
[20:11] <tyler1> after -i but is it required to be before something else? or just after -i
[20:11] <c_14> After -i and before the output file name
[20:13] <jspencer> ohai...I have a multi-channel audio question
[20:14] <klaxa> go ahead
[20:14] <jspencer> when I create a 6 channel (5.1) audio file (using amerge) channel 6 is always distorted (sounds like a low pass) but when I create a 7 channel (6.1) all channels are clean
[20:15] <jspencer> I assumed the distortion in the 6 channel is because it's the LFE channel
[20:15] <jspencer> but as none of the channels in the 7 channel file have the same distortion I'm not so sure
[20:16] <jspencer> so...what I'd really like is a 6 channel file without any distortion on that 6th channel...if possible
[20:35] <jspencer> my workaround will probably be to use 6.1 with a silent channel, I just don't understand why 5.1 has one distorted channel but 6.1 is clean throughout...
[21:52] <tyler1> hey c_14 still having issues
[21:52] <c_14> What seems to be the problem?
[21:53] <tyler1> I ran :  ffmpeg -i monq%05d.png -vf "crop=((in_w/2)*2):((in_h/2)*2)" -vcodec libx264 -an -pix_fmt yuv420p -preset fast -y ~/www/gifvid/output.mp4
[21:53] <tyler1> and still getting: height not divisible by 2 (500x495)
[21:53] <c_14> you need trunc(in_w/2)*2
[21:54] <c_14> Or in this specific case trunc(in_h/2)*2
[21:54] <tyler1> ah i did crop instead of scale
[21:54] <tyler1> I think its working now
[21:54] <c_14> If you're not truncating, you won't get an even number.
[21:55] <tyler1> this worked: -vf scale="trunc(iw/2)*2:trunc(ih/2)*2"
[21:55] <tyler1> thanks again for your help :)
[21:56] <c_14> np
[21:57] <tyler1> it is working from ffmpeg command line but not with this node package : https://www.npmjs.org/package/fluent-ffmpeg
[21:57] <tyler1> :\
[21:58] <tyler1> [Parsed_scale_0 @ 0x7fddbc800000] [Eval @ 0x7fff565d9440] Unknown function in '"trunc(iw/2)*2'
[22:00] <c_14> I don't know anything about that, and that isn't supported here; but are you using .addOption('-vf','scale="trunc(iw/2)*2:trunc(ih/2)*2"') ?
[22:01] <tyler1> I think so, I have this:  .addOptions(['-vf scale="trunc(iw/2)*2:trunc(ih/2)*2"', '-pix_fmt yuv420p', '-preset fast'])
[22:02] <tyler1> oh maybe I need to seperate the -vf, let me try that
[22:04] <tyler1> cf_14	this was the full error:
[22:04] <tyler1> [Parsed_scale_0 @ 0x7fea93000000] [Eval @ 0x7fff5f14b440] Unknown function in '"trunc(iw/2)*2'
[22:04] <tyler1> [Parsed_scale_0 @ 0x7fea93000000] [Eval @ 0x7fff5f14b440] Invalid chars '"' at the end of expression 'trunc(ih/2)*2"'
[22:04] <tyler1> Error when evaluating the expression 'trunc(ih/2)*2"'.
[22:04] <tyler1> Maybe the expression for out_w:'"trunc(iw/2)*2' or for out_h:'trunc(ih/2)*2"' is self-referencing.
[22:04] <tyler1> [Parsed_scale_0 @ 0x7fea93000000] Failed to configure output pad on Parsed_scale_0
[22:04] <tyler1> Error opening filters!
[22:05] <c_14> If you have an error that's more than one line long, please use a pastebin so you don't spam the channel.
[22:05] <tyler1> ah okay
[22:05] <klaxa> remove the quotes though
[22:05] <relaxed> it looks like you ned .withVideoFilter('scale=\'trunc(iw/2)*2:trunc(ih/2)*2\'')
[22:05] <klaxa> >Invalid chars '"'
[22:05] <relaxed> from a quick glance
[22:07] <relaxed> that's my 2 seconds, 2 cents :)
[22:07] <tyler1> thanks all, ill try the back slashes
[22:08] <tyler1> worked!
[22:08] <tyler1> thanks c_14 klaxa relaxed
[22:18] <aarobc> So I have a subsonic server, and I'm trying to get it to use a new version of ffmpeg. Problem is, the newer version doesn't allow streaming while it's transcoding, and it seems slower too. Any ideas?
[22:38] <tyler1> one other question - how can I set the output fps (using -r) as a multiple of whatever the input is
[22:41] <c_14> setpts=x*PTS
[22:41] <c_14> Wait, no.
[22:43] <c_14> Can I ask why you would want to do that?
[22:46] <tyler1> sure, I'm converting gifs to video, but the outputted video ends up being significantly faster than the gif, and since I can't really read the "fps" of a gif (doesn't exist), I need to estimate it
[22:47] <c_14> In that case, setpts might be what you're looking for.
[22:47] <c_14> https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video
[22:48] <tyler1> ah, perfect
[22:50] <tyler1> so making the output twice as slow is eesentially doubling the frames and thus the file size?
[22:52] <c_14> Usually, yes.
[23:53] <jedir0x> so i'm generating a video with ffmpeg and i'm finding that the fps as described by ffprobe -i (and players) is half of what i expect.
[23:53] <jedir0x> vCodecCtx->time_base.num  = 1           vCodecCtx->time_base.den	= 30;
[23:53] <jedir0x> that's how i'm setting it - is there anything else that could be causing it to do this?
[00:00] --- Thu May 15 2014

More information about the Ffmpeg-devel-irc mailing list