[Ffmpeg-devel-irc] ffmpeg.log.20161011

burek burek021 at gmail.com
Wed Oct 12 03:05:01 EEST 2016


[01:27:55 CEST] <runawayfive> Looking for some ffmpeg related critique on this fairly in-depth streaming setup, if someone has a second. Wanting to know if there would be an easier way to hack something like this together (less parallel scripts running): http://hastebin.com/ajasofukej.lisp
[01:27:59 CEST] <runawayfive> Excuse the crude diagram.
[02:17:51 CEST] <klaxa> runawayfive: looks like a good idea, have fun implementing it?
[02:20:00 CEST] <Spring> is there no way to get ffplay to use -to rather than -t for seek? using a trim filter to set start/end points is /far/ too slow for previewing portions of videos that start 30+ minutes in
[02:21:08 CEST] <klaxa> you could use mpv and use --start 30:00 or what do you want to do?
[02:21:53 CEST] <klaxa> doesn't look like there is -to for ffplay
[02:22:23 CEST] <Spring> Yeah, I know. Is there any reason for this?
[02:22:46 CEST] <klaxa> don't think so
[02:23:46 CEST] <klaxa> just do -ss start -t duration-end
[02:24:20 CEST] <klaxa> or something... my math might be wrong
[02:24:35 CEST] <SchrodingersScat> close enough
[02:25:18 CEST] <Spring> I don't know the end if I'm just using ffmpeg + ffplay though, and there are a variety of different formats that can be validly entered for the start/end formats
[02:25:45 CEST] <Spring> even Handbrake doesn't check the start/end formats to make sure the latter is greater than the former
[02:25:53 CEST] <SchrodingersScat> I use segments sometimes, to chop up videos in x minute chunks.
[02:26:10 CEST] <klaxa> oh, you do end-start
[02:26:13 CEST] <klaxa> that's the duration
[02:26:15 CEST] <klaxa> duh silly me
[02:26:32 CEST] <klaxa> so: -ss start -t end-start
[02:27:19 CEST] <Spring> yeah but if someone enters seconds like 120 for one format and 03:05 for another it doesn't make it easy to calculate
[02:28:15 CEST] <Spring> -to is a reasonable feature to have in ffplay tbh
[02:30:07 CEST] <Spring> is there a trac ticket about it? Can't find anything
[02:34:55 CEST] <Spring> (I also can't sign up due to spam false positives each time)
[02:36:50 CEST] <SchrodingersScat> today could be the day you realize you're just an AI
[02:43:53 CEST] <Spring> is there any way to speed up ffplay's processing of the trim filter when seeking to parts of videos further in than 20 minutes?
[02:44:22 CEST] <Spring> or will it always be slow due to it processing everything before it
[03:07:04 CEST] <Spring> alternatively is there a way of keeping the original timestamps when using -ss and an end filter together? Thought it might be possible to skip directly to the start section and just use the end filter for a pseudo -to.
[04:34:52 CEST] <Spring> well that was simpler than I expected. Turns out all you need is to add -ss <start> in addition to the vtrim/atrim filter (with its own start/end times) for it to immediately trim begin at the correct section without any timecode inconsistencies
[11:01:43 CEST] <qwertynik> Note: I have a few days ( 3-4 ) of experience with using FFmpeg, and I do not fully understand the commands I mention here ( most of them are from stack overflow forums - I do not have those links bookmarked ).  I wanted to generate a 30-second video, with 16:9 aspect ratio, from a set of images and an audio with blend + zoom and pan effect.   Started this by just generating slide show of images from the video   **Slideshow of imag
[11:02:20 CEST] <qwertynik> Posted the question elaborately here
[11:06:31 CEST] <qwertynik> http://pastebin.com/20JnXkAW
[11:06:53 CEST] <qwertynik> Here is my query. Elaborately explained in this paste
[11:10:42 CEST] <ElAngelo> how can one compile ffescape?
[11:11:31 CEST] <BtbN> Whatever that is, it's not from ffmpeg.
[11:19:32 CEST] <ElAngelo> it is
[11:19:37 CEST] <ElAngelo> it's in the tools source directory
[11:20:00 CEST] <ElAngelo> /Downloads/ffmpeg-3.1.4 $ ls tools | grep ffescape
[11:20:02 CEST] <ElAngelo> ffescape.c
[11:20:16 CEST] <ElAngelo> BtbN: ^^
[11:22:26 CEST] <qwertynik> Is this live? Can anyone see my message?
[11:22:36 CEST] <BtbN> if it's properly set up, it should just be built alongside all the other tools.
[11:31:16 CEST] <ElAngelo> BtbN: it doesn't seem to be setup well
[11:31:25 CEST] <ElAngelo> cause after compiling there is no ffescape binary
[11:31:29 CEST] <ElAngelo> hence my quesion
[11:45:57 CEST] <ElAngelo> i have a problem with filter graph escaping
[11:46:02 CEST] <ElAngelo> i have the following filter
[11:46:29 CEST] <ElAngelo> movie=c:\opt\videos\Laborie, St. Lucia.mp4:si=0,select=gt(scene,0.4),trim=end_frame=10
[11:46:54 CEST] <ElAngelo> which in accordance to http://ffmpeg.org/ffmpeg-filters.html#Notes-on-filtergraph-escaping
[11:46:59 CEST] <ElAngelo> should become
[11:47:24 CEST] <ElAngelo> movie=c\\:\\\\opt\\\\videos\\\\Laborie\, St. Lucia.mp4\\:si=0,select=gt(scene\,0.4),trim=end_frame=10"
[11:47:28 CEST] <ElAngelo> but this doesn't wrk
[11:47:54 CEST] <ElAngelo> the 2 backslashe before the :si have to be removed before the command works
[11:48:02 CEST] <ElAngelo> and i don't understand why
[11:48:29 CEST] <ElAngelo> or how this could possibly work with what is documented
[11:56:40 CEST] <Spring> if it's like setting the curves filter it will need single quotes around the path
[12:11:50 CEST] <ElAngelo> Spring: you have an example of that?
[12:13:41 CEST] <BtbN> Why don't you just use /?
[12:16:30 CEST] <Spring> in my case I was adding it via a batch script, so I first converted backslashes to forward slashes, escaped the drive letter colon, then escaped any single quotes, then wrapped the whole path in single quotes. After that I further escaped single quotes with '\\\'' for further escaping reasons.
[12:18:14 CEST] <Spring> so when echo'd it would look like psfile='C\:/Users/Spring/Desktop/test.scv'
[12:19:26 CEST] <Spring> *acv rather. Escaping with ffmpeg + scripts / console can be a bit of trial and error.
[12:26:50 CEST] <shapsuk> hey guys -- I was wondering if its safe to assume I can ALWAYS work with avformat to decode/demux my movies, even when the container format doesn't require demuxing? Unless I know exactly what I need, when would you use avcodec directly?
[12:27:20 CEST] <BtbN> a container that doesn't require demuxing?
[12:28:19 CEST] <shapsuk> oh -- I just assumed some don't
[12:31:51 CEST] <shapsuk> I guess I thought that because I came across some example code that suggested specifying the pixel format, etc... and seemed to skip using avformat altogether
[12:32:10 CEST] <shapsuk> right now I'm following along with http://dranger.com/ffmpeg/tutorial01.html
[12:32:18 CEST] <shapsuk> which is a lot clearer ;)
[12:32:50 CEST] <shapsuk> ok so I'm having a problem would love some clarification on -- seems the tutorial suggests some deprecated API
[12:33:06 CEST] <shapsuk> specifically avpicture_fill
[12:33:22 CEST] <shapsuk> I think I now need to use av_image_fill_arrays ?
[12:33:53 CEST] <shapsuk> I think I understand how to map between these functions however I'm having trouble understanding what this actually does
[12:34:47 CEST] <n4zarh> is there any tutorial about swscale and copying data from avframe to byte array?
[12:35:06 CEST] <shapsuk> the link I suggested above has some stuff around that I think
[12:35:23 CEST] <shapsuk> n4zarh: http://dranger.com/ffmpeg/tutorial01.html
[12:36:39 CEST] <shapsuk> I've calculated the numOfBytes for the image in RGB24, and setup a buffer with av_malloc, so I assume the av_image_fill_arrays is supposed to populate that buffer?
[12:37:54 CEST] <shapsuk> the problem I'm having is understanding why I need to cast from AVFrame to AVPicture? Why not work directly with AVPicture?
[12:38:33 CEST] <shapsuk> as I understand it, isn't AVPicture the old approach and considered deprecated?
[12:39:51 CEST] <n4zarh> what is this tutorial supposed to help me with? I can find no swscale functions there, unless I missed something obvious
[12:40:26 CEST] <n4zarh> oh, nevermind, found something
[13:07:51 CEST] <shapsuk> n4zarh: yeah theres multiple pages to that tutorial ;) I'm fairly certain I saw what you're looking for -- but I'm really new to this framework
[13:08:05 CEST] <n4zarh> not working for me
[13:08:15 CEST] <n4zarh> it uses AVPicture which is deprecated
[13:08:33 CEST] <shapsuk> damn -- I'm having same problem
[13:08:38 CEST] <shapsuk> I also found this
[13:08:42 CEST] <shapsuk> https://blogs.gentoo.org/lu_zero/2015/10/15/deprecating-avpicture/
[13:08:51 CEST] <shapsuk> talks about avpicture and replacement API
[13:08:54 CEST] <shapsuk> MIGHT be helpeful
[13:32:44 CEST] <n4zarh> I'll try again then. Anyone have been using sws_scale function? I am not sure how to use it and how to parse data from frame to byte array afterwards
[13:33:41 CEST] <nonex86> well, i am using it
[13:34:17 CEST] <nonex86> byte array of what?
[13:34:48 CEST] <n4zarh> http://pastebin.com/htJ4rKWE
[13:35:04 CEST] <n4zarh> byte array of picture/frame data
[13:35:09 CEST] <n4zarh> all pixels, you know
[13:35:12 CEST] <nonex86> oh... source code again :D
[13:35:27 CEST] <n4zarh> fortunately just 4 lines
[13:35:33 CEST] <n4zarh> + for loop
[13:36:04 CEST] <nonex86> can you state please
[13:36:11 CEST] <nonex86> what exact problem do you have? :)
[13:36:35 CEST] <n4zarh> I get some random pixels instead of picture
[13:36:48 CEST] <BtbN> A frame _is_ a byte array. If you ignore all the other flags it carries.
[13:37:04 CEST] <n4zarh> and that means I either can't use sws_scale, I can't get data from result picture right or both
[13:37:09 CEST] <BtbN> The layout of it is defined by the pixel format you used
[13:37:20 CEST] <nonex86> guess you are sure about source yuv frame? isnt it? its correct?
[13:38:21 CEST] <nonex86> that magic number 4 in your context? what is it?
[13:38:29 CEST] <nonex86> why dont you use defines
[13:39:00 CEST] <n4zarh> I am nearly 100% sure that source is in yuv, since my code usually uses some chinese-made function which converts yuv to rgb565
[13:39:29 CEST] <n4zarh> (at least I think it's chinese, 90% of stuff I found related to those goddamn ip cameras is chinese)
[13:40:07 CEST] <nonex86> picture is AVFrame i assume?
[13:40:08 CEST] <n4zarh> 4 - well, I thought it will be needed since we have 4 bytes per single pixel (ARGB)
[13:40:15 CEST] <n4zarh> yup, both pictures are
[13:40:32 CEST] <nonex86> no, 4 is some of SWS constant
[13:40:49 CEST] <nonex86> and not related to pixel format
[13:40:57 CEST] <nonex86> pf itelf already the size
[13:41:16 CEST] <n4zarh> oh, that
[13:41:52 CEST] <nonex86> #define SWS_BICUBIC           4
[13:41:55 CEST] <nonex86> filtering mode
[13:42:00 CEST] <n4zarh> SWS_BICUBIC or something like that, changed to 4 because I had some bugs with compile (not caused by that but I forgot to change it)
[13:42:02 CEST] <nonex86> well, thats nice
[13:42:10 CEST] <nonex86> you provide some parameters
[13:42:16 CEST] <nonex86> but dont know what are they :D
[13:42:33 CEST] <n4zarh> completely. :D
[13:42:37 CEST] <nonex86> ok, initialization call looks correct
[13:42:57 CEST] <nonex86> hope input data is in YUV420p
[13:43:02 CEST] <nonex86> isnt it?
[13:43:08 CEST] <nonex86> why dont you use
[13:43:11 CEST] <nonex86> something like
[13:43:16 CEST] <nonex86> frame->pix_fmt
[13:43:19 CEST] <nonex86> for example?
[13:43:26 CEST] <n4zarh> ...oh.
[13:43:27 CEST] <nonex86> instead to hardcode pixel format?
[13:43:42 CEST] <nonex86> this will protect you from mistakes
[13:43:53 CEST] <nonex86> check the frame structure
[13:43:59 CEST] <nonex86> i am sure it has a field
[13:44:03 CEST] <nonex86> with pixel format :)
[13:45:23 CEST] <nonex86> well, if i help you this time guess you will owe me a bottle of good beer :D
[13:46:20 CEST] <n4zarh> format is 0, and since NONE is -1 and next one is YUV420P
[13:46:24 CEST] <n4zarh> I guess this is correct
[13:47:31 CEST] <nonex86> back to your code
[13:47:42 CEST] <nonex86> not related to the problems you have
[13:47:43 CEST] <nonex86> but
[13:47:49 CEST] <nonex86> you really need alpha channel?
[13:47:58 CEST] <nonex86> why do you used ARGB?
[13:48:13 CEST] <nonex86> why not AV_PIX_FMT_BGR24 for example?
[13:48:41 CEST] <n4zarh> I found out that I can use ARGB_8888, ARGB_4444 or RGB565 to init bitmap on android
[14:06:37 CEST] <mvardan> Hi guys! I am trying to restream udp stream to rtsp server.
[14:06:37 CEST] <mvardan>  ffmpeg -i udp://10.116.126.206:8080 -vcodec copy -an -f rtsp rtsp://10.116.126.206:80/live/str
[14:06:37 CEST] <mvardan> is not working
[14:08:45 CEST] <mvardan> https://drive.google.com/open?id=0B2xidSTkSMV8TG5xc21KeXI5dnM
[14:09:43 CEST] <mvardan> or there is a lot of aac errors
[14:10:36 CEST] <shapsuk> can anyone tell me how I can tell if av_image_fill_arrays as successful or not?
[14:11:11 CEST] <shapsuk> actually scrap that
[14:12:38 CEST] <shapsuk> I can see the return code is 6220800 -- but I don't know what that refers to. Unsure what my issue is.
[14:13:20 CEST] <shapsuk> hmm.. actually this seems to be the value from numBytes
[14:14:07 CEST] <shapsuk> ahh I see -- it returns the number of bytes required
[14:14:23 CEST] <shapsuk> and negative values on error
[14:14:27 CEST] <shapsuk> ignore me
[14:14:55 CEST] <DHE> this stuff is documented. either in the headers or doxygen can make pretty HTML
[14:24:48 CEST] <shapsuk> DHE: that's how I worked it out ;)
[14:24:53 CEST] <shapsuk> sorry -- I should've checked that first
[14:25:03 CEST] <shapsuk> I just assumed it was status only
[14:25:04 CEST] <shapsuk> my bad
[14:57:54 CEST] <tommy-boy1112> Hello, I'm trying to use ffmpeg for HLS live streaming. I'm using ssegment segmentation. Today I've tried adding -copyts option to keep original timestamps, but the problem is that, when I add this option -segment_time option is ignored and stream is cut at every key frame.
[14:58:01 CEST] <tommy-boy1112> Is this normal?
[14:58:21 CEST] <tommy-boy1112> I've set -segment_time 5, but now I get 0.4 second segments.
[14:58:27 CEST] <tommy-boy1112> (using ffmpeg 3.0.1)
[15:12:50 CEST] <BtbN> tommy-boy1112, why not use the hls muxer?
[15:13:01 CEST] <BtbN> Also, you should update your ffmpeg.
[15:13:32 CEST] <tommy-boy1112> BtbN: last time I've tried using HLS muxer it had problems with copying mpegts metadata.
[15:13:53 CEST] <BtbN> did you report that? did you try on the latest version?
[15:14:32 CEST] <tommy-boy1112> okay, will try newest version soon. Compiling as we speak.
[15:38:28 CEST] <pomaranc> is there any way to use hls from libavformat to get mpegts data (=not demuxed) instead of encoded frames?
[15:41:26 CEST] <DHE> not directly, but reproducing your own wouldn't be difficult.
[15:42:32 CEST] <shapsuk> hey guys -- when a function returns an int with negative values I understand these represent errors. Is there some enum or a list of constants representing these that I can refer to?
[15:45:21 CEST] <cuba_> did anyone get drawtext filter running on windows?
[15:46:27 CEST] <DHE> shapsuk: there's a bunch in libavutil/error.h for huge numbers, otherwise they're just negated errno values
[15:46:55 CEST] <nonex86> shapsuk: use av_strerror or enable debug in ffmpeg
[15:47:28 CEST] <shapsuk> I've enabled av_log_set_level(AV_LOG_VERBOSE)
[15:47:43 CEST] <nonex86> so you should get your error message in log
[15:48:33 CEST] <nonex86> and about the constants, check AVERROR_* definitions
[15:48:46 CEST] <shapsuk> got it
[15:48:47 CEST] <tommy-boy1112> BtbN: newest ffmpeg version works the same, as with older. When copyts is set, cut happens at every key frame.
[15:48:47 CEST] <shapsuk> thanks guys
[15:49:12 CEST] <shapsuk> I wasn't seeing any errors cause there aren't any lol -- but I wanted to know how to deal with them when they do occur ;)
[15:49:17 CEST] <shapsuk> anyway -- thanks again
[15:49:37 CEST] <furq> cuba_: yes
[15:49:50 CEST] <cuba_> do youve the cmdline anywere furq?
[15:49:54 CEST] <cuba_> *where
[15:50:07 CEST] <furq> maybe?
[15:50:08 CEST] <shapsuk> another thing I've been wondering about. Does anyone here know about VideoToolbox? Specifically, if I'm using avformat to demux/decode, will it automatically use it where appropriate/supported?
[15:50:09 CEST] <furq> what isn't working
[15:50:22 CEST] <shapsuk> as far as I know its not supported at all for decoding?
[15:50:30 CEST] <furq> the only trick i remember is that i ended up putting a font file in the working directory
[15:50:37 CEST] <cuba_> I want to get drawtext working with localtime
[15:50:38 CEST] <furq> otherwise it was some kind of escaping nightmare
[15:51:03 CEST] <cuba_> ah wow maybe thats it
[15:52:05 CEST] <cuba_> lol yes
[15:52:09 CEST] <cuba_> for fuck sake
[15:52:11 CEST] <cuba_> thanks furq!
[15:59:09 CEST] <cuba_> milliseconds doesnt seem to be supported for localtime
[15:59:15 CEST] <cuba_> at least %f
[16:21:16 CEST] <mvardan> I am trying to save stream from UDP port using
[16:21:16 CEST] <mvardan>  ffmpeg -i udp://10.116.126.206:8080 -c copy -f flv out.flv
[16:21:16 CEST] <mvardan> But I see a lot of debug log and it not work (https://drive.google.com/open?id=0B2xidSTkSMV8TG5xc21KeXI5dnM )
[16:21:16 CEST] <mvardan> the UDP is streamed by (I can't change that)
[16:21:16 CEST] <mvardan>  gst-launch-1.0 v4l2src device=/dev/video0 ! video/x-raw,width=800,height=600 ! x264enc ! rtph264pay ! udpsink host=10.116.126.206 port=8080
[16:21:17 CEST] <mvardan> Can someone help?
[16:59:06 CEST] <shapsuk> hey guys quick question
[16:59:17 CEST] <shapsuk> numBytes=av_image_get_buffer_size(AV_PIX_FMT_RGB24, pCodecCtx->width, pCodecCtx->height, 32);
[16:59:17 CEST] <shapsuk> is 32 correct here for the align value?
[16:59:50 CEST] <shapsuk> I'm converting from avpicture_get_size() which didn't previously have this argument
[17:01:46 CEST] <nonex86> you are converting from deprecated call?
[17:02:13 CEST] <shapsuk> as in, I found some sample code
[17:02:18 CEST] <shapsuk> but it was using the old method
[17:02:41 CEST] <shapsuk> so I've refactored the code to use this new one, however I'm not sure if the align value should be 1, 16, 32, or something else
[17:02:44 CEST] <nonex86> open ffmpeg source code
[17:02:49 CEST] <nonex86> find avpicture_get_size
[17:02:59 CEST] <nonex86> find the way it works now
[17:03:00 CEST] <shapsuk> lol -- fair call :)
[17:03:51 CEST] <nonex86> FYI
[17:03:52 CEST] <nonex86> int avpicture_get_size(enum AVPixelFormat pix_fmt, int width, int height)
[17:03:53 CEST] <nonex86> {
[17:03:53 CEST] <nonex86>     return av_image_get_buffer_size(pix_fmt, width, height, 1);
[17:03:53 CEST] <nonex86> }
[17:03:59 CEST] <nonex86> simple, isnt it?
[17:04:54 CEST] <shapsuk> yeah found it
[17:05:41 CEST] <shapsuk> nonex86: haha
[17:06:02 CEST] <nonex86> every deprecated call usually maps to something new
[17:06:21 CEST] <shapsuk> yeah tbh that's how I worked out av_image_fill_arrays()
[17:06:29 CEST] <shapsuk> I should've done the same here
[17:06:30 CEST] <shapsuk> :)
[17:21:03 CEST] <shapsuk> anyone know whats wrong with this: sprintf(szFilename, "%s/frame%d.ppm", path, iFrame);
[17:21:03 CEST] <shapsuk> where path is a const char*
[17:21:20 CEST] <shapsuk> I'm not really a C dev, but I get a SIGABRT
[17:21:38 CEST] <shapsuk> I can print the value -- its not a bad pointer
[17:24:42 CEST] <nonex86> why do you think something wrong with this?
[17:25:01 CEST] <jkqxz> Maybe you want 'snprintf("%s/frame%d.ppm", szFilename, path, iFrame);'?
[17:25:48 CEST] <shapsuk> nonex86: actually I thought it was fine
[17:25:57 CEST] <shapsuk> but as I said, I'm getting SIGABRT
[17:26:37 CEST] <nonex86> are you sure the problem in this line?
[17:27:03 CEST] <nonex86> is szFilename buffer enough to put formatted data?
[17:27:26 CEST] <shapsuk> ahh -- good point probably not
[17:27:27 CEST] <jkqxz> Oops, szFilename sounded like a size.  Ignore what I said.
[17:27:45 CEST] <shapsuk> char szFilename[32];
[17:27:54 CEST] <shapsuk> again, was from the sample which only took a filename
[17:27:54 CEST] <nonex86> and the path size is?
[17:27:58 CEST] <shapsuk> I'm passing an entire path
[17:28:03 CEST] <shapsuk> much longer ;)
[17:28:06 CEST] <shapsuk> I'll adjust
[17:28:07 CEST] <shapsuk> cheers
[17:28:07 CEST] <nonex86> and you overrun the buffer
[17:28:08 CEST] <nonex86> so
[17:28:10 CEST] <shapsuk> yep
[17:28:11 CEST] <shapsuk> exactly
[17:28:13 CEST] <nonex86> what did you expect
[17:28:16 CEST] <shapsuk> lol
[17:28:31 CEST] <shapsuk> like I said -- was from sample -- I'm modifying as I go and C isn't my language
[17:28:36 CEST] <shapsuk> I'm learning :) fast!
[17:28:36 CEST] <maziar> how can i change mp3 quality with ffmpeg ?
[17:28:57 CEST] <nonex86> doubt using ffmpeg
[17:29:02 CEST] <nonex86> is best way to learn c
[17:30:25 CEST] <nonex86> maziar: reencode it with another qulity? :)
[17:30:30 CEST] <nonex86> *quality
[17:30:52 CEST] <maziar> nonex86 yes
[17:31:14 CEST] <jkqxz> In any case, snprintf() is the function you should be using there, not sprintf() - 'snprintf(szFilename, sizeof(szFilename), "%s/frame%d.ppm", path, iFrame);'.  (By looking at the return value you will be able to tell that it was too long.)
[17:32:10 CEST] <nonex86> maziar: then why do you ask the question the answer to which you already know - reencode it :)
[17:32:22 CEST] <furq> why would you call a char array "sz"
[17:32:28 CEST] <maziar> nonex86 i dont know how to
[17:32:35 CEST] <shapsuk> jkqxz: is there a recommended size to use for file paths?
[17:32:45 CEST] <furq> PATH_MAX
[17:32:50 CEST] <nonex86> MAX_PATH
[17:32:51 CEST] <shapsuk> ahh perfect thanks ;)
[17:33:29 CEST] <nonex86> sz mean string terminated with zero afair
[17:33:41 CEST] <nonex86> anyway, who cares ? :)
[17:33:59 CEST] <nonex86> its a hungary notation
[17:36:07 CEST] <shapsuk> ah right thanks
[17:37:21 CEST] <shapsuk> ok great -- that all worked! I got a 10sec .mov file to decode to PPM files on disk.
[17:37:21 CEST] <shapsuk> took a while -- would that likely be the disk overhead or the actual decoding?
[17:37:46 CEST] <shapsuk> think I might write some performance checks around it, I want to understand where my bottlenecks reside
[17:37:47 CEST] <maziar> does any one know how can i decrease a MP3 file size ?
[17:38:44 CEST] <nonex86> use a profiler instead
[17:39:19 CEST] <shapsuk> that's actually what I meant -- I'm an Cocoa/iOS dev, so I'm using Xcode -- should be able to use the profilers in there
[17:44:13 CEST] <trfl> maziar, you get lower quality every time you convert to mp3, so you should not convert mp3 to mp3. With that warning out of the way: ffmpeg -i source.mp3 -acodec libmp3lame -q:a 5 small.mp3
[17:44:37 CEST] <trfl> for higher quality and a higher file size, replace 5 with a lower number
[17:45:42 CEST] <trfl> does anyone know why vaapi decoding is not implemented in ffmpeg? when I tried it in gstreamer there was a 4x speed boost when decoding H.265
[17:51:42 CEST] <shapsuk> just out of curi
[17:51:47 CEST] <shapsuk> oops -- sorry
[17:54:23 CEST] <shapsuk> is there a way to determine whether or not the decoding is using VideoToolbox on iOS?
[17:59:35 CEST] <shapsuk> cause my CPU is hitting 100%
[17:59:47 CEST] <shapsuk> and the file is an mp4
[18:00:02 CEST] <shapsuk> takes around 30 seconds to decode the entire 2min file
[18:00:28 CEST] <shapsuk> its 1080p, 7979 kb/s bitrate
[18:01:09 CEST] <DHE> there's usually a bit in the output that says stream 0:0 -> 0:0 (codecinfo)
[18:01:46 CEST] <shapsuk> Stream #0:0(und): Video: h264, 3 reference frames (avc1 / 0x31637661), yuv420p, 1920x816 [SAR 1:1 DAR 40:17], 7979 kb/s, 23.98 fps, 23.98 tbr, 2500k tbn, 47.95 tbc (default)
[18:02:38 CEST] <DHE> nope, further down
[18:02:54 CEST] <shapsuk> hmm
[18:03:10 CEST] <shapsuk> that's the only output I'm logging
[18:03:17 CEST] <shapsuk> using av_dump_format() I think
[18:03:23 CEST] <shapsuk> what are you referring to?
[18:06:03 CEST] <shapsuk> do I need to do something myself to use VideoToolbox for the decoding?
[18:06:10 CEST] <shapsuk> or is it detected automatically
[18:06:15 CEST] <shapsuk> ?
[18:06:28 CEST] <jkqxz> trfl:  vaapi decoding is implemented in ffmpeg, and has been for ages.
[18:08:13 CEST] <shapsuk> just a thought -- in the build script I used to build ffmpeg it makes no mention of VideoToolbox -- but I do have the header/source in my project
[18:08:42 CEST] <shapsuk> should I have added a flag as a part of the build to enable this or is this handled automatically?
[18:08:56 CEST] <shapsuk> sorry is that's a stupid question -- just trying to understand
[18:09:51 CEST] <furq> it should be autodetected
[18:09:57 CEST] <shapsuk> ok
[18:10:00 CEST] <shapsuk> I thought so
[18:10:01 CEST] <DHE> shapsuk: if you're using ffmpeg itself, there's a quick table of stream mappings shown just above the main progress line
[18:10:09 CEST] <furq> but it'll just build without it if you've got a header in the wrong place or something
[18:10:12 CEST] <shapsuk> DHE -- but I'm not ;)
[18:10:18 CEST] <shapsuk> its an iOS app
[18:10:23 CEST] <furq> after you run ./configure it'll print a list of enabled features
[18:10:31 CEST] <furq> you probably want to doublecheck that videotoolbox is listed in there
[18:10:50 CEST] <shapsuk> when I try and do a color conversion on the mp4 I get this
[18:10:51 CEST] <shapsuk> [swscaler @ 0x10347c000] No accelerated colorspace conversion found from yuv420p to rgb24.
[18:11:08 CEST] <shapsuk> and if I remove the color-conversion code I don't get any output related to this at all
[18:11:13 CEST] <shapsuk> not sure if that indicates anything
[18:11:19 CEST] <shapsuk> cpu is still 100% during decoding
[18:11:20 CEST] <shapsuk> lol
[18:19:03 CEST] <shapsuk> would it be safe to assume videotoolbox is compiled in if I have the headers and source after building?
[18:30:24 CEST] <shapsuk> so I decided to try and rebuild in case -- and added the config flag
[18:30:35 CEST] <shapsuk> and I'm getting this which is a good sign now
[18:30:35 CEST] <shapsuk> Enabled hwaccels:
[18:30:36 CEST] <shapsuk> h263_videotoolbox	  h264_videotoolbox	    mpeg1_videotoolbox	      mpeg2_videotoolbox	mpeg4_videotoolbox
[18:30:49 CEST] <shapsuk> so once that's built, I'll try again
[18:49:06 CEST] <trfl> jkqxz: really? that's great news, but this table is suggesting otherwise: https://trac.ffmpeg.org/wiki/HWAccelIntro#FFmpegimplementations
[18:51:36 CEST] <jkqxz> It's telling you that there is no standalone decoder; you use the hwaccel on the normal decoder.
[18:59:05 CEST] <trfl> oh, that's neat! I'll build a vaapi-enabled ffmpeg and try it out, thanks :)
[19:51:34 CEST] <shapsuk> any chance someone could help with this: http://pasteboard.co/dKZOGts3H.x-portable-pixmap
[19:52:02 CEST] <shapsuk> I'm using AV_PIX_FMT_RGB24
[19:52:31 CEST] <shapsuk> and outputting to PPM
[19:52:31 CEST] <shapsuk> following along with a tutorial
[19:52:35 CEST] <shapsuk> but as you can see, the output isn't right
[19:52:49 CEST] <shapsuk> the colours are ok I think, but not sure what's going on with the rest of the image
[19:52:56 CEST] <shapsuk> any advice would be apprecited
[19:54:33 CEST] <shapsuk> perhaps I should try a different image format, would be happy to use JPG, but unsure how to go about that
[20:21:47 CEST] <Sashmo> Anyone know how to ignore this error? [eac3 @ 0x7ffe1f003a00] incomplete frame
[20:22:09 CEST] <Sashmo> once I get it, then every frame after that has errors i.e. [eac3 @ 0x7ffe1f003a00] frame sync error
[20:26:32 CEST] <shapsuk> hmm.. I'm trying to use av_videotoolbox_alloc_context() but the function doesn't appear to be recognised... but I have VideoToolbox.h in my project
[20:38:55 CEST] <pgorley> shapsuk: what's the error message?
[20:39:09 CEST] <shapsuk> no error message unfortunately
[20:39:20 CEST] <pgorley> not even at compile or link time?
[20:42:46 CEST] <shapsuk> no
[20:42:52 CEST] <shapsuk> oh sorry
[20:42:59 CEST] <shapsuk> thought you were asking about the output issue
[20:43:15 CEST] <shapsuk> the issue with av_videotoolbox_alloc_context() is that it doesn't exist
[20:44:42 CEST] <shapsuk> the declaration of function 'av_videotoolbox_alloc_context' is invalid in C99
[20:45:21 CEST] <shapsuk> and AVVideotoolboxContext says undeclared identifier
[20:45:27 CEST] <shapsuk> so -- either these are private to the lib
[20:45:37 CEST] <shapsuk> or disabled through #define somewhere
[20:46:11 CEST] <shapsuk> but I built the lib with the --enable-videotoolbox flag
[20:46:18 CEST] <shapsuk> so I would expect that I'm able to use this?
[20:46:24 CEST] <shapsuk> pgorley: ?
[20:46:31 CEST] <shapsuk> any help appreciated ;)
[20:50:34 CEST] <pgorley> i've been able to compile ffmpeg with vt support, in fact, i got it to work with h264 streams
[20:50:50 CEST] <pgorley> what version of ffmpeg are you using?
[20:59:40 CEST] <pgorley> shapsuk: vt was introduced in commit 11d923d4, which is in 2.8+
[21:00:26 CEST] <shapsuk> I'm using 3.x
[21:00:47 CEST] <shapsuk> umm 3.0 i think
[21:00:54 CEST] <pgorley> that's fine then
[21:01:16 CEST] <pgorley> are you compiling ffmpeg yourself? or using a precompiled version?
[21:01:22 CEST] <shapsuk> compiling
[21:01:42 CEST] <pgorley> what's your configure line? could you send me a pastebinb link?
[21:01:47 CEST] <pgorley> *pastebin
[21:02:34 CEST] <shapsuk> sure hang on
[21:03:24 CEST] <shapsuk> this is the entire script
[21:03:25 CEST] <shapsuk> http://pastebin.com/MnvR1YSD
[21:03:37 CEST] <shapsuk> got it from a repo on github
[21:03:42 CEST] <shapsuk> modified slightly
[21:03:54 CEST] <shapsuk> to include video toolbox and removed ARCHs I don't need
[21:04:29 CEST] <JEEB> &25
[21:07:09 CEST] <NapoleonWils0n> hi all
[21:07:25 CEST] <pgorley> shapsuk: try adding --enable-hwaccel options
[21:07:35 CEST] <NapoleonWils0n> im getting this error recording a stream and copy the audio and video codecs into an mkv
[21:07:38 CEST] <NapoleonWils0n> AMF_END_OF_OBJECT in AMF_DATA_TYPE_MIXEDARRAY
[21:07:50 CEST] <pgorley> do a ./configure --list-hwaccels to get a list of all of them
[21:07:50 CEST] <shapsuk> ok give me a minute -- will also remove the existing source so that it grabs 3.1.1 or wateva is the current latest
[21:08:05 CEST] <shapsuk> ok cool
[21:08:06 CEST] <shapsuk> one sec
[21:08:07 CEST] <NapoleonWils0n> anyone had this error and got any tips on how to resolve it
[21:08:10 CEST] <pgorley> i use 3.1.3
[21:08:29 CEST] <shapsuk> right
[21:11:30 CEST] <shapsuk> --enable-hwaccel  returns unknown option
[21:11:52 CEST] <shapsuk> oh wait its missing an 's' at the end right?
[21:11:53 CEST] <pgorley> it's --enable-hwaccel=h264_videotoolbox, for example
[21:11:58 CEST] <shapsuk> ahh ok
[21:12:05 CEST] <shapsuk> thing is
[21:12:09 CEST] <shapsuk> when I ran it earlier
[21:12:21 CEST] <pgorley> like i said, do a configure --list-hwaccels to see which ones are available
[21:12:24 CEST] <shapsuk> I noticed while it was running that all of those hwaccels were added
[21:12:33 CEST] <shapsuk> all the videotoolbox ones
[21:12:45 CEST] <shapsuk> I can try again anyway with the latest version though
[21:13:28 CEST] <pgorley> in configure's output you have all those xxx_videotoolbox hwaccels under the enabled hwaccels section?
[21:13:55 CEST] <shapsuk> yes
[21:14:44 CEST] <shapsuk> http://pastebin.com/t0s4923C
[21:14:45 CEST] <pgorley> and you still get an undefined reference to av_videotoolbox_alloc_context?
[21:14:52 CEST] <shapsuk> I did yes
[21:15:02 CEST] <shapsuk> but I will try and rebuild with 3.x now
[21:15:04 CEST] <shapsuk> and see
[21:15:53 CEST] <shapsuk> I'll also add --enable-hwaccel=h264_videotoolbox --enable-hwaccel=h263_videotoolbox --enable-hwaccel=mpeg1_videotoolbox --enable-hwaccel=mpeg4_videotoolbox
[21:16:40 CEST] <pgorley> ah, i see why i needed those switches, we have a --disable-everything at the top of our makefile for ffmpeg
[21:17:18 CEST] <shapsuk> ah ok
[21:17:22 CEST] <JEEB> that is a general thing some people tend to do. and then they get all kinds of issues when they're missing various stuff :)
[21:17:26 CEST] <shapsuk> so for me, its not relevant lol
[21:18:10 CEST] <pgorley> nope lol
[21:18:12 CEST] <shapsuk> ok its building now
[21:18:23 CEST] <shapsuk> I've just kept --enable-videotoolbox
[21:18:37 CEST] <pgorley> should be autodetected though, looking at the configure script
[21:18:59 CEST] <shapsuk> yeah and so now I get this during the build output:
[21:19:05 CEST] <shapsuk> Enabled hwaccels: h263_videotoolbox	  h264_videotoolbox	    mpeg1_videotoolbox	      mpeg2_videotoolbox	mpeg4_videotoolbox
[21:19:09 CEST] <shapsuk> which is right
[21:19:16 CEST] <shapsuk> that's all the video toolbox ones
[21:19:27 CEST] <shapsuk> but no different to before yet
[21:19:29 CEST] <pgorley> that's good, how about when you build your application
[21:19:46 CEST] <shapsuk> huh?
[21:20:09 CEST] <pgorley> you're just compiling ffmpeg, or you want to use it with some of your code?
[21:20:20 CEST] <shapsuk> I want to use it yes
[21:20:42 CEST] <shapsuk> so basically I'm compiling into .h/.a
[21:20:47 CEST] <shapsuk> then added it to my Xcode project
[21:20:54 CEST] <pgorley> so you have some piece of code that calls av_videotoolbox_alloc_context?
[21:21:08 CEST] <shapsuk> I then have a .c source file where I have a function
[21:21:21 CEST] <shapsuk> which is currently able to decode a file
[21:21:30 CEST] <shapsuk> the output is weird but that's another issue
[21:21:36 CEST] <shapsuk> it does read the frames
[21:21:44 CEST] <shapsuk> but its using a LOT of CPU
[21:21:51 CEST] <shapsuk> and its an mp4 file
[21:22:00 CEST] <shapsuk> so I figured its not using the hw
[21:22:06 CEST] <shapsuk> I'm running on an iPhone 7 Plus
[21:22:12 CEST] <pgorley> OH
[21:22:20 CEST] <shapsuk> so I wouldn't expect the CPU to be full on
[21:22:20 CEST] <pgorley> videotoolbox is iOS 8+
[21:22:21 CEST] <shapsuk> right?
[21:22:27 CEST] <shapsuk> yes, running iOS 10
[21:22:32 CEST] <shapsuk> I knew that
[21:22:40 CEST] <pgorley> oh, misread that as iOS 7 lol
[21:22:42 CEST] <pgorley> sorry
[21:22:44 CEST] <shapsuk> lol
[21:22:45 CEST] <shapsuk> n
[21:22:48 CEST] <shapsuk> nw
[21:23:33 CEST] <pgorley> does ffmpeg build ok? ie: is it only when building your project that it doesn't work?
[21:23:58 CEST] <shapsuk> yeah the libs build fine
[21:24:07 CEST] <shapsuk> and even accessing the decoders seems fine
[21:24:40 CEST] <shapsuk> it even generates the videotoolbox.h
[21:24:45 CEST] <shapsuk> but no associated .a file
[21:24:50 CEST] <shapsuk> however
[21:25:08 CEST] <shapsuk> either way its not even seeing the function
[21:25:17 CEST] <shapsuk> hmmm.. just had a thought
[21:25:19 CEST] <shapsuk> one sec
[21:25:50 CEST] <furq> does it show up in `nm -g libavcodec.a`
[21:26:01 CEST] <shapsuk> shit!
[21:26:02 CEST] <shapsuk> sorry
[21:26:08 CEST] <shapsuk> I'm a complete idiot!
[21:26:16 CEST] <pgorley> haha what is it?
[21:26:18 CEST] <furq> were you using the wrong libraries
[21:26:22 CEST] <shapsuk> what would be the most obvious and stupid thing I could forget??
[21:26:27 CEST] <shapsuk> furq: no
[21:26:28 CEST] <shapsuk> lol
[21:26:30 CEST] <pgorley> the linker flags?
[21:26:34 CEST] <shapsuk> no
[21:26:36 CEST] <shapsuk> simpler!
[21:26:43 CEST] <pgorley> #include
[21:26:45 CEST] <shapsuk> #include "videotoolbox.h"
[21:26:45 CEST] <pgorley> ?
[21:26:46 CEST] <furq> #include "videotoolbox.h"
[21:26:47 CEST] <shapsuk> yep!
[21:26:49 CEST] <pgorley> haha
[21:26:51 CEST] <shapsuk> unbelievable !
[21:26:56 CEST] <shapsuk> I feel ridiculous!
[21:26:57 CEST] <furq> nice
[21:27:04 CEST] <pgorley> happens to the best of us ;)
[21:27:08 CEST] <furq> we've all done it
[21:27:11 CEST] <shapsuk> I thought I'd checked that
[21:27:17 CEST] <shapsuk> well -- at least now I've upgraded haha
[21:28:01 CEST] <shapsuk> ok well have you had experience using the video toolbox stuff anyway?
[21:28:07 CEST] <shapsuk> I have some questions would love answers to
[21:28:13 CEST] <shapsuk> just to be sure I'm on the right track
[21:28:19 CEST] <shapsuk> especially now I can actually compile ;)
[21:28:44 CEST] <pgorley> i've only got it working with h264 though
[21:28:58 CEST] <pgorley> still need to get it working for h263 and mpeg4 in my case
[21:29:04 CEST] <pgorley> keep getting operation not permitted
[21:29:15 CEST] <shapsuk> well for now I'm just interested in H264
[21:29:23 CEST] <shapsuk> so that would be a great help :)
[21:29:32 CEST] <shapsuk> my initial question is -- currently I'm using AVCodecContext
[21:29:48 CEST] <shapsuk> and I was wondering if I just need to swap that out for AVVideoToolboxContext?
[21:30:08 CEST] <shapsuk> and I guess I need to use AV_PIX_FMT_VIDEOTOOLBOX somewhere
[21:30:14 CEST] <pgorley> nah, i still use AVCodecContext
[21:30:18 CEST] <shapsuk> oh right
[21:30:26 CEST] <shapsuk> so how do you get it to actually use VideoToolbox?
[21:30:28 CEST] <pgorley> i've basically followed the ffmpeg_videotoolbox.c example
[21:30:37 CEST] <shapsuk> oh I didn't see that
[21:30:45 CEST] <shapsuk> I was looking for an example
[21:30:48 CEST] <shapsuk> awesome
[21:31:10 CEST] <pgorley> you have to return AV_PIX_FMT_VIDEOTOOLBOX on AVCodecContext->get_format
[21:31:51 CEST] <shapsuk> https://github.com/FFmpeg/FFmpeg/blob/master/ffmpeg_videotoolbox.c
[21:31:55 CEST] <shapsuk> is that right?
[21:31:55 CEST] <pgorley> yep
[21:31:57 CEST] <shapsuk> cool
[21:32:15 CEST] <shapsuk> AVCodecContext->get_format -- I read that somewhere but didn't quite understand
[21:32:18 CEST] <shapsuk> ok will compare my code to this example
[21:32:20 CEST] <shapsuk> and see what I can do
[21:32:23 CEST] <shapsuk> thanks heaps
[21:32:30 CEST] <shapsuk> sorry for wasting your time earlier -- ugh!
[21:32:34 CEST] <pgorley> there's a bunch of stuff you can ignore in that file, depending on what you want to do
[21:32:39 CEST] <pgorley> haha, no worries
[21:32:56 CEST] <pgorley> i was answering while my code was compiling ;)
[21:33:37 CEST] <shapsuk> nice
[22:06:23 CEST] <campeterson> Howdy! Im trying to start, record, then stop, two webcams at the same time, and having the outputs go to different files. Ive read up on -map but cant seem to get it right.
[22:06:38 CEST] <campeterson>   ffmpeg -f avfoundation -i "0:0" -i "1:0" -map 0:v -map 0:a out1.avi -map 1:v -map 1:a out2.avi
[22:08:59 CEST] <furq> campeterson: -f avfoundation -i "0:0" out1.avi -f avfoundation -i "1:0" out2.avi
[22:09:33 CEST] <furq> also i assume you're passing some codec params as well because otherwise that'll reencode to mpeg4
[22:10:13 CEST] <campeterson> Yes, I removed the codec params to simplify the question
[22:10:28 CEST] <campeterson> let me give that a try.
[22:10:35 CEST] <campeterson> Thanks @furq
[22:11:08 CEST] <furq> actually you still need -map for that
[22:11:25 CEST] <furq> -f avfoundation -i "0:0" out1.avi -f avfoundation -i "1:0" -map 1 out2.avi
[22:18:57 CEST] <campeterson> @furq that works. thank you.
[22:19:25 CEST] <campeterson> Im still seeing other issues that appear to be related to my parameters
[22:19:52 CEST] <campeterson> but it accomplishes the goal of multiple inputs -> multiple outputs
[00:00:00 CEST] --- Wed Oct 12 2016



More information about the Ffmpeg-devel-irc mailing list