[Ffmpeg-devel-irc] ffmpeg.log.20161025

burek burek021 at gmail.com
Wed Oct 26 03:05:01 EEST 2016


[00:11:27 CEST] <Phi> Can someone confirm the avcodec.h line 102
[00:12:27 CEST] <Phi> https://git.ffmpeg.org/gitweb/ffmpeg.git/blob_plain/HEAD:/libavcodec/avcodec.h
[00:12:48 CEST] <Phi> In the scenario you're decoding from one, stripping pictures out for display, then re-encoding
[00:15:16 CEST] <Mavrik> Confirm what exactly?
[00:15:23 CEST] <Phi> For decoding, call avcodec_receive_frame
[00:15:28 CEST] <Mavrik> (Your link doesn't include line numbers, I suggest linking doxygen next time.)
[00:15:31 CEST] <Phi>  For encoding, call avcodec_receive_packet
[00:15:36 CEST] <Phi> both are receive
[00:16:02 CEST] <Phi> do I then still have to call av_write_frame to output to MP4?
[00:16:14 CEST] <Phi> or is that handled by the encoder?
[00:17:03 CEST] <Mavrik> You have to write your data to muxer youself.
[00:17:09 CEST] <Mavrik> This a codec api.
[00:17:14 CEST] <Mavrik> Which is separate from muxers :)
[00:17:57 CEST] <Phi> rats
[00:18:01 CEST] <Phi> I'm suing
[00:18:27 CEST] <Phi> ...wait, what?
[00:18:40 CEST] <Phi> you receive a packet from the encoder, then you write a frame?
[00:22:57 CEST] <DHE> you send AVFrame's to the encoder, and receive AVPacket's from it
[00:23:57 CEST] <Phi> urgh
[00:24:02 CEST] <Phi> the thing is I have RTSP involved
[00:24:46 CEST] <Phi> so does av_receive_frame from RTSP's FormatContext return a H264 frame or NALs to decode to H264 then decode to YUV420P then encode to a new output H264 then pass that into the MP4
[00:32:05 CEST] <Phi> there's like 8 different formats involved XD
[00:34:12 CEST] <Phi> any idea DHE?
[00:37:16 CEST] <DHE> an AVFrame is uncompressed (give or take colourspace ratios). avcodec_receive_frame gives an AVFrame as output. therefore it's a decoder
[00:39:08 CEST] <Phi> so the output is a H264...
[00:39:42 CEST] <DHE> *thunk*
[00:39:46 CEST] <DHE> I said uncompressed
[00:40:05 CEST] <Phi> if it makes you feel better, I'm new to video
[00:40:21 CEST] <Phi> prior to this project I hadn't even worked with any rendering code
[00:40:59 CEST] <Phi> so the avcodec_receive_frame is a YUV420P AVFrame?
[00:41:52 CEST] <Phi> hm, I just worked out I'm decoding in two different ways
[00:42:28 CEST] <Phi> but yeah, is it a YUV420P AVFrame?
[00:45:08 CEST] <Phi> sempai
[00:47:03 CEST] <klaxa> if your video was YUV420P then it should decode to a YUV420P AVFrame
[00:47:48 CEST] <DHE> it most likely is. there's a "format" field in the AVFrame which should be equal to AV_PIX_FMT_YUV420P
[00:58:31 CEST] <Phi> I shall observe
[00:59:13 CEST] <Phi> so what's the method for converting YUV420P to BGR24?
[00:59:20 CEST] <Phi> I was using the decode2
[00:59:28 CEST] <Phi> *avcodec_decode_video2
[01:00:42 CEST] <Phi> pfft
[01:00:43 CEST] <Phi> XD
[01:00:57 CEST] <Phi> so I was decoding from YUV420P to YUV420P
[01:01:49 CEST] <DHE> avcodec_decode_video2 is the old API, combining avcodec_send_packet and receive_frame in one call
[01:02:13 CEST] <Phi> what's confusing is avcodec_receive_frame receives to a AVPacket
[01:02:33 CEST] <Phi> *av_read_frame
[01:03:10 CEST] <DHE> ah, that's not avcodec. that's part of avformat. it reads a packet from an AVFormatContext
[01:03:21 CEST] <Phi> hmm
[01:03:47 CEST] <Phi> well atm I need an image to RGB24, and the frame to be passed to a H264 AVFormat encoder
[01:04:52 CEST] <douglasbay> how can ffmpeg create private data in the adaptation field of a mpegts? I can create an asset with an adaptation field but the private data is 0
[01:05:39 CEST] <Phi> so I go av_read_frame to a packet, then pass it to avcodec_decode_video2 and get a YUV420P AVFrame (or AVPicture), then pass the YUV picture to a sws_scale and convert, and after that pass the packet to av_write_frame
[01:06:32 CEST] <Phi> it looks like they scrapped got_picture
[01:16:57 CEST] <Phi> I'm puzzled about avcodec_send_frame, it expects an AVCodecCtx but those are deprecated
[01:20:12 CEST] <DHE> no, only some fields in it are
[01:20:57 CEST] <Phi> AVStream->codec is marked as deprecated, what else should I use?
[01:22:03 CEST] <klaxa> codec_par i think
[01:22:44 CEST] <klaxa> *codecpar
[01:25:29 CEST] <Phi> avcodec_send_packet wants a AVCodecCtx
[01:26:23 CEST] <Phi> *avcodec_send_frame
[01:26:53 CEST] <klaxa> Phi: what i do (and this doesn't go all the way down to decoding actually) is: https://github.com/klaxa/mkvserver_mk2/blob/master/server.c#L200
[01:27:02 CEST] <klaxa> not sure if that helps
[01:28:32 CEST] <Phi> errr
[01:28:40 CEST] <Phi> do you encode?
[01:28:43 CEST] <klaxa> no
[01:28:49 CEST] <klaxa> i only do (de)muxing
[01:28:59 CEST] <Phi> hm
[01:29:00 CEST] <klaxa> but i allocate the codec context
[01:29:21 CEST] <Phi> I'll keep an eye on it
[01:29:43 CEST] <klaxa> i took this from doc/example/remuxing.c and tried to remove the deprecation warnings
[01:30:01 CEST] <Phi> I'm getting an error -1 when I'm using avformat_write_header
[01:30:03 CEST] <klaxa> by looking at other code that already used the new codecpar api
[01:30:09 CEST] <Phi> so I may need your code
[01:32:29 CEST] <Phi> the thing is, I could keep a copy of the pointer to the codec context
[01:32:34 CEST] <Phi> but it does seem kinda pointless
[01:36:33 CEST] <Phi> DHE, if AVCodecCtx is not used why is AVStream::codec deprecated?
[01:36:55 CEST] <Phi> ...why are my messages on this chat always a bit off with what I'm actually trying to say.
[01:37:10 CEST] <Phi> if AVCodecCtx is used why is AVStream::codec deprecated?
[01:37:29 CEST] <Phi> I changed the chat background colour to red
[01:37:33 CEST] <Phi> maybe that'll fix up some things
[01:42:49 CEST] <klaxa> why is it indeed
[01:43:06 CEST] <klaxa> i would guess performance
[01:43:44 CEST] <klaxa> seeing that now you have to explicitly allocate the codec with the help of codecpar i would assume that before this change codec was allocated and eventually not always needed
[01:44:06 CEST] <Phi> I don't think it was allocated
[01:44:39 CEST] <Phi> aformat_new_stream doesn't take any codec parameters, or even options
[01:44:39 CEST] <klaxa> oh? well i don't know that
[01:44:39 CEST] <klaxa> *then
[01:44:41 CEST] <Phi> just a raw AVFormatContext and AVCodec
[01:45:01 CEST] <Phi> I think they're not quite sure what they're doing, tbh
[01:45:21 CEST] <Phi> probably a mislaid deprecation warning somewhere
[01:45:56 CEST] <DHE> ffmpeg is many layers. the codec layer is just that - the codec. how you store it on disk is none of its concern as long as it gets back AVPacket's split up proprely
[01:46:21 CEST] <DHE> streams are an avformat thing where a bunch of AVPacket's in sequence go to a single decoder
[01:47:08 CEST] <Phi> in this case, I need the encoder
[01:47:19 CEST] <Phi> hm
[01:47:31 CEST] <Phi> can I just pass avformat_write_frame and pass the YUV420P stuff?
[01:47:46 CEST] <DHE> no, it wants AVPackets
[01:47:49 CEST] <Phi> the issue is up until now (and the reason I upgraded to v12) was the output MP4 always had the source H264 profile
[01:48:22 CEST] <Phi> the other reason was that I spent so long trying to fix the output MP4 profile that v12 was released :p
[01:51:31 CEST] <Phi> okay, so I have an AVPacket from an RTSP FormatContext's av_read_frame
[01:51:44 CEST] <Phi> I also have an AVFrame from that AVPacket, presumably in YUV420P
[01:52:44 CEST] <Phi> ...so how does that work
[01:53:39 CEST] <DHE> .. well what do you want to do with them?
[01:57:44 CEST] <Phi> I have RTSP FormatContext, and I want an output RGB24 image, and a MP4 FormatContext with a different H264 profile
[02:00:50 CEST] <Phi> if I have multiple YUV420Ps in one AVPacket, I just need one RGB image
[02:02:41 CEST] <Phi> does that help Mr. DHE
[02:10:29 CEST] <Phi> klaxa: Do you have any encoding examples of v12?
[02:10:39 CEST] <Phi> I've been through the git and they're lacking
[02:11:47 CEST] <klaxa> v12 being what?
[02:12:12 CEST] <klaxa> and probaly not, the project i linked was the first project i wrote with libav*
[02:12:21 CEST] <klaxa> and still is
[02:21:56 CEST] <Phi> v12 is the more recent one
[02:22:01 CEST] <Phi> came out this Oct
[02:22:09 CEST] <Phi> ...well, beta1 did
[02:22:54 CEST] <Phi> yeah, both did, 2016-10-02
[02:27:29 CEST] <klaxa> the more recent what?
[02:30:35 CEST] <DHE> Phi: the only way I know of to convert between formats is either swscale or a filter (which will just invoke swscale)
[02:32:43 CEST] <llogan> Phi: what's v12?
[02:32:59 CEST] <Phi> Libav 12 "Not Enough Trocadero"
[02:33:05 CEST] <Phi> https://libav.org/download/
[02:33:10 CEST] <llogan> why are you asking FFmpeg about Libav stuff?
[02:35:20 CEST] <Phi> because I get libav DLLs that ffmpeg C++ examples use
[02:35:40 CEST] <llogan> Why don't use use the FFmpeg libraries?
[02:37:09 CEST] <Phi> would there be any difference in usage?
[02:37:36 CEST] <llogan> Maybe you are (understandably) confused about the name? libav* is the collective name of the FFmpeg libraries. Libav is a fork of FFmpeg, and they conviently decided to name themselves using a name FFmpeg histoically used.
[02:37:36 CEST] <klaxa> yes
[02:38:37 CEST] <Phi> no, not at all :p
[02:38:46 CEST] <Phi> I shouldn't confuse libav with libav or libav*
[02:39:19 CEST] <furq> i assume that was sarcasm, but just to be on the safe side
[02:39:24 CEST] <furq> do not use anything from libav.org
[02:40:27 CEST] <llogan> if you do want to use the fork, i suggest asking for help at #libav
[02:42:14 CEST] <llogan> otherwise, just use the FFmpeg libraries instead
[02:42:28 CEST] <Phi> so which would you recommend?
[02:42:37 CEST] <llogan> Obviously I'm biased.
[02:42:42 CEST] <furq> it's hard to guess which one #ffmpeg would recommend
[02:42:45 CEST] <llogan> since I'm in #ffmpeg
[02:43:00 CEST] <Phi> yes, but you must have reasoning for your bias
[02:43:09 CEST] <Phi> we're all logical men here *snork
[02:43:28 CEST] <furq> is libav actively developed any more
[02:43:41 CEST] <llogan> I don't know what c++ examples you are referring to, but if they are using ffmpeg then there is little reason to use something else
[02:43:43 CEST] <furq> it seemed to run out of steam a good while ago
[02:43:47 CEST] <Phi> their last release was the 2nd, so it looks like it
[02:43:55 CEST] <furq> also yeah if you're looking at ffmpeg's examples then obviously use ffmpeg's libs
[02:44:28 CEST] <Phi> actually since one is a fork of the other then the examples don't really clarify things
[02:44:46 CEST] <furq> it's a fork from like 2011 or so
[02:44:54 CEST] <furq> there's been a lot of api changes since then
[02:44:58 CEST] <Phi> why was it forked?
[02:45:10 CEST] <furq> because some people got mad at michael niedermayer for some reason
[02:45:21 CEST] <furq> the same reason any fork happens except with names changed
[02:46:04 CEST] <Phi> never met the bloke
[02:46:14 CEST] <Phi> guess that's another thing to add to my time travel list
[02:47:43 CEST] <Phi> is there a difference in license?
[02:48:33 CEST] <furq> i don't think that would be possible
[02:49:24 CEST] <DHE> can't just change the license of someone else's code when forking it
[02:49:43 CEST] <DHE> which includes when "someone else" is a whole team of people
[02:51:02 CEST] <furq> you can do it if you have their permission, but somehow i don't think that was on offer
[02:52:07 CEST] <furq> afaik the only reason libav ever had any relevance is that one of the forkers was also the debian ffmpeg maintainer, and he went ahead and decided that ffmpeg was dead and replaced it with libav for two releases
[02:52:29 CEST] <furq> which is why we will eternally have ubuntu 12 users in here asking about it
[03:02:34 CEST] <klaxa> what i found the worst offense was that the binary shipped was broken even! https://files.klaxa.eu/2014-08-16-002239_1280x800_scrot.png
[03:03:19 CEST] <klaxa> (it was a puny laptop and i didn't have the time to compile ffmpeg)
[03:03:38 CEST] <klaxa> literally took more than an hour back then instead of a few minutes now
[03:13:03 CEST] <Phi> weeeelp
[03:13:20 CEST] <Phi> the ffmpeg libraries result in my dll not working
[03:13:51 CEST] <Phi> if only the program that used it provided error messages instead of "this dll no longer exists on this temporal plane" I'd be able to debug
[03:16:03 CEST] <Phi> multidimensional irrational calculus is not my forté
[03:19:47 CEST] <Phi> well, the probability engine gives me answer 42
[03:20:00 CEST] <Phi> which is... toolchain is GCC, not MSVC
[03:20:21 CEST] <Phi> time to grab the source I guess
[03:22:56 CEST] <Phi> well, I guess i can play some games on the clock while it downloads/makes
[05:08:49 CEST] <Phi> well, recompile finished, time to see if it works
[05:12:57 CEST] <t4nk748> hey, when i using -map 0:0 -map 0:1 -map 1:0 , how can I say the second audio stream is the default ? so that when i play the movie it choose the second audio automatically
[05:19:22 CEST] <Phi> "external symbol _AcquireCredentialsHandleA is unresolved"
[05:19:49 CEST] <Phi> looks like Secur32.dll
[05:22:26 CEST] <Phi> linking Secur32.lib fixed it
[05:23:50 CEST] <Phi> now my dll works, yay
[05:30:12 CEST] <furq> t4nk748: -disposition:a:0 default
[05:30:15 CEST] <furq> er
[05:30:21 CEST] <furq> -disposition:a:1 default
[05:44:28 CEST] <t4nk748> furq: thank you :)
[05:49:22 CEST] <t4nk748> furq: i did it now but it didn't work! http://pastebin.com/mAGKAw4D
[06:06:17 CEST] <Phi> alright folks, actual bug
[06:06:19 CEST] <Phi> http://pastie.org/private/jt8mbywhjmkzjmfakou93g
[06:06:38 CEST] <Phi> avcodec_open2 fails with error -542398533: Generic error in an external library
[06:41:06 CEST] <Phi> broken ffmpeg default settings detected
[06:47:58 CEST] <Karun> Hello. Is there any way to determine the format of a video file? I used an ffmpeg Java wrapper and I got "mov,mp4,m4a,3gp,3g2,mj2;" for an mp4 file.
[06:49:36 CEST] <furq> those are all more or less the same format
[06:50:49 CEST] <Karun> furq: Ah. Okay. So there's no way to get mp4 as the format? :)
[06:52:23 CEST] <furq> http://vpaste.net/COooW
[06:52:45 CEST] <Karun> furq: Awesome! Thank you so much.
[06:52:49 CEST] <furq> http://www.ftyps.com/
[06:53:16 CEST] <furq> i don't know how reliable the tag is though
[06:53:44 CEST] <furq> i imagine qt vs isom is a safe bet
[06:54:37 CEST] <Karun> The java wrapper makes a key value map that has this -  (major_brand,mp42)
[06:55:07 CEST] <Karun> keyvalue bag*
[08:54:24 CEST] <babadoc> hey guys
[08:54:38 CEST] <babadoc> i have a question
[08:54:48 CEST] <babadoc> if ffmpeg says a video is " Stream #0:0[0x100]: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p"
[08:55:12 CEST] <babadoc> is that the .mp4 mime type?
[08:55:19 CEST] <babadoc> i cant figure it out
[08:55:38 CEST] <babadoc> is the mime type .h264
[08:55:54 CEST] <furq> that's the video codec
[08:56:08 CEST] <babadoc> okay, so how can i find the extension that the video is suppose to be using
[08:56:17 CEST] <babadoc> i ran ffmpeg -i msnamedfile.webm
[08:56:21 CEST] <babadoc> and it gave me the codec
[08:57:31 CEST] <furq> Input #0, mov,mp4,m4a,3gp,3g2,mj2, from 'test.mp4':
[08:57:51 CEST] <babadoc> uh
[08:58:27 CEST] <babadoc> where can i find that
[08:58:34 CEST] <babadoc> ffmpeg -i should show it correct
[08:58:50 CEST] <furq> yes
[08:58:54 CEST] <furq> although you should be using ffprobe for this
[08:59:00 CEST] <babadoc> okay
[09:00:43 CEST] <furq> http://vpaste.net/cdn0K
[09:01:42 CEST] <babadoc> http://pastebin.com/06Ww9p56
[09:01:47 CEST] <babadoc> okay, so let me see
[09:01:54 CEST] <furq> Input #0, mpegts, from 'Oct24_2016_22_40_34.webm':
[09:02:04 CEST] <furq> so the extension should be .ts
[09:05:49 CEST] <babadoc> .ts?
[09:05:52 CEST] <babadoc> what the hell is that
[09:06:07 CEST] <beauty> ls
[09:07:10 CEST] <beauty> how to know a fuction whether is thread safety???
[09:07:19 CEST] <beauty> for example avcodec_decode_video2
[09:12:53 CEST] <babadoc> alright thanks furq
[09:12:58 CEST] <babadoc> you really have been a big help
[09:13:05 CEST] <babadoc> :)
[09:14:39 CEST] <babadoc> aw man furq
[09:14:43 CEST] <babadoc> i have another problem
[09:15:11 CEST] <babadoc> 76.91.0.160:8080
[09:15:20 CEST] <babadoc> if you go there you can see what the problem is..
[09:16:00 CEST] <babadoc> i tried putting the .ts file into a html video tag and it doesn't load the video, yet it still shows the correct time
[09:16:03 CEST] <babadoc> its really weird
[09:17:06 CEST] <babadoc> it works if i encode the video into mkv though.
[09:17:13 CEST] <babadoc> i can watch it in a video tag
[09:18:06 CEST] <furq> ts isn't supported in browsers
[09:18:12 CEST] <furq> neither is mkv, so i don't know how that's working
[09:18:23 CEST] <furq> remux it to mp4
[09:18:31 CEST] <nonex86> beauty: read the documentation
[09:18:52 CEST] <nonex86> beauty: usually it warned about thread safety
[09:18:55 CEST] <babadoc> you want me to show you what i mean
[09:18:57 CEST] <babadoc> here
[09:19:00 CEST] <babadoc> let me host the mkv file
[09:19:16 CEST] <babadoc> wait
[09:19:21 CEST] <babadoc> are mpeg files supported in browser?
[09:19:26 CEST] <furq> no
[09:19:39 CEST] <furq> only mp4 is supported in video tags in every browser
[09:19:48 CEST] <furq> if mkv works i assume it's using a plugin
[09:19:53 CEST] <babadoc> i dont have any
[09:19:59 CEST] <babadoc> here
[09:20:02 CEST] <babadoc> i can try hosting the mkv file
[09:20:04 CEST] <babadoc> tell me if it loads
[09:21:07 CEST] <babadoc> look, the link is http://192.168.0.5:8080/last25minutes.mkv
[09:21:12 CEST] <babadoc> shit
[09:21:14 CEST] <babadoc> dont click that
[09:21:24 CEST] <furq> i don't think it would really matter if i did
[09:21:27 CEST] <babadoc> 76.91.0.160:8080 go here and play it
[09:21:28 CEST] <babadoc> yeah
[09:21:30 CEST] <babadoc> i know its private
[09:22:18 CEST] <furq> yeah that just downloads
[09:22:21 CEST] <babadoc> wait
[09:22:34 CEST] <babadoc> going to just the base ip?
[09:22:38 CEST] <babadoc> with the port
[09:22:40 CEST] <babadoc> 76.91.0.160:8080
[09:22:43 CEST] <babadoc> that downloads?
[09:22:51 CEST] <furq> no
[09:22:54 CEST] <furq> that just doesn't do anything
[09:22:59 CEST] <babadoc> ...
[09:23:20 CEST] <babadoc> is it a blank page
[09:23:27 CEST] <furq> it has two broken video tags
[09:23:31 CEST] <babadoc> two
[09:23:32 CEST] <babadoc> okay
[09:23:39 CEST] <ritsuka> just remux it to mp4&
[09:23:43 CEST] <furq> ^
[09:24:05 CEST] <babadoc> for me it has only 1 broken
[09:24:10 CEST] <ritsuka> h.264 -> mp4, vpwhatever -> webm
[09:24:15 CEST] <babadoc> https://i.gyazo.com/e331aba8b25ce4c167fbf37ab84a0c93.png
[09:24:20 CEST] <babadoc> look! it only has 1 broken
[09:24:24 CEST] <babadoc> the mkv is working and i have no plugins
[09:24:35 CEST] <ritsuka> yes it could work on your browser, but not on the 99% of the other browsers
[09:24:48 CEST] <furq> the only interesting thing i could have got out of that image is what browser and os you're using, and you cropped that out
[09:25:02 CEST] <babadoc> well
[09:25:06 CEST] <babadoc> im using os x yosemite
[09:25:07 CEST] <babadoc> and chrome
[09:25:10 CEST] <babadoc> the most recent version
[09:25:22 CEST] <babadoc> here "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/53.0.2785.143 Safari/537.36"
[09:25:26 CEST] <babadoc> thats mine
[09:26:00 CEST] <ritsuka> chrome probably uses the same demuxer for mkv and webm
[09:26:06 CEST] <babadoc> hm. interesting
[09:26:15 CEST] <babadoc> the only thing i care about is watching it on my end
[09:26:16 CEST] <ritsuka> but it won't work in safari, firefox, etc&
[09:26:20 CEST] <furq> https://www.youtube.com/watch?v=27jvjVkpCUA&t=6m18s
[09:26:22 CEST] <furq> also this is more like it
[09:26:23 CEST] <babadoc> because int he end im uploading it to youtube
[09:26:37 CEST] <furq> if you're uploading it to youtube it makes literally no difference what container you use
[09:26:41 CEST] <babadoc> yeah i kow
[09:26:42 CEST] <babadoc> know
[09:26:49 CEST] <babadoc> but i want to watch it on my end before i upload it to youtube
[09:26:59 CEST] <babadoc> without downloading the 5gb file
[09:27:22 CEST] <babadoc> which was accomplished with the mkv file that i copied
[09:28:06 CEST] <babadoc> but i have to run ffmpeg -t 00:25:00 -i  -c copy -bsf:a aac_adtstoasc first25minutes.mkv
[09:28:10 CEST] <babadoc> with the webm video
[09:28:16 CEST] <babadoc> and that runs very fast
[09:28:21 CEST] <babadoc> like in a couple of seconds
[09:28:34 CEST] <babadoc> but i dont want to copy it into a new file
[09:28:36 CEST] <ritsuka> your original file is a .ts one, not webm :P
[09:28:40 CEST] <babadoc> i know
[09:28:45 CEST] <babadoc> i screwed up the filenames
[09:28:54 CEST] <furq> i'm pretty sure i already recommended sshfs for this
[09:28:55 CEST] <babadoc> that is the problem
[09:29:03 CEST] <babadoc> what is sshfs
[09:29:08 CEST] <furq> and by "i'm pretty sure" i mean i already checked the logs and i did
[09:29:29 CEST] <babadoc> um
[09:29:42 CEST] <babadoc> i have no recollection of this
[09:30:02 CEST] <furq> you can mount your home directory on your server as a filesystem on your home computer
[09:30:05 CEST] <furq> over ssh
[09:30:18 CEST] <babadoc> okay
[09:30:22 CEST] <furq> and then just play the files in a proper video player
[09:30:38 CEST] <babadoc> i see
[09:30:43 CEST] <babadoc> i will try to use this
[09:30:49 CEST] <furq> https://github.com/osxfuse/osxfuse/wiki/SSHFS
[09:30:53 CEST] <furq> i guess you want to read that
[09:32:57 CEST] <babadoc> okay, i will try this
[09:49:08 CEST] <babadoc> thanks furq, once again
[09:49:11 CEST] <babadoc> you saved the day
[09:49:26 CEST] <babadoc> i can't thank you enough
[10:07:02 CEST] <gour> morning
[10:08:45 CEST] <gour> i've ~TB of captured hi8 video footage stored as *.DV (http://pastebin.com/K1tPjkkV) and wonder what would be the best way to convert to some lossless format for archival purposes and still being able to use it as source material for video projects to be played on today's machines?
[10:09:19 CEST] <furq> probably ffv1 or lossless x264
[10:10:05 CEST] <furq> everything else either compresses much worse, is too obscure, or decodes too slowly
[10:11:27 CEST] <gour> furq: what will be estimated increase of space when converting to ffv1/x264? what resolution is realistic to get from such sources?
[10:12:39 CEST] <furq> what's wrong with the format it's in now
[10:12:51 CEST] <furq> or are you planning on capturing it again
[10:14:06 CEST] <furq> actually nvm dv is digital isn't it
[10:14:15 CEST] <furq> there's a clue in the name
[10:15:09 CEST] <gour> no, i'd just convert from the current *.dv format into something better, if possible
[10:15:37 CEST] <gour> the paste shows output of ffmpeg -i file.dv
[10:15:49 CEST] <furq> the size increase depends on the source really
[10:16:10 CEST] <gour> current DV is ~12G/hr
[10:16:13 CEST] <furq> if it's already 28mbit at 576p then it shouldn't be that bad
[10:17:01 CEST] <furq> 576p 4:2:0 rawvideo is only 118mbit
[10:18:11 CEST] <gour> which format you would recommend then?
[10:18:39 CEST] <furq> x264 is probably more widely compatible
[10:18:45 CEST] <gour> thanks
[10:18:58 CEST] <jya> how can I configure ffplay to use the Apple Audio Toolbox decoder?
[10:19:36 CEST] <furq> gour: https://en.wikipedia.org/wiki/FFV1#Video_archiving
[10:19:51 CEST] <furq> i think any h264 decoder can decode lossless h264, but i could be wrong
[10:20:34 CEST] <furq> i'd probably pick whichever one runs faster and compresses better
[10:20:35 CEST] <gour> furq: after converting to x264, is it realistic to produce output suitable for HD resolutions? in the past i was just creating DVDs (PAL res)
[10:21:10 CEST] <furq> for a given definition of suitable
[10:22:14 CEST] <furq> it's obviously not going to look great
[10:28:01 CEST] <gour> even HD (1280x720) would be bad?
[10:28:38 CEST] <pomaranc_> furq: decoder does not care if the input is losless
[10:30:14 CEST] <furq> i've seen some hd releases which are upscaled from sd sources
[10:30:23 CEST] <furq> they can look ok with enough filtering but it's never going to look as good
[10:31:08 CEST] <gour> furq: i see...well, the main thing is that i preserve material the best i can
[10:31:09 CEST] <furq> i wouldn't really know how you'd go about filtering it other than a lanczos/spline scale and then sharpening
[10:31:13 CEST] <furq> there's doubtless much fancier stuff you can do
[10:33:59 CEST] <gour> furq: e.g.?
[11:02:57 CEST] <bencoh> hqx? :]
[12:07:59 CEST] <pomaranc> can ffmpeg handle 12-bit h265 input?
[13:42:40 CEST] <MINIMAN10000> gah!
[13:43:11 CEST] <MINIMAN10000> ffmpeg -r 60 -i bugs.webm out.webm did make the video play at 60 fps but did not modify the audio playback rate
[13:50:22 CEST] <MINIMAN10000> why is changing audio playback rate so bad why can't it just take any arbitrary audio playback rate
[14:02:03 CEST] <MINIMAN10000> what the heck
[14:02:18 CEST] <MINIMAN10000> ffmpeg -i final.flac -i bugs.webm -r 60 out.webm should be forcing the frame rate of the output file to 60 fps
[14:02:39 CEST] <MINIMAN10000> i guess what i need is my input framerate forced then?
[14:09:23 CEST] <MINIMAN10000> I ended up having to use audacity since I don't know of any sane way of handling it with ffmpeg which is disappointing :| hopefully it's just me not knowing a setting.
[14:25:12 CEST] <nonex86> MINIMAN10000: honestly speaking cant understand, what do you want
[14:26:28 CEST] <DHE> I think to speed up the video, both framerate and audio pitch
[14:26:32 CEST] <MINIMAN10000> so the video was 25 fps. I wanted it to playback at 60 fps and have the audio play at that rate as well
[14:26:43 CEST] <MINIMAN10000> DHE: I wanted pitch unchanged
[14:26:52 CEST] <c_14> setpts atempo
[14:27:17 CEST] <DHE> still confused
[14:27:22 CEST] <c_14> https://trac.ffmpeg.org/wiki/How%20to%20speed%20up%20/%20slow%20down%20a%20video
[14:27:22 CEST] <MINIMAN10000> not only could I not figure out how to use atempo, you have do weird chaining because it only accepts 0.5 to 2
[14:28:36 CEST] <c_14> atempo=2,atempo=1.2
[14:30:31 CEST] <MINIMAN10000> I'll try out of curiosity but the inability to use any float value is more than enough reason to avoid doing it in ffmpeg.
[14:31:27 CEST] <c_14> you can use floats
[14:31:33 CEST] <MINIMAN10000> any float
[14:31:45 CEST] <MINIMAN10000> it only accepts was it 0.5 to  2
[14:31:47 CEST] <c_14> write a patch modifying the atempo filter
[14:31:54 CEST] <MINIMAN10000> sounds fair enough
[14:32:09 CEST] <c_14> It's a deficiency in the used algorithm afair
[14:32:33 CEST] <MINIMAN10000> I mean it sounds like you could have it be recursive or something
[14:33:06 CEST] <c_14> iterative, but yes
[14:33:42 CEST] <MINIMAN10000> Bugs bunny 60 fps http://puu.sh/rV1Bn/0bf2df626c.webm Is what I was making. The quality sounds better than the audacity tempo http://puu.sh/rV0TS/99b35f4f94.webm
[14:35:19 CEST] <MINIMAN10000> nah nvm they sound the same
[14:35:40 CEST] <MINIMAN10000> they sound weird when he says nighty night. but I assume that's just how it is.
[14:37:44 CEST] <MINIMAN10000> It was a slow motion scene so I figured there were more frames drawn so I wanted to see the scene played back at 60 fps to sorta get an idea of what an old cartoon would look like at 60 fps
[14:42:35 CEST] <nonex86> MINIMAN1000: use smooth video project
[14:42:59 CEST] <nonex86> MINIMAN1000: or any other motion estimation software then
[14:43:46 CEST] <nonex86> *motion interpolation
[14:44:36 CEST] <JEEB> FFmpeg does have nowadays motion estimation
[14:44:41 CEST] <JEEB> I mean, interpolation
[14:44:53 CEST] <JEEB> please look at the docs regarding that stuff
[14:45:45 CEST] <nonex86> JEEB: real interpolation or just frame blending?
[14:47:55 CEST] <MINIMAN10000> Huh I didn't know that. First time I just frame interpolation was SVP ( Smooth video project ) but the free version is highly restricted in what it can do.
[14:48:01 CEST] <MINIMAN10000> First time I saw*
[14:48:20 CEST] <bencoh> you could have a look at avisynth as well
[14:48:33 CEST] <MINIMAN10000> think i ran into a video trying to find SVP lol
[14:49:02 CEST] <nonex86> some post in the net advise to use ffmpeg + vapoursynth plugin
[14:49:08 CEST] <MINIMAN10000> Like the more I think about it the cooler frame interpolation on anime seems.
[14:49:14 CEST] <nonex86> so i guess no native support in ffmpeg exists
[14:49:18 CEST] <MINIMAN10000> ah
[14:49:29 CEST] <nonex86> at least i cant find any reference
[14:49:37 CEST] <JEEB> nonex86: actual motion interpolation
[14:50:02 CEST] <nonex86> JEEB: interesting, but i cant find any reference to it :(
[14:50:50 CEST] <nonex86> found this post https://ubuntuforums.org/showthread.php?t=2304500 dated 2015 nov
[14:52:34 CEST] <bencoh> nonex86: https://ffmpeg.org/ffmpeg-filters.html#minterpolate
[14:52:51 CEST] <nonex86> bencoh: thanks!
[14:56:27 CEST] <nonex86> well, i checked my ffmpeg build, it not contains minterpolate filter and i cant find any reference in configure script...
[14:59:44 CEST] <bencoh> nonex86: Date:Tue Aug 23 17:50:35 2016 +0530 avfilter: added motion estimation and interpolation filters
[15:00:01 CEST] <bencoh> dunno which version you're using but it's still quite "young" :)
[15:20:14 CEST] <nonex86> benoch: i see... well, my version is 3.1.4, release
[15:46:10 CEST] <mad_ady> Hello all! I'm trying to build ffmpeg from source for an embedded device running ubuntu.
[15:46:56 CEST] <mad_ady> I have the dependencies and I'm building it with "debuild -b -uc -us"
[15:47:31 CEST] <mad_ady> the build process takes a while but eventually fails on some tests in ffprobe_xml
[15:47:39 CEST] <mad_ady> make[2]: Target 'check' not remade because of errors.
[15:47:53 CEST] <mad_ady> How do I build it "the debian way" skipping checks?
[16:30:14 CEST] <nwoki> hi all. Is there a way to specify which of two webcams to use when streaming (via cmd line).
[16:30:33 CEST] <nwoki> as of now I get a dialog which requires me to specify which webcam to use
[16:40:54 CEST] <ChocolateArmpits> nwoki: Have you looked at the directshow page? https://trac.ffmpeg.org/wiki/DirectShow
[17:29:40 CEST] <bencoh> 9
[17:29:40 CEST] <bencoh> woops
[17:30:37 CEST] <ozette> 8
[17:31:05 CEST] <SchrodingersScat> 7
[17:32:22 CEST] <tontonth> 6.1
[17:34:23 CEST] <ozette> bencoh: you iniated a countdown chain, it simply can not be stopped.
[17:35:12 CEST] <nonex86> Chuck Norris can do anything... even stop countdown chain
[17:35:35 CEST] <ozette> oh noes
[18:39:29 CEST] <Phi> mook
[18:40:03 CEST] <Phi> I'm getting "broken ffmpeg default settings detected" with libx264
[18:40:10 CEST] <Phi> any idea how to fix?
[18:40:22 CEST] <Phi> in the C++
[18:40:33 CEST] <Phi> not the cmdl
[18:54:37 CEST] <BtbN> don't use some old ffmpeg version.
[19:34:49 CEST] <zeryx> can I do something like ffmpeg -i something.mp4 -codec copy something.mp4?
[19:35:19 CEST] <furq> if you mean overwriting the input file then no
[19:35:32 CEST] <zeryx> essentially mutating it yeah
[19:35:41 CEST] <zeryx> I don't want to create a new file between stages
[19:36:42 CEST] <furq> that would just delete the input file
[19:36:52 CEST] <furq> there are container-specific tools that can make limited inplace changes
[19:36:58 CEST] <furq> if you want to edit metadata or something
[19:41:44 CEST] <zeryx> ah kk
[19:53:20 CEST] <thebombzen> so, I've got a high-FPS video I'm trying to slow down without re-interlacing the frames. but FFmpeg doesn't want to respect -vsync 0 for some reason.
[19:54:10 CEST] <thebombzen> if I run ffmpeg -i input.mov -map 0:v -vf 'setpts=PTS*12.0' -c:v ffv1 output.nut
[19:54:17 CEST] <ATField> Hello! If I have an original videofile, and a supposedly losslessly cut fragment from that videofile, can I check somehow whether or not that fragment is really a lossless cut (a cut without reencoding, any quality loss, etc)?
[19:55:10 CEST] <thebombzen> the original video starts at lightly under 240 fps (i.e. 239.84) and it re-encodes it to 240 fps. so I tried using -vsync 0 but that actually changed nothing.
[19:56:23 CEST] <thebombzen> not only did -vsync 0 not change the output fps compared to without it, it also didn't change the output tbr or tbc. am I missing something here?
[19:56:52 CEST] <thebombzen> also, why is it outputting 240 fps instead of the input, which is 239.84?
[19:57:41 CEST] <ChocolateArmpits> ATField: Use PSNR or compares eyeball waveforms in an NLE
[19:57:45 CEST] <ChocolateArmpits> compare*
[19:59:12 CEST] <ChocolateArmpits> thebombzen: Did you try setting the rate manually via -r ?
[19:59:50 CEST] <thebombzen> ChocolateArmpits: I don't want to do that because I can't figure out the exact input framerate. 239.84 is weird.
[20:00:13 CEST] <thebombzen> I know for example that 29.97 is actually 30000/1001, but 239.84 isn't a multiple of 1/1001.
[20:00:35 CEST] <JEEB> it seems to be 24000/1001 * 10
[20:01:00 CEST] <JEEB> well not exactly it seems
[20:01:14 CEST] <JEEB> >>> 240000 /1001
[20:01:15 CEST] <JEEB> 239.76023976023976
[20:01:15 CEST] <thebombzen> exactly.
[20:01:29 CEST] <thebombzen> I don't want to use -r unless it's exact because that would drop frames.
[20:01:51 CEST] <thebombzen> which is why I want to just take the timestamps and multiply them by 12, and get something that is "slightly less than 20 fps"
[20:02:14 CEST] <thebombzen> the problem is using -vf 'setpts=PTS*12' combined with -vsync 0 still forces it to 240 fps and I don't know why, given that I'm using -vsync 0.
[20:03:16 CEST] <thebombzen> I feel that I should be able to slow the video down by an integer multiple without any sort of forced framerate choice, using just setpts. but I don't know exactly how to achieve that given that -vsync 0 doens't like me.
[20:03:57 CEST] <geri> how accurate is recording with the following setting? ffmpeg -f gdigrab -framerate 30 -i desktop out.mpg
[20:04:12 CEST] <geri> is it likely that the framerate changes?
[20:05:28 CEST] <thebombzen> okay so I used ffprobe to give me more info. apparently the r_frame_rate is 240/1 but the avg_frame_rate is 361440/1507
[20:05:47 CEST] <thebombzen> which is approximately... 239.84. Now, why would those be different. what is the actualy difference between those?
[20:07:00 CEST] <ATField> ChocolateArmpits: Thanks. Is there a way to do that through ffmpeg commands? And if not, which NLE program would you recommend for that?
[20:07:35 CEST] <Mad7Scientist> I have this thought -- if you take an HD h264/AVC video and compare it to the same video at 1/4 the pixel count encoded at MPEG2 at the same bit rate, if the viewer is far enough away to not see the resolution difference will the mpeg2 version look better?
[20:07:39 CEST] <ATField> ChocolateArmpits: Also, is extracting the same frame from both files for comparison a viable option? Or it would mess things up \ be too convoluted?
[20:07:39 CEST] <Maxz> hi, I choose ismv as container for storing a live video. I choose it because I can read while it is still not finished. My biggest issue now is it takes a lot of seek operations to read the file using "ffprobe". For example: 3.1G ismv file take "17406 seeks" (ffprobe -v 9 -loglevel 99 file.ismv). My question is: There is a way to accelerate this, keeping ismv?
[20:08:33 CEST] <Mad7Scientist> geri, Windows I assume?
[20:08:47 CEST] <Chocola2> ATField: Well ideally you would want to cut the part in the original video to match the start and duration. Are you sure it starts with a keyframe ? As for your next question, this is well possible but be sure to use same player. You can then compare the images using psnr filter in ffmpeg
[20:09:02 CEST] <geri> Mad7Scientist: Windows 10
[20:09:36 CEST] <ChocolateArmpits> ATField: Be sure to have several image samples for comparison so you are more sure about this
[20:09:39 CEST] <thebombzen> ATField: theoretically you could convert both to raw frames in the same pixel format and use diff. but I think it'd probably pretty inefficient.
[20:09:57 CEST] <thebombzen> well they should already be the same pixel format. otherwise it's not lossless.
[20:10:31 CEST] <ChocolateArmpits> thebombzen: depending on the format converted images could include additional time-based metadata which can invalidate results unless you know the addresses the mismatch can happen at
[20:11:24 CEST] <geri> Mad7Scientist: is there a way to get a unix/utc timestamp in millisec, which tell me when the frame is recorded?
[20:11:35 CEST] <geri> has been
[20:13:10 CEST] <ChocolateArmpits> geri: are you sure there is timecode data or that timestamps match real time ?
[20:14:08 CEST] <thebombzen> so what is the actual difference between r_frame_rate and avg_frame_rate?
[20:14:20 CEST] <thebombzen> does it mean the framerate of the input video is non-constant if these overlap?
[20:14:32 CEST] <geri> ChocolateArmpits: i would like to know the real time (e.g. utc) timestamp for a frame recorded... is that possible?
[20:14:34 CEST] <thebombzen> if these are not equal*
[20:14:56 CEST] <ChocolateArmpits> thebombzen: https://gist.github.com/yohhoy/50ea5fe868a2b3695e19
[20:16:46 CEST] <ChocolateArmpits> geri: As I said, it depends on either of those factors, if none are present or if they do not pertain to real world time then that might be impossible, unless there's some ancillary data written too somehow, which would be application specific
[20:17:12 CEST] <geri> ChocolateArmpits: could i change the source ffmpeg?
[20:17:21 CEST] <ChocolateArmpits> For what?
[20:17:34 CEST] <geri> ChocolateArmpits: to save the timestamp at which a frame has been recorded
[20:17:44 CEST] <ChocolateArmpits> There's a setpts filter
[20:18:04 CEST] <ChocolateArmpits> and a timeclock option to use in the equation
[20:18:22 CEST] <thebombzen> ChocolateArmpits: that being said, multiplying the PTS by 12 should divide the TBR by 12, but even with -vsync 0 we still end up with 240 fps/tbr. What am I missing?
[20:18:30 CEST] <thebombzen> why does -vsync 0 do nothing?
[20:18:31 CEST] <geri> setpts filter will be called each frame or how should i understand it?
[20:18:55 CEST] <ChocolateArmpits> geri: it sets timestamps for each video frame
[20:18:58 CEST] <ChocolateArmpits> https://ffmpeg.org/ffmpeg-filters.html#setpts_002c-asetpts
[20:19:29 CEST] <ChocolateArmpits> Read the documentation
[20:19:40 CEST] <geri> ok
[20:19:46 CEST] <ChocolateArmpits> I have tried wallclock some time ago but it lagged really bad on Windows
[20:20:00 CEST] <ChocolateArmpits> so see for yourself further
[20:20:01 CEST] <ATField> Okay, thanks! And just to make sure, ffmpeg -ss [..] -i IN.mkv -t [..] -c copy OUT.mkv and ffmpeg -i IN.mkv -ss [..] -to [..] -c copy OUT.mkv are guaranteed to make lossless cuts, right?
[20:20:03 CEST] <ATField> Also, is there a way to preserve timestamps (so they dont get reset, see: https://trac.ffmpeg.org/wiki/Seeking) without using the Output seeking order and having to wait?
[20:20:03 CEST] <geri> ChocolateArmpits: where does the lag come from?
[20:20:32 CEST] <ChocolateArmpits> geri: I have no idea, and I haven't tested this with any other system or any newer build
[20:20:37 CEST] <thebombzen> ATField: generally do ffmpeg -ss [..] -copyts -i in.mkv
[20:20:40 CEST] <ChocolateArmpits> investigate yourself to see how it works for you
[20:21:15 CEST] <thebombzen> and yea ATField it will make lossless cuts but they won't be accurate.
[20:21:21 CEST] <geri> ChocolateArmpits: i wrote a screen recording app for windows... i try to see if ffmpeg is better and how it works internally....
[20:21:53 CEST] <thebombzen> geri: if you want to screenrecord on windows I recommend OBS. It does have the downside of being a GUI but it's still FOSS and works great.
[20:22:09 CEST] <ATField> thebombzen: They ended up being accurate two times in a row, which is what made me suspicious that theyre true lossless cuts. I guess it was just lucky and both cuts started at a proper-type keyframe.
[20:22:13 CEST] <geri> thebombzen: i only need a console application...
[20:22:34 CEST] <thebombzen> then I'd recommend the Directshow capture device
[20:22:56 CEST] <ChocolateArmpits> thebombzen: avg_frame_rate isn't in the timestamps, r_frame_rate is
[20:23:06 CEST] <ChocolateArmpits> so it's using those with 240/1 rate
[20:23:09 CEST] <thebombzen> geri: see: https://trac.ffmpeg.org/wiki/Capture/Desktop#Windows
[20:23:13 CEST] <geri> thebombzen: have you used OBS?
[20:23:28 CEST] <geri> thebombzen: i dont have direct show filter installed...
[20:23:53 CEST] <thebombzen> geri: which is why when you click the link I provided, you realize that the "directshow filter" is actually a link to where you can install it.
[20:24:58 CEST] <thebombzen> ChocolateArmpits: then, if I use -vf setpts=PTS*12 -vsync 0, why does it still have 240 fps/tbr? Why doesn't it decrease to 20?
[20:25:31 CEST] <thebombzen> and geri yes I have used OBS. I wouldn't say it "works great" if I had never used it.
[20:26:04 CEST] <ChocolateArmpits> thebombzen: Are you sure you are transcoding the input ?
[20:26:57 CEST] <ChocolateArmpits> Also
[20:26:59 CEST] <ChocolateArmpits> >Each frame is passed with its timestamp from the demuxer to the muxer.
[20:27:10 CEST] <ChocolateArmpits> I guess vsync may override setpts
[20:27:22 CEST] <ChocolateArmpits> I don't know why would you use the two
[20:27:38 CEST] <ChocolateArmpits> unless it was -vsync 1 or 2
[20:29:53 CEST] <thebombzen> ChocolateArmpits: because I want the muxer to determine the tbr based on the timestamps of the frames it receives.
[20:30:45 CEST] <thebombzen> the input stream has the frames separated by 1/240 of a second. the output will have them separated by 1/20 of a second. the muxer should see that and determine "20 tbr"
[20:32:52 CEST] <ChocolateArmpits> You're not making sense. -vsync 0 is not needed when you have that setpts set
[20:33:30 CEST] <ChocolateArmpits> -vsync 0 passes timestamps from demuxer to the muxer, what I'm guessing is overwriting the setpts timestamps so you end with 240/1 again
[20:33:49 CEST] <ChocolateArmpits> muxing happens after encoding that happens after filtering
[20:35:02 CEST] <JEEB> btw, the least insane timestamp mangling in ffmpeg.c seemed to be `-vsync passthrough -copyts`
[20:35:21 CEST] <JEEB> although the ways of ffmpeg.c are always dark and treatcherous
[20:50:52 CEST] <thebombzen> ChocolateArmpits: although, the results are the with with and without -vsync 0. that's what's bugging me.
[20:51:33 CEST] <ChocolateArmpits> thebombzen: then why not use -r 20 ?
[20:51:39 CEST] <ChocolateArmpits> without all others
[20:52:01 CEST] <thebombzen> well using -r 20 without setpts won't slow the video down by 12x. it'll just drop frames.
[20:52:21 CEST] <ChocolateArmpits> as an input option
[20:52:39 CEST] <thebombzen> huh. lemme see what that does.
[20:54:37 CEST] <thebombzen> yea that does the trick. weird.
[21:00:54 CEST] <zeryx> is there a faster way to cat images together forming a video file?
[21:01:01 CEST] <zeryx> I'm doing image operations on each frame, then catting them back together
[21:01:07 CEST] <furq> faster than what
[21:01:43 CEST] <zeryx> right now I'm using ffmpeg -r 25 -f concat -i /path/to/catfile.cat catted_video.mp4
[21:01:52 CEST] <zeryx> where r is actually just whatever the input video's fps was
[21:02:04 CEST] <Maxz> zeryx: https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
[21:02:06 CEST] <zeryx> can I offload work to the gpu for encoding?
[21:02:18 CEST] <furq> ffmpeg -framerate 25 -i img%03d.png out.mp4
[21:02:24 CEST] <furq> assuming your images are numbered in sequence
[21:02:32 CEST] <zeryx> yeah cat file, should be identical right
[21:02:33 CEST] <ChocolateArmpits> zeryx: look up h264_nvenc
[21:02:52 CEST] <Maxz> h264_qsv too
[21:02:57 CEST] <furq> there's no need to create a concat file
[21:03:04 CEST] <zeryx> there is, long story
[21:03:05 CEST] <furq> i would also suspect the concat demuxer is slower than image2
[21:03:12 CEST] <zeryx> wait what
[21:03:16 CEST] <zeryx> the demuxers are different?
[21:03:21 CEST] <zeryx> shouldn't they be totally identical?
[21:03:34 CEST] <ChocolateArmpits> what demuxers?
[21:03:52 CEST] <zeryx> concat vs. image2 (assuming image2 is the ffmpeg -framerate 25 -i img%03d.png out.mp4 guy)
[21:04:16 CEST] <furq> well you're currently using image2 for every image and then concat
[21:04:29 CEST] <furq> i don't think image2 uses the concat demuxer for multiple images
[21:04:31 CEST] <furq> i could be wrong though
[21:04:58 CEST] <ChocolateArmpits> best to separate those two in your head. concat is used to merge multiple files, image2 for image sequences
[21:05:36 CEST] <ChocolateArmpits> Read documentation on formats https://www.ffmpeg.org/ffmpeg-formats.html
[21:05:36 CEST] <furq> encoding an image sequence is merging multiple files
[21:05:39 CEST] <zeryx> like, that's not really how the documentation explains it?
[21:06:47 CEST] <zeryx> so using the image2 demuxer is faster than using the concat file like what I've been using?
[21:07:13 CEST] <furq> it's definitely faster in the sense that you don't need to create a concat file
[21:07:22 CEST] <zeryx> thats instantaneous though
[21:07:31 CEST] <zeryx> like ~2 ms
[21:07:34 CEST] <furq> i would imagine it runs faster but i don't actually know that
[21:07:38 CEST] <zeryx> vs. 5 min to split & then re-encode a video
[21:07:43 CEST] <zeryx> a 6 min video*
[21:07:47 CEST] <zeryx> on a really powerful machine
[21:07:53 CEST] <furq> and if it is then it's probably unnoticeable if you're encoding
[21:07:54 CEST] <zeryx> trying to optimise this
[21:08:03 CEST] <furq> but concat seems like a weirdly roundabout way to do it
[21:08:11 CEST] <zeryx> by re-encode I mean literally concating images together at a certain fps to create a video file
[21:08:19 CEST] <zeryx> the word concat might not mean the same thing
[21:08:32 CEST] <zeryx> concatenating images to generate a video file is what I'm doing
[21:08:45 CEST] <furq> well yeah that's what image2 does
[21:09:01 CEST] <zeryx> so that's the go-to approach? I haven't seen much documentation suggesting that method online
[21:09:13 CEST] <furq> i've never seen anyone doing it another way
[21:09:21 CEST] <zeryx> ./shrug
[21:10:05 CEST] <furq> it saves a command if nothing else
[21:25:04 CEST] <zeryx> furq I think you're right
[21:25:07 CEST] <zeryx> the fps seems to be higher
[21:25:34 CEST] <zeryx> one more question, if I set the output of an image sampling operation to jpg, is it possible to set the compression value?
[21:25:47 CEST] <furq> -q:v
[21:25:57 CEST] <furq> 2 is best quality, 31 is worst
[21:26:16 CEST] <zeryx> interesting, so would 2 be equivalent to a png file (no compression at all)?
[21:26:43 CEST] <furq> max quality is still lossy
[22:21:43 CEST] <boxk> Hi, do you know how to avoid the audio/video not sync when cutting a slice of a file? I noticed for some files I export I have a perfect match of the audio/video frames but with some they have a difference of 1 which makes the video unsynced.
[22:23:46 CEST] <Phi> "<BtbN> don't use some old ffmpeg version."
[22:24:04 CEST] <Phi> I literally downloaded it from ffmpeg.org and make'd it yesterday
[22:24:16 CEST] <Phi> away with your sass
[22:24:36 CEST] <relaxed> which version did you download?
[22:25:54 CEST] <zeryx> when extracting audio from a file, is it possible to make audio type generic?
[22:26:07 CEST] <zeryx> IE can I just extract the audio without any kind of conversions so I can re-attach it later?
[22:26:08 CEST] <Phi> relaxed: whatever was on the git.ffmpeg.org
[22:27:19 CEST] <relaxed> zeryx: yes
[22:27:39 CEST] <zeryx> how? can I not denote a file extension when extracting?
[22:27:48 CEST] <zeryx> right now I'm extracting as "file.aac"
[22:28:16 CEST] <relaxed> ffmpeg -i input -map 0:a -c:a copy output.aac
[22:28:39 CEST] <zeryx> but will that be aac audio container output or whatever the original output is?
[22:29:11 CEST] <c_14> zeryx: -c copy -f matroska (I don't know of any audio codec that doesn't go in matroska)
[22:29:27 CEST] <c_14> Assuming the audio stream isn't necessarily aac
[22:29:41 CEST] <c_14> The aac format (should) reject non-aac
[22:30:43 CEST] <zeryx> c_14: can you be more specific?
[22:30:57 CEST] <c_14> About the command?
[22:31:06 CEST] <zeryx> yeah, where does that bit fit in?
[22:31:09 CEST] <c_14> ffmpeg -i input -map 0:a -c:a copy output.mkv
[22:31:35 CEST] <c_14> You can also use the .mka extension to clue that the container only contains audio
[22:34:02 CEST] <zeryx> :+1:
[22:34:04 CEST] <zeryx> nice
[23:03:51 CEST] <Maxz> 182
[23:05:40 CEST] <debkad> hello, i'm currently trying to download a .ts stream from .m3u8 url, the problem is sometimes my connection goes off or disconnect, when i try to resume, the download start from the begining, if i stop ffmpeg the resulted .ts file is unreadable ( moove atom ..), how can i resume from where ffmpeg hang?
[23:07:00 CEST] Last message repeated 1 time(s).
[23:09:32 CEST] <debkad> if ffmpeg is not the right tool for that please tell me, i will use something else
[23:11:00 CEST] <debkad> ffmpeg is hanging on 43 min from about one hour
[23:12:14 CEST] <debkad> !ping
[23:17:11 CEST] <Phi> we hear you debkad
[23:17:20 CEST] <Phi> that doesn't necessarily mean we have a clue :p
[23:17:27 CEST] <debkad> :(
[23:18:49 CEST] <Phi> you will probably have to pass a start time parameter
[23:18:58 CEST] <Phi> not sure how you would go about that
[23:20:10 CEST] <debkad> Phi: thank you, i will gives that a shot, will copy the .ts file, hope it will not resulted in not reading
[23:21:15 CEST] <refrijerator> Hi, my mic is such that if I record my voice, I need options "arecord -f S16_LE -d 5 out.wav" (Recording WAVE 'out.wav' : Signed 16 bit Little Endian, Rate 8000 Hz, Mono). Else my voice sounds all chipped and unclear. I want to record a desktop video, I try "ffmpeg -f alsa -ac 1 -i hw:0 -ar 8000 -f x11grab -video_size 1200x750 -r 30 -i :0.0+50,0 -strict -2 output.mp4" and I get error "Option sample_rate not found." (If I don't s
[23:21:52 CEST] <refrijerator> *garbage*
[23:22:38 CEST] <refrijerator> I found option -ar in the manual, but it does not exist? how then does it know I asked for option sample_rate?
[23:24:09 CEST] <klaxa> either put it before -i hw:0 to use it as an input option for hw:0 or put it after -i :0.0+50,0 to use it as an output option
[23:24:21 CEST] <klaxa> not sure which one is appropriate here
[23:24:51 CEST] <furq> debkad: the moov atom is specific to mp4
[23:25:09 CEST] <furq> ts is specifically supposed to work if it's truncated
[23:25:20 CEST] <ATField> I am having an audio\video sync problem, and I think its because 1. the originals Frame rate (MediaInfo shows "Frame rate: 93.750 fps (512 spf)") doesnt get passed onto the converted outcome file, and that 2. the original audio stream thats 1:39:18.621-long changes into a 1:41:05.237 one after conversion. How can I fix this / what parameters should I use? The -ar param. seems to...
[23:25:21 CEST] <ATField> ...be about...
[23:25:23 CEST] <ATField> ...Sampling Rate (KHz, is 48.0 on both files), and not frame rate.
[23:25:48 CEST] <furq> ATField: pastebin the command line and full output
[23:25:56 CEST] <debkad> furq: ah yes thank you, i used a specific parameter in the first try, that make sens
[23:26:35 CEST] <debkad> i know use only -c copy
[23:26:58 CEST] <furq> https://www.ffmpeg.org/ffmpeg-protocols.html#http
[23:27:04 CEST] <furq> you can try some of the reconnect options in there
[23:27:27 CEST] <ATField> furq: Just to make sure: should I post the command line logs of the conversion from original to output, the MediaInfo data, or both?
[23:27:36 CEST] <debkad> furq: that was for me?
[23:27:37 CEST] <furq> the first one
[23:27:40 CEST] <furq> yes
[23:27:48 CEST] <debkad> oh thank you so much
[23:27:58 CEST] <ATField> Ok, Ill go reconvert it to make the logs.
[23:28:09 CEST] <refrijerator> klaxa: you saved me, thank you very much
[23:28:31 CEST] <furq> ATField: if you don't have them to hand then the command line and mediainfo might be enough
[23:29:00 CEST] <furq> or ffprobe -show_streams output
[23:29:09 CEST] <klaxa> :)
[23:30:00 CEST] <llogan> refrijerator: use -sample_rate, not -ar for ALSA input option (possibly doesn't make a difference though)
[23:30:25 CEST] <debkad> furq: i think i must use those three reconnect_* options?
[23:30:49 CEST] <furq> probably
[23:30:55 CEST] <furq> i've not needed them in a while
[23:30:58 CEST] <refrijerator> llogan: thanks
[23:31:26 CEST] <NapoleonWils0n> hi all
[23:31:38 CEST] <llogan> refrijerator: and -channels instead of -ac (see "man ffmpeg-devices")
[23:31:43 CEST] <furq> i wonder if reconnect_streamed being off by default means it'll start from the beginning
[23:32:27 CEST] <NapoleonWils0n> i have a script that auto restart when the stream cuts out
[23:32:31 CEST] <NapoleonWils0n> https://github.com/NapoleonWils0n/kodi-playercorefactory/blob/master/bash-scripts/rip-record
[23:32:35 CEST] <CFS-MP3b> Is it possible for ffmpeg to segment on demand? By this I mean - instead of segmenting by segment length, segment when an external event happens, such as a signal being sent
[23:33:07 CEST] <NapoleonWils0n> are auto bitstream filters enabled in latest version on git
[23:33:13 CEST] <llogan> refrijerator: ...and -framerate instead of -r for x11grab.
[23:33:33 CEST] <llogan> and you don't need "-strict -2"
[23:33:35 CEST] <NapoleonWils0n> is it possible to enable auto bit stream filters using any config options
[23:33:54 CEST] <debkad> furq: i will work on those options, that really helpful, thanks again
[23:34:23 CEST] <refrijerator> llogan: strict 2 was needed, else audio did not work (I don't know aac, sthg like this), I run ffmpeg version 2.8.6
[23:34:32 CEST] <llogan> your ffmpeg is old
[23:35:09 CEST] <refrijerator> it is in the stable branch in my distribution, gentoo
[23:35:31 CEST] <llogan> 2.8.6 that was old when old was young
[23:35:54 CEST] <refrijerator> ok, I will keep that in mind, but seems to do what I wanted for now
[23:36:03 CEST] <debkad> my ffmpeg ( compiled from git if i remember) is ffmpeg version N-81995-gd790e48
[23:36:34 CEST] <debkad> is that ok?
[23:36:37 CEST] <NapoleonWils0n> i see automatic bitstream filtering in the changelog on github
[23:37:16 CEST] <NapoleonWils0n> im running 3.1.5 at the moment, is it a config option or am i just running an old version
[23:37:54 CEST] <furq> the latest release usually doesn't have all the changes from git
[23:38:13 CEST] <NapoleonWils0n> cheers furq
[23:38:29 CEST] <NapoleonWils0n> so will have to wait for the changes to flow downstream
[23:38:35 CEST] <furq> that change is probably part of the 3.2 branch
[23:38:47 CEST] <furq> you could just use git head
[23:38:50 CEST] <NapoleonWils0n> right ill keep an eye out
[23:40:03 CEST] <NapoleonWils0n> stupid question but what would be the timeframe for 3.2.0 "release"
[23:40:06 CEST] <Substring> hihi ! i'm trying to capture video from stdin as well as the sound output with ffmpeg through ALSA. my audio device has no capture feature, i'd just like to record what i hear on the speakers. Is that possible ?
[23:40:21 CEST] <NapoleonWils0n> Sustring yes it is possible
[23:40:35 CEST] <NapoleonWils0n> i have done it myself i dig out a link to the code i used
[23:40:45 CEST] <Substring> NapoleonWils0n, oh thanks :)
[23:41:59 CEST] <NapoleonWils0n> just trying to find it
[23:42:47 CEST] <NapoleonWils0n> https://github.com/NapoleonWils0n/cerberus/blob/master/ffmpeg/ffmpeg-screen-recording-mic-alsa-loopback.txt
[23:43:02 CEST] <NapoleonWils0n> you create a alsa loopback device
[23:43:21 CEST] <Substring> but it has no capture device
[23:43:29 CEST] <Substring> (it's on a raspberry pi)
[23:43:59 CEST] <NapoleonWils0n> i think you also need the ~/.asoundrc
[23:44:14 CEST] <NapoleonWils0n> with the alsaloop back device
[23:44:56 CEST] <Substring> oh so the .asoundrc is making th emagic ?
[23:44:58 CEST] <debkad> Substring: the video is inside your machine?
[23:45:24 CEST] <Substring> debkad, it's not even a video, it's the display of the pi
[23:45:27 CEST] <NapoleonWils0n> Substring you need to create the alsa loopback device with code like this
[23:45:29 CEST] <NapoleonWils0n> sudo modprobe snd-aloop pcm_substreams=1
[23:46:05 CEST] <debkad> Substring: you want to capture the whole screen including the sound?
[23:46:13 CEST] <NapoleonWils0n> im using filter complex to fix an issue where the audio was only coming out the left speaker
[23:46:19 CEST] <NapoleonWils0n> so you probably dont need that
[23:46:33 CEST] <Substring> debkad, yes i am
[23:46:36 CEST] <NapoleonWils0n> you can capture both you mic and any audio that is playing
[23:46:53 CEST] <debkad> ok i think the loop idea from NapoleonWils0n worth a try
[23:46:58 CEST] <NapoleonWils0n> what you need to do is set the application to use the alsa loopback device
[23:47:16 CEST] <NapoleonWils0n> so in vlc for example you need to change the output audio device to alsa loopback
[23:47:31 CEST] <NapoleonWils0n> note you wont be able to hear the audio playing but it will record
[23:47:41 CEST] <Substring> well, i'd better make the loopback device the default output then
[23:47:47 CEST] <Substring> oh !
[23:47:50 CEST] <Substring> wll
[23:47:54 CEST] <debkad> o_o
[23:47:58 CEST] <Substring> that's just to test if I can do some screen casting
[23:48:19 CEST] <NapoleonWils0n> i have used that technique on several screencasts for youtube
[23:48:36 CEST] <debkad> sound very logic
[23:49:00 CEST] <debkad> NapoleonWils0n: i guess it is a kind of a virtual device?
[23:49:10 CEST] <Substring> debkad, i"m not using any software on which you can set on which device you want the sound output
[23:49:22 CEST] <NapoleonWils0n> debkad yes exactly its a virtual audio device
[23:49:55 CEST] <NapoleonWils0n> Substring you can set the loopback device to be default temporaily while you record
[23:49:56 CEST] <debkad> Substring: yeah i understand that, the virtual loop device can capture the sound stream
[23:50:16 CEST] <NapoleonWils0n> then you can then switch back when your finished recording
[23:50:37 CEST] <ATField> furq: Here are: 1. the notes http://pastebin.com/k9LLcLan , 2. the mediainfo data: http://pastebin.com/S8ZFKFcS , http://pastebin.com/DypjaQRG. The command used for creating OUTPUT.ogg was a generic b ffmpeg -i "INPUT.dts" -c:a libvorbis -q:a 4  OUTPUT.ogg c.
[23:51:18 CEST] <Substring> That's some great help you gave me here :) One more question : i'd like the video to be encoded with h264_omx and not libx264 ... what is the command line ?
[23:51:37 CEST] <NapoleonWils0n> i havent used h264_omx before
[23:52:02 CEST] <Substring> well, then on a general matter : how do you define the encoder ?
[23:52:10 CEST] <llogan> -c:v <encoder name>
[23:52:35 CEST] <NapoleonWils0n> script is here:
[23:52:37 CEST] <NapoleonWils0n> https://github.com/NapoleonWils0n/cerberus/blob/master/ffmpeg/ffmpeg-screen-recording-mic-alsa-loopback.txt
[23:52:44 CEST] <NapoleonWils0n> -c:v libx264 -preset veryfast -crf 18 -profile:v high
[23:52:52 CEST] <NapoleonWils0n> -c:a libfdk_aac -ac 2 -ar 44100 -b:a 128k
[23:53:02 CEST] <debkad> Substring: you can use ffmpeg -formats to see the available formats
[23:53:22 CEST] <NapoleonWils0n> you dont need the filter complex in that code that was just a fix for my laptop
[23:53:41 CEST] <NapoleonWils0n> steps are create the alsa loopback device
[23:54:00 CEST] <NapoleonWils0n> use asoundrc to set it as your default audio device
[23:54:03 CEST] <debkad> -codecs* my bad
[23:54:09 CEST] <NapoleonWils0n> then record with ffmpeg
[23:54:17 CEST] <NapoleonWils0n> -f alsa -ac 2 -ar 44100 -i hw:0,0 \
[23:54:23 CEST] <NapoleonWils0n> -f alsa -ac 2 -ar 44100 -i loopout \
[23:54:53 CEST] <llogan> -f alsa -channels 2 -sample_rate 44100
[23:54:56 CEST] <NapoleonWils0n> obviously you need to work out the address of the mic
[23:55:45 CEST] <Substring> h264_omx is working fine (just the compression is awful ... nevermind)
[23:56:36 CEST] <NapoleonWils0n> Substring i have some bash scripts for ffmpeg here:
[23:56:38 CEST] <NapoleonWils0n> https://github.com/NapoleonWils0n/kodi-playercorefactory
[23:56:50 CEST] <NapoleonWils0n> and videos:
[23:56:51 CEST] <Substring> my ffmpeg is not compiled with the non free option
[23:56:52 CEST] <NapoleonWils0n> https://www.youtube.com/channel/UCriRR_CzOny-akXyk1R-oDQ
[23:57:05 CEST] <NapoleonWils0n> i always use the non-free
[23:57:23 CEST] <NapoleonWils0n> pain otherwise
[23:57:27 CEST] <Substring> libfdk_aac is part of the non free
[23:57:48 CEST] <NapoleonWils0n> yes i found it usually better quality
[23:57:58 CEST] <Substring> libaac works not bad either
[23:58:45 CEST] <llogan> support for libfaac has been removed from FFmpeg
[23:59:39 CEST] <Substring> aac
[23:59:49 CEST] <Substring> just the aac codec
[00:00:00 CEST] --- Wed Oct 26 2016


More information about the Ffmpeg-devel-irc mailing list