[Ffmpeg-devel-irc] ffmpeg.log.20191108

burek burek at teamnet.rs
Sat Nov 9 03:05:02 EET 2019


[00:02:55 CET] <Atlenohen> DHE: # make ffmpeg_g V=1
[00:02:55 CET] <Atlenohen> Makefile:163: /tests/Makefile: No such file or directory
[00:02:55 CET] <Atlenohen> make: *** No rule to make target '/tests/Makefile'.  Stop.
[01:01:56 CET] <iconoclasthero> is there some way i can complie ffmpeg w/codec2 that will be a separate binary from the one installed?
[01:02:19 CET] <iconoclasthero> ffmpeg version 3.4.6-0ubuntu0.18.04.1 Copyright (c) 2000-2019 the FFmpeg developers / built with gcc 7 (Ubuntu 7.3.0-16ubuntu3) /configuration: --prefix=/usr --extra-version=0ubuntu0.18.04.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --enable-gpl --disable-stripping --enable-avresample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libass --enable-libbluray
[01:02:19 CET] <iconoclasthero> --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librubberband --enable-librsvg --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-
[01:02:19 CET] <iconoclasthero> libtheora --enable-libtwolame --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libopencv --enable-libx264 --enable-shared
[01:02:49 CET] <iconoclasthero> i can't remember if i complied that myself or not.  if opus isn't standard than I did.
[01:03:18 CET] <pink_mist> nothing is standard. ask your packager what their standard is.
[01:06:04 CET] <iconoclasthero> probably not since it was built/installed in april and i wasn't doing stuff with it then.
[01:06:31 CET] <iconoclasthero> so that brings me back to how to build it so i can test codec2 to see if it's worth going any furhter.
[01:30:39 CET] <Atlenohen> DHE: Do you mean verbose build log?
[01:31:32 CET] <Atlenohen> there is no ffmpeg_g, it doesn't work.
[03:00:24 CET] <montana> what is better? 720p or 1080i
[03:04:40 CET] <DHE> Depends. I rather like 720p because for TV it usually means I get 60fps without any processing required.
[03:05:44 CET] <montana> you cannot get 60fps with "i" ?
[03:06:06 CET] <DHE> you have to deinterlace it, but yes
[03:06:15 CET] <DHE> I'd rather just get the good image straight up
[03:08:06 CET] <montana> does using interlacing really create the half size of a file-size?
[03:08:53 CET] <DHE> I am not qualified to answer such questions. I just know that 1080i at 30fps is roughly the same number of pixels per second as 720p at 60fps
[05:31:51 CET] <Reinhilde> montana: you can, if the field rate is 120
[09:46:30 CET] <snooky> Hi. how powerful is ffmpeg? what is possible with ffmpeg? So in more detail. can you build real livestreams with ffmpeg? which you can edit dynamically in real time?
[12:45:11 CET] <DHE> snooky: yes, sorta. it's not a timeline editor, but you can use the libraries to build on
[12:45:42 CET] <DHE> but you can totally pull your webcam, rescale it and livestream to youtube or something in a single app
[12:46:25 CET] <snooky> Oh. OK. but I mean with live editing just edit live. I'll do an example.
[12:48:14 CET] <snooky> https://nopaste.linux-dev.org/?1275933
[12:49:10 CET] <snooky> I show there a video and audio stream and send it to a v4l2 device and after alsa ... ok .. everything works .. on the other side I add the two back together and create a rtsp stream .. everything works fine ,
[12:49:55 CET] <snooky> but would it be possible to add fades to the stream now? as in the live tv, for example, in the middle of a video show a moving message or something
[12:52:04 CET] <DHE> while there's a fade filter, it takes a time for when it should happen which makes it unsuitable for a live streaming. but you're not REALLY live streaming, you're streaming a pre-rendered file so you could do it...
[12:52:19 CET] <DHE> https://ffmpeg.org/ffmpeg-filters.html#fade
[14:39:30 CET] <mlok> Is there a difference in terms of CPU/GPU usage when generating fragmented MP4 vs TS chunks?
[14:40:13 CET] <mlok> I mean in regards to the benefits of generating CMAF and transcoding
[14:51:02 CET] <DHE> video encoding is video encoding. there are slightly different NAL encoding mechanisms but they're trivial
[14:54:54 CET] <Atlenohen> Hello
[14:55:25 CET] <mlok> DHE: Otherwise in terms of processing there is not much difference?
[14:55:56 CET] <Atlenohen> I'm trying to figure out the buffering portion of ffmpeg, which is being integrated in another project via API. Like, is it possible to queue frames/packets without having to manage any "delaying" or anything?
[14:56:29 CET] <Atlenohen> And when the dumping is stopped, it wouldn't kill the buffer of frames the ffmpeg has queued?
[14:58:31 CET] <DHE> Atlenohen: codecs do internal buffering for the purposes of motion compensation look-ahead and bitrate planning. you can usually configure them not to, like libx264 has av_dict_set(&dict, "tune", "zerolatency", 0);
[14:58:54 CET] <DHE> in which case you can call avcodec_send_frame and immediately receive it encoded with avcodec_receive_packet
[15:02:04 CET] <Atlenohen> Oh, right, actually the other way around, I would want to buffer them, just not kill the buffer when the program stops recording, but internally it shouldn't kill the whole thing.
[15:02:31 CET] <Atlenohen> I see "av_codec_frame_free" and various context_free thingies
[15:03:32 CET] <Atlenohen> Infact I'm also looking if there's a way to throttle the speed of ffmpeg's encoding, like a limit of how many frames per second it would do, and it would simply queue the rest to some buffer or file on storage.
[15:04:25 CET] <Atlenohen> So after the portion is complete, ffmpeg should still run internally finishing up the queue, not sure if this is done automatically, I don't want to discard any buffers or queues.
[15:04:33 CET] <DHE> well ffmpeg just does as it's told. if you're using API, just capture frames at a fixed rate. if  you're using hardware capture that usually just happens on its own anyway
[15:04:53 CET] <DHE> avcodec_send_frame(codecContext, NULL); // to signal end of stream.
[15:05:35 CET] <Atlenohen> Right so that would signal it's ending, and I want to ... well I kinda would want to figure out how much the buffers/queues are filled to estimate when to shut it down completely.
[15:05:42 CET] <DHE> then collect all your frames from the output until it says it's hit EOF, and you're done
[15:06:36 CET] <Atlenohen> X264 may be used, but primarily it's FFV1 and other lossless codecs involved
[15:07:32 CET] <DHE> the concept remains the same. ffv1 is listed as doing this sort of buffering but I suspect not to the degree that x264 does it
[15:09:08 CET] <Atlenohen> Right, I'll worry about codec-specific stuff later then.
[15:09:54 CET] <DHE> the point is the API for interacting with codecs is unified. from your standpoint the only issue should be selecting a codec and selecting any codec-specific options (like x264's zerolatency mode, etc)
[15:12:45 CET] <Atlenohen> I read up on "avio_closep" one of the lines says "the buffer is automatically flushed" ... right now the program does this right away after context freeing, this probably isn't optimal right?
[15:14:05 CET] <Atlenohen> I need to read up on the docs, they should be okay right, I'm doing this 4.1.3 version but the final one will use 4.2.1
[15:36:44 CET] <Atlenohen> Currently it runs av_write_trailer before any signaling (nonexistent actually)  and before flushing and freeing context, those flushed may not actually mean the frame's data themselfs IMO.
[15:37:55 CET] <Atlenohen> I'm doing this totally from scratch, learning ffmpeg API. Really appreciate all the support, I'll try to dig into it as much as I can, but it's probably going to take me like a couple of weeks (not that I would be hard at it every day)
[15:42:02 CET] <DHE> allocate your context, set the output file handle (if not done as part of context creation already), set up the stream information and AVCodecParameters, call avformat_write_header, do your encoding and call av_write_frame() (or interlaced version) over and over, call av_write_trailer, close file, free contexts
[15:42:53 CET] <DHE> There's a field, AVFormatContext->pb which is the AVIO object used for file IO, or whatever method you're using (HTTP, etc)
[15:46:55 CET] <Atlenohen> It's file IO all right, I saw that yeah, and looks like currently there's also some WritePacket, ReceivePacket stuff used (it is based on an older API as well) That's not needed anymore?
[15:48:39 CET] <DHE> ffmpeg is multiple libraries. the container (mp4, mkv, etc) muxer/demuxer layer is largely independent of the codec (encoding, decoding) layer
[15:54:36 CET] <Atlenohen> Oh so it's possible to touch the codec layer but it's not standard, is there a way to be able to tell which is which in code, or if the API docs can separate these things?
[16:11:51 CET] <Atlenohen> So if I follow your statement, I should call avio_closep after av_frame_free but before av_free_context
[16:13:33 CET] <DHE> you call av_write_trailer to indicate you're done with the file, then you're good to close the file handle and free everything
[16:14:23 CET] <DHE> generally functions are prefixed with their family. avformat_*, avcodec_*, etc. though there are some old functions that are just av_* it should be obvious which family they belong to
[16:14:23 CET] <Atlenohen> But the trailer after the signal, like you mentioned before: avcodec_send_frame(codecContext, NULL); //
[16:14:59 CET] <Atlenohen> Oh thanks for that.
[16:15:20 CET] <DHE> that's at the codec layer. then you call avcodec_receive_packet() until you hit EOF, av_write_frame (badly named function!) all that into your file, and then you're done with the encoding step and can flush the avformat layer
[16:21:25 CET] <Atlenohen> I guess the av_* are more common/generic, so av_write_frame should be like avcoded or avformat write frame? Also I see there is av_interleaved_write_frame used, but this is a capture, the advance frames may not exist (unless queued, unless we throttle it on purpose, but no idea about that now)
[16:21:53 CET] <Atlenohen> And there's no audio, just video.
[16:32:52 CET] <Atlenohen> The last thing makes it sound as if it buffers packets in memory and then writes everything to file?
[16:35:56 CET] <Atlenohen> But It's going to take time for me to just understand the flow, not only the meanings, even if I spend the whole day reading what every function does I still wouldn't understand the whole thing.
[16:37:20 CET] <Atlenohen> So where you said "do you encoding" you meant the avcodec_receive_packet() stuff, I'm guessing at.
[16:53:22 CET] <bviktor> hi, how do i convert an audio file ={.mp3, .mp4, .ogg, .flac} into an opus encoded file in .caf format? all i get is "[caf @ 000000000255c0a0] unsupported codec Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input"
[16:53:55 CET] <Hello71> it would appear that you cannot.
[16:54:26 CET] <bviktor> why is it here then? https://www.ffmpeg.org/doxygen/4.0/caf_8c_source.html
[16:55:05 CET] <Hello71> do you have the right version
[16:55:14 CET] <Hello71> https://github.com/FFmpeg/FFmpeg/commit/b4093e60c51af493a6dad7819264ef769736227f
[16:55:27 CET] <bviktor> i've tried 4.2.1 and nightly
[16:55:34 CET] <bviktor> both win32 and win64...
[16:55:35 CET] <Hello71> so you have the wrong version, and also "muxing codec currently unsupported"
[16:55:45 CET] <Hello71> I think
[16:55:50 CET] <Hello71> hm
[16:56:03 CET] <bviktor> i don't understand, the commit you linked is from 2017
[16:56:09 CET] <Hello71> what is your full command line
[16:56:19 CET] <Hello71> I was looking at the wrong file
[16:56:22 CET] <bviktor> oh i had like a million command lines already
[16:56:45 CET] <bviktor> but example: ffmpeg -i input.mp3 -c:a copy output.caf
[16:56:57 CET] <Hello71> that doesn't make it opus though
[16:56:59 CET] <Hello71> but sure
[16:57:06 CET] <bviktor> yeah, i tried first just putting stuff into caf
[16:57:11 CET] <Hello71> works fine for me
[16:57:15 CET] <pagios> Hello, how can i exit ffmpeg if no stream is detected as input? by default it hangs
[16:57:33 CET] <Hello71> do you have some non-mp3 mp3 file
[16:57:35 CET] <bviktor> so putting mp3 into .caf worked for me too. but putting opus into caf didn't
[16:57:46 CET] <Hello71> "what is your full command line"
[16:58:00 CET] <Hello71> bviktor: "what is your full command line"
[16:58:03 CET] <Hello71> pagios: "what is your full command line"
[16:58:28 CET] <bviktor> ffmpeg -i input.opus -acodec copy output.caf
[16:58:33 CET] <pagios> Hello71, ffmpeg -i rtmp:/?... -vf .. -vb .. -f hls ..
[16:59:40 CET] <Hello71> iirc with rtmp it is not possible to know in general
[16:59:58 CET] <Hello71> you could put a timeout but it could be that the stream is very slow
[17:00:02 CET] <Hello71> check the manual
[17:00:20 CET] <Hello71> bviktor: try ffmpeg -i input -vn -c:a libopus output.caf
[17:00:30 CET] <Hello71> could be wrong about rtmp though
[17:00:50 CET] <pagios> Hello71, how can i put a timeout?
[17:00:59 CET] <Hello71> pagios: check the manual, idk
[17:01:08 CET] <bviktor> Hello71 that unfortunately gives the same error
[17:01:11 CET] <Hello71> or google "ffmpeg rtmp timeout"
[17:01:17 CET] <Hello71> bviktor: what is your whole output then
[17:02:05 CET] <bviktor> https://paste.fedoraproject.org/paste/OmopUniooxISggfBHC82Bg/raw
[17:02:29 CET] <Hello71> "Copyright (c) 2000-2017" pretty sure that's your problem
[17:02:43 CET] <Hello71> idk if "85588" is a good number but 2017 doesn't seem right.
[17:03:22 CET] <bviktor> dude it's the nightly
[17:03:42 CET] <Hello71> from... when?
[17:03:44 CET] <bviktor> they apparently forgot to update the copyright year, that happens all the time tbh
[17:03:44 CET] <Hello71> 2017?
[17:04:01 CET] <bviktor> from 10 minutes ago
[17:04:09 CET] <Hello71> that's... not how it works.
[17:04:21 CET] <bviktor> https://ffmpeg.zeranoe.com/builds/
[17:04:25 CET] <Hello71> the year is in the source code. it's not like the builder changes it
[17:04:26 CET] <Hello71> https://github.com/FFmpeg/FFmpeg/commit/6108805
[17:04:33 CET] <Hello71> you have this version.
[17:05:15 CET] <bviktor> fml
[17:05:37 CET] <bviktor> there's an ffmpeg scattered around in program files installed by... whatever, overriding my local ffmpeg lol
[17:07:06 CET] <bviktor> can't believe i wasted this much of my and your time with that lol
[17:07:07 CET] <bviktor> ty
[17:09:00 CET] <aleksandrdvorkin> hi
[17:09:43 CET] <aleksandrdvorkin> the git version of Kodi complains about VAAPI requested but not found
[17:09:52 CET] <Hello71> so...
[17:10:06 CET] <Hello71> what do you want us to do about it
[17:10:49 CET] <Hello71> lol
[17:10:56 CET] <aleksandrdvorkin> ignoledge
[17:11:12 CET] <aleksandrdvorkin> the make says to report to #ffmpeg
[17:11:27 CET] <pagios> -i rtmp:// -timeout 2  is not exiting..
[17:11:35 CET] <pagios> ps -ef | grep ffmpeg shouws it hanging in memory
[17:11:35 CET] <Hello71> pagios: put it on the other side.
[17:11:42 CET] <pagios> ok
[17:12:04 CET] <aleksandrdvorkin> i seem to install the vaapi libraries
[17:12:07 CET] <Hello71> aleksandrdvorkin: advise you to ask #kodi.
[17:12:15 CET] <aleksandrdvorkin> ok
[17:13:44 CET] <Hello71> where is everyone today? DHE: too boring?
[17:14:57 CET] <pagios> stderr: [rtmp @ 0x559c53e7e8a0] Cannot open connection tcp://....1935?listen&listen_timeout=2000 rtmp://...:1935/mediaengine/testblah: Address already in use
[17:15:05 CET] <pagios> Hello71,
[17:15:17 CET] <Hello71> sure, seems plausible
[17:16:12 CET] <pagios> it always give an error if i do ffmpeg -timeout 2 -i rtmp://...
[17:17:47 CET] <Hello71> what is your full command line.... and also the full output
[17:23:33 CET] <pagios> ffmpeg -timeout 2 -i rtmp://127.0.0.1:5555 -vf scale=640:480  -vb 128k -f hls index.m3u8
[17:23:37 CET] <pagios> Hello71,
[17:24:03 CET] <Hello71> so... what is your m3u8? and what is the output?
[17:24:35 CET] <pagios> stderr: [rtmp @ 0x559c53e7e8a0] Cannot open connection  ...
[17:29:53 CET] <pagios> meh
[19:00:04 CET] <orev> i'm curious what the "standard" crf for av1 encoding is?  in the wiki, 23 is mentioned as the default for h264, and 28 for h265.  the av1 page gives 30 as an example, but doesn't give any other guidelines like the h64/h265 pages do
[20:54:27 CET] <CounterPillow> orev: looks like it sets it to 32, which is then mapped to the AOM's actual setting of cq level
[20:56:06 CET] <CounterPillow> source: line 576 of libavcodec/libaomenc.c
[21:40:55 CET] <dissected> is there a way to tell ffmpeg to quit recording after detecting x duplicate frames ?
[21:41:38 CET] <Reinhilde> if there is, I've never found it nor needed to use it
[21:46:46 CET] <`St0ner> how do i edit the batch file at https://stackoverflow.com/questions/38477974/quickly-check-the-integrity-of-video-files-inside-a-directory-with-ffmpeg-with-w to only output if there is an error? basically i want to delete all corrupted videos
[21:47:27 CET] <Hello71> batch files suck.
[21:47:47 CET] <dissected> ah I have a lead in the freezedetect filter
[21:50:50 CET] <`St0ner> open to other scripts usable in windows
[21:55:49 CET] <angular_mike> how can I add a fast zoom effect on a video as it is playing?
[21:56:04 CET] <angular_mike> zoompan seems to turn it into single frame
[21:58:39 CET] <durandal_1707> angular_mike: change fps and duration options, not just use defaults
[22:27:25 CET] <angular_mike> durandal_1707: what should those be set to?
[22:27:37 CET] <angular_mike> why can't ffmpeg determine them from video file?
[23:06:32 CET] <durandal_1707> angular_mike: filter by defult zoom images
[23:06:55 CET] <durandal_1707> use d=1
[00:00:00 CET] --- Sat Nov  9 2019


More information about the Ffmpeg-devel-irc mailing list