[Ffmpeg-devel-irc] ffmpeg.log.20180225

burek burek021 at gmail.com
Mon Feb 26 03:05:01 EET 2018


[00:21:48 CET] <utack> is the hdr peak detection for tone mapping also possible in ffmpeg?
[00:25:20 CET] <JEEB> I don't think peak detection is in there
[00:25:29 CET] <JEEB> koda implemented the tone mapping as it was quite some time ago
[00:26:00 CET] <JEEB> (mpv had peak detection then as well I think, but looking at the options it doesn't seem to have one for that
[00:26:07 CET] <JEEB> it has an option to define the peak, though
[00:26:16 CET] <utack> yeah ok, i could not find any reference to it
[00:26:34 CET] <utack> i guess it does take the peak defined the input file, or does that get lost in my filter chain?
[00:27:28 CET] <utack> does "crop" do anything bad, drop any info? "crop=3840:2076:0:42,zscale=w=1280:h=692,zscale=transfer=linear,tonemap=mobius,zscale=transfer=bt709:matrix=709:primaries=709,format=yuv420p10"
[00:28:20 CET] <JEEB> I wouldn't be surprised if a lot of filters would drop the side data, and I'm not even sure if the tonemap filter reads that mastering screen side data
[00:29:08 CET] <JEEB> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavfilter/vf_tonemap.c;h=10308bdb16cf28fc3944755b5d5b9b6edfaae59c;hb=HEAD#l117
[00:29:23 CET] <utack> i have to admit i still have no idea what HDR actually tries to do, or why it is supposed to be any good. is there any kind of "hdr for dummies" somewhere that explains what it tries to solve?
[00:29:26 CET] <JEEB> so it will first try to get side data of type AV_FRAME_DATA_CONTENT_LIGHT_LEVEL
[00:29:39 CET] <JEEB> 2390-2 is pretty good as a primer
[00:29:41 CET] <JEEB> I think?
[00:29:54 CET] <JEEB> https://www.itu.int/dms_pub/itu-r/opb/rep/R-REP-BT.2390-2-2017-PDF-E.pdf
[00:30:14 CET] <JEEB> it begins with stuff like "Introduction and design goals for HDR television"
[00:30:15 CET] <utack> cool thanks
[00:30:19 CET] <JEEB> and "common misconceptions on HDR"
[00:31:02 CET] <JEEB> so yea, the tonemap filter will first look at side data of that type, and then take a look if the frame has the mastering side data
[00:31:30 CET] <JEEB> the problem is I don't know if lavfi filter chains make any attempt at passing the side data through
[00:32:25 CET] <utack> i'll just pipe to mpv with and without crop and see if it changes anything at all
[00:33:10 CET] <JEEB> that won't keep the side data anyways if there was any
[00:33:16 CET] <JEEB> as it's side data of the internal AVFrame stuff
[00:33:26 CET] <JEEB> so sharing that between two apps doesn't exactly work :)
[00:34:34 CET] <shastenm> Hey, I got an issue I am trying to solve.  I am trying to render on Kdenlive, and I get this nasty response.  It leaves me this message, "Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead,"  but I don't know what to do.
[00:35:07 CET] <JEEB> there's a function that handles that IIRC
[00:35:23 CET] <shastenm> From what I have researched online, it's an ffmpeg bug, but I need a work around.
[00:36:58 CET] <shastenm> Does anyone have any ideas?
[00:37:05 CET] <JEEB> not a bug, if you want to share stuff from an encoder to a muxer when adding a stream
[00:37:09 CET] <JEEB> avcodec_parameters_from_context
[00:37:13 CET] <JEEB> you have this
[00:38:28 CET] <JEEB> https://ffmpeg.org/doxygen/trunk/group__lavc__core.html#ga0c7058f764778615e7978a1821ab3cfe
[00:39:04 CET] <JEEB> it's even utilized in the examples now as far as I can tell
[00:41:47 CET] <shastenm> I can see that it can be worked around, but I am still a little lost.
[00:42:02 CET] <JEEB> it's not worked around, it is how the API is to be used now
[00:42:50 CET] <JEEB> anyways, if you are just a confused user of an application using FFmpeg's libraries then just go poke the authors of that application
[00:43:23 CET] <JEEB> if they are using a recent FFmpeg while not using the APIs right that is their problem and they can ask for help here if they require such
[00:43:40 CET] <JEEB> although on linux you have the problem that distributions are distributing whatever they want
[00:43:51 CET] <JEEB> also I have no idea if that message is an actual error
[00:43:58 CET] <JEEB> your actual failure might be somewhere else
[00:44:09 CET] <shastenm> You won't hear any arguments from me there.
[00:44:59 CET] <shastenm> According to the Arch Wiki, it's a soft error, but nothing will render just the same.
[00:45:33 CET] <JEEB> then it's something else and so far this thing has been zero per cent about FFmpeg
[00:45:45 CET] <JEEB> but rather an application using the FFmpeg libraries' API
[00:45:53 CET] <JEEB> so please go to the support for that application
[00:46:17 CET] <JEEB> if they need support in using the API according to the latest requirements (which do not change often, btw) they are free to ask for help here
[00:46:39 CET] <shastenm> Let me see what I can find from Kdenlive then.
[00:47:53 CET] <JEEB> you'll probably want to give them more information about the specific point of failure than you gave to me :P
[00:50:01 CET] <shastenm> It's hard because nothing shows up in the dmesg or other outputs to point me into a direction.
[00:52:52 CET] <JEEB> feel free to ask them for how the application logs more
[00:52:55 CET] <JEEB> :P
[00:54:12 CET] <shastenm> Yeah, I will do that.
[01:04:43 CET] <JEEB> gagandeep: during configuration of your FFmpeg you set a --prefix (default is /usr/local) and you do make/make install which installs the headers and libraries so that they are usable, and installs relevant pkg-config files
[01:05:29 CET] <JEEB> so to utilize, say, libavutil you need to set PKG_CONFIG_PATH to <your prefix>/lib/pkgconfig and run `pkg-config --libs --cflags libavutil`
[01:05:36 CET] <JEEB> that will give you the flags needed to include and link
[01:05:57 CET] <JEEB> the stuff in the FFmpeg source tree is not meant for usage as-is
[01:06:06 CET] <JEEB> I usually have my own prefix under home
[01:06:22 CET] <JEEB>  /home/jeeb/ownapps/ffmpeg_root etc
[01:06:53 CET] <gagandeep> yeah, i mean looking in the files the includes were not making any sense from relative directory point of view
[01:07:44 CET] <gagandeep> ok, let me try this path config, and if i run into any trouble i'll ask you again, thanks
[01:08:30 CET] <JEEB> set the prefix (or use the default) when configuring FFmpeg, and properly install FFmpeg into that prefix. then utilize PKG_CONFIG_PATH to append your prefix's lib/pkgconfig directory into the search path
[01:08:49 CET] <JEEB> after that when that PKG_CONFIG_PATH variable is set you should be able to just call pkg-config --cflags libavutil for example
[01:09:04 CET] <JEEB> to get the required compiler flags for the includes to work
[01:09:17 CET] <JEEB> a lot of libraries have pkg-config files and it's pretty much the standard with how stuff works
[01:10:29 CET] <gagandeep> this is my first time using libraries and packages where compiler flags are needed, so again, thanks
[02:27:18 CET] <acDC> i want to make a stream (http ts format) from a portion of my computer screen, then use that as a source for p2p broadcast. there are so many instructions and i find it all confusing
[02:27:55 CET] <acDC> using windows 7 pro 32 bit
[03:19:46 CET] <gagandeep> JEEB: i followed the compilation guide on ffmpeg which was using the ./configure method to compile from the sources
[03:49:36 CET] <Kennedy> how's everyone doing?
[10:36:39 CET] <blap> crosscompiling ffmpeg: build ffmpeg libavdevice/sdl2.c fatal error: SDL.h: "No such file or directory"
[10:37:05 CET] <blap> but file exists /usr/include/SDL2/SDL.h
[10:37:48 CET] <blap> maybe $(SRC_PATH) is wrong
[10:41:02 CET] <blap> ahh my DESTDIR was /usr/local and the includes are in /usr
[10:51:21 CET] <blap> no that didn't fix
[11:04:05 CET] <blap> fixed it
[12:10:44 CET] <blap> hm when i copy the built FFmpeg tree to the target device and run make install, it begins rebuilding
[13:35:42 CET] <DHE> blap: timestamps were probably re-written. that's what make does
[13:36:32 CET] <blap> it was the scp.. i needed "scp -r -p" to preserve mtime
[13:36:34 CET] <blap> ty DHE
[13:37:52 CET] <blap> weird that i get flickering with mpv+opengl but not ffplay+opengl on the pi3 with kms(vc4) overlay
[13:38:23 CET] <JEEB> mpv utilizes opengl much more heavily
[13:38:54 CET] <blap> yeah i saw it was setting-up a shader to do yuv conversion for e.g.  i think
[13:38:58 CET] <sfan5> on the rpi you should be preferring gles anyway
[13:39:26 CET] <blap> ok ty. i see that is set-able
[14:03:25 CET] <shtomik> Guys, tell me, please, what's wrong with my sound? https://www.youtube.com/watch?v=4oe4Ub2fBp4, What am I doing wrong in transcoding audio? Thanks!
[14:03:53 CET] <JEEB> use pastebin or gist to just post your code
[14:06:09 CET] <shtomik> JEEB: Hello, JEEB, 1 min, thanks!
[14:11:05 CET] <shtomik> JEEB: https://pastebin.com/XM6ZXJJP
[14:44:04 CET] <shtomik> JEEB: Are u here? I'm trying to play with filter_spec, but there is no result yet ;(
[16:35:15 CET] <shtomik> Somebody is here ? ;(
[18:31:34 CET] <durandal_1707> shtomik: what? fixed your code?
[18:33:57 CET] <shtomik> durandal_1707: No, but I think I know what's the problem... All is good with sound if I don't capture picture... Problem with nb_samples or with audio sync
[18:37:02 CET] <shtomik> durandal_1707: But I don't know yet how to fix it
[18:41:14 CET] <shtomik> durandal_1707: I add av_buffersink_set_frame_size(), but audio doesn't sync with picture
[18:43:17 CET] <durandal_1707> shtomik: perhaps its encoding video, and thats slow...
[18:45:05 CET] <shtomik> durandal_1707: preset: ultrafast, hm
[18:56:22 CET] <shtomik> durandal_1707: something with pkt duration or nb samples... oh ;(
[19:04:27 CET] <durandal_1707> unlikely
[19:11:15 CET] <shtomik> durandal_1707: If I record only sound, and don't set frame size I have some issue...
[19:12:15 CET] <shtomik> durandal_1707: What do u think about it?
[19:43:59 CET] <durandal_1707> code missing/wrong
[19:49:59 CET] <shtomik> ??
[20:01:29 CET] <fexJ> hello, how can I transcode multicast h254 to multicast mpeg2 with audio ac3
[22:26:09 CET] <kms_> hi
[22:26:41 CET] <kms_> how to apply fade videofilter to output framerate
[22:26:41 CET] <shtomik> hi
[22:26:47 CET] <kms_> not input
[22:27:47 CET] <shtomik> Sorry, I don't know
[22:28:04 CET] <kms_> i try to create slideshow, and i need fade out last photo
[22:30:54 CET] <shtomik> https://ffmpeg.org/ffmpeg-filters.html#fade
[22:30:56 CET] <shtomik> ?
[22:46:00 CET] <kms_> it applied to input fps
[00:00:00 CET] --- Mon Feb 26 2018


More information about the Ffmpeg-devel-irc mailing list