[Ffmpeg-devel-irc] ffmpeg.log.20171105

burek burek021 at gmail.com
Mon Nov 6 03:05:01 EET 2017


[00:31:35 CET] <kepstin> it wouldn't be possible in general on any codec that does prediction
[00:31:45 CET] <kepstin> so you could do it on, say, mjpeg... :/
[00:42:11 CET] <Cracki> no, you could patch up prediction too
[00:43:19 CET] <Cracki> might not be cheaper than a reencode
[00:43:27 CET] <Cracki> then again, might be...
[01:03:01 CET] <plexigras> im getting this error `ERROR: gnutls not found using pkg-config`
[01:03:48 CET] <DHE> you need to install the '-dev' packages for any software you want to compile with. so maybe try installing gnutls-dev or such
[01:04:41 CET] <plexigras> oh ok
[01:05:24 CET] <plexigras> what is that package on arch
[01:05:54 CET] <utack> ah who knows man...check an aur pkgbuild for ffmpeg git maybe and see what it wants to install before building
[01:06:07 CET] <utack> https://aur.archlinux.org/cgit/aur.git/plain/PKGBUILD?h=ffmpeg-git
[01:06:13 CET] <utack> a ton of stuff
[01:06:32 CET] <utack> but makedepends are pretty sane
[01:09:04 CET] <plexigras> thanks i will just go and reinstall all of them maybe that helps
[01:31:22 CET] <plexigras> nope i still get the same error
[01:34:36 CET] <furq> plexigras: check config.log
[01:34:41 CET] <furq> there'll be an actual error at the end
[03:03:14 CET] <zash> Hey. I'm trying to turn a pbm stream into a video but so far the only success I had was by turning each frame into a png first, but it'd be nice if I could use pbm_pipe. command and output: https://pastebin.com/s2snpJsV
[03:05:29 CET] <c_14> what is /tmp/mtrace.ppm?
[03:07:53 CET] <zash> output of a script in netbmp P1 format, 1024x1024 px
[03:08:06 CET] <zash> netbpm*
[03:08:28 CET] <c_14> Is it a file? A named pipe? Unix socket?
[03:08:34 CET] <zash> A file
[03:09:16 CET] <zash> I've tried to pipe from the script to ffmpeg with the same result
[03:10:06 CET] <Cracki> so what's the problem? looks like it's producing output at 30 Mbit/s
[03:10:30 CET] <zash> I get a 0 second long file
[03:10:36 CET] <Cracki> single frame?
[03:11:49 CET] <zash> Duration: 00:00:00.04, start: 0.000000, bitrate: 736 kb/s  Stream #0:0: Video: vp9 (Profile 1), gbrp(pc, gbr/unknown/unknown, progressive), 1024x1024, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
[03:12:11 CET] <Cracki> sounds like a single frame
[03:12:30 CET] <Cracki> pbm pipe is perhaps expecting different input
[03:12:32 CET] <zash> Some of the variations I've tried has aborted with "there's no output stream" or somesuch
[03:12:43 CET] <Cracki> how does that file look like
[03:12:54 CET] <Cracki> is pbm a video format, or a still image format?
[03:13:06 CET] <zash> An image format
[03:13:16 CET] <Cracki> then ffmpeg will read one frame
[03:13:46 CET] <Cracki> you have the option of piping out raw bitmap data
[03:14:02 CET] <Cracki> ffmpeg will split that into frames. but you'll have to tell it resolution and depth explicitly
[03:14:24 CET] <c_14> zash: works fine for me
[03:14:29 CET] <c_14> do you have output I can test with?
[03:14:46 CET] <zash> It's basically "P1\n1024 1024\n" followed by space-separated lines of "1" or "0"
[03:14:49 CET] <Cracki> I'm trying to find anything on this pbm_pipe, but the formats docs online don't mention it at all
[03:15:34 CET] <c_14> Cracki: alias for image2pipe
[03:15:48 CET] <Cracki> https://ffmpeg.org/ffmpeg-formats.html
[03:15:48 CET] <c_14> which probably forces format detection
[03:16:00 CET] <c_14> just because it's not in the docs doesn't mean it doesn't exist
[03:16:25 CET] <c_14> I don't think image2pipe is in the docs at all
[03:16:25 CET] <Cracki> it needs to be documented
[03:16:39 CET] <Cracki> don't expect users to guess or read the source code
[03:18:56 CET] <geri> hi
[03:19:16 CET] <zash> Heh, this all started with someone noticing pbm_pipe in `ffmpeg -formats|grep -i pbm`
[03:19:35 CET] <furq> looking good zash
[03:19:41 CET] <geri> which codec can take a bgra images and save as a video?
[03:19:50 CET] <furq> geri: loads of them
[03:20:01 CET] <furq> if you want it lossless then probably ffv1
[03:20:23 CET] <geri> ok
[03:20:33 CET] <geri> furq: i use ffv1
[03:20:35 CET] <geri> opencv
[03:27:49 CET] <qmr> Any suggestions on rotating video 180 degrees with minimal generation loss?
[03:28:41 CET] <Cracki> that's not lossy
[03:28:52 CET] <Cracki> recompressing will probably be
[03:29:04 CET] <Cracki> or set some transform matrix in the container to display it rotated
[03:37:17 CET] <qmr> I'm not going to archive uncompressed 1080P so yes it will be lossy, need to recompress
[03:37:20 CET] <qmr> I thought that was implied
[03:37:28 CET] <qmr> this fucking camera doesn't have video flip wtf
[03:39:10 CET] <zash> Do video codecs/containers commonly have something like the flag in some image formats that says that it's rotated?
[03:39:19 CET] <furq> yes they do
[03:39:24 CET] <furq> mp4 definitely does
[03:39:34 CET] <furq> qmr: -c copy -metadata:s:v:0 rotate=180
[03:39:38 CET] <furq> if your player supports it
[03:39:59 CET] <furq> or rotate=0 if your camera already set the flag and now the video is upside-down, which you can check with ffprobe
[03:41:58 CET] <geri> furq: do you have an example how to write a video using lib ffmpeg?
[03:42:08 CET] <furq> not other than the ones that are part of the docs
[03:42:32 CET] <furq> https://www.ffmpeg.org/doxygen/trunk/examples.html
[03:45:17 CET] <zash> Let's try this rawvideo thing furq suggested
[03:45:19 CET] <qmr> furq:  You are a gentleman and a scholar
[03:45:37 CET] <qmr> I don't get why there is no flip in this camera!!!  other generations have it
[03:48:52 CET] <geri> furq: https://gist.github.com/Arnold1/0369ce50f72b74a52bdf84d7534af242
[03:49:03 CET] <geri> sory: https://gist.github.com/yohhoy/52b31522dbb751e5296e
[04:12:34 CET] <geri> i try to use cmake with ffmpeg. FIND_PACKAGE( FFMPEG REQUIRED ) gives me: CMake Error at Example/CMakeLists.txt:8 (FIND_PACKAGE):   By not providing "FindFFMPEG.cmake" in CMAKE_MODULE_PATH this project has   asked CMake to find a package configuration file provided by "FFMPEG", but   CMake did not find one.
[04:12:42 CET] <geri> but i installed ffmpeg
[05:27:20 CET] <_Vi> How do I make FFmpeg wait for incoming data starting to flow in as long as needed, not bailing out with "[sdp] Could not find codec parameters for stream 0" after 10 seconds?
[05:28:34 CET] <_Vi> Something like -analyzeduration 1000000000000 also doesn't help.
[05:45:06 CET] <furq> _Vi: you need to set -probesize as well iirc
[06:02:24 CET] <_Vi> furq, Still bails out.
[06:02:57 CET] <_Vi> It primarily bails about not because of analysing, but because of timeout. Yet -timeout option is not accepted.
[12:42:02 CET] <voltagex> hi, is anyone able to compile libbluray into ffmpeg with --enable-static? It's failing to find (compile) libbluray, see http://sprunge.us/QEGJ
[12:44:00 CET] <JEEB> IIRC bug reports were made to that library among others
[12:44:26 CET] <JEEB> their pkg-config file doesn't contain Libs.private=-dl
[12:44:39 CET] <JEEB> which is where dlopen and such would be gotten
[12:44:53 CET] <JEEB> the static library format is archaic which is why these dependencies are not in the library itself
[12:45:29 CET] <JEEB> this started popping out after we fixed dependency handling in the configure script, and started trusting the pkg-config pc files' contents more
[12:45:44 CET] <JEEB> (basically before "-ldl" got added globally from some other place)
[12:45:49 CET] <voltagex> ah, was googling and coming up empty
[12:45:54 CET] <voltagex> should I report another bug?
[12:46:12 CET] <JEEB> I don't think so, libbluray static breakage is known and I think already communicated to libbluray
[12:46:27 CET] <JEEB> --extra-libs="-ldl" I think is the way to work around it for now
[12:46:31 CET] <JEEB> -34
[12:46:51 CET] <voltagex> --extra-libs="-ldl" on ffmpeg's configure line?
[12:47:21 CET] <JEEB> yes
[12:47:29 CET] <JEEB> or you can edit the libbluray .pc file if you want
[12:47:31 CET] <JEEB> to have the proper flag
[12:47:45 CET] <voltagex> ah yep, that's fixed it, thanks
[12:48:02 CET] <JEEB> oh yay
[12:48:05 CET] <JEEB> libbluray fixed their jack
[12:48:05 CET] <JEEB> http://git.videolan.org/?p=libbluray.git;a=blobdiff;f=src/libbluray.pc.in;h=d6a34b1efc185200d2137aaf040f6b01839668c6;hp=2fb51bb584e2073536fd7e7e0d7440f000746130;hb=bd887f4e9d8e81f2656fe0d0494bf20af852a23c;hpb=b5216adb9af57e3089260132d41d5254de90da68
[12:48:15 CET] <JEEB> so I guess the next release will have this fixed :P
[12:48:47 CET] <voltagex> yeah, in theory ffmpeg just needs to update their git modules file
[12:48:48 CET] <JEEB> also seems like they're using Requires.private instead of Libs.private, but I guess that will work?
[12:49:01 CET] <JEEB> voltagex: we don't have submodules to any dependencies
[12:49:16 CET] <JEEB> it's 100% up to the user to select which version to build against
[12:49:19 CET] <voltagex> oh right, was getting confused with libudfread
[12:49:20 CET] <voltagex> sorry
[12:49:42 CET] <JEEB> but yea, great to know that libbluray fixed their jack
[12:49:47 CET] <voltagex> I pulled the latest libbluray from git, did I miss something?
[12:49:57 CET] <JEEB> check the pc file in your prefix
[12:50:02 CET] <JEEB> /your/prefix/lib/pkgconfig
[12:50:10 CET] <JEEB> should have libbluray.pc
[12:50:17 CET] <JEEB> check its contents
[12:50:57 CET] <voltagex> Libs.private: -ldl
[12:51:04 CET] <voltagex> but I still had that issue :/
[12:51:46 CET] <JEEB> oh right, did you have pkg-config configured for static?
[12:51:55 CET] <JEEB> --pkg-config-flags="--static"
[12:51:58 CET] <JEEB> I think it was like that
[12:52:04 CET] <JEEB> it adds --static to the pkg-config line
[12:52:29 CET] <JEEB> because if you look at the last check that fails for libbluray that one doesn't have -ldl there
[12:53:10 CET] <voltagex> what needs that --pkg-config-flags parameter added? ffmpeg?
[12:54:18 CET] <JEEB> ffmpeg's configure line since you're building against static libraries
[12:54:31 CET] <JEEB> otherwise the private things don't get added by pkg-config
[12:54:47 CET] <JEEB> since shared libraries have those embedded, unlike the archaic format for static libraries
[12:59:51 CET] <voltagex> yep, that also works
[13:00:23 CET] <JEEB> yeh, that's the one thing you always had to do when dealing with static libraries
[13:01:13 CET] <voltagex> guess that's what I get for using helper scripts until now :P
[13:01:41 CET] <JEEB> well due to the dependency handling being more of a mess before it might have accidentally worked
[13:01:54 CET] <JEEB> "we added this and this commonly used library by accident because X,Y,Z"
[13:02:25 CET] <JEEB> I think the funziest is when you have something built in C++
[13:02:42 CET] <JEEB> and the projects say that they can't find out which C++ standard library they were linked against
[13:03:00 CET] <JEEB> so they cannot properly fill up their pc file ;)
[14:41:26 CET] <_Vi> Does FFmpeg support "a=group:BUNDLE" in SDPs? (draft-ietf-mmusic-sdp-bundle-negotiation-39) to reuse the same RTP port for multiple streams?
[17:30:07 CET] <geri> hi does ffmpeg support GPU transcoding?
[17:30:12 CET] <JEEB> yes
[17:30:19 CET] <JEEB> with various APIs
[17:31:07 CET] <pomaranc> geri: I'm using nvidia nvenc/nvdec chips without any major problems
[17:31:29 CET] <geri> how much speedup do u notice?
[17:31:45 CET] <JEEB> I don't think it's much faster than x264's ultrafast
[17:31:47 CET] <JEEB> for AVC
[17:31:49 CET] <pomaranc> I'm using it for live, so there is no speedup
[17:32:10 CET] <geri> live?
[17:32:14 CET] <JEEB> the main point seems to be that the "cloud" providers somehow make you pay less for an OK CPU than a GPU. which doesn't make any sense
[17:32:20 CET] <pomaranc> geri: live streaming
[17:32:25 CET] <JEEB> so even if you lose a crapload of quality per bits used
[17:32:25 CET] <redrabbit> same nvenc works very good for me
[17:32:32 CET] <redrabbit> for live tv
[17:32:41 CET] <geri> pomaranc: i plan live too (screen record)
[17:32:46 CET] <JEEB> it somehow ends up being cheaper, compared to something
[17:33:08 CET] <JEEB> I do understand local live streaming of course since there you want to use the main CPU power for the game or whatever
[17:33:21 CET] <pomaranc> geri: well instead of using all your cpu it will use like a fraction of a single core :)
[17:33:56 CET] <redrabbit> i use nvenc on my main rig for iptv because its cpu-cheap and doesnt heatup anything, use no power, also stream starts faster
[17:34:09 CET] <redrabbit> sure cpu looks a bit better
[17:34:19 CET] <JEEB> the stream starts faster part just looks like you didn't configure x264 at all
[17:34:42 CET] <redrabbit> well i configured x264 for quality
[17:34:50 CET] <redrabbit> but its only useful when bw is low
[17:35:02 CET] <JEEB> well you are complaining exactly about that!
[17:35:04 CET] <pomaranc> quality means more delay
[17:35:06 CET] <JEEB> for eff's sake
[17:35:14 CET] <redrabbit> no complaints !
[17:35:28 CET] <JEEB> and I'm trying to tell you that people have found it to be better than hw encoders even at something like super/ultrafast level of speeds
[17:35:38 CET] <JEEB> and if you need faster starting that's goddamn configurable
[17:35:45 CET] <redrabbit> sure
[17:35:45 CET] <JEEB> the fact that the limited control on nvidia's part makes it the default
[17:35:50 CET] <JEEB> does not make it better (!)
[17:35:57 CET] <redrabbit> im just saying its working good
[17:36:04 CET] <redrabbit> the nvenc stuff
[17:36:12 CET] <JEEB> hw encoding can be good for your use case but your way of putting it is incorrect
[17:36:15 CET] <redrabbit> worth using it if its available
[17:36:21 CET] <redrabbit> probably
[17:36:35 CET] <JEEB> it is worth using IFF you look into the things and find out it is what makes most sense
[17:36:43 CET] <JEEB> that IFF is "if and only if"
[17:36:51 CET] <JEEB> because hw encoding is not magical
[17:36:56 CET] <JEEB> it has its good points and drawbacks
[17:37:12 CET] <redrabbit> also i use the hevc hw encoding mostly
[17:37:34 CET] <JEEB> have you actually checked if it's better than the AVC alternatives?
[17:37:41 CET] <redrabbit> its a bit better than avc
[17:37:46 CET] <JEEB> because a newer format does not mean the encoder is better
[17:38:00 CET] <redrabbit> comparing nvenc avc vs hevc
[17:38:02 CET] <JEEB> and I'm not limiting it to just nvenc where the encoders can be random quality
[17:38:06 CET] <JEEB> yea, that's what I guessed
[17:38:31 CET] <JEEB> anyways, always when you speak in such things mention the things you have looked into and why you are using things. the context. the context often is quite important
[17:38:34 CET] <JEEB> people often forget that
[17:38:46 CET] <geri> pomaranc: sweet
[17:38:58 CET] <redrabbit> context is personal pc with iptv service
[17:39:12 CET] <JEEB> no, also useful is how much you actually compared before you just decided to use hw encoding
[17:39:18 CET] <redrabbit> sometimes its used by someone else in the house and i dont want to get taxed on my cpu
[17:39:25 CET] <JEEB> OK, THAT MAKES SENSE
[17:39:37 CET] <JEEB> are you kind of getting what I'm trying to note here?
[17:39:44 CET] <redrabbit> right
[17:40:10 CET] <geri> pomaranc: do you have some c++ code samples?
[17:40:34 CET] <JEEB> it's the same thing as "cloud providers are weird and currently price their GPU units cheaper than units with proper CPUs - and I'm OK with the crappy compression"
[17:40:55 CET] <JEEB> although to be fully honest I'd at that point probably rent a dedicated server :D
[17:40:55 CET] <pomaranc> geri: nope, I use the ffmpeg binary
[17:41:11 CET] <JEEB> esp. if it's a 24/7 thing
[17:41:16 CET] <pomaranc> geri: and C > C++ anyway!
[17:41:31 CET] <JEEB> geri: there are various examples under doc/examples
[17:41:40 CET] <JEEB> they are in C but you should be able to utilize ssame things from C++
[17:41:52 CET] <JEEB> just remember to extern "C" {} the headers
[17:42:41 CET] <geri> thats fine as long as i find them
[17:43:40 CET] <JEEB> they should be rather simple to find as I noted their location in the FFmpeg source tree :)
[17:48:47 CET] <c3r1c3-Win> Rent? JEEB, I thought you would just build it. LOL.
[17:49:17 CET] <JEEB> depends
[17:49:30 CET] <JEEB> sometimes it's cheaper to rent a box than put it colocated
[19:22:54 CET] <utack> question about getting something from HDR to SDR with tonemapping. does this command line sound right or can the initial crop and scaling do something to throw off tonemapping? https://paste.ubuntu.com/25897176/  A quick insight would be appreciated
[19:23:13 CET] <utack> using hable with mpv on the input video looks a lot different for some reason
[19:26:12 CET] <JEEB> yea, you better check out the exact filter chain that generates
[19:26:38 CET] <JEEB> also haasn has been busy changing/improving the tone mapping related stuff as of late so the results might be different, but I would *always* check things first
[19:26:58 CET] <JEEB> also I'm not sure if avfilter keeps the mastering screen etc metadata
[19:27:04 CET] <utack> ok thanks. colors look muted and dark, with or without crop first.
[19:27:18 CET] <JEEB> yea, just check with -v debug or so to see if any filters get auto-inserted
[19:27:21 CET] <utack> in mpv they look too bright and vibrant, so the opposite
[19:27:30 CET] <utack> i will thanks
[19:27:40 CET] <JEEB> the second thing is if the mastering screen metadata gets through libavfilter, which I would guess it doesn't
[19:28:23 CET] <utack> not cropping changes nothing on how the output looks, with "-vf zscale=transfer=linear,tonemap=hable,zscale=transfer=bt709:matrix=709:w=1920:h=1080,format=yuv420p" instead
[19:29:21 CET] <JEEB> yea, I would check if anything gets added
[19:29:28 CET] <JEEB> -v debug or similar should do that
[19:29:37 CET] <utack> i will, yep
[19:30:00 CET] <JEEB> id not, it could just be the lack of the screen metadata that mpv is most likely capable of using
[19:30:03 CET] <JEEB> *if
[19:30:14 CET] <JEEB> and/or changes in how the tone mapping works since koda made the filter
[19:30:25 CET] <JEEB> haasn has been hard at work looking at various things
[19:31:48 CET] <utack> it is quite possible that the version in the ffmpeg git build i use is closer to the "truth" than the mpv result
[19:32:20 CET] <JEEB> not sure, I know most things are actually over-vibrated
[19:32:38 CET] <JEEB> also things such as lack of mastering screen metadata etc
[19:32:59 CET] <JEEB> I haven't had time to look into the tonemap filter, but I know it did match what mpv did earlier
[19:33:04 CET] <JEEB> since koda based it on it
[19:33:22 CET] <JEEB> this lovely chap: https://www.youtube.com/watch?v=AEfCKaclafw
[19:36:51 CET] <utack> video unavailable
[19:37:07 CET] <utack> but by this time next year, it will hopefulyl be unavailable in AV1 :D
[19:37:35 CET] <JEEB> oh, it got removed?
[19:37:39 CET] <JEEB> funky
[19:38:02 CET] <JEEB> seems so
[19:38:06 CET] <JEEB> wonder what happened
[19:38:23 CET] <JEEB> he did a shorter version of it at VDD, but the recordings of that have really low volume
[21:48:08 CET] <hanna> utack: can you send the source file?
[21:48:15 CET] <hanna> (one frame is enough)
[21:48:18 CET] <hanna> I'll try libplacebo for comparison
[21:52:53 CET] <utack> thank you hanna
[22:04:41 CET] <hanna> utack: https://0x0.st/szTu.jpg libplacebo result
[22:04:52 CET] <hanna> converted from BT.2020+PQ to sRGB
[22:06:21 CET] <utack> that looks exactly as i was hoping
[22:06:30 CET] <utack> have you tried my commandline vf to see if it works as well?
[22:06:36 CET] <hanna> haven't
[22:06:44 CET] <hanna> (I doubt it would look different for me than for you)
[22:06:58 CET] <utack> that would be really nice if you could try that, if it works or if i messed up or there is some bug
[22:07:12 CET] <utack> "-vf zscale=transfer=linear,tonemap=hable,zscale=transfer=bt709:matrix=709:w=1920:h=1080,format=yuv420p" was the one i tried
[22:09:49 CET] <hanna> utack: sounds like you wanted primaries=bt709 as well
[22:09:55 CET] <hanna> in your second zscale
[22:10:00 CET] <hanna> (or perhaps your first)
[22:10:12 CET] <utack> good idea
[22:10:35 CET] <hanna> still looks somewhat dimmer etc.
[22:10:50 CET] <hanna> might be because of lack of appropriate metadata
[22:11:06 CET] <hanna> anyway the libplacebo algorithm is a bit different than the one in ffmpeg; it takes into account both the peak brightness and the scene average brightness
[22:11:09 CET] <utack> but it was copied in the file i sent you, right?
[22:11:27 CET] <hanna> it was
[22:11:33 CET] <hanna> but that doesn't necessarily mean the ffmpeg filter uses it
[22:12:26 CET] <JEEB> lavfi IIRC kills the mastering screen stuff
[22:12:40 CET] <JEEB> not sure how much passes through otherwise of these filters
[22:12:49 CET] <JEEB> lavfi seems to be really liking killing off AVFrame metadata
[22:13:08 CET] <utack> yeah turns i was an idiot, primaries sovled it
[22:13:09 CET] <hanna> utack: also, my libplacebo result completely ignores the metadata
[22:13:11 CET] <utack> thank you hanna
[22:13:23 CET] <JEEB> :)
[22:13:32 CET] <hanna> I was throwing the metadata away (which is useless anyway) and doing scene detection on the frame itself
[22:14:30 CET] <hanna> per-scene metadata would probably solve this
[22:14:43 CET] <hanna> but the blu-ray spec mandated some completely useless garbage instead
[22:14:59 CET] <JEEB> it's just that they've not finished debating which version of dynamic metadata they want
[22:15:21 CET] <JEEB> https://kuroko.fushizen.eu/screenshots/smpte/st2094_structure.png
[22:15:21 CET] <hanna> MaxCLL and MaxFALL are actually the closest thing to being useful but with the way most HDR content is mastered, all you get from MaxFALL is that this contains some stupidly bright scenes
[22:15:32 CET] <hanna> doesn't help you at all for 99% of the scenes in the movie
[22:15:35 CET] <JEEB> currently they've got these four shits
[22:15:48 CET] <JEEB> 1 is dolby vision, 4 is samsung's HDR10+
[22:17:37 CET] <JEEB> so far calculating some sort of stuff that you've been discussing is probably the least insane thing (with some lookahead)
[22:17:46 CET] <JEEB> since that's how it seems to be done with dolby vision's mastering tools
[22:17:53 CET] <JEEB> "pre-pass that takes a fuckload of time"
[22:21:05 CET] <hanna> use libplacebo
[22:21:07 CET] <hanna> it's fast ;)
[23:08:22 CET] <geri> hanna: whats libplacebo?
[23:08:40 CET] <hanna> https://github.com/haasn/libplacebo
[23:08:52 CET] <hanna> GPU shader stuff; can be used for offline processing
[23:09:05 CET] <hanna> unfortunately no libavfilter yet
[23:09:46 CET] <geri> what is it good for?
[23:10:07 CET] <hanna> colorspace conversions, sampling/scaling, debanding, dithering etc.
[23:10:11 CET] <hanna> more planned
[23:10:30 CET] <hanna> but it does all of this on the GPU in realtime (since it was developed for video playback)
[23:11:36 CET] <geri> i look for a video codec to create a video which runs on GPU
[23:14:26 CET] <JEEB> there were some GPGPU video formats, but those were not meant for high compression
[23:14:46 CET] <JEEB> since the amount of threading you have to be able to do makes it kind of incompatible with high threading
[23:14:55 CET] <JEEB> in other words, while video filtering etc is something GPUs can do
[23:15:01 CET] <JEEB> encoding video is not one of those things
[23:15:12 CET] <JEEB> which is why all companies stopped mentioning GPGPU for encoding
[23:15:24 CET] <JEEB> instead all vendors put standalone separate hardware on their chips
[23:15:29 CET] <JEEB> ASICs
[23:15:38 CET] <JEEB> black boxes where you can feed textures and get encoded video
[23:15:57 CET] <utack> who decided that this needs to be part of the gpu in the first place?
[23:16:04 CET] <utack> might have just as well part of a CPU?
[23:16:19 CET] <JEEB> well it is with intel
[23:16:26 CET] <JEEB> and I think AMD will be doing the same with the APUs
[23:16:38 CET] <JEEB> of course with intel CPUs it's part of the iGPU package, yea
[23:16:56 CET] <JEEB> utack: I think it's mostly for rendering perf
[23:17:03 CET] <JEEB> so you can export a d3d or opengl texture
[23:17:04 CET] <utack> is there one arm like "instance" making a blueprint for these things or does everyone cook one up themselves?
[23:17:06 CET] <JEEB> or something compatible with one
[23:17:23 CET] <JEEB> I think intel/amd/nvidia all have their own ASIC designs
[23:17:30 CET] <utack> interesting
[23:17:37 CET] <utack> are there comparisons which one sucks the most?
[23:17:43 CET] <hanna> utack: something something VRAM
[23:18:00 CET] <hanna> they all suck
[23:18:12 CET] <hanna> actually the problem is that they're too high level
[23:18:15 CET] <JEEB> utack: there were but I am not aware of any modern ones that are proper
[23:18:18 CET] <hanna> it's like you feed them frames with metadata and they shit out textures
[23:18:20 CET] <hanna> at whatever bit depth they feel like
[23:18:21 CET] <utack> ok
[23:18:24 CET] <hanna> and in whatever colorspace they feel like
[23:18:32 CET] <hanna> and don't expect them to parse the headers correctly either
[23:18:44 CET] <JEEB> that's for decoding
[23:18:48 CET] <JEEB> I think this was about encoding
[23:18:52 CET] <hanna> there's some stuff you can do (e.g. NVDEC, CUDA) to get a bit more control over the process
[23:18:54 CET] <hanna> oh
[23:19:01 CET] <hanna> sorry my bad
[23:19:02 CET] <JEEB> np
[23:19:11 CET] <hanna> GPU encoding is just.. no
[23:19:19 CET] <JEEB> well, lossless 4:4:4 encoding in nvidia is nice
[23:19:24 CET] <JEEB> since it's real fast from screen capture
[23:19:34 CET] <JEEB> still uses a fuckload of space
[23:19:37 CET] <JEEB> but at least it's lossless
[23:23:34 CET] <hanna> yeah I guess that's better than trying to do raw screen capture
[23:24:12 CET] <Mavrik> Especially since it's easier on the GPU when capturing 3D stuff
[23:24:27 CET] <Mavrik> Which is the most common usecase these days I guess
[23:25:10 CET] <thebombzen> ffmpeg has a magicyuv encoder though, which is convenient
[23:25:19 CET] <thebombzen> was added recently
[23:25:47 CET] <JEEB> also someone posted some AVX2 UtVideo asm
[23:25:51 CET] <thebombzen> alternatively utvideo provides good screencap speed as does x264 ultrafast (superfast if you can handle it)
[23:25:52 CET] <JEEB> should check it out
[23:26:13 CET] <JEEB> thebombzen: the magicyuv developer is "muh secret sauce" retard so I dislike the format
[23:26:24 CET] <JEEB> Ut Video guy has made open source plugins for all major multimedia frameworks
[23:26:30 CET] <JEEB> which is why I have respect for the dude
[23:26:39 CET] <JEEB> and he doesn't think of his shit as super special bullshit
[23:26:59 CET] <Mavrik> For screen capture I'd try to avoid using CPU though. Reencode later :)
[23:26:59 CET] <utack> (the amd asics do not seem that terrible actually i have to say, steam link is surprisingly usable at zerolatency 30mbit encode for full hd 60fps)
[23:27:17 CET] <JEEB> Mavrik: yea
[23:27:29 CET] <JEEB> either GPU encoding if it lets you do lossless, or something really lightweight
[23:27:33 CET] <JEEB> like Ut Video or so
[23:36:33 CET] <thebombzen> kmsgrab + vaapi tho
[23:37:01 CET] <JEEB> if it is lossless compatible that's OK for me too
[23:48:14 CET] <rjp421> how much space will i need to git clone and build?
[23:50:13 CET] <JEEB> my git clone is about 400MiB
[23:50:35 CET] <JEEB> and then the armv7 build I have is about 60 megabytes
[23:56:13 CET] <rjp421> ty
[00:00:00 CET] --- Mon Nov  6 2017


More information about the Ffmpeg-devel-irc mailing list