[Ffmpeg-devel-irc] ffmpeg.log.20170503

burek burek021 at gmail.com
Thu May 4 03:05:01 EEST 2017


[00:37:53 CEST] <kilobyte_> good evening everyone. Does anybody have an idea how to record a game which runs at 50 fps? I tried this, but it doesnt record all the frames, the replay is a VERY LITTLE jerky: ffmpeg -probesize 10M -rtbufsize 1500M -f dshow -i audio="virtual-audio-capturer" -acodec pcm_s16le -f gdigrab -framerate 50 -i desktop -vcodec libx264 -qp 0 -threads 0 -crf 0 -preset ultrafast -tune zerolatency output.mkv
[00:38:25 CEST] <furq> don't use gdigrab
[00:38:37 CEST] <furq> and also don't use -tune zerolatency if you're recording to a file
[00:38:43 CEST] <furq> generally don't use it ever
[00:39:21 CEST] <kilobyte_> I also have UScreenCapture, no joy with this either
[00:39:23 CEST] <james9999> the ffmpeg doc suggests using gdigrab
[00:39:32 CEST] <furq> where does it do that
[00:40:16 CEST] <james9999> https://trac.ffmpeg.org/wiki/Capture/Desktop
[00:40:33 CEST] <furq> that's the wiki, not the docs
[00:40:58 CEST] <furq> it also suggests using dshow and then says "gdigrab also exists" as an afterthought
[00:41:10 CEST] <furq> which seems about right because gdigrab is inherently slow
[00:41:40 CEST] <furq> but yeah if you're capturing from a game under windows then OBS or something like that will probably do a better job
[00:41:45 CEST] <james9999> It also mentions it here without any caution or proviso: https://www.ffmpeg.org/ffmpeg-devices.html
[00:41:53 CEST] <BtbN> I wonder how hard a desktop duplication capture source would be
[00:42:49 CEST] <furq> well yeah
[00:42:58 CEST] <furq> https://www.ffmpeg.org/ffmpeg-codecs.html#libmp3lame-1
[00:43:04 CEST] <furq> that doesn't bother to mention that mp3 is shit
[00:43:23 CEST] <furq> if you had warnings for every potential problem then the docs would be twice as dense as they are already
[00:43:41 CEST] <james9999> well all I can say is when I was googling for desktop capture it's one of the first things I tried after reading about it in the documentation
[00:43:49 CEST] <furq> well yeah it works
[00:44:02 CEST] <furq> but if you can get a reliable 50fps capture of a game with it then you're very lucky
[00:44:26 CEST] <furq> it's adequate enough for screencasts and stuff where there's very little movement
[00:44:58 CEST] <james9999> so advice on getting a game capture at 50fps goes where? the wiki?
[00:45:05 CEST] <furq> probably
[00:45:30 CEST] <james9999> ok
[00:45:33 CEST] <furq> ideally the OBS wiki because that's actually designed for doing this
[00:46:18 CEST] <james9999> maybe. but screencasting is the reason I googled for ffmpeg in the first place
[00:46:25 CEST] <james9999> so it's not exactly an uncommon use case
[00:50:45 CEST] <BtbN> You want OBS for that
[00:51:34 CEST] <james9999> BtbN: ironically that was the last program I downloaded.
[00:51:57 CEST] <james9999> although with that pkt_size hack I did get streaming to work sort of with ffmpeg
[02:57:02 CEST] <james9999> https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC
[02:57:02 CEST] <james9999> i'm trying to wrap my head around the idea of a c99-to-c89 converter. o_0
[04:01:18 CEST] <utack> who makes the docs on https://www.ffmpeg.org/ffmpeg-all.html ?
[04:01:41 CEST] <utack> it seems like zscale "tin" also accepts smpte2084, just the the output
[04:01:46 CEST] <utack> that is not documented
[05:03:00 CEST] <c_14> utack: anyone who writes a patch and sends it to the mailing list
[05:43:19 CEST] <james999> the last project mailing list I looked at every post was BUG or PATCH
[05:43:25 CEST] <james999> is that typical?
[05:43:51 CEST] <c_14> what else would be on it?
[05:44:04 CEST] <james999> idk. discussion about what features to add or support?
[05:44:29 CEST] <c_14> happens rarely, or on irc
[05:44:41 CEST] <furq> non-contributors having interminable conversations about the future of the project
[05:44:41 CEST] <james999> also several threads were just one person replying to themselves
[05:45:03 CEST] <furq> or just bikeshedding in general
[05:45:31 CEST] <furq> oh and also people signing their name and having a hilarious and clever quote afterwards
[05:45:48 CEST] <furq> extra points if you've set up your mail client to randomly select a quote from a big list you compiled in 1993
[05:46:37 CEST] <furq> and then...footnotes
[05:46:38 CEST] <james999> TIL Parkinson's Law of Triviality
[05:47:26 CEST] <utack> c_14 allright
[05:48:53 CEST] <james999> furq: lol at some point I stopped doing that because it got lame
[05:49:01 CEST] <james999> or i just stopped caring, either way
[05:49:13 CEST] <furq> it was always lame
[05:50:05 CEST] <furq> i hope you mean the quotes and not the footnotes
[05:50:06 CEST] <c_14> I'd totally throw a fortune here, but I don't want to install it so just imagine I did.
[05:50:15 CEST] <furq> the footnotes transcend lameness
[05:50:29 CEST] <furq> when you see a mailing list post with more than 10 footnotes...that's when you know this post is Worth Reading
[05:51:04 CEST] <furq> especially when you see it go from [3] to [5] and realise that [4] is inside [3]
[05:51:18 CEST] <james999> wow, wikipedia has an entire article discussing mailing list posting etiquette: https://en.wikipedia.org/wiki/Posting_style
[05:51:42 CEST] <james999> also yes I mean quotes, Idk how the hell you would justify footnotes in email
[05:51:50 CEST] <c_14> The number of people who get annoyed by bad posting style is non-negligible
[05:51:53 CEST] <furq> i'm going to link that to some people who have extremely strong feelings about top posting and watch them get furious
[05:51:55 CEST] <c_14> james999: I use them for links
[05:52:08 CEST] <furq> using them for links is the only acceptable use for footnotes
[05:52:24 CEST] <furq> i've seen many posts where the length of the footnotes exceeded the length of the post
[05:52:32 CEST] <james999> ah ok. i sort of do that on facebook sometimes, putting the links at the end of a post instead of inline
[06:05:50 CEST] <james999> on a completely different subject
[06:05:58 CEST] <james999> I'm reading about the ffmpeg drama. pretty intense
[08:22:35 CEST] <JC_Yang> do I need to use a lavf muxer to extract raw avc streams from a program stream? Do raw streams require a muxer?
[10:02:30 CEST] <kerio> is decoding h264 a deterministic thing
[10:02:46 CEST] <kerio> or will different decoders output different things
[10:04:01 CEST] <JC_Yang> that depends
[10:05:02 CEST] <JC_Yang> the decoders have flexibility to postprocess the decoded video
[10:05:19 CEST] <JC_Yang> e.g
[10:05:23 CEST] <kerio> yeah i guess
[10:05:43 CEST] <kerio> there's multiple different raw videos that map to the same h264 encoding for obvious reasons
[10:08:40 CEST] <JC_Yang> IIRC,  there're other cases which the standard allow some sort of flexibility, while the decoder should maintain the major semantic of the bistream.
[10:11:49 CEST] <jkqxz> The actual decoding of H.264 is completely deterministic - there is exactly one correct output for a given stream.  (This is a requirement because you can refer to previous frames without accumulating error.)
[10:12:37 CEST] <jkqxz> Decoders are allowed to postprocess the output to show something different to the user, but they must still store the correct output for use as reference.
[10:16:38 CEST] <jkqxz> (There are some funny edge cases to that: you don't need to actually generate the correct output for frames which aren't referred to (e.g. you can ignore or replace the deblocking filter) - this is generally only an optimisation used to do less work in the decoder, though.)
[10:17:18 CEST] <kerio> oh i see what you mean
[10:17:45 CEST] <kerio> for all intents and purposes, the output that other frames refer to *is* the correct output
[10:27:25 CEST] <termos> is it normal that right after running avcodec_open2 on my libx264 codec I get a time_base in AVCodecContext that's 0/2?
[12:29:09 CEST] <alfal> Hi everyone! Anyone here with ffserver experience? Is it possible to have a single jpeg in the stream and then replace the jpeg at different times? For example a stream that always shows the latest picture taken. I'm very thankful!
[12:47:03 CEST] <durandal_170> alfal: no
[13:05:55 CEST] <faLUCE> hello. In order to demux a http mpegts stream, I tried "demuxing_decoding.c" example, even if it says that it works on files (and doesn't say anything about http streams). Anyway, It receives the stream and prints that the frames are decoded, but the resulting output file is not readable. What's wrong? Should I allocate two demuxers, instead of only one, or is it enough  avformat_open_input(&fmt_ctx, "http://stream_url.
[13:05:57 CEST] <faLUCE> ts", NULL, NULL) ?
[13:49:16 CEST] <JEEB> faLUCE: I have no idea what you're expecting but as you can see a single open like that should get you the MPEG-TS demuxed
[14:03:38 CEST] <ss23> Hi, I'm wanting to create a filter that operates on two input streams (two versions of the same original content) and will only pass frames through to output when the frame(s) selected are an I frame of one input, and a B frame of the other. So, compare frame 0 from both streams, and only pass it through if it's an I frame in video stream 1, and also a B frame in video stream 2
[14:04:21 CEST] <ss23> I'm thinking I could use eq somehow, but it only operates on a single frame at a time, so I can't figure out how to make an eq for each stream but only pass frames through if both eq's worked
[14:35:35 CEST] <faLUCE> JEEB: then I should see the raw output file, but the player (ffplay) says it's corrupted... in addition, the demuxed packets doesn't have the size of encoded packets: the sizes are similar, but not  the same
[14:38:02 CEST] <Nacht> Anyone I can bother with some questions regarding H264 ?
[14:38:11 CEST] <faLUCE> Nacht: just ask
[14:38:26 CEST] <Nacht> What's the relationship between NALU: SPS and IDR ?
[14:39:02 CEST] <hiihiii> hello
[14:40:07 CEST] <hiihiii> I have made some gif files but discovered that they had the wrong dimensions (larger)
[14:41:06 CEST] <hiihiii> I now want to take those gif files and convert them to the right dimensions but I'm concerned that they'll be "re-encoded"
[14:42:02 CEST] <hiihiii> I've made them with palettegen/paletteuse
[14:43:54 CEST] <BtbN> you can't resize stuff without re-encoding
[14:44:10 CEST] <hiihiii> yes I now that
[14:44:46 CEST] <Nacht> 2nd question I also have, how do you identify if a PID is for audio or video ?
[14:45:10 CEST] <Nacht> based on the headers
[14:45:28 CEST] <hiihiii> I just don't want the colors to degrade a 2nd time.
[14:45:43 CEST] <BtbN> then encode from the original material again
[14:46:05 CEST] <hiihiii> I wish I had the original inputs
[14:46:26 CEST] <hiihiii> they were large in size, so I deleted them
[14:47:03 CEST] <hiihiii> I won't be concerned if I had used the generic palette
[14:47:58 CEST] <hiihiii> because the colors won't be altered by quantization
[14:49:42 CEST] <hiihiii> since I used a custom palette, I believe some loss in color is inevitable
[14:50:12 CEST] <BtbN> converting gif to gif should be lossless, if all involved parts are smart enough
[14:50:50 CEST] <hiihiii> even if I generate a palette again. it will not be the same as the one generated from source input
[14:50:55 CEST] <BtbN> but specially with scaling, which might create new intermediate colors, you can easily see a quality hit
[14:51:18 CEST] <hiihiii> well I'm scaling down
[14:51:19 CEST] <BtbN> There are only 255 colors already, no need to generate a palete
[14:51:49 CEST] <hiihiii> umm I don't know
[14:52:12 CEST] <hiihiii> tht would be the case for gif with a generic palette
[14:52:36 CEST] <BtbN> hm?
[14:52:46 CEST] <BtbN> A gif is paleted colors, 255 arbitrary colors.
[14:53:12 CEST] <BtbN> So when converting gif to gif, there is no point in making a new palete, there are no new colors going to show up from anywhere
[14:53:55 CEST] <hiihiii> but for a gif made with a custom palette, re-encoding will revert back to using a generic palette
[14:54:10 CEST] <BtbN> there is no generic gif palette.
[14:54:23 CEST] <BtbN> Every gif has its color palette embedded
[14:54:32 CEST] <hiihiii> and that will swap any special colors with the preset ones
[14:54:46 CEST] <hiihiii> but ffmpeg has one
[14:55:02 CEST] <BtbN> there's no preset ones. It will figure out the best palete from the input
[14:55:18 CEST] <hiihiii> http://blog.pkh.me/img/ffgif/default-palette.png
[14:55:30 CEST] <hiihiii> I'm afraid you're wrong
[14:55:52 CEST] <hiihiii> then what would be the use of palettegen and paletteuse filters ???
[14:56:09 CEST] <BtbN> to investigate a whole video for the best overall palette
[14:56:23 CEST] <BtbN> instead of a single frame
[14:56:48 CEST] <hiihiii> in other words create a palette of colors to get butter results
[14:57:02 CEST] <hiihiii> I've made simple tests
[14:57:21 CEST] <BtbN> I'm not sure what you are even trying to argue about. You don't seem to have a choice anyway
[14:57:27 CEST] <hiihiii> of the same input and the difference is clear
[14:57:52 CEST] <hiihiii> .
[14:58:20 CEST] <BtbN> scaling the gif with the same palette should be ok
[14:58:42 CEST] <BtbN> And running palettegen on the gif, I'd expect it to give you the original palette again
[15:00:22 CEST] <hiihiii> maybe...
[15:00:23 CEST] <hiihiii> I came here to ask whether I can or not suppress/tell ffmpeg to not look for any other palette because it's already exists inside the input file
[15:01:00 CEST] <hiihiii> i not then my only choice is to use palttegen
[15:01:05 CEST] <hiihiii> again
[15:02:00 CEST] <BtbN> ffmpeg will decode the gif, and it will just be a normal RGB or YUV stream of frames then
[15:02:16 CEST] <BtbN> Which just happens to only have the same 255 colors all over
[15:02:46 CEST] <BtbN> the problem I see is that scaling it will most likely result in more/different colors
[15:03:03 CEST] <BtbN> so generate the palette without scaling, and use it for the scaled version
[15:03:18 CEST] <BtbN> and maybe try around a bit, might also look better if you generate a new one post-scaling
[15:04:15 CEST] <hiihiii> okay thx that sounds good
[15:08:53 CEST] <BtbN> Or just get the original material again...
[15:09:16 CEST] <BtbN> If the original is huge, I'd imagine a gif of it to be even larger
[15:12:09 CEST] <hiihiii> the original was a single file
[15:12:35 CEST] <hiihiii> I trimmed multiple sections from it to gif
[15:12:45 CEST] <hiihiii> anyway
[15:13:02 CEST] <hiihiii> I tried doing what was suggested
[15:14:42 CEST] <hiihiii> the result looks significantly worse. the palettes generated even have a large chunk of black at the bottom suggesting that there are some unused colors
[15:16:37 CEST] <BtbN> some part of the chain is not smart enough for that special case then
[15:16:42 CEST] <hiihiii> heck even without resizing the image there's some degradation in color
[15:17:15 CEST] <BtbN> Is it using yuv as intermediate?
[15:19:07 CEST] <hiihiii> I haven't specifyed the color format so it must be that of the source input or the ffmpeg default for gif
[15:19:20 CEST] <hiihiii> the input is yuv420
[15:19:56 CEST] <BtbN> try forcing it to use rgb
[17:02:32 CEST] <retroj> does ffmpeg have any sort of volume automation filter, whereby one could define when in an audio stream to raise and lower the volume?
[17:06:51 CEST] <DHE> might dynaudnorm be in the ballpark?
[18:24:50 CEST] <durandal_170> retroj: loudnorm also
[19:09:38 CEST] <retroj> DHE, durandal_170: thank you
[19:18:01 CEST] <b0bby__> Whats the best way to compile ffmpeg from their github repository then test it using a pre-written c program. Step by step instruction would help. I've tryed this before by myself but I always get gcc compiler errors when trying to test the libraries. Thanks for any help.
[19:19:19 CEST] <DHE> like most linux apps from source, ./configure, make, and optionally make install. run configure --help to see what options are available. if you plan to install, consider using the --prefix option and related parameters
[19:20:08 CEST] <james9999> i always get compiler errors when trying to compile things. so i gave up trying to compile things from websites. ^_^
[19:21:18 CEST] <b0bby__> Ok im going to recompile all the stuff from scratch. Does anyone have some test code I could use?
[19:27:48 CEST] <DHE> there's a couple of sample apps in doc/examples if you really want to test the C interface
[19:27:48 CEST] <DHE> of course, the ffmpeg CLI tool is already a C user of libav
[19:28:31 CEST] <b0bby__> what does --prefix do?
[19:28:59 CEST] <retroj> b0bby__: installation path.  normally prefix is /usr/local or /usr
[19:29:29 CEST] <DHE> some users like /opt/$packagename to keep per-applications apart, but then you have issues with $PATH and suck
[19:29:30 CEST] <DHE> such
[19:30:55 CEST] <b0bby__> ok I created the make file and set the prefix to a separate install directory
[19:31:00 CEST] <b0bby__> Compiling now
[19:33:51 CEST] <b0bby__> running make install
[19:34:41 CEST] <b0bby__> ok now I have a folder that looks like: bin  include  lib  share
[19:42:42 CEST] <DHE> so ffmpeg should be in the bin directory, man pages in the share directory, and so on and so forth
[19:48:23 CEST] <b0bby__> ok I tried to compile ffmpeg.c and got the errors: https://pastebin.com/Uwu45DtJ
[19:59:23 CEST] <b0bby__> Anyone here?
[20:09:19 CEST] <b0bby__> hello?
[20:14:12 CEST] <tdr> b0bby__, your source is missing header files.   libavutil/internal.h should be there when it unpacks but isnt
[20:19:08 CEST] <b0bby__> libavutil/internal.h is there: https://gist.github.com/2432ec892159cf00ac0eca14c9a8f2bc
[20:19:42 CEST] <tdr> ok line 12 of your paste
[20:20:07 CEST] <tdr> its either not there or some env is off during your buikd
[20:21:30 CEST] <b0bby__> tdr: what?
[20:22:11 CEST] <tdr> b0bby__, just saying it cant find your header, so its either not there or its looking in the wrong place or you're building it weird.  how you building it?
[20:23:00 CEST] <b0bby__> gcc ffmpeg.c
[20:23:13 CEST] <b0bby__> hold on Ill pint the forr folder layout
[20:23:26 CEST] <tdr> omg really?
[20:23:35 CEST] <tdr> have you ever compiled from source before?
[20:24:20 CEST] <tdr> b0bby__, which OS platform are you compiling on?
[20:29:02 CEST] <b0bby__> tdr: linux
[20:29:14 CEST] <b0bby__> Here is a directory printout https://gist.github.com/5efcd8479df9224aa28ac62be1187014
[20:38:34 CEST] <tdr> b0bby__, you shouldnt be running gcc on it like that directly though.  run ./configure --help (to look at switched), then ./configure  then make and optionally make install
[20:40:49 CEST] <tdr> b0bby__, you may need to download a bunch of packages with headers files (-dev / -devel) for your distro too, it depends what you already have and which switches you want to enable.
[20:43:50 CEST] <kilobyte_> hey all, yesterday I asked, let me try again: how to record Windows desktop about 50-60 fps? I tried this: ffmpeg -probesize 10M -f dshow -rtbufsize 1500M -video_size 1920x1080 -rtbufsize 702000k -framerate 50 -i video="screen-capture-recorder":audio="virtual-audio-capturer" -vcodec libx264 -qp 0 -threads 0 -crf 0 -preset ultrafast -acodec pcm_s16le -tune zerolatency output.mkv but it doenst seem recording all the 50 frames, just abou
[20:46:32 CEST] <b0bby__> Ok is their an easier way to do video converting or minor editing. ffmpeg is kicking up a fight and unless someone can help me step by step I need another way.
[20:46:48 CEST] <tdr> b0bby__, most distros have a binary for it you can download
[20:47:19 CEST] <tdr> b0bby__, even lastest versions.  its common enough to build "newest version" that there's guides for building it for most distros too.
[20:47:23 CEST] <furq> didn't you just say you built it
[20:48:14 CEST] <dystopia_> he said "converting" not "compiling" which i guess he means encoding,
[20:48:31 CEST] <furq> 18:33:51 ( b0bby__) running make install
[20:48:31 CEST] <furq> 18:34:41 ( b0bby__) ok now I have a folder that looks like: bin  include  lib  share
[20:48:59 CEST] <tdr> furq, the errors from the pastebin were from from compiling, not an install phase
[20:49:04 CEST] <dystopia_> ahh i joined after that, never mind then
[20:49:33 CEST] <b0bby__> Tell me what to do and Ill do it. Anything to get this working
[20:49:34 CEST] <tdr> b0bby__, which distro of linux are you on?
[20:49:38 CEST] <b0bby__> mint
[20:49:54 CEST] <tdr> b0bby__, sudo apt-get install ffmpeg
[20:49:54 CEST] <furq> b0bby__: what are you actually trying to do
[20:49:58 CEST] <furq> you already compiled ffmpeg
[20:50:20 CEST] <tdr> furq, when i saw, he was running gcc on ffmpeg.c
[20:50:32 CEST] <furq> yeah that's the bit i don't understand
[20:50:51 CEST] <tdr> if it compiled before that prob just made a mess
[20:51:00 CEST] <b0bby__> I want to develop a C/C++ program that has some video conversion/editing functionality
[20:51:21 CEST] <tdr> b0bby__, ok did the apt-get command I gave you pull in ffmpeg?
[20:52:18 CEST] <b0bby__> tdr yes
[20:52:27 CEST] <furq> b0bby__: can you build the stuff in doc/examples
[20:52:51 CEST] <furq> look at the makefile in there
[20:53:45 CEST] <b0bby__> what should I link it to
[20:54:34 CEST] <furq> https://github.com/FFmpeg/FFmpeg/blob/master/doc/examples/Makefile
[20:56:41 CEST] <b0bby__> furq: how can I tell make where my libs are installed
[20:57:49 CEST] <b0bby__> furq never mind
[21:00:09 CEST] <b0bby__> here are the errors from the compile with make decode_video.c: https://pastebin.com/t7DLTnqg
[21:00:46 CEST] <furq> did you do the thing it tells you to do
[21:07:06 CEST] <james9999> Perhaps you should add the directory containing `libavdevice.pc'
[21:07:06 CEST] <james9999> to the PKG_CONFIG_PATH environment variable
[21:08:38 CEST] <b0bby__> I got to go
[21:08:53 CEST] <b0bby__> I'll try to comeback later to get this fixed
[21:09:09 CEST] <furq> that boy ain't right
[21:58:00 CEST] <reuf> hello - best resource to see overal a ilities of ffmpeg?
[21:58:07 CEST] <reuf> i mean best book, tutorial
[22:00:03 CEST] <james9999> i don't think anything like that exists reuf
[22:00:28 CEST] <james9999> of course you're welcome to learn all about how ffmpeg works, then write the tutorial yourself and host it yourself.
[22:01:20 CEST] <reuf> james9999, thanks
[22:01:51 CEST] <james9999> well i was laying on the sarcasm a bit heavy there but np, glad to provide helpful-if-slightly-snarky answers
[22:05:49 CEST] <llogan> reuf: there is a book, but it is not free and not from the FFmpeg developers: http://ffmpeg.tv/
[22:06:11 CEST] <llogan> otherwise see the FFmpeg Wiki http://trac.ffmpeg.org/
[22:06:18 CEST] <llogan> and of course the documentation
[22:07:06 CEST] <reuf> llogan, thanks
[22:14:26 CEST] <b0bby__> Hey I compiled ffmpeg and I'm trying to compile one of the examples (decode_video.c) with "gcc decode_video.c -L"/home/kitt/Documents/c programs/ffmpeg/ffmpeg build/install/lib"  -lswscale -lavutil -lavformat -lavcodec". However I get the error: https://gist.github.com/5352e462633ad9b7410180102d9a8d37. Here is also a directory layout: https://gist.github.com/anonymous/497a407b6c15824fe9ea09a46dbe7d5b
[22:14:35 CEST] <b0bby__> How do I fix this?
[22:23:19 CEST] <b0bby__> ??
[22:26:52 CEST] <faLUCE> hey, I found this option:  AVFormatContext->max_analyze_duration for minimizing the delay when playing a stream... do you know which is the corresponding option for ffplay ?
[22:30:08 CEST] <faLUCE> I tried "-analyzeduration 100000" (0.1 seconds), but it did not improve.... (It improved with libav, though)
[22:31:35 CEST] <furq> you need to set -probesize as well
[22:31:58 CEST] <faLUCE> furq: do you mean that I have to set both?
[22:32:14 CEST] <furq> that is normally what "as well" means
[22:32:43 CEST] <faLUCE> furq: but I did not set both with libav, and it minimized the delay. Why do I have to set both with ffplay?
[22:32:52 CEST] <furq> shrug
[22:33:39 CEST] <faLUCE> furq: sorry for my poor english: does "shrug" mean, in this case, "I don't know" ?
[22:36:33 CEST] <james9999> faLUCE: either that or the world is about to tumble off of Atlas's shoulders
[22:36:37 CEST] <DHE> yes. a shrug is an action done with the shoulders
[22:37:00 CEST] <faLUCE> ok, thanks
[22:45:25 CEST] <vbgunz> hello fellas, I wrote a script a long time ago and it works great with a single monitor, but having a strange multi monitor setup, it gets crazy to just get coordinates or the name of a single screen. other than :0 for -i, is there a way to put the name of the screen I'm trying to capture?
[22:47:50 CEST] <b0bby__> Hey I compiled ffmpeg and I'm trying to compile one of the examples (decode_video.c) with "gcc decode_video.c -L"/home/kitt/Documents/c programs/ffmpeg/ffmpeg build/install/lib"  -lswscale -lavutil -lavformat -lavcodec". However I get the error: https://gist.github.com/5352e462633ad9b7410180102d9a8d37. Here is also a directory layout: https://gist.github.com/anonymous/497a407b6c15824fe9ea09a46dbe7d5b
[22:47:50 CEST] <b0bby__> How do I fix this?
[22:51:43 CEST] <jkqxz> b0bby__:  You have the libraries in the wrong order, and you're missing at least m, dl and z.  Just use pkg-config.
[22:53:25 CEST] <b0bby__> jkqxz: how do I use pkg-config
[22:56:18 CEST] <jkqxz> Which libraries are you actually using?  libavcodec and libavutil?  Put the output of "pkg-config --cflags --libs --static libavutil libavcodec" at the end of your command line.  (May also need to set PKG_CONFIG_PATH if you've installed in a funny place.)
[22:58:26 CEST] <b0bby__> https://gist.github.com/anonymous/ce3c0fc2a7dbe73c9936ab5ae85748e0
[23:07:38 CEST] <vbgunz> I'm sorry, is there a way to tell ffmpeg to record a particular monitor? I have several in very wild and different orientations and sizes. can I just just say record all of screen 1, 2 or 3? my $DISPLAY is always just :0, there is no 0.1, 0.2.
[23:09:01 CEST] <BtbN> it's all one big rectangular screen
[23:09:08 CEST] <BtbN> just need to give the right coordinates
[23:12:30 CEST] <vbgunz> damn, that makes it crazy. is there a way to see what the coordinates already are? through trial and 20 minutes later I found 0.0+0,420 is what I needed for the main screen. damn if that's true
[23:12:47 CEST] <BtbN> xrandr should show them I believe?
[23:13:04 CEST] <vbgunz> I tried xrandr --query
[23:13:27 CEST] <vbgunz> it shows my monitors but not their positions
[23:14:21 CEST] <vbgunz> I mean, I can look deeper, maybe I'll find what I'm looking for in xrandr
[23:14:38 CEST] <BtbN> It should print something like 1920x1080+1280+487 for each monitor
[23:14:53 CEST] <BtbN> there you have its width/height and x/y offset, that's all you need for ffmpeg
[23:15:17 CEST] <vbgunz> I wish, that would solve everything, I'm trying and --verbose is just nuts atm
[23:15:34 CEST] <BtbN> xrandr prints that just fine for me, no idea what you are doing with it
[23:17:42 CEST] <vbgunz> for me, xrandr prints 1920x1080+0+0 ... for the main horizontal screen. 1080x1920+1920+0 for the second vertical screen. but to get the first monitor right, I can't use just -i :0 ... these huge black bars surround the top, left and right sides of the video. I need to use 0.0+0,420 and it took me 20 minutes to find that out
[23:18:17 CEST] <vbgunz> I'm not finding clues to get that correct the first time
[23:19:05 CEST] <vbgunz> ffmpeg version 3.1.7, Fedora 25
[23:19:50 CEST] <BtbN> if xrandr does not print the correct layout, something is wrong
[23:22:56 CEST] <vbgunz> tell me about it, everything is fine with a single monitor. having more than 1 enabled, it gets tough
[23:25:59 CEST] <furq> if you have different vertical resolutions then the smaller ones will be centred relative to the biggest one
[23:27:29 CEST] <furq> i'm guessing you have one 1920x1080 and one 1080x1920
[23:27:33 CEST] <vbgunz> 420+1080+420=1920, makes sense
[23:28:54 CEST] <vbgunz> I know the tv I'm connected too is in a wild position atm, when it is enabled the overall area gets much larger and will probably making capturing the main monitor hell
[23:38:40 CEST] <cryptodechange> What is the difference between deadzone 21,11 (default) and deadzone 6,6 (grain)?
[23:39:35 CEST] <furq> nothing
[23:40:25 CEST] <furq> the deadzone settings do nothing if trellis is enabled
[23:41:18 CEST] <cryptodechange> TIL
[23:42:14 CEST] <cryptodechange> I noticed the qcomp is increased too
[23:42:16 CEST] <cryptodechange> from 0.6 to 0.8
[23:57:35 CEST] <vbgunz> out of nowhere :0+0,420 stopped working perfectly after spending time figurign it out and it has to be :0 again... monitor setup didn't change at all, still 2 monitors :(
[23:59:05 CEST] <vbgunz> the only thing I did try was recording the second monitor and successfully doing so at 1080x1920 ... it's the only thing I did differently between all the recording
[00:00:00 CEST] --- Thu May  4 2017


More information about the Ffmpeg-devel-irc mailing list