[Ffmpeg-devel-irc] ffmpeg.log.20190125

burek burek021 at gmail.com
Sat Jan 26 03:05:02 EET 2019


[02:11:14 CET] <Hello71> does ffserver still compile? is it likely that it will stop compiling?
[09:00:37 CET] <BazookaJoe> Does anyone know where to find lgpl pre-compiled dlls of ffmpeg?
[09:01:35 CET] <JEEB> nope, although thankfully compiling FFmpeg with either mingw-w64 or new enough MSVC isn't too hard
[09:02:28 CET] <JEEB> (I think we still support MSVC 2013+ as long as latest updates are applied)
[09:05:05 CET] <BazookaJoe> I tried using media-autobuild_suite, but couldn't find way to compile tagged version
[09:05:26 CET] <BazookaJoe> it just builds master
[09:05:53 CET] <JEEB> usually those things build a whole crapload of dependencies, you should generally decide on which libraries you want based on which encoders you need
[09:06:04 CET] <JEEB> since 99% of all decoders are built-in
[09:07:33 CET] <BazookaJoe> Is there 'dependencies' I should be aware of? I mean, if I will checkout some tag of ffmpeg, make sure the script will not checkout master again, I can assume that there will be no dependencies errors?
[09:10:05 CET] <JEEB> if it has built already then you have the basics, such as "some shell to run the configure script and make to run the makefiles with the compiler that configure script sets/gets configured with"
[09:10:20 CET] <JEEB> and then a C compiler and nasm for x86(_64)
[09:10:42 CET] <JEEB> you can look at the console output of running configure during a build
[09:11:11 CET] <JEEB> and then when you want to build a binary that you actually want to utilize, you add --disable-autodetect and specifically add --enable-* for what you exactly need
[09:12:49 CET] <BazookaJoe> (thanks for the help!)
[09:12:54 CET] <JEEB> generally dependencies are stuff like encoder libraries (although some are specifically requiring enable-gpl and/or enable-version3
[09:13:16 CET] <BazookaJoe> so does the 'core' of ffmpeg is just that one build of ffmpeg, without all the mess around it?
[09:14:05 CET] <JEEB> --disable-autodetect disables the enablement of most libraries, except some that are considered "system" things not sure which were such, but the list is really short
[09:15:09 CET] <furq> i can't imagine you need that for windows
[09:15:23 CET] <JEEB> yes you do, if you care about what you're depending on
[09:15:38 CET] <furq> well you're not going to accidentally pull in any gpl libs
[09:15:56 CET] <JEEB> I generally think of disable-autodetect as a good way of not enabling random things I didn't request :P
[09:16:13 CET] <furq> if you're trying to strip stuff out beyond that then sure
[09:17:20 CET] <JEEB> I just think disable-autodetect is a good default measure if you plan on distributing your binaries :P since everything can always be enabled
[09:18:02 CET] <JEEB> I think the most common things I enable are zlib,the windows schannel TLS and dxva2 and d3d11va hwaccels
[11:07:05 CET] <egrouse> so i have a script that randomly plays videos and sends them to a stream, but at the end of each video when it cycles to the next one it disconnects for like half a second until the next process starts. is there any way i can kind of just send a black image in this time? so theres no 'downtime' in between
[11:07:26 CET] <egrouse> i thought perhaps if i use the image as source 1 (background) and then brought in the 'stream' as source 2 then when the stream is down it would fallback to the image
[11:07:30 CET] <egrouse> but i dont know if thats logical?
[11:07:42 CET] <egrouse> /if what i want is possible
[11:08:10 CET] <furq> there's not really a good way to do that with ffmpeg unless you have an ordered list of all the videos in advance
[11:08:45 CET] <furq> and even then it's annoying because they all need to be the same format, dimensions etc
[11:12:22 CET] <egrouse> furq, yeah i thought that may be the case - concat filter?
[11:12:30 CET] <egrouse> i did look at that, but it threw issues due to conflicting formats
[11:12:45 CET] <egrouse> i can get a list easy enough; suppose i could just pre process everything to be the same in advance..
[11:12:53 CET] <egrouse> hmm
[13:51:27 CET] <egrouse> if my video is already in x264 and my audio is already in aac can i use copy automatically and otherwise reencode? or will i have to check it first, and then run a different command dependant on result
[13:51:39 CET] <egrouse> i.e. can it fallback to reencoding if copy would not result in 264/aac
[13:56:55 CET] <furq> yeah but ffmpeg won't do it for you
[13:56:57 CET] <BtbN> *h264
[13:56:58 CET] <furq> it's easy enough to automate with ffprobe
[13:58:17 CET] <furq> [ $(ffprobe -v error -select_streams v:0 -show_entries stream=codec_name -of default=nk=1:nw=1 foo.mkv) == h264 ]
[14:05:16 CET] <egrouse> furq, thank you - looks to be exactly what i need
[14:05:27 CET] <egrouse> not seen ffprobe before, seems handy
[14:05:44 CET] <furq> http://vpaste.net/Dk1EP
[14:05:49 CET] <furq> you might want to steal that for your bashrc
[14:05:51 CET] <furq> i use it a lot
[14:06:51 CET] <egrouse> perfect - thanks so much :)
[14:53:42 CET] <bendikrb> Hi everybody! I'm making a video clip where I've put a semi-transparent black bar at the bottom of the frame. so far, so good. But now I want to animate the height of this bar (the bottom part of it should stay in place, and the top should animate upwards to its final height)
[14:54:04 CET] <bendikrb> any ideas on how I could do that anyone? ;)
[15:54:34 CET] <atbd> hi, how do you handle mpegts rollover when using ffmpeg api ? For now i have timestamps and DTS error and nothing is written after that
[15:55:32 CET] <JEEB> in input lavf in theory should be making your timestamps constantly rising, but it seems to bork after the first wrap-around
[15:55:42 CET] <JEEB> in output you just keep rising the values and the muxer *should* do the right thing
[15:56:52 CET] <DHE> but the demuxer will return the raw bits. so it's on you to add (1UL<<33) to all DTS values every time they wrap. (See also pts_wrap_bits in AVStream (?))
[15:57:07 CET] <DHE> also correct the PTS with the rule PTS >= DTS
[15:57:09 CET] <JEEB> DHE: uhh
[15:57:15 CET] <JEEB> lavf has wrap-around code
[15:57:30 CET] <JEEB> so the demuxer will return the original timestamp yes
[15:57:34 CET] <DHE> JEEB: it breaks when it comes back around to its original values
[15:57:37 CET] <JEEB> but lavf utils will handle it *once*
[15:57:43 CET] <DHE> yep. for mpegts that's 26.5 hours
[15:57:45 CET] <JEEB> yes, I said that exactly just now
[15:57:59 CET] <JEEB> > but it seems to bork after the first wrap-around
[15:58:53 CET] <DHE> yeah. I actually did write a transcoder app for 24/7 handling of mpegts which handles all of this
[15:58:54 CET] <JEEB> so technically no the demuxer (for an API user) will not be returning raw bits (as in, that is not the spec of it). It will start doing it *after* the first time it happens, but that is a bug. so I'd like it being noted like that
[15:59:06 CET] <JEEB> I only herped a derp at your wording :P
[15:59:31 CET] <JEEB> it's likely there's an option in lavf to disable the lavf utility from happening, but not sure :P
[15:59:37 CET] <Mavrik> mpegts isn't quite linear anyway and ffmpeg will bork on it
[15:59:43 CET] <Mavrik> so manual fixing tends to be required :)
[15:59:49 CET] <JEEB> quite likely
[15:59:52 CET] <Mavrik> (assuming that its a live stream)
[15:59:55 CET] <DHE> av_dict_set_int(&dict, "correct_ts_overflow", 0, 0); av_dict_set(&dict, "avoid_negative_ts", "disabled", 0); // this is what I'm doing
[15:59:57 CET] <JEEB> (but there are some really stable streams)
[16:01:10 CET] <analogical> how do I extract only the audio from a file without any re-encoding?
[16:01:23 CET] <JEEB> DHE: for the record I'm not at all trying to ridicule your experience or anything, just noting that telling someone that the demuxer returns original timestamps as-is by default is kind of... not true
[16:01:30 CET] <JEEB> you will not get the original bits *by default*
[16:01:47 CET] <JEEB> and yea, the ts_overflow thing probably
[16:01:51 CET] <DHE> yeah that's probably my fault for using this app so long with these settings I've forgotten the default behaviour
[16:02:16 CET] <JEEB> avoid_negative_ts is mostly used in muxing I think
[16:02:21 CET] <JEEB> (movenc) etc
[16:03:00 CET] <DHE> maybe it's not required. I just took every setting I could find that might tamper with the source inputs
[16:03:13 CET] <JEEB> nah, it's a good one
[16:04:27 CET] <JEEB> DHE: and yea, as an API user I 100% understand stopping lavf from poking the timestamps :D
[16:04:36 CET] <JEEB> esp. since mpeg-ts handling tends to be abrupt
[16:05:00 CET] <JEEB> (I have one patch for that around from Japan which I should post on ML)
[16:30:40 CET] <atbd> JEEB DHE: thanks! I will try those settings + check correct pts/dts as you said. If you have examples of this i could take a look ^^'
[16:33:35 CET] <DHE> I don't think I can share large code segments on this one. I have a function aptly named fix_timestamps(AVPacket*) which will ensure the video stream has a rule-abiding consistent DTS, and that all other streams are as close to its values as possible. then PTS >= DTS to settle on the PTS wrapping position.
[16:37:04 CET] <atbd> okay i will make a function like this to check input AVPackets
[16:37:06 CET] <atbd> thank you
[17:12:00 CET] <sn00ker> hi all
[17:16:59 CET] <sn00ker> https://nopaste.linux-dev.org/?1191882
[17:17:05 CET] <sn00ker> can everyone helped me?
[17:17:33 CET] <pink_mist> no; I certainly can't
[17:17:46 CET] <pink_mist> so everyone can't
[17:18:24 CET] <sn00ker> sry
[17:18:27 CET] <sn00ker> anyone :)
[17:18:34 CET] <egrouse> sn00ker, this is a rather insane coincidence but im pretty much attempting to do exactly the same thing
[17:18:41 CET] <egrouse> you've got it further than me, though
[17:18:46 CET] <egrouse> im still just accepting the rtmp drops
[17:20:08 CET] <sn00ker> that's interesting :) then we can develop together:) everything runs perfectly for me. about another script I can even connect to my own "software" with obs what I can then live in the stream
[17:20:48 CET] <sn00ker> I have only the problem that the sound is not syncron. may I ask on which rtmp server to set? I had some worries at the beginning but now it works perfectly
[17:22:46 CET] <egrouse> wow we are doing the same thing
[17:22:47 CET] <egrouse> haha
[17:22:58 CET] <egrouse> i am using nginx-rtmp
[17:24:02 CET] <sn00ker> OK. I found "mistserver" after a long search. nice graphic surface. especially when operating on a server where not everyone should play around on the bash is very great and runs very stable
[17:26:23 CET] <sn00ker> I have to say to me though. that my problem occurs only with a virtual sound card. With a real sound card everything works wonderfully. even if sound card is real and the video is a virtual interface. So it has to be on the audio loop
[17:40:23 CET] <brimstone> i'm having problems with ffmpeg burning in subtitles. I'm not getting any errors, it seems like ffmpeg is reading the subtitles file, but the output doesn't have the subtitles burned into it. https://pastebin.com/8R3Epjzj
[18:22:07 CET] <kepstin> brimstone: the issue is due to the seeking, the subtitles filter doesn't see the original timestamps with that command so they're not lined up right
[18:23:14 CET] <kepstin> brimstone: try this command: `ffmpeg -copyts -ss 00:00:59 -to 00:01:04 -i input.mp4 -vf "subtitles=f=input.srt:charenc=latin1" -y test.mp4`
[18:24:11 CET] <kepstin> although note this might run into other issues, since the output timestamps won't start at 0; you might be able to fix that by adding additional filters to edit the timestamps.
[18:24:35 CET] <kepstin> sticking `,setpts=PTS-STARTPTS` on the end of your video filter, and adding `-af asetpts=PTS-STARTPTS`
[18:26:22 CET] <kepstin> i kinda wish the subtitles filter had a timestamp offset option
[19:03:48 CET] <brimstone> kepstin: nice! thanks
[19:03:56 CET] <brimstone> seems like -copyts helps, they actually show up now
[19:04:20 CET] <brimstone> i don't need any audio, so i can ignore the -af, right?
[19:05:10 CET] <kepstin> Well, if you don't want audio you should turn it off, either use -an or the map options
[19:05:54 CET] <brimstone> ah, right
[19:43:31 CET] <tempaccount1987> Hey guys this may be the wrong channel but in c# the only way I can run ffmpeg is through cli (starting the process and passing arguments) I can't get ffmpeg nightly to work, anyone have any suggestions on how I can get around not passing to CLI?
[19:44:34 CET] <JEEB> see doc/examples and the doxygen for API usage?
[19:45:39 CET] <tempaccount1987> @JEEB are you speaking to me?
[19:45:54 CET] <pink_mist> yes he is
[19:46:03 CET] <tempaccount1987> The issue is I can get it to work, it's just the speed is killing me.
[19:46:10 CET] <tempaccount1987> Is there any other options for C#?
[19:46:51 CET] <tempaccount1987> Sorry I should have been more specific, I have the stand alone build working through CLI.
[19:47:14 CET] <tempaccount1987> I'm just looking for C# libraries or other ways I can speed up Av1 encoding / decoding.
[19:48:25 CET] <kepstin> av1 is just gonna be slow
[19:48:29 CET] <kepstin> no way around that
[19:51:19 CET] <tempaccount1987> Daaarn.
[19:51:42 CET] <tempaccount1987> I'm doing research into moving from webp to Av1
[19:51:58 CET] <tempaccount1987> Should I just suggest that the current state it will be slower to encode/decode.
[19:52:33 CET] <JEEB> decoding is fast enough with dav1d
[19:52:35 CET] <JEEB> but encoding, yes
[19:52:36 CET] <JEEB> it's slow
[19:52:59 CET] <tempaccount1987> I've been looking at Dav1d, what is offered?
[19:53:03 CET] <tempaccount1987> c++ libs?
[19:53:06 CET] <JEEB> C
[19:53:11 CET] <JEEB> also FFmpeg has a wrapper for it
[19:53:11 CET] <tempaccount1987> oof
[19:53:20 CET] <tempaccount1987> Would it be faster~
[19:53:20 CET] <JEEB> so you can just use it through FFmpeg's libavcodec
[19:53:27 CET] <JEEB> faster than libaom's decoder? yes
[19:53:35 CET] <tempaccount1987> I'm currently using libaom for ffmpeg
[19:53:38 CET] <JEEB> http://www.jbkempf.com/blog/post/2018/dav1d-toward-the-first-release
[19:54:03 CET] <JEEB> libaom is the reference implementation, dav1d is a project sponsored by AOM to make a *fast* decoder
[19:55:08 CET] <tempaccount1987> I really appreciate all your help, a few last questions, do you guys have any metrics on decoding av1 to raw pixel data RGB?
[19:55:28 CET] <JEEB> most video is not RGB so that would also add a YCbCr->RGB conversion there
[19:55:40 CET] <JEEB> whihc you probably didn't mean to imply :P
[19:56:16 CET] <JEEB> also that blog post has some performance numbers for dav1d
[19:56:25 CET] <JEEB> not all of them are percentages
[19:57:59 CET] <JEEB> so yea, playing AV1 is possible but encoding is a bit more uhhh
[19:59:10 CET] <tempaccount1987> Building out Dav1d will still only offer cli?
[19:59:27 CET] <JEEB> uhh
[19:59:33 CET] <Hello71> createprocess is not *that* slow
[19:59:38 CET] <Hello71> even though windows sucks
[19:59:55 CET] <JEEB> dav1d will have a test cli application as well as a library, but you generally use it through the library :P
[19:59:56 CET] <tempaccount1987> This is for images,
[20:00:02 CET] <JEEB> like FFmpeg does if you enable libdav1d
[20:00:20 CET] <tempaccount1987> I should have stated it's for images not video
[20:00:33 CET] <JEEB> at least you only have a single image to encode then :P
[20:00:52 CET] <tempaccount1987> _The only issue is starting those CLI processes :P_
[20:01:06 CET] <JEEB> then use the API?
[20:01:17 CET] <JEEB> C# has an OK C interface if I recall correctly
[20:08:37 CET] <BtbN> The C# C insterface has a few major annoyances
[20:08:49 CET] <BtbN> Like, it doesn't have proper types that match up with C types correctly
[20:09:23 CET] <BtbN> And as C# code can be either run as 32 or 64 bit depending on the platform, that makes for some "fun" workarounds
[20:09:52 CET] <BtbN> If the C api uses only types with a fixed size, so, uint32_t and the like, it's very straight forward though
[20:10:45 CET] <BtbN> Or you end up using an IntPtr for a lot of non-pointer-variables, just because the size happens to match up
[20:19:21 CET] Action: meredydd waves to durandal_1707 
[20:20:46 CET] <meredydd> Thanks for your help last week. It put me on the right track to work out that what I really wanted was a "delay" filter to match the existing "adelay". I now have it working - patch incoming :)
[20:38:54 CET] <sn00ker> no one can me help?
[23:18:28 CET] <SpeakerToMeat> Hello all
[23:18:49 CET] <SpeakerToMeat> I have an audio file I'm remuxing untranscoded from a wav into an mxf op1a.
[23:19:32 CET] <SpeakerToMeat> My question is, the original file has 1 stream with 2 channels (stereo). on the mxf I would like them to be on two streams, first channel (left) on the first stream, second (right) on the second... how do I achieve that? map?
[23:28:04 CET] <furq> SpeakerToMeat: -map_channel
[23:31:58 CET] <SpeakerToMeat> Hmm it's not including the audio... : ffmpeg -i video.mxf -i audio.wav -map 0:v -map_channel 1.0.0 -map_channel 1.0.1 -codec copy output.mxf
[23:32:06 CET] <SpeakerToMeat> Is producing only a video stream (0:0)
[23:32:55 CET] <SpeakerToMeat> Though it IS listed in the inputs: Stream #1:0: Audio: pcm_s16le ([1][0][0][0] / 0x0001), 48000 Hz, stereo, s16, 1536 kb/s
[23:34:15 CET] <SpeakerToMeat> sigh for now I'll pre split the file and do it that way
[23:39:38 CET] <SpeakerToMeat> Until I learn what my mistake is
[00:00:00 CET] --- Sat Jan 26 2019


More information about the Ffmpeg-devel-irc mailing list