[Ffmpeg-devel-irc] ffmpeg.log.20180417

burek burek021 at gmail.com
Wed Apr 18 03:05:02 EEST 2018


[00:00:52 CEST] <ChocolateArmpits> visual studio compile environment has tools to compile the idl file
[00:01:06 CEST] <Anonrate> I've got the whole 2017 suite.
[00:01:15 CEST] <ChocolateArmpits> precisely midl
[00:03:13 CEST] <Anonrate> Anything speical I need for a flag or anything?
[00:05:40 CEST] <ChocolateArmpits> Anonrate, nah you need the VS2015 x64 Native Tools Command Prompt loaded or whatever it's called for vs2017
[00:05:52 CEST] <ChocolateArmpits> then you'll be able to call midl
[00:06:06 CEST] <Anonrate> Already got it open.  :)  ran midl DeckLinkAPI.idl and no header.
[00:06:19 CEST] <ChocolateArmpits> it must've generated other files first, no?
[00:06:34 CEST] <Anonrate> Of course it did.  It would be VS  if it didn't.
[00:07:05 CEST] <ChocolateArmpits> what extension do the new files have ?  I don't quite remember, but there's surely one more step involved
[00:07:37 CEST] <Anonrate> There is a new file called DeckLinkAPI.tlb
[00:08:23 CEST] <ChocolateArmpits> Anonrate, run Tlbimp on that
[00:08:41 CEST] <ChocolateArmpits> check out the options too https://docs.microsoft.com/en-us/dotnet/framework/tools/tlbimp-exe-type-library-importer
[00:11:28 CEST] <Anonrate> It's broken...  I hate windows.
[00:11:47 CEST] <ChocolateArmpits> you sure set the options correctly?
[00:12:07 CEST] <Anonrate> I mean the commandPrompt.  Lol.  The buffer looks like it has cancer..
[00:13:05 CEST] <Anonrate> It would probably help if I used the correct extension..
[00:15:34 CEST] <Anonrate> I needed to use the flag /h DeckLinkAPI.h to output the header.
[00:15:47 CEST] <ChocolateArmpits> with midl ?
[00:15:54 CEST] <Anonrate> Yes.
[00:15:59 CEST] <ChocolateArmpits> there ya go
[00:16:17 CEST] <Anonrate> Lets hope it works.  :)  Thank you for the help.  <3
[00:19:37 CEST] <Anonrate> Now it needs a source file..
[00:55:18 CEST] <Anonrate> For "-lndi" does a file called "libndi.lib" have to exist?  Or can it be something else?
[00:58:04 CEST] <Anonrate> Because  included with the NewTek SDK is a lib called "Processing.NDI.Lib.x64.lib" so I just used the flag extra-lib="${PATH_TO_NDI_LIB}"
[01:02:19 CEST] <Anonrate> I'm also getting alot of "warning: previous multiple common of `--gnu_lto_v1'" and "warning: previous common is here"
[01:55:52 CEST] <Skullclown> Trying to compile ffmpeg for android with --enable-libx264 on debian w/ libx264-dev installed but it's not working out. I'm getting "ERROR: libx264 not found". Opening the log shows "ld: error: cannot find -lx264". Any thoughts?
[01:57:00 CEST] <iive> did you installed in a system path. aka sometimes /usr/local/lib/ is not in ld.so.conf
[02:00:09 CEST] <Skullclown> iive: debian stretch package
[02:01:34 CEST] <iive> Skullclown, do you get this error at runtime or when compiling?
[02:03:26 CEST] <Skullclown> iive: when compiling ffmpeg
[02:04:41 CEST] <iive> do you have pkgconfig?
[02:05:19 CEST] <iive> debian uses non-standard library directories and these could be found out only if pkg-config tells them.
[02:15:38 CEST] <Anonrate> Has anyone compiled ffmpeg with support for ndi_newtek?
[02:18:43 CEST] <Anonrate> I'm trying to compile ffmpeg with libndi_newtek support and I keep getting -lndi not found.  The thing is, there is no libndi.dll (Using mingw32-w64), there is a Processing.NDI.Lib.x64.dll (And a .lib of that as well), but that's it.  Would I maybe have to rename one of those to libndi.*?
[02:20:19 CEST] <Anonrate> I've added the path that the lib and dll was located (Not at the sametime) via --extra-ldflags=-L${PATH...} I've also tried adding the --extra-libs=${PATH..}/Processing.NDI.Lib.x64.lib (Also tried the samething for the dll)
[02:20:41 CEST] <Anonrate> Wait..  I think I fixed it by renameing it..
[02:21:36 CEST] <Skullclown> iive Got a minute by chance?
[02:23:04 CEST] <iive> Skullclown, go ahead. i'll be leaving soon, but i do have 1 minute
[02:33:21 CEST] <iive> cu
[03:05:13 CEST] <Anonrate> After compiling ffmpeg and installing, when I run it..  No output is given whatsoever..  I'm using mingw32-w64
[04:02:46 CEST] <Anonrate> Say I wanted to make a configure script to configure the "Main" ./configure for me (So I don't have to keep retyping it out), what would be the best approach to do this?  Using a bash script or another configure script?
[04:05:06 CEST] <Anonrate> I'm probably asking that in the wrong channel..
[04:07:29 CEST] <DHE> umm... well I just have a custom shell script that is really just the configure line I normally use (3rd party libs and all). is that what you mea?
[04:11:43 CEST] <Anonrate> I almost hit enter and it would have hurt your head trying to understood what I said..  I made it longer than it had to be.  I'll try to make it simpler (I'm terrible at expalining things)
[04:13:16 CEST] <Anonrate> When you need to disable something lets say x264, you would have to go into your script and take out the --enable-libx264, That would involve a lot of work, instead I was thinking having a variable called FF_WITH_X264=disable and then use that variable like this --${FF_WITH_X264}-libx264
[04:14:46 CEST] <Anonrate> I'm very new to shell scripting and makefiles.  They are very intimidating when I see one and they kind of intrigue me at the sametime.
[05:40:31 CEST] <Swervz> Hey does anyone know why when I use ffmpeg -i filename.mkv -vcodec copy -acodec copy 1.mp4 to remux a mkv into a mp4 it changes the framerate from constant to variable?
[05:40:57 CEST] <Swervz> the mkv has a constant framerate but the remuxed video has a variable one
[10:14:38 CEST] <Skullclown> I'm trying to compile ffmpeg with libx264 for android with some filters enabled but it's not working out. I've been trying for the past half day, I'd almost say I just want to pay someone to do it to get it over with. :| Anyone around for a few questions?
[10:19:35 CEST] <Mavrik> Hmm, you certanly can ask questions.
[10:19:52 CEST] <Mavrik> People probably won't have time for personal consulting, but I did manage to compile it for Android with x264
[10:20:10 CEST] <Mavrik> Suggestion - use a "standalone NDK toolchain" and point configure script to that version of GCC
[10:20:21 CEST] <Skullclown> Mavrik: I did that
[10:20:40 CEST] <Skullclown> I got to the point where I can compile ffmpeg and libx264 separately, but I can't compile ffmpeg with --enable-libx264
[10:21:27 CEST] <Mavrik> And what's the error in config.log?
[10:21:56 CEST] <Skullclown> If I do --enable-libx264 and include -I../x264 in my flags, compilation ends with an error x264.a (not the real name, paraphrasing atm while I'm booting my laptop & connecting to SSH) about log2f not existing.
[10:22:07 CEST] <Skullclown> I also tried adding -lm to my flags, but without luck
[10:25:16 CEST] <Skullclown> Mavrik: x264/libx264.a error: undefined reference to 'log2f'
[10:56:56 CEST] <Skullclown> Willing to pay a beer for anyone who can help me get this compiled :P
[10:58:50 CEST] <Mavrik> We'd love to help you but you're really not provoding usable information.;
[10:59:00 CEST] <Mavrik> Where are your configure, compilation logs?
[11:01:17 CEST] <Skullclown> Mavrik: https://pastebin.com/baQndc5R
[11:01:56 CEST] <Mavrik> First of all, are you sure you want to compile ffmpeg with clang? :)
[11:02:16 CEST] <Skullclown> No, but NDK 16 wouldn't get compiled with gcc :(
[11:02:26 CEST] <Skullclown> Sorry, I phrased that wrong.
[11:02:37 CEST] <Skullclown> ffmpeg wouldn't get compiled using NDK16 + gcc
[11:02:37 CEST] <Mavrik> Second of all, did you set Android API level when creating the standalone toolchain?
[11:02:53 CEST] <Skullclown> I did, I've tried 16, 18 and 19
[11:02:53 CEST] <Mavrik> Perhaps you need to target newer API level.
[11:03:01 CEST] <Mavrik> And the issue seems to be with libx264.a compilation already
[11:03:25 CEST] <Skullclown> Mavrik: that was my latest guess. But all I can give is guesses, compilation isn't my usual thing.
[11:05:28 CEST] <Mavrik> Well, try to rebuild with newer NDK platform first
[11:05:42 CEST] <Skullclown> NDK 16 is latest
[11:05:43 CEST] <Mavrik> Since if you're building for Android 1.5 it might not have everything you need.
[11:05:52 CEST] <Mavrik> Not the NDK package version.
[11:06:00 CEST] <Mavrik> But the version of the operating system you're trying to link against.
[11:06:17 CEST] <Mavrik> If you want your code to run on Android 1.5 you're linking against different libs than if you're trying to run on Android 6.0
[11:06:24 CEST] <Mavrik> And 1.5 might not have log2f at all ;)
[11:06:53 CEST] <Mavrik> That's usually the core issue, if you don't check you can easily end up compining for Android 1.0 or something dumb like that.
[11:07:08 CEST] <Mavrik> And do use GCC for ffmpeg. ;P
[11:07:38 CEST] <Mavrik> Since ffmpeg might also not properly configure Clang when compiling for these platforms.
[11:08:18 CEST] <Skullclown> Mavrik: alright I'll use gcc. OS version is Android API level, or am I misunderstanding?
[11:08:53 CEST] <Mavrik> Yes, I think it's called API level or platform level.
[11:08:58 CEST] <Mavrik> Try to use something a bit saner like API 15
[11:09:18 CEST] <Mavrik> And remember that you need to rebuild both always ;)
[11:09:47 CEST] <Skullclown> Mavrik: I used 16, 18 and 19 before. Why would 15 be saner?
[11:10:04 CEST] <Mavrik> It wouldn't, I thought you might want to use something older.
[11:10:16 CEST] <Mavrik> Any of those should be more than good enough for ffmpeg.
[11:10:46 CEST] <Skullclown> Now I'm a little confused, then why was this relevant?
[11:12:07 CEST] <Mavrik> Because I didn't know which API level did you point your ffmpeg compilation to.
[11:12:13 CEST] <Mavrik> Not psychic.
[11:12:35 CEST] <Skullclown> But I've told you which one I used before
[11:14:37 CEST] <Mavrik> 1.) You didn't. 2.) Why do you think debating this helps you?
[11:14:52 CEST] <Mavrik> (Again, NDK version isn't quite the same as API level of platform you're linking against.(
[11:15:40 CEST] <Skullclown> Mavrik: https://i.imgur.com/2wigxWs.png
[11:16:47 CEST] <Skullclown> 2) Because if I'm being assumed to be lying, I don't expect anyone would be willing to help properly
[12:25:23 CEST] <alone-y> hello, is it possible to split file not my time but my size?
[12:42:12 CEST] <alone-y> .
[12:42:37 CEST] <durandal_1707> no
[12:43:01 CEST] <alone-y> thank you durandal_1707
[14:09:39 CEST] <alone-y> any one know, why "-f segment -segment_time" increasing size of file so much while capturing screen?
[14:11:40 CEST] <Skullclown> Tried again: ffmpeg finally compiled for android w/ libx264 enabled but now whenever I use the executable on android it exits with error code 132, which means "illegal instruction" according to Google. Any ideas?
[14:13:01 CEST] <Skullclown> I imagine I must've set a wrong target platform or API, but I've tried changing what I have (I'm just guessing at this point) but no luck
[14:32:26 CEST] <svetlanak> Hi, Im reading a file with ffmpeg and the audio is out of sync. it plays correctly in chrome and vlc. checking the file with mediainfo shows a duration_firstframe field with the value 335ms on audio channel only. it appears to be the sync gap.
[15:59:55 CEST] <jrun> what is this? ERROR: nvenc requested, but not all dependencies are satisfied: ffnvcodec
[16:00:08 CEST] <jrun> i'm getting that trying to install on getto from git
[16:02:41 CEST] <BtbN> you need the headers from ffnvcodec for nvenc/dec
[16:02:55 CEST] <BtbN> http://git.videolan.org/?p=ffmpeg/nv-codec-headers.git;a=summary
[16:04:48 CEST] <jrun> BtbN: would this do?  https://github.com/lu-zero/nvidia-video-codec
[16:04:52 CEST] <jrun> is it the same thing?>
[16:04:57 CEST] <BtbN> no
[16:10:00 CEST] <jrun> hmm, ebuild for gentoo pulls this as the dependency for nvenc:  https://developer.nvidia.com/nvidia-video-codec-sdk
[16:11:02 CEST] <jrun> probably not the same?
[16:13:44 CEST] <jrun> which ends up installing this on the system: /usr/include/nvEncodeAPI.h
[17:23:50 CEST] <pgorley> hi, i'm using the remuxing.c example to save an rtp stream, but i'm getting this warning: Timestamps are unset in a packet for stream 1. This is deprecated and will stop working in the future. Fix your code to set the timestamps properly
[17:24:06 CEST] <pgorley> how do correctly set the timestamps for my file?
[17:26:31 CEST] <pgorley> in libavformat/mux.c, it sets the pts to st->internal->priv_pts->val, but i don't have access to the internal field
[17:49:22 CEST] <BtbN> jrun, the ebuild is wrong then.
[17:49:40 CEST] <BtbN> That header was never required, they used to be bundled with ffmpeg. And now moved to the mentioned repo.
[19:55:53 CEST] <SpeakerToMeat> Hello all
[19:56:14 CEST] <SpeakerToMeat> Question, is there any way to use ff to play two image sequences in an over/under format?
[19:58:28 CEST] <durandal_1707> SpeakerToMeat: with mpv
[20:02:30 CEST] <SpeakerToMeat> durandal_1707: How do I feed it the two file sets?
[20:02:52 CEST] <furq> SpeakerToMeat: -i foo%d.png -i bar%d.png -vf hstack -f nut -c:v rawvideo - | mpv -
[20:02:59 CEST] <furq> or vstack, rather
[20:03:28 CEST] <SpeakerToMeat> Nod, thanks
[20:03:48 CEST] <SpeakerToMeat> I was looking at the stereo3d filter but of course as it's separate files.
[20:06:15 CEST] <SpeakerToMeat> hmmm
[20:06:32 CEST] <SpeakerToMeat> "Simple filtergraph 'vstack' was expected to have exactly 1 input and 1 output. However, it had >1 input(s) and 1 output(s)"
[20:06:43 CEST] <furq> oh yeah sorry
[20:06:46 CEST] <furq> -lavfi vstack
[20:11:27 CEST] <SpeakerToMeat> I should try to fix and port a cuda jpeg2000 decoder soemday
[20:13:59 CEST] <SpeakerToMeat> I wonder why use mpv, can't ffplay use filters like stack? or multiple inputs, filters map?
[20:15:46 CEST] <ChocolateArmpits> maybe if ffplay supports lavfi filter input with multiple movie filters
[20:17:40 CEST] <furq> mpv supports all the libavfilter stuff
[20:17:46 CEST] <furq> what it doesn't handle well is multiple inputs iirc
[20:17:47 CEST] <SpeakerToMeat> ChocolateArmpits: one of my cats would love you.
[20:18:10 CEST] <furq> it has some limited support for that but it's not as flexible or obvious as piping from ffmpeg
[20:18:36 CEST] <SpeakerToMeat> nod
[20:20:42 CEST] <durandal_1707> furq: lies
[20:20:42 CEST] <Aerroon> i wonder what would give better image quality
[20:20:52 CEST] <Aerroon> vs render time and size
[20:20:56 CEST] <durandal_1707> furq: mpv supports lavfi-complex perfectly
[20:21:00 CEST] <Aerroon> x264 or hevc_nvenc
[20:21:01 CEST] <furq> yeah i said that
[20:21:13 CEST] <durandal_1707> ffplay does not support lavfi-complex at all
[20:21:22 CEST] <Aerroon> sorry, better image quality at the same size*
[20:22:03 CEST] <SpeakerToMeat> durandal_1707: He's just covering himself in case I come back to rant irrationally
[20:22:34 CEST] <furq> actually i misread the question
[20:22:41 CEST] <furq> use mpv because ffplay sucks
[20:29:53 CEST] <jonascj> I have a wmv video at 25fps. Is there a way to extract frame 0, 1, 25, 26, 50, 51, 75, 76 etc. without extracting all frames and discarding?
[20:30:49 CEST] <jonascj> What I want to do is reduce the FPS to make a time-lapse montage/video and I want to sum 2 neighbor frames to reduce noise (that is why I need frame 0,1,25,26 etc.)
[20:32:23 CEST] <kepstin> jonascj: that's unlikely to be a pattern that matches keyframes, so yes - you're going to have to decode all the frames and discard extra ones
[20:33:35 CEST] <jonascj> Its not crucial that it is frame 0,1,25,26 etc., what about just "almost certainly neighbor frames, roughly 1sec apart"?
[20:34:03 CEST] <jonascj> well maybe the fastest in terms of learning new ffmpeg switches will be to just extract everything and discard
[20:34:22 CEST] <jonascj> Only issue is the number of files, maybe I can run a purge job in in parallel deleting frames as they are generated :P
[20:35:32 CEST] <kepstin> jonascj: the easiest way to do this in the ffmpeg command line tool is to use the "select" video filter with an expression that matches the frames you want to keep
[20:35:56 CEST] <jonascj> kepstin: cheers, I'll look at the select filter
[20:38:04 CEST] <furq> jonascj: select=lt(mod(n\,25)\,2),tblend
[20:39:53 CEST] <jonascj> furq: a modulo operation is just the thing, so that says n%2<2 which will give me two frames for each 25 frames, neat
[20:51:47 CEST] <kerio> output rawvideo with muxer rawvideo into a script or something
[20:57:14 CEST] <jonascj> kerio: I was thinking along those lines as well, redirecting output to a script, but I think the "select" filter will suffice
[20:57:43 CEST] <kerio> how do you take the mean of nearby frames, though?
[20:59:38 CEST] <jonascj> kerio: If I just get the frames (as image files) I'll just make a second pass with imagemagic or similar
[21:36:19 CEST] <furq> jonascj: tblend will blend every pair of frames, that's why i put it there
[21:36:24 CEST] <furq> !filter tblend @jonascj
[21:36:24 CEST] <nfobot> jonascj: http://ffmpeg.org/ffmpeg-filters.html#blend_002c-tblend
[21:37:20 CEST] <jonascj> furq: ah, so that was already a part of your select-filter line
[21:44:18 CEST] <MASM> Hello i'm trying decode data from a listener socket with ffmpeg, ffmpeg -listener 1 -i tcp://127.0.0.1:1212     when i run this i don't see data incoming in ffmpeg, and when i run a simple tcp server, i see raws of data incoming
[21:44:51 CEST] <MASM> the format incoming of the audio is g726,
[21:46:00 CEST] <BtbN> I never heard of -listener, where is this documented?
[21:47:06 CEST] <MASM> BtbN: sorry it is "-listen"
[21:47:18 CEST] <MASM> BtbN: ffmpeg -listen 1 -i tcp://127.0.0.1:1212
[21:47:52 CEST] <BtbN> -listen 1 is for the http server
[21:48:05 CEST] <c_14> you want ?listen afaik
[21:48:09 CEST] <c_14> at the end of the uri
[21:50:51 CEST] <MASM> I have a dvr that point to a ip and port, and with ffmpeg i succefully decode the video to hls, but i cant get the audio from it, my example with ffmpeg with the video, :::     ffmpeg -listen 1 -i tcp://192.168.0.11:1212 -f hls -g2 -segment_list_type hls -hls_allow_cache 0 -segment_list_flags +live -hls_time 5 output.m3u8
[21:53:27 CEST] <MASM> with this command i convert the incoming video h264 to hls, but the audio incoming is g726
[21:54:03 CEST] <MASM> and i have been trying a lot of things to conver the data incoming (HEX) to audio format
[21:54:24 CEST] <c_14> if the video works, I see no reason the audio shouldn't
[21:55:31 CEST] <MASM> fflogger: thanks
[21:58:59 CEST] <MASM> c_14: the problem is that the ffmpeg don't detect the format of the audio
[22:00:54 CEST] <c_14> try forcing it with -c:a g726 (as an input option)
[22:58:59 CEST] <jpabq> When building ffmpeg, is there a way to get it to show the actual compile/link commands being used?
[22:59:30 CEST] <c_14> make V=1 iirc
[23:00:36 CEST] <jpabq> c_14: THANK YOU!    I have been searching for that for a while.   I don't know why I didn't ask here earlier.
[00:00:00 CEST] --- Wed Apr 18 2018


More information about the Ffmpeg-devel-irc mailing list