[Ffmpeg-devel-irc] ffmpeg.log.20171016

burek burek021 at gmail.com
Tue Oct 17 03:05:01 EEST 2017


[01:05:52 CEST] <Pandela> Does FFmpeg support directshow output?
[01:35:05 CEST] <g0n> Pandela yes
[01:36:08 CEST] <Pandela> g0n: As in like a directshow device readable by 3rd party apps? Could you show me an example?
[01:38:45 CEST] <Cracki_> dshow display is probably what they meant
[01:38:58 CEST] <Cracki_> not a dshow source you can pull from
[01:42:36 CEST] <Pandela> Cracki_: Exactly, similar to how manycam or splitcam has their own. Or even this https://github.com/CatxFish/obs-virtual-cam
[01:42:49 CEST] <Cracki_> hmmm
[01:43:19 CEST] <Cracki_> interesting application
[01:43:28 CEST] <Cracki_> how does such a virtual device get its data fed?
[01:44:45 CEST] <Pandela> https://en.wikipedia.org/wiki/GraphEdit
[01:44:45 CEST] <Pandela> Thats a good question, i dont knoow the actual process but you can usually do it manually with graphedit
[01:46:46 CEST] <Cracki_> surely that OBS stuff can be adapted
[01:47:21 CEST] <Cracki_> I figure these sources are created and destroyed by whatever provides the data, and live only so long
[01:48:25 CEST] <Pandela> I mean OBS uses ffmpeg anyway, i feel like it could
[01:49:13 CEST] <Pandela> Who wouldnt want to use ffmpeg output as a virtual webcam
[01:49:19 CEST] <Pandela> Wonder what the dev channel would say lol
[06:39:39 CEST] <andjarnic> Hello all. I discovered the wonderful ability to concatenate multiple videos into one, and using the -c copy option to avoid rendering..just muxing. Question though.. Given that it is not rendering but simply putting one file after another, thus just writing the files out to one combined file, why is it so slow?
[06:39:53 CEST] <andjarnic> It is doing like 5fps.. on my quad core system with 64GB RAM and SSD drives.
[06:40:28 CEST] <andjarnic> Seems like it should take a matter of minutes or less to combine files together into one.
[07:32:31 CEST] <furq> andjarnic: pastebin the command
[09:33:53 CEST] <g0n> hi
[09:34:56 CEST] <g0n> I have a serie of packets coming from an application with fields: dts pts data duration.
[09:35:26 CEST] <g0n> These packet are of variable size.
[09:36:32 CEST] <aem34i> hey folks - Is there an option to set (or a patch) for ffmpeg to auto-connect to my jack ports (e.g. Ardour Jack ports)? (I'm using Jack2 and I don't want to use qjackctl's patchbay)
[09:36:47 CEST] <g0n> Can someone point me in the right direction for encoding this into a file?
[09:40:26 CEST] <blap> g0n: maybe output redirection will help, using ">"
[09:43:27 CEST] <g0n> Im not using the command-line, but the C interface. is that something usable in c?
[09:53:31 CEST] <blap> no
[09:58:26 CEST] <g0n> Thats what I tought. thanks anyway.
[10:00:03 CEST] <g0n> From what i know I think i have to somehow put this in avpackets, from that encode avframes and then build the video
[10:00:24 CEST] <JEEB> decode you mean :)
[10:00:29 CEST] <JEEB> but yes, you need a demuxer
[10:00:51 CEST] <JEEB> or if you already have the packets in a data structure you need to create an AVPacket
[10:00:58 CEST] <JEEB> and then set the fields accordingly with the buffer
[10:01:20 CEST] <JEEB> then that can be utilized to get decoded things as AVFrames with the normal AVPacket-ingesting decoding APIs :)
[10:01:32 CEST] <JEEB> then you can encode that with encoders that ingest AVFrames
[10:01:57 CEST] <JEEB> g0n: do note that if you're already getting raw data then you might want to just skip to generating AVFrames
[10:02:15 CEST] <Nacht> Anyone knows which configure option I need to enable to use dshow ?
[10:02:26 CEST] <g0n> Nice explanation
[10:02:53 CEST] <g0n> How would i go about skipping the avframes part?
[10:02:54 CEST] <JEEB> Nacht: on windows with a new enough SDK/mingw-w64 it should just get enabled
[10:03:10 CEST] <JEEB> g0n: you generate an AVFrame and set fields accordingly as well
[10:03:11 CEST] <g0n> Im almost surf is raw data
[10:03:15 CEST] <g0n> Sure
[10:03:24 CEST] <JEEB> and raw data as in already decoded stuff
[10:04:40 CEST] <g0n> Good question...
[10:04:55 CEST] <JEEB> if you are for example getting raw packets of H.264 NAL units
[10:04:58 CEST] <JEEB> then that is no longer raw
[10:05:14 CEST] <JEEB> as that needs a H.264 decoder to be made into raw pictures
[10:06:00 CEST] <g0n> Well, i gues Ill try the first approach, with avframe and see where that goes...
[10:06:39 CEST] <aem34i> Can someone plz answer my question?
[10:07:04 CEST] <g0n> The size of a 77 min vídeo is about 60 gb (1280x720)
[10:11:18 CEST] <g0n> Than for pointing me in the right direction and correcting my terms, that will sure help!!!
[10:11:30 CEST] <g0n> Thanks
[10:13:42 CEST] <JEEB> aem34i: not many people using JACK input module here unfortunately
[10:13:56 CEST] <aem34i> JEEB: :(
[11:54:56 CEST] <bove> Should I be worried about RTP: missed n packets when using an rtsp input? Does that mean the frames where lost. And is there anything I can do to prevent it? https://pastebin.com/F6VSQzn6
[11:57:45 CEST] <BtbN> It's UDP. If you're on a shitty connection, there is going to be loss.
[11:59:11 CEST] <bove> BtbN: I tried ?rtsp_flags=prefer_tcp, but it didn't help much
[12:01:49 CEST] <BtbN> No idea how rtsp is implemented, but even when using tcp, if it can't keep up, it's going to drop stuff
[12:03:08 CEST] <bove> I'm thinking maybe there's a way do create a buffer between my rtsp input and http output?
[12:04:03 CEST] <BtbN> The sender is dropping stuff.
[12:05:06 CEST] <BtbN> It's a real-time protocol after all, as the name implies
[12:15:36 CEST] <bove> I see. I was hoping to get smoother playback by converting to http before piping it across the internet, but just piping the rtsp stream gives smoother playback
[12:30:12 CEST] <AlienPenguin> hi all, i have a question regarding the ffmpeg windows binary packages... when i use those in a VC 2010 project of mine i got mixed results: in debug builds everything is fine, in release builds the app does not start because i get the following error: impossible to find entry point for ?fastFree at cv@YAXPAX at Z in avcodec-57.dll
[12:30:25 CEST] <AlienPenguin> Can anyone point me in the right direction?
[12:30:47 CEST] <JEEB> zeranoe is making those unofficial binaries
[12:32:08 CEST] <furq> are you using opencv
[12:32:35 CEST] <furq> the zeranoe builds are built with mingw so i wouldn't expect them to play nicely with a C++ MSVC project
[12:32:52 CEST] <AlienPenguin> furq, yes my project already linked opencv as opencv before i started adding ffmpeg
[12:33:28 CEST] <AlienPenguin> furq, so you think i have no chance with VC? there is no doc about compiling ffmpeg with VC, only using mingw
[12:33:41 CEST] <furq> https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC
[12:33:46 CEST] <furq> idk if it's strictly necessary though
[12:33:55 CEST] <furq> i've not touched C++ enough to know what the actual problem is
[12:34:17 CEST] <JEEB> mingw-w64 itself should be OK, but indeed opencv could be a problem
[12:34:26 CEST] <JEEB> esp. since static linking is not utilized
[12:34:31 CEST] <JEEB> mingw-w64 + MSVC + static linking is a PITA
[12:35:02 CEST] <JEEB> but yes, you can compile with MSVC but support for C99 features was added in an update to VC2013
[12:35:10 CEST] <JEEB> so 2010 is awfully outdated in that sense
[12:35:23 CEST] <JEEB> it would probably be simpler to build with a modern mingw-w64 without opencv on FFmpeg's side
[12:35:33 CEST] <JEEB> that way it wouldn't try to get things from the FFmpeg DLL
[12:35:36 CEST] <AlienPenguin> why opencv should be a problem? do you think that zeranoe builds are using it?  the strange thing is that when building in debug mode everything works fine
[12:35:47 CEST] <JEEB> probably because your VC library is different then
[12:35:50 CEST] <furq> well that missing symbol is from opencv
[12:36:03 CEST] <furq> at least according to google
[12:38:46 CEST] <JEEB> http://ffmpeg.zeranoe.com/builds/readme/win64/static/ffmpeg-20171014-0655810-win64-static-readme.txt
[12:38:50 CEST] <JEEB> what Zeranoe links
[12:39:09 CEST] <JEEB> or actually this http://ffmpeg.zeranoe.com/builds/readme/win64/shared/ffmpeg-20171014-0655810-win64-shared-readme.txt
[12:39:21 CEST] <JEEB> he has vid.stab there which might be using opencv?
[12:40:27 CEST] <AlienPenguin> ok, i think i have to try and compile that by myself then...
[12:40:39 CEST] <AlienPenguin> however it's a PITA i am stuck with VS2010
[12:41:04 CEST] <furq> i assume you can build ffmpeg with mingw as long as you don't link any C++ libs
[12:41:36 CEST] <furq> which is still annoying but nowhere near as annoying as building it with msvc
[12:51:54 CEST] <JEEB> AlienPenguin: yea just build shared with mingw-w64 and link like with zeranoe's
[12:59:46 CEST] <AlienPenguin> JEEB, i currently do not have a mingw toolchain, however the app is still 32bits, so i guess mingw64 is a no-go
[13:08:07 CEST] <furq> mingw-w64 will build for i686
[13:08:27 CEST] <furq> the name is confusing
[13:08:43 CEST] <furq> the original mingw couldn't build for amd64, so mingw-w64 is a fork that can
[13:09:15 CEST] <furq> and because original mingw was completely stagnant, everyone uses -w64 now
[14:30:31 CEST] <SolarAquarion> ERROR: libflite not found
[14:30:38 CEST] <SolarAquarion> somethings going on
[14:31:56 CEST] <JEEB> do you need that one?
[14:32:02 CEST] <JEEB> I mean, only enable stuff that you need
[14:32:20 CEST] <JEEB> a basic `./configure` will already give you 99% of all decoders and thus you only need to care about encoders
[14:32:37 CEST] <JEEB> specific encoders that you might need and which have a "lib" prefix
[15:00:35 CEST] <rabbe> anyone tried using SimpleWebRTC using only one master to stream his video to the other 'slaves'?
[15:09:16 CEST] <Nacht> I only ever tried with a modified omxplayer to have a master-slave setup
[15:09:27 CEST] <Nacht> But that was due to the fact I was doing in on RPi's
[15:14:58 CEST] <rabbe> using rtsp to send the video?
[15:15:31 CEST] <rabbe> with an rtsp server?
[15:16:40 CEST] <Nacht> Nah, they all shared the same VOD and played that.
[15:17:08 CEST] <Nacht> Now I'm just using mutlicast
[15:17:19 CEST] <Nacht> Which works decent, but not perfect.
[15:17:52 CEST] <Nacht> If the video buffers don't really line up, you still get some sync issues. But on the Rpi you don't really have much variation in players.
[15:17:58 CEST] <Nacht> As in, you're stuck with OMXplayer
[15:18:13 CEST] <rabbe> k.. i need to stream realtime video
[15:18:53 CEST] <rabbe> and i would like to record and be able to watch in slow motion also... :) impossible?
[15:20:00 CEST] <Nacht> Record what ?
[15:20:14 CEST] <rabbe> the master's video
[15:21:23 CEST] <Nacht> I'm not really sure what you're trying to do here. Start with explaining what you're trying to make/achieve
[15:27:42 CEST] <rabbe> camera on a robot arm. video signal captured with capture card. mobile devices like android or ipad tablet that should watch the realtime stream. functionality to start and stop recording and the mobile devices should be able to watch the recordings also, in slow motion if they want
[15:28:33 CEST] <rabbe> preferably in a web environment, but using VLC would be ok also
[15:29:16 CEST] <rabbe> i've tried flv over rtmp, but i cannot get the same nice latency in VLC as by using ffplay -fflags nobuffer
[15:30:09 CEST] <Nacht> How many fps does the camera give off, and how many does the capture card support
[15:30:47 CEST] <rabbe> i read that rtsp would give better latency when played with VLC, but i couldn't get v4l2rtspserver to work because of the v4l2loopback thing that i couldn't get working
[15:31:16 CEST] <rabbe> it's a gopro hero5. capture card supports 1080p at 60fps which the camera can provide
[15:33:23 CEST] <Nacht> RTMP has the advantage that you can show it on websites.
[15:33:35 CEST] <Nacht> RTSP also, but you need an activeX component
[15:33:43 CEST] <rabbe> without plugins?
[15:33:51 CEST] <Nacht> RTMP would need Flash
[15:34:07 CEST] <rabbe> yeah..
[15:34:07 CEST] <Nacht> thing is, for low latency streams, you still stuck using old tech
[15:34:22 CEST] <Nacht> HTTP streams create latency, sadly
[15:34:54 CEST] <rabbe> but webrtc is supposed to be the sh*t nowadays, right?
[15:35:06 CEST] <Nacht> Hence, when you're using hardware products with build in web features which allow you to see the actual video, you're looking at a RTMP stream
[15:35:55 CEST] <rabbe> or RTSP?
[15:38:15 CEST] <rabbe> i figured that an IP camera is not built for twisting and turning in strange ways, which was why i decided to go with an action camera..
[15:39:15 CEST] <robswain[m]> The Stockholm subway uses axis cameras for monitoring everything. They use gstreamer and rtsp/rtp I believe
[15:39:38 CEST] <robswain[m]> Depends how low latency you need
[15:40:07 CEST] <robswain[m]> And also what reliability you need
[15:40:21 CEST] <rabbe> initially i was gonna use the udp stream available through the gopro's own wifi, but i didn't think it provided production quality because of the jitter
[15:40:42 CEST] <rabbe> okay
[15:40:55 CEST] <robswain[m]> I would guess streamers use rtmp over tcp because of a combination of legacy and reliability
[15:41:25 CEST] <Nacht> Mostly, in industrial areas, you go for industrial camera's
[15:41:30 CEST] <Nacht> Such as machine vision camera's
[15:41:49 CEST] <Nacht> Axis (CCTV) camera's can as well, but they are usually not that sturdy, but allot cheaper
[15:41:51 CEST] <robswain[m]> Reliability being perhaps more important for broadcast than end to end latency
[15:42:03 CEST] <rabbe> yeah, but the ones i checked had too low resolution
[15:45:14 CEST] <rabbe> anyways.. right now i have the following options: flv over RTMP with nginx, giving latency and longtime instability using VLC. trying to get v4l2rtspserver working so i can test v4l2rtspserver with VLC, or trying to customize SimpleWebRTC
[15:45:43 CEST] <rabbe> using VLC i guess it will not be that user friendly for an operator to get the recorded videos
[15:45:51 CEST] <rabbe> so web is preferred
[15:47:35 CEST] <rabbe> i got recording working with nginx+rtmp.. and it should work equally with recording an rtsp stream.. using webrtc i guess i would try RecordRTC... hopefully i can record on the sending side to avoid lag
[15:48:15 CEST] <Nacht> Define low
[15:48:15 CEST] <Nacht> I wouldn't really advice using consumer products for industrial means, as they usually dont last logn
[15:49:19 CEST] <blap> glory glory to the lord. i have keycolor transparency working in X11
[15:49:45 CEST] <rabbe> the solution is build so that any HDMI camera or usb camera should work
[15:50:24 CEST] <rabbe> i liked the nice support of accessories provided by the gopro.. didn't have to think too much about constructions for the robot arm
[15:57:29 CEST] <rabbe> so.. back to webrtc.. how would i let only one client of the page provide the stream to all other clients? should i check the IP? and will i always need to open the browser to start the stream?
[15:58:26 CEST] <rabbe> i will start the stream from the same computer that hosts the page
[16:22:12 CEST] <SolarAquarion> JEEB: I compared the files I have to deb Dev files and they're the same
[16:41:10 CEST] <Zeranoe_> Does FFmpeg have a native decoder for Speex, or does it rely on the external library?
[16:41:46 CEST] <Zeranoe_> Followup question: are there legitimate reasons to use Speex today?
[16:43:26 CEST] <JEEB> very few to be honest, but maybe someone has some legacy system somewhere making speex?
[16:43:57 CEST] <furq> there's no native decoder
[16:44:08 CEST] <furq> and yeah there's no reason to use it or celt over opus for new applications
[16:51:01 CEST] <Zeranoe_> How about openh264? Is it somehow better than x264?
[16:51:31 CEST] <JEEB> it might be faster on ARM in some very specific use cases, generally it is used because of its license and how cisco distributes binaries that you don't have to pay MPEG-LA for
[16:51:39 CEST] <DHE> no. it's actually quite bad. but it does have the blessings of the licensing people, so there's that
[16:51:43 CEST] <bencoh> Zeranoe_: far from being complete
[16:51:52 CEST] <JEEB> doesn't apply to binaries you distribute of course, only cisco
[16:54:14 CEST] <Zeranoe_> Are there any on here that should be added back? https://ffmpeg.zeranoe.com/forum/viewtopic.php?f=2&t=5164#p12626
[17:03:15 CEST] <furq> i would probably keep vorbis
[17:04:08 CEST] <furq> and i personally build with modplug and gme, although i don't think i've ever actually used them
[17:04:54 CEST] <furq> openh264 is probably pointless because you'd probably want to build ffmpeg yourself anyway if you need it for the license
[17:05:47 CEST] <furq> also i know gme uses cmake (and not even very well) so i understanding wanting rid of that
[17:05:53 CEST] <furq> -ing
[17:05:57 CEST] <bencoh> it's "useful" only in that it isn't gpl
[17:06:12 CEST] <bencoh> (and doesn't taint)
[17:06:15 CEST] <Zeranoe_> furq: I'll remove x265 then
[17:06:21 CEST] <furq> lol
[17:06:26 CEST] <bencoh> ?
[17:06:32 CEST] <Zeranoe_> cmake
[17:06:37 CEST] <furq> one cmake is better than two
[17:06:41 CEST] <Zeranoe_> lots of libs are
[17:06:55 CEST] <furq> i just remember having nightmares trying to get gme to actually build a shared lib
[17:06:56 CEST] <Zeranoe_> it's what the cool kids are doing
[17:07:05 CEST] <furq> the documented flag for it didn't work
[17:07:09 CEST] <bencoh> """cool"""
[17:07:17 CEST] <SolarAquarion> i have a issue with flite
[17:07:24 CEST] <furq> SolarAquarion: do you actually need flite
[17:07:42 CEST] <SolarAquarion> furq, it's needed in the pkgbuild i use
[17:07:54 CEST] <furq> so remove it?
[17:07:55 CEST] <Zeranoe_> furq: I'll add Vorbis back, GME I can't figure out a use for, or modplud
[17:08:01 CEST] <SolarAquarion> but i guess i should remove the switch
[17:08:34 CEST] <furq> Zeranoe_: they're probably more useful for audio players using the libs
[17:08:53 CEST] <furq> i can imagine doing some kind of batch conversion with them though
[17:09:17 CEST] <Fyr> guys, what does pseudocolor in FFMPEG do?
[17:11:14 CEST] <Zeranoe_> "Alter frame colors in video with pseudocolors."
[17:11:34 CEST] <Fyr> ok, what is pseudocolors here?
[17:11:37 CEST] <brendan_> Hi there. I am using the zeranoe build to transcode using h264_qsv. When I transcode using the following ffmpeg -y -stats -i inputfile.mp4 -c:v h264_qsv -preset fast output.mp4 on an HD file... the transcode uses 1000kbps. If I use libx264, then it keeps a reasonable bitrate of > 5000kbps. I don't necessarily know the input file's bitrate to start with and was hoping that it would keep a reasonable frame rate. Any ideas why?
[17:12:27 CEST] <brendan_> Unless I specify bitrate, the h264_qsv sticks to 1000kbps
[17:12:31 CEST] <Zeranoe_> brendan_: What are your reasons to use QSV?
[17:12:54 CEST] <brendan_> I am trying to transcode quicker (i.e. use GPU
[17:13:32 CEST] <brendan_> I see this line in the console: Stream #0:0(und): Video: h264 (h264_qsv) (avc1 / 0x31637661), nv12, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 1000 kb/s, 25 fps, 12800 tbn, 25 tbc (default)
[17:14:43 CEST] <brendan_> I want to use nvenc when I can, but default to qsv when I only have an intel i5/i7 processor
[17:17:53 CEST] <Zeranoe_> brendan_: It's probably the rate control, which varies by device
[17:18:57 CEST] <brendan_> I tried adding -crf 23... same result... this is what I tried: .\ffmpeg -y -stats -i input.mp4 -c:v h264_qsv -crf 23 -preset fast output.mp4
[17:19:16 CEST] <brendan_> Is there another way to set this? New to me
[17:19:50 CEST] <Zeranoe_> brendan_: what if you use -b:v 1M?
[17:20:08 CEST] <Zeranoe_> 5M rather
[17:20:54 CEST] <brendan_> That works 100%
[17:21:12 CEST] <brendan_> I am trying to work around now knowing the bitrate of the input file
[17:21:46 CEST] <brendan_> so my input file could be any file type or resolution... which I am trying to get into a common format, and then downscale to the lower resolutions
[17:24:19 CEST] <brendan_> My fallback plan is to determine the width / height of the source file, and then to force bitrates... but I was hoping not to have to do this.
[17:24:31 CEST] <Zeranoe_> brendan_: it should be noted that qsv is pretty fragile, so be warned
[17:25:02 CEST] <brendan_> ok, well I can get around this issue if qsv has its moments. Just hoping I am not making a mistake
[17:25:02 CEST] <Zeranoe_> If you need a robust encoder, don't use it
[17:25:06 CEST] <brendan_> ok
[17:25:16 CEST] <brendan_> I assume an NVidia card and nvenc would be a lot better>
[17:25:19 CEST] <brendan_> ?
[17:25:19 CEST] <AlienPenguin> Zeranoe_, hi, i was trying to use your latest build in a project of mine (which uses opencv 3.20) with VS 2010, however when running in release mode the application won't start because of the error: Impossible to find the entry point ?fastFree at cv@YAXPAX at Z of the procedure in the dll avcodec-57.dll
[17:25:37 CEST] <Zeranoe_> brendan_: Does it work with quality? -q
[17:25:40 CEST] <AlienPenguin> when i run all in debug mode it works flawlessly
[17:25:44 CEST] <brendan_> let me check quick
[17:25:53 CEST] <AlienPenguin> do you have any insight?
[17:26:50 CEST] <Zeranoe_> AlienPenguin: Hm, my builds don't have an opencv in them, are you sure it's due to my dll?
[17:27:37 CEST] <AlienPenguin> well, i did not have any other ffmpeg library on my system and if i remove ffmpeg support the application start just fine both in release and debug
[17:28:32 CEST] <AlienPenguin> besides, as i said, this happens only if i run the application in "release" mode, when running in debug the app works just fine
[17:30:00 CEST] <Zeranoe_> AlienPenguin: It can find the dll, right?
[17:30:23 CEST] <AlienPenguin> Zeranoe_, yes, if i move the dll away then the app won't start saying it is missing the dll
[17:30:42 CEST] <Zeranoe_> AlienPenguin: I'd need a minimal code example to reproduce it
[17:31:48 CEST] <brendan_> Zeranoe_: I can't find documentation on -q... do you have a value I could use?
[17:32:07 CEST] <AlienPenguin> i have to see if i can find a snippet, unfortunately now it is quite integrated in my app code...
[17:34:54 CEST] <Zeranoe_> brendan_: I forget the range, just try -q 30
[17:36:22 CEST] <brendan_> .\ffmpeg -y -stats -i input.mp4 -c:v h264_qsv -q 30 -t 0:01:00 output.mp4 give me "More than one of: { constant qscale, lookahead, VCM } requested, only one of them can be used at a time." and "Error initializing output stream 0:0 -- Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height"
[17:38:02 CEST] <Zeranoe_> brendan_: ffmpeg -y -stats -i input.mp4 -c:v h264_qsv -q 30 -look_ahead 0 -t 0:01:00 output.mp4
[17:38:30 CEST] <arinov> why the stream fps is always 8 and sound are so shitty and laggy?
[17:38:38 CEST] <arinov> whatever i do
[17:38:41 CEST] <arinov> 320x240
[17:38:43 CEST] <arinov> the same
[17:38:50 CEST] <Zeranoe_> arinov: You forgot pretty much all the info we need
[17:39:14 CEST] <brendan_> zeranoe_: ok that still gives me 1000kb/s: Stream #0:0(und): Video: h264 (h264_qsv) (avc1 / 0x31637661), nv12, 1920x1080 [SAR 1:1 DAR 16:9], q=2-31, 1000 kb/s, 25 fps, 12800 tbn, 25 tbc (default)
[17:39:35 CEST] <brendan_> wait I lie
[17:39:56 CEST] <brendan_> that is the powershell output, but if I check the file it is a bitrate of 1885kbps
[17:40:14 CEST] <brendan_> still very low for 1080p
[17:40:15 CEST] <arinov> ffmpeg -f v4l2 -video_size 320x240 -framerate 30 -r 10 -i /dev/video0 \
[17:40:17 CEST] <arinov>     -f alsa -i pulse -ac 2 -vcodec libx264 -vf hue=s=0 -pix_fmt yuv420p -preset ultrafast -g 20 -b:v 2500k \
[17:40:19 CEST] <arinov>     -acodec libmp3lame -ar 44100 -ab 128k -threads 0 -bufsize 512k \
[17:40:21 CEST] <arinov> -f flv "rtmp://a.rtmp.youtube.com/live2/$STREAM_KEY flashver=FMLE/3.0\20(compatible;\20FMSc/1.0)"
[17:41:46 CEST] <arinov> Zeranoe_:
[17:42:03 CEST] <Zeranoe_> brendan_: Honestly I don't know if constant quality will work here. Maybe someone else here knows, but I always use -b:v myself
[17:42:42 CEST] <arinov> maybe i should stream 10x10?
[17:42:43 CEST] <Zeranoe_> arinov: Is this from a webcam?
[17:42:47 CEST] <arinov> yep
[17:42:48 CEST] <brendan_> Thanks Zeranoe. I think my bigger issue was trying to figure out if I was making an error in its usage. I will use -b:v
[17:42:55 CEST] <alexp> hey guys. I'm trying to fix an issue with Yadif where AVCHD videos from cameras do not get properly deinterlaced unless you manually specify the frame rate prior to input. I'm looking in vf_yadif.c but I'm not sure where I should be looking to override the frame rate detection logic
[17:43:08 CEST] <brendan_> Zeranoe... thanks for the help!
[17:43:14 CEST] <Zeranoe_> arinov: How does ffplay work with it?
[17:43:19 CEST] <arinov> but the lags in sound too, which is from default audio device
[17:43:37 CEST] <arinov> i need to find out, wait
[17:43:39 CEST] <Zeranoe_> arinov: no encoding, just see if you can get decent results with ffplay
[17:43:52 CEST] <arinov> perfect
[17:44:04 CEST] <arinov> fast and smooth
[17:44:19 CEST] <arinov> Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 320x240, 36864 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
[17:44:29 CEST] <Zeranoe_> arinov: Are you getting buffer overflow errors when encoding?
[17:44:56 CEST] <arinov> ffplay or stream?
[17:45:10 CEST] <arinov> [flv @ 0x90f6c0] Failed to update header with correct duration.rate=2189.1kbits/s
[17:45:11 CEST] <Zeranoe_> arinov: the stream
[17:45:12 CEST] <arinov> [flv @ 0x90f6c0] Failed to update header with correct filesize.
[17:45:14 CEST] <arinov> ffplay says
[17:46:02 CEST] <arinov> nope
[17:46:08 CEST] <arinov> it seems nothing critial
[17:46:16 CEST] <arinov> just a couple of yellow strings
[17:46:26 CEST] <Zeranoe_> what does ffmpeg -f v4l2 -video_size 320x240 -framerate 30 -r 10 -i /dev/video0 -f alsa -i pulse -ac 2 -vcodec libx264 -vf hue=s=0 -pix_fmt yuv420p -preset ultrafast -g 20 -b:v 2500k -acodec libmp3lame -ar 44100 -ab 128k -threads 0 -bufsize 512k -f matroska | ffplay -
[17:46:29 CEST] <arinov> rame=  512 fps=8.5 q=2.0 size=   13275kB time=00:00:49.90 bitrate=2179.4kbits/s
[17:46:52 CEST] <Zeranoe_> what does ffmpeg -f v4l2 -video_size 320x240 -framerate 30 -r 10 -i /dev/video0 -f alsa -i pulse -ac 2 -vcodec libx264 -vf hue=s=0 -pix_fmt yuv420p -preset ultrafast -g 20 -b:v 2500k -acodec libmp3lame -ar 44100 -ab 128k -threads 0 -bufsize 512k -f matroska - | ffplay -
[17:47:30 CEST] <arinov> oh hate weechat and its strings
[17:47:34 CEST] <arinov> a second please
[17:47:41 CEST] Action: arinov weechat is shit
[17:49:11 CEST] <arinov> http://pastebin.centos.org/377796/
[17:49:15 CEST] <arinov> full output
[17:50:10 CEST] <arinov> sound is laggy there
[17:50:26 CEST] <arinov> tr-tr-tr-tr-tr-tr-tr-
[17:50:43 CEST] <arinov> but the video is pretty fast
[17:52:08 CEST] <arinov> oh now the sound died
[17:52:40 CEST] <arinov> video is good, the sound lags, lags, lags and died
[17:53:17 CEST] <arinov> 120 kbits/s
[17:53:21 CEST] <arinov> the best result
[17:53:23 CEST] <alexpigment> arinov: just to rule things out, have you tried keeping the audio at 48khz in case the conversion from 48k to 44.1 is causing the problem?
[17:54:05 CEST] <arinov> the better way is?
[17:54:17 CEST] <arinov> just stream 48?
[17:54:19 CEST] <alexpigment> -ar 48000 (or don't define it at all)
[17:54:23 CEST] <alexpigment> just as a test
[17:54:32 CEST] <arinov> ok
[17:55:13 CEST] <arinov> 2-5 seconds clear sound, then again tr-tr-tr-tr-tr-tr
[17:55:32 CEST] <arinov> Stream #0:1: Audio: mp3, 48000 Hz, stereo, s16p, 128 kb/s (default)
[17:55:39 CEST] <alexpigment> arinov: Ok, I just figured I'd throw that suggestion out there
[17:55:43 CEST] <arinov> it seems the sound problems here
[17:56:54 CEST] <arinov> 2-5 seconds of clear sound with 800kbits/s and then it starting lagg with 100-120 kbits/s
[17:57:29 CEST] <alexpigment> and I assume you've already raised the thread_queue_size as it suggested?
[17:58:33 CEST] <alexpigment> if not, try -thread_queue_size 512
[17:58:46 CEST] <alexpigment> or even -thread_queue_size 1024
[17:58:51 CEST] <demonicsweaters> hey all, I'm hoping somebody will have an idea here. What I want to do is take 2 MP3s, one is a complete song, the other is just a short talking clip, I want to mix them together, but when the voice is playing, I want the music level to drop abit to allow the talk to be louder, then after the talking stops, the music comes back up and the song completes
[17:59:35 CEST] <blap> a compressor/limiter would do that demonicsweaters
[17:59:52 CEST] <Zeranoe_> I don't think FFmpeg can do that in one pass
[18:00:21 CEST] <blap> pretty sure you can do it actually, mix two sources, then run through a compressor filter
[18:00:26 CEST] <arinov> the same
[18:00:37 CEST] <demonicsweaters> hmm, maybe a sidechain to create a ducking effect
[18:00:38 CEST] <alexpigment> blap: assuming that the voiceover is significantly louder, yes
[18:00:55 CEST] <arinov> maybe i should run from mp3 to ogg?
[18:01:10 CEST] <alexpigment> arinov: any tests are good ones
[18:01:11 CEST] <demonicsweaters> is sidechain compression possible in ffmpeg?
[18:01:13 CEST] <arinov> or this is an Intel, the God of trottling?
[18:01:45 CEST] <alexpigment> arinov: try -c:a copy first . if it's a problem there, you can rule out encoding altogether
[18:02:50 CEST] <arinov> hmm
[18:03:04 CEST] <arinov> it seems the sound from webcam lags
[18:03:14 CEST] <arinov> i should check any other sources
[18:04:59 CEST] <arinov> is there a way to set pulse audio input device to ffmpeg?
[18:05:11 CEST] <arinov> i have several inputs here
[18:07:46 CEST] <arinov> oh it can be changed by the system sound preferences
[18:30:00 CEST] <Johnjay> idea: a filter in addition to segment that just takes a number and splits the file into chunks
[18:30:37 CEST] <Johnjay> as in ffmpeg -i file.wav -f divide 3 output.wav => 3 files of equal length adding up to output
[18:31:06 CEST] <alexpigment> Johnjay: do you have a use case in mind?
[18:31:39 CEST] <alexpigment> it seems odd to me that your goal is ever to just get a specific number of segments
[18:31:51 CEST] <alexpigment> more than likely, you'd need a specific file size per segment
[18:32:07 CEST] <alexpigment> (or specific length)
[18:32:43 CEST] <Johnjay> alexpigment: I use an mp3 player to listen to things while i work out. But the talks I download are typically 2-3 hours in length
[18:33:05 CEST] <Johnjay> So I have to split them down into segments. And so I have to calculate exact time with segment filter
[18:33:18 CEST] <alexpigment> Johnjay: right, so the number of segments seems like an indirect way to split on length
[18:33:47 CEST] <Johnjay> Right. but if I see something is say 2 hours 30 min, then I can just tell it to divide in half
[18:34:05 CEST] <Johnjay> instead of trying to compute exact amount for segment time in seconds
[18:34:22 CEST] <Johnjay> if segemnt time took HH:MM:SS that would be another possibily
[18:34:25 CEST] <Johnjay> *possibility
[18:35:31 CEST] <alexpigment> Johnjay: well, it still sounds very niche to me, but others may feel differently
[18:35:49 CEST] <Johnjay> I'm trying that out right now, maybe it does accept that format
[18:36:00 CEST] <Johnjay> damn i guess it does
[18:38:03 CEST] <alexpigment> well there you go :)
[18:38:23 CEST] <alexpigment> I actually thought specifying the HH:MM:SS was the thing you were trying to avoid initially
[18:38:31 CEST] <Johnjay> it seemed to split as 10:20 even though I told it 10:10
[18:38:53 CEST] <Johnjay> No I just assumed it wouldn't work because the -t or -to option wouldn't take HH:MM:SS before
[18:39:10 CEST] <relaxed> Johnjay: are you using the segment muxer?
[18:39:13 CEST] <Johnjay> yes
[18:47:31 CEST] <Johnjay> honestly this thing is kind of weirdly designed
[18:47:41 CEST] <Johnjay> it's just a headset, no interface, and has 4GB of storage
[18:48:04 CEST] <Johnjay> but it's useless because it doesn't respect folders or let you browse them
[18:48:13 CEST] <Johnjay> just play next and play last
[18:48:43 CEST] <Johnjay> so why have 4GB of storage if you can have playlists? Just to have 4GB of music on random play?
[18:48:55 CEST] <Johnjay> *can't have
[18:55:02 CEST] <arinov> the sound is good without video
[18:55:11 CEST] <arinov> no lags, nothing
[18:59:31 CEST] <arinov> ok i solve the problem
[18:59:38 CEST] <BytesBacon> Is there a way to take the input video, crop out the letterboxing and change to 720p while keeping the aspect ratio? Seems anyway I run video filter I can't get it encode right, should be 1280x536 with a 2.40:1 ratio, but I keep getting 1280x720, with the same aspect ratio which changes it to 1728x720.
[18:59:42 CEST] <arinov> the resolution and other parameters
[19:00:04 CEST] <arinov> the only way to stream with webcam without sound lags - dont tweak the video
[19:00:13 CEST] <arinov> any video options will cause lags
[19:00:49 CEST] <Johnjay> BytesBacon: What do you mean keep the same aspect ratio?
[19:02:11 CEST] <arinov> f v4l2 -i /dev/video0
[19:02:31 CEST] <BytesBacon> I'm trying to descale to from 1080p that has letterboxing, so I want to remove the letterboxing and go to 720p, which ended up being 1280x536. Seems I'm not getting this -vf command right. Or do I need to encode once and then encode a 2nd time to crop out the letterboxing.
[19:03:26 CEST] <BytesBacon> scale to 720p from 1080p*
[19:04:08 CEST] <Johnjay> er so of the 1080 height pixels 720 are the image and 360 are black boxing?
[19:05:24 CEST] <BytesBacon> Yep, so when I re-encode just the 1080p, I do -vf crop=1920:804:0:128 to remove the bars. Which works will, it's just now going from 1080p to 720p, that I'm missing something.
[19:06:59 CEST] <BytesBacon> Thought -vf scale=1280:-1:force_original_aspect=decrease and use the crop before that work, but it doesn't seem to make a 1280x536 with 2.40:1
[19:11:19 CEST] <Johnjay> so the -vf crop goes to 0,128 and then takes a rectangle of 1920 width and 804 height?
[19:13:43 CEST] <Johnjay> I'm confused because 804+128+128 = 1060, not 1080
[19:17:30 CEST] <Johnjay> idk maybe furq knows what to do
[19:23:46 CEST] <c_14> BytesBacon: get rid of the force_original_aspect_ratio maybe
[19:23:55 CEST] <c_14> Oh, eh
[19:23:59 CEST] <c_14> Are you specifying -vf twice?
[19:24:01 CEST] <c_14> don't do that
[19:24:05 CEST] <c_14> -vf crop,scale
[19:24:06 CEST] <oddjob> hello, im recording a m3u8 stream and gets skips
[19:24:14 CEST] <oddjob> only "odd" thing I see is  for reading
[19:24:15 CEST] <oddjob> skipping 1 segments ahead, expired from playlists
[19:24:25 CEST] <oddjob> or maybe thats normal?
[19:24:28 CEST] <c_14> You're encoding too slowly
[19:24:33 CEST] <c_14> are you using -re ?
[19:24:35 CEST] <c_14> (hint, don't)
[19:24:49 CEST] <oddjob> no reoncoding at all
[19:25:08 CEST] <c_14> Well, the problem's that you're writing too slowly
[19:25:14 CEST] <c_14> What's the command you're using?
[19:26:06 CEST] <oddjob> ffmpeg -i "stream" out.mkv
[19:26:37 CEST] <oddjob> or -re -reconnect 1 -reconnect_at_eof 1 -reconnect_streamed 1 -reconnect_delay_max 4294 -i "$stream" $qmap -c copy -flags -global_header -hls_flags delete_segments -hls_time 20 -hls_list_size 60 -hls_wrap 0 -use_localtime 1 out.m3u8
[19:26:46 CEST] <c_14> get rid of the -re
[19:26:50 CEST] <c_14> like I said
[19:26:52 CEST] <oddjob> ok
[19:26:57 CEST] <oddjob> are you sure its a write problem?
[19:27:04 CEST] <oddjob> couse other streams gets me no problem
[19:27:09 CEST] <oddjob> only from this stream
[19:27:26 CEST] <c_14> 99%
[19:27:49 CEST] <c_14> What that message means is that the client is reading the stream too slowly
[19:27:53 CEST] <c_14> And is lagging behind
[19:28:02 CEST] <c_14> And then source segments time out and ffmpeg has to skip them
[19:28:07 CEST] <c_14> which causes the skips
[19:28:22 CEST] <c_14> I mean, it could also be the network connection
[19:28:24 CEST] <c_14> If that's too slow
[19:28:40 CEST] <oddjob> probably not
[19:28:42 CEST] <oddjob> akademi
[19:28:46 CEST] <oddjob> 1ms away
[19:29:13 CEST] <oddjob> well ill try without re
[19:33:23 CEST] <oddjob> yes it did work
[19:33:25 CEST] <oddjob> thanks god
[19:33:29 CEST] <oddjob> thanks alot c_14 !
[19:41:46 CEST] <BytesBacon> c_14, thanks I'll give that a try.
[19:43:05 CEST] <BytesBacon> Okay so that works. But is there a way to find the letterboxing dimensions so that I don't have to know what the height is?
[19:44:07 CEST] <c_14> cropdetect
[19:44:09 CEST] <c_14> that's 2 pass though
[19:45:58 CEST] <c_14> (i.e. detect in one run, use the values in the next)
[19:51:17 CEST] <BytesBacon> Will could I use that and set the -t switch so it's only a small output video. Would it give the needed data on cropping? Like the start point location? (x, y, and w, h)
[19:52:11 CEST] <c_14> yeah, though also use -f null. You don't need to actually create an output video, you just need to process the input
[19:53:23 CEST] <BytesBacon> sweet! Thank you c_14. Glad to know I can use the tool to do the hard work for me, vs me trying to find out what the end results need to be by doing the math myself.
[19:58:01 CEST] <BytesBacon> That's odd, using the cropdetect but it's getting a height of 528 and not 536. Any idea why c_14?
[19:58:21 CEST] <c_14> rounding to the nearest multiple of 16
[19:58:31 CEST] <c_14> set round=1
[19:58:52 CEST] <BytesBacon> okay, thank you
[19:59:52 CEST] <BytesBacon> It's still getting 528, with cropdetect=24:1:0
[20:00:42 CEST] <c_14> try round=0 maybe?
[20:00:45 CEST] <c_14> otherwise adjust limit
[20:00:54 CEST] <c_14> part of the bars might be too gray
[20:01:26 CEST] <BytesBacon> Ooo okay, it could be, I'll play around with it, thank you.
[20:21:04 CEST] <doublya> Pulled from ffmpeg master, and not I'm getting undefined reference to 'dlsym' 'dlopen' 'dlclose'  in alsa/src/dlmisc.c during ffmpeg configure
[20:21:34 CEST] <JEEB> yes, there recently has been a flurry of findings that the pkg-config files of things fail at life
[20:21:49 CEST] <JEEB> although wait that is alsa...
[20:22:06 CEST] <JEEB> is that building alsa or trying to link FFmpeg against alsa?
[20:22:18 CEST] <JEEB> because I'm surprised if it's showing you the C file
[20:22:22 CEST] <JEEB> of ALSA
[20:22:26 CEST] <JEEB> in the error
[20:25:03 CEST] <doublya> I've built alsa from src and I've provided the pkg-config path to ffmpeg configure
[20:25:42 CEST] <JEEB> ok, so if it's during the FFmpeg compilation then it's a case of the pkg-config file sucking
[20:25:51 CEST] <JEEB> can you check pkg-config --libs alsa or whatever its name is :)
[20:26:00 CEST] <JEEB> maybe with the static parameter
[20:26:15 CEST] <JEEB> (which adds the Libs.private stuff)
[20:28:04 CEST] <doublya> I pulled from ffmpeg to get cuda9 support. is there a commit that provides cuda9 support that doesn't have broken pkg-config?
[20:28:44 CEST] <JEEB> what I want to find out is if this is an ALSA bug or FFmpeg bug
[20:28:49 CEST] <JEEB> so please then just post your ALSA pkg-config file
[20:28:51 CEST] <JEEB> thank you
[20:29:46 CEST] <JEEB> because recently we removed additional flags and actually started trusting the pkg-config files (.pc files)
[20:30:01 CEST] <doublya> pkg-config --modversion alsa is returned the correct version.  and --libs alsa work as well
[20:30:11 CEST] <JEEB> can you really not fucking read?
[20:30:22 CEST] <JEEB> Please post the pc file's contents
[20:30:32 CEST] <JEEB> and tell me if you're trying to do static or shared linking
[20:30:38 CEST] <JEEB> the pc file in a pastebin or so, thank you
[20:30:41 CEST] <JEEB> and link here
[20:30:46 CEST] <doublya> static, standby
[20:30:47 CEST] <JEEB> don't paste onto the channel
[20:31:01 CEST] <JEEB> ok, yes. most issues are with static because most libraries don't make their pc files correctly
[20:31:08 CEST] <JEEB> let's see how the pc file looks like
[20:34:06 CEST] <doublya> https://pastebin.com/7jFZBX0x
[20:34:11 CEST] <JEEB> cheers
[20:34:20 CEST] <JEEB> woah
[20:34:24 CEST] <JEEB> it actually has that stuff now :)
[20:34:50 CEST] <JEEB> doublya: then most likely you were missing --pkg-config-flags="--static"
[20:34:55 CEST] <JEEB> in FFmpeg configure
[20:35:13 CEST] <JEEB> (make sure to clear your build root)
[20:35:23 CEST] <JEEB> if that still fails then pastebin the failure please
[20:36:04 CEST] <doublya> I have pkg_config_flags="--static". The only thing I've done since this WAS working, was pull from ffmpeg master.
[20:36:07 CEST] <Johnjay> pc files are bogus? huh?
[20:36:36 CEST] <JEEB> can you please post the full log then of the compilation
[20:36:56 CEST] <JEEB> doublya: or wait, post make V=1
[20:37:00 CEST] <JEEB> which shows teh actual commands used
[20:37:27 CEST] <JEEB> we might be missing the alsa pkg-config flags somewhere, but I want to check if that's true first
[20:37:36 CEST] <JEEB> Johnjay: yea a lot of them lack stuff to link things statically
[20:37:53 CEST] <JEEB> before we were manually adding various flags but now we're trusting the pkg-config files more
[20:38:05 CEST] <JEEB> which led to some things borking because the pkg-config files were... borked
[20:38:11 CEST] <JEEB> but some things of course can be our bugs
[20:38:35 CEST] <Johnjay> hrm, I see
[20:38:57 CEST] <JEEB> usually related to things like C++ stdlib
[20:39:01 CEST] <JEEB> which almost nothing puts in there
[20:39:06 CEST] <JEEB> or the threading or math library
[20:39:20 CEST] <JEEB> if you build the library shared those are found as they're linked against
[20:39:28 CEST] <JEEB> but with a static AR archive that's not the case
[20:39:40 CEST] <JEEB> you have to then define those dependencies once again when linking the "next level"
[20:39:59 CEST] <Johnjay> AR is the .a files right?
[20:40:03 CEST] <JEEB> ya
[20:40:07 CEST] <Johnjay> ok
[20:40:14 CEST] <JEEB> basically if you do `ldd SOME_SO_FILE.so`
[20:40:20 CEST] <JEEB> you will get a list of dependencies to shared libraries
[20:40:29 CEST] <JEEB> you don't have that with AR archives
[20:40:59 CEST] <Johnjay> compilation is such a big messy task
[20:41:04 CEST] <Johnjay> i'm amazed i got ffmpeg to compile under msys2 in windows
[20:42:17 CEST] <JEEB> I've gotten used to both *nix and windows build systems and I more or less understand how things work together and what some things can cause so I don't really consider it messy at this point.
[20:42:40 CEST] <Johnjay> sort of like Han and chewy on star wars
[20:43:09 CEST] <doublya> configure test that fails for alsa https://pastebin.com/QRZnsJ6r
[20:43:28 CEST] <doublya> I changed some paths names as there are confidential
[20:43:34 CEST] <JEEB> yea, that's OK
[20:44:00 CEST] <JEEB> and you say this is with the static flag for pkg-config?
[20:44:05 CEST] <JEEB> let me check that configure check
[20:44:34 CEST] <JEEB> also in case you have ALSA on your system
[20:44:45 CEST] <JEEB> can you make sure that the --static --libs thing gives you -ldl ?
[20:45:05 CEST] <doublya> yes. I have a similar problem with other depends. I have to add some LD_FLAGS to configure like -lm and -lpthread
[20:45:14 CEST] <doublya> ok
[20:45:26 CEST] <relaxed> doublya: add to ffmpeg's configure, --extra-libs="-ldl"
[20:45:42 CEST] <JEEB> relaxed: I'm trying to trouble-shoot it here because his pc file seems to contain -ldl
[20:46:02 CEST] <JEEB> and I'm trying to double-check if he is actually getting the proper (the one he wants) pc file used
[20:46:08 CEST] <JEEB> no need for extra-libs or extra-ldflags
[20:46:20 CEST] <JEEB> as in, there shouldn't be a need for them
[20:47:43 CEST] <doublya> JEEB: I'll check the pkg-config output in a sec.  I just did a configure with -ldl in extra libs and I'm not getting that alsa error anymore
[20:47:54 CEST] <JEEB> well d'uh
[20:48:04 CEST] <JEEB> "guess why"
[20:48:16 CEST] <JEEB> you forcibly added it there
[20:48:23 CEST] <JEEB> thus of course the check now passes
[20:48:28 CEST] <JEEB> hmm
[20:48:30 CEST] <JEEB> oh fun
[20:48:35 CEST] <JEEB> pkg-config isn't even being used for ALSA
[20:48:41 CEST] <JEEB> that explains
[20:49:10 CEST] <relaxed> \o/
[20:49:40 CEST] <JEEB> "-lasound should be enough for everyone!"
[20:49:42 CEST] <JEEB> said someone years ago
[20:50:12 CEST] <JEEB> doublya: ok it doesn't matter if you were using the right pc file or not, since the darn thing doesn't even try to find out alsa with pkg-config
[20:50:19 CEST] <JEEB> I might whip a patch up for that
[20:50:54 CEST] <JEEB> also so darn rare for something to set its Libs.private correctly
[20:51:02 CEST] <JEEB> also makes me go all sentimental
[20:51:50 CEST] <JEEB> doublya: if I push a branch somewhere soon would you be ready to test it out?
[20:52:00 CEST] <doublya> I'm not knowledgeable enough with pkg-config to appreciate it haha.
[20:52:00 CEST] <doublya> yes
[20:52:17 CEST] <JEEB> ok
[20:53:26 CEST] <doublya> thanks
[20:56:50 CEST] <JEEB> just grabbed some hot tea because I'm not exactly feeling well today
[20:58:38 CEST] <doublya> sorry to hear. I had a cold last week.
[21:00:04 CEST] <blap> tea is the best
[21:04:10 CEST] <Mavrik> JEEB, I think ffmpeg needs more cmake to fix things ;P
[21:04:39 CEST] <JEEB> lol
[21:04:47 CEST] <JEEB> might as well go meson
[21:09:31 CEST] <JEEB> fun, my VM decided not to come back to life
[21:09:36 CEST] Action: JEEB hard-reboots
[21:16:03 CEST] <JEEB> ok, that did it :)
[21:19:09 CEST] <hanetzer> yello o/
[21:19:17 CEST] <JEEB> ok, double checked that Fedora has the pc file as well
[21:21:58 CEST] <hanetzer> a question have I, how does the ffmpeg build process check for pthreads? because, I've installed a mingw-w64 toolchain with gcc configured with "Thread model: posix" and the mingw-w64 runtime installed with the thread libraries enabled, but its not being found during my cross-emerge.
[21:22:25 CEST] <JEEB> hanetzer: checking ffbuild/config.log is what you'll probably want to do :)
[21:22:37 CEST] <JEEB> also it's highly likely that the internal pthreads implementation of posix threads is utilized
[21:22:42 CEST] <JEEB> which wraps around Windows threads
[21:25:36 CEST] <JEEB> alright, moment of truth :)
[21:27:07 CEST] <JEEB> doublya: seems like I got something working :)
[21:27:16 CEST] <JEEB> let me commit and have you try it out
[21:27:33 CEST] <JEEB> because this seems positive in config.log :) http://up-cat.net/p/35bfb8c7
[21:29:05 CEST] <JEEB> doublya: https://github.com/jeeb/ffmpeg/tree/alsa_pkgconfig_addition
[21:29:17 CEST] <JEEB> or you can specifically cherry-pick https://github.com/jeeb/ffmpeg/commit/71b92aa9f92c89c8c7f5552d0caddb9aaadc6142
[21:36:20 CEST] <Hink> how do i make a near-lossless 60fps .gif from a lossless 60fps video file?
[21:38:48 CEST] <hanetzer> well, theoretically it should have found it, if its looking for -pthread as a linker flag.
[21:39:40 CEST] <JEEB> hanetzer: ffbuild/config.log has the asnwer
[21:39:45 CEST] <JEEB> in your build root
[21:39:57 CEST] <JEEB> it contains everything that the configure script did
[21:40:12 CEST] <hanetzer> JEEB: doesn't seem to, doing a bit of searching against the string 'error:' and 'pthread'
[21:41:05 CEST] <hanetzer> but then again, the ffmpeg 'configure' script is some homegrown thingus I have very little idea how to parse ;P
[21:41:28 CEST] <hanetzer> http://bpaste.net/show/490903f961fa
[21:41:49 CEST] <JEEB> oh
[21:41:58 CEST] <JEEB> you did enable-pthreads
[21:42:13 CEST] <JEEB> if you leave it out it will default to the internal win32 thread wrapper
[21:42:28 CEST] <hanetzer> but I have pthreads in my mingw-w64 toolchain.
[21:42:36 CEST] <JEEB> (you can specifically pick it with --enable-w32threads
[21:42:49 CEST] <JEEB> hanetzer: generally the internal wrapper was found to be faster than the mingw-w64 thing
[21:42:55 CEST] <JEEB> you can test it yourself if you want
[21:43:13 CEST] <hanetzer> JEEB: are you talking about libpthreadGC2 or the one with the mingw runtime?
[21:43:21 CEST] <JEEB> libpthreadGC2 is the old one
[21:43:28 CEST] <JEEB> mingw-w64 has a newer one AFAIK
[21:43:33 CEST] <hanetzer> yep, which I have.
[21:43:50 CEST] <JEEB> yes, and that one the last time people benched was slower than the wrapper inside FFmpeg (or x264)
[21:43:58 CEST] <JEEB> but feel free to bench for yourself
[21:44:37 CEST] <hanetzer> the main issue here is, if you can tell, I'm using gentoo and I'd have to have the upstream ebuild changed to not use pthreads on windows builds :)
[21:44:58 CEST] <hanetzer> I'm just wondering why it is not finding my existant version of pthreads at all.
[21:45:24 CEST] <JEEB> I see at least one pthread test succeeding
[21:46:23 CEST] <JEEB> http://up-cat.net/p/db06c9dc
[21:46:34 CEST] <JEEB> and no errors around that
[21:47:15 CEST] <hanetzer> which is why I'm here asking questions. everything seems to be alright.
[21:49:51 CEST] <JEEB> poked -devel since I can't see the reason there
[21:50:08 CEST] <hanetzer> should I hop in there?
[21:50:15 CEST] <JEEB> you can if you want to
[21:50:35 CEST] <JEEB> also darn, doublya timed out :P
[21:50:48 CEST] <JEEB> anyone else doing alsa static here to check my pkg-config fix :P
[21:51:58 CEST] <JEEB> although I already got an OK from jamrial so I guess I'll just do the git send-email
[22:12:34 CEST] <koyglreg> What's the reason for TV having the 10-235 range while computers have 0-256?
[22:12:53 CEST] <JEEB> you mean 0-255
[22:13:03 CEST] <koyglreg> JEEB: yes
[22:13:13 CEST] <JEEB> and if you want the nitty gritty details it's 235 for Y and 240 for CbCr IIRC
[22:13:28 CEST] <JEEB> anyways, it's because broadcast needs its integer based filtering and it needs some space to overflow
[22:14:01 CEST] <JEEB> anyways, 99.9% of all YCbCr content you will find is limited range
[22:14:04 CEST] <koyglreg> JEEB, what's integer based filtering?
[22:15:07 CEST] <JEEB> I'm not going to go there :D
[22:15:31 CEST] <JEEB> but basically, they need the overhead and that's why limited range YCbCr is a thing
[22:17:31 CEST] <koyglreg> ok.  i've been learning the difference between RGB and YUV.  I almost always use -pix_fmt yuv420p for encoding videos, such as with libx264.  now, you said 99.9% of all YCbCr content you will find is limited range.  "yuv" in ffmpeg-speak means YCbCr, right?
[22:17:36 CEST] Action: Mavrik goes to google integer based filtering.
[22:17:51 CEST] <JEEB> koyglreg: no, the other way
[22:17:59 CEST] <JEEB> there is no digital YUV
[22:18:05 CEST] <JEEB> yet people keep calling YCbCr YUV
[22:18:10 CEST] <JEEB> like with "yuv420p"
[22:18:14 CEST] <koyglreg> JEEB: okay
[22:18:20 CEST] <JEEB> so if we were going FFmpeg-speak it would be YUV :P
[22:18:33 CEST] <koyglreg> right, so are you saying yuv420p is NOT YUV?
[22:18:45 CEST] <JEEB> yes, since YUV as far as I remember is an analog thing
[22:19:03 CEST] <furq> The scope of the terms Y2UV, YUV, YCbCr, YPbPr, etc., is sometimes ambiguous and overlapping. Historically, the terms YUV and Y2UV were used for a specific analog encoding of color information in television systems, while YCbCr was used for digital encoding of color information suited for video and still-image compression and transmission such as MPEG and JPEG.
[22:19:10 CEST] <furq> thanks wikipedia
[22:19:30 CEST] <JEEB> it's yes
[22:19:38 CEST] <koyglreg> furq: that's some of what i've read already.  so yeah, i got the impression there was some overlap
[22:19:56 CEST] <JEEB> yes, colloquially YUV is used for YCbCr, just like in "yuv420p"
[22:20:22 CEST] <koyglreg> so yuv420p is YCbCr?
[22:20:25 CEST] <JEEB> yes
[22:20:37 CEST] <JEEB> Y is luminance, C for chroma and b/r for blue/red
[22:21:00 CEST] <furq> what's P
[22:21:04 CEST] <koyglreg> so when you say almost all YCbCr content is limited range, do you mean from TV?  or do you mean ALL digital videos, such as found on youtube?
[22:21:07 CEST] <furq> oh, phase
[22:21:15 CEST] <JEEB> koyglreg: pretty much everything
[22:21:31 CEST] <JEEB> koyglreg: the only exception that's widely used is the FRAPS screen capture video format
[22:21:39 CEST] <JEEB> and JPEG I guess?
[22:21:44 CEST] <JEEB> which isn't a video format really
[22:21:52 CEST] <koyglreg> I didn't realize that.  For instance, I thought the "black" in my encoded videos was pure 0 black.
[22:21:56 CEST] <Mavrik> Isn't JPEG also slightly modified YUV? :)
[22:22:10 CEST] <JEEB> Mavrik: it's the BT.601 colorspace hardcoded, yes
[22:22:12 CEST] <JEEB> IIRC
[22:22:23 CEST] <JEEB> although I think JPEG has o9k parameters to define the colorspace
[22:22:28 CEST] <Mavrik> Although, I might be wrong, you can encode RGB JPEG as well
[22:22:29 CEST] <JEEB> that they kind of got right, I think?
[22:22:32 CEST] <JEEB> yes
[22:22:33 CEST] <Mavrik> Or did I mix something up?
[22:22:35 CEST] <doublya> JEEB: Internet went down right when you send your link. I'll test in a bit
[22:22:50 CEST] <JEEB> doublya: I already sent it to the mailing list but feel free to test :)
[22:23:13 CEST] <koyglreg> Shouldn't we use RGB for internet/computer videos, so we can use the full range?
[22:23:28 CEST] <furq> rgb doesn't compress as well
[22:23:29 CEST] <JEEB> koyglreg: you still get compression benefits from separating colors from the luminance
[22:23:35 CEST] <JEEB> RGB has information duplicated
[22:23:52 CEST] <koyglreg> okay, but can you have full-range YUV?
[22:23:55 CEST] <furq> you can
[22:23:59 CEST] <JEEB> theoretically, yes
[22:24:04 CEST] <JEEB> as I said, FRAPS does it for example
[22:24:21 CEST] <JEEB> but really, any content that you have around and want to play in most players has to be limited range if YCbCr
[22:24:21 CEST] <furq> x264/x265 will encode full range
[22:24:28 CEST] <furq> but i wouldn't trust any player to play it back properly
[22:24:37 CEST] <JEEB> the sad point of it, yes :P
[22:24:39 CEST] <furq> certainly not ones in web browsers
[22:24:54 CEST] <koyglreg> okay... so obviously limited-range YUV digital video reigns
[22:24:58 CEST] <kerio> how does RGB have duplicated info
[22:25:07 CEST] <kerio> it's still three coordinates
[22:25:14 CEST] <koyglreg> but why?  it's not broadcast, it's digital
[22:25:19 CEST] <furq> koyglreg: legacy
[22:25:32 CEST] <BtbN> Also because it's smaller for not a lot of visual loss
[22:25:35 CEST] <furq> never underestimate the power of inertia
[22:25:45 CEST] <kerio> koyglreg: if backwards compatibility wasn't a thing we'd be using av1+opus probably
[22:25:46 CEST] <JEEB> not really, kierank whacked everyone's head when they tried to say it was legacy reasons from the analog domain
[22:26:02 CEST] <JEEB> it's just due to the fixed point / integer filtering stuff
[22:26:12 CEST] <JEEB> so you have to have overhead in such broadcast systems :P
[22:26:15 CEST] <koyglreg> JEEB: which I still don't understand
[22:26:18 CEST] <JEEB> (until those get converted to floating point
[22:26:38 CEST] <koyglreg> JEEB: when you say overhead, do you mean captions and stuff like that?
[22:26:41 CEST] <furq> how is that not a legacy issue
[22:26:54 CEST] <JEEB> well yes, but people usually mean legacy as in analogue
[22:26:56 CEST] <JEEB> but sure
[22:26:57 CEST] <JEEB> :)
[22:27:14 CEST] <JEEB> koyglreg: nope
[22:27:38 CEST] <JEEB> just the fact that when such non-floating point filters do their stuff they can overflow the range, which would otherwise not be pretty if there was no overhead
[22:27:49 CEST] <JEEB> kerio: all of the things IIRC contain luminance
[22:27:50 CEST] <furq> i mean legacy as in "people are using old shit and people who make new shit don't want to put up with complaints that it doesn't work with people's old shit"
[22:27:56 CEST] <furq> and so the cycle continues forever
[22:28:00 CEST] <JEEB> ok
[22:28:14 CEST] <koyglreg> makes sense
[22:28:17 CEST] <kerio> JEEB: and all the components in YCbCr contain red
[22:28:31 CEST] <Cracki_> rgb is three times intensity. if you encode yuv, you have better separation between dimensions, can compress better.
[22:28:42 CEST] <JEEB> (´4@)
[22:28:47 CEST] <Cracki_> ¯\_(Ä)_/¯
[22:28:59 CEST] <kerio> U(  )W
[22:29:12 CEST] <koyglreg> can someone point me to a link that explains what fixed-point filters are?
[22:29:20 CEST] <Cracki_> fixed point math
[22:29:34 CEST] <Cracki_> as opposed to floating point
[22:29:46 CEST] <Cracki_> fixed point is basically integer with shifts
[22:29:49 CEST] <koyglreg> okay, a link to explain integer based filtering
[22:29:54 CEST] <Cracki_> so faster on dumber hw
[22:29:54 CEST] <koyglreg> I don't have any idea what that is
[22:30:02 CEST] <Cracki_> any signal filter
[22:30:13 CEST] <koyglreg> well, what's a signal filter?
[22:30:19 CEST] <Cracki_> smoothing for example
[22:30:21 CEST] <Cracki_> or deblocking
[22:30:30 CEST] <JEEB> woah Cracki_ - you are compassionate :)
[22:30:32 CEST] <Cracki_> or denoising, or a convolution, or a DCT
[22:30:41 CEST] <koyglreg> hmm, well i know what filters are in general
[22:30:41 CEST] <Cracki_> huh?
[22:30:47 CEST] <Cracki_> am I feeding a troll or what do you mean
[22:30:50 CEST] <JEEB> (I would have just given up explaining)
[22:30:55 CEST] <koyglreg> but what does "signal" filter mean?
[22:31:01 CEST] <JEEB> nah, just someone who has no idea about things
[22:31:08 CEST] <Cracki_> i see now
[22:31:36 CEST] <koyglreg> JEEB: you are correct - i'm not a troll; just someone with a skull full of mush
[22:31:44 CEST] <koyglreg> teach me, o learned ones, the ways of ffmpeg
[22:31:56 CEST] <JEEB> also lol at the "YUV" article on wikipedo. copypasta NV12 to RGB thing from Android
[22:31:57 CEST] <Cracki_> you go zig, then you go zag
[22:32:09 CEST] <JEEB> I wonder if it's BT.601 or BT.709
[22:32:25 CEST] <Cracki_> does it speak about OETF/EOTF/gamma?
[22:32:26 CEST] <koyglreg> So tell me what a "signal filter" is
[22:33:04 CEST] <JEEB> Cracki_: way higher level stuff that is :) it's the colorspace conversion stuff as far as I can tell
[22:33:12 CEST] <Cracki_> the matrix
[22:33:53 CEST] <Cracki_> well, wp:yuv looks not too bad
[22:34:12 CEST] <JEEB> aww, colorspace.h in libavutil doesn't use the same calculations so I can't just check the numbers :)
[22:34:24 CEST] Action: JEEB added BT.709 there for BD subtitles
[22:34:43 CEST] <JEEB> (and yes, it's a yet another place where FFmpeg has the colormatrix stuff)
[22:35:20 CEST] <Cracki_> why don't we just use 32 bit floats for everything
[22:35:21 CEST] <koyglreg> ok, a different question.  how common is interlaced x264?
[22:35:29 CEST] <Cracki_> 23 bit mantissa, good enough for everything
[22:35:54 CEST] <Cracki_> omfg kill everyone who does interlaced these days, and everyone they know and love
[22:36:00 CEST] <JEEB> koyglreg: you mean interlaced AVC/H.264? x264 is an encoder
[22:36:07 CEST] <JEEB> (that produces AVC/H.264)
[22:36:16 CEST] <JEEB> in broadcast interlacism is still in high produce
[22:36:23 CEST] <furq> yeah most broadcast 1080i is h264
[22:36:24 CEST] <JEEB> out of broadcast it is finally lowering down
[22:36:37 CEST] <Cracki_> broadcast is over
[22:36:41 CEST] <JEEB> web video generally tends to be progressive
[22:36:49 CEST] <Cracki_> just dump to 50/60p
[22:36:53 CEST] <JEEB> and blu-rays generally tend to be progressive as well
[22:36:57 CEST] <Cracki_> *bump
[22:37:05 CEST] <koyglreg> JEEB: hmm, yes.  that is, using ffmpeg with libx264 to produce AVC/H.264.
[22:37:05 CEST] <furq> i don't think i've seen any interlaced bluray
[22:37:25 CEST] <furq> i sure have seen m2v bluray though
[22:37:27 CEST] <furq> that's a lot of fun
[22:37:55 CEST] <furq> actually the bluray i'm thinking of is 1080i, so yeah
[22:37:57 CEST] <koyglreg> I have a bunch of home-recorded DVDs - low bitrate, not great quality.  I've thought of encoding to interlaced h264/mp4.  thoughts?
[22:38:08 CEST] <furq> you're better off just deinterlacing them
[22:38:29 CEST] <koyglreg> furq: would ivtc be even better?
[22:38:36 CEST] <furq> only if they're telecined
[22:38:45 CEST] <Cracki_> the good thing about interlacing: you see when someone doesn't know their shit
[22:38:48 CEST] <koyglreg> furq: they're movies recorded from tv
[22:38:49 CEST] <JEEB> furq: PAL stuff tends to be at least flagged interlaced since you can't officially do 25p in blu-ray :)
[22:38:56 CEST] <furq> lol what
[22:39:10 CEST] <JEEB> fake-interlaced in x264 is for that :D
[22:39:13 CEST] <koyglreg> these are NTSC DVDs with movies
[22:39:18 CEST] <JEEB> so that it gets flagged as 25i
[22:39:23 CEST] <JEEB> but actual coded content is 25p
[22:39:30 CEST] <furq> that is incredibly stupid
[22:39:36 CEST] <furq> i thought we'd left that shit behind with dvd
[22:39:42 CEST] <JEEB> blame whomever made up the BD spec
[22:39:59 CEST] <furq> koyglreg: by movies do you mean like recorded off the tv
[22:40:03 CEST] <furq> or home movies
[22:40:05 CEST] <JEEB> although I think the same thing goes for 30/1.001
[22:40:12 CEST] <koyglreg> furq: yes, feature films
[22:40:19 CEST] <furq> i mean
[22:40:21 CEST] <JEEB> you can't encode it progressive so you have to do fake-interlaced
[22:40:28 CEST] <furq> if you pirated them once then you might as well just pirate them again, but properly this time
[22:40:55 CEST] <koyglreg> furq: these are my recordings
[22:41:55 CEST] <furq> are you saying that makes it not piracy or that that gives some kind of sentimental value to these low-bitrate dvds
[22:42:15 CEST] <koyglreg> furq: not saying anything
[22:42:21 CEST] <furq> fair enough
[22:42:45 CEST] <furq> well yeah if it was me i would probably throw them out unless i couldn't find a better quality copy online
[22:43:02 CEST] <furq> of course i do not advocate piracy
[22:43:06 CEST] <koyglreg> now if these have MOSTLY telecined content on them (movies), but segments (such as intro) that are "video", how could I ivtc?
[22:43:07 CEST] <furq> (aside) i do. it's brilliant
[22:43:17 CEST] <JEEB> koyglreg: manually
[22:43:20 CEST] <furq> koyglreg: with difficulty
[22:43:40 CEST] <JEEB> you'd have to filter it in pieces
[22:43:47 CEST] <furq> generally what i do with that is i manually find and select the frame ranges in vapoursynth
[22:43:52 CEST] <JEEB> yup
[22:44:21 CEST] <furq> at least in theory. i've never actually bothered completing such an annoying operation
[22:44:32 CEST] <JEEB> I did quite some time ago
[22:44:40 CEST] <koyglreg> hmm ok
[22:44:41 CEST] <JEEB> back then AvsPmod and avisynth were the tools :D
[22:44:53 CEST] <JEEB> thankfully I only had the credits sequence of a TV show
[22:45:10 CEST] <JEEB> so as long as I knew the point where those two separate I could just blast a good deinterlacer on the other side
[22:45:14 CEST] <JEEB> and normal IVTC on the other
[22:45:18 CEST] <furq> i only have one dvd that needs doing but it has a radial wipe from telecined to interlaced
[22:45:28 CEST] <furq> and i really couldn't be bothered figuring out how i wanted to deal with that shit
[22:45:45 CEST] <furq> it's in the intro as well so it's on every episode
[22:45:48 CEST] <JEEB> I remember there was some nice blend filter for fades with combs
[22:46:18 CEST] <JEEB> interlaced editing on a telecined show is the best, isn't it :)
[22:46:46 CEST] <furq> you know it
[22:47:24 CEST] <furq> i normally have it lucky with pal
[22:47:45 CEST] <furq> the worst thing i commonly deal with is 50i titles and credits around a 25p show
[22:47:55 CEST] <furq> which is incredibly simple by comparison
[22:47:59 CEST] <JEEB> yea
[22:48:34 CEST] <JEEB> doublya: any luck with the patch?
[22:51:41 CEST] <koyglreg> let's say i have a bunch of ripped dvds in .mpg form and i don't want to re-encode them.  lots of them need very simple filtering like cropping or changing the aspect ratio.  can i put options like cropping/stretching in the video metadata for something?  i mean, i doubt i can do it for .mpg, but maybe .mkv?  or is there a video player that can remember settings for every video?
[22:52:36 CEST] <furq> you can do both of those in mkv
[22:52:52 CEST] <furq> player support for AR in the container is pretty much universal
[22:52:54 CEST] <koyglreg> for instance, using MPC-HC with madVR, i can play .avs scripts.  is this a bad solution?
[22:52:58 CEST] <furq> idk about cropping but i assume it's less common
[22:53:16 CEST] <JEEB> the cropping fields are unfortunately badly specified as they were originally meant to override the decoder cropping stuff
[22:53:37 CEST] <JEEB> aspect ratio is just fine with matroska
[22:54:07 CEST] <furq> iirc i tested cropping in mkv and it worked in mpv
[22:54:14 CEST] <furq> so if you just want it to work there, you're fine
[22:54:32 CEST] <koyglreg> what about playing .avs scripts in MPC-HC or another video player?
[22:54:46 CEST] <furq> you can use vapoursynth scripts with mpv
[22:55:11 CEST] <furq> maybe avs on windows, idk
[22:55:34 CEST] <furq> ffmpeg will take avs as input but idk if mpv with lavf with avs support will do it
[22:55:59 CEST] <koyglreg> not familiar with vapoursynth, but i get the idea.  would you say playing scripts like that is a good solution for "permanently saving" those settings (beyond just ar) without re-encoding?
[22:56:22 CEST] <furq> i wouldn't call it good but i can't really think of anything better
[22:56:34 CEST] <furq> and VS is more or less the same thing as avs, just crossplatform
[22:56:46 CEST] <koyglreg> furq: ah, i see
[22:57:15 CEST] <koyglreg> are there media servers (like Plex) that support .avs?
[22:57:27 CEST] <furq> no idea
[22:57:29 CEST] <furq> i'd be surprised
[22:57:37 CEST] <koyglreg> ok
[22:57:54 CEST] <koyglreg> would you say vlc is the best overall media player for windows?
[22:58:00 CEST] <furq> i would certainly not say that
[22:58:50 CEST] <furq> mpv is probably the best, mpc-hc is all right
[22:59:20 CEST] <koyglreg> how is mpv superior in your view?
[22:59:44 CEST] <Fenrirthviti> mpc-hc is my go-to.
[22:59:47 CEST] <Fenrirthviti> for windows.
[23:00:08 CEST] <Fenrirthviti> but mpv isn't an abandoned project, so it has that going for it.
[23:00:56 CEST] <koyglreg> Fenrirthviti, didn't know mpc-hc was abandoned
[23:01:11 CEST] <furq> development stopped a few months ago
[23:01:42 CEST] <furq> https://mpc-hc.org/2017/07/16/1.7.13-released-and-farewell/
[23:02:11 CEST] <Fenrirthviti> They basically handed the project to the wild. It was sad :(
[23:02:24 CEST] <furq> there are some commits on the repo in august
[23:02:27 CEST] <furq> so maybe someone did step up
[23:02:52 CEST] <furq> but yeah, afaik most of the mpc-hc developers moved over to mpv
[23:03:06 CEST] <furq> as i'm sure other people in here can tell you about in more detail
[23:03:36 CEST] <koyglreg> does mpv support .avs?
[23:04:00 CEST] <JEEB> nah, most mpc-hc developers just passed onto the real life (and DShow not being too interesting I don't blame 'em)
[23:04:03 CEST] <furq> i assume it does if it was built with ffmpeg with --enable-avisynth
[23:04:06 CEST] <furq> but i've never tried it
[23:19:02 CEST] <koyglreg> hmm, the mpg captions (from tv) are lost when it goes through avisynth...
[23:29:23 CEST] <hanetzer> yark.
[23:30:35 CEST] <hanetzer> Is there any plans on moving ffmpeg's build system to something standardish? for example, autotools/cmake/meson ?
[23:31:03 CEST] Action: Mavrik smiles.
[23:31:43 CEST] <JEEB> hanetzer: nope. someone would have to go through the work of implementing all those checks
[23:54:37 CEST] Action: hanetzer volunteers as tribute
[00:00:00 CEST] --- Tue Oct 17 2017


More information about the Ffmpeg-devel-irc mailing list