burek021 at gmail.com
Mon Sep 4 03:05:01 EEST 2017
[01:30:16 CEST] <BtbN> alexpigment, https://github.com/BtbN/FFmpeg this should "work". But it's a hack, and not a proper fix. With the current state of the ffmpeg APIs, I don't see an apropiate fix.
[01:38:23 CEST] <commanderkeen`> with filtergraphs should i create one for video and a separate one for audio?
[01:38:36 CEST] <c_14> you can only have one filtergraph
[01:39:29 CEST] <commanderkeen`> so have a single filtergraph that takes in the decoded audio and video
[01:39:54 CEST] <JEEB> if you are using the APIs you can have as many as you want
[01:39:55 CEST] <commanderkeen`> so that would mean to have a video sink and audio sink on the filter graph?
[01:40:14 CEST] <JEEB> you have the buffersrc and the sink
[01:40:26 CEST] <JEEB> as the start/end points
[01:40:33 CEST] <JEEB> for each of the filtering chains you're handling
[01:41:15 CEST] <commanderkeen`> i think i understand
[01:41:56 CEST] <commanderkeen`> i think i will try to have separate filtergraphs and encode then mux the streams togethe
[03:11:21 CEST] <alexpigment> BtbN: I keep thinking you're "done" with looking into this, and you keep surprising me :)
[03:11:51 CEST] <alexpigment> I'll be hanging out with the wife for the rest of the night, but I'll make a note to check this out tomorrow
[03:12:11 CEST] <alexpigment> Your work is greatly appreciated - "hack" or not
[10:36:38 CEST] <thebombzen> I'm looking for somewhat sane settings for vaguedenoiser at denoising anime
[10:36:43 CEST] <thebombzen> the default does not look good
[10:37:03 CEST] <thebombzen> also, the manpage does not list the default settings, which is frustrating
[10:56:49 CEST] <durandal_1707> thebombzen: ffmpeg -h filter=vaguedenoiser
[10:57:02 CEST] <thebombzen> I just poked around vaguedenoiser.c to find them
[10:57:12 CEST] <thebombzen> I'm about to send a patch that adds them to doc/filters.texi
[11:02:39 CEST] <thebombzen> durandal_1707: alright, send the patch. But that being said, do you have suggestions for good values? the default ones do not work well
[11:03:34 CEST] <durandal_1707> depends on noise type
[11:15:52 CEST] <thebombzen> hm, this is hard, I'll stick with hqdn3d
[11:17:27 CEST] <durandal_1707> thebombzen: hqdn3d is for poor people
[11:23:52 CEST] <thebombzen> durandal_1707: well it works and it's fast and it doesn't require me to configure it /shrug
[11:24:34 CEST] <durandal_1707> thebombzen: it blurs and learn to ask google
[11:29:01 CEST] <ozgurk> hi, how can i encode lpcm for digital preservation in matroska? i use -c:v ffv1 -level 3 -g 1 -slicecrc 1 -slices 16 for video, but what should i use for lpcm for digital preservation? thanks
[11:37:39 CEST] <ritsuka> you can leave it as pcm, or use flac or another lossless codec
[11:53:27 CEST] <ozgurk> thanks ritsuka, but if i want to use lpcm, which pcm format should i use? also which endianness? does it make sense for pcm?
[11:54:48 CEST] <ozgurk> i want it to pass the conformance checking of mediaconch - http://www.preforma-project.eu/mediaconch.html
[11:58:05 CEST] <ritsuka> whatever you prefer, mkv supports both endianess and signed/float pcm
[12:00:02 CEST] <AliD> Hello again.
[12:00:42 CEST] <AliD> I created standalone ndk toolchain for compiling.
[12:01:41 CEST] <AliD> Now I want to compile mp3lame sources to libraries and use them to compile ffmpeg as executable binary for Android.
[12:02:39 CEST] <AliD> I tested ways on the network but all fails for me.
[12:02:46 CEST] <AliD> No idea?
[12:15:59 CEST] <AliD1> No Idea?
[13:59:16 CEST] <blap> "my car broke! why?"
[15:10:03 CEST] <doslas> Nacht
[15:10:11 CEST] <doslas> Hi
[15:10:44 CEST] <doslas> every thing is work by your script
[15:11:44 CEST] <doslas> Thank you for help me
[15:32:48 CEST] <BullHorn> hello, i need help. i used to have .bat files the included all the commands needed to take an .mkv source in d:\video, convert it to .mp4 and place it in f:\video, all by typing 'covert 1 2' (1 being the source in d:\video and 2 being the output for f:\video)
[15:33:06 CEST] <BullHorn> but somehow they disappared (maybe antivirus or malwarebytes deleted it i dont even)
[15:35:30 CEST] <durandal_1707> backup backup backup
[15:36:17 CEST] <BullHorn> yeah i do backup most things
[15:36:48 CEST] <BullHorn> this is the one thing i didnt - and i only had it like in the root of cmd where i didnt have to cd.. anywhere or without adding it to the PATH ;x
[15:47:50 CEST] <realies> any ideas what could http://www.videosmaller.com/ be using to squash videos without visual degradation?
[15:49:43 CEST] <realies> getting some -70 to -80% filesize on certain files
[15:51:01 CEST] <BtbN> any transcoder with non-shit settings
[15:51:41 CEST] <BtbN> "Reduce size of MP4 videos captured with your Android or iPhone." It's not hard to reduce the high bitrate of phones hardware encoders.
[15:51:47 CEST] <ritsuka> it uses ffmpeg with quite standard settings
[15:52:09 CEST] <BtbN> it seems highly dubious though
[15:52:21 CEST] <BtbN> Why would they offer a service that is highly CPU intensive and thus expensive for free?
[15:52:45 CEST] <realies> BtbN, it works fine though, no idea at all
[15:52:53 CEST] <realies> maybe they need some huge amounts of video for something?
[15:52:57 CEST] <BtbN> But they now have your video.
[15:53:29 CEST] <realies> ritsuka, how did you know?
[15:53:45 CEST] <ritsuka> I used it to convert a small video
[15:53:52 CEST] <realies> oh, metadata?
[15:53:53 CEST] <ritsuka> and inspected the result
[15:54:22 CEST] <realies> can read or guess the parameters from the inspection?
[15:54:29 CEST] <realies> (trying to reproduce)
[15:54:42 CEST] <BullHorn> i kinda remember it
[15:54:43 CEST] <BullHorn> it was like
[15:54:54 CEST] <ritsuka> it's just x264 with a super high rf
[15:55:04 CEST] <BullHorn> ffmpeg -i D:\%1.mkv -c copy -copyts F:\%2.mp4
[15:55:05 CEST] <BullHorn> or something
[15:55:09 CEST] <BullHorn> im not sure about hte syntax though
[15:56:33 CEST] <BullHorn> yes thats it lol
[15:56:35 CEST] <BullHorn> it works :D
[15:56:46 CEST] <realies> ritsuka, crf?
[15:57:13 CEST] <BullHorn> %1 and onwards allows you to input names, so i do 'convert myinputvideohere shortoutputnamelol' and it works
[15:57:50 CEST] <BullHorn> ok but now its missing a thing - my video has 2 audio tracks, -c copy -copyts only copies the first track
[15:57:56 CEST] <BullHorn> whats the thing to make it copy all audio tracks
[16:01:09 CEST] <BullHorn> ty guys
[16:02:34 CEST] <realies> ritsuka, this seems like a cool read about it http://williamyaps.blogspot.bg/2017/01/ffmpeg-encoding-h264-decrease-size.html
[16:03:06 CEST] <BtbN> it's 100% impossible to do it without quality loss though
[16:03:14 CEST] <BtbN> even if you encode to a higher bitrate, it will degrade
[16:04:17 CEST] <realies> BtbN, talking about the subjective perception
[16:32:59 CEST] <graphitemaster> actually, you sure they're using their CPU for that
[16:33:11 CEST] <graphitemaster> like, their hardware
[16:33:27 CEST] <graphitemaster> maybe they just compiled ffmpeg with emscripten and it's doing all the encoding in your browser :P
[16:33:52 CEST] <graphitemaster> ah no, you have to upload the video :(
[16:38:02 CEST] <bencoh> :]
[16:38:32 CEST] <bencoh> I wonder how it'd perform though
[16:39:02 CEST] <BtbN> poorly
[16:39:09 CEST] <graphitemaster> with webassembly, and how they now have simd instructions and webworkers which are sort of like threads
[16:39:10 CEST] <BtbN> It's mostly fast on PCs because of handwritten assembly
[16:39:17 CEST] <graphitemaster> I imagine you can get pretty damn close
[16:39:26 CEST] <BtbN> no, you wouldn't even get remotely close
[16:39:44 CEST] <graphitemaster> well, close to native speed for the reference C code, not the handwritten assembly in ffmpeg
[16:39:54 CEST] <bencoh> s/ffmpeg/x264/ in that case
[16:40:00 CEST] <BtbN> Build ffmpeg and x264 without yasm, and then halve the speed you get from that, and you might have something to hope for
[16:40:44 CEST] <graphitemaster> I iamgine it would be more like ffmpeg/x264 without yasm at two 3rd's the speed
[16:42:01 CEST] <graphitemaster> what is more interesting is how they already have propsals for webapis for encoding video in the works anyways
[16:42:22 CEST] <graphitemaster> so you can just ignore emscripten compiled ffmpeg/x264 and use the webapi which will prolly just be what ever the platform has
[16:42:33 CEST] <graphitemaster> which will likely be optimized (and potentially hardware accelerated, e.g nvenc)
[18:05:39 CEST] <ZexaronS> hello
[18:05:50 CEST] <ZexaronS> any benefit for ffmpeg to use C++ ?
[18:06:07 CEST] <ZexaronS> perf
[18:15:17 CEST] <JEEB> you don't magically get perf out of writing stuff in C++ instead of C
[18:18:58 CEST] <ZexaronS> i'm looking at some of the nvenc vs x265 discussions, and the people that know around these sectors say that GPU encoders are proprietary ASICs, and that have much less options than x264, why don't they make an ASIC that runs x264/x265 on the GPU then ?
[18:19:26 CEST] <ZexaronS> what's the point of making a totally different encoder, just make a x264/x265 version in HW , no ?
[18:19:36 CEST] <furq> good luck with that
[18:20:46 CEST] <ZexaronS> Once I found the details, the whole thing is a bit meh, features are low, and it's not even the same encoder so it's not GPU accelerated, it's not the same encoder it's something else
[18:21:09 CEST] <JEEB> well, you'd have to design the hardware to do all those things. which is why hw encoders don't start off of some sw encoder
[18:21:31 CEST] <JEEB> they generally start building their hw off of some FPGA design and then move it to an immutable ASIC in the end
[18:21:36 CEST] <ZexaronS> It's not helping x265 by just having a GPU, and secondly, the VCE/NVENC is an engine on the GPU, does it even use the whole GPU or just the ASIC part ?
[18:22:00 CEST] <JEEB> the ASIC in general handles all the actual video coding since the GPU is not good at what you really require for image compression
[18:22:13 CEST] <JEEB> the GPU part can be used for stuff that happens with raw video like scaling or deinterlacing
[18:22:22 CEST] <JEEB> as in, normal image processing
[18:22:52 CEST] <JEEB> for a video format to be actually implementable on the GPU itself you'd have to build the video format from the ground up to be massively threadable
[18:23:00 CEST] <JEEB> ATi did that once in around 2007-8
[18:23:13 CEST] <JEEB> let's just say that the format could not give comparable compression
[18:23:49 CEST] <ZexaronS> Indeed the markeing monikers are a load of bunk, it's an ASIC inside the GPU, and the GPU only helps minor with some filters and basic stuff, hardly GPU driver, I guess the term "accelerated" is the lawyer-sneaky picked one to mean that it's only "helping a little bit" not actually running
[18:24:00 CEST] <JEEB> because generally the more you thread something in a way that doesn't require too much synchronization (= having to stop the thing)
[18:24:09 CEST] <ZexaronS> GPU driven*
[18:24:15 CEST] <JEEB> you cannot utilize as much data that would help you compress the thing
[18:24:21 CEST] <JEEB> thus the end result is that you get specialized ASICs
[18:24:26 CEST] <JEEB> that are just doing the job at speed X
[18:25:23 CEST] <furq> i don't think i've ever seen nvidia refer to nvenc as "gpu accelerated"
[18:25:33 CEST] <JEEB> also with x265 it would help if people cared about that encoder more :P it just lacks a proper community around it like x264 had.
[18:25:39 CEST] <JEEB> (or libvpx)
[18:25:44 CEST] <furq> but then it's rare to see nvidia refer to nvenc at all
[18:25:47 CEST] <ZexaronS> Then what we need is a separate ASIC card that is comparable to a GPU in horsepower, then that would be proper HW driven, this is more like a tack-on right now, like a supercharger on a car, it's not really a different engine, right ?
[18:26:00 CEST] <furq> it is a completely different engine
[18:26:06 CEST] <furq> it's just not a very good one compared to a cpu
[18:26:38 CEST] <JEEB> well, ASIC is a system made for specific thing so it's not really a general purpose computational unit :P
[18:26:56 CEST] <JEEB> you can think of it as a black box that you feed things and get things out of it
[18:27:02 CEST] <ZexaronS> Well, I meant, we build an dedicated ASIC video encoder card that's as big as a GPU and built from the ground up to be better than anything on any CPU
[18:27:10 CEST] <furq> who's "we"
[18:27:41 CEST] <JEEB> I would still see actual effort put into SW encoders as much more bang for the buck thing
[18:28:00 CEST] <furq> there are already cards/boxes that do that
[18:28:07 CEST] <furq> but they're not really for consumer use
[18:28:23 CEST] <JEEB> sure, but they're not really generally better except when the SW alternatives suck dong
[18:28:24 CEST] <furq> and also i assume they're not better than x264
[18:28:29 CEST] <ZexaronS> Well make it for consumer use then, what's the problem
[18:28:33 CEST] <furq> the price
[18:28:54 CEST] <ZexaronS> Well if people want so much profit out of it then, there's your problem
[18:29:00 CEST] <furq> nvenc is just a nice thing to tack onto a gpu so that people can stream themselves playing league of mobas
[18:29:20 CEST] <JEEB> ZexaronS: hardware design and manufacture costs a lot. if you're a niche thing, even more
[18:29:22 CEST] <furq> it's not really intended for anything more than that
[18:29:37 CEST] <ZexaronS> If the chinese or the taiwanese want to be losers and can't do anything then if I get rich I'll do a better job :p
[18:29:53 CEST] <JEEB> so in general we're not limited by the CPUs anyways at this point
[18:29:59 CEST] <JEEB> the encoders could be massively optimized to be honest :P
[18:30:00 CEST] <ZexaronS> I would do it with a "break even" policy, no profit needed
[18:30:11 CEST] <ZexaronS> So the price wouldn't be that high
[18:30:49 CEST] <ZexaronS> JEEB: I guess x265 people are spinning wheels, why the hell is the thing proprietary
[18:30:58 CEST] <ZexaronS> afaik ... maybe it's not
[18:31:03 CEST] <JEEB> it's not proprietary, fully open source
[18:31:11 CEST] <JEEB> the problem is the lack of community around it
[18:31:13 CEST] <JEEB> I tried in '13
[18:31:19 CEST] <JEEB> but in the end it's a one-company show
[18:31:29 CEST] <JEEB> x264 was much more community driven
[18:32:18 CEST] <JEEB> generally the more use cases you try to accomodate on some level, the better the result gets in the end as long as they're not destroying the capabilities in another use case for another
[18:32:42 CEST] <JEEB> Google's libvpx (the VPx series reference software) encoder is another prime example of a single vendor thing
[18:32:54 CEST] <JEEB> it doesn't help how great your video/audio format is if the implementations are wee-wee
[18:33:09 CEST] <ZexaronS> The masses out there just glued to their web streaming screens, completely forgoteen about offline video, John Carmack complained how everyone just wants to watch streaming video and talked how crappy the video is, watching proper VR in 4K is simply not going to look good in web streaming because sites automatically recode and imagine HDR and WCG on top, we will be talking in gigabytes as per standard
[18:33:34 CEST] <JEEB> (granted, x265 is still one of the best HEVC encoders - which of course just tells about the quality of the competition)
[18:34:01 CEST] <redrabbit> x265 is available under the terms of the open source GNU GPL 2 license. Source code is available on https://bitbucket.org/multicoreware/x265.
[18:34:04 CEST] <redrabbit> x265 is also available under a commercial license to companies who wish to distribute x265 without the copyleft restrictions that the GPL v2 license imposes. For commercial licensing questions, please contact our licensing team (license @ x265.com)
[18:34:21 CEST] <furq> that's true of x264 as well
[18:34:34 CEST] <atomnuker> but x264 only offered lgpl and not commerical
[18:34:50 CEST] <ZexaronS> I maybe mistaken for H265 licensing thing
[18:35:01 CEST] <atomnuker> so companies who wanted to modify it sill had no choice but to release the modifications
[18:35:29 CEST] <JEEB> atomnuker: x264 LLC's secondary license as far as i know is commercial (But you still have to present changes to libx264 to x264 LLC)
[18:35:38 CEST] <JEEB> so it's GPL and commercial
[18:35:47 CEST] <JEEB> they don't have to give those changes out outside of x264 LLC
[18:36:12 CEST] <JEEB> which basically means people on #x264dev , who then every now and then poke people if some of those changes look sane
[18:36:21 CEST] <ZexaronS> I think I had a discussion on that also, these big international standards groups simply write some blueprint but don't build anything, so much money going to CERN but they can't make one video codec, sort of MEH
[18:36:34 CEST] <furq> why would cern make a video codec
[18:36:53 CEST] <JEEB> well the standard body's idea is to create the format and reference implementations that they can test things with :P
[18:37:18 CEST] <JEEB> the HEVC reference is called "HM"
[18:37:23 CEST] <JEEB> it has both decoder and encoder
[18:37:28 CEST] <ZexaronS> There's thousands of scientists working on so many theories and making things that never really do much outside the labs, but can't make one video codec :(
[18:38:06 CEST] <JEEB> redrabbit: MCW got the license to the x265 name by giving the library out as open source in addition to their corporate license. which is kind of nice.
[18:38:12 CEST] <Mavrik> JEEB, what do pro shops use to encode HEVC?
[18:38:17 CEST] <Mavrik> e.g. broadcasters and stuff?
[18:38:28 CEST] <JEEB> broadcasters at least in 2014 had expensive hw stuff that was awful
[18:38:42 CEST] <JEEB> I checked the .jp test broadcasts back in the day
[18:38:46 CEST] <JEEB> 40 megabits and looked like ass
[18:38:51 CEST] <redrabbit> good AVC > bad HEVC
[18:39:01 CEST] <JEEB> that's the thing about implementations :P
[18:39:10 CEST] <furq> b-but...but i read hevc is 50% more efficient!
[18:39:34 CEST] <ZexaronS> the ISO/IEC standards thing makes up a standard that supposably governments will use in library/national archives and cable broadcast etc ... right, wrong lol, they make up a standard without the tool to make a video that's compatible with the standard
[18:39:34 CEST] <furq> that's why i reencoded all my x264 rips with nvenc in 1-pass mode at half the bitrate
[18:39:38 CEST] <JEEB> Mavrik: nowadays I've seen a lot of x265 for blu-ray related things although I bet sony etc have their own proprietary solutions around
[18:39:51 CEST] <JEEB> ZexaronS: I've noted the HM reference software X times by now
[18:39:52 CEST] <Mavrik> mhm, I always wondered what the content providers used
[18:39:52 CEST] <redrabbit> furq: sounds neat. lol
[18:40:14 CEST] <JEEB> Mavrik: amazon and netflix either use x265 or that crappily named SW thing
[18:40:16 CEST] <furq> i have genuinely had to tell people not to do that
[18:40:24 CEST] <Mavrik> For H.264 I've only mostly met x264 and the mainconcept encoder
[18:40:24 CEST] <furq> so i can only imagine there are people out there who have done it without asking first
[18:40:30 CEST] <Mavrik> For H.265 I had no idea
[18:40:43 CEST] <ZexaronS> JEEB: Reference encoder is most probably not used in practise so much is it?
[18:40:51 CEST] <JEEB> ZexaronS: it's often used as the base
[18:41:00 CEST] <ZexaronS> They're not updating regularly imo
[18:41:08 CEST] <Mavrik> How active is libx265 development anyway?
[18:41:24 CEST] <ZexaronS> Not supporting new CPU instructions, etc, I'd assume
[18:41:31 CEST] <furq> ZexaronS: how does that not constitute people using the standard
[18:41:33 CEST] <JEEB> MCW is selling it so libx265 is relatively active
[18:41:48 CEST] <JEEB> ZexaronS: the idea of the reference software is not to create a fully optimized solution
[18:41:54 CEST] <furq> the standard is the bitstream
[18:42:05 CEST] <JEEB> a lot of the optimizations aren't about just hitting it with more SIMD
[18:42:09 CEST] <furq> quite a few people are using h264 bitstreams
[18:42:16 CEST] <furq> as many as 100 people are using it these days
[18:42:24 CEST] <ZexaronS> JEEB: Yes, I mean, I disagree with that idea, I'd skip the reference and focus on fully optimized solution
[18:42:28 CEST] <JEEB> because encoder optimization is mostly about "where can we skip X,Y,Z to make this faster but not lose too much quality against the perfect way of doing it"
[18:42:46 CEST] <ZexaronS> oh then that's another story
[18:42:59 CEST] <ZexaronS> but hey I never meant that kind of optimizations, those are cheats
[18:43:06 CEST] <JEEB> reference encoder generally needs to prove things
[18:43:11 CEST] <Mavrik> Best optimizations are cheats :P
[18:43:26 CEST] <furq> cheating kicks ass
[18:43:29 CEST] <ZexaronS> optimizing without breaking off the accuracy/quality, not the fake optimization
[18:43:30 CEST] <JEEB> so that a certain way of coding improves compression by X% etc with a speed loss of Y% without skipping
[18:43:51 CEST] <furq> if you think that's "fake optimisation" then you're going to fucking hate x264
[18:43:55 CEST] <JEEB> yea :D
[18:44:13 CEST] <JEEB> the whole point of non-reference encoders is getting as close as possible to the reference result without doing the full computations
[18:44:26 CEST] <ZexaronS> well, facepalm
[18:44:41 CEST] <JEEB> and there's a fuckload of research on these things
[18:45:11 CEST] <JEEB> also the HEVC reference implementation is getting updated rather well :P
[18:45:11 CEST] <JEEB> https://hevc.hhi.fraunhofer.de/trac/hevc/browser
[18:45:16 CEST] <JEEB> last update was 6 days ago I see
[18:45:29 CEST] <JEEB> and they have releases every couple of months
[18:45:55 CEST] <ZexaronS> So pretty much every video out there is most probably not really up to par with standard 100%, it's some 95% ish ?
[18:46:07 CEST] <JEEB> no, the standard doesn't define how you encode something
[18:46:28 CEST] <JEEB> the standard defines the way of decoding something
[18:46:49 CEST] <JEEB> as long as what you output fits that specification, it's valid HEVC or AVC or MPEG-4 Part 2 or MPEG-2 Video
[18:47:30 CEST] <JEEB> which is why the hardware encoder manufacturers first of all just do the minimal amount of work to get their encoder thing to output valid <next format in line> with their previous gen encoder
[18:47:34 CEST] <ZexaronS> Well then it's funny how the standard "H264" specifies the computations, but then nobody uses them, so it's like ... everything's off, in practise, so what's the point of a glossy standard sitting in ISO/IEC offices, "see boys and girls this is H265" ... that statement doesn't really hold true for most videos out there then
[18:48:02 CEST] <JEEB> no, the specification is for the video format. you don't have to use all of the bells and whistles in it to create a valid file
[18:48:11 CEST] <furq> the reference encoder isn't part of the standard
[18:48:13 CEST] <JEEB> I'd recommend you actually go read the specification since that's freely available
[18:48:17 CEST] <ZexaronS> Oh that yes I get the file metadata thing
[18:48:25 CEST] <furq> the standard defines the bitstream format
[18:48:35 CEST] <JEEB> and the features you use have to be decode'able as per the standard
[18:48:36 CEST] <furq> the reference encoder is just some tool that creates a compliant bitstream
[18:48:43 CEST] <Mavrik> Yeah, it's like standard defining the letters... but not how you put them together into poetry :P
[18:48:57 CEST] <furq> it's as correct or otherwise as any other encoder
[18:49:37 CEST] <JEEB> and having the specification *is* important because otherwise you end up with reference implementation bugs = standard behavior
[18:49:44 CEST] <ZexaronS> furq, reference encoder isn't part of the standard ... I guess I need a second facepalm, but who invented this way of doing things, never realized
[18:49:50 CEST] <JEEB> which is what both VP8,9 had
[18:50:11 CEST] <Mavrik> Yuck. How does libvpx/VP9 compare to libx265 these days?
[18:50:12 CEST] <furq> creating video encoders is quite difficult
[18:50:22 CEST] <furq> i've heard it can take up to 8 hours of work
[18:50:39 CEST] <JEEB> Mavrik: it's still not on par methinks. which is why a certain ex-googler is selling an encoder b2b
[18:50:40 CEST] <ZexaronS> furq: It seems to me that the people who invented these weren't computer people nor were part of the 90ies and played any video games, these are some random mathematicians and physicists put together from various EU countries imo
[18:50:49 CEST] <furq> if you insisted that no video standard be released without an encoder of comparable quality to x264 then we'd still be using h.262
[18:50:51 CEST] <ZexaronS> it's like they're on another planet
[18:51:02 CEST] <Mavrik> JEEB, libvpx not on par with libx265 or the other way?
[18:51:09 CEST] <JEEB> yea, that way
[18:51:14 CEST] <JEEB> IIRC the psychovisuals are even worse
[18:51:27 CEST] <furq> don't forget the rate control and the multithreading
[18:51:34 CEST] <Mavrik> blah
[18:51:34 CEST] <JEEB> yea, that's just icing on the cake
[18:51:49 CEST] <Mavrik> Does HEVC demand player support for 4:4:4 yet? Or is 4:2:0 still the only one mandayory?
[18:52:04 CEST] <JEEB> HEVC spec only specifies profiles
[18:52:14 CEST] <JEEB> and most media formats then specify formats X,y
[18:52:18 CEST] <JEEB> *profiles
[18:52:33 CEST] <JEEB> so yes, effectively right now 4:2:0 is what you get :P
[18:52:35 CEST] <Mavrik> Hmm, HEVC Main is 4:2:0 only it seems
[18:52:36 CEST] <redrabbit> mpeg2 is still widely used to satellite tv
[18:52:39 CEST] <redrabbit> to my horror
[18:52:56 CEST] <ZexaronS> Well, when WCG and HDR hit, all this codecs/players need some big update anyway
[18:52:57 CEST] <JEEB> Mavrik: at least main 10 is now a generally supported thing
[18:52:59 CEST] <Mavrik> redrabbit, yeah, my clients encode boatload of MPEG2/MP2 from satellites to H.264 for broadcast
[18:53:02 CEST] <redrabbit> i beleive AVC still have some lifespan ahead of it
[18:53:14 CEST] <Mavrik> mhm, still 4:2:0 only though it seems
[18:53:20 CEST] <JEEB> yes, 4:2:0 only
[18:53:24 CEST] <JEEB> with main/main 10
[18:53:30 CEST] <Mavrik> With HEVC/VP9 mess, HDR standard mess... the new codec world looks kinda worse than just keeping to H.264 :P
[18:53:44 CEST] <JEEB> it's not like you can ignore the HDR mess with H.264
[18:53:49 CEST] <JEEB> it's an additional metadata mess
[18:53:55 CEST] <JEEB> (and colorspace conversion mess)
[18:54:22 CEST] <redrabbit> too bad you cant use VP9 in a .TS
[18:54:26 CEST] <JEEB> you can I think
[18:54:32 CEST] <JEEB> not 100% sure, though
[18:54:39 CEST] <JEEB> I remember opus went through standardization
[18:54:41 CEST] <ZexaronS> Yeah this is the thing, none of these codecs are actually built for archival/enthusiast consumers like us, it's either for the corporate web streaming nobleman or the enointed broadcast royalty, but nothing for us pesants
[18:54:42 CEST] <Mavrik> Hmm, is HDR H.264 a thing in the wild?
[18:54:44 CEST] <furq> is any large broadcaster using h264 for sd yet
[18:54:51 CEST] <furq> i know none of the ones here are
[18:54:57 CEST] <Mavrik> ZexaronS, yeah, but it still hits consumers
[18:54:57 CEST] <redrabbit> well with my software (dvbviewer) there is no way to use either VP9 or OPUS with a .TS
[18:55:08 CEST] <Mavrik> ZexaronS, e.g. how do I know my TV will play HDR YouTube and Netflix and BBC?
[18:55:14 CEST] <redrabbit> it does relies on ffmpeg but there must be internal limitations
[18:55:23 CEST] <JEEB> redrabbit: at least opus got implemented some time ago
[18:55:25 CEST] <Mavrik> (Getting a TV that can DISPLAY HDR is hard enough with widespread lying)
[18:55:37 CEST] <JEEB> redrabbit: so you can both mux and demux opus from/to MPEG-TS
[18:55:46 CEST] <redrabbit> yep opus should work with TS but it didnt, so i use AAC
[18:55:56 CEST] <JEEB> that's not an FFmpeg problem then
[18:55:59 CEST] <furq> vp9 works in ffmpeg if nothing else
[18:56:05 CEST] <redrabbit> it probably dvbviewer
[18:56:09 CEST] <redrabbit> yeah
[18:56:15 CEST] <JEEB> most likely dvbviewer demuxes by itself
[18:56:19 CEST] <JEEB> because lavf mpegts is awful
[18:56:27 CEST] <JEEB> and that demuxing logic has no support for opus
[18:57:10 CEST] <redrabbit> interesting, i had no idea why
[18:57:13 CEST] <JEEB> Mavrik: even if it gave better results it's not being used because of lack of ASIC support
[18:57:31 CEST] <redrabbit> how does AAC compares to OPUS
[18:57:41 CEST] <Mavrik> mhm, can't seem to track down what Netflix / Amazon use
[18:57:41 CEST] <redrabbit> around 128k / stereo
[18:57:47 CEST] <Mavrik> I'm guessing HEVC / AAC ?
[18:57:54 CEST] <Mavrik> HEVC / DD I mean
[18:58:12 CEST] <furq> h264 and dd for 1080p, hevc and dd for 4k
[18:58:13 CEST] <JEEB> amazon and netflix use HEVC for 10bit due to the ASIC thing, although 10bit VP9 is becoming a thing too
[18:58:13 CEST] <furq> iirc
[18:58:30 CEST] <JEEB> f.ex. my TV can play 10bit VP9 (profile 9.2 or whatever)
[18:58:31 CEST] <redrabbit> i use AVC / AAC or HEVC / ACC but usually i just copy the audio instead of transcoding it if there is bw available
[18:58:38 CEST] <Mavrik> Yeah, although Sony just said that my TV won't support HDR YouTube -_-
[18:58:39 CEST] <furq> only itunes uses aac for their downloads afaik
[18:58:40 CEST] <Mavrik> Thankyou mediatek
[18:58:53 CEST] <JEEB> Mavrik: not like I'd trust the tone mapping in hardware anyways, lol
[18:58:57 CEST] <Mavrik> :P
[18:59:01 CEST] <Mavrik> It's pretty tho :P
[18:59:01 CEST] <JEEB> that's probably the most FFFUUU thing in HDR playback
[18:59:13 CEST] <Mavrik> I'll probably just buy shieldTV at some point
[18:59:15 CEST] <JEEB> because there is no standardized way of tone mapping
[18:59:23 CEST] <Mavrik> Although the AndroidTV on my Sony keeps up surprisingly wel
[18:59:26 CEST] <JEEB> so everything can say they do it correctly
[18:59:41 CEST] <JEEB> Mavrik: hmm, I wonder if it would be worth it to try out mpv-android on that thing :D
[18:59:48 CEST] <Mavrik> Well, my TVs default settings are so color saturated that nothing looks real anyway :P
[19:00:07 CEST] <JEEB> although the interface currently sucks for lean-back things
[19:00:17 CEST] <JEEB> and the store won't likely let you install the app due to that
[19:00:20 CEST] <Mavrik> I'm getting like 90% CPU usage just transcoding DTS to DD on-the-fly
[19:00:23 CEST] <Mavrik> So I'd say no :P
[19:00:35 CEST] <JEEB> mediacodec would get used for hw decoding
[19:00:45 CEST] <JEEB> then you get the decoded YCbCr
[19:00:53 CEST] <JEEB> and shove it into an opengl pipe line
[19:00:53 CEST] <JEEB> :)
[19:01:03 CEST] <ZexaronS> And youtube is just so funny with their 4K 3D videos, where 4K is for the whole 3D video, so it's still looks like crap ... these are the early years it's still pretty much useless
[19:01:04 CEST] <JEEB> slower than dumb overlay of course
[19:11:16 CEST] <ZexaronS> Some successor to MPEG2TS has to come, with HDR,WCG, archival features which will be good for various govts too so their interest makes it alive
[19:11:42 CEST] <ZexaronS> and no interlaced/werid frame rates, and subsampling stuff, full blown quality
[19:11:43 CEST] <atomnuker> ZexaronS: mpegts sucks and anyone using it sucks and everything that follows it would suck
[19:11:59 CEST] <furq> check this out: mpeg4ts
[19:13:36 CEST] <ZexaronS> actually I meant that the format is updated but it has capability to be used as transport or storage, without having to be tagged with "TS" on the outside
[19:14:03 CEST] <BtbN> what?
[19:14:06 CEST] <ZexaronS> and the codec used internally would be made for hiqh quality, not streaming
[19:14:20 CEST] <BtbN> the container and the codec are entirely independend
[19:14:31 CEST] <ZexaronS> the streaming moniker used by companies implies that the actual quality has been sacrificed
[19:14:33 CEST] <BtbN> put something lossless in there and you have your high quality
[19:14:47 CEST] <ZexaronS> you can stream anything without any loss if you really want to
[19:15:09 CEST] <BtbN> If everyone has 50MBit/s for a 1080p stream, sure
[19:16:11 CEST] <ZexaronS> Well but I don't expect the .ts format to support the newer HDR/WCG codecs that's what I meant, a new version of MPEG format which would be one big format without the profile cluterness
[19:16:29 CEST] <atomnuker> matroska supports everything and you can stream it and you have archival features likes crc
[19:16:46 CEST] <ZexaronS> crc? jeez, SHA-1 at least
[19:16:56 CEST] <JEEB> you do understand that the colorspace metadata in the video format as opposed to the container format?
[19:17:04 CEST] <JEEB> *is in
[19:17:32 CEST] <BtbN> mpeg-ts has no profiles
[19:17:42 CEST] <BtbN> you seem to be confused about what a container and a codec is.
[19:18:11 CEST] <atomnuker> ZexaronS: seems like it supports sha with 160bit keys
[19:19:02 CEST] <BtbN> sha has no keys oO
[19:19:07 CEST] <atomnuker> though only for extradata
[19:19:25 CEST] <ZexaronS> I'm talking about both, but first I'm talking about that MPEG(whatever is the lates) Container would get a new modern version, and it would be a profileless one just like MPEG-TS, but it would have all kinds the bells and whistles for super-high-quality and archival purposes as well as a transport feature, and you wouldn't need a separate "SuperMPEG" and "SuperMPEG-TS" containers
[19:19:49 CEST] <BtbN> the container is not invovled with the quality in any way
[19:20:06 CEST] <ZexaronS> It would have features that i can't even think of right now
[19:20:15 CEST] <BtbN> must be very important features then
[19:20:40 CEST] <furq> so uh
[19:20:53 CEST] <furq> you're saying that there should be a separate container to mpegts so that you don't need a separate container to mpegts
[19:21:36 CEST] <ZexaronS> The container it self could have some things that it would verify the video stream, so if you analyze the file, you wouldn't need to pipe the whole video through, the encoder would put it in a special place and that special place has to be supported by the container
[19:22:06 CEST] <ZexaronS> it could also have redundancy
[19:22:39 CEST] <ZexaronS> for the important metadata
[19:23:12 CEST] <ritsuka> I hope that with all these HDR formats people will start to finally tag their file, 99% of the video out there has got no colorspace metadata, and so one player display it and another another guess the wrong colorspace& so fun
[19:23:24 CEST] <BtbN> fec is applied on top of the container.
[19:23:33 CEST] <ZexaronS> MediaInfo dev doesn't want to report bitrate in AAC for MP4 files because it would take full analysis of the whole video I think ... not sure if MP4 but I can find the threat
[19:24:03 CEST] <BtbN> "the bitrate" is a rather useless information anyway, as it's mostly variable
[19:24:07 CEST] <ritsuka> nope, it's very easy to calculate the bitrate in mp4
[19:24:21 CEST] <ZexaronS> The new container whould have requirement to put all that metadata from video/audio streams into the container's metadata, so apps like MediaInfo would have much less trobule
[19:24:35 CEST] <BtbN> Sounds pointless to me
[19:24:36 CEST] <relaxed> more containers!
[19:24:40 CEST] <BtbN> containers already store tons of metadata
[19:24:41 CEST] <ZexaronS> It's not MP4 then, it's probably MKV i think then
[19:26:22 CEST] <ZexaronS> Here it is: https://sourceforge.net/p/mediainfo/feature-requests/154/
[19:26:44 CEST] <ZexaronS> https://sourceforge.net/p/mediainfo/feature-requests/154/#f91d
[19:27:06 CEST] <ZexaronS> ... > Currently, I didn't find a solution without having to parse the whole file (which may be long, too long)
[19:27:21 CEST] <ZexaronS> And last in 2016 still no solution
[19:27:32 CEST] <BtbN> and why would that be so gravely important?
[19:27:41 CEST] <BtbN> The bitrate is not overly relevant for anything
[19:27:58 CEST] <BtbN> you can already write arbitrary metadata into most if not all containers
[19:28:30 CEST] <ZexaronS> For me audio is very important, I have many times spent extra time/effort to get best audio and extract and add it to the video, many times I picked 720p over 1080p except keeping the best audio
[19:28:54 CEST] <BtbN> and the bitrate being only an estimate affects the presence of audio in what way exactly?
[19:29:16 CEST] <ZexaronS> But see, this hacky way, I want it to be properly supported, so it's clean, fast, runs great and stable, solid for 20-30 years, the goal is archival
[19:29:25 CEST] <BtbN> But supported why?
[19:29:28 CEST] <BtbN> Absolutely nothing needs it
[19:29:35 CEST] <ZexaronS> I need it
[19:29:41 CEST] <BtbN> why?
[19:29:59 CEST] <ZexaronS> This may be a small thing, but this is only one I could think of right now
[19:30:17 CEST] <ZexaronS> Because I'm not relying on the corporate web streaming services, I'm dumping everything offline
[19:30:37 CEST] <ZexaronS> Youtube already started their censor bot
[19:30:52 CEST] <ZexaronS> a week ago
[19:31:25 CEST] <ZexaronS> Google has none of my interests in mind, so, screw them, taking things in my own hands
[19:31:56 CEST] <BtbN> are you drunk or something?
[19:33:15 CEST] <ZexaronS> Ehm, it was on drudge just a few days ago https://twitter.com/DRUDGE_REPORT/status/903786180482920448
[19:33:28 CEST] <ZexaronS> https://www.thesun.co.uk/tech/4372177/youtube-accused-of-censorship-over-controversial-new-bid-to-limit-access-to-videos/
[19:34:35 CEST] <ZexaronS> That's what the whole AI is for, not for anything good
[20:25:28 CEST] <alexpigment> BtbN: I checked out your change for interlaced nvenc
[20:25:52 CEST] <alexpigment> MediaInfo reports the frame rate as 29.97 now, but the interlacing is visually not correct
[20:25:59 CEST] <alexpigment> I believe the frame alignment is wrong
[20:26:04 CEST] <BtbN> it looks perfect to me in WMP
[20:26:11 CEST] <BtbN> exactly like in the input file
[20:26:20 CEST] <alexpigment> I just copied nvenc.c and nvenc.h
[20:26:24 CEST] <alexpigment> Were there changes elsewhwere?
[20:26:35 CEST] <BtbN> yes
[20:26:47 CEST] <alexpigment> Ok, that's probably the problem
[20:27:01 CEST] <BtbN> it shouldn't even compile if you just copied those two
[20:27:58 CEST] <alexpigment> I didn't see any errors; I'm using a build script (Windows Build Helpers on linux), so maybe it works a little different
[20:28:10 CEST] <alexpigment> at any rate, I'll try and download your whole source and go from there
[20:28:59 CEST] <BtbN> https://btbn.de/public/out_test5.mp4
[20:29:25 CEST] <alexpigment> yep, that looks perfect
[20:29:53 CEST] <BtbN> I suppose it might compile. But should error when you try to encode something interlaced.
[20:30:09 CEST] <alexpigment> But actually, it's still reporting 59.94fps in MediaInfo
[20:30:17 CEST] <alexpigment> not sure if that's a problem for compatibility elsewhere
[20:30:52 CEST] <BtbN> MediaInfo is plain wrong there
[20:30:55 CEST] <alexpigment> I guess I'll have to break out my collection of Blu-ray players :)
[20:31:06 CEST] <BtbN> it seems like it doesn't often have to deal with field mode interlaced content
[20:31:07 CEST] <alexpigment> I have about 20 different models to test with
[20:31:21 CEST] <BtbN> the in-bitstream "fps" is per picture
[20:31:32 CEST] <BtbN> and with field mode interlaced encoding, there is two pictures per frame
[20:32:02 CEST] <BtbN> So it being 60 fps is correct
[20:32:05 CEST] <alexpigment> Is field mode just "not MBAFF"?
[20:32:20 CEST] <BtbN> MBAFF is one interleaved picture per frame
[20:32:21 CEST] <alexpigment> or is there another nuance to field mode that is different
[20:32:27 CEST] <BtbN> field mode is two seperate pictures, in seperate slices
[20:32:59 CEST] <BtbN> so the picture-rate is doubled with field mode
[20:33:09 CEST] <alexpigment> Here's my question:
[20:33:47 CEST] <BtbN> keep in mind that this solution is a hack. It working depends entirely on the container timestamps and the muxer being able to make up proper intermediate timestamps.
[20:33:49 CEST] <alexpigment> If I have an AVCHD video from a camera that reports "Scan type, store method: separate fields", is that the same thing as "field mode" to you?
[20:33:55 CEST] <BtbN> As I'm just sending out every second frame without a timestamp
[20:34:03 CEST] <BtbN> Yes
[20:34:16 CEST] <BtbN> every second packet
[20:34:22 CEST] <alexpigment> OK, I have lots of those videos that report their frame rate as 29.97 in mediainfo
[20:34:33 CEST] <BtbN> They violate the spec then
[20:34:41 CEST] <BtbN> The sample you gave me also violates it
[20:34:43 CEST] <alexpigment> So really all I'm saying is that the two are not mutually exclusive in mediainfo
[20:35:14 CEST] <alexpigment> I suppose so. It's just the standard for most AVCHD recording modes on consumer/prosumer cameras to be 29.97 and field mode
[20:35:20 CEST] <alexpigment> As reported in MediaInfo
[20:35:28 CEST] <BtbN> The FPS is still 29.97
[20:35:38 CEST] <BtbN> the value in the bitstream just plain does not refer to the fps
[20:36:23 CEST] <alexpigment> Right. I'm really just giving you a different perspective to your assumption that MediaInfo is usually wrong for field mode interlaced content
[20:36:59 CEST] <alexpigment> Again, the sample you provided a second ago looks good in WMP. I just have to do testing on a bunch of players to make sure the discrepancy of the FPS shown in MediaInfo has no impact on hardware compatibility
[20:37:04 CEST] <BtbN> I am positive that the test output has the fps and timebase correctly set
[20:37:20 CEST] <BtbN> nvenc even bothers to fix the framerate
[20:37:22 CEST] <alexpigment> I'm not doubting you on this
[20:37:31 CEST] <BtbN> it's not 2997/100. It's 30000/1001
[20:38:04 CEST] <alexpigment> Yeah, 2997/100 is a weird one for sure :)
[20:38:29 CEST] <BtbN> It's some encoder having bad rounding
[20:40:20 CEST] <alexpigment> Out of curiosity, do you plan on submitting your 'hack' to master?
[20:40:43 CEST] <BtbN> no
[20:40:46 CEST] <ZexaronS> alexpigment: did you try ffprobe to see if it's different to mediainfo ?
[20:41:47 CEST] <ZexaronS> Also if you can provide details I'd put a ticket to mediainfo if you don't want to take time/effort
[20:41:50 CEST] <alexpigment> ZexaronS: not yet. there was also a discrepancy between the two. However, I did get ffprobe to read 59.94 through some combination of demuxing and remuxing the other day. I'm not sure the same thing would happen after BtbN's most recent change
[20:42:13 CEST] <ZexaronS> change of what?
[20:42:18 CEST] <BtbN> if there was raw h264 somewhere in that chain, the framerate is messed up anyway
[20:42:23 CEST] <ZexaronS> I wasn't tuned in this discussion
[20:42:45 CEST] <alexpigment> ZexaronS: this file that BtbN created is the file in question: https://btbn.de/public/out_test5.mp4
[20:43:27 CEST] <alexpigment> I can't 100% say that MediaInfo is "wrong" here, but they may be pulling the frame rate from a different method than other players
[20:43:58 CEST] <BtbN> the framerate is clearly not that. So it's wrong.
[20:44:01 CEST] <alexpigment> Hence why I need to test and make sure that certain hardware doesn't also read in that method and try to deinterlace based on the wrong frame rate
[20:44:16 CEST] <BtbN> The framerate is also 100% irrelevant.
[20:44:22 CEST] <BtbN> The timestamps are what matters.
[20:44:47 CEST] <alexpigment> BtbN: I'm not doubting you. I don't care about what's right/wrong. I care about universal compatibility. I can't go to LG and say "hey fix this Blu-ray player you made in 2010 and sold to millions of people"
[20:46:00 CEST] <alexpigment> And while I trust your assumption, I also know how hardware generally works. They have bugs for things that they don't commonly deal with
[20:48:20 CEST] <JEEB> just stop guessing things and actually try the output on the thing you think might be failing with it
[20:48:38 CEST] <JEEB> there is no such thing as universal compatibility anyways, you have to put a limit somewhere on how dumb something is
[20:49:44 CEST] <JEEB> so if you care about some hardware and think it's gonna fail, just test it instead of trying to make decisions based on information presented to you by chinese whispers that you do not understand what they are based upon
[20:50:29 CEST] <alexpigment> JEEB: did you miss the part where I said I'm going to test on hardware?
[20:50:50 CEST] <alexpigment> Unlike you guys, I *do* have a test bank of around 20 players
[20:51:02 CEST] <JEEB> yes because your last two lines or so were talking of what I commented upon
[20:51:21 CEST] <JEEB> and yes, look down upon people. great way of making people motivated to help you.
[20:51:25 CEST] <alexpigment> Ok, well *I'm going to test hardware compatibility*
[20:51:34 CEST] <JEEB> you don't know what we work with or how we test things
[20:51:50 CEST] <alexpigment> You're right. I'm making a bad assumption
[20:51:58 CEST] <alexpigment> I apologize
[20:52:14 CEST] <alexpigment> You just came at me as if I was just giving idle speculation
[20:52:24 CEST] <JEEB> well you base on random numbers output from mediainfo mostly
[20:52:30 CEST] <JEEB> which I know can be misleading
[20:52:39 CEST] <alexpigment> I agree with that
[20:52:41 CEST] <JEEB> the only way to know what exactly comes out of mediainfo is to read the source
[20:52:52 CEST] <JEEB> since it doesn't always tell you what something actually means
[20:52:52 CEST] <alexpigment> But if MediaInfo sees it wrong, there's a chance that some hardware can play it wrong
[20:52:57 CEST] <alexpigment> So I need to test on hardware
[20:53:03 CEST] <alexpigment> Which I will be doing
[20:53:58 CEST] <alexpigment> If there's an issue, I'll report back. Otherwise, I believe BtbN's hack will work perfectly fine on those players
[21:02:51 CEST] <alexpigment> BtbN: I'm not sure if this is any clue here, but if you take your file out_test5.mp4 and then transcode it to progressive via yadif=1, it becomes 120fps
[21:03:22 CEST] <durandal_1707> thats normal
[21:03:29 CEST] <alexpigment> Handbrake also doesn't deinterlace it properly; I think they use yadif as well
[21:03:46 CEST] <alexpigment> durandal_1707: how so?
[21:04:02 CEST] <durandal_1707> it outputs fields
[21:04:07 CEST] <alexpigment> right
[21:04:22 CEST] <alexpigment> there should be 59.94 fields that get turned into 59.94 frames
[21:04:30 CEST] <alexpigment> (per second, of course)
[21:05:38 CEST] <durandal_1707> whats is fps of original videeo?
[21:06:07 CEST] <alexpigment> Well, that's the question of interest here
[21:06:15 CEST] <alexpigment> It's supposed to be 29.97 interlaced
[21:06:42 CEST] <alexpigment> MediaInfo reports it as 59.94fps interlaced, but BtbN thinks that MediaInfo is just reporting it wrong (which may be very true)
[21:06:53 CEST] <alexpigment> But Yadif is also apparently reading it wrong if that's true
[21:06:54 CEST] <durandal_1707> but whats reported by ffmpeg
[21:07:12 CEST] <Nosomy> yadif=0, desinterlace in "original" video fps
[21:07:23 CEST] <Nosomy> yadif=1, makes in double
[21:07:30 CEST] <alexpigment> Right
[21:07:38 CEST] <alexpigment> 59.94fps interlaced doesn't exist though
[21:07:49 CEST] <alexpigment> Or at least, it's not a standard for video
[21:08:20 CEST] <alexpigment> ffmpeg does report it as 59.94fps though
[21:08:33 CEST] <Nosomy> 60000/1001 fps exist in intelaced, but is uncommon
[21:08:51 CEST] <alexpigment> It's not a standard though, to my knowledge
[21:09:04 CEST] <JEEB> why not check how the ffmpeg demuxer / decoder that you're using handles it and if it's really outputting 120 Hz progressive out (because the deinterlacer will double the time base but the actual timestamps of the output don't need to be on each possible picture of that
[21:09:51 CEST] <alexpigment> Pardon my ignorance, but I'm not entirely sure how to test for that
[21:10:01 CEST] <alexpigment> I'm using yadif as my roundabout way to test that
[21:10:18 CEST] <JEEB> ffprobe -show_frames -of json input.file (and then pipe into less or a file for browsing)
[21:10:34 CEST] <JEEB> also there was a parameter for ffprobe to limit the thing to the video track
[21:11:13 CEST] <JEEB> -of json is just so it can be parsed more easily by various things, but as long as you get the timestamps of the decoded frames *somehow* that should be enough
[21:11:52 CEST] <JEEB> -select_streams v:0 it seems :P
[21:12:07 CEST] <JEEB> picks the first video track only (if the file has audio etc)
[21:13:08 CEST] <alexpigment> that spits out a lot of data
[21:13:22 CEST] <alexpigment> I'm not sure how to gather the info you're talking about based on what I'm seeing
[21:13:43 CEST] <JEEB> that's why I told you to either pipe it to less or output to file
[21:13:52 CEST] <JEEB> first ~four frames should be enough
[21:13:57 CEST] <JEEB> there might be a parameter to limit it to that
[21:14:03 CEST] <alexpigment> yeah, I've got it in a text file
[21:14:12 CEST] <alexpigment> I guess just derive it from the time stamps?
[21:14:20 CEST] <JEEB> post the first five or so frames onto a pastebin or so and link here
[21:14:34 CEST] <JEEB> I *think* show_frames should actually decode the stuff
[21:14:46 CEST] <alexpigment> https://pastebin.com/8eqV5EAg
[21:15:54 CEST] <JEEB> ok, so +400 per picture
[21:16:05 CEST] <JEEB> now do -show_streams which should show the time_base
[21:16:14 CEST] <JEEB> (you can derive it from the _time things but yunno)
[21:16:40 CEST] <BtbN> It's entirely possible ffmpeg just does not support field mode interlacing. It's a hack after all.
[21:16:50 CEST] <BtbN> Nobody ever did that before to my knowledge.
[21:17:06 CEST] <BtbN> So I wouldn't be surprised if it things the two fields are actual frames.
[21:17:09 CEST] <alexpigment> https://pastebin.com/EJZfXhVc
[21:17:11 CEST] <JEEB> uhh, I'm pretty sure FFmpeg's AVC decoder should be tested with that, but I don't know after that :)
[21:17:15 CEST] <JEEB> as in, yadif could have issues
[21:17:28 CEST] <BtbN> JEEB, for decoding, yes. But not producing it.
[21:17:42 CEST] <BtbN> As in, every AVPacket is expected to contain a frame. Not a field.
[21:17:59 CEST] <JEEB> well yea, that definition isn't exactly true
[21:18:17 CEST] <JEEB> but as you have seen there's cracks here and there
[21:18:31 CEST] <alexpigment> fwiw, here are some AVCHD samples from cameras: https://www.shedworx.com/shedworx-samples
[21:18:35 CEST] <alexpigment> i.e. the second video
[21:18:42 CEST] <BtbN> The hack is to just split the bitstream on the slice boundaries.
[21:18:44 CEST] <alexpigment> that's probably a good point of reference
[21:18:48 CEST] <BtbN> And output two packets per input frame
[21:18:50 CEST] <JEEB> anyways, 1/11988 is the time_base for that stream 0
[21:19:20 CEST] <JEEB> given that the PTS goes up 400
[21:19:24 CEST] <JEEB> per picture
[21:19:31 CEST] <JEEB> and we expect that to be static
[21:19:42 CEST] <JEEB> >>> 11988/400
[21:19:43 CEST] <JEEB> 29.97
[21:19:53 CEST] <JEEB> so yea, the decoders are outputting that correctly :P
[21:19:56 CEST] <Nosomy> >Canon HG10, NTSC 60i mode
[21:20:07 CEST] <Nosomy> exotic.
[21:20:52 CEST] <alexpigment> Nosomy: those sexy fences ;)
[21:21:33 CEST] <JEEB> alexpigment: anyways, looking at what the FFmpeg decoder gives out the time base is 1/11988, but the pictures are output with proper timestamps to give you 30000/1001 or so per second
[21:22:59 CEST] <alexpigment> JEEB: does that seem concercerning for compatibility, then?
[21:23:13 CEST] <alexpigment> I'm comparing other non-MBAFF streams in yadif and they come out as 59.94fps
[21:23:34 CEST] <JEEB> the time base is a bit weird, but nope? are you sure yadif is actually pushing out double the pictures?
[21:23:39 CEST] <alexpigment> yes
[21:23:41 CEST] <alexpigment> yadif=1
[21:23:44 CEST] <JEEB> ...
[21:23:48 CEST] <JEEB> that's not what I meant
[21:23:58 CEST] <alexpigment> and then the output file is ~120fps and then doesn't work in WMP
[21:24:20 CEST] <alexpigment> presumably because I'm on Windows 7 and it doesn't handle 120fps correctly via DXVA2
[21:24:32 CEST] <JEEB> there can be various reasons why it doesn't work in WMP, to be honest. so I would like you to output a few frames of that thing and check the things we just did with the input
[21:24:47 CEST] <alexpigment> K
[21:24:50 CEST] <JEEB> that way we can see if yadif did a dumb thing or not
[21:24:55 CEST] <JEEB> instead of playing the guessing game
[21:25:42 CEST] <durandal_1707> settb filter
[21:26:19 CEST] <BtbN> Can I require that to be used from within an encoder?
[21:26:34 CEST] <alexpigment> deinterlaced: https://pastebin.com/CuArWGic
[21:26:53 CEST] <JEEB> can we get a -show_streams as well?
[21:26:59 CEST] <JEEB> which shows the time base
[21:27:07 CEST] <alexpigment> https://pastebin.com/qHYMcn7u
[21:27:12 CEST] <JEEB> danke
[21:27:21 CEST] <BtbN> durandal_1707, settb can't increase the ticks_per_frame it seems
[21:27:34 CEST] <JEEB> ok, so the time base got duplicated as expected from yadif 1001/240000
[21:27:41 CEST] <JEEB> now for the actual time stamps
[21:28:06 CEST] <durandal_1707> BtbN: whats that?
[21:28:26 CEST] <JEEB> or wait no, actual time_base is 1/120000
[21:28:26 CEST] <BtbN> it's what determines the fps
[21:28:36 CEST] <JEEB> fuck, I always end up grabbing the codec_time_base first :P
[21:29:20 CEST] <JEEB> ok, yea. seems like yadif did a dumb
[21:29:46 CEST] <durandal_1707> what dumb?
[21:29:48 CEST] <JEEB> the decoded time stamps were properly 30000/1001 for two-fields-in-one pack (most likely the AVC decoder does that)
[21:30:20 CEST] <JEEB> durandal_1707: it got interlaced "frames" at 30000/1001 but with a weird time_base
[21:30:31 CEST] <BtbN> the original file is already misdetected by ffmpeg as 60fps though, while it actually is only 30 but interlaced
[21:30:32 CEST] <JEEB> so the end result seems to have been 120000/1001
[21:30:51 CEST] <JEEB> BtbN: "misdetection" doesn't really matter since the decoder seems to output correct timestamps
[21:31:05 CEST] <JEEB> we got +400 for each pts
[21:31:18 CEST] <JEEB> and the time base was 1/11988
[21:31:24 CEST] <BtbN> it's probably messed up container data, due to my hack to create that file
[21:32:03 CEST] <JEEB> possibly, but the timestamps seem correct
[21:32:08 CEST] <JEEB> so either ffmpeg.c or yadif did a dumb :)
[21:32:23 CEST] <JEEB> decoding wise the AVC decoder did the right thing at least
[21:33:28 CEST] <BtbN> I wish there way a way to modify the timebase from an encoder
[21:34:08 CEST] <JEEB> AVPackets having their own time_base would be gut
[21:34:37 CEST] <JEEB> then you could init the muxer with that
[00:00:00 CEST] --- Mon Sep 4 2017
More information about the Ffmpeg-devel-irc