[Ffmpeg-devel-irc] ffmpeg-devel.log.20161128

burek burek021 at gmail.com
Tue Nov 29 03:05:03 EET 2016


[00:04:47 CET] <philipl> Timothy_Gu: you have admin rights on the github project? You could set up the integrations
[00:05:05 CET] <nevcairiel> we have fate for regular testing :p
[00:06:21 CET] <Timothy_Gu> philipl: I'm not sure... let me check
[00:06:53 CET] <Timothy_Gu> philipl: nope, I don't have admin rights, only write privilege
[00:07:49 CET] <philipl> Well, then michaelni has to do it or give one of us rights
[00:08:33 CET] <michaelni> what do i need to do ?
[00:08:34 CET] <Timothy_Gu> philipl: did you mean to say something in http://lists.ffmpeg.org/pipermail/ffmpeg-devel/2016-November/203499.html?
[00:08:47 CET] <philipl> urp
[00:09:20 CET] <philipl> man. I wrote a bunch of stuff :-(
[00:10:00 CET] <philipl> michaelni: https://scan.coverity.com/travis_ci
[00:10:35 CET] <philipl> You need to follow the instructions there down to step 4 yourself
[00:10:52 CET] <nevcairiel> can you sanely rate-limit travis integration on github? like michael s aid on the ML, we cant do it after every commit
[00:10:55 CET] <philipl> I'm not sure how other people get added to the project in travis, but however that's done, then someone can take over.
[00:11:08 CET] <Timothy_Gu> basically go to https://travis-ci.org/profile/FFmpeg (logging in with your GitHub account as needed), and enable FFmpeg
[00:11:15 CET] <nevcairiel> oh r ight you wanted to use some branch
[00:11:21 CET] <philipl> As discussed in the doc, you have a separate branch for coverity
[00:11:25 CET] <Timothy_Gu> nevcairiel: we can specify the branch in .travis.yml
[00:11:47 CET] <philipl> You could use travis to build every commit to master as a build test, but that's a separate objective.
[00:11:55 CET] <philipl> easy to do once you've done all the work for coverity, of course.
[00:12:54 CET] <Timothy_Gu> michaelni: actually before you enable it, we probably need to update the .travis.yml file to whitelist the coverity branch first
[00:24:46 CET] <michaelni> i see no way to add a user to this travis thing
[00:26:21 CET] <philipl> Does it inherit users from the github project?
[00:26:25 CET] <philipl> Chloe, Timothy_Gu ?
[00:26:43 CET] <Timothy_Gu> Yeah it inherits the organization status from github
[00:26:49 CET] <Chloe> yes it does
[00:27:10 CET] <Chloe> you need to be in the ffmpeg org and have read access on the ffmpeg repo
[00:28:16 CET] <Timothy_Gu> Actually you need more than read or write access
[00:28:22 CET] <Chloe> you do?
[00:28:36 CET] <Timothy_Gu> Yeah. I have write access to FFmpeg repo but i can't add travis
[00:28:36 CET] <Chloe> oh right, yeah. I think you may need admin access for the repo
[00:30:30 CET] <michaelni> ive added travis to ffmpeg but ive no clue how to give anyone access to that
[00:30:51 CET] <Chloe> you need to give them admin access to the ffmpeg repo on github
[00:31:32 CET] <Timothy_Gu> Chloe: the problem is there isn't something called "admin" on github
[00:31:43 CET] <Timothy_Gu> i'm guessing they really mean "owner" but I'm not sure
[00:31:50 CET] <Chloe> (if they are to do any management on the travis settings that is), anyone with push access can trigger a build
[00:31:56 CET] <Chloe> Timothy_Gu lemme check
[00:32:32 CET] <Timothy_Gu> Triggering is different from managing
[00:32:37 CET] <Chloe> I know
[00:32:50 CET] <Chloe> Create a team called 'travis', add the FFmpeg repo, and set the permission level to 'Admin'
[00:33:11 CET] <Chloe> and then add people to ffmpeg org, before adding them to the travis team
[00:33:23 CET] <Chloe> (this is how it's set up in another org I'm in)
[00:33:33 CET] <Timothy_Gu> Chloe: you can't set permission level of a team AFAICT
[00:33:44 CET] <philipl> I think you can these days
[00:33:44 CET] <Chloe> pretty sure you can
[00:33:46 CET] <Timothy_Gu> only per member
[00:34:06 CET] <Chloe> from what I can see, it looks likeyou can
[00:34:17 CET] <Timothy_Gu> where?
[00:34:29 CET] <Chloe> within the teams' settings
[00:34:49 CET] <michaelni> i found a optio to give admin rights on: https://github.com/orgs/FFmpeg/teams/members/repositories
[00:35:18 CET] <Timothy_Gu> Ok i think that'll work
[00:35:26 CET] <Chloe> nevcairiel we can do travis after every commit, but not coverity
[00:35:35 CET] <Timothy_Gu> this is so confusing. GH keeps changing their team settings
[00:35:55 CET] <michaelni> ok ive given all members (timothy lou and me) admin for ffmepeg/ffmpeg
[00:36:20 CET] <michaelni> "this is so confusing" +1
[00:36:37 CET] <Timothy_Gu> Right, I can access Travis settings now
[00:36:45 CET] Action: michaelni happy :)
[00:36:52 CET] <Chloe> cool, good it works now
[00:37:11 CET] <Chloe> I can help with travis, but as I said, I'm not sure about docker
[00:37:12 CET] <Timothy_Gu> I guess the question now is: do we want to only use Travis for Coverity, or do we want to do what Libav does and make it another FATE client
[00:37:23 CET] <philipl> Timothy_Gu, Chloe: I'll let you guys handle getting travis going.
[00:37:29 CET] <philipl> I'm currently building the Dockerfile
[00:37:42 CET] <Chloe> Timothy_Gu there's no reason why it couldnt be another FATE client
[00:37:55 CET] <philipl> Once you've gone to the trouble of getting it working for coverity, making it run FATE will be a small amount of work
[00:38:38 CET] <Timothy_Gu> Chloe: you can try with something like https://github.com/TimothyGu/FFmpeg/commit/8330ab38679e984c5b69f8257674ce011cf8c64b
[00:38:51 CET] <Timothy_Gu> never got it working though
[00:39:40 CET] <Timothy_Gu> Okay so I guess the answer is "Coverity first, FATE later"
[00:45:51 CET] <philipl> The list of dependencies that you can't satisfy from official ubuntu packages is pretty small.
[00:46:49 CET] <Chloe> So philipl's doing the docker, I'll add it as a FATE client. Who's overseeing travis?
[00:47:22 CET] <Chloe> I mean, I can do that too, but I'd need access.
[00:47:53 CET] <Timothy_Gu> Chloe: I can do it
[00:47:59 CET] <Chloe> ok
[00:48:20 CET] <Timothy_Gu> And I don't think I have the ability to add people to the FFmpeg gh org :(
[00:48:36 CET] <Chloe> afaik only michaelni has the ability
[00:48:49 CET] <Timothy_Gu> yeah
[00:50:17 CET] <Timothy_Gu> If someone needs Travis to be enabled ping me
[01:03:06 CET] <philipl> https://github.com/philipl/FFmpeg/commit/885eff0c6502a6f3ab7500a2442bd3770247b3d3
[01:03:19 CET] <philipl> That's a build dockerfile and one that runs configure on top of it, to prove it works
[01:05:13 CET] <Chloe> philipl so which dependencies are missing?
[01:06:03 CET] <philipl> whatever maps to the commented out enable options
[01:06:15 CET] <philipl> Of course, a few of those are impossible - android, rpi, etc
[01:06:35 CET] <philipl> iec61883 and tesseract are present in ubuntu but configure failed. I haven't investigated yet.
[01:07:33 CET] <philipl> So someone should register ffmpeg with docker hub and that can be pointed to the build environment Dockerfile after it's checked in.
[01:08:40 CET] <Chloe> https://buildbot.openmpt.org/builds/auto/src/ the autotools tarballs from here work best
[01:09:53 CET] <philipl> I can create the ffmpeg org in docker hub if we want.
[01:10:55 CET] <Chloe> I dont see an issue with that, as long as other relevant people can get access too
[01:11:04 CET] <philipl> Send me docker usernames/emails and I'll add them
[01:17:24 CET] <philipl> I suspect someone with github admin access has to link to docker hub for the automatic builds to work.
[01:17:30 CET] <Timothy_Gu> philipl: timothygu/timothygu99 at gmail.com
[01:17:56 CET] <philipl> done
[01:18:18 CET] <Timothy_Gu> philipl: You can put the dockerfile in your own repo and set up the auto builds from there, no?
[01:18:27 CET] <philipl> That's what I'm going to test, at least.
[01:18:41 CET] <Timothy_Gu> There doesn't seem to be a connection between GitHub accounts and dockerhub ones
[01:19:06 CET] <Chloe> Timothy_Gu there is
[01:20:20 CET] <Chloe> https://cloud.docker.com/app/timothygu/settings -> Source providers -> click on github's socket thingy
[01:21:12 CET] <Timothy_Gu> what's Docker cloud?
[01:22:00 CET] <Timothy_Gu> I mean I already have a Docker Hub account?
[01:22:13 CET] <Chloe> oh
[01:22:20 CET] <Chloe> i think I signed up to the wrong thing
[01:23:17 CET] <Chloe> Timothy_Gu: https://hub.docker.com/account/authorized-services/ -> github maybe?
[01:23:31 CET] <Timothy_Gu> yeah we already did that
[01:24:08 CET] <philipl> https://github.com/philipl/ffmpeg-build-docker
[01:24:14 CET] <philipl> https://hub.docker.com/r/ffmpeg/build/builds/
[01:25:03 CET] <Timothy_Gu> well let's see how fast this thing runs
[01:25:33 CET] <philipl> It took a while to do locally for me. It's 900MB of crap that needs to be downloaded and 3.5GB image at the end of the day
[01:27:34 CET] <Timothy_Gu> michaelni: can you add me or philipl to the Coverity project?
[01:27:46 CET] <Timothy_Gu> as "Admin"
[01:27:51 CET] <Timothy_Gu> in order to submit builds
[01:30:30 CET] <michaelni> Timothy_Gu, done
[01:31:27 CET] <Timothy_Gu> thx
[01:36:14 CET] <Timothy_Gu> ok I've linked the Coverity project with GitHub, so we just need to get the Travis thing working
[01:53:10 CET] <philipl> Success. Took 27 minutes apparently
[01:55:21 CET] <philipl> I'm not sure what the recommended way to run a build in docker on top of travis is. I haven't found exact examples.
[02:00:25 CET] <Timothy_Gu> philipl: https://docs.travis-ci.com/user/docker/
[02:02:02 CET] <Timothy_Gu> "BuildTime: 2016-10-26T22:01:48.986273588+00:00" heh
[02:05:27 CET] <philipl> Timothy_Gu: that doesn't discuss how to run the actual build in docker.
[02:05:47 CET] <philipl> given that travis has already checked out the source code, what's the recommended way to access it in the container.
[02:05:50 CET] <philipl> That kind of thing
[02:06:05 CET] <Timothy_Gu> -v `pwd`:/src ?
[02:06:16 CET] <philipl> probably.
[02:06:22 CET] <philipl> It's just I couldn't find a discussion of it
[02:24:30 CET] <Timothy_Gu> philipl: I don't think we can use Travis CI's nice coverity addon because we are using a container
[02:25:02 CET] <Timothy_Gu> check out https://scan.coverity.com/projects/ffmpeg/builds/new?tab=upload for instructions on downloading the cov-build tool among other things
[02:25:39 CET] <philipl> hrm,
[02:27:35 CET] <philipl> With the addon, the build is still done using the script.
[02:27:50 CET] <philipl> So you can '-v' the directory, run the build and then the output is there to be grabbed by coverity, right?
[02:28:24 CET] <philipl> don't know until you try
[02:29:11 CET] <wm4> can't you just use github's coverity integration?
[02:30:08 CET] <philipl> That is what we're discussing. The builds have to run with travis
[02:32:13 CET] <Timothy_Gu> cov-int basically puts some gcc wrappers in PATH, but because we are doing the build in the container with separate CC etc im not sure if it will work
[02:33:23 CET] <philipl> ah. yeah, that won't help, although we could set it up in the container and still use the addon to upload
[02:33:28 CET] <philipl> which has value
[02:35:23 CET] <Timothy_Gu> or actually i think u might be tight. cov-int directory exists in the *cov-build
[02:36:03 CET] <Timothy_Gu> sorry typed "enter" at the wrong time
[02:36:57 CET] <Timothy_Gu> i think we should be able to use it for upload, but in that case we could just as well upload it ourselves
[02:37:16 CET] <philipl> up to you. See what's easiest.
[02:37:28 CET] <Timothy_Gu> yep
[02:37:52 CET] <Timothy_Gu> sorry got homework tonight. Will try figuring something out tomorrow
[02:38:26 CET] <philipl> Sure.
[02:38:38 CET] <philipl> I'm done for the day. I found what I could in ppas and I'm updating the docker file.
[02:39:09 CET] <philipl> Everything else is tricky. Source builds are always an option and some are in ancient ppas, which reflect that these libraries are ancient and irrelevant but that's ffmpeg for you.
[02:46:44 CET] <Timothy_Gu> philipl: thanks for all the work!
[02:47:02 CET] <philipl> you too.
[04:49:22 CET] <rcombs> wm4: you mean, like, determine if a VT frame is yuv420p or nv12 (or I guess it could technically be RGB), find out what the plane offsets, strides, etc& are, and such?
[04:49:44 CET] <rcombs> if so, yes, there are reasonably usable APIs for that
[11:24:21 CET] <durandal_1707> don't kill ffserver, fix it instead
[11:30:37 CET] <JEEB> right
[11:31:19 CET] <JEEB> people are currently just saying that it should GTFO of the main repo as most probably the biggest offender of internal API usage etc
[11:32:02 CET] <JEEB> also at least one attempt was to make a streamer based on the public APIs and it's definitely possible (someone posted a funky matroska streamer here before)
[11:32:28 CET] <JEEB> where ffserver "works" is very limited and you should see how many times someone *thinks* they can utilize it on #ffmpeg
[11:32:37 CET] <JEEB> that's of course separate from bad API usage
[11:32:43 CET] <JEEB> but it's a mess from multiple sides
[13:41:10 CET] <BBB> so are we removing ffserver tomorrow, or should I call for a vote to remove it?
[13:46:50 CET] <jamrial> BBB: as i said on an email, i'm (still, somehow) ok with waiting until 3.3 is ready to be branched out before removing it, since it will aid michaelni and/or reynaldo on making it work as a standalone program
[13:47:20 CET] <BBB> the only reason michael works on ffserver is because its in-tree
[13:47:22 CET] <jamrial> but if you want to start a vote to choose between "remove after this voting ends" or "remove right before 3.3 is branched" then go ahead
[13:47:29 CET] <BBB> like, he doesnt work on MPV or gstreamer or chrome
[13:48:09 CET] <BBB> I just want to make sure 3.2 doesnt repeat itself
[13:48:12 CET] <BBB> that was embarassing
[13:48:24 CET] <Compn> embarassed for an open source project
[13:48:33 CET] <jamrial> yes, i've said as much in more than one reply to that thread
[13:49:36 CET] <BtbN> I can't think of any good reason for it to stay. Even with a few people desperately fighting for it. If they like it so much, it can be maintained out of tree.
[13:50:00 CET] <BtbN> Eventually, all the cli tools should be moved out of tree. But ffserver is the most pressing one
[13:51:03 CET] <BBB> ffmpeg can stay IMO
[13:51:09 CET] <BBB> ffplay is fun but nothing else, it
[13:51:15 CET] <BBB> its useful for seeking maybe
[13:51:20 CET] <BBB> for lack of better seek tests
[13:51:27 CET] <BBB> but probably should be moved also
[13:51:36 CET] <BBB> (would also give us reasons to use real players like MPV or whatnot)
[13:51:45 CET] <BBB> (or VLC; I dont care; just not ffplay)
[13:52:05 CET] <BBB> ffmpeg.c is pretty useful for a lot of things, also because every single fate test uses it...
[13:52:46 CET] <jamrial> was going to say, ffmpeg out of tree would be more a pain in the ass than a net gain. running fate would become twice as hard, for starters
[13:52:52 CET] <jamrial> but that's a discussion for another day
[13:53:51 CET] <BBB> totally agree
[14:04:44 CET] <iive> so you want to get ffmpeg out of ffmpeg.
[14:07:24 CET] <wm4> rcombs: getting the format once you have output is trivial
[14:07:36 CET] <wm4> rcombs: but you still need to set an output format before decoding begins (or so)
[14:07:47 CET] <wm4> rcombs: the question is how to determine the best format for that
[14:08:00 CET] <rcombs> oh
[14:11:35 CET] <rcombs> looks like I just hardcoded kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange (i.e. NV12)
[14:16:12 CET] <wm4> the problem is that this format leads to horrible performance on older hw
[14:17:04 CET] <rcombs> oh, it does?
[14:17:23 CET] <rcombs> so older hardware prefers planar and new prefers semi-planar?
[14:17:37 CET] <wm4> older hw prefers packed
[14:18:47 CET] <nevcairiel> what kind of hardware are we talking about, mobiles or PCs?
[14:19:43 CET] <rcombs> what does "packed" mean in that context?
[14:20:11 CET] <wm4> maxminis
[14:20:14 CET] <wm4> *macminis
[14:20:38 CET] <wm4> rcombs: kCVPixelFormatType_422YpCbCr8
[14:20:40 CET] <nevcairiel> i find it rather curious that the same hardware that would run on a different OS suddenly has special requirements when it runs on osx =p
[14:21:08 CET] <wm4> yeah
[14:21:23 CET] <wm4> actually I don't know if the problem is in the decoder or renderer
[14:21:27 CET] <rcombs> wew 4:2:2
[14:21:28 CET] <wm4> I never had direct access to said hw
[14:21:53 CET] <wm4> it corresponds to AV_PIX_FMT_UYVY422
[14:24:45 CET] <rcombs> source on that being preferred over NV12? (as opposed to over yuv420p)
[14:26:24 CET] <rcombs> or is that experimentally determined?
[14:26:35 CET] <wm4> experimentally
[14:29:30 CET] <rcombs> yeah, I've got no idea
[14:29:35 CET] <rcombs> looks like most software just picks one
[14:29:47 CET] <rcombs> or uses packed 4:2:2 on iOS and NV12 on macOS
[14:29:54 CET] <rcombs> erm, swap those
[14:30:04 CET] <wm4> woah
[14:30:51 CET] <rcombs> I don't see any documentation on the topic
[14:31:47 CET] <wm4> documentation seems to be completely absent more than not
[14:32:10 CET] <rcombs> yup
[15:40:43 CET] <Compn> https://en.wikipedia.org/wiki/Conflict_resolution
[15:43:53 CET] <jamrial> Compn: there is no conflict. there's a few people that missed the ship sailing, noticed it only when it was about to reach destination, and now pretend it should instead go back home
[16:10:47 CET] <MustBeLucky2> is #ffmpeg usually dead this time of morning?
[16:16:32 CET] <DHE> it's very situational on what people have experience with. ALSA recording is probably less common knowledge than file-to-file work, for example
[16:24:26 CET] <MustBeLucky2> Cool, thanks for the heads up
[16:25:53 CET] <MustBeLucky2> maybe a dev can help me fidn ALSA_BUFFER_SIZE_MAX? i saw a post that claimed it was in libavdevices/asla-audio.h but this file does not exist on my device. Is this normal?
[16:28:25 CET] <atomnuker> BBB: there's no need to vote, by the time it's close to 3.3 everything ffm related would be gone from ffserver and it will be removed
[16:28:48 CET] <atomnuker> it's just that the people who care need some time to finally sort it out
[16:28:56 CET] <nevcairiel> didnt we even vote before on the previous irc meeting or something
[16:30:23 CET] <jamrial> holy shit, nicolas can be annoying
[16:32:22 CET] <nevcairiel> its his usual deal, have no base to stand on but hide behind fancy arguments that make everyone want to quit isntead of keeping to argue
[16:35:31 CET] <BBB> if he wants to play a game, thats fne
[16:35:33 CET] <BBB> lets play the game
[16:35:42 CET] <BBB> but we want him to agree (eventually) with some outcome
[16:35:48 CET] <BBB> so that we can actually move on
[16:35:48 CET] <nevcairiel> he wo nt
[16:35:50 CET] <nevcairiel> he is like that
[16:36:04 CET] <nevcairiel> if we vote he'll quote some pseudo legal why our vote was invalid
[16:36:05 CET] <nevcairiel> done it before
[16:36:16 CET] <BBB> pff...
[16:36:25 CET] <BBB> at the end of the vote, if the result is to remove it
[16:36:27 CET] <BBB> announce it
[16:36:34 CET] <BBB> and apply patch at same time
[16:36:41 CET] <BBB> can argue for days and years, but need to move
[16:36:42 CET] <nevcairiel> we did announce it, publicly in a news entry, months ago when we decided to remove it
[16:36:44 CET] <BBB> this is soooo ridiculous
[16:36:48 CET] <nevcairiel> there isnt even room for s discussion
[16:39:33 CET] <nevcairiel> jamrial summarized it pretty well in his mail, we conceded them sometime to  split it into a new repo before we remvoe it, now  some of the work towards that goal is used as leverage to actually keep it entirely, which is just malice
[17:17:02 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:bda6f2937ea1: ffmpeg_opt: Fix starttime with ffm in bitexact mode
[17:31:54 CET] <BBB> lol @ ET for atari
[17:31:57 CET] <BBB> that was pretty sweet
[17:32:43 CET] <wm4> where is the patch that makes ffserver not require the ffm parts in libavformat to access the deprecated AVCodecContext?
[17:34:26 CET] <wm4> jamrial: I'm sorry that you have to "discuss" with this "person"
[17:39:44 CET] <BBB> adding a streaming server? Whats next - adding a directory browsing API?
[17:39:50 CET] <BBB> oh crap we already did that *headbang*
[17:40:03 CET] <JEEB> :D
[17:40:34 CET] <JEEB> for a while I haven't been able to take that thread seriously
[17:40:47 CET] <JEEB> jusy for my own sanity
[18:02:17 CET] <durandal_1707> this community is very toxic, can I add another filter?
[18:02:46 CET] <nevcairiel> can you filter the community?
[18:11:05 CET] <BBB> -vf trolls=false
[18:17:56 CET] <Chloe[m]> they submitted a turing codec patch again, saying they fixed all the issues, yet it still doesn't build on macOS .-.
[18:18:37 CET] <Chloe[m]> I would try fix it myself but I don't really know C++
[18:19:08 CET] <nevcairiel> building on all platforms isnt exactly a requirement, tbh
[18:23:41 CET] <Chloe[m]> sure, but I'm not the only person here who uses macOS, and it would be nice for testing
[19:15:02 CET] <roxlu> hey! I'm using ffmpeg in my C++ app. When I try to decode a .mp4 file (that was created using Quicktime on Mac), I get lots and lots of errors like: https://gist.github.com/roxlu/ae6e2e4fb06992da91bc8ac36c9ddd6f 
[19:17:08 CET] <Chloe[m]> roxlu: wrong channel, #ffmpeg
[19:17:54 CET] <roxlu> Chloe[m]: ah thanks
[19:18:22 CET] <Chloe[m]> turingcodec's code is hardly intuitive though... Maybe it's that I don't know C++ but... https://github.com/bbc/turingcodec/blob/stable/turing/Global.h#L461
[19:18:38 CET] <wm4> w-wat
[19:20:30 CET] <Chloe[m]> I don't understand how this code works on linux, let alone why it doesn't work on macOS
[19:20:35 CET] <BtbN> what even is that? Just another video codec?
[19:20:46 CET] <JEEB> yet another HEVC encoder
[19:20:50 CET] <JEEB> this time from BBC
[19:21:02 CET] <BtbN> looks horrible
[19:21:10 CET] <Chloe[m]> have you seen the API?
[19:21:11 CET] <Chloe[m]> lol.
[19:21:15 CET] <BtbN> yes
[19:24:40 CET] <wm4> so the cmdline API seems not so bad suddenly
[19:30:53 CET] <kierank> research code is complex, news at 11
[19:31:49 CET] <wm4> s/research/c++/?
[19:36:54 CET] <durandal_1707> c++ is obfuscation step
[19:50:25 CET] <durandal_1707> j-b: ping
[20:20:29 CET] <cone-287> ffmpeg 03Vittorio Giovara 07master:afb84857bf53: vf_colorspace: Forbid odd dimensions
[20:55:01 CET] <Chloe[m]> atomnuker: still on track for the patch to be pushed tomorrow, right? I guess the 'DECISION' or whatever (not sure what exactly it's mean to be) by nicolas will be ignored?
[20:56:09 CET] <atomnuker> it was an arbitray decision by me so no
[20:59:50 CET] <durandal_1707> so ffserver gonna be lost? it was useful I think maybe still is, but implementation is just bad
[21:00:08 CET] <JEEB> it's not lost
[21:00:12 CET] <JEEB> even if removed
[21:00:19 CET] <JEEB> the idea was to just separate it
[21:00:34 CET] <durandal_1707> to obscure place
[21:00:35 CET] <JEEB> it will be lost if nobody updates it to be less insane
[21:05:30 CET] <cone-287> ffmpeg 03Alex Converse 07master:8899057d914a: libvpxenc: Report encoded VP9 level
[21:08:52 CET] <JEEB> durandal_1707: also the main part is that I can't think of anything that ffmpeg.c (actually tested component!) wouldn't be able to handle together with one of the following: icecast, nginx-rtmp, vlc
[21:09:19 CET] <JEEB> rtsp, udp, HLS/DASH, basic http streaming
[21:21:11 CET] <iive> the reason to move it to a separate repo is to let it rot there.
[21:21:29 CET] <nevcairiel> in contrast to it rotting for years in the main repo, how is that different? =p
[21:22:12 CET] <nevcairiel> without the attempt to move it out, noone would have cared to look at it at all
[21:44:35 CET] <iive> nevcairiel: when it is in the main repo, there is still obligation to keep the api compatible with it, or update it when the api is changed
[21:44:57 CET] <nevcairiel> thats the main reason it needs to go, it badly violate sthe api in several places
[21:45:05 CET] <nevcairiel> please do inform yourself on the topic before commenting
[21:45:08 CET] <iive> and that's the reason to request its removal
[21:45:12 CET] <nevcairiel> we have enough people commenting from the sidelines
[21:46:38 CET] <iive> well, we don't seem to disagree
[21:46:42 CET] <iive> why the hostility.
[21:50:23 CET] <wm4> iive: because we discussed this over and over months or weeks ago?
[21:57:18 CET] <iive> and it is still discussed
[21:57:51 CET] <iive> what really makes me sad is how much people are eager to kill things, instead of fixing them.
[21:58:54 CET] <BBB> iive: not everyone has an interest in fixing everything and the kitchen sink
[21:59:17 CET] <BBB> iive: yet a lot of people like to continue to push the boundaries of whats possible with ffmpeg and libav{codec,filter,format,*}
[21:59:31 CET] <jamrial> iive: why didn't you answer to calls for help to fix and maintain ffserver during the past few years then?
[21:59:33 CET] <BBB> iive: ffserver sits in the way of that purpose, it doesnt help it
[22:01:52 CET] <lotharkript> I have a quick question: the example doc/examples/remuxing does not work for Y4M file. The problem is that the demuxer returns the codec RAWVIDEO, but the muxer take only WRAPPED_AVFRAME. To fix this, i could re-add the RAWVIDEO codec to yuv4mpegenc. Any comment?  
[22:02:16 CET] <lotharkript> doc/examples/remuxing /tmp/test.y4m /tmp/test1.y4m   <-- this fails
[22:05:14 CET] <iive> jamrial: (de)muxer and streaming are not in my competence.
[22:11:22 CET] <iive> BBB: remove the old to give a way to the new. I've heard that before.
[22:12:15 CET] <wm4> iive: if you don't know about it and won't fix it and don't even use it, you don't get to have strong feelings on it
[22:12:27 CET] <BBB> lotharkript: did you check the APIChanges to see what the recommended way is to use muxers that want WRAPPED_AVFRAME input?
[22:12:40 CET] <BBB> lotharkript: you likely need to call them in some special way and remuxing hasnt been updated for that change yet
[22:12:51 CET] <wm4> iive: "it would be nice to keep it" is not enough reason that we torture ourselves for it
[22:15:18 CET] <iive> wm4: are you quoteing me? because i don't think i've said that.
[22:15:25 CET] <lotharkript> BBB: I did not see any thing in the API changes explaining what to do.. 
[22:15:41 CET] <iive> but if there are people who are fixing it and they want to keep it, then why not.
[22:17:10 CET] <BBB> lotharkript: I believe quite literally you create an AVPacket where pkt.data is set to AVFrame *
[22:17:41 CET] <BBB> lotharkript: since yuv4mpegpipe.c does AVFrame *frame = (AVFrame *)pkt->data; (where pkt was declared as AVPacket *pkt)
[22:18:28 CET] <BBB> lotharkript: so in the case of AVCODEC_ID_RAWVIDEO, you probably need to decode it using the rawvideo decoder, and then input the decoded AVFrame wrapped in a AVPacket into the muxer
[22:18:30 CET] <lotharkript> BBB: I agree, but the demuxer return a AVPAcket without an AVFrame. Does it mean you need to create a AVframe from the AVpacket and then use it in the new AVPacket? 
[22:19:19 CET] <BBB> lotharkript: yes exactly, I realize it sounds a litle counterintuitive that you need to decode a non-compressed packet to do remuxing
[22:19:23 CET] <BBB> lotharkript: but thats indeed what it means
[22:19:58 CET] <BBB> lotharkript: hopefully that makes sense (not conceptually, but as in you knowing what to do to get it to work)
[22:20:05 CET] <lotharkript> BBB: Can we not just re-add the RAWVIDEO codec support to the muxer as secondary codec? 
[22:20:25 CET] <BBB> maybe& thats not up to me to decide
[22:21:00 CET] <BBB> you could certainly propose this on the mailinglist
[22:21:10 CET] <BBB> Im not against it (Im not really for it either, I dont know whats best)
[22:21:47 CET] <lotharkript> it feel counter intuitive to create a decoder in the muxer to get a AVFrame before calling the write_frame
[22:30:08 CET] <BBB> I totally agree its counterintuitive
[22:30:24 CET] <BBB> I wasnt intending to defend the way it is, just trying to help you get it working with the current way it works ;)
[22:30:41 CET] <BBB> btw, guys, didnt libav drop avserver?
[22:31:27 CET] <Chloe[m]> BBB: yes
[22:31:45 CET] <BBB> look at the counts when you search on google for avserver.exe (13.6k), avconv.exe (13.4k), ffmpeg.exe (212k) or ffserver.exe (9.6k)
[22:31:56 CET] <BBB> avserver.exe appears _more_ than ffserver.exe
[22:32:28 CET] <Chloe[m]> hmm, well I can't see it in the source tree. Is it in an external repo?
[22:32:34 CET] <BBB> and avserver/avconv.exe is actually > 1.0, whereas ffserver/ffmpeg.exe is a few % at best
[22:32:41 CET] <BBB> dunno
[22:32:49 CET] <BBB> isnt it interesting that it appears more often than avconv.exe?
[22:33:00 CET] <BBB> (and more than ffserver.exe)
[22:34:59 CET] <Chloe[m]> I assume it's because of how the timing with debian switching to libav, and avserver still being in libav worked out. As in, people are on old stable debian (or something) using avserver because ffserver isn't on old stable (for example).
[22:35:25 CET] <durandal_1707> an avserver may be something else
[22:35:52 CET] <Chloe[m]> ^ there's this too. disclaimer: I have no idea
[22:37:15 CET] <nevcairiel> lotharkript: you totally want to avoid using rawvideo for muxing, it's terrible for performance, intuitive or not. Using wrapped avframe is easy
[22:38:14 CET] <nevcairiel> Examples j
[22:38:31 CET] <nevcairiel> Examples are just examples, they don't support everything
[22:38:47 CET] <BBB> durandal_1707: that may be, yes& but its interesting how close the counts for avserver and avconv are, which makes me & not doubt& but & wonder about that
[22:39:39 CET] <BBB> nevcairiel: remuxing (w/o decoding) is a special case though, then there really is no reason to use avframe over rawvideo
[22:39:50 CET] <BBB> nevcairiel: and requiring avframe adds some slight code complexity
[22:39:58 CET] <BBB> again, not picking sides, but I can see his point that its sorta strange
[22:40:04 CET] <BBB> (for this particular one use case only)
[22:42:33 CET] <nevcairiel> Remuxing y4m to y4m is not particularly useful of a function
[22:44:45 CET] <nevcairiel> And supporting multiple codecs required more then just a flag in the mixer for raw
[22:44:53 CET] <nevcairiel> Mixer*
[22:45:00 CET] <nevcairiel> Damn autocorrect
[22:45:10 CET] <nevcairiel> I need to teach it some words
[22:48:53 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:319a7c5deae5: tests/ffserver-regression.sh: Fix file truncation introduced in 508826f961caf662cadb7c253e3c0e7d75104bdd
[22:48:54 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:c8b24a685ac5: ffserver: drop FeedData, its unused
[22:48:55 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:75b436d8b682: ffserver: Remove use of AVStream as a intermediate to store parameters
[22:48:56 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:da38da459598: ffserver: Remove some deprecated API use related to codec/codecpar
[22:48:57 CET] <cone-287> ffmpeg 03Michael Niedermayer 07master:0dbee6770039: ffserver: Remove last use of AVStream size
[22:51:39 CET] <jamrial> "fix" something introduced by a commit from freaking 2007
[23:12:37 CET] <michaelni> ffserver.exe is not found much because mingw* lacks the dependancies ffserver needs so its not built (at least not on my mingw32 or 64)
[23:14:26 CET] <nevcairiel> it needs fork i  think, which windows doesnt have
[23:15:37 CET] <nevcairiel> and some signal.h thing, apparently
[23:34:55 CET] <lotharkript> nevcairiel: I understand the wrapper is easier when you have decoding/encoding path, but when you just demux a Y4M file and you want to remux it, now, you have to decode the AVPacket to get the AVFrame. 
[23:37:05 CET] <nevcairiel> the av_image_* apis can just fill an AVFrame with the data from the rawvideo packet
[23:37:26 CET] <nevcairiel> the key point is, supporting another codec in the muxer would require a lot of ugly ifs everywhere
[23:39:48 CET] <lotharkript> agreed, but then the problem is done at the demuxer level, instead of the caller. Other wise, the called will need to check which demuxer it will use, check if the codec is the wrapper one and call some av_image_* api to fill an AVFrame. So, you are pushing the ifs statement to the caller of the libavformat, instead of the muxer to deal with it.
[23:41:25 CET] <nevcairiel> rawvideo should really be discontinued from all muxers for unity
[23:43:27 CET] <lotharkript> why? the codec type is still raw video. Why not then removing it from the demuxer then? Should the demuxer create a wrapped AVFrame when the codec is raw? 
[23:45:08 CET] <nevcairiel> the problem is that rawvideo is inherently extremely huge data that needs shuffling around, so any extra movement should be avoided, demuxing into rawvideo is ok since thats basically just the file packet copied into memory, but the other way around its different, and riddling every muxer with support for those two formats is just silly
[23:47:30 CET] <nevcairiel> as such it being different is just the price you pay for not having your memory bandwidth requirements being trippled
[23:54:25 CET] <nevcairiel> i wonder if you could make a bsf that would wrap rawvideo into an avframe, can bsf change the codec id? would that be sane? probably not right
[00:00:00 CET] --- Tue Nov 29 2016


More information about the Ffmpeg-devel-irc mailing list