[Ffmpeg-devel-irc] ffmpeg-devel.log.20180218

burek burek021 at gmail.com
Mon Feb 19 03:05:03 EET 2018


[01:36:12 CET] <Chloe> tmm1: you said there was something you were paying for? What was it
[03:12:31 CET] <atomnuker> one more reason to abandon ipv6: no isp I've seen filters any ipv6 websites
[03:12:54 CET] <atomnuker> erm abandon ipv4
[03:15:27 CET] <JEEB> interesting, kotlin/native is being marketed with a FFmpeg wrapper
[03:15:40 CET] <JEEB> I guess it's a good way of saying "we have working FFmpeg interop"?
[03:15:49 CET] <JEEB> s/FFmpeg interop/C interop/
[07:49:11 CET] <cousin_luigi> Greetings.
[10:30:02 CET] <Guest2640> What does vf_mix.c do? When I am giving 2 input images I am getting a blurred output image. What's happening at pixel level?
[10:32:53 CET] <nevcairiel> it literally mixes the images, takes the value of both inputs and adds them
[10:35:02 CET] <Guest2640> Is it adding both images pixel wise?
[10:35:42 CET] <nevcairiel> If you're in this channel, you should be able to read the code to find out :)
[10:37:55 CET] <Guest2640> sure :')
[12:30:43 CET] <durandal_1707> omerjerk: so is als encoder finished?
[13:09:33 CET] <fdsf>  
[13:11:37 CET] <cone-192> ffmpeg 03Michael Niedermayer 07master:de675648cef7: avcodec/vp8: Check for bitstream end before vp7_fade_frame()
[13:11:37 CET] <cone-192> ffmpeg 03Michael Niedermayer 07master:01370b31aced: avcodec/jpeg2000dec: Use av_image_check_size2()
[13:11:38 CET] <cone-192> ffmpeg 03Michael Niedermayer 07master:1be49cee34eb: avcodec/h264: Increase field_poc to 64bit in ff_h264_init_poc() to detect overflows
[13:11:39 CET] <cone-192> ffmpeg 03Calvin Walton 07master:d2fc244293b5: libavfilter/vf_fps: Add more fate tests
[13:15:37 CET] <cousin_luigi> Is there an ETA for 3.5 ?
[13:19:40 CET] <jkqxz> Never.
[13:21:36 CET] <cousin_luigi> jkqxz: Next stable then?
[13:21:52 CET] <cousin_luigi> As in next major release.
[13:24:58 CET] <jkqxz> 4.0?  Dunno.  I think people would like the component registration/listing method sorted out before it happens.
[14:02:53 CET] <cousin_luigi> jkqxz: Where can I find a list of the blocking bugs? Can I filter that out on trac?
[14:06:34 CET] <jkqxz> It's not that organised, I'm afraid.
[14:08:10 CET] <JEEB> what was there left with the component registration stuff?
[14:08:24 CET] <JEEB> the thing that ffmpeg.c doesn't utilize the new stuff yet?
[14:13:33 CET] <wm4> most devs don't even use trac
[14:13:39 CET] <JEEB> yea
[14:14:46 CET] <JEEB> we just need to decide what we need to get in if anything out of new stuff, and then we branch out and have a list of crap that has to get fixed before tagging vNEXT
[14:41:00 CET] <cousin_luigi> Then I guess the answer to my question is: "not imminent".
[14:41:33 CET] <JEEB> so far it's just been when someone (f.ex. michael) decides that a release is done
[14:43:11 CET] <cousin_luigi> k, thanks
[17:23:02 CET] <Nakul> Hi,
[17:23:25 CET] <Nakul> Can anyone help me getting started into gsoc
[17:24:05 CET] <JEEB> 1) get the latest master 2) compile it 3) look into what you're interested 4) ??? 5) profit!
[17:34:02 CET] <Nakul> I have visited already this site can you suggest further
[17:34:44 CET] <jkqxz> Nakul:  Why do you want to work on ffmpeg?  Do you use it yourself?  If so, for what?
[17:36:20 CET] <Nakul> I have used just two days before for converting Avi to MP4 videos
[17:36:53 CET] <Nakul> I want bcz I am interested 
[17:37:08 CET] <JEEB> well go through the steps I noted first :P
[17:37:19 CET] <JEEB> get the code, compile, start looking into what you want to work on
[17:39:46 CET] <Nakul> I want to work on :MPEG-4 ALS codec improvements
[17:42:07 CET] <durandal_1707> really?
[17:42:47 CET] <Nakul> Yup
[17:47:30 CET] <durandal_1707> Nakul: have you contacted mentors of that project?
[17:47:48 CET] <durandal_1707> nothing is instant here
[17:49:39 CET] <Nakul> I am trying to  get reply on mail.but still I have no any conversations with yhem
[17:50:10 CET] <Nakul> *them
[17:51:33 CET] <durandal_1707> Nakul: iirc you had contacted other people too, have you read their mails?
[17:52:17 CET] <durandal_1707> one of als mentors is omerjerk iirc
[17:53:34 CET] <Nakul> I have mailed to @omerjerk . waiting for reply
[17:55:39 CET] <Nakul> No I haven't read anyone mail.
[17:57:38 CET] <durandal_1707> checked spam?
[18:05:39 CET] <philipl> jkqxz: going back to the mjpeg hwaccel stuff: what's the reason for storing and comparing the pix fmts? Is that because the pix fmt could change from frame to frame?
[19:01:09 CET] <Chloe> i hate email
[19:01:16 CET] <Chloe> managed to send the wrong patches again
[19:23:15 CET] <RiCON> you can "git log HEAD~N" to check which patches will get sent
[19:23:56 CET] <Chloe> well it was more that I made a one line change then didnt commit it before I sent
[19:24:30 CET] <RiCON> or "git format-patch" and send the patches as attachments, i think that also works in patchwork?
[19:31:34 CET] <Chloe> jkqxz: 'Video filtering with OpenCL' looks interesting, I've been wanting to learn about GPU programming for a while
[19:32:23 CET] <atomnuker> you don't like vulkan?
[19:32:54 CET] <Chloe> atomnuker: well, I dont know if I'd be able to run vulkan stuff
[19:33:22 CET] <Chloe> atomnuker: I also have no 'godlike familiarity' with it
[19:34:41 CET] <durandal_1707> atomnuker should be gsoc student and mentor at same time
[19:35:55 CET] <atomnuker> can't, graduated 2 years ago now, besides I've got a job
[19:37:25 CET] <jkqxz> Chloe:  Are you eligible?
[19:37:35 CET] <Chloe> atomnuker: I mean I guess I'd prefer vulkan over opencl but your prerequisites are rather outside of my current skillset. I could get familiar with vulkan but hardly 'godlike'
[19:37:38 CET] <Chloe> jkqxz: I should be
[19:38:04 CET] <Chloe> I'm 18 and I am in education, and will be in education until after gsoc's application
[19:38:50 CET] <durandal_1707> you should register or something...
[19:39:11 CET] <Chloe> you cant register yet, but I'm considering doing it
[19:39:25 CET] <jkqxz> philipl:  I'm not quite sure what you're referring to.  Storing and comparing where?
[19:40:28 CET] <nevcairiel> "education" is rather vague, they require an accredited university
[19:40:50 CET] <Chloe> ah, that's a shame then. 
[19:41:27 CET] <nevcairiel> for people still in school, there is another one, i forgot its name
[19:42:07 CET] <jkqxz> Are you going to university next year?  Being accepted to go to one next year is sufficient.
[19:42:16 CET] <Chloe> nevcairiel: FFmpeg wasn't in that last year
[19:42:28 CET] <Chloe> jkqxz: no
[19:44:23 CET] <durandal_1707> why?
[19:50:55 CET] <jkqxz> It's unclear to me that GSoC is necessarily a good idea when you live in western europe.  An in-person internship with a technology company is likely to be more generally useful.
[19:51:09 CET] <jkqxz> I don't know about other places.
[19:54:07 CET] <durandal_1707> and paid?
[19:54:54 CET] <Chloe> durandal_1707: its too expensive and I dont have the entry grades
[19:55:15 CET] <durandal_1707> shame
[19:58:40 CET] <durandal_1707> Compn: see, working on FFmpeg does not give the job
[20:01:20 CET] <Chloe> Compn: what jobs do you get from working on FFmpeg
[20:08:13 CET] <BenLubar> Is there documentation on what I need to do to write a demuxer?
[20:08:25 CET] <Chloe> BenLubar: look at the other demuxers pretty much
[20:08:47 CET] <atomnuker> Chloe: propose what filter you'd like to add and I'll make a project for you
[20:09:00 CET] <Chloe> atomnuker: I'm not eligible it seems
[20:09:28 CET] <atomnuker> jkqxz: yeah, no, good luck getting hired, I went through hell in 2014 trying to do that
[20:10:06 CET] <atomnuker> stuck up companies like arm want you to apply by february
[20:10:08 CET] <nevcairiel> gsoc is definitely more tailored for US culture
[20:10:21 CET] <nevcairiel> EU universities dont have the long summer break its meant to take place in
[20:10:54 CET] <atomnuker> I had from May to September here in the uk, not sure how things are elsewhere
[20:11:22 CET] <BenLubar> Basically I want to make a demuxer for CMV: http://dwarffortresswiki.org/index.php/User:Jifodus/CMV_file_format The video is pretty simple - I already have code written to convert the data to rgb24 - but the audio is weird. The format lets you play 8 different ogg/vorbis files with a maximum of 16 started per frame, so there's not a simple way to convert that to a stream.
[20:11:58 CET] <nevcairiel> there were barely 2 weeks entirely free for me in summers, after classes was exam season in the "break", and then new classes started
[20:12:21 CET] <jkqxz> atomnuker:  Uhh, yeah.  That's basically how it works, not know that is unfortunate but you should have started earlier.
[20:12:36 CET] <atomnuker> I did
[20:12:40 CET] <atomnuker> in January
[20:12:40 CET] <Chloe> BenLubar: why not just output several different streams
[20:12:46 CET] <atomnuker> I got 0 replies
[20:12:59 CET] <BenLubar> Chloe: wouldn't that end up as multiple tracks in the output?
[20:13:07 CET] <Chloe> yeah what's wrong with that?
[20:13:38 CET] <Chloe> you can the use amerge to merge the streams
[20:13:43 CET] <BenLubar> I have some code here to generate a filter to combine the audio tracks, but I'm pretty sure it can't be done that way from a demuxer: https://github.com/BenLubar/cmv2mp4/blob/741a2c288ca7c2fe5315f70a9b02ec827cb09776/worker.js#L559-L601
[20:13:45 CET] <atomnuker> I applied again for arm in may, and after a few days they said to come over for an interview, one day before my math exam, I did, interview went crap, the 2 people there were slacking timewasters
[20:14:44 CET] <BenLubar> if a demuxer can create a filter graph, that would make it easy
[20:15:18 CET] <Chloe> I think some do, though others might know more
[20:20:07 CET] <Chloe> jkqxz: any recommendations for companies to look at?
[20:22:21 CET] <jkqxz> My experience of this from the other side is that lots of people apply in November/December/January time.  Some amount of immediate HR-type filtering has to happen because there is high volume and many are just entirely unsuitable, but people who do know what's going on look at it after that and then interviews happen pretty quickly and places are filled by around now.
[20:23:10 CET] <philipl> jkqxz: in your mjpegdec changes, you store the hwaccel_sw_pix_fmt and hwaccel_pix_fmt in the private context and then compare agains the pix_fmt on each decode.
[20:24:06 CET] <jkqxz> Chloe:  Where are you and what do you want to do?
[20:25:10 CET] <durandal_1707> BenLubar: demuxer job is not to merge separate streams
[20:25:51 CET] <jkqxz> philipl:  You need to avoid calling get_format on every frame.
[20:27:15 CET] <Chloe> also nice i broke threading by const-ifying lavfi
[20:28:01 CET] <durandal_1707> huh
[20:29:00 CET] <Chloe> jkqxz: I'm in the UK at the moment, I'm not entirely sure what I want to do. I like multimedia in the sense that codecs interest me. Something with 'lower-level' programming I guess
[20:29:14 CET] <Chloe> durandal_1707: I'm sure I havent, but that's where the errors are showing up at the moment
[20:29:44 CET] <philipl> jkqxz: right. In a conventional codec, you'd do get_format at init time and not touch it again. But here, you could image doing it on the first frame and then never doing it again. Your logic will run it again if the format changes, which it could, of course. In practice, would a real world mjpeg stream have frames with different formats?
[20:30:04 CET] <philipl> I'm just trying to understand the rationale.
[20:30:20 CET] <jkqxz> philipl:  Er, all codecs have that handling?
[20:30:39 CET] <jkqxz> And certainly making an MJPEG by concatenating JPEGs together could give you anything.
[20:30:57 CET] <philipl> jkqxz: vc1dec only calls get_format once at init time.
[20:31:03 CET] <philipl> just as an example
[20:31:10 CET] <philipl> Anyway, not trying to make a big deal out of it.
[20:31:16 CET] <jkqxz> ... which makes me realise it needs to check the dimensions as well.
[20:32:14 CET] <jkqxz> VC-1 requires accurate extradata so it can do that.
[20:32:27 CET] <philipl> jkqxz: and the hack to remaps yuvj to yuv. Not sure what that's needed for as I can't make anything fail if I remove it.
[20:33:08 CET] <jkqxz> Format allocation based on sw_format.
[20:33:18 CET] <jkqxz> *Frame allocation
[20:41:44 CET] <BenLubar> durandal_1707: what would a demuxer for a container format that plays audio files at specific frame numbers output, then?
[20:43:00 CET] <jkqxz> Chloe:  I don't really know.  Without going to university you are going to have a very hard time with any company which has a formalised recruitment process because you are missing a very big standard tickbox.
[20:44:12 CET] <durandal_1707> BenLubar: demuxers just split packets into sensible frames
[20:44:24 CET] <Chloe> jkqxz: I figured I'd have to just do an apprenticeshit, or find a shit job until I can go to uni as a mature student. 
[20:45:26 CET] <durandal_1707> Chloe: ask kierank to recruit you
[20:45:51 CET] <Chloe> didnt kierank go back to uni after being in work for a while just to check the box?
[20:47:47 CET] <BenLubar> durandal_1707: would I write a demuxer and two decoders, then? How would I make the audio decoder?
[20:47:48 CET] <atomnuker> no, he did x264 stuff while in uni afaik
[20:49:17 CET] <durandal_1707> BenLubar: decoders are written only if there are not already available in codebase
[20:50:28 CET] <durandal_1707> BenLubar: do you know specification of audio in files?
[20:51:25 CET] <BenLubar> durandal_1707: in this format, the audio is int32 count, that many strings encoded as null-terminated char[50], and then int32[200][16] where the first index is the frame number and the 16 elements are sound indexes from the names array to start playing on that frame with -1 meaning no sound
[20:51:45 CET] <BenLubar> the audio files are ogg vorbis files
[20:52:15 CET] <BenLubar> but I'm pretty sure ffmpeg doesn't have an API for "play this file" as part of an audio stream because that doesn't really make any sense
[20:52:36 CET] <durandal_1707> huh, is that some kind of sequencer?
[20:53:00 CET] <BenLubar> it's "video" from Dwarf Fortress
[20:53:12 CET] <nevcairiel> its like a playlist, not really a muxed container format
[20:53:22 CET] <durandal_1707> or those are just streams with vorbis data?
[20:53:42 CET] <durandal_1707> the wiki mentions zlib stuff
[20:53:42 CET] <Chloe> you could treat it as a sequencer and just merge all the streams
[20:53:55 CET] <durandal_1707> nooo!
[20:54:44 CET] <BenLubar> the zlib part is for the video data
[20:55:09 CET] <BenLubar> it's basically 15 bit pixels that represent a CP437 character with a foreground and background color
[20:55:32 CET] <BenLubar> I'm confident I can write a demuxer and a video decoder for this
[20:55:40 CET] <durandal_1707> yea
[20:58:15 CET] <durandal_1707> i doubt it stores raw pixels, it stores chars with colors
[20:59:03 CET] <BenLubar> yeah it's not pixels that you'd see on a screen but it's 15 bits per tile
[20:59:33 CET] <durandal_1707> would probably need font to write decoder
[21:00:03 CET] <Chloe> you'd need to take an input tileset
[21:00:07 CET] <BenLubar> yeah, this is the one I've been using: https://benlubar.github.io/cmv2mp4/curses_800x600.png
[21:00:17 CET] <Chloe> cause you can change the DF tileset too
[21:01:33 CET] <BenLubar> how would that work, the video decoder would need an input file separate from the stream from the demuxer
[21:01:46 CET] <BenLubar> would it be a command line argument that the decoder adds?
[21:02:24 CET] <durandal_1707> BenLubar: decoder takes frames given by demuxer
[21:03:00 CET] <durandal_1707> so if they are zlib compressed they are uncompressed by decoder and rendered into frame
[21:03:04 CET] <BenLubar> yeah, the video frames would just be columns+rows+byte[2][colums][rows]
[21:03:36 CET] <BenLubar> or I guess colums+rows+remainder of cmv file after the header
[21:09:06 CET] <philipl> jkqxz: So, if I'm reading this right, the reason why the yuvj remapping isn't an issue for nvdec is that the sw_format selection doesn't look at what the original pix_fmt is. Specifically in this case, the sw format is always NV12 for any 8bit content.
[21:37:45 CET] <durandal_1707> atomnuker: so you will teach students vulkan stuff?
[21:38:02 CET] <atomnuker> yeah
[21:38:16 CET] <atomnuker> though really its mostly just shaders
[21:38:37 CET] <Chloe> atomnuker: if no one takes your project hmu
[21:44:43 CET] <durandal_1707> hmu?
[21:45:10 CET] <RiCON> hit me up
[21:45:47 CET] <Chloe> as in, Id do it outside of gsoc if no one in gsoc wants it
[22:02:57 CET] <jamrial> ubitux: ping
[23:28:23 CET] <cone-784> ffmpeg 03Michael Niedermayer 07master:f82dd4c09b2d: avcodec/hevcdec: Check luma/chroma_log2_weight_denom
[23:28:23 CET] <cone-784> ffmpeg 03Michael Niedermayer 07master:647fa49495c3: avcodec/dirac_dwt_template: Fix Integer overflow in horizontal_compose_dd137i()
[23:54:17 CET] <philipl> BtbN: I think nvenc doesn't handle full colour range correctly. The range is currently being set from the avctx value, but I think this is not initialised meaningfully. Rather, the AVFrame carries the range.
[23:55:24 CET] <nevcairiel> technically for encoder init, the avctx range should be used, you dont have any frames at that time
[23:55:35 CET] <nevcairiel> if it changes mid-stream, only got  yourself to blame for doing s uch things
[23:56:09 CET] <philipl> Except the avctx range isn't set.
[23:56:14 CET] <philipl> Who's supposed to set it?
[23:56:41 CET] <nevcairiel> whoever wants to encode things
[23:56:42 CET] <nevcairiel> ie. the user
[23:56:51 CET] <philipl> So if I'm using 'ffmpeg'...
[23:57:02 CET] <philipl> I'm talking transcode here.
[23:57:11 CET] <philipl> The decode side sets the flag on itself correctly
[00:00:00 CET] --- Mon Feb 19 2018


More information about the Ffmpeg-devel-irc mailing list