[Ffmpeg-devel-irc] ffmpeg-devel.log.20130304

burek burek021 at gmail.com
Tue Mar 5 02:05:03 CET 2013


[00:27] <cone-334> ffmpeg.git 03Michael Niedermayer 07release/1.0:044177215e82: update for 1.0.5
[00:41] <cone-334> ffmpeg.git 03Michael Niedermayer 07release/1.0:164628e2f7e4: rtmpproto: increase APP_MAX_LENGTH
[00:43] <ubitux> michaelni: do you mind indicating the hash when closing a ticket?
[00:43] <ubitux> not only it gives hint for developers, but it also makes sure the user understands the problem is fixed
[00:44] <ubitux> the user might think we are closing tickets randomly because of inactivity or something (just like ubuntu does)
[00:44] <ubitux> giving him hints about where exactly it's fixed is helpful, imo
[00:50] <BBB> yay finally that works (h264 without dsputil and mpegvideo)
[00:50] <BBB> now let's try vp8 and some other things inclusive
[00:54] <BBB> ah, that required minor more work
[00:54] <BBB> ok, works too now
[00:54] <BBB> this is nice
[00:54] <BBB> now to get theora working
[01:01] <BBB> which I guess means re-diving into hpeldsp
[01:01] <BBB> blegh
[01:01] <BBB> I hated that patchset
[01:10] <iive> so when you talk about dsputil, you do mean dsputil.c not the dsputil as dsp utilities.
[01:22] <BBB> iive: what is the difference according to you?
[01:24] <BBB> iive: look - dsputil is the set of functions assigned to a function pointer in DSPContext; dsp utilities are things to do runtime cpu selection of simd functions, such as dsputil - dsputil is a dsp utilitiy, however not each dsp utility is a dsputil
[01:24] <BBB> iive: most codecs (such as h264, vp8, etc) now each have their own, so no reason to use dsputil anymore as a dumping ground for all new runtime cpu detect functions
[01:25] <BBB> unless your name is dirac, apparently
[01:26] <Compn> i think what were asking is, arent most of the functs in dsputil to be reused ? so that one can optimize 10 codecs at once ? :p
[01:27] <iive> rv34 reused some of the h264 MC ones, afair. each having its own dsp util struct means it would duplicate cpu detection code over and over again.
[01:27] <BBB> I'm sure that was the theoretical intent
[01:27] <BBB> but that's not how it works
[01:27] <BBB> most codecs are strictly incompatible
[01:28] <BBB> e.g. mpeg4 and h264 both use quarterpel mc
[01:28] <BBB> but one uses long edges, the other uses edge mirroring
[01:28] <BBB> if you don't know the difference, look it up, you'll suddenly burst out in laughter
[01:28] <BBB> then vp3 and mpeg both use halfpel mc
[01:29] <BBB> but guess what, they use a different implementation on the position where both x and y are in the halfpel position
[01:29] <BBB> so really in general, you can share only very little between codecs
[01:29] <BBB> idcts are always unique, esp. nowadays in modern bitexact codec days
[01:29] <BBB> loopfilter is so specific that yes they are unique per codec
[01:29] <saste> michaelni, +1 on git hash commit
[01:29] <BBB> so no, there's not much to be shared
[01:29] <BBB> you can share emulated_edge_mc
[01:29] <saste> btw trac is supposed to be *integrated* with git
[01:30] <BBB> that's pretty generic
[01:30] <saste> like you push a commit mentioning a ticket and ticket is closed
[01:30] <iive> BBB: this is faulty logic.
[01:30] <BBB> I'm listening
[01:31] <saste> ubitux, what do you want me to do with volume?
[01:31] <iive> because there are codecs that use different implementation, there is nothing to be shared.
[01:31] <ubitux> saste: finish your eval patch :)
[01:31] <saste> ubitux, several approaches were considered
[01:31] <BBB> iive: I pointed out codecs that are as similar as possible
[01:31] <BBB> iive: learn your codecs
[01:31] <iive> especially the old codecs shared a lot. h264 was paradigm shift.
[01:31] <BBB> iive: e.g. if you ask "can h264 and vp8 mc be shared", no you can't
[01:32] <BBB> they're aren't even remotely similar
[01:32] <BBB> old codecs share a lot and still use dsputil
[01:32] <BBB> I'm not ever going to touch them
[01:32] <BBB> after all, chrome doesn't use them anyway
[01:32] <BBB> so I don't really care
[01:32] <iive> you know there are more than vp and h264 codecs out there...
[01:32] <BBB> sure, I just don't use them :)
[01:32] <BBB> I'm well aware of how they work fwiw
[01:33] <michaelni> saste, about integration, last i looked (year ago or so) it required git repo and trac to be run on the same server or something like that but i might misremember
[01:33] <BBB> iive: and more importantly, I don't want chrome to carry all that binary cruft of stuff it doesn't use and doesn't need
[01:34] <BBB> iive: chrome should be small and simple, and since it doesn't carry any "old" codec, that means that it shouldn't carry dsputil
[01:34] <BBB> thus my goal: get h264, vp3 and vp8 dsputil-free
[01:34] <BBB> (and some audio codecs)
[01:34] <BBB> I believe the audio codecs are done, and of the video ones, only vp3 is remaining
[01:37] <iive> So you just want to remove the functions that are not used by the codecs you select. makes sense.
[01:39] <BBB> that's exactly what "get rid of dsputil" means
[01:39] <BBB> since dsputil is a dumping ground of all code for all codecs
[01:39] <BBB> with virtually nothing selectable
[01:41] <ubitux> http://bits.blogs.nytimes.com/2013/02/27/scientists-uncover-invisible-motion-in-video/  for anyone bored wanting to write a filter
[01:41] <iive> ubitux: detecting breading and pulse?
[01:42] <iive> breathing 
[01:42] <ubitux> detecting dead babies too
[01:42] <ubitux> it's just amplifying motion
[01:45] <saste> ubitux, someone bored... you?
[01:45] <ubitux> i'm not
[01:45] <ubitux> i'm boring, but i'm not bored
[01:46] <ubitux> it seems an army of ebur128 recently woke up, so i've some stuff to do (and real-life stuff too)
[01:46] <ubitux> ebur128 users*
[01:54] <iive> BBB: there are about 200 video codecs, do you think that each one of them should have its own dsputil context like h264 does?
[01:55] <ubitux> saste: do we have a native filter to adjust brightness/contrast?
[01:55] <saste> ubitux, mp=eq2
[01:56] <ubitux> yeah right, only this, not native
[01:56] <saste> it has some assembly, that's why nobody (= me or durandal) never ported it yet
[01:56] <saste> mp end is approaching
[01:56] <ubitux> yeah
[01:57] <ubitux> i'm feeling like porting it
[01:57] <ubitux> i'll try to motivate Paul first though
[01:57] <ubitux> it might work too
[01:58] <ubitux> saste: the only problem i have with eq2 is that, just like volume, it doesn't interface well with eval stuff
[01:58] <ubitux> (because optimz)
[01:58] <ubitux> having a av_eval_is_const() would help this kind of situation
[01:59] <ubitux> otherwise we need to re-init the filter every time
[01:59] <ubitux> or use different options for const and eval expr
[01:59] <saste> yes this is a common issue
[02:00] <saste> (same for overlay with dynamic x/y)
[02:01] <ubitux> no one wants to write av_eval_is_const()? :-°
[02:02] <Compn> BBB : yeah, i thought there would be more code shared, like h263 and mpeg were , back in the day. but i guess not with h264 and vp8 and newer codecs
[02:02] <Compn> thx for details
[02:06] <ubitux> saste: btw, don't you think we should have a "real" histogram filter?
[02:06] <iive> well mpeg1/2/4 share a lot of code, that's why mpegvideo is called mpeg :) As I said, h264 is paradigm shift, so it sets a new family of codecs. Probably h265 shares more with h264.
[02:06] <ubitux> saste: so we could tweak r/g/b histograms (using a function for instance, or some kind of other input)
[02:07] <ubitux> http://www.mutinydesign.co.uk/resources/images/changing-eye-colour-to-blue-in-photoshop/adjusting-curves-in-photoshop.png
[02:07] <ubitux> this kind of stuff
[02:11] <saste> ubitux, how do you do that with no gui?
[02:12] <saste> and BTW lua scripting may be the solution to a lot of integration problems
[02:12] <saste> since you don't need to write C code anymore, but rely on lua scripts (which can be used to create custom control filters)
[02:13] <ubitux> saste: -vf curve=red=x*2+1:blue=x:red=x/2+5
[02:13] <ubitux> something like this... i guess :)
[02:14] <ubitux> or the "dot" method
[02:14] <ubitux> -vf curve=red=3,4;5,6;12,34:blue=..
[02:15] <ubitux> (and you interpolate some bezier curves between the dots
[02:15] <ubitux> )
[02:15] <ubitux> see the gimp curves feature
[02:17] <saste> ubitux, are you planning to work on av_eval_is_const?
[02:17] <ubitux> no
[02:17] <ubitux> i'm planning nothing right now
[02:17] <ubitux> just throwing ideas and hope for someone to grab them
[02:18] <saste> ubitux, no i suggest, you propose, you implement ;-)
[02:19] Action: ubitux runs away
[02:19] <ubitux> maybe i'll do the curve stuff though
[02:19] <ubitux> that might be fun to do
[02:20] <saste> if you have a parsed expression, checking if it only contains constansts should be easy enough
[02:21] <saste> then we can use the same solution to multiple filters (overlay, volume, eq, ...)
[02:21] <ubitux> you need a white or black list of functions
[02:22] <ubitux> the safer choice is to have a list of the constant functions
[02:22] <ubitux> (so if one is not referenced, the worst case is that your code gets slow)
[02:22] <ubitux> then you need to crawl the expression and look if everything is constant
[02:22] <ubitux> but i'm extremely lazy right now so i won't do it
[02:22] <saste> ubitux, constant functions?
[02:23] <ubitux> yeah
[02:23] <ubitux> sin(12)*cos(4)*pi
[02:23] <ubitux> this is constant
[02:23] <ubitux> sin() and cos() are constant functions
[02:23] <saste> functions with constant args
[02:23] <ubitux> random() isn't for instance
[02:23] <saste> uhm ok
[02:23] <saste> this will easily get hairy
[02:23] <ubitux> also, it's risky to consider ld() and st() constant
[02:24] <saste> btw what's the status of gsoc?
[02:24] <ubitux> since it means to re-use a context from previously
[02:24] <saste> any more proposals, or administrator volunteer?
[02:24] <ubitux> i may volunteer as a mentor, but i'm not sure of the kind of tasks to propose
[02:24] <saste> the page is more or less as i left it 4 days ago, with lots of ideas and few "mentored" tasks
[02:24] <saste> ubitux, filters, subtitles?
[02:25] <ubitux> some filters, subtitles anything should be ok
[02:25] <saste> more image formats
[02:25] <ubitux> depends on which
[02:25] <saste> telecine filters
[02:25] <ubitux> i'm not familiar with telecine TBH
[02:26] <saste> palette filters would be nice as well
[02:26] <ubitux> i've started downloading a few historical animes ntsc dvd renowned for their crazy telecining
[02:26] <saste> like you select the palette and the filter applies it to the image
[02:26] <ubitux> i *may* be able to mentor the gif thing
[02:26] <ubitux> but i'm not yet comfortable with the formats
[02:26] <saste> great
[02:26] <ubitux> -s
[02:27] <ubitux> with palettes there are a lot of things to do..
[02:27] <saste> i think it's better to have a few well specified tasks than lots of awesome tasks with no mentor
[02:28] <saste> (I'm not going to mentor more than one task)
[02:28] <saste> but you can propose several task, and only accept the one with the best candidate
[02:29] <ubitux> mmh
[02:29] <saste> *several tasks
[02:32] <ubitux> http://caca.zoy.org/wiki/libcaca/study/introduction
[02:32] <ubitux> finally found it back.
[02:32] <ubitux> that stuff might be needed for gif mentoring
[02:33] <ubitux> and various palette stuff
[02:59] <cone-334> ffmpeg.git 03Clément BSsch 07fatal: ambiguous argument 'refs/tags/n1.0.5': unknown revision or path not in the working tree.
[02:59] <cone-334> Use '--' to separate paths from revisions
[02:59] <cone-334> refs/tags/n1.0.5:HEAD: build: disable iconv by default.
[03:23] <BBB> iive: h265 shares nothing with h264, you haven't followed its development (neither have I, but I can read wikipedia)
[03:23] <BBB> iive: neither does vp9 share anything with vp8
[03:24] <BBB> (I would know, I develop it)
[03:24] <BBB> iive: as for code sharing, sure, if their algo for mc is identical, it should use the same code; but if it isn't, it should not, and so it shouldn't be placed in a common struct (such as DSPContext)
[03:39] <cone-334> ffmpeg.git 03Clément BSsch 07master:393dcbf079e5: compat/strtod: isspace -> av_isspace.
[09:01] <joeherold> hello out there
[09:02] <joeherold> has any one time to help me out with a question about ffserver and the no-launch mode?
[09:02] <joeherold> i can not find anything in the docs
[09:06] <joeherold> hello?
[09:06] <joeherold> anyone here?
[09:08] <av500> no
[09:08] <joeherold> @av500
[09:09] <joeherold> hey, can you help me?
[09:13] <av500> you have not even asked a question
[09:14] <joeherold> yes, because i thought, noone is out there. so my question: I can't find anything in the docs about no-launch mode.
[09:14] <joeherold> i want to add feeds to ffserver installt in ubuntu via apt-get without restarting ffserver
[09:15] <joeherold> there should be a way to add ffplayer instances manually in no-launch mode
[09:15] <joeherold> how is this done?
[09:16] <joeherold> so start ffserver, and for example add a 2nd live stream during runtime
[09:16] <joeherold> because changing the config file and restart ffserver would kick off present viewers to other streams
[11:24] <TimNich> 0
[11:24] <TimNich> .
[12:12] <jojva> Hi, I'm working on h264MVC and I'm facing a problem. I have a sample video, and the first 4 VCL nal units I get are 20 - 5 - 1 - 20. 
[12:13] <jojva> 20 means VCL nal unit of the second view. They always contain P or B slices
[12:14] <jojva> So, in decoding order, how can the first nal unit 20 be decoded since it needs a reference ?
[12:18] <nevcairiel> isnt the reference the frame from the primary view?
[12:19] <nevcairiel> on blurays its split up into two streams, a normal h264 stream, and the MVC stream which just has the second view, and references into the primary view in the normal h264 stream
[12:23] <jojva> That's right, so when the first nal unit, which is 20, is decoded, an error is raised : there is no decoded ref frame at the moment
[12:24] <nevcairiel> yes, but thats why i am wondering if you shouldn't have decoded the primary view from the other stream first, which is then te reference frame
[12:25] <jojva> That is the point that is unclear to me, yes
[12:26] <nevcairiel> thats how i understood it to work, the mvc stream only encodes differences from the main view, so it doesnt have references on its own mostly
[12:26] <nevcairiel> so for decoding you need to combine the main and second view, and then decode it as one
[12:32] <jojva> Specifically, what is the order for decoding ?
[12:32] <jojva> I thought it was the order of packets arrival
[12:34] <jojva> alternating between h264 and mvc. But if it is whole h264 stream first, then whole mvc stream, as soon as an IDR access unit is met in the h264 stream, every refs are removed...
[12:34] <av500> no
[12:34] <av500> you have to decode them in parallel
[12:34] <jojva> lol
[12:34] <av500> if you have 2 separate stream
[12:35] Action: av500 has not idea about MVC
[12:35] <av500> no*
[12:46] <jojva> ok thanks, I was hoping I could do it without parallelization
[13:06] <nevcairiel> the way i see it, you group them based on their timestamps, and then decode the frame from the primary view, and then the frame from the second view (which references the primary view)
[13:06] <nevcairiel> so you have to take both streams and map the primrary and second view frames together
[13:13] <cone-714> ffmpeg.git 03Clément BSsch 07master:53ca0cb8d46f: build: [autodetect] -> [no] in iconv help.
[13:29] <michaelni> the h264 spec talks about just 1 stream with NAL units for all views that get split in access units and each these access units then split into views that get decoded IIRC ...
[13:29] <michaelni> anyone got a sample /url/link with MVC ?
[13:30] <nevcairiel> i only have some blurays at home, from which i could cut a small sample later if so desired
[13:30] <michaelni> nevcairiel, please do
[13:33] <jojva> I have samples
[13:34] <jojva> Specifically I made one sample where an object is only visible in the right stream
[13:34] <jojva> I'm gonna upload it
[13:34] <michaelni> ok thanks
[13:35] <michaelni> i assume your samples decode correctly with some/the reference decoder of course
[13:35] <jojva> didn't try
[13:37] <jojva> https://mega.co.nz/#!PZBj3TJD!FcLI5wRYFZM-Ph80zH8idUPEuJJUHe-mXvh5IKljwLQ
[13:39] <av500> mega download links are so usefull
[13:40] <jojva> ironic ?
[13:41] <JEEB> hah, incognito doesn't work with it in chrome, and my firefox wouldn't do anything after it "downloaded" it
[13:42] <jojva> Which video host is better ?
[13:42] <nevcairiel> firefox needs a extension to work with mega, iirc
[13:42] <nevcairiel> worked for me in chrome
[13:43] <JEEB> yeah, they offered an extension, but didn't say it wouldn't work at all.
[13:43] <nevcairiel> its concept is rather odd, but its still less annoying then the other hosters with wait time and shit like that
[13:43] <JEEB> yeah
[13:43] <JEEB> so I guess the file was downloaded into local storage, and then the decryption or whatever didn't work
[13:43] <nevcairiel> i dont think firefox supports local storage
[13:43] <nevcairiel> which is why it doesnt work properly
[13:44] <JEEB> afaik it does actually
[13:44] <JEEB> although I'm on nightlies anyways >_>
[13:45] <michaelni> jojva, your file has the MVC split in a separate stream like nevcairiel explained. This would need to be merged, and how to do this would probably be explained in some blueray spec.
[13:46] <nevcairiel> too bad i only have an old version of the BD spec, pre-3D
[13:46] <jojva> I don't know if it can be considered as a blu-ray, the samples I make are from a Sony camera
[13:46] <nevcairiel> its AVCHD, similar concept
[13:47] <michaelni> the question is in which spec is it described and where can i find a copy ?
[13:47] <jojva> If the spec is AVCHD 3D, it's proprietary right ?
[13:48] <nevcairiel> i havent found a recent version of the BD spec which includes 3D, unless you want to pay thousands of dollars
[13:48] <jojva> lol
[13:48] <JEEB> http://www.avchd-info.org/license/index.html heh
[13:48] <JEEB> I guess this would be cheaper
[13:49] <JEEB> unless someone finds the spec from baidu or something
[13:50] <JEEB> or naturally if some version of MPEG-2 Systems has MVC muxing in it
[13:51] <jojva> How much would this blu-ray spec define things here ? Because if it's only the correct ordering of packets, I can always try different orders
[13:53] <nevcairiel> yeah its probably just a very short paragraph
[13:53] <jojva> But I also found nal units coded with 24, which isn't specified in h264. I assume they're used to separate different views from a same timestamp (their length is 0)
[13:54] <jojva> haha, I'll avoid spending a few thousand $ for a short paragraph
[13:54] <nevcairiel> nal type 24? those are not specified for h264, even in the last version
[13:55] <jojva> yes, they're just a delimiter in my samples to indicate we're going to decode the non-base view
[13:56] <jojva> or at least this is my assumption
[13:56] <nevcairiel> i suppose the spec says they can be used for application-specific means
[13:57] <nevcairiel> which then might be documented in the BD or AVCHD spec
[13:58] <jojva> Then, ideally, I guess the mpegts demuxer would have to give a specific meaning for these unspecified nals
[14:01] <jojva> nevcairiel, where di you get the previous bd spec ?
[14:04] <JEEB> jojva, that's one of those questions you shouldn't ask ;)
[14:04] <JEEB> or expect any other answer than "the internet"
[14:07] <jojva> Because of legal issues ?
[14:13] <kierank> jojva: hello
[14:13] <jojva> Hello
[14:17] <kierank> jojva: how come you want to work on mvc?
[14:17] <kierank> don't you know 3d is dead :)
[14:18] <av500> kierank: maybe it is not 3d
[14:18] <kierank> 4d then
[14:18] <av500> maybe its front/back view
[14:18] <jojva> It is 3d
[14:19] <kierank> 3d with mixed paff/mbaff scares me
[14:20] <jojva> Well, I'm a student (master's) and with a few people we have this project. The purpose is to film an object with the 3d cam, then transform it into a 3d model
[14:20] <kierank> ok
[14:20] <kierank> are you part of that EU 3d thing?
[14:20] <jojva> I don't know what you're talking about
[14:21] <kierank> there is a 3D project from the EU that a lot of universities participate in
[14:21] <kierank> MUSCADE
[14:25] <jojva> never heard of it
[14:26] <jojva> btw, 3d is dead ? Not sure, cameras that do not require glasses will soon become affordable
[14:26] <jojva> The one I'm using (university's property) is 1400¬
[14:27] <jojva> Prices will go down, and TVs will follow as well. So MVC may come back in a few years...
[14:27] <kierank> jojva: NALU type 0x18 is dependency representation delimeter
[14:30] <kierank> i remember they standardised it in a weird place
[14:31] <av500> MVC Stereo Dependent unit: A set of NAL units that are consecutive in decoding order and contain exactly one
[14:31] <av500> non-Base view component. A dependent unit starts from a view and dependency representation delimiter NAL unit,
[14:31] <av500> VDRD_nal_unit (nal_unit_type = 24)
[14:32] <kierank> basically like an AUD
[14:48] <jojva> So, like an AUD, its only purpose is to say "we're entering a new acces unit here" but with "a new stereo dependent unit" instead
[14:50] <cone-714> ffmpeg.git 03Carl Eugen Hoyos 07master:2b09078a6143: Add h264chroma dependency for vp5 and vp6 decoder to configure.
[14:50] <cone-714> ffmpeg.git 03Carl Eugen Hoyos 07master:9051e297d276: Add h264qpel dependency for snow codec to configure.
[14:50] <cone-714> ffmpeg.git 03Carl Eugen Hoyos 07master:523b12affc00: Add h264chroma dependency for cavs decoder to configure.
[14:50] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:03148fd1743f: buildsys: only include log2_tab per library for shared builds
[14:51] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:144da7eeca9c: Merge remote-tracking branch 'cehoyos/master'
[15:47] <ubitux> BBB: would you mind testing the iconv patch from carl on ffmpeg-devel? (basically check if ./configure --enable-iconv && make works)
[16:58] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:7ccc0ed6a0ce: nutdec: print error on invalid/unsupported fourcc style
[16:58] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:9bb54bb68529: nutdec: more specific return codes for decode_syncpoint()
[18:09] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:faa0068a8755: avformat: Make duration estimation from pts more robust
[18:46] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:dae38a66ebd8: dnxhddec: return the correct number of bytes from decode_frame
[19:00] <cone-714> ffmpeg.git 03Michael Niedermayer 07master:7992bdbeb4ba: update_stream_timings: check bitrate for being in range.
[19:47] <BBB-work> michaelni, any comments on my last patches?
[21:56] <durandal_1707> michaelni: have any more comments for stereo3d filter?
[21:59] <michaelni> i was hoping saste would review it ...
[21:59] <michaelni> but does it need more comments at all ?
[21:59] <durandal_1707> it is more/less copy-paste of mp code
[22:03] <wm4> so much for mplayer<->ffmpeg incest, hahaha
[22:04] <wm4> durandal_1707: there you have your example
[22:04] <durandal_1707> of what?
[22:05] <durandal_1707> ^NIH syndrome?
[22:06] <ubitux> durandal_1707: gonna port eq2? :x
[22:07] Action: ubitux writing a curves filter
[22:07] <durandal_1707> ^ what that does?
[22:08] <durandal_1707> ubitux: what about eq1?
[22:08] <ubitux> eq2 is a generic eq afaik
[22:09] <ubitux> you should be able to drop both mp=eq* after porting eq2
[22:09] <wm4> does libavfilter have an API yet
[22:09] <ubitux> curves filter is basically the photoshop curves thing
[22:11] <durandal_1707> ubitux: enlighten me what it does
[22:12] <durandal_1707> wm4: no, but one can be written at any time
[22:12] <ubitux> durandal_1707: do you have gimp?
[22:13] <ubitux> https://sites.google.com/site/elsamuko/gimp/get-curves/curve_orig.png
[22:13] <ubitux> ’ smoothly adjust color ranges amplitudes
[22:18] <durandal_1707> nice, ultimate goal is to assimilate all gimp features
[22:23] <ubitux> :)
[22:25] <wm4> add GIMP source code to the libavfilter dir
[22:25] <wm4> it will be converted over the years
[22:26] <durandal_1707> patchWelcoMe4
[22:28] <durandal_1707> nevcairiel: you got unsupported xtor sample
[22:29] <RobertNagy> is FF_BUFFER_HINTS_REUSABLE and FF_BUFFER_HINTS_PRESERVE implicitly also  FFBUFFER_HINTS_READABLE?
[22:30] <durandal_1707> probably not
[22:31] <RobertNagy> not sure it makes sense to me in that case
[22:31] <RobertNagy> why would you have preserve, but not readable?
[22:32] <durandal_1707> dunno, but it does not need to have any sense....
[22:33] <durandal_1707> but that flag is currently irellevant because nothing protect buffers yet
[22:33] <RobertNagy> the preserve flag?
[22:33] <durandal_1707> READABLE
[22:34] <wm4> nobody knows how these perms work
[22:34] <durandal_1707> wrong
[22:35] <RobertNagy> I'm trying to have the decoder to directly decode into gpu (OpenCL) allocated memory
[22:35] <RobertNagy> which is why these flags would be relevant for me, but they are a bit unclear...
[22:36] <durandal_1707> anyway they are gonna be removed once refcounted frames comes in
[22:37] <RobertNagy> ic
[22:44] <durandal_1707> flags are usually used for codecs that need previous frames, usually non-intra-only codecs
[23:05] <durandal_1707> You may not use, copy, publish, share, emulate, clone, rent, lease, sell, modify, decompile, disassemble, otherwise reverse engineer, or transfer the licensed program, or any subset of the licensed program, except as provided for in this agreement.
[23:05] <durandal_1707> ^ yea right
[23:19] <RobertNagy> durandal_1707: Do non-intra-only codecs always call reget before they read from a previously decoded frame?
[23:20] <RobertNagy> should they*?
[23:21] <nevcairiel> personally i consider reget to be a bad thing to use
[23:23] <RobertNagy> nevcairiel: does that mean that a decoder can read from a previosuly decoded frame without reget?
[23:23] <nevcairiel> also, it always depends how the codec works, if it always codes difference from the last frame, that might work, but if it always codes difference to the last intra frame, it may need to keep the full intra frame untouched somewhere
[23:23] <nevcairiel> not automatically, it needs to store it
[23:24] <RobertNagy> not quite following the last comment, can a codec read from a decoded fram without reget?
[23:24] <nevcairiel> it needs to store the frame
[23:25] <RobertNagy> what do you mean?
[23:25] <nevcairiel> i dunno how to make this any simpler. if you want to keep it, store it
[23:26] <nevcairiel> anyhow the whole process of this is changing soon, then you can just increase the reference count of the frame and it'll be stored for you
[23:26] <RobertNagy> I don't want to store it...
[23:26] <RobertNagy> I just want to know when the codec would want to read from it
[23:26] <RobertNagy> so I can map it back from gpu memory
[23:27] <RobertNagy> when appropriate
[23:27] <RobertNagy> what I want to know is whether I should always map the memory back to host memory
[23:27] <RobertNagy> or if I can always wait until reget
[23:27] <RobertNagy> _buffer
[23:27] <nevcairiel> the more complicated codecs dont use reget_buffer, only very simple ones do
[23:28] <nevcairiel> stuff like h264 or the mpeg variants have their own frame management
[23:28] <RobertNagy> I see, then I have to always make sure that whatever I return from get_buffer is always readable/writable according to the hints
[23:28] <RobertNagy> thanks for clarifying
[23:29] <nevcairiel> i would be surprised if the hints are reliable
[23:29] <RobertNagy> not even those with FF_BUFFER_HINTS_VALID?
[23:29] <RobertNagy> nvm
[23:30] <nevcairiel> anyhow, it will usually call get_buffer once, then decode the frame into there, and then store the frame internally somewhere until its no longer needed, and at that point it'll call release_buffer
[23:30] <nevcairiel> so any buffer you give to it needs to be writable anyway
[23:31] <RobertNagy> yes, but not necessarily readable, which makes a difference performance wise
[23:31] <RobertNagy> can I use AVFrame->reference reliably to find out if the codec will read from the frame?
[23:50] <BBB-work> nobody commented on my clever trick of using times4**5 </sad>
[23:55] Action: ubitux almost did it
[00:00] --- Tue Mar  5 2013


More information about the Ffmpeg-devel-irc mailing list