[Ffmpeg-devel-irc] ffmpeg-devel.log.20170725

burek burek021 at gmail.com
Wed Jul 26 03:05:04 EEST 2017


[03:25:16 CEST] <ElementalBlack> Is there any detailed documentation on FFmpeg's API besides inline comments? I'm writing a filter and it's taking a good amount of time to figure everything out from analyzing other filters.
[03:29:50 CEST] <ElementalBlack> A flow chart would be nice too. It looks like most have a init, uinit, and filter_frame
[03:46:22 CEST] <jamrial> ElementalBlack: unfortunately no. it's hard to get people to write good tutorials or documentation, so generally the way to go is "look at an existing simple filter/demuxer/decoder, copy/paste it to a new file and work from that"
[03:46:54 CEST] <jamrial> but if you have specific questions, you can ask here and someone that knows the code might answer
[04:32:06 CEST] <cone-107> ffmpeg 03Steven Liu 07master:805ce25b1d2f: avformat/hlsenc: improve hls encrypt get key file operation
[05:48:12 CEST] <ElementalBlack> Are filters run across multiple threads, or do they need to be written that way?
[05:56:02 CEST] <ElementalBlack> As in, will libavfilter queue/run/collect multiple frames to the filter?
[05:59:09 CEST] <cone-107> ffmpeg 03James Almer 07master:4a654be3fb0a: avcodec/mpegvideo_enc: fix mixed declarations and code warning
[09:41:58 CEST] <cone-223> ffmpeg 03Nicolas George 07master:bbc7cfbf1e0b: lavfi/testsrc2: fix completely transparent alpha.
[14:40:51 CEST] <ldts> I am using ffplay to reproduce a hevc video; however I only get a frame displayed every 10 or 12 second. Can anyone give me some hints as to how do debug this please? I am using http://jell.yfish.us/jellyfish-3-mbps-hd-hevc.mkv on the ffmpeg master branch (yesterdays tip).  
[15:00:04 CEST] <DHE> ldts: wrong channel
[15:05:00 CEST] <ldts> DHE: um, is there a channel specific to ffplay? I am integrating HEVC V4l2 support -extending the 4 patches I posted yesterday with HEVC decoding capabilities; since I am not discarding frames I can see image artifacts being displayed. Not sure where in ffmpeg the discard of frames is being handled though hence my ask. 
[15:10:25 CEST] <DHE> well this is the dev channel for people making patches for ffmpeg itself. generic playback issues would go to the main #ffmpeg channel
[15:16:18 CEST] <ldts> yep, I thought so. 
[15:17:11 CEST] <ldts> I need some help findomg out where -in ffmpeg- hevc frames are being discarded and if there is an algorithm for it..
[15:18:59 CEST] <JEEB> I think the HEVC decoder might not implement that since I've seen the sw decoder pushing out non-fully-decoded pictures (as in, the sync hasn't been completed yet)
[15:19:06 CEST] <JEEB> the AVC decoder seems to implement it at least
[15:22:07 CEST] <ldts> JEEB:  yeah I couldnt find it on the decoder...was wondering if it could have happened earlier? still 10 seconds without a frame seems like I am hitting some other problem
[15:24:42 CEST] <BBB> I would generally recommend to use ffmpeg instead of ffplay for debugging
[15:24:52 CEST] <ldts> when I run my v4l2_m2m hevc decoder I can see that some frames are being properly decoded but others are not...and I am wondering if I should be filtering out the input to the decoder somehow (I am running a hevc_mp4toannexb as well)
[15:24:57 CEST] <BBB> so & does ffmpeg -i file -f image2 /tmp/file-%d.png reproduce the issue?
[15:25:04 CEST] <BBB> if so, does -vsync 0 fix it?
[15:25:17 CEST] <ldts> BBB:  ah let me test. I'll come back to you on this.
[15:25:25 CEST] <BBB> if ffmpeg reproduces it but vsync 0 fixes it, it means timestamps are messed up
[15:25:37 CEST] <BBB> if not, it means the hevc decoder is internally discarding them (maybe sync code, who knows)
[15:25:57 CEST] <BBB> if ffmpeg doesnt reproduce it at all, then its a ffplay bug and maybe you should ask marton
[15:26:16 CEST] <BBB> hopefully thats somewhat helpful
[15:27:44 CEST] <JEEB> yea, ffplay isn't the best thing for debugging :)
[15:29:32 CEST] <ldts> BBB:  JEEB: thanks, yes that helps and thanks for the advice. will retest now. 
[15:30:52 CEST] <BBB> btw I implemented v4l2 drivers (and even did some uvc work) several lives back, so to see v4l2 being used for hardware decoding (like vaapi/vdpau/dxva2/vda/etc.) is kind of strange for me, but ohwell...
[15:33:20 CEST] <ldts> BBB:  ah, this is for video decoding. even stranger is to use an ISP for image processing in the print industry with intel atoms and that was fun as well :) 
[15:33:54 CEST] <BBB> crazy stuff
[15:34:03 CEST] <BBB> anyway, all fine, move along, nothing to see here ;)
[15:34:16 CEST] <bencoh> v4l2 is used for a lot of "video-related" stuff, even when it's not exactly ... "recommended"
[15:34:28 CEST] <bencoh> since there is no other "recommended" way :/
[15:34:49 CEST] <BBB> so from this, should I assume you guys do hevc decoding in v4l2 also?
[15:34:57 CEST] <BBB> hows vp9 decoding going on these chips?
[15:35:13 CEST] <BBB> (vp9 is much more interesting than hevc, nobody uses hevc)
[15:35:19 CEST] <BBB> (not on cell phones at least)
[15:35:30 CEST] <bencoh> (not yet?)
[15:35:33 CEST] <ldts> BBB:  yes, h264, mpeg4, vp8, hevc..I am not sure we have VP9 support 
[15:35:58 CEST] <BBB> bencoh: maybe - hard to say whatll happen tomorrow, right?
[15:36:08 CEST] <bencoh> :)
[15:36:11 CEST] <BBB> ;)
[15:36:52 CEST] <BBB> ldts: Im obviously biased, but hey, if the hw supports it, try to make it work, vp9 is pretty big on cell phones, and av1 is coming also
[15:38:36 CEST] <ldts> BBB: ah, cool... the Venus hardware on Qcoms 410C and 820c does have hevc btw (its venus v4l2 driver was recently merged into korg so it seems a good idea to make it available to every one via ffmpeg)
[15:39:24 CEST] <ldts> BBB:  anyway, back to trying to understand what is going on...thanks for all the advice!
[15:39:36 CEST] <BBB> qcom=snapdragon?
[15:39:45 CEST] <ldts> yes
[15:39:46 CEST] <BBB> 820 has vp9 support afaik
[15:39:54 CEST] <BBB> not sure about the 410
[15:40:08 CEST] <ldts> https://lwn.net/Articles/697956/ 
[15:40:29 CEST] <BBB> yeah I saw that from your email to the list
[15:40:46 CEST] <BBB> anyway, back to work I guess - dont forget vp9 on the 820! :)
[15:41:41 CEST] <ldts> um, the 820 doesnt have VP9 (it has VP8) ...just looking at the docs. will check the driver and if it is there I'll try to enable it for sue
[15:41:58 CEST] <ldts> s/sue/sure
[15:41:59 CEST] <BBB> the webm wiki claims it does
[15:42:01 CEST] <BBB> strange
[15:42:20 CEST] <BBB> http://wiki.webmproject.org/hardware/socs
[15:42:28 CEST] <BBB> Qualcomm	SnapDragon 820	Mobile			Yes
[15:42:32 CEST] <BBB> (yes for vp9 decoding)
[15:42:33 CEST] <ldts> yes, it is correct. tehre is vp9 (looked at hte wrong page)
[15:43:00 CEST] <BBB> :)
[15:43:20 CEST] <ldts> ok more work then :)
[15:43:25 CEST] <BBB> off you go :-p
[17:12:20 CEST] <ldts> BBB, just so you know, VP9 is on :)
[17:12:31 CEST] <BBB> \o/
[17:13:39 CEST] <ldts> will add that to v2 of the patchset. still stuck on hvec though....dumping the frames still show the same issues
[17:15:29 CEST] <ldts> anyway, if you think of something else anytime I'll be all ears...
[17:17:02 CEST] <BBB> you mean from ffmpeg with -vsync 0?
[17:17:25 CEST] <BBB> ffmpeg -i file -vsync 0 -f image2 /tmp/bla%d.png shows the same issue, one frame each 10 seconds or so?
[17:17:41 CEST] <BBB> that suggests a sync issue in the hevc decoder then, so youll have to start digging into libavcodec/hevc*
[17:21:14 CEST] <ldts> BBB:  with the software decoder, I can get the frames correctly...with the v4l2 decoder (+sync) I still get the artifacts (heavy pixelation) and bad frames as well
[17:21:31 CEST] <BBB> oh I see, so its a hw decoding issue then
[17:21:38 CEST] <BBB> well, thats entirely up to you, isnt it? :-p
[17:21:42 CEST] <ldts> btw I do need the vsync=0 on all v4l2 decoders to be able to dump the frames
[17:21:57 CEST] <BBB> with vsync=0 it works?
[17:22:16 CEST] <ldts> BBB:  sorry no, I just get the frames in memory (otherwise they wont be written)
[17:22:26 CEST] <ldts> BBB:  I do hope it is up to me...
[17:22:30 CEST] <BBB> hm&
[17:22:43 CEST] <ldts> but I am not sure why the ffmpeg driver would be different for h264 or any other format
[17:22:57 CEST] <ldts> the same ffmpeg codec should work
[17:23:05 CEST] <BBB> I think theres a terminology issue here
[17:23:13 CEST] <BBB> anyway
[17:23:26 CEST] <BBB> look, if the software decoder works, and the hardware decoder gets the same input as the software decoder
[17:23:53 CEST] <BBB> then from my perspective, ffmpeg does everything correct, and your job becomes to make sure that each field of each output frame generated by the hw decoder module is identical to that produced by the software decoder module
[17:24:06 CEST] <ldts> fair enough, makes sense
[17:24:16 CEST] <BBB> and if you follow black box theory, it means everything after that should be identical, regardless of the inside of the black box
[17:24:28 CEST] <ldts> yeah I get it :)
[17:24:44 CEST] <BBB> Im not familiar enough with the hardware decoding itself to make any suggestions on where to go next
[17:25:15 CEST] <BBB> but my guesses would be syncing, different data format expectations (maybe escaping/no-escaping or whatever), timestamp mismatches or not setting timestamps on output frames, etc.
[17:25:18 CEST] <ldts> BBB:  maybe it is an issue in the hevc_mp4annexb....which the sw decoder doesnt use
[17:25:29 CEST] <ldts> all right
[18:44:42 CEST] <A3G1S> Hey Guys, need small help. When I am running ranlib on avutil.so I am getting -> Shared library: [libmp3lame.so.0]
[18:44:43 CEST] <J_Darnley> atomnuker: are you around now?
[18:44:43 CEST] <J_Darnley> I've got a Q about dirac, again.
[18:45:10 CEST] <J_Darnley> Do the transforms in ffmpeg already do edge extension for the plane?
[18:45:12 CEST] <A3G1S> How to make it so it shows -> Shared library: [libmp3lame.so] and not with .0 extension
[18:45:53 CEST] <J_Darnley> The 5_3 one for example does a loop over 1..width-1 in one place.
[18:48:14 CEST] <nevcairiel> A3G1S: this is the correct way to link, it should link to the library with the major version in the name, so when the library changes in an incompatible manner, you'll get a loading error instead of weird crashes
[18:48:30 CEST] <atomnuker> J_Darnley: I have no recollection of "edge extension"
[18:49:01 CEST] <J_Darnley> Oh.
[18:49:19 CEST] <A3G1S> @nevcairiel thanks for the reply, the main issue is with Android ecosystem where System.loadLibrary("mp3lame"); will load libmp3lame.so and not libmp3lame.so.0
[18:49:38 CEST] <A3G1S> Also other libraries like x264, fribidi, fontconfig, libass loads fine
[18:50:03 CEST] <J_Darnley> I think it was in some bizarre coner of the spec, or maybe only a white paper.  Anyway, ignore that term for now.
[18:50:48 CEST] <J_Darnley> Is that loop over 1..w-1 supposed to handle overread at the edge of the frame?
[18:50:54 CEST] <A3G1S> its the mp3lame with which I am facing issue with, isn't there a way with which libutil simply looks for libmp3lame.so and not libmp3lame.so.0
[18:51:37 CEST] <nevcairiel> just include a symlink, thats what is usually done
[18:53:27 CEST] <A3G1S> symlink and all works absolutely fine with bash but I just would like to know a way with which I can simply load it without ".0", either by editing configure, Makefile etc 
[18:54:07 CEST] <Mavrik> If you're building for Android... why not just link everything statically and avoid these kind of issues?
[18:55:02 CEST] <A3G1S> Thats actually a valid point Mavrik, going to try static build now
[19:13:15 CEST] <A3G1S> @Mavrik with static linking size of the libraries is increasing significantly, so need to go back to shared builds
[19:26:40 CEST] <kepstin> A3G1S: anyways, the name used in linking is based on the 'SONAME' field stored in the library being linked, so if you want it to say 'libmp3lame.so', then you have to rebuild lame to change the SONAME.
[19:28:03 CEST] <kepstin> most libraries do not, of course, provide a way to configure this.
[19:28:14 CEST] <A3G1S> Thanks for the response @kepstin  I did try renaming manually but that didn't work either
[19:28:44 CEST] <A3G1S> I am going to change lame configure and build it so it doesn't automagically add .0 to the extension
[19:34:09 CEST] <DHE> usually you're supposed to keep the .0 extension because any future ABI breakage will bump that number and prevent your programs from simply crashing instead
[21:17:40 CEST] <cone-695> ffmpeg 03Rostislav Pehlivanov 07master:24de4fddcad9: lavu/frame: add new side data type for ICC profiles
[21:17:40 CEST] <cone-695> ffmpeg 03Rostislav Pehlivanov 07master:2e08bbb28210: pngdec: decode and expose iCCP chunks as side data
[21:17:40 CEST] <cone-695> ffmpeg 03Rostislav Pehlivanov 07master:0563a5d17542: mdct15: simplify prereindexing and forward transform postrotation
[23:02:31 CEST] <ZeroWalker> is it possible to retrieve the source format parameters from SwrContext?
[23:40:20 CEST] <cone-695> ffmpeg 03Kaustubh Raste 07master:a776cb207448: libavcodec/mips: Optimize avc idct 4x4 for msa
[23:40:21 CEST] <cone-695> ffmpeg 03Michael Niedermayer 07master:7140761481e4: avformat/oggparsecelt: Do not re-allocate os->private
[00:00:00 CEST] --- Wed Jul 26 2017


More information about the Ffmpeg-devel-irc mailing list