[Ffmpeg-devel-irc] ffmpeg.log.20170919

burek burek021 at gmail.com
Wed Sep 20 03:05:01 EEST 2017


[00:01:12 CEST] <FishPencil> -vf bitplanenoise,metadata=print -f null - did the trick
[00:16:10 CEST] <FishPencil> Hm, I wonder if it's possible to filter a frame first with bitplanenoise and then feed the results into nlmeans denoise strength...
[00:20:56 CEST] <Johnjay> FishPencil: I have no idea what those things are.
[00:22:01 CEST] <Johnjay> when i ask google what it is the 3rd, 4th, and 5th hits mention ffmpeg. lol.
[00:22:29 CEST] <furq> you'd think it would be the first on account of them both being ffmpeg filters
[00:23:00 CEST] <Johnjay> The first is a tweet of a Dave Rice using it. The second is Bit Plane on wikipedia
[00:24:51 CEST] <FishPencil> Does anyone know if that's possible on a frame by frame basis? detect the amount of noise, and then denoise based on that?
[00:28:47 CEST] <durandal_170> FishPencil: not in one go
[00:29:19 CEST] <FishPencil> durandal_170: is it even possible in two if the metadata is saved to a file or piped?
[00:29:34 CEST] <durandal_170> also bitplanenoise calcullates noise of single bit plane
[00:30:36 CEST] <durandal_170> FishPencil: no because nlmeans have options for all frames
[00:31:07 CEST] <durandal_170> except if you gonna filter each frame separately
[00:31:36 CEST] <FishPencil> durandal_170: so maybe the best option then is to do one pass to dump all the frames noise a take the mean?
[00:32:55 CEST] <durandal_170> maybe, but you need to first find corellation between the two
[00:33:37 CEST] <FishPencil> durandal_170: do the bitplanes reefer to the chroma and luma values? I guess I need to research what that actually means
[00:34:36 CEST] <durandal_170> FishPencil: no, bitplane is plane of bits so 8 planes for 8bpp frame
[00:36:38 CEST] <FishPencil> I really would like to have per frame detection and respective strength settings. Some scenes from the same media have vastly different amounts of noise
[00:38:22 CEST] <durandal_170> yea, some kind of iso noise measurement
[00:38:51 CEST] <FishPencil> durandal_170: is that a thing?
[00:39:00 CEST] <durandal_170> dunno
[00:39:49 CEST] <FishPencil> does ffmpeg accept patches that just tie two filters together?
[00:40:34 CEST] <DHE> you can feed the same frame to multiple filters
[00:41:56 CEST] <FishPencil> as far as I'm setting the issue is that nlmeans uses per stream strength values, not per frame
[00:42:04 CEST] <FishPencil> s/setting/seeing
[00:43:00 CEST] <FishPencil> perhaps if the denoise filter was modified to support metadata=ma.txt which contained the bitplanenoise info
[00:45:00 CEST] <durandal_170> hmm, but how will you map numbers to options?
[00:47:46 CEST] <FishPencil> well if the metadata option is present than the strength setting is used as a relative strength to the detected per frame one
[01:06:52 CEST] <hiihiii> hello
[01:07:21 CEST] <hiihiii> I'm trying to pipe ffmpeg with gzip or zip
[01:17:51 CEST] <Johnjay> TIL what a bitplane is
[01:18:42 CEST] <degenerate> hey guys, so i've had a client send me some new iphone photos, they're in this .heic format.  I figured out with MP4Box i can dump the tiles to a series of .hevc files, but i'm stuck on how to combine all these .hevc files back into a single image with ffmpeg.
[01:18:45 CEST] <degenerate> One person suggested this:
[01:18:50 CEST] <degenerate> MP4Box -dump-item 1:path=item1.hevc ~/test_001.heic
[01:18:54 CEST] <degenerate> ffmpeg -i item1.hevc -frames:v 1 -vsync vfr -q:v 1 -y -an item1.bmp
[01:19:07 CEST] <degenerate> but this only does 1 tile. how do i do a series of tiles all into 1 image at the end?
[01:19:47 CEST] <Johnjay> FishPencil: Sorry I don't understand clearly. What are you trying to do with bitplanes? tweak an existing ffmpeg setting?
[01:19:53 CEST] <Cracki> tiles, so it's a kind of mosaic?
[01:20:10 CEST] <Cracki> you're perhaps better off converting them to something else and then using imagemagick
[01:20:38 CEST] <degenerate> Cracki: To be honest i barely understand how this new format works: https://en.wikipedia.org/wiki/High_Efficiency_Image_File_Format
[01:20:51 CEST] <degenerate> Cracki: no can do, imagemagick no support: https://github.com/ImageMagick/ImageMagick/issues/507
[01:20:51 CEST] <Cracki> sounds like a single HEVC intra frame
[01:21:05 CEST] <Cracki> use ffmpeg to convert to png or something
[01:21:11 CEST] <Cracki> then use imagemagick for puzzling
[01:21:24 CEST] <Cracki> what kinda resolution is that anyway?
[01:21:32 CEST] <Johnjay> When I googled for mp4box I got gpac
[01:21:34 CEST] <Cracki> I haven't heard of an "image" format being tiles before
[01:21:38 CEST] <Johnjay> When I looked for that I got https://gpac.wp.imt.fr/ and still don't know what it is
[01:21:56 CEST] <Cracki> ignore
[01:22:06 CEST] <degenerate> yeah i just leared about MP4Box and gpac today, becasue they are the only thing i could find that has HEIC support so far
[01:22:06 CEST] <Cracki> did you get ffmpeg to read it?
[01:22:21 CEST] <degenerate> it seems really new and a lot of the libraries out there just don't support it
[01:22:23 CEST] <Cracki> or, well, you unpacked that thing, right?
[01:22:31 CEST] <degenerate> i wish i just had a single line command that converted it from HEIC to PNG
[01:22:35 CEST] <degenerate> that would be awesome
[01:22:52 CEST] <Cracki> *heic/heif
[01:23:00 CEST] <degenerate> so far i'm here: for i in `seq 51`; do MP4Box -dump-item $i:path=item$i.hevc IMG_5915.HEIC; done
[01:23:11 CEST] <degenerate> basically i dumped the 51 tiles from the HEIC into hevc files
[01:23:17 CEST] <degenerate> so now i've got 51 HEVC files
[01:23:18 CEST] <Cracki> nice
[01:23:35 CEST] <Cracki> you might wanna try getting ffplay to "play" these single frames
[01:23:39 CEST] <Cracki> i.e. display them
[01:23:39 CEST] <degenerate> so i guess you're right i gotta convert each one to png or something
[01:23:45 CEST] <degenerate> and then try to stich with imagemagic
[01:23:46 CEST] <degenerate> brutal
[01:23:50 CEST] <Cracki> if that works, ffmpeg can turn them into tamer image formats
[01:23:52 CEST] <degenerate> i guess i'll write a python wrapper that does the whole thing
[01:23:53 CEST] <Cracki> aye :)
[01:24:29 CEST] <degenerate> So can you suggest a ffmpeg command to go from a single hevc to say, png or jpg?
[01:24:31 CEST] <Cracki> it MIGHT be possible with some voodoo to get an ffmpeg filter to place consecutive frames or "video files" at computed positions in a resulting frame...
[01:24:43 CEST] <Cracki> first I'd try if ffplay can deal with the tiles
[01:24:46 CEST] <Cracki> single tile
[01:25:01 CEST] <degenerate> so is this correct?
[01:25:02 CEST] <degenerate> ffmpeg -i item1.hevc -frames:v 1 -vsync vfr -q:v 1 -an test.jpg
[01:25:42 CEST] <Cracki> no clue
[01:25:43 CEST] <Cracki> try
[01:25:59 CEST] <Cracki> but I'd try with ffprobe and ffplay first
[01:26:24 CEST] <klaxa> if you want to reduce quality loss, use a lossless codec like png
[01:27:25 CEST] <Cracki> HEIF seems to have come out within the last few months
[01:27:29 CEST] <Cracki> https://photofocus.com/2017/06/09/ios-11-photos-a-new-image-format/
[01:28:12 CEST] <Cracki> it's not a smart format... hevc licensing and all
[01:30:44 CEST] <Cracki> 500px.com says it's a "still picture" profile of hevc
[01:39:02 CEST] <degenerate> Cracki: yeah, but i'm about to get slammed by users trying to upload these files, and i still need to handle it. whether or not it is a smart format, my users are dumb and just take pictures with their phones and then try to upload them
[01:39:26 CEST] <degenerate> and i've not even yet touched the video compression they're doing something similar too, while still in a .mov container, it's using this new HEIF stuff under the hood
[01:39:52 CEST] <degenerate> those .mov's don't even play in apple's quicktime player (maybe mines out of date, but yeah, totally silly)
[01:40:35 CEST] <degenerate> common quicktime! YOU HAD ONE JOB. play mov's and you can't even do that anymore.
[01:42:57 CEST] <Johnjay> what users degenerate?
[01:43:01 CEST] <Johnjay> On your website?
[01:43:04 CEST] <degenerate> yup
[01:55:47 CEST] <Johnjay> ddd
[01:56:06 CEST] <Johnjay>  So what do you need exactly a change to ffmpeg for your site?
[01:56:17 CEST] <degenerate> ?
[01:57:59 CEST] <Johnjay> Oh nevermind I finally found your initial comment
[01:58:28 CEST] <degenerate> k :)
[02:02:57 CEST] <Cracki> you're welcome to post sample files :)
[02:03:12 CEST] <degenerate> linked in last post on this thread
[02:03:13 CEST] <degenerate> https://github.com/gpac/gpac/issues/901
[02:03:20 CEST] <degenerate> HEIC Conversion.zip
[02:03:33 CEST] <degenerate> sorry, direct link: https://github.com/gpac/gpac/files/1312599/HEIC.Conversion.zip
[02:03:44 CEST] <degenerate> but that post does describe the current state of things, and what the files are
[02:04:50 CEST] <FishPencil> Does FFmpeg/avfilter have a way to generate a 1920x1808 frame of noise?
[02:07:29 CEST] <Cracki> certainly
[02:07:50 CEST] <Cracki> there is a noise source filter
[02:08:08 CEST] <atomnuker> ffmpeg -f rawvideo -s 1920x1808 -pix_fmt yuv420p -i /dev/urandom
[02:08:14 CEST] <Cracki> or that
[02:08:20 CEST] <atomnuker> there isn't a noise source filter for video, only audio
[02:08:38 CEST] <Cracki> oh, then I thought of gstreamer :/
[02:09:37 CEST] <FishPencil> without using urandom?
[02:10:36 CEST] <Cracki> you could use the audio noise, pipe it out as binary, pipe it back in and interpret as video
[02:10:50 CEST] <Cracki> (in case your system doesn't have /dev/urandom or something similar)
[02:12:04 CEST] <c_14> you could use color to create a white or black frame and then use the noise filter to add noise
[02:12:54 CEST] <Cracki> ^
[02:13:10 CEST] <FishPencil> I like that
[02:13:53 CEST] <Cracki> btw, ffplay "plays" the .hevc items just fine
[02:14:20 CEST] <Cracki> and I see you got the hevc->png step done too
[02:15:04 CEST] <degenerate> yeah, i think i'm good tbh
[02:15:12 CEST] <degenerate> unless you see anything wrong wiht my conversion flags
[02:15:32 CEST] <degenerate> for i in `seq 51`; do ffmpeg -i item$i.hevc -frames:v 1 -vsync vfr -q:v 1 -an image$i.png; done
[02:15:47 CEST] <degenerate> thanks for your help btw
[02:15:55 CEST] <Cracki> image50 is a thumbnail i see
[02:16:08 CEST] <Cracki> I don't see a need for vsync, but if you do, it's ok
[02:16:19 CEST] <degenerate> no, i just copied this command from somewhere
[02:16:24 CEST] <Cracki> ah
[02:16:37 CEST] <Cracki> -q:v might not make sense for png either
[02:16:39 CEST] <degenerate> yeah, image50 is a thumbnail, and apparently image49 is some sort of grid thumbnail that doesn't convert
[02:16:46 CEST] <degenerate> and also image51 is EXIF metadata
[02:16:47 CEST] <Cracki> if the pngs are temp files, doesn't matter how well they were compressed
[02:16:58 CEST] <Cracki> you could even store bitmaps... if that's a tempfs in ram
[02:17:03 CEST] <degenerate> my hope is to grab that EXIF metadata and then re-apply it to the final image i create with imagemagic after restiching the tiles
[02:17:10 CEST] <Cracki> ah, the exif data, ffmpeg or imagemagick might be able to copy that
[02:17:13 CEST] <Cracki> yeah
[02:17:39 CEST] <Cracki> item49 is empty
[02:18:02 CEST] <degenerate> yeah i don't really get what item49 is supposed to be used for TBH
[02:18:06 CEST] <Cracki> the header is even wrong, says length 24, but 16 bytes in total
[02:19:51 CEST] <Cracki> you might wanna poke the imagemagick people
[02:29:39 CEST] <degenerate> Hey Cracki does ffmpeg have some way of automatically detecting "letterboxing" and cropping it out?  For example i've managed to get past the stiching point, and this was the result:
[02:29:39 CEST] <degenerate> https://www.dropbox.com/s/4jwgdy49tuotuit/combined.png?dl=0
[02:29:55 CEST] <Cracki> I *think* there might be a filter
[02:30:13 CEST] <degenerate> because of the image tiles being a standard height and width, and the image itself is not exactly a multiple of that size, you end up with this black bar on the right and bottom
[02:30:28 CEST] <Cracki> check the video filters list, I think it'll be there
[02:30:40 CEST] <degenerate> thnx
[04:40:49 CEST] <Johnjay> thebombzen
[04:41:00 CEST] <Johnjay> oops no auto complete
[04:44:52 CEST] <Ober> is there any way to use ffmpeg to do transitions between small clips made from a larger image?
[04:45:06 CEST] <Ober> or even gluing them together ala cat(1)
[05:15:22 CEST] <kevinn_> Hi! Does anyone have any good examples of decoding x264 buffers with libavcodec?
[05:15:35 CEST] <kevinn_> I can't seem to find any nice and simple sample
[07:36:24 CEST] <kjshdlkafhslh> Is anyone here?
[07:36:36 CEST] <blap> dunno
[07:36:45 CEST] <kjshdlkafhslh> I am but a useless nerd trying to make my gif and mp3 together in one webm
[07:36:49 CEST] <blap> btw get a droid 4 with sailfish os
[07:37:05 CEST] <kjshdlkafhslh> I have all three source files in a folder with the ffmpeg.exe
[07:38:17 CEST] <kjshdlkafhslh> I have tried this multiple times
[07:38:19 CEST] <kjshdlkafhslh> However
[07:38:28 CEST] <kjshdlkafhslh> I can only get a webm which holds a single frame
[07:38:31 CEST] <kjshdlkafhslh> I do not know why
[09:05:10 CEST] <jeffW_> Could someone explain to me, where in this:   https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC    is it explaining how to compile FFmpeg in MSVC? Other than the beginning, it seems to me like it is explaining how to compile in Linux or MSYS2.
[09:06:38 CEST] <jeffW_> No where does it explain how to setup anything in MSVC, it only tells you to perform Linux commands such as:     ./configure --toolchain=msvc    make     make install
[09:11:55 CEST] <JEEB> jeffW_: yes the same build system is used but --toolchain=msvc tells the system to utilize MSVC tools
[09:12:24 CEST] <JEEB> it makes no sense to duplicate the build system
[09:12:37 CEST] <JEEB> so yes, you need msys2 or something else that provides you with the basic shell etc
[09:12:50 CEST] <JEEB> and you can call that from the visual studio development command line
[09:13:10 CEST] <jeffW_> To build 64bit you need to run this command to enable the 64bit compiler and linker:  vcvarsall.bat amd64
[09:13:37 CEST] <jeffW_> What does running vcvarsall.bat amd64  do?  How is running this .bat file going to help MSYS2?
[09:14:04 CEST] <JEEB> it adds the MSVC tools into the environment, although usually you go the other way... not that I've read that document on the wiki recently :P
[09:14:32 CEST] <JEEB> the visual studio command line link in the start menu does pretty much that I think?
[09:15:29 CEST] <jeffW_> Start menu?
[09:15:48 CEST] <jeffW_> Does pretty much what?
[09:16:32 CEST] <JEEB> what I'm trying to explain is that while I haven't read that document, the vcvarsall.bat is what those "open visual studio development command line" shortcuts in the start menu utilizes
[09:16:55 CEST] <JEEB> so "how does this help" => "it is supposed to bring the tools into your environment (PATH etc)"
[09:17:28 CEST] <JEEB> anyways, I've got work to do so I can't be hand-feeding you here >_>
[09:18:21 CEST] <jeffW_> Bye.
[09:19:08 CEST] <jeffW_> Does anyone else know to get the information presented here:    https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC   to work?
[09:48:38 CEST] <cart_man> Hey everyone
[09:50:07 CEST] <cart_man> I am trying to grab 2 frames a second from a video. My command is as followings  ffmpeg.exe -y -vf fps=2/1 ./Input.asf ./conv/%03d.jpg    ...IT gives me this error howeverset video filters cannot e applied to input url. :(
[09:50:24 CEST] <cart_man> Is there another way to grab roughly 2 frames a second instead of all of them?
[09:58:57 CEST] <Nacht> cart_man: ffmpeg.exe -i ./Input.asf -y -vf fps=2/1 ./conv/%03d.jpg
[09:59:28 CEST] <Nacht> define the input file with -i and place the -vf after it, cause you are applying it to the output, not the input
[10:24:04 CEST] <cart_man> ok that worked...Thanks allot!
[10:27:03 CEST] <jeffW_> Is anyone here familiar with compiling ffmpeg with MSYS2/mingw-w64 ?
[10:27:47 CEST] <jeffW_> I am trying to follow the MSVC guide on the wiki:    https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC
[10:28:53 CEST] <jeffW_> When I type  which link   produces:    /c/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin/x86_amd64/link       and    which cl   produces:    /c/Program Files (x86)/Microsoft Visual Studio 14.0/VC/bin/x86_amd64/cl
[10:29:22 CEST] <jeffW_> Yet when I run:           $ ./configure --enable-asm --enable-yasm --arch=x86_64 --disable-ffserver --disable-doc --disable-ffprobe --disable-ffmpeg --enable-shared --disable-static --disable-bzlib --disable-libopenjpeg --disable-iconv --disable-zlib --prefix=/c/ffmpeg-output-shared2 --toolchain=msvc
[10:29:57 CEST] <jeffW_> I get:           cl is unable to create an executable file. If cl is a cross-compiler, use the --enable-cross-compile option. Only do this if you know what cross compiling means. C compiler test failed.  If you think configure made a mistake, make sure you are using the latest version from Git.  If the latest version fails, report the problem to the ffmpeg-user at ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net. Include the log 
[10:32:13 CEST] <jeffW_> Here is the ffbuild/config.log :         https://pastebin.com/EBCcZTP7
[10:40:00 CEST] <BtbN> that looks like your MSVC is just broken, or your environment is missettup
[10:41:55 CEST] <jeffW_> :(
[10:43:19 CEST] <jeffW_> Alright, well MSVC did give me  a warning about possibly needing repair after I uninstalled Community version
[10:43:55 CEST] <jeffW_> I'll try Repair it and report back.
[16:35:19 CEST] <voltagex> hey, anyone awake? I've got an mp4 file that appears to switch resolution part way through. Wondering if I can upload it anywhere as I'm not sure how to file a bug for it
[16:35:36 CEST] <JEEB> there were multiple places for samples if I recall correctly
[16:36:13 CEST] <Nacht> Maybe it's an adaptive bitrate vod /s :)
[16:36:36 CEST] <JEEB> well you can have multiple parameter set things and each sample can in theory point to one of the initialization data things
[16:36:53 CEST] <JEEB> and/or you just have initialization data in-band within the samples
[16:37:04 CEST] <Nacht> https://streams.videolan.org/upload/ ?
[16:37:06 CEST] <JEEB> both ways enable you to do complete changes of the video track
[16:37:14 CEST] <JEEB> yes, that's one way of doing it I think
[16:41:21 CEST] <voltagex> I think it's being grabbed by ffmpeg incorrectly in the first place. https://video.vice.com/en_us/video/motherboard-speed-daemons-los-angeles-drag-racing/58ee471250a511f57762b8fb if anyone's interested (filter chrome traffic for m3u)
[19:23:39 CEST] <jamin> I'm experiencing an odd hang when trying to open a multicast stream with ffmpeg.  With tcpdump I an see that packets are arriving, but ffmpeg does nothing
[19:24:03 CEST] <jamin> https://pastebin.com/y5zYLgEh
[19:24:31 CEST] <JEEB> you might want to specify the source(s)
[19:24:47 CEST] <JEEB> udp://MULTICAST:PORT?sources=SOURCE_IP
[19:25:15 CEST] <DHE> got an iptables rule dropping the traffic?
[19:25:27 CEST] <DHE> (I assume linux)
[19:26:07 CEST] <jamin> DHE: there is an iptables rule to drop 0 sized udp packets but not the others... and even without it ffmpeg hung at the same point
[19:26:14 CEST] <jamin> and yes, Linux
[19:26:32 CEST] <JEEB> for me it's usually the lack of source IP, because otherwise lavf won't register
[19:26:37 CEST] <JEEB> that has often been a thing for me
[19:27:36 CEST] <jamin> getaddrinfo(sources=192.168.1.238, 0): Name or service not known
[19:27:55 CEST] <jamin> nvm, typo
[19:28:12 CEST] <jamin> nah, even with sources it hangs in the same place
[19:28:38 CEST] <JEEB> also set timeout in microseconds to know if it's just thinking it's not getting data?
[19:28:53 CEST] <JEEB> https://www.ffmpeg.org/ffmpeg-all.html#udp
[19:30:01 CEST] <JEEB> jamin: also try setting localaddr to make sure it tries to bind the correct interface
[19:30:10 CEST] <JEEB> other than that, no idea
[19:30:23 CEST] <jamin> udp://@239.255.42.42:5004?sources=192.168.1.238&timeout=1000: Input/output error
[19:30:37 CEST] <JEEB> that's in microseconds so that's... really low timeout
[19:30:52 CEST] <jamin> udp://@239.255.42.42:5004?sources=192.168.1.238&timeout=100000: Input/output error
[19:31:14 CEST] <jamin> udp://@239.255.42.42:5004?sources=192.168.1.238&timeout=10000000: Input/output error
[19:31:23 CEST] <JEEB> yea, that looks alright range already
[19:31:39 CEST] <JEEB> so indeed, it doesn't listen where the thing is
[19:31:44 CEST] <JEEB> try defining localaddr
[19:32:05 CEST] <JEEB> which is the local address of the interface
[19:32:08 CEST] <JEEB> also you don't need the @
[19:36:08 CEST] <jamin> removed the iptables rule entirely, also removed the @
[19:36:12 CEST] <jamin> udp://239.255.42.42:5004?sources=192.168.1.238&timeout=10000000&localaddr=192.168.1.100: Input/output error
[19:36:37 CEST] <JEEB> rip
[19:37:00 CEST] <JEEB> at this point it would have worked for me (after I add the source)
[19:37:10 CEST] <JEEB> try strace'ing?
[19:39:39 CEST] <jamin> just a bunch of these
[19:39:40 CEST] <jamin> futex(0x5619b6fa7640, FUTEX_WAIT_BITSET_PRIVATE|FUTEX_CLOCK_REALTIME, 0, {tv_sec=1505842716, tv_nsec=804757000}, 0xffffffff) = -1 ETIMEDOUT (Connection timed out)
[19:41:39 CEST] <JEEB> ffv1 mentioned \o/ https://pbs.twimg.com/media/DKFi2QWWsAg6Ysl.jpg:orig
[21:15:40 CEST] <pgorley> can a hwaccel fail (barring catastrophic hardware meltdowns) if it was succesfully created with av_hwdevice_ctx_create?
[21:16:20 CEST] <pgorley> on calls to avcodec_send_packet/avcodec_receive_frame or av_hwframe_transfer_data?
[21:17:20 CEST] <BtbN> if you send it nonsensical or broken data, sure
[21:17:25 CEST] <BtbN> Or something it plain does not support
[21:21:55 CEST] <pgorley> makes sense
[21:22:52 CEST] <pgorley> how do i make sure the hwaccel supports what i'm gonna send it? (resolution, codec, all that)
[21:24:52 CEST] <Diag> pgorley: check the wiki or try it
[21:25:23 CEST] <pgorley> i was hoping for way to check that in code?
[21:30:18 CEST] <JEEB> pgorley: I'm not sure if there's a good way
[21:30:30 CEST] <JEEB> generally things tend to not let you send things in if they're not 8bit, 4:2:0
[21:30:47 CEST] <JEEB> after that you hope the driver doesn't BSoD on you for sending too many reference frames or whatever
[21:31:04 CEST] <JEEB> hw decoding has gotten better, but it's still kind of fragile
[21:32:03 CEST] <BtbN> If it decodes the first frame and you don't start sending it vastly different stuff or garbe, it will continue working
[21:32:24 CEST] <BtbN> so testing if it works is done by doing just that
[21:32:51 CEST] <pgorley> no other way than to start sending it frames then, alright
[21:32:54 CEST] <pgorley> thanks!
[23:42:23 CEST] <kevinn_> Hi! Does anyone have any good examples of decoding x264 buffers with libavcodec?
[23:42:24 CEST] <kevinn_> I can't seem to find any nice and simple sample
[00:00:00 CEST] --- Wed Sep 20 2017


More information about the Ffmpeg-devel-irc mailing list