[Ffmpeg-devel-irc] ffmpeg.log.20180531

burek burek021 at gmail.com
Fri Jun 1 03:05:01 EEST 2018


[00:58:16 CEST] <slavanap> Hi! I have a list of segments in video I have to mute, could you suggest the best effective way to do it with ffmpeg. No video reencoding needed.
[00:58:24 CEST] <zap0> when requesting a greyscale output, how are coloured inputs converted?
[00:59:13 CEST] <slavanap> zap0, I guess it's converted to YUV, then U:=0, V:=0
[00:59:53 CEST] <zap0> slavanap, output an audio track, as WAV or some dumb fast format.   edit it.   make a new video by  streamin in vision from original file and audio frmo edited file
[01:00:31 CEST] <zap0> slavanap, ok.. makes sense if orogina is YUV..  what if it's RGB..   just a YUV convertin first internally?
[01:01:02 CEST] <zap0> original/
[01:01:49 CEST] <slavanap> zap0, about "muting" I expected ffmpeg has already some internal filters to do it.
[01:02:27 CEST] <slavanap> zap0, RBG -> YUV, then using just Y seems the most correct from scientific point of view.
[01:02:33 CEST] <zap0> like there is a pix_fmt  grey14 or grey16 etc..   is the internal represntation of the Y  floating point?  or bigger than 8?
[01:02:59 CEST] <zap0> slavanap, muting:   yeah i imagine there is a filter, but it would only do 1 span.
[01:03:43 CEST] <zap0> on command line  ffmpeg -filters
[01:04:56 CEST] <zap0> ffmpeg -filters | grep volume
[01:13:51 CEST] <zap0> slavanap  https://stackoverflow.com/questions/29215197/mute-specified-sections-of-an-audio-file-using-ffmpeg
[01:14:05 CEST] <slavanap> zap0, grep for RGB_TO_Y_CCIR -- that's for getting Luma (Y) from RGB. ffmpeg supports GRAY8 & GRAY16 formats -- that's greppable too.
[01:15:02 CEST] <slavanap> I can't find out the exact placement of colorspace conversions now.
[01:15:19 CEST] <slavanap> zap0, thanks for SO question & answer!
[01:15:28 CEST] <zap0> i hope it works!
[01:17:37 CEST] <slavanap> zap0, I just believe ffmpeg does something like RGB -> YUV -> Y -> RGB for grayscale output in RGB and input in RGB, preserving the color bitness. Moveover, may formats supports grayscale natively, so some of conversions might not be necessary
[01:17:54 CEST] <slavanap> s/may/some/
[01:23:58 CEST] <furq> slavanap: -af "volume=0:enable=between(t\, 12\, 34)+between(t\, 56\, 78)"
[01:24:21 CEST] <slavanap> Thanks
[01:24:53 CEST] <slavanap> zap0, and I believe y -> RGB is just r=g=b=y, but notice there're different YUV formats.
[01:24:56 CEST] <slavanap> good night
[01:25:50 CEST] <zap0> :)
[01:26:50 CEST] <voip_> hello guys
[01:26:51 CEST] <voip_> I need take rtmp live stream and without transcoding video/audio push multicast UDP.
[01:26:51 CEST] <voip_> Please help with right command
[03:49:37 CEST] <Hello71> how does this command work: ffmpeg -ss 3 -i input.mp4 -vf "select=gt(scene\,0.4)" -frames:v 5 -vsync vfr -vf fps=fps=1/600 out%02d.jpg
[03:50:30 CEST] <Hello71> find all frames matching scene > 0.4, take the first one every 10 minutes, and stop at 5 frames?
[03:58:12 CEST] <furq> wrong order
[03:58:44 CEST] <furq> specifying -vf twice will just override the first one
[03:59:29 CEST] <furq> if it's supposed to be -vf select,fps then that'll pretty much do what you said
[03:59:34 CEST] <furq> plus it'll skip the first 3 seconds
[03:59:53 CEST] <furq> i suspect you actually want -vf fps,select though
[04:01:01 CEST] <Hello71> I don't think that does anything
[04:01:08 CEST] <Hello71> for most videos then select would do nothing
[04:01:17 CEST] <Hello71> anyways I decided I actually want thumbnail filter
[04:01:54 CEST] <furq> thumbnail buffers the entire selection
[04:02:08 CEST] <furq> so you're probably going to run out of memory if you ask it to buffer 10 minutes of uncompressed frames
[04:03:36 CEST] <Hello71> well I'm planning on using fps=5,thumbnail=100 or something
[04:04:12 CEST] <furq> that'd help
[04:04:18 CEST] <furq> -skip_frame nokey might help as well
[04:04:29 CEST] <furq> since scenecuts are more likely to be on keyframes
[04:07:54 CEST] <Hello71> ty
[04:21:23 CEST] <Hello71> also, TIL the ffmpeg jpeg encoder is terrible
[04:22:04 CEST] <Hello71> why is it so much worse than imagemagick
[04:23:25 CEST] <atomnuker> turn on huffman optimizations
[04:24:11 CEST] <furq> isn't it on by default
[04:24:19 CEST] <Hello71> oh, nvm that last part, turns out a few K is a big difference to imagemagick
[04:24:28 CEST] <Hello71> thought 74 and 80 should be about the same, apparently not
[06:01:36 CEST] <nai> hi, i have an issue trying to record my screen. here is my command:
[06:01:38 CEST] <nai> ffmpeg -framerate 30 -f x11grab -i :0.0 -f mp4 -c:v libx264 -crf 0 -preset ultrafast out.mp4
[06:02:12 CEST] <nai> the resulting file plays nice in mpv and ffplay, but firefox cannot play it. instead, it gives the following error: "Video can't be played because the file is corrupt"
[06:02:25 CEST] <nai> here is the output of ffprobe on that file: https://up.nai.im/ffprobe.txt
[06:05:34 CEST] <furq> nai: add -pix_fmt yuv420p
[06:06:12 CEST] <nai> furq: doesn't solve it
[06:06:44 CEST] <kepstin> lossless always uses the 444 profile in h264, which firefox doesn't support
[06:06:59 CEST] <kepstin> you'll have to use a non-lossless mode *and* the yuv420p pix_fmt option
[06:07:32 CEST] <kepstin> if you want working lossless video and 4:4:4 sampling in firefox & chrome, you should use vp9 instead of h264.
[06:07:50 CEST] <nai> perfect answer, thanks!
[06:08:15 CEST] <nai> i was precisely wondering the benefits of webm/vp9 over traditional formats
[06:08:29 CEST] <kepstin> (note that the libvpx-vp9 encoder is a lot slower than x264 - you might consider capturing with x264 then transcoding to vp9 if it's too slow to do vp9 live)
[06:08:30 CEST] <atomnuker> don't use fucking mp4
[06:08:44 CEST] <atomnuker> if ffmpeg dies hard the file will be corrupt
[06:08:46 CEST] <nai> kepstin: thanks
[06:08:56 CEST] <kepstin> and, what atomnuker said ;)
[06:09:51 CEST] <nai> am i to infer that mp4 stores data at the end of files?
[06:10:57 CEST] <kepstin> mp4 stores some codec initialization data together with a variable-size index field, which means the init data can't be stored at the start of the file when initially encoding the video
[06:11:10 CEST] <kepstin> and so it's written at the end... and the video is unplayable without the init data
[06:11:42 CEST] <kepstin> mkv is a reasonable alternative. you can always remux the file later into a different container.
[06:11:51 CEST] <nai> got it, thanks
[06:12:30 CEST] <nai> i was initially using mkv, but ran into another problem, maybe you can enlighten me on this one: the mimetype of the resulting files was application/octet-stream instead of video/x-matroska
[06:12:47 CEST] <kepstin> that sounds like a problem with your web server configuration
[06:12:59 CEST] <nai> no, i'm talking about the mimetype returned by xdg-mime
[06:13:02 CEST] <kepstin> (but most browsers can't play mkv anyways, so it doesn't really matter)
[06:13:26 CEST] <kepstin> hmm, well, that's probably a bug there. on linux there's no mime type stored in the file, so tools just try to read file headers and guess
[06:13:29 CEST] <nai> this isn't a very important issue, it just makes xdg-open confused when trying to open the file, and ends up opening it with my browser instead of my video player
[06:14:41 CEST] <nai> yeah, my file manager (pcmanfm) uses another method to detect file types, possibly based on extensions, and therefore is able to open the correct application
[06:14:57 CEST] Action: kepstin notes that xdg-mime reports 'application/x-matroska' correctly on his system - maybe yours is just out of date or missing file type definitions?
[06:15:04 CEST] <nai> but apparently something i was doing wrong prevents xdg-mime from detecting a matroska file
[06:15:17 CEST] <nai> wait, i'll give you a test command
[06:15:48 CEST] <nai> ffmpeg -framerate 30 -f x11grab -i :0.0 -f matroska -c:v libx264 -pix_fmt yuv420p -crf 0 -preset ultrafast out.mkv
[06:16:02 CEST] <nai> xdg-mime query filetype out.mkv => application/octet-stream
[06:16:31 CEST] <kepstin> I get video/x-matroska there. You probably have out of date software or a distro packaging issue :/
[06:16:47 CEST] <nai> hm, interesting
[06:16:53 CEST] <nai> i'm on arch's bleeding edge
[06:17:11 CEST] <kepstin> so, distro packaging issue then ;)
[06:17:22 CEST] <nai> yup. alright, thanks for everything! :)
[06:17:46 CEST] <kepstin> (i'm joking, but i have had friends run into issues with arch packages a fair bit, so..?)
[06:18:14 CEST] <nai> yeah, might be. i honestly don't have the insight to know what the cause might be
[06:18:52 CEST] <kepstin> also check what 'file --mime-type out.mkv' reports
[06:19:20 CEST] <kepstin> i dunno exactly how xdg-mime works, but maybe it uses the same data as file?
[06:19:24 CEST] <nai> application/octet-stream too
[06:19:47 CEST] <nai> my "file" command comes from https://www.darwinsys.com/file/
[06:20:03 CEST] <nai> apparently it's standard among Linux distros
[06:20:11 CEST] <nai> i have version 5.33-3
[06:20:56 CEST] <kepstin> yeah, so there's something wrong with the 'magic' database on your system that 'file' uses to detect file types.
[06:24:10 CEST] <nai> alright, i want to go to the end of this. could you upload your /usr/share/file/misc/magic.mgc file? here's mine https://up.nai.im/magic.mgc
[06:24:45 CEST] <nai> my server serves that as plain text, sorry about that, wrong mime type configuration :')
[06:26:30 CEST] <nai> ah, i've found a bug report https://bugs.launchpad.net/ubuntu/+source/file/+bug/420963
[06:27:13 CEST] <nai> ...from 2012
[06:27:17 CEST] <nai> 2010*
[09:33:03 CEST] <momomo> Don't block me. One line. Looking for a Linux Sysadmin in Europe, for a great job oppurtonity in Stockholm city. Well payed, permanent / temporary (based on your preference). Immediate Accommodation. One crux, has to also know Elastic Search. Need to be filled immediately.
[09:33:24 CEST] <lohroc> I want to use ffmpeg to downscale from 1080p .mkv to 720p .mp4 but I'm getting the error element type mismatch 3 != 0
[09:34:57 CEST] <lohroc> also can I resize the video and change the extension at once?
[10:10:11 CEST] <momomo> Don't block me. One line. Looking for a Linux Sysadmin in Europe, for a great job oppurtonity in Stockholm city. Well payed, permanent / temporary (based on your preference). Immediate Accommodation. One crux, has to also know Elastic Search. Need to be filled immediately.
[13:00:50 CEST] <subi> why zeranoe's win build doesn't include ffserver.exe? also why the 4.0 version couldn't initialise SDL Audio through WASAPI?
[13:02:06 CEST] <durandal_1707> subi: ffserver is removed in 4.0
[13:02:43 CEST] <subi> then where's the feature now? in ffmpeg.exe?
[13:03:41 CEST] <klaxa> it's in development
[13:03:58 CEST] <klaxa> i'm writing a replacement
[13:04:28 CEST] <klaxa> maybe it will even be useful
[13:07:28 CEST] <subi> is it zeranoe specific or the original ffmpeg project that makes those changes?
[13:08:48 CEST] <furq> it was removed upstream because it never worked properly and nobody maintained it
[13:11:05 CEST] <subi> then what's left is obs studio?
[13:11:44 CEST] <furq> OBS doesn't have a server does it
[13:11:45 CEST] <subi> something for live broadcast
[13:11:57 CEST] <furq> you can still do screen capture with ffmpeg, you just need an external streaming server
[13:12:10 CEST] <furq> nginx-rtmp is a popular one
[13:57:51 CEST] <TarquinWJ> JEEB: think it was you who helped me yesterday... I am running this command:
[13:57:51 CEST] <TarquinWJ> ffmpeg.exe -i source.mp4 -map 0:v -c:v libvpx-vp9 -pix_fmt yuv420p -b:v 2M -vf scale=1024:576 -r 30 -f dash -format_options dash_segment_type=webm -seg_duration 3 -media_seg_name "\$Number\$.chk" -init_seg_name 0.chk output.mpd
[13:57:51 CEST] <TarquinWJ> it works, I get a dashed video. but MediaInfo says that the first fragment is "Format : MPEG-4" rather than "Format  : WebM" which I see coming from YouTube
[13:57:51 CEST] <TarquinWJ> Am I doing something wrong?
[14:01:24 CEST] <TarquinWJ> it almost feels like I have got an old version rather than the latest, which is possible, but I downloaded the latest snapshot here:
[14:01:25 CEST] <TarquinWJ> ttps://ffmpeg.org/releases/ffmpeg-snapshot.tar.bz2
[14:07:54 CEST] <JEEB> TarquinWJ: not sure ffmpeg.c take options like that?
[14:08:04 CEST] <JEEB> also isn't dash_segment_type its own option?
[14:08:10 CEST] <JEEB> -dash_segment_type "webm"
[14:08:21 CEST] <JEEB> not an option for "format_options"
[14:12:14 CEST] <TarquinWJ> hah, well at least that was a simple fix :D
[14:12:38 CEST] Action: TarquinWJ going to hang head in shame now
[14:13:33 CEST] <JEEB> hint: ffmpeg -h muxer=dash
[14:13:41 CEST] <JEEB> should list all AVOptions specific for that muxer
[14:13:51 CEST] <TarquinWJ> thanks :)
[14:14:57 CEST] <TarquinWJ> always nice to realise you just misunderstood the documentation, rather than something being impossible
[14:15:15 CEST] <JEEB> also if you're building, just use the git.videolan.org git repo, much simpler to update :P
[14:15:20 CEST] <JEEB> and you know exactly the hash you're on
[14:21:39 CEST] <TarquinWJ> I see why I was confused; documentation does not show "-dash_segment_type", it shows "dash_segment_type" without the "-", so it looks like a value for the previous option
[14:22:04 CEST] <TarquinWJ> see here: https://www.ffmpeg.org/ffmpeg-all.html#dash-2
[15:09:01 CEST] <JEEB> TarquinWJ: and that is because the dash in front is specific to ffmpeg.c :)
[15:09:10 CEST] <JEEB> while those things just list the options for each module
[16:32:12 CEST] <aeikum> hi all, quick question about av_frame ref counting with the new decoder API. in short, is it OK to have more unrefs than refs?  specifically, avcodec_receive_frame() says it calls av_frame_unref(), but this is unbalanced on the very first call.  do i need to call av_frame_ref() first?
[16:34:03 CEST] <aeikum> i guess the intention is that the client can do av_frame_alloc(), repeat receive() with refs managed internally, then av_frame_free(), without having to bother with refcounting at all
[16:34:05 CEST] <aeikum> just wanted to confirm
[16:40:59 CEST] <anill> Hi users, can anyone explain me how can i can use ffmpeg for packetization.
[16:44:11 CEST] <anill> Can i use FFMPEG to extrach the H.264 content from RTP stream.?
[16:52:39 CEST] <kepstin> aeikum: calling av_frame_unref on a frame that doesn't have any allocated buffers isn't an error, it'll just reset some of the fields in the frame structure.
[16:52:59 CEST] <aeikum> awesome, makes sense. thank you
[17:03:40 CEST] <anill> Can i use FFMPEG to extract the H.264 content from RTP stream.?
[17:04:13 CEST] <DHE> anill: yes, just a question of what you want to write it to. mp4 file?
[17:04:24 CEST] <DHE> or maybe clarify what you mean by packetization
[17:07:39 CEST] <anill> DHE: i mean if i have decrypted RTP stream, i want to get the raw H.264 contents from it or maily look at this https://tools.ietf.org/html/rfc6184#page-16
[17:09:19 CEST] <anill> DHE:i have two requirements for writing one is writing into .h264 file and another is directly in MP4 file
[17:09:54 CEST] <DHE> okay. that's easy.
[17:10:02 CEST] <anill> great
[17:10:14 CEST] <anill> can you please explain me a little bit
[17:10:22 CEST] <DHE> Well I'm guessing it would be enough to do: ffmpeg -i rtp://.....   -c copy -map 0:v output.mp4 -c copy -map 0:v output.h264
[17:10:39 CEST] <DHE> will write both an .mp4 file and a raw h264 variant
[17:11:05 CEST] <DHE> actually hold on. I don't think ffmpeg recognizes that as a default format...
[17:11:31 CEST] <DHE> okay, I'm wrong. it does.
[17:11:50 CEST] <anill> can i give it a file rather than rtp://....
[17:11:58 CEST] <DHE> sure
[17:12:59 CEST] <anill> another thing there are 3 packetization modes Single NAL unit mode, Non Interleaved Mode and Interleaved Mode
[17:13:11 CEST] <anill> as shown here https://tools.ietf.org/html/rfc6184#page-16
[17:13:52 CEST] <anill> if i run the above command what would be the packteziation mode ?
[17:13:54 CEST] <anill> any idea
[17:14:27 CEST] <DHE> yes I see that. I don't know the specifics here but I expect ffmpeg would output in non-interleaved mode. internally it runs in DTS order which seems to match the description of non-interleaved
[17:15:50 CEST] <zevarito> Does anyone know if h264/hls encoder uses already written .ts segments to build new ones, update m3u8 etc.? or forget about it once it has been written on disk?
[17:15:51 CEST] <anill> DHE: you seems to me very knowlegeable, can i ask you a silly question, do you have any link for the document stating ffmpeg runs in DTS order.
[17:20:46 CEST] <DHE> anill: it's part of the doxygen stuff for the raw APIs. the muxer needs monotonically increasing DTS order for its inputs
[17:26:29 CEST] <anill> DHE: does ffmpeg take pcap file of decrypted RTP stream and get the raw H.264 file and MP4 file?
[17:26:47 CEST] <anill> i mean i wanted to know the type of file input that can be given to ffmpeg
[17:32:33 CEST] <anill> DHE: u there?
[17:35:47 CEST] <DHE> it won't read a pcap file. you might be able to dump the stream in the pcap to a file and read that maybe?
[17:36:08 CEST] <DHE> maybe. I really don't know if that will work. my gut says no though
[17:36:18 CEST] <kepstin> anill: ffmpeg won't take pcap input, no. the ffmpeg rtp stuff is designed to receive network packets directly, so you'd have to have something to parse it and either send over net or use libavformat directly.
[17:37:17 CEST] <kepstin> there's probably some specialty network analysis tools that can save audio/video from rtp streams in pcap (wireshark might have something?) but it's not something the ffmpeg cli can do.
[17:38:42 CEST] <anill> kepstin: what i do is i capture the RTP stream into file, its a raw file just a dump of all RTP packets
[17:39:32 CEST] <anill> so as you said i need to use libavformat to dump all the RTP packets into file is it
[17:39:34 CEST] <anill> ?
[17:39:40 CEST] <kepstin> a dump of the rtp packets isn't sufficient, iirc rtp relies on external framing instead of storing packet lengths internally so you need to at least store the length with each packet.
[17:40:05 CEST] <kepstin> (i could be wrong about that, been a while since i've worked with this tuff)
[17:40:44 CEST] <anill> whats iirc?
[17:40:56 CEST] <kepstin> "if i recall correctly"
[17:41:15 CEST] <kepstin> (or "remember")
[17:42:22 CEST] <anill> do u have an idea of packetization mode in ffmpeg?
[17:42:56 CEST] <voip_> hi guys
[17:43:02 CEST] <voip_> I need take rtmp live stream and without transcoding video/audio push multicast UDP.
[17:43:02 CEST] <voip_> Please help with right command
[17:43:29 CEST] <anill> kepstin: do u have an idea of packetization mode in ffmpeg?
[17:43:48 CEST] <kepstin> anill: packetization mode of input? output? what format?
[17:44:30 CEST] <DHE> voip_: ffmpeg -i rtmp://......  -c copy -f mpegts -pkt_size 1316 -muxrate 8M -bitrate 8M udp://239.0.0.1:12345
[17:45:03 CEST] <DHE> where the source video is CBR and never exceeds ~8 megabits. substitute in the options you want like multicast IP/port and of course the rtmp source
[17:45:14 CEST] <voip_> DHE, thank you!
[17:45:15 CEST] <kepstin> voip_: although given your name, maybe you want rtp instead?
[17:45:54 CEST] <voip_> its:)))) udp is perfect :))
[17:46:29 CEST] <kepstin> there's lots of ways to put video into udp, and you didn't say which way you wanted.
[17:47:03 CEST] <anill> kepstin: although DHE, helped me much in this, what i want is i have decryped RTP stream and i want H.264 data from it, there are packetization mode defined here https://tools.ietf.org/html/rfc6184#page-16, i want to get the output file .h264 and .MP4
[17:47:11 CEST] <kepstin> mpeg-ts is used in iptv broadcast stuff with multicast udp, so that's fairly common at least.
[17:47:37 CEST] <voip_> DHE one more question, for monitoring and restart ffmpeg what program you will recommend ? I was google for "monit", but not found goof guide
[17:48:12 CEST] <kepstin> anill: that packetization mode is being done by whatever is sending the video, not by ffmpeg... ?
[17:52:24 CEST] <anill> kepstin: you asking me or telling, confused
[17:55:13 CEST] <anill> kepstin: u there ?
[17:56:15 CEST] <kepstin> anill: I don't know what you're trying to ask - the information about packetization format is for the rtp stream, which is not what you're writing with ffmpeg.
[17:57:41 CEST] <anill> kepstin: You mean to say ffmpeg has nothing to do with packteziation mode, right?
[17:58:35 CEST] <kepstin> well, it has to be able to parse whatever mode you're sending, but I expect that it can handle common standard modes fine.
[18:02:24 CEST] <anill> so this command ffmpeg -i rtp://.....   -c copy -map 0:v output.mp4 -c copy -map 0:v output.h264 will output the rtp://... into the .h264 and MP4 format
[18:04:48 CEST] <anill> kepstin: am i correct?
[18:05:57 CEST] <kepstin> anill: it should yeah.
[18:06:32 CEST] <kepstin> anill: that should work to capture a live (network) rtp stream that is being sent to the computer ffmpeg is running on.
[18:07:51 CEST] <anill> kepstin: ya but in my case i have stream being captured into a file, and also no clue how to give file as input to ffmpeg
[18:14:36 CEST] <voip_> Guys for monitoring ffmpeg and restarting if it stops i am going to use Monit utility.
[18:14:36 CEST] <voip_> But didn't found any "best practice" guide how to configure for ffmpeg
[18:14:36 CEST] <voip_> Or you can recommend other software ?
[19:20:41 CEST] <wilsoniya> Does ffmpeg maintain a corpus of videos of various a/v codecs and container formats (both valid and potentially invalid/corrupted) which are used for testing?
[19:28:06 CEST] <leif_> Does anyone know what might cause ffmpeg to output the warning: "Could not update timestamps for skipped samples"?
[19:28:28 CEST] <leif_> (And the resulting file is very short. Like, less than a seecond of the two minute file I was expecting.)
[19:29:01 CEST] <klaxa> wilsoniya: there is https://www.ffmpeg.org/fate.html
[19:33:05 CEST] <leif_> kepstin: Err...I didn't use the command line tool, but the C API.
[19:33:29 CEST] <kepstin> well, that's helpful to know too :)
[19:33:55 CEST] <leif_> When I use the command line, it works fine with the exact same filtergraph. Which makes me think its a problem with seeking.
[19:34:16 CEST] <leif_> (But the thing is I get it even when I don't do any seeking.)
[19:35:50 CEST] <kepstin> kinda hard to tell what's going on without any sample code or information about the intput, but a guess from the ffmpeg code is that you might have an issue with how the stream timebase is configured.
[19:38:21 CEST] <leif_> kepstin: Mmm...that might make sense. Would that be a thing I have to set manually, or a thing ffmpeg guesses at automatically?
[19:38:51 CEST] <leif_> (My codebase is very asynchronous, making it hard to make a small sample. But I'll see if I can put one together.)
[19:39:12 CEST] <kepstin> there's some steps you're required to do to make sure information is passed between different parts of the code, maybe look at some of the examples for reference?
[19:42:57 CEST] <leif_> kepstin: Hmm...I've done that. But its possible I've missed something.
[19:43:03 CEST] <leif_> I'll try to get a small sample together.
[19:46:54 CEST] <leif_> https://gist.github.com/LeifAndersen/4fe6316aa7a35a243a161c3d9c1877c7
[19:47:14 CEST] <leif_> kepstin: Okay, that is (more ore less), what I do to open a file.
[19:47:43 CEST] <kepstin> oh, fun, you're going through some kind of bindings rather than using the api directly :/
[19:48:33 CEST] <leif_> kepstin: Yes I am. Although I am the poor chump responsible for making those bindings...
[19:48:53 CEST] <kepstin> I was gonna suggest going to the person who developed the bindings for help, but, uh... :)
[19:48:54 CEST] <leif_> I can easily translate it into C proper if you'd like.
[19:49:02 CEST] <leif_> lol...
[19:51:20 CEST] <leif_> If I do translate it into C, is there any easy way to test if its right?
[19:53:42 CEST] <kepstin> not sure what you mean. compile and run it, see if it works?
[19:54:53 CEST] <leif_> kepstin:Err...its the 'see if it works' part that's hard. Because if I do translate it to C, then all I have is a stream of data. So I'm not sure how I can make sure that stream is 'the right stream'(tm).
[19:57:17 CEST] <kepstin> tbh, I'd suggest attempting to write code that does what you want in C, then working on your bindings afterwards so they replicate what you've got working in C
[19:58:03 CEST] <kepstin> note that ffmpeg's api is a very synchronous design, so building something async on top of it is probably gonna be difficult - and you have to be careful with threads
[20:00:54 CEST] <leif_> kepstin: Oh ya. Its all single threaded. It just uses callbacks to sort of jump back and forth between encoding and decoding.
[20:01:34 CEST] <leif_> or at least, it only uses one OS level thread. I do use a few 'green' threads so to speak.
[20:01:40 CEST] <leif_> Anyway, thanks for the suggestion. :)
[20:03:50 CEST] <wilsoniya> klaxa: exactly what I needed! Thank you!
[22:24:52 CEST] <Zexaron> Hello
[22:26:08 CEST] <Zexaron> I'm not sure why is the program that uses a specific ffmpeg version having to check what version it's using, or I'm understanding that wrong, it has stuff like this #if (LIBAVCODEC_VERSION_MICRO >= 100 && LIBAVCODEC_VERSION_INT >= AV_VERSION_INT(57, 33, 100)) ||  \
[22:26:58 CEST] <durandal_1707> mainly to check if it is Libav of FFmpeg...
[22:29:27 CEST] <Zexaron> durandal_1707_: but FFmpeg is custom built and integrated into the program without the need for shared DLLs, it's as specific as you can get, and I only the developer that includes ffmpeg support knows the details of the custom feature set he used when building those ffmpeg libs (unless that can be analyzed in the .lib)
[22:30:41 CEST] <Zexaron> This whole thing is being rewritten from top to bottom, there may be old code from 2009 still there, it's at this line: https://github.com/dolphin-emu/dolphin/blob/master/Source/Core/VideoCommon/AVIDump.cpp#L69
[22:31:07 CEST] <Zexaron> Sorry, not 2009, ffmpeg got introduced quite a bit later
[22:35:32 CEST] <Zexaron> This should explain things https://github.com/dolphin-emu/dolphin/commit/04158dfe158785b9fcb3c11278d748fb522d814d#diff-6a6d4269338abd71eaff037290b6f3d5
[22:35:42 CEST] <Zexaron> Still relevant for 4.0 ?
[22:36:42 CEST] <Zexaron> All the old stuff is being removed, 4.0 will be supported by default and prefferred, there will be an override to do different versions, but not supported
[22:37:01 CEST] <Zexaron> So I don't care about any other version except ffmpeg 4.0
[22:37:55 CEST] <JEEB> then just keep the stuff that builds with 4.0, that's just bw compat for non-codecpar versions, which by now are quite old
[22:41:36 CEST] <Zexaron> The goal is to go straight to hw accel codecs, so this non-codecpar might not be needed, if that's not working right, even if cpu codecs are allowed the newer stuff will be used, but i don't know what codecpar even is yet
[22:42:16 CEST] <Zexaron> JEEB: the ffmpeg currently used is 3.4.2 I think, I already updated it to 4.0 and had no trouble building with existing code
[22:42:30 CEST] <Zexaron> in my dev branch
[22:42:37 CEST] <JEEB> 3.4 already had codecpar anyways. it's by now old, so all that's to support some really older stuff
[22:43:30 CEST] <Zexaron> Okay, just making sure, thanks.
[23:59:13 CEST] <gcl_5> i get some repetitive warnings so i disable with -loglevel error, but i still want "the progress bar" -> time=00:35:24.87 bitrate=
[00:00:00 CEST] --- Fri Jun  1 2018


More information about the Ffmpeg-devel-irc mailing list