[Ffmpeg-devel-irc] ffmpeg.log.20190322

burek burek021 at gmail.com
Sat Mar 23 03:05:02 EET 2019


[00:06:15 CET] <xxxmaker> what is the command to see  what libcodec version  my ffmpeg build is using
[01:31:48 CET] <pink_mist> xxxmaker: uhm, literally just running 'ffmpeg' on the commandline shows it
[01:51:34 CET] <net|> is there a flag to make the -fs 1M portion reoccurring to overwrite itself without relaunching ffmpeg ?
[01:52:37 CET] <net|> or would i have to use webm_chunk somehow
[01:53:14 CET] <net|> maybe a frame limiter would work better
[01:54:22 CET] <net|> maybe moving it to a secondary tmp file every minute or 3
[01:54:36 CET] <kepstin> net|: i'd suggest using the segment muxer
[01:55:38 CET] <kepstin> hmm, it doesn't support breaking by file size. i'm actually kind of surprised by that.
[01:55:42 CET] <kepstin> but it can do by time
[01:56:39 CET] <furq> i take it this is a live input
[01:56:45 CET] <net|> yes
[01:56:50 CET] <furq> nvm then
[01:56:55 CET] <net|> its for streaming webm with html video tag
[01:57:00 CET] <furq> mkvmerge can split by size but that's no use to you
[01:57:32 CET] <net|> i have autorefresh html code to make it better
[01:58:06 CET] <net|> kepstin how do you do it by time ?
[01:58:35 CET] <kepstin> for streaming webm, you should probably look into using dash
[01:59:31 CET] <net|> does DASH work with html video tag ? i tried with the hdr file but maybe my video options were out
[01:59:41 CET] <net|> even vlc wont read it
[02:01:24 CET] <net|> ffmpeg -i input.mp4 -c copy -map 0 -segment_time 00:20:00 -f segment -reset_timestamps 1 output%03d.mp4
[02:02:32 CET] <net|> i wonder if i took the %03d option out if it would overwrite itself
[04:00:22 CET] <aicpro> even with a maximum bitrate of 7000 and a CRF of 10 set I'm consistently getting artifacts like this https://my.mixtape.moe/frfvsz.png
[04:00:38 CET] <aicpro> the q value of ffmpeg's output is regularly 25-30 as well
[04:10:06 CET] <furq> that's just yuv420p
[04:10:11 CET] <furq> there's nothing you can do about that
[04:10:47 CET] <furq> other than using a different colour scheme
[04:13:52 CET] <aicpro> any suggestions?
[04:14:28 CET] <furq> not other than the one i just gave
[04:14:42 CET] <aicpro> any suggestions on another color scheme
[04:14:44 CET] <furq> you could use a different pixel format but youtube or whatever will just convert it to 4:2:0 anyway
[04:14:47 CET] <furq> oh right
[04:15:08 CET] <furq> either use a bigger font or use colours that have more luma contrast
[04:15:25 CET] <furq> white on black or vice versa will look fine
[04:17:20 CET] <aicpro> alright
[12:52:32 CET] <danieel> hello, we are feeding ffmpeg a raw h264 stream in order to make a MOV file. What are the possibilities to fix dropped frames? Can we edit the NAL before it enters ffmpeg, to say "i am twice that long?", or is there a special "no difference" NAL blob we can just insert to correct for delay ?
[12:52:51 CET] <danieel> the application uses multiple cameras, and dropping frames make the streams asynchronous :(
[12:53:25 CET] <JEEB> just set timestamps right and then things should be OK?
[12:53:53 CET] <BtbN> raw h264 doesn't exactly have proper timestamps, even though in theory it does
[12:53:54 CET] <JEEB> it only becomes a problem if you're still passing monotonically rising timestamps even though there's been a skip in packets
[12:54:01 CET] <JEEB> BtbN: AVPackets do have timestamps
[12:54:09 CET] <JEEB> oh, he means ffmpeg.c?
[12:54:11 CET] <JEEB> perkele
[12:54:12 CET] <JEEB> :V
[12:54:14 CET] <BtbN> If the input is raw h264 though...
[12:54:36 CET] <JEEB> yes, but the camera or whatever should have timestamps (in the worst case the receipt timestamps if there absolutely is no better way)
[12:54:37 CET] <danieel> the h264 is produced by a hw encoder (nvidia) outside of ffmpeg, i can see parsing it whether it is a I or P frame
[12:55:01 CET] <BtbN> Why are you encoding with nvenv, and then feeding it to ffmpeg?
[12:55:02 CET] <danieel> hoping there is also a timestamp or duration field in it, which i could edit
[12:55:12 CET] <BtbN> *nvenc
[12:55:22 CET] <JEEB> danieel: raw annex b very unlikely has that :P
[12:55:43 CET] <danieel> the frames come from argus, and then we inspect images (e.g. to sync the streams between each other)
[12:55:51 CET] <BtbN> h264 bitstream in theory has timestamps. They are usually not set to anything meaningful though, and the container is used for that.
[12:56:49 CET] <danieel> so better to look into crafting  a no difference P frame?
[12:57:00 CET] <BtbN> Better use a container
[12:57:12 CET] <BtbN> Proper timestamps are the only way to make this work reliably
[12:57:54 CET] <danieel> container is now handled by ffmpeg, so it does also do some implicit timestamping based on -r FPS option, right?
[13:01:17 CET] <JEEB> ahahahaha
[13:01:19 CET] <JEEB> no
[13:01:25 CET] <JEEB> that is an output option :P
[13:01:46 CET] <JEEB> and even when that one patch lands for -r to work with input, it will not fix what you need
[13:02:01 CET] <JEEB> the only way to fix your issue is to add proper timestamps to your input
[13:02:37 CET] <danieel> editing the h264 streams, or adding extra data bearing timestamps?
[13:03:36 CET] <danieel> what i am doing now is just popen( 'ffmpeg -y -r $fps -i - -vcodec copy $file' ), and feed the stream from encoder there
[13:03:42 CET] <BtbN> There have to be timestamps where that comes from
[13:03:49 CET] <BtbN> don't use raw h264, ever
[13:04:09 CET] <BtbN> It's interpreterd frame by frame based on a constant framerate you yet
[13:04:12 CET] <BtbN> *set
[13:04:16 CET] <JEEB> no, it even isn't yet
[13:04:20 CET] <JEEB> the patch for that is still on the ML
[13:04:21 CET] <JEEB> :P
[13:04:27 CET] <JEEB> so it effectively does nada
[13:04:56 CET] <danieel> this gives me a file which can be used though :) and was the easiest way to achieve a container
[13:05:24 CET] <JEEB> except you already found an issue with the fact that you're dropping stuff in the middle :P
[13:05:29 CET] <JEEB> and suddenly your crap falls apart
[13:05:58 CET] <JEEB> I don't care how you transfer the timestamps, either in some other way and you use the FFmpeg API with AVpackets having proper timestamps
[13:06:17 CET] <JEEB> or if you properly create your stream from the encoder with a container with timestamps
[13:07:00 CET] <JEEB> I hope why I'm saying this makes sense to you?
[13:08:44 CET] <danieel> yes, the solution is to use the full api, not pass raw data by stdin
[13:12:48 CET] <lixus> i recorded a screencast with ffmpeg without audio. now i would like to record an audio track to that video. is it possible to play back the video and record audio annotations at the same time using ffmeg tools ?
[13:12:55 CET] <danieel> there might be a slight problem with that - the output would be a VFR / variable frame-rate video / which would cause trouble by software that does not undestand it
[13:13:14 CET] <JEEB> danieel: you can always make it stable X Hz
[13:13:16 CET] <JEEB> that's the simpler part
[13:13:17 CET] <JEEB> :P
[13:13:21 CET] <JEEB> there's an fps filter, among other things
[13:14:56 CET] <danieel> what will that do, to the h264 stream that has here and there a jump in timestamp? hope not frame interpolation, is it just modifying the time to frame mapping table?
[13:15:42 CET] <BtbN> Without proper input timestamps, it will probably just re-arange the frames
[13:16:05 CET] <JEEB> danieel: the fps filter is really simple; it will either drop or duplicate
[13:16:20 CET] <JEEB> oh wait, you didn't want to re-encode, right? :P
[13:16:31 CET] <danieel> that is easy to say in uncompressed, but we have already a bitstream :)
[13:16:39 CET] <danieel> yes exactly
[13:17:03 CET] <JEEB> "have fun with that". but at least your input is known to have timestamps so you can have your own logic to handle it how you want
[13:18:33 CET] <danieel> well the only viable solution is then to craft the zero difference frame nal blob
[13:18:50 CET] <danieel> and keep it fixed rate
[13:19:55 CET] <danieel> shame there is no easy definition of such stuffing frame
[13:26:33 CET] <xxxmaker> what is the command to see  what x264 and x265 version  my ffmpeg build is using
[13:29:05 CET] <BtbN> You need to read the documentation from the person who built it for that.
[13:32:43 CET] <xxxmaker> btbn what is the command you would use for your build
[13:32:51 CET] <xxxmaker> that you are using now
[13:32:53 CET] <BtbN> There is no command.
[13:33:17 CET] <BtbN> That's why you need to read the documentation. To find the version.
[13:33:35 CET] <xxxmaker> are you sure there is no command to find out?
[13:33:48 CET] <BtbN> x264 might print some stuff to stdout when you use it, but other then that, libx264.c has no notion of querying the version
[13:35:21 CET] <xxxmaker> well i am using 4.1.1
[13:40:57 CET] <BtbN> That doesn't mean much
[15:22:36 CET] <another> x264/5 do indeed print version information when you encode with it
[15:30:08 CET] <xxxmaker> another but is there way to find the version  by typing command?
[15:30:59 CET] <aboxer__> I'm looking for sample mpeg2 videos with CEA 708 closed captions. Anyone know where I can find such ?
[15:31:31 CET] <xxxmaker> aboxer why does it have to be 708 CC ?
[15:51:06 CET] <another> xxxmaker: ffmpeg -y -f lavfi -i testsrc=duration=1 -c:v libx264 -f matroska /dev/null -c:v libx265 -f matroska /dev/null 2>&1 | egrep 'core|version'
[15:51:30 CET] <xxxmaker> that won't work in cmd.exe
[15:53:39 CET] <ksk> install linux subsystem for windows :P
[15:56:25 CET] <another> ffmpeg -y -f lavfi -i testsrc=duration=1 -c:v libx264 -f matroska NUL -c:v libx265 -f null - 2>&1 | egrep 'core|version'
[15:58:11 CET] <xxxmaker> 'egrep' is not recognized as an internal or external command,
[15:58:11 CET] <xxxmaker> operable program or batch file.
[15:58:19 CET] <another> ah wait. windows doesn't have grep
[15:59:31 CET] <another> ffmpeg -y -f lavfi -i testsrc=duration=1 -c:v libx264 -f matroska NUL -c:v libx265 -f null - 2>&1 | findstr /r 'core|version'
[16:00:08 CET] <xxxmaker> 'version'' is not recognized as an internal or external command,
[16:00:08 CET] <xxxmaker> operable program or batch file.
[16:03:15 CET] <xxxmaker> i have 2 hour video:  what is the "command" to best way to create a 1 minute sample
[16:04:51 CET] <Hello71> ' isn't a quote in cmd.exe
[16:04:59 CET] <Hello71> or I think windows in general
[16:05:54 CET] <another> well, okay. windows doing windows things
[16:06:06 CET] <another> out it in double quotes then
[16:06:23 CET] <another> /s/out/put/
[16:06:54 CET] <xxxmaker> another nothing happened
[16:06:58 CET] <xxxmaker> no error nothing
[16:07:06 CET] <xxxmaker> when using double quotes
[16:10:39 CET] <Hello71> I think you missed /i
[16:11:36 CET] <another> no it's all lowercae
[16:12:55 CET] <Hello71> probably x265 not included
[16:15:12 CET] <another> maybe. but honestly i don't want to debug this anymore
[16:15:35 CET] <xxxmaker> okay forge  x264/x265  , how do i tell what version  "aac"  "libfdk_aac"   my ffmpeg build comes with
[16:15:38 CET] <xxxmaker> forget*
[16:16:02 CET] <another> aac is a native encoder
[16:16:12 CET] <xxxmaker> yes i know but i am sure it still has a version #
[16:16:47 CET] <DHE> I don't think the fdk version is exported, unless it's dynamically linked in which case check out the .so/.dll
[16:19:12 CET] <xxxmaker> i don't understand why this is so difficult
[16:21:47 CET] <furq> patches welcome
[16:22:17 CET] <JEEB> fdk-aac's license is not compatible with certain other license that f.ex. x264 and x265 are under :P
[16:22:21 CET] <JEEB> and that is by design P
[16:22:27 CET] <JEEB> from Google's side
[16:22:33 CET] <JEEB> (and fraunhofer's)
[16:23:00 CET] <JEEB> so if you make a build with fdk-aac and something that doesn't work with GPL, you cannot distribute it
[16:23:06 CET] <JEEB> that's why it requires enable-nonfree :P
[16:23:20 CET] <another> aac version = ffmpeg version
[16:23:20 CET] <xxxmaker> jeeb what does all this have to do with just wanting to find the version #
[16:23:30 CET] <JEEB> ah
[16:23:47 CET] <JEEB> yea I'm not sure if fdk-aac gives out its version :P
[16:24:02 CET] <Hello71> last time I think the answer was grep
[16:24:05 CET] <xxxmaker> another  not necessarily  4.1.2 ffmpeg  might have bug fixes but no fixes for aac codec
[16:24:23 CET] <xxxmaker> another  not necessarily  4.1.2 ffmpeg  might have other codec bug fixes but no fixes for aac codec
[16:24:29 CET] <Hello71> well then the question is why did you download a frankenstein ffmpeg
[16:24:38 CET] <JEEB> separate encoders or decoders are not versioned
[16:24:44 CET] <JEEB> the libavcodec library is versioned
[16:24:53 CET] <xxxmaker> Hello71 what is "frankenstein ffmpeg"?
[16:26:45 CET] <xxxmaker> why is this so difficult, i can find the  libopus version
[16:26:57 CET] <furq> use libopus then
[16:26:58 CET] <xxxmaker> but why can't i find "aac" or "lib-fdk" version
[16:27:30 CET] <xxxmaker> furq that's not the point
[16:27:39 CET] <DHE> "aac" version is ffmpeg's version. fdk version isn't available because it's an external application ffmpeg imports and it doesn't tell you what version it is
[16:28:16 CET] <Hello71> smells like xy problem
[16:28:17 CET] <xxxmaker> DHE:  not necessarily  4.1.2 ffmpeg  might have other codec bug fixes but no fixes for aac codec
[16:28:30 CET] <JEEB> sure, but we don't version components within a library
[16:28:35 CET] <JEEB> only the libraries
[16:29:11 CET] <xxxmaker> if nothing was done to the "aac"  why would version # change
[16:29:12 CET] <JEEB> although I'm not sure how often those get bumped in branches :P
[16:29:32 CET] <JEEB> since the bug fixes themselves generally don't touch the library versions
[16:29:51 CET] <DHE> library versions are more of an API compatibility thing, right?
[16:30:08 CET] <JEEB> well, micro version I think can be bumped in theory?
[16:30:18 CET] <DHE> and then the project as a whole gets a release with a version number like 4.1.2
[16:31:47 CET] <xxxmaker> why doesn't ffmpeg write aac-version # on  meta data
[16:31:58 CET] <JEEB> because there is no AAC encoder's version
[16:32:13 CET] <JEEB> we don't version on that level
[16:32:40 CET] <xxxmaker> jeeb whta about  lib_fdk , i don't see version # on  meta data either
[16:32:47 CET] <JEEB> then it doesn't export it
[16:32:50 CET] <DHE> well that's their problem
[16:33:23 CET] <xxxmaker> can't you modify it so that it does?
[16:33:56 CET] <JEEB> then everyone would have to patch their fdk-aac the same way
[16:34:10 CET] <JEEB> if that doesn't sound like a bad idea to you then I don't know what is
[16:34:32 CET] <xxxmaker> no idea what that means
[16:34:39 CET] <JEEB> also where the flying asdf would you even put the encoder version to?
[16:34:40 CET] <DHE> no, it can't be done
[16:34:57 CET] <DHE> should we disassmble the produced AAC stream, insert the metadata, and reassmeble it?
[16:35:05 CET] <DHE> (again, we don't have a version number to insert anyway)
[16:35:20 CET] <xxxmaker> i see version # for lib_opus and lib_flac
[16:35:43 CET] <xxxmaker> and lib_lame
[16:37:18 CET] <DHE> So I checked the source code. There's a version function in fdk, but the version number has only had ~10 different values since 2012
[16:38:26 CET] <DHE> (there's a build date/time but that promises very little)
[16:40:45 CET] <xxxmaker> DHE: can you elaborate on "~10 different values since 2012"
[16:40:54 CET] <xxxmaker> what are some version # examples
[16:42:26 CET] <DHE> 2.2.6 in 2012 to 2.2.8 -> 2.3.0 in 2013 and now we're at 2.3.6
[16:43:27 CET] <another> libfdk recently bumped to 2.0
[16:44:49 CET] <another> also, i don't think ffmpeg supports libflac
[16:45:05 CET] <xxxmaker> another sure it does
[16:45:09 CET] <xxxmaker> DHE i see, thanks
[16:45:09 CET] <furq> it doesn't
[16:46:00 CET] <xxxmaker> ffmpeg -i audio.wav -acodec flac audio.flac
[16:46:02 CET] <xxxmaker> that works
[16:46:11 CET] <furq> that's not libflac
[16:46:19 CET] <xxxmaker> same thing
[16:46:29 CET] <furq> no it's not
[16:46:38 CET] <xxxmaker> how is it not same thing?
[16:47:38 CET] <another> it's a native encoder
[16:49:59 CET] <xxxmaker> another huh? what are you talking about
[16:50:12 CET] <another> also: where do you see the version of libopus?
[16:50:12 CET] <xxxmaker> flac was developed by somebody else not ffmpeg
[16:50:21 CET] <another> yes
[16:50:35 CET] <another> and ffmpeg has it's own implementation
[16:50:43 CET] <another> based on flake i think
[16:51:12 CET] <xxxmaker> another why would ffmpeg change flac's implementation
[16:54:41 CET] <xxxmaker> another:  ffmpeg -f lavfi -i sine -acodec libopus R:\test.opus     that shows  libopus version
[16:55:44 CET] <another> not for me
[16:56:27 CET] <another> anyway, gotta go
[17:03:30 CET] <xxxmaker> jeeb so what do you think?>
[17:03:59 CET] <JEEB> i don't give a flyong fuck to be blunt
[17:04:31 CET] <xxxmaker> jeeb no i mean nevcairiel's actions
[17:04:32 CET] <JEEB> you can make an issue on trac about versioning components if you really want
[17:05:06 CET] <xxxmaker> [08:48] <xxxmaker> nevcairiel   but if ffplay cannot play then you can't prove that it does
[17:05:06 CET] <xxxmaker> [08:48] <@nevcairiel> i dont have to prove anything, to you or anyone
[17:06:10 CET] <JEEB> he is correct, he has no obligation whatsoever. although looking at other FFmpeg API clients like mpv it is flying obvious that the libraries give you what you need in order to implement closed caption support
[17:06:22 CET] <xxxmaker> then ffplay should play it
[17:06:46 CET] <JEEB> ffplay and ffmpeg are example api clients and they are static. thus they don't
[17:07:03 CET] <xxxmaker> no idea what just said
[17:07:03 CET] <JEEB> if someone cares enough they can implement the support
[17:07:58 CET] <JEEB> xxxmaker: ffmpeg contains the libraries and the tools that utilize the libraries. the tools just suck but the basic building blocks for other applications like handbrake are there
[17:08:11 CET] <JEEB> does this make sense?
[17:08:32 CET] <xxxmaker> yes
[17:08:37 CET] <JEEB> if someone cares enough about the internal tools, support will be added
[17:09:07 CET] <JEEB> well not internal, just the ones delivered with FFmpeg (ffmpeg.c and ffplay.c)
[17:09:47 CET] <JEEB> heck, the libraries support dynamically adding streams and ffmpeg.c will just say it found a new stream and that it ignored it
[17:10:16 CET] <JEEB> because ffmpeg.c is an old example application :p
[17:10:26 CET] <JEEB> it is not all of FFmpeg
[17:11:12 CET] <JEEB> it is easier to add building blocks for applications than try to update ffmpeg.c to be more dynamic :p
[17:11:27 CET] <JEEB> (to use those new building blocks)
[17:14:16 CET] <xxxmaker> why does  ffmpeg -f lavfi -i sine -acodec libfdk_aac -t 30 -vbr 3 test.aac  create corrupted file?  but  ffmpeg -f lavfi -i sine -acodec libfdk_aac -t 30 -vbr 3 test.mp4   does not?
[17:15:08 CET] <xxxmaker> is this a bug?
[17:15:40 CET] <JEEB> post on the issue tracker
[17:15:46 CET] <JEEB> i have no idea
[17:15:57 CET] <xxxmaker> ffmpeg has weird bug tracker
[17:16:26 CET] <xxxmaker> can i just post it via IRC
[17:16:49 CET] <JEEB> no
[17:49:13 CET] <pkunk> I am generating an ffmpeg command to create a mosaic of 16 input streams using the overlay filter. It works fine but ffmpeg takes 5-6 seconds to open each input stream which means the command takes 2-3 minutes just to start transcoding
[17:49:55 CET] <pkunk> Is there a way to make ffmpeg process all the -i http://path.to/mpegts  URL's simultaneously instead of one at a time ?
[17:50:13 CET] <JEEB> probably not due to the design of the ffmpeg.c cli app
[17:50:27 CET] <JEEB> you can see if you want to improve that part of it
[17:50:45 CET] <JEEB> if you lack the skills and/or time, feel free to make a ticket on trac.ffmpeg.org
[17:51:56 CET] <pkunk> JEEB: ah, I've already worked with the libav* C code before so it's doable. I have no idea if ffmpeg.c itself will need drastic changes to do this
[17:52:41 CET] <pkunk> i.e is it thread-aware enough , or is it just a matter to increasing the thread count in the input processing stage
[17:53:12 CET] <JEEB> yea, or is it just an iterative thing which in theory could be handled through its own thread pool
[17:53:24 CET] <JEEB> because probing all of the inputs doesn't have to be A->B
[17:53:35 CET] <JEEB> at least I can't think of many cases where that would have to be the case
[18:11:47 CET] <kepstin> pkunk: yeah, currently the input handling code in ffmpeg.c is all single-threaded afaik. would probably need some structural changes to fix this.
[18:18:06 CET] <JEEB> yea it depends on how much you want to thread
[18:18:16 CET] <JEEB> if it's just the initial probing that might be possible to just handel separately?
[19:02:59 CET] <kepstin> eh, if you're doing the initial probing separately, you might as well do the complete input handling in threads, should help a lot of the people doing live network streaming use cases in general.
[19:05:15 CET] <JEEB> sure
[20:09:10 CET] <pkunk> kepstin: Thanks, will have a look at the code tomorrow
[20:31:33 CET] <faLUCE> hello all
[20:33:48 CET] <faLUCE> is there a way to get from an aviocontext of a muxer if the muxed packet was a keyframe ?
[20:34:16 CET] <JEEB> wrong abstraction level, no
[20:35:28 CET] <faLUCE> in addition: an audio+video stream to mux must start with a video keyframe?
[20:35:40 CET] <JEEB> no?
[20:35:52 CET] <JEEB> I think you've asked this already before and the answer is the same
[20:36:09 CET] <faLUCE> JEEB: I asked about an AUDIO only stream
[20:36:40 CET] <JEEB> then the keyframe part didn't make sense since audio doesn't have keyframes as such since all formats so far (except AC-4) have all packets be keyframes
[20:37:44 CET] <faLUCE> JEEB: I know that, but I don't understand why for an audio+video stream I don't need a video keyframe as well
[20:38:24 CET] <JEEB> often it is preferred that you start with a RAP
[20:38:28 CET] <JEEB> (random access point)
[20:38:32 CET] <JEEB> but that doesn't come out of a technical limitation
[20:38:57 CET] <JEEB> for example, people didn't do mp4 fragments that start with non-RAP because f.ex. chrome would have an open bug about that
[20:39:07 CET] <JEEB> even though MPEG-DASH specification would let them do so
[20:39:29 CET] <JEEB> then you have MPEG-TS streams where all the digital TV receivers are used to waiting for RAPs
[20:39:40 CET] <JEEB> so it doesn't make any sense for the muxer to make you wait for a RAP
[20:39:58 CET] <faLUCE> then, JEEB, when an audio+video doesn't start with a video keyframe, a player should play audio only before getting the first video key?
[20:40:12 CET] <JEEB> what a player does is up to the player's specification
[20:40:16 CET] <JEEB> it may play audio only
[20:40:24 CET] <JEEB> it may not play anything until it gets the RAP
[20:40:33 CET] <faLUCE> I see, thanks
[20:40:54 CET] <JEEB> or as was the case with various browsers, it might just explode
[20:41:10 CET] <faLUCE> :-)
[20:41:52 CET] <JEEB> of course, if a format specification says that a mux has to start from a RAP
[20:42:06 CET] <JEEB> then for that specific format the muxer might fail you if you try to feed it a non-RAP
[20:42:11 CET] <JEEB> but that is 100% format specific
[22:04:08 CET] <brimestone> Using filter (-filter_complex with multiple output) is exporting a PNG still an option?
[22:04:34 CET] <furq> yes
[22:05:01 CET] <brimestone> Which filter should I use?
[22:05:06 CET] <furq> uh
[22:05:18 CET] <furq> whatever filter you want to apply
[22:19:21 CET] <DHE> filtering is done on raw decoded images. then it goes to the PNG output like normal. filter_complex is only special because it lets you select arbitrary inputs and outupts, including multiple of each type from different sources, etc
[23:31:33 CET] <__raven__> hi
[23:33:59 CET] <__raven__> how to set a fixed header information about fps in an flv container?
[23:34:09 CET] <__raven__> we need to remux a rtmp
[00:00:00 CET] --- Sat Mar 23 2019


More information about the Ffmpeg-devel-irc mailing list