[Ffmpeg-devel-irc] ffmpeg.log.20190803

burek burek021 at gmail.com
Sun Sep 15 22:22:12 EEST 2019


[01:49:36 CEST] <dastan> hello people
[01:50:07 CEST] <dastan> i am searching a way to passthrough one input streaming to a two different ffmpeg process
[01:55:34 CEST] <DHE> dastan: could one ffmpeg process produce two different outputs for you instead?
[02:10:03 CEST] <dastan> DHE
[02:10:24 CEST] <dastan> i know but i need to use h264 for one process and prores to the other
[02:10:44 CEST] <dastan> nd the speed of the codecs is different
[02:11:43 CEST] <DHE> with a realtime source?
[02:11:53 CEST] <dastan> yep
[02:11:59 CEST] <dastan> the input is youtube
[02:44:49 CEST] <Hello71> weren't you just asking that here
[02:50:36 CEST] <martmists> Hey everyone, I'm trying to use ffmpeg to stitch videos together. Half of them are 1080p60, half of them are 640x400 at 25fps. I seem to be unable to convert these 640x400 at 20 files to 1080p60, no matter what I do. the code for the entire script is here: https://pastebin.com/2hDmkuDy
[02:51:12 CEST] <furq> martmists: -filter:v and -vf are the same thing and you can only set it once
[02:51:24 CEST] <furq> you want -vf scale=1920:1080,fps=60
[02:51:40 CEST] <martmists> I see
[02:51:41 CEST] <furq> and ,setpts as well for the first one
[02:52:32 CEST] <martmists> now i'm getting `[NULL @ 0000027cc396dac0] Unable to find a suitable output format for 'scale=1920:1080'`
[02:53:00 CEST] <furq> you've got a stray option with no argument somewhere
[02:53:19 CEST] <stevessss> I guess I could compile armabi version of libffmpeg in way that logs all library calls and then replace that in my zyplay app that doesnt say how it gets video url
[02:55:43 CEST] <martmists> hm, now I specify 1920:1080 and it ssets it to 1920:1072...
[02:57:22 CEST] <martmists> also, the original flv files are now sped up like 10 times?
[03:01:42 CEST] <martmists> well not the original files, but the parts of the video corresponding to those files
[03:04:57 CEST] <martmists> @furq any clue?
[03:13:50 CEST] <kepstin> martmists: you added a setpts filter to speed them up...
[03:14:39 CEST] <martmists> but that only applies to the generated files, and even if I removed it the flv files were still sped up
[03:14:51 CEST] <kepstin> your final concat commmand has a setpts filter
[03:15:02 CEST] <martmists> removed that too
[03:16:14 CEST] <kepstin> if "ffmpeg -i input.mp4 -vf scale=1920:1080,fps=60 output.mp4" changes the video speed then there's something really weird going on.
[03:17:12 CEST] <kepstin> (possibly an issue where it's reading the input file incorrectly)
[03:17:40 CEST] <kepstin> if that's the case, i'd want to see the complete output of "ffmpeg -i input.flv" or whatever
[03:17:47 CEST] <martmists> no, the last ffmpeg command changes the video speed of the flvs
[03:18:58 CEST] <martmists> https://pastebin.com/wHvGf28Q here's one of the files
[03:26:23 CEST] <kepstin> hmm, that seems fine. what's the complete command line and output of the last ffmpeg command, as you ran it?
[03:27:45 CEST] <martmists> https://pastebin.com/Ek84ZGMC
[03:28:25 CEST] <martmists> I terminated it early to see if the file was still sped up, and it was
[03:58:05 CEST] <kepstin> hmm. my guess is that the inputs aren't all properly matching
[03:58:35 CEST] <kepstin> i note that at least the pixel formats don't match, you should probably be using "-pix_fmt yuv420p" when encoding all the parts.
[03:59:36 CEST] <kepstin> really hard to say what's up with all these separate files.
[04:04:01 CEST] <kepstin> the concat demuxer is kind of weird to work with, you really want to make sure all your inputs are perfectly matching.
[04:05:08 CEST] <kepstin> for this sort of thing, i'd normally recommend using the concat filter instead. You can probably do the whole video building in a single ffmpeg command by programmatically building a filter script, too.
[04:05:33 CEST] <martmists> How would I go about doing that?
[07:05:52 CEST] <rmmh> How do I get vp8 to output yuv444p? I've tried `-vcodec libvpx -vf format=yuv444p` and `-pix_fmt yuf444p`, but they still use yuv420p. (using ffmpeg 4.1.3)
[08:10:51 CEST] <furq> rmmh: i don't think vp8 supports yuv444p
[08:10:58 CEST] <furq> the libvpx wrapper in ffmpeg definitely doesn't
[08:13:03 CEST] <rmmh> Ah. From the RFC, "VP8 works exclusively with an 8-bit YUV 4:2:0 image format." I found some commits adding yuv444p to the VP9 wrapper, and got confused.
[08:15:29 CEST] <furq> https://clbin.com/FUET1
[08:15:31 CEST] <furq> that's the quickest way to check
[08:26:27 CEST] <jbaxter> Hello! I'm trying to use the ffmpeg cli to transcode/add AAC audio streams to a bunch of videos. On some videos it errors out. I've stripped out the actual transcoding arguments and I'm having trouble with a few files, just doing a copy: https://pastebin.com/TfbZJfc8
[08:26:40 CEST] <jbaxter> the error is: Could not write header for output file #0 (incorrect codec parameters ?): Invalid data found when processing input
[08:26:43 CEST] <jbaxter> any suggestions?
[08:31:21 CEST] <jbaxter> Sorry I think the more relevant error is actually this: [adts @ 0x56130eac5720] MPEG-4 AOT 0 is not allowed in ADTS
[08:31:45 CEST] <furq> what's $dest set to
[08:33:54 CEST] <jbaxter> dest=$file.aac
[08:34:00 CEST] <jbaxter> Oh dear
[08:34:05 CEST] <jbaxter> I think I see the problem
[08:34:34 CEST] <jbaxter> I might be an idiot
[08:36:19 CEST] <jbaxter> Well that's now confirmed. Thanks for the help ;)
[11:49:34 CEST] <JEEB> Guest48: the command line apps were made for file->file usage originally so they probably require a lot of stuff from the AVFormat/Contexts
[11:49:46 CEST] <JEEB> which requires probing
[11:49:51 CEST] <JEEB> is the probing the problem for you or what?
[11:49:56 CEST] <JEEB> why are you trying to minimize it?
[11:50:03 CEST] <JEEB> or more like
[11:50:05 CEST] <JEEB> "do you need to"
[11:50:31 CEST] <JEEB> also I hope you have posted your command line and full terminal output with -v verbose or so on a pastebin or so
[11:50:36 CEST] <JEEB> and linked it here
[11:50:42 CEST] <JEEB> to show where the actual problem is :P
[11:51:20 CEST] <JEEB> as in, yes you get messages about missing resolution/frame rate etc, but where does it actually fail
[12:00:09 CEST] <Guest48> Excuse me for the delay, @JEEB. Had to reset my pass.
[12:00:15 CEST] <Guest48> https://pastebin.com/wWxBThDs
[12:00:20 CEST] <Guest48> Basically, I have two streams starting at the same time and there is a condition in where it could occur that one stream does not become available after a certain amount of time, so I need to measure the time it takes before that stream actually starts streaming data, measure that time and itsoffset that time to make the videos run sync again later, next to each other.
[12:00:37 CEST] <JEEB> right
[12:00:44 CEST] <JEEB> sounds like something to use the API for, unfortunately
[12:00:54 CEST] <JEEB> also for the record
[12:01:03 CEST] <JEEB> framerate is an image input specific option, it's specific to a specific image2 module
[12:01:08 CEST] <Guest48> @JEEB; yeah, that's becoming apparent
[12:01:29 CEST] <JEEB> -r is the option that sets the frame rate, but in general it does nothing on the input side
[12:01:31 CEST] <Guest48> @JEEB; that explains; Could be expanded, if you ask me, but that's a matter of taste.
[12:01:48 CEST] <Guest48> @JEEB; -r; yep, figured that out too.
[12:01:57 CEST] <JEEB> video_size is also probably specific to some module
[12:02:13 CEST] <JEEB> ffmpeg.c or ffprobe.c do not differentiate between module AVOptions
[12:02:19 CEST] <JEEB> and options of ffmpeg.c setting general values
[12:02:38 CEST] <JEEB> basically what they cannot find in their options, they pass onto all the modules one at a time as AVOptions :P
[12:04:24 CEST] <JEEB> anyways, yea. sorry. what you're doing starts sounding where the command line apps no longer cut and you need to start using the API
[12:05:16 CEST] <Guest48> Yeah, I hear ya man. I'm using node.js; seems like this would suffice, wouldn't it? https://github.com/Streampunk/beamcoder
[12:05:41 CEST] <JEEB> I have no idea to be honest
[12:05:54 CEST] <JEEB> I have used FFmpeg's APIs through C, Python and Java
[12:05:56 CEST] <Guest48> Yep, that certain seems like it should be able to do the trick. Native API bindings from node.js to FFmpeg.
[12:06:12 CEST] <JEEB> but I have never utilized other people's wrappers around it
[12:06:19 CEST] <Guest48> https://github.com/Streampunk/beamcoder/tree/master/src
[12:06:21 CEST] <JEEB> well, ffms2 I have utilized for frame exact access but yunno :P
[12:06:32 CEST] <JEEB> (that's a bit more higher level)
[12:06:51 CEST] <JEEB> Guest48: anyways I hope that wrapper is good
[12:06:57 CEST] <JEEB> I can't really vouch or not vouch for it
[12:07:10 CEST] <Guest48> I hope so too. We're gonna find out, I guess.
[12:07:18 CEST] <Guest48> Thanks, @JEEB!
[12:07:20 CEST] <Guest48> :)
[12:07:30 CEST] <JEEB> np
[12:07:54 CEST] <JEEB> upipe had some nice things that I think could be useful in libavformat
[12:08:05 CEST] <JEEB> like having three types of PTS in the packets
[12:08:29 CEST] <JEEB> original timestamp, sanified timestamp (f.ex. MPEG-TS 33bit wrapping handled) and receipt timestamp
[12:08:59 CEST] <JEEB> right now what you get from libavformat is something between the first and the second (it's supposed to be the second but the general wrap-around code seems to bork in some cases)
[12:12:01 CEST] <Guest48> @JEEB, yeah, I will have to way my options and discuss here on what's feasible. This explains a lot on what I was struggling with though, @JEEB, so thanks again! :)
[20:07:45 CEST] <dastan> hi people....
[20:13:02 CEST] <dastan> is it possible to download a HLS link and pass to multiple ffmpeg process?
[20:21:39 CEST] <ChocolateArmpits> dastan, with scripting anything is possible
[23:49:58 CEST] <n000g> Hi, I'm trying to set the options (x y w h) for the deshake filter. is -vf deshake=10:10:680:550 correct? Because that doesn't seem to do much on my very slightly shaky video.
[23:56:56 CEST] <durandal_1707> have you read filter docs?
[00:00:00 CEST] --- Sun Aug  4 2019


More information about the Ffmpeg-devel-irc mailing list