[Ffmpeg-devel-irc] ffmpeg.log.20180423

burek burek021 at gmail.com
Tue Apr 24 03:05:01 EEST 2018


[01:47:37 CEST] <postmodern> hello, i'm trying to stream to a local ffserver using ffmpeg and the stereo3d filter, but when specifying `-map "[stereo3d]" -f ffm -pix_fmt yuv420p URL` i get "stereo3d" has an unconnected output
[03:07:08 CEST] <kepstin> postmodern: that's not your whole command line. you can only use -map to specify pad that are in your filter chain
[03:07:28 CEST] <kepstin> postmodern: also, if you upgrade your ffmpeg it'll delete ffserver, which will solve this problem completely.
[03:14:55 CEST] <nicolas17> I'm doing a timelapse from a video (ffmpeg -r 600 -i in.mp4 -r 30 out.mp4) and it seems the bottleneck is decoding the input video; can that be hardware-accelerated, or is hw accel only for playback to screen?
[03:15:06 CEST] <nicolas17> (Intel GPU)
[03:22:37 CEST] <postmodern> kepstin, ah i had seen notices that ffserver was deprecated, but ubuntu/debian keeps it around.
[03:23:37 CEST] <postmodern> kepstin, i kind of need a local streaming solution where i can stream ffmpeg -vf effects to a relay server
[03:24:04 CEST] <kepstin> nicolas17: you can use intel gpu to decode, but I'm won't guarantee it'll actually be any faster (depends on many factors whether that's the case)
[03:24:14 CEST] <postmodern> kepstin, figured out that -override_ffserver works
[03:24:58 CEST] <kepstin> nicolas17: you're on linux?
[03:25:31 CEST] <nicolas17> I know there's cases (maybe with older hardware?) where you can decode pretty fast with the GPU, but bringing the result back to main memory to do any processing becomes the new bottleneck :P
[03:25:44 CEST] <nicolas17> yeah Linux
[03:25:59 CEST] <kepstin> nicolas17: on linux, use intel via the vaapi driver, there's some examples for both decoding and encoding here: https://trac.ffmpeg.org/wiki/Hardware/VAAPI
[03:26:15 CEST] <kepstin> (libx264 is a much better encoder if you want quality, of course)
[03:26:26 CEST] <nicolas17> ah yeah
[03:26:48 CEST] <nicolas17> I enabled vaapi on VLC and playback CPU usage (and thus power usage on a laptop) got MUCH better
[03:27:04 CEST] <kepstin> the first decode-only example on that page is basically what you're doing
[03:28:08 CEST] <nicolas17> it's crazy how my new PC's integrated Intel GPU is faster than the nvidia card I had on my previous PC
[03:28:11 CEST] <nicolas17> x.x
[03:29:41 CEST] <nicolas17> hm another problem
[03:30:05 CEST] <nicolas17> finished on CPU-only on a different video
[03:30:09 CEST] <nicolas17> ffmpeg -r 30000*60/1001 -i input.mp4 -r 30000/1001 output.mp4
[03:30:29 CEST] <nicolas17> the output video is 60 times faster and has 60 times less frames, but VLC reports the same length as the original
[03:30:49 CEST] <furq> well yeah
[03:30:57 CEST] <nicolas17> as in
[03:31:11 CEST] <nicolas17> it says the video is 30 minutes long, even though after 30 seconds it freezes because there's no more frames
[03:31:55 CEST] <furq> does -r before -i with mp4 even do anything
[03:32:19 CEST] <nicolas17> oh crap, audio... I didn't even realize the output still had audio because I have my speakers muted :/ it's probably being kept unmodified so it really has 30 minutes of audio
[03:32:42 CEST] <furq> yeah the audio track will be the same length
[03:33:03 CEST] <nicolas17> yeah I forgot I even had an audio track :D
[03:41:57 CEST] <nicolas17> -an fixed the output length :D
[04:02:24 CEST] <nicolas17> if I use -s on the output, that adds a rescale video filter to the filter graph, right?
[04:02:28 CEST] <nicolas17> where is it added, beginning or end?
[04:04:17 CEST] <nicolas17> "As an output option, this inserts the "scale" video filter to the end of the corresponding filtergraph." ok then
[10:01:04 CEST] <nadavr> hello #ffmpeg. does anyone here have instructions on cross compiling ffmpeg? not to windows, i need to cross compile to arm. i have a whole environment set up, but i don't know how to tell the configure script to use my toolchain
[10:04:44 CEST] <JEEB> nadavr: you can look at how FATE does thing
[10:04:46 CEST] <JEEB> fate.ffmpeg.org
[10:10:59 CEST] <nadavr> thanks JEEB. that's some sort of CI right? I'll try to take a look
[10:17:27 CEST] <nadavr> JEEB: unfortunately it seems FATE's logs don't say how it actually configures ffmpeg, only what the configure script spits out
[10:18:34 CEST] <nadavr> oh wow nevermind, the configuration line is right there in the main page. thank you!
[12:00:21 CEST] <alone-y> hello, my BAT file have too long line with ffmpeg. can i load ffmpeg cmd options from some file?
[12:00:34 CEST] <alone-y> instead ffmpeg.exe "too long line"?
[12:02:27 CEST] <romano> I'm having trouble with the overlay filter
[12:02:31 CEST] <romano> Im using this command:
[12:02:32 CEST] <romano> ffmpeg -y -i [INPUT1] -i [INPUT2] -filter_complex \"overlay=enable='between(t,0,4)'\" -c:v prores [OUTPUT]";
[12:03:42 CEST] <romano> But the output file is shorter than the original video and the original is freezed while the overlay runs
[12:04:08 CEST] <romano> I want to overlay a short video into a larger one only in the specified frame/time
[15:02:24 CEST] <TheAMM> There isn't any way to use image2pipe (or something similar) with timestamps, is there?
[15:04:15 CEST] <TheAMM> I'm parsing subprocessed stdout with Python and processing the frames a bit, but I lose timestamps when feeding the new raw frames into another ffmpeg
[15:05:26 CEST] <TheAMM> This is k with cfr but vfr gets funky if I specify a framerate
[15:38:01 CEST] <kepstin> TheAMM: you can use the setpts filter with an expression to set the timestamp based on when the frame enters the filter
[15:39:38 CEST] <TheAMM> I ended up with doing an ultrafast low-crf x264 re-encode of the clip, cleans out my mpv 1000fps files
[15:40:17 CEST] <TheAMM> Easier and faster to implement than getting the real timestamps and replicating them with setpts or other means
[15:40:29 CEST] <kepstin> oh, wait, you want to preserve the timestamps with an input?
[15:40:45 CEST] <kepstin> no way to do that other than actually mux the video into some sort of container before piping to ffmpeg
[15:41:41 CEST] <TheAMM> Yeah, I took another stab at "good gif encoding" because I realized error diffusion (dithering with sierra) screws with optimization
[15:42:11 CEST] <TheAMM> I also implemented that pixel change tresholding with Pillow in a couple of ImageChops
[15:42:22 CEST] <TheAMM> hence the image2pipe and timestamps
[16:01:38 CEST] <romano> I'm having trouble with the overlay filter
[16:01:43 CEST] <romano> Im using this command:
[16:01:50 CEST] <romano> ffmpeg -y -i [INPUT1] -i [INPUT2] -filter_complex \"overlay=enable='between(t,0,4)'\" -c:v prores [OUTPUT]"
[16:01:58 CEST] <romano> But the output file is shorter than the original video and the original is freezed while the overlay runs
[16:02:37 CEST] <romano> I want to overlay a video on top of another in a specific time
[16:02:43 CEST] <romano> time or frame
[16:09:40 CEST] <mlok> -bsf:a aac_adtstoasc - are there any other better ways to use this filter or do something similar with another flag?
[16:15:51 CEST] <Mavrik> Funny question that.
[16:36:38 CEST] <mlok> Mavrik: yeah I'm trying to stop the audio from cutting out in a set top box
[16:36:52 CEST] <mlok> Mavrik: also -q:a 1 helps
[18:03:01 CEST] <tuna> BtbN: I am back from last week, you told me (i think it was you?) that I didnt need to have the AVHWFramesContext for using opengl/cuda buffers as input to nvenc...but I got an error stating the opposite...here is the source I found https://pastebin.com/q1E4rm2j seems like it is needed
[18:03:12 CEST] <tuna> source from: https://github.com/FFmpeg/FFmpeg/blob/master/libavcodec/nvenc.c
[18:03:29 CEST] <tuna> This was the error: hw_frames_ctx must be set when using GPU frames as input
[18:05:47 CEST] <BtbN> that's kinda outdated, I'll see if I can fix it in a bit
[18:07:01 CEST] <tuna> Fix it as in rewrite the code?
[18:07:18 CEST] <BtbN> yes, the can probably be just removed
[18:13:46 CEST] <tuna> If I wanted to get around that for now, without modifying ffmpeg ( since I am not setup to build it) do you think I could just create a frame object with format and itll pass the tests for now?
[18:25:53 CEST] <BtbN> you'll also need to fill in some parameters into the hw_frames_ctx
[18:26:00 CEST] <BtbN> widht, height and sw_pix_fmt
[18:36:25 CEST] <tuna> ok
[18:53:13 CEST] <sudden6> hi, in the recent releases the function `av_register_all` got marked as deprecated, is there a replacement?
[18:54:19 CEST] <JEEB> you don't call anything
[18:54:30 CEST] <jdev> congrats to the whole team on the ffmpeg 4.0 release. This looks like one of the largest releases bases on changeling features in a long time. Where I can find the new compile/build flag options supporting the new features?
[18:54:32 CEST] <JEEB> the only registration currently needed is for avdevices, which 90%+ of all people don't utilize
[18:55:01 CEST] <JEEB> jdev: internal things generally supported out of the box, otherwise see --help of the configure script
[18:55:48 CEST] <sudden6> JEEB: thanks a lot
[18:59:19 CEST] <jdev> JEEB: thank you, reviewing https://github.com/FFmpeg/FFmpeg/blob/master/configure
[18:59:46 CEST] <JEEB> it's usually just simpler to run the configure script with the parameter :P
[19:00:01 CEST] <memo1> hi, i need to resart the ffmpeg command once the video source is lost, maybe a drop connection. How i do that?, im trying using bash until loop, but doesnt work
[19:00:19 CEST] <JEEB> if it's a network thing, use timeouts
[19:00:22 CEST] <jdev> of course, I just don't have the source code locally in front of me right now :)
[19:01:34 CEST] <memo1> im using stime but only works if the source is lost at soon i run the command, but during the streaming, it doest work.  Can you help me please?, which timeouts should i use?
[19:06:10 CEST] <sudden6> JEEB: the documentation in https://github.com/FFmpeg/FFmpeg/blob/n4.0/libavformat/avformat.h#L39-L40 still says to call av_register_all, is it outdated?
[19:08:44 CEST] <JEEB> yes, the docs most likely didn't get updated. feel free to make a bug report about that
[19:09:04 CEST] <BtbN> tuna, best to use a dummy hw_frames_ctx for now.
[19:09:21 CEST] <BtbN> The actual solution needs some more involved one
[19:16:19 CEST] <memo1> JEEB: can you help me please?, trying this many days, i need to up ffmpeg on network lost
[19:44:57 CEST] <tuna> BtbN: Ok, ill go with the dummy frame ctx
[20:21:48 CEST] <YokoBR> hi there
[20:22:53 CEST] <YokoBR> Dudes, I'm using this script: https://docs.peer5.com/guides/production-ready-hls-vod/ How can I use that playlist.m3u8? Will that video be available forever or it's just generated on deman?
[20:24:04 CEST] <Mavrik> Well you use it in <video> tag or open it as a player URL
[20:24:07 CEST] <YokoBR> https://gist.github.com/mrbar42/ae111731906f958b396f30906004b3fa
[20:24:25 CEST] <YokoBR> I want to use it on web
[20:24:34 CEST] <YokoBR> maybe calling the script from nginx
[20:25:27 CEST] <FishPencil> How do I get the k1 and k2 values for the lenscorrection filter? I ran the calibration tool, but my values are -4.552e-03 and 3.893e-05 which is way to small.
[20:27:28 CEST] <FishPencil> does the calibration tool need to be set with the square size option or something?
[20:31:11 CEST] <the_gamer> i got a folder with 9311 pngs numbered from 0001.png to 9311.png. why does "ffmpeg -i geblenderrendert/%04d.png -r 25 -q:v 1 rendered.mp4" only give about 1 sec of video and not the 372 seconds as i would expect?
[20:33:25 CEST] <nicolas17> maybe a missing file in the sequence? the numbers have to be consecutive
[20:34:22 CEST] <ChocolateArmpits> YokoBR, the m3u8 playlist is regenerated every time based on hls_list_size, I think. As new segments are generated, the list gets updated. Wrapping it in a simple video tag won't work, most browsers don't support hls without plugins. You need to get a dedicated JS player, like videojs or plyr with addition of hls.js (which actually adds the hls support to the afformentioned players)
[20:34:23 CEST] <the_gamer> damn, that's it
[20:34:37 CEST] <the_gamer> nicolas17, so i have to make a new folder and rename the pictures?
[20:35:22 CEST] <tuna> BtbN: I must be doing somthing wrong here...yall said last week that nvenc supports some form of rgb input (I think it was BGR0?) with ffmpeg...but the hwcontext_cuda.c file keeps giving me an error saying pixel format not supported...and it fails when it compares what I pick with the following list:     AV_PIX_FMT_NV12,     AV_PIX_FMT_YUV420P,     AV_PIX_FMT_YUV444P,     AV_PIX_FMT_P010,     AV_PIX_FMT_P016,     AV_PIX_FMT_YUV444P16,
[20:35:42 CEST] <YokoBR> ChocolateArmpits: I'm using videojs
[20:36:06 CEST] <YokoBR> I just want to grab my video url hosted at amazon and create a hls vod with ffmpeg
[20:36:08 CEST] <BtbN> tuna, that's for the frames it can allocate for you
[20:36:11 CEST] <ChocolateArmpits> YokoBR, the playlist and the segments are meant to be hosted on an http server, with a low enough cache duration (basically anything lower than the duration of a single segment)
[20:36:18 CEST] <ChocolateArmpits> YokoBR, you also need hlsjs
[20:36:20 CEST] <BtbN> But since you're allocating it yourself, there's no need to deal with it
[20:36:32 CEST] <tuna> So how do I stop it from making that check?
[20:36:41 CEST] <tuna> just leave it blank?
[20:36:46 CEST] <tuna> or null
[20:37:14 CEST] <ChocolateArmpits> YokoBR, just get this https://github.com/Peer5/videojs-contrib-hls.js
[20:37:18 CEST] <tuna> it happens in cuda_frames_init
[20:37:37 CEST] <tuna> Maybe I dont init the dummy frame?
[20:37:39 CEST] <ChocolateArmpits> YokoBR, it has an example player code right there too
[20:38:21 CEST] <BtbN> You only need some struct that has sw_format, width and height set
[20:38:31 CEST] <Mavrik> (All mobile browsers, Edge and Safari support HLS)
[20:38:55 CEST] <ChocolateArmpits> well that's cool because chrome and firefox don't
[20:38:55 CEST] <tuna> Ok
[20:39:26 CEST] <Mavrik> True, but your "most" kinda isn't correct ;)
[20:39:32 CEST] <Mavrik> And yeah, video.js shim works ok
[20:39:47 CEST] <ChocolateArmpits> well I didn't know about Edge
[20:40:37 CEST] <ChocolateArmpits> as for Apple products, that was part of "not most"
[20:42:34 CEST] <furq> the_gamer: just use -pattern_type glob -i "*.png"
[20:42:58 CEST] <furq> also -q:v 1 probably isn't what you want
[20:44:17 CEST] <the_gamer> furq, what do i want?
[20:44:48 CEST] <furq> i don't know
[20:44:58 CEST] <furq> i don't think -q is mapped to anything with libx264, which is the default for mp4
[20:45:11 CEST] <the_gamer> i donÄt want it to scale the quality of the video
[20:46:09 CEST] <furq> if you don't want to lose any quality then use -qp 0
[20:46:15 CEST] <the_gamer> how to leave the perfect quality of the pictures in the video?
[20:46:16 CEST] <furq> but you'll obviously end up with a massive video
[20:46:32 CEST] <the_gamer> yeah, of course but that's what i want because of editing
[20:46:35 CEST] <furq> also if your source is png then you probably want to use -c:v libx264rgb if you need perfect quality
[20:46:41 CEST] <furq> bear in mind both of those will reduce compatibility
[20:50:02 CEST] <the_gamer> when to use -qp 0 and when q:v 1?
[20:51:30 CEST] <JEEB> &32
[20:52:10 CEST] <furq> -q:v isn't mapped to anything with x264
[20:52:13 CEST] <furq> so you should never use that
[20:52:29 CEST] <the_gamer> what about other formats?
[20:52:41 CEST] <the_gamer> or where to read about it?
[20:53:28 CEST] <furq> the mappings for generic codec options are sort of badly documented
[21:04:36 CEST] <the_gamer> too bad :(
[21:04:44 CEST] <the_gamer> thank you for telling me about it
[21:22:33 CEST] <YokoBR> ChocolateArmpits: I got that, the player side is okay. I'm just confused about that script. How it will be a "VoD" on server side
[21:22:54 CEST] <ChocolateArmpits> oh a vod, then a single playlist
[21:22:59 CEST] <ChocolateArmpits> with lots of segments
[21:32:08 CEST] <ChocolateArmpits> but you can also have a single segment with the playlist using byterange to mark each segment, though the server needs to support range-based requests
[21:35:43 CEST] <zevarito_> avcodec_send_packet requires a valid frames pts ?
[21:37:41 CEST] <Mavrik> yup, of course :)
[21:37:55 CEST] <Mavrik> well I guess there might be encoders that don't needit
[22:13:56 CEST] <wfbarksdale> does mp4 codec store a framerate value?
[22:14:25 CEST] <JEEB> no, it only contains timestamps and durations of each sample
[22:14:32 CEST] <JEEB> there really isn't a field like "frame rate"
[22:14:45 CEST] <JEEB> of course, you can code constant frame rate like that
[22:15:03 CEST] <JEEB> just have all of the samples have composition time stamps of previous.cts + previous.duration
[22:19:40 CEST] <wfbarksdale> thats what I thought thanks as always JEEB
[22:20:19 CEST] <alone-z> hello, putting "spaces" after text is only method with drawtext for replacing /
[22:20:23 CEST] <alone-z> ?
[22:20:35 CEST] <ovijs> hello, i have a litte problem with ffmpeg
[22:20:53 CEST] <ovijs> Unrecognized option 'crf'
[22:21:05 CEST] <ovijs> it worked at some point
[23:01:57 CEST] <tuna> avcodec_send_frame returns a -11 "invlaid argument"...not very helpful, what argument is invalid?
[23:03:26 CEST] <tuna> looks like AVERROR(EINVAL):   codec not opened, refcounted_frames not set, it is a  *                         decoder, or requires flush
[23:54:45 CEST] <alone-z> how to avoidBAT file too long line error?
[23:56:53 CEST] <tuna> in reference to my last question about the "Invalid argument" does anything seem out of place??? https://pastebin.com/haYD7Ehq
[23:57:15 CEST] <tuna> Invalid argument comes on sending frame to encoder
[23:57:58 CEST] <JEEB> tuna: you should be OK if you follow these guidelines https://www.ffmpeg.org/doxygen/trunk/group__lavc__encdec.html
[23:58:38 CEST] <alone-z> hello jeeb, may be you know, can i send ffmpeg some file as parameters?
[23:59:00 CEST] <alone-z> not ffmpeg.exe *****, but ffmpeg.exe file_with_parameters?
[23:59:02 CEST] <JEEB> what
[23:59:08 CEST] <JEEB> isn't that just making a script
[23:59:13 CEST] <tuna> JEEB: Does this cover hardware encoding as well, with hardware input...seems to be very little documentations on the steps to take to get that to work
[23:59:49 CEST] <JEEB> in general yes, but hardware contexts and AVFrames are kind of special
[23:59:59 CEST] <JEEB> there's a few examples under doc/examples IIRC
[00:00:00 CEST] --- Tue Apr 24 2018


More information about the Ffmpeg-devel-irc mailing list