[Ffmpeg-devel-irc] ffmpeg.log.20180201

burek burek021 at gmail.com
Fri Feb 2 03:05:02 EET 2018


[00:00:46 CET] <linux50> I am stumped... I am trying to capture a rtsp stream from my foscam camera and re-stream it using ffserver. Am I allowed to paste my ffmpeg command here?
[00:01:31 CET] <furq> probably pastebin it
[00:01:40 CET] <furq> but the advice you'll get will most likely be "don't use ffserver"
[00:02:02 CET] <linux50> well... my issue is that ffmpeg is not capturing the stream to begin with
[00:02:10 CET] <linux50> so Im still behind the curve ball there
[00:02:24 CET] <BtbN> keep in mind that ffserver is dead and nobody has any useful experience with it.
[00:02:40 CET] <BtbN> So help there will be limited, mostly a bunch of people telling you not to use it. So, don't use it.
[00:02:42 CET] <furq> does it work if you output to a file instead of ffserver
[00:03:20 CET] <linux50> im not sure
[00:03:31 CET] <linux50> forgive my ignorance
[00:10:18 CET] <linux50> https://pastebin.com/GBpiaHxM
[00:10:25 CET] <linux50> can anyone please take a look
[00:11:18 CET] <linux50> on the bottom... Text is red and underlined with a message "Unkown Encoder 'libx264'
[00:11:31 CET] <furq>   configuration: --enable-libvpx
[00:11:34 CET] <furq> uhh
[00:11:58 CET] <furq> did you build this yourself
[00:12:18 CET] <furq> if not then go and yell at whoever did build it
[00:12:29 CET] <linux50> no... like a noob... i tried different commands
[00:12:32 CET] <linux50> this has never worked
[00:12:38 CET] <linux50> so... im the only one to blame
[00:12:43 CET] <furq> well yeah that build doesn't have x264
[00:12:47 CET] <linux50> this is for learning purposes
[00:12:50 CET] <furq> or very much of anything really
[00:13:05 CET] <linux50> when you say build
[00:13:10 CET] <linux50> are you referring to how it was compiled
[00:13:11 CET] <linux50> ?
[00:13:12 CET] <furq> yes
[00:14:30 CET] <linux50> im at the documentation page of ffmpeg.org
[00:14:43 CET] <linux50> is there a HowTo?
[00:14:54 CET] <furq> how to what
[00:14:54 CET] <linux50> on how to compile with the appropiate features?
[00:15:11 CET] <furq> there are compilation guides on the wiki
[00:15:12 CET] <furq> but really
[00:15:15 CET] <furq> https://www.johnvansickle.com/ffmpeg/
[00:15:18 CET] <furq> just use that
[00:16:31 CET] <linux50> thank you
[00:16:33 CET] <linux50> i will try that out
[00:17:51 CET] <linux50> one last thing... how were you able to see that it had no features?
[00:23:59 CET] <furq> where it says "configuration: --enable-libvpx"
[00:24:44 CET] <furq> http://vpaste.net/lcMNO
[00:24:47 CET] <furq> as opposed to something like this
[00:37:14 CET] <linux50> furq: http://vpaste.net/ijIHK
[00:37:49 CET] <linux50> okay
[00:37:58 CET] <linux50> i think i have resolved that issue, right?
[01:04:22 CET] <geuis> furq: was the librsvg wrapper yours?
[01:05:00 CET] <atomnuker> no, I wrote it
[01:05:14 CET] <atomnuker> bugs?
[01:06:57 CET] <geuis> cool wrapper.
[01:07:01 CET] <geuis> question about fonts
[01:07:48 CET] <geuis> the original svg includes a path to a local font file for text. shows up in other svg viewing apps but isn't being used by ffmpeg
[01:07:55 CET] <geuis> wonder if you had any insights
[01:08:56 CET] <ddubya> svg is plaintext right? You can find and replace the fonts
[01:09:27 CET] <atomnuker> geuis: never heard of such, but libcairo should handle it
[01:09:38 CET] <geuis> yup plaintext. not sure what you mean by find and replace though
[01:10:16 CET] <ddubya> replace the font with another one that you have
[01:10:33 CET] <ddubya> maybe the font substitution doesn't work for some reason
[01:11:05 CET] <geuis> the font file is local to the directory where ffmpeg is working. The svgs all reference it
[01:11:31 CET] <ddubya> oic
[01:12:23 CET] <ddubya> well if you can't install the font, perhaps you can add the directory to font search path
[01:12:36 CET] <ddubya> no idea how
[01:12:42 CET] <geuis> hmm
[01:15:45 CET] <ddubya>  or maybe use a fully qualified path in the svg
[01:15:55 CET] <ddubya> might need file:// prefix on it
[01:32:41 CET] <kepstin> geuis: what format is the external font file?
[01:35:24 CET] <geuis> ttg
[01:35:26 CET] <geuis> ttf
[01:35:52 CET] <kepstin> ttf? I suspect the most reliable way is to just install the font and use it by name then, rather than listing a file.
[01:38:24 CET] <geuis> hmm just install it to debian and reference by name?
[01:38:42 CET] <kepstin> yeah, using css font-family: "Font Name" or whatever
[01:39:09 CET] <geuis> ddubya: tried the file:// path, no luck
[01:39:13 CET] <kepstin> I don't think rsvg has any support for loading fonts from files at all
[01:39:45 CET] <geuis> what about how fontfile is referenced in drawtext?
[01:40:18 CET] <geuis> looked briefly through the docs but not sure if fontfile works outside of drawtext
[01:40:54 CET] <kepstin> drawtext is completely independent from the librsvg "decoder" (svg renderer)
[01:42:32 CET] <kepstin> it looks like librsvg internally just passes the font-family from the svg to pango, which attempts to load that from the system fonts
[01:42:41 CET] <kepstin> they explicitly don't support svg fonts
[01:49:44 CET] <linux50> Hi Everyone... What streaming server (besides ffserver) do you recommend? Also... Why should I not use ffserver?
[02:02:02 CET] <kepstin> linux50: you shouldn't use ffserver because nobody is gonna be able to help you if you have problems with ffserver.
[02:02:45 CET] <kepstin> as for a streaming server, that depends entirely on what's gonna be playing the stream.
[02:07:14 CET] <SortaCore> copying from m3u8, is it meant to keep selecting new URLs?
[02:07:51 CET] <SortaCore> it goes from bla(n).ts to bla(n+1).ts
[02:09:53 CET] <SortaCore> on the computer doing screen recording, it keeps pausing for buffering
[03:02:55 CET] <SortaCore> correct answer: yea, it seems like new URLs make no difference
[03:54:54 CET] <karen__> You still around JEEB?
[04:22:35 CET] <doug___> hi! is there any filter to track faces in the video?
[04:23:03 CET] <thebombzen> libavfilter doesn't have any native facial recognition algorithms
[04:23:31 CET] <thebombzen> You'd have to find something more specialized to do that
[04:24:04 CET] <doug___> ok thanks thebombzen!
[04:24:33 CET] <thebombzen> doug___: it is worth noting that it has a video stabalizer plugin with vid.stab
[04:24:40 CET] <thebombzen> that might make it easier to track faces with other software
[04:24:50 CET] <linux50> hi everyone... What stream (re-broadcast) server do you recommend
[04:24:51 CET] <thebombzen> since the background will not move as much
[04:25:22 CET] <thebombzen> linux50: the most common recommendation is to use nginx as an HTTP streaming server
[04:25:27 CET] <linux50> I have my foscam ip cameras as my source using h.264 and AAC
[04:25:32 CET] <linux50> oh interesting
[04:25:33 CET] <linux50> okay
[04:25:36 CET] <linux50> ill check that out
[04:25:39 CET] <linux50> just curious
[04:25:43 CET] <linux50> why nginx
[04:25:44 CET] <linux50> ?
[04:25:50 CET] <thebombzen> it's a generic http server (like apache) that needs a module to do it
[04:26:02 CET] <linux50> can it be done in apache?
[04:26:02 CET] <thebombzen> this is for HLS or Dash. for rtmp* you'd need something else
[04:26:14 CET] <thebombzen> I believe it's easier to do it in nginx, but perhaps it can be done in apache? I don't know.
[04:26:17 CET] <doug___> thebombzen: thank you. You've been most helpful
[04:26:21 CET] <thebombzen> :)
[04:26:37 CET] <linux50> for rtmp is there a recommendation
[04:26:43 CET] <linux50> or should I re-encode it with ffmpeg?
[04:26:54 CET] <thebombzen> I'd avoid re-encoding if possible, but it might be unavoidable
[04:27:03 CET] <thebombzen> since if the bitrate is too high for streaming you can't stream it
[04:27:23 CET] <thebombzen> I have no idea what to use as an rtmp streaming server, sorry.
[04:27:54 CET] <linux50> okay
[04:28:03 CET] <linux50> ffmpeg can transcode right?
[04:28:11 CET] <thebombzen> yes, absolutely
[04:28:18 CET] <thebombzen> also upon googling apparently nginx-rtmp exists
[04:28:19 CET] <linux50> thats what "-f mpeg4" would mean?
[04:28:38 CET] <thebombzen> no, you wouldn't use "-f mpeg4" cause that would mean "mux to the mpeg4 container format" which doesn't exist
[04:28:56 CET] <linux50> what would be ideal then?
[04:28:56 CET] <thebombzen> transcoding is decoding and re-encoding
[04:29:09 CET] <thebombzen> Ideally you wouldn't be transcoding, and just streamcopy
[04:29:16 CET] <thebombzen> but upon googling I found nginx-rtmp: http://nginx-rtmp.blogspot.com/
[04:29:23 CET] <thebombzen> check it out, if you'd ike to stream over rtmp
[04:29:35 CET] <thebombzen> it also does HLS and Dash though if that fits your needs better
[04:29:56 CET] <linux50> okay... So i continue to be on square 1
[04:30:33 CET] <linux50> so my objective is to put all my home cameras onto a internal webpage at home that my raspberry pi (which is connected to my TV) can present on my TV
[04:30:55 CET] <linux50> so am I going about it the wrong way?
[04:32:59 CET] <thebombzen> if you're looking for some sort of CCTV-like system, the best way to do that is probably to just not do a webpage
[04:33:23 CET] <thebombzen> have all the cameras go directly to the rasperry pi, which will then on-the-fly just tile the videos together and put it on your TV
[04:33:53 CET] <thebombzen> you don't need to do a webpage rendered in the browser if you're not interacting with the videos, just displaying them.
[04:54:45 CET] <linux50> thebombzen: can ffmpeg send the stream to nginx as well as save to an output file?
[04:54:51 CET] <thebombzen> Yes
[04:55:00 CET] <linux50> how would that format look like
[04:55:19 CET] <linux50> do I specify the file the the destination url
[04:55:19 CET] <linux50> ?
[04:55:38 CET] <linux50> do I specify the file then the destination url?
[04:55:43 CET] <linux50> sorry for the typo*
[04:56:06 CET] <kazuma_> output it as hls/m3u8 in a dir nginx monitors
[04:56:24 CET] <kazuma_> then access the m3u8 from a browser or device
[04:59:28 CET] <kazuma_> https://pastebin.com/1cM2CCga might be useful for you linux50, my script to do it in windows, the recordings are stored locally as .ts + encoded on the fly and streamed to wherever via nginx
[05:03:58 CET] <linux50> thank you
[05:04:01 CET] <linux50> i will review
[05:04:47 CET] <linux50> what is the -ss?
[05:09:28 CET] <kazuma_> -ss in ffmpeg specifies video start time, so in that script i get the current runtime of the active recording with media info for loop and set it as %duration%
[05:09:56 CET] <kazuma_> so then when i start encoding for the live stream, it starts from where the recoring is at, instead of from the beggining of the recording file
[05:10:43 CET] <linux50> does that mean you have 1 giant file?
[05:10:49 CET] <linux50> with all your recording?
[05:11:08 CET] <kazuma_> yeah, i have the recording writing to disk
[05:11:30 CET] <kazuma_> then jump into it at %current_time% to stream from
[05:12:57 CET] <linux50> i wrote my own script using openRTSP that would break the recording into 5 minute increments
[05:13:18 CET] <linux50> and then organize the videos by hour then day, then month, etc...
[05:13:47 CET] <linux50> so If i needed to go back and review i could quickly move through it in 5 minute increments
[05:14:49 CET] <kazuma_> good idea
[05:15:01 CET] <linux50> but its in bash
[05:15:10 CET] <linux50> i dont mind passing it down to you guys
[05:15:14 CET] <linux50> via pastebin or something
[05:15:36 CET] <linux50> i personally didnt like it because I wanted to re-broadcast it
[05:15:40 CET] <linux50> and I dont think it does that
[05:15:53 CET] <linux50> but...
[05:16:00 CET] <linux50> if I can get the nginx to work
[05:16:17 CET] <linux50> ill have the script pull the stream from nginx (if possible) and record the video for me
[09:13:16 CET] <bloobloo> found a bug from ffmpeg? https://pastebin.com/YXksPJaC
[09:14:18 CET] <bloobloo> "-i default" works but "-i hw:3,0" doesn't, the "arecord -l" shows there is the 3,0 device
[09:16:50 CET] <bloobloo> many people were using "-f alsa" and suggested that command, i guess that works with "-f alsa" but not with "-f pulse"
[09:28:53 CET] <furq> bloobloo: what's the output of pactl list sources
[09:31:30 CET] <bloobloo> furq: https://pastebin.com/71fnTwx0
[09:38:43 CET] <furq> i guess you want -f pulse -i 1
[09:41:30 CET] <Nacht> isn't it 3.0 instead of 3,1 ?
[09:41:38 CET] <Nacht> *3,0
[09:45:30 CET] <bloobloo> furq: that indeed works
[09:45:59 CET] <bloobloo> Nacht: both should work
[09:46:08 CET] <bloobloo> if i remember correctly
[10:30:06 CET] <kazuma_> i have a video with "Chroma subsampling : 4:2:0 (Type 2)"
[10:30:43 CET] <kazuma_> when i encode it, it becomes "4:2:0", how can i maintain the (Type 2) ??
[10:32:01 CET] <JEEB> I have no idea what that means, 4:2:0 is 4:2:0
[10:32:48 CET] <JEEB> in other words, find out wtf whatever you're reading that from means with that thing
[10:33:30 CET] <kazuma_> https://pastebin.com/raw/TCuwVWXx
[10:34:30 CET] <kazuma_> it's this video JEEB, i was told when encoding it, i should keep the same settings for everything eg, 5.1, 4:2:0 type 2, 10bit, btrec2020
[10:36:07 CET] <kazuma_> i think it's somthing to do with hdr primaries
[10:37:13 CET] <JEEB> uhh
[10:37:29 CET] <JEEB> there is no metadata for 4:2:0 type1/2, there is no such distinction
[10:37:50 CET] <JEEB> so really, ask Mediainfo
[10:38:13 CET] <kazuma_> i might make a post later, thanks for the info
[10:38:17 CET] <JEEB> the range, primaries, transfer characteristics are what matter (and then you have the special HDR metadata which is sepatate)
[10:38:21 CET] <JEEB> *separate
[10:39:22 CET] <JEEB> kazuma_: also I don't know of two different 4:2:0 modes in HEVC, pretty sure won't find it in https://www.itu.int/rec/T-REC-H.265-201612-I/en
[10:39:41 CET] <JEEB> so as I noted, find out WTF mediainfo means with that and if it actually means anything
[10:40:41 CET] <kazuma_> ok, will see what i can find out
[10:40:44 CET] <jkqxz> Chroma sample location, perhaps?
[10:40:53 CET] <jkqxz> That's kindof a property of the subsampling rather than anything else.
[10:47:10 CET] <JEEB> jkqxz: yea but I would expect it to output it like ffprobe then
[10:47:15 CET] <JEEB> Chroma Location: Top Left or so
[10:47:25 CET] <JEEB> instead of "4:2:0 (Type 2)"
[10:47:32 CET] <JEEB> in chroma subsamppling
[10:53:58 CET] <kazuma_> ffprobe shows it as "yuv420p10le"
[10:55:58 CET] <pmjdebruijn> so that's 10bit
[10:56:45 CET] <pmjdebruijn> kazuma_: so not all codec have a 10bit mode
[11:01:16 CET] <JEEB> kazuma_: that's just 10bit , 4:2:0. ffprobe will also show a lot of other stuff esp. with -v verbose
[11:02:19 CET] <JEEB> verbose adds the chroma location as well
[11:02:22 CET] <JEEB> `yuv420p(tv, bt709, top first, left)`
[11:02:32 CET] <JEEB> is from an interlaced thing I have on hand :P
[11:03:01 CET] <inkubot> hi all.. not sure to create a bug report so i will start here... ffprobe is putting every i-frame as IDR key_frame=1 ...
[11:03:51 CET] <JEEB> inkubot: check with current master if possible, and post a sample + link it on the trac issue tracker
[11:04:03 CET] <inkubot> anyone is using ffprobe to analyze video, with 2sec GOP and I-Frame insertion in scene changes?? before our scripts were working ok because it wasn't open gop
[11:04:12 CET] <inkubot> ok JEEB thanks
[11:04:16 CET] <inkubot> will try it
[11:05:04 CET] <JEEB> with open gop I recall reading the H.264 parser and it would only mark things as random access points if a) IDR or b) open GOP random access point SEI + I
[11:05:16 CET] <JEEB> if it's anything else it shouldn't mark it as a random access point
[11:06:12 CET] <inkubot> sorry my mistake not open gop... just and IDR every 2 seconds and i-frame insertion in scene change inside the chunk
[11:06:27 CET] <inkubot> i will try with current
[11:06:51 CET] <JEEB> well I just noted to you the two ways I remember FFmpeg's H.264 parser notes random access points with H.264 (unless the container has that information)
[11:07:23 CET] <JEEB> because if the container has a random access point flag then that is of course trusted
[12:13:56 CET] <LiamC> Anybody come across "Invalid UE golomb code" when converting AVI to MP4 with ffmpeg?
[14:32:46 CET] <wouter> hi -- I'm having issues with concatenated video
[14:33:37 CET] <wouter> getting output like http://paste.debian.net/1008324/ on stderr
[14:33:55 CET] <wouter> and the result is a video that just freezes when it should switch to the next component
[14:34:17 CET] <wouter> (I'm doing 'ffmpeg -f concat foo.txt -c:v copy -c:a copy', in case that matters)
[14:34:29 CET] <wouter> although that's not the entire command line; I can get you one if you need
[14:38:13 CET] <wouter> any ideas what I might be doing wrong?
[14:43:13 CET] <mort> I'm calling av_frame_get_buffer, which allocates the frame's 3 buffers. frame->buf[{0,1,2}] exist, and claim to have a size which seem appropriate, but calling 'av_buffer_get_opaque' on them returns NULl. Am I missing something?
[14:47:07 CET] <jkqxz> mort:  What are you trying to do?  The opaque field is owned by whoever allocates it and generally shouldn't be touched by anything else.
[14:48:22 CET] <mort> jkqxz: it's allocated by a codec I'm trying to add to libavcodecs (using that av_frame_get_buffer function), and used by webrtc
[14:48:53 CET] <mort> I'm trying to add a codec to chromium's webrtc implementation, and that uses ffmpeg, so I'm adding it as an ffmpeg codec
[14:49:10 CET] <mort> webrtc is using av_buffer_get_opaque and casting the result to its own VideoFrame struct
[14:49:49 CET] <jkqxz> Well, that must be assuming it has a frame it allocated itself and put into AVFrame.
[14:50:25 CET] <mort> https://cs.chromium.org/chromium/src/third_party/webrtc/modules/video_coding/codecs/h264/h264_decoder_impl.cc?type=cs&q=h264_decoder_impl.cc&sq=package:chromium&l=348
[14:50:51 CET] <mort> it's not allocating av_frame_->buf[0] itself, that's allocated by avcodec_decode_video2
[14:51:57 CET] <jkqxz> It must have allocated the frames itself if it expects that to work.
[14:52:39 CET] <mort> av_frame_ itself is just allocated by 'av_frame_.reset(av_frame_alloc())'
[14:52:40 CET] <jkqxz> That will definitely not work on any frame allocated by libav*.
[14:53:26 CET] <jkqxz> If that's comig from receive_frame(), presumably it set a get_buffer callback so that it can do the allocation?
[14:53:54 CET] <mort> right, yes, there's actually a get_buffer callback
[14:54:13 CET] <mort> I didn't think about that, that's probably why it works; thanks
[14:54:43 CET] <kepstin> wouter: i'm guessing that the videos you're concatenating were not encoded with compatible settings (and ffmpeg doesn't or isn't able to to midstream parameter updates in most containers)
[14:55:56 CET] <wouter> kepstin: I just noticed that the audio layout wasn't entirely the same; but beyond that, it should
[14:56:26 CET] <kepstin> wouter: in this particular case, it looks like one of the later videos was encoded with a higher number of reference frames than a previous video.
[14:56:27 CET] <wouter> kepstin: I ran 'ffprobe -show_format -show_streams' on all files and compared the output; didn't find anything totally out of the ordinary
[14:56:38 CET] <wouter> oh, okay
[14:56:41 CET] <kepstin> wouter: ffprobe won't show this, no.
[14:57:13 CET] <wouter> so what I'm trying to do is add opening credits and closing credits based on a .png file to a video
[14:57:42 CET] <wouter> so I use ffmpeg to generate a 5-second video file based on the two PNG files first, and then concatenate the two together
[14:57:48 CET] <wouter> but perhaps that's not the best way to do it?
[14:58:02 CET] <kepstin> wouter: assuming the existing video was encoded with x264, you should be able to use mediainfo or so to find out what x264 settings were used, and attempt to copy those.
[14:58:11 CET] <kepstin> that might be enough to get it to work
[14:58:29 CET] <wouter> hrm, okay. Where is this "mediainfo" documented? Haven't heard of it...
[14:58:49 CET] <wouter> oh, it's a separate program, right
[14:59:04 CET] <kepstin> or you could just try using something like -preset veryslow when making your credits, so your first video uses all the x264 features
[14:59:09 CET] <kepstin> that might be enough.
[14:59:48 CET] <wouter> mm, let me try that one
[15:16:53 CET] <saml> good morning my friends
[15:24:33 CET] <wouter> kepstin: -preset veryslow doesn't seem to be fixing it. mediainfo output on a sample of the video that we're trying to convert is http://paste.debian.net/1008324/; what parameters should I add to ffmpeg to make a file that is compatible?
[15:24:39 CET] <wouter> kepstin: current command line is:
[15:24:40 CET] <wouter> Running: 'ffmpeg' '-loglevel' 'warning' '-y' '-loop' '1' '-framerate' '25/1' '-i' '/srv/sreview/assets/fosdem2018_sponsors_bg.png' '-f' 'lavfi' '-i' 'anullsrc=channel_layout=mono' '-c:v' 'libx264' '-r:v' '25/1' '-speed' '4' '-c:a' 'aac' '-ar' '48000' '-t' '5' '-pix_fmt' 'yuv420p' '-preset' 'veryslow' '/tmp/transpaVhji/video_test-postroll.mkv'
[15:24:56 CET] <wouter> (actually, that would be "libfdk_aac", but anyway)
[15:25:29 CET] <kepstin> wouter: wrong paste? that's not the mediainfo output
[15:25:45 CET] <wouter> whoops
[15:25:49 CET] <wouter> http://paste.debian.net/1008338/ is the right one
[15:26:21 CET] <wouter> (I really want to "do it right" at some point, but don't have the time right now, FOSDEM is this weekend and it needs to work then...)
[15:26:31 CET] <saml> are you concating two same codec?
[15:27:23 CET] <kepstin> saml: same codec, but different encoder settings in a way that means the sps from the first video isn't compatible with the second video.
[15:27:28 CET] <wouter> saml: I'm trying to create a 5-second file from a .png file that is compatibl with what the video team records
[15:28:12 CET] <kepstin> wouter: hmm, that video might not be from x264 then, if it's off a camera or capture card or something.
[15:28:38 CET] <wouter> yeah, it's from a blackmagic hardware encoder.
[15:28:48 CET] <wouter> sorry, you did mention that
[15:28:53 CET] <saml> one static image for 5 secs followed by whatever vid?
[15:29:08 CET] <kepstin> wouter: honestly, i'd probably just re-encode the whole thing with x264 (use fast settings if needed), since figuring out x264 settings close enough to match it would be a lot of trial and error.
[15:29:13 CET] <wouter> yes, and the same at the end, in reverse
[15:29:30 CET] <wouter> the one at start works right now, but it's the end slide that fails
[15:29:59 CET] <saml> audio is none?
[15:30:08 CET] <kepstin> wouter: oh, it's the transition from main video to end credits that fails?
[15:30:14 CET] <wouter> kepstin: yes
[15:30:24 CET] <kepstin> i thought it was front credits to main video.
[15:30:25 CET] <wouter> opening credits to main video works fine
[15:31:08 CET] <wouter> there's an example at https://video.fosdem.org/2018/J1.106/2018-02-03/video_test.mp4
[15:31:42 CET] <wouter> (sorry about the annoying audio, wasn't my fault ;)
[15:32:22 CET] <kepstin> ok, setting '-profile:v main -level 31' when encoding the end video will probably be enough
[15:32:34 CET] <kepstin> that should match the hardware encoder close enough. worth a try anyways
[15:32:43 CET] <wouter> yeah, I'll give that a go
[15:32:53 CET] <wouter> those are output options, right?
[15:32:55 CET] <kepstin> yes
[15:33:54 CET] <kepstin> if you're got some time, and particularly if you're going to be distributing these videos over the internet, you should probably still re-encode the whole thing with x264 if only to reduce the size.
[15:34:10 CET] <wouter> we're also transcoding to vp9 ATM
[15:34:21 CET] <wouter> shipping mp4 because it exists, but I really only care about the vp9
[15:34:28 CET] <wouter> and that ends up being about half size, so...
[15:34:35 CET] <kepstin> ouch. I hope you've got libvpx 1.7 for the multithreaded vp9 encoder :)
[15:34:54 CET] <wouter> we're going to be transcoding 600+ videos, I don't care about the time for a single video ;-)
[15:35:06 CET] <wouter> but I did see the recommended settings on the google website, and we're using that
[15:35:21 CET] <kepstin> re-encoding the mp4 with x264 should let you reduce the size quite a bit at similar quality - hardware encoders are not very efficient in general
[15:35:39 CET] <kepstin> it'll probably be close to the vp9 file
[15:35:44 CET] <wouter> yeah, but it would also need some testing and coding, and it's too late for that now
[15:44:13 CET] <wouter> kepstin: that failed too :-/
[15:49:43 CET] <wouter> kepstin: can I tune the reference frame settings in ffmpeg somehow? Reading the manpage, but can't find it...
[15:49:49 CET] <furq> -refs
[15:49:59 CET] <furq> or maybe -ref, i forget which one is ffmpeg and which is x264
[15:50:08 CET] <furq> check in ffmpeg -h encoder=libx264
[15:51:56 CET] <saml> i'm asking this question again. what is a good lossless codec and container?
[15:52:33 CET] <furq> depends what you need
[15:52:38 CET] <saml> -vcodec libx264  -crf 0   out.mp4    is good but I don't like .mp4 cause I cannot send stream to -i
[15:52:45 CET] <furq> use mkv then
[15:52:58 CET] <saml> what's a good video codec for mkv?
[15:53:20 CET] <furq> anything
[15:54:18 CET] <wouter> kepstin: would it perhaps be possible to tell ffmpeg to re-encode the first and last videos, but not the main one?
[15:55:41 CET] <furq> isn't the main video the one that's causing issues
[15:56:12 CET] <wouter> well, yeah, but it's also the huge one
[15:56:24 CET] <wouter> the main one would be a whole talk (say an hour or so), the other two are 5 seconds each
[15:56:33 CET] <furq> well yeah you'd still have the same problem then
[15:56:35 CET] <saml> https://gist.github.com/saml/a384801a53a87bfd4a78df6a5a5308a2  i'm still hung up on this psnr.  am i doing lossless mkv wrong way?
[15:56:48 CET] <wouter> and I'm really only transcoding PNG files to x264 because otherwise it seems impossible to do that?
[15:56:48 CET] <therage3> saml: mkv container supports an obscene amount of video codecs, so choose one that serves your purpose the best.
[15:56:48 CET] <furq> there's no way to get x264 to automatically match the sps values of another stream
[15:57:01 CET] <wouter> but maybe I missed something obvious
[15:57:25 CET] <saml> sps?
[15:57:25 CET] <furq> i'm sure there's a tool that shows you all the sps/pps values but i can't find it now
[15:57:38 CET] <furq> afaik those should ideally match exactly if you want to concat two streams
[15:57:44 CET] <furq> although some decoders are probably a bit more lax about it
[16:01:08 CET] <wouter> Hrm. Is there a page with recommended settings for x264 encoding somewhere? Things like bitrates etc
[16:01:13 CET] <furq> oh duh
[16:01:14 CET] <furq> wouter: https://github.com/aizvorski/h264bitstream
[16:01:28 CET] <furq> the h264_analyze utility from that should be able to show you what's different
[16:01:35 CET] <wouter> ah, heh, okay
[16:01:36 CET] <wouter> thanks
[16:01:57 CET] <furq> i forgot there's an actual binary in that, i thought it was just a lib
[16:02:06 CET] <wouter> right
[16:07:18 CET] <wouter> furq: https://grep.be/~wouter/test.264 is the output from the hardware decoder
[16:08:35 CET] <wouter> er, I mean, what that tool produces on the output of the hardware *en*coder, obviously
[16:11:28 CET] <wouter> the postroll just doesn't have any lines saying "sps"... what does that imply?
[16:12:05 CET] <wouter> furq: https://grep.be/~wouter/post.264
[16:12:40 CET] <furq> how did you encode the postroll
[16:13:22 CET] <wouter> ffmpeg '-loglevel' 'warning' '-y' '-loop' '1' '-framerate' '25/1' '-i' 'fosdem2018_sponsors_bg.png' '-f' 'lavfi' '-i' 'anullsrc=channel_layout=mono' '-c:v' 'libx264' '-r:v' '25/1' '-speed' '4' '-c:a' 'aac' '-ar' '48000' '-t' '5' '-pix_fmt' 'yuv420p' '-profile:v' 'main' '-level' '31' 'video_test-postroll.mkv'
[16:14:13 CET] <wouter> (that's a generated command line, but...)
[16:14:35 CET] <furq> weird
[16:14:43 CET] <furq> i take it that plays by itself
[16:15:02 CET] <wouter> let me double-check that ;)
[16:15:19 CET] <wouter> yeah, it does
[16:15:25 CET] <furq> if there's actually no sps then it probably wouldn't play at all
[16:15:31 CET] <furq> and also ffmpeg shouldn't be producing anything that broken
[16:15:37 CET] <furq> so i wonder if h264_analyze just isn't very good
[16:15:42 CET] <wouter> obviously though, since it's a single thing that doesn't move...
[16:15:59 CET] <wouter> furq: I need to go away now for a while, not sure when I'll be back, but at least an hour or so
[16:16:38 CET] <wouter> furq: thanks for the help so far anyway, I'll see if I can think out of the box a bit and maybe fix it in some way
[16:53:25 CET] <yusa> Hello there, i was just trying to read png with 10 fps and convert dem to a video on the fly and displaying with mpv with: "cat output/*png | ffmpeg -r 10 -i - -c:v libx264 -f flv - | mpv -" But it seems that, First: All PNGs are loaded and then trancoding is started. Thats a problem for me because actually i want to read PNGs coming from a named pipe and transcode them on the fly to video displayed by mpv. Any ideas to overcome this is
[16:54:58 CET] <yusa> Sorry for typos: *dem=them
[16:58:46 CET] <kepstin> yusa: by default, libx264 uses a very large buffer of frames so it can do efficient encoding. If you instead want it to output frames as soon as possible, add "-tune zerolatency" to the ffmpeg output options.
[16:59:34 CET] <kepstin> but that said, you should be able to convince mpv to play the png stream directly without needing ffmpeg in the middle
[17:03:40 CET] <yusa> that would be fine also :-D
[17:04:46 CET] <yusa> with -tune zerolatency it seems to start working roughly after frame 50
[17:05:00 CET] <kepstin> yusa: that's probably just down to pipe buffers then
[17:05:17 CET] <yusa> cat output/*png |  mpv -                  This is not working :D
[17:05:39 CET] <yusa> yes probably due to my pipe
[17:05:48 CET] <kepstin> you'd see different behaviour if you're slowly passing frames in instead of using cat to load a lot at once
[17:07:56 CET] <kepstin> yusa: try "mpv --demuxer-lavf-format=image2pipe -"
[17:09:04 CET] <kepstin> you might also need to add --demuxer=+lavf
[17:13:01 CET] <furq> failing that, if this is just for mpv then just use -c:v rawvideo -f nut in your original command
[17:13:22 CET] <yusa> Awesome man, thanks a lot! This works for me:       cat <>out_pipe.png |  mpv --demuxer-lavf-format=image2pipe -
[17:13:50 CET] <yusa> What would the lavf option mean?
[17:14:11 CET] <kepstin> yusa: mpv has a bunch of its own demuxers, or it can use libavformat (from ffmpeg) to demux
[17:14:16 CET] <kepstin> lavf is short for libavformat
[17:15:11 CET] <yusa> ok thanks
[17:18:01 CET] <yusa> No ffmpeg command needed anymore, i think i have to leave irc now :P ;-)
[17:27:53 CET] <saml> hi bloodbath
[18:17:56 CET] <zerodefect> What is the easiest/best way to do a AVFrame format conversion using the C-API.  I'm using BT.601 and BT.709, so I'd like to preserve the transfer characteristics. Setting up a graph is a bit cumbersome, and it feels a bit over the top.
[18:18:28 CET] <zerodefect> I'd like to convert from YUVV422 (interleaved) to YUV422P
[18:18:42 CET] <zerodefect> *YUV422
[18:18:50 CET] <zerodefect> to YUYV422P
[18:19:09 CET] <kepstin> zerodefect: you either have to use libswscale directly, which means you have to do a lot of manual work with AVFrames, or use libavfilter, which means setting up a filter graph
[18:20:06 CET] <kepstin> the latter is *probably* the better option if you're already working with AVFrames
[18:21:16 CET] <zerodefect> Ok. I'll go down that route.
[18:21:25 CET] <zerodefect> thanks @kepstin
[18:53:31 CET] <saml> [Parsed_ssim_1 @ 0x5631b0e6f660] SSIM Y:0.874944 (9.028965) U:0.962720 (14.285209) V:0.965424 (14.612272) All:0.904654 (10.206954)
[18:53:38 CET] <saml> does that mean two videos are similar?
[18:53:46 CET] <saml> i'm not sure what's the number in parens
[19:10:02 CET] <kepstin> saml: I think the main number is the similarity (1.0 is identical), which means the scond is... hmm, I have no idea.
[19:43:13 CET] <wouter> furq: so I'm an idiot, it actually is ffmpeg after all, and I found the exact command line that's in use, so I can copy settings
[19:43:18 CET] <wouter> sorry about confusing you ;-)
[19:45:57 CET] <wouter> turns out I needed to add -g 45
[19:46:04 CET] <wouter> not sure what that parameter does, but it fixes the issue
[19:47:03 CET] <wouter> actually, might be probesize and/or analyzeduration, too, but anyway
[19:47:22 CET] <kepstin> -g sets the gop size (keyframe interval); the default is 250 which is much larger -and means more reference frames.
[19:51:54 CET] <wouter> ahh, there we go, so yeah, that's probably the reason then
[19:51:59 CET] <wouter> thanks for your help!
[20:02:47 CET] <saml> it's so weird. ssim filter even works on two videos with different framerates
[20:05:23 CET] <saml> if i'm changing framerate to optimize for some players,  is -r better option that -vf framerate or -vf fps ?
[20:05:40 CET] <furq> -r is -vf fps
[20:08:13 CET] <saml> https://lists.ffmpeg.org/pipermail/ffmpeg-user/2013-July/016273.html   this person suggests they are different
[20:09:00 CET] <saml> or i'm reading wrong. it's hard to read
[20:09:32 CET] <BtbN> that looks like a confused wall of text to me
[20:09:36 CET] <furq> yeah he's confused
[20:09:44 CET] <furq> -r as an output option appends -vf fps to the end of the filterchain
[20:09:53 CET] <furq> -r as an input option is an alias for the -framerate option some demuxers have
[20:09:59 CET] <saml> yup
[20:10:17 CET] <saml> so,  for output, would you use -vf fps   or -vf framerate ?
[20:10:20 CET] <furq> fps
[20:10:25 CET] <saml> why?
[20:10:51 CET] <saml> framerate is just computationally more expensive without noticible effect on human perception?
[20:12:05 CET] <wouter> saml: framerate is "frames per second", which abbreviates to fps?
[20:12:14 CET] <wouter> unless you speak a different version of english
[20:12:58 CET] <wouter> unless I'm missing something
[20:12:59 CET] <saml> there are two filters: fps  and framerate
[20:13:13 CET] <wouter> oh, okay, sorry
[20:13:15 CET] <saml> there's also minterpolate filter but it's way too expensive
[20:14:30 CET] <furq> framerate does a linear blend of missing frames and it generally looks terrile
[20:14:33 CET] <furq> b
[20:14:45 CET] <furq> if you prefer the way it looks for your input then by all means use that
[20:16:05 CET] <saml> ah i see. so when down sampling (? going down frame rate),  either makes no difference
[20:33:01 CET] <kepstin> saml: I think when reducing framerate, there's some circumstances where the 'framerate' filter may still blend frames?
[20:40:29 CET] <saml> hrm i see
[20:55:50 CET] <Pandela_> Hey guys
[20:56:03 CET] <Pandela_> Is there an ffplay for android?
[21:49:04 CET] <saml> what is ffplay
[21:49:54 CET] <saml> FFplay is a very simple and portable media player using the FFmpeg libraries and the SDL library. It is mostly used as a testbed for the various FFmpeg APIs.
[21:50:06 CET] <saml> nice
[21:50:08 CET] <JEEB> it's a proof of concept level thing
[21:50:18 CET] <JEEB> never, ever think of it as a proper multimedia player
[21:50:32 CET] <JEEB> it can be useful for testing in some cases
[21:50:56 CET] <BtbN> I have used it on some "professional" situations, where I just needed a UI-less player to play a video in infinite loop forever.
[21:50:56 CET] <saml> that's cool
[21:51:04 CET] <BtbN> No other player could do that as easily as ffplay
[21:51:14 CET] <saml> kiosk?
[21:51:20 CET] <saml> mplayer ?
[21:51:32 CET] <BtbN> mplayer just crapped on itself after a few months
[21:51:34 CET] <JEEB> I've been doing similar stuff with mpv's --loop
[21:51:35 CET] <BtbN> so ffplay it was
[21:51:44 CET] <saml> wow so uptime of months?
[21:51:58 CET] <JEEB> although I haven't usually run it for more than a full day or so
[21:52:02 CET] <BtbN> it's playing on loop for 3 years now
[21:52:11 CET] <saml> that's more stable than erlang
[21:52:13 CET] <DHE> ffplay has some interesting user interactions. for example if you click anywhere in the player, it's treated as a seek request based on the horizontal offset of the position
[21:53:13 CET] <saml> can ffplay play HLS? /me hides
[21:53:27 CET] <JEEB> anything lavf can read
[21:53:30 CET] <BtbN> it can play anything libav* can
[21:54:10 CET] <saml> so it could be a viable option for devices like chromecast?
[21:54:19 CET] <DHE> that includes hls, but last I checked it wasn't good enough to, for example, do dynamic bitrate control
[21:54:47 CET] <JEEB> yea, vlc is one of the few to actually have dynamic profile switching
[21:54:47 CET] <saml> because tv isn't touch screen, it won't seek
[21:55:57 CET] <geuis> k
[21:55:59 CET] <BtbN> chromecasts only support a very few selected formats for no good reason
[21:56:14 CET] <geuis> clear
[21:56:23 CET] <geuis> sry trying a new irc client
[21:56:50 CET] <DHE> but if you wrote an app for chromecast, you should be able to use libavformat (and only avformat) to make a generic video player, but use the hardware decoder for h264 and the codecs...
[21:57:09 CET] <DHE> I should get a chromecast...
[21:57:28 CET] <BtbN> can you even run anything on them?
[21:57:58 CET] <DHE> there's limited apps. but I think the main intention is for them to be the go-between for your phone and a TV
[21:58:09 CET] <saml> you can write a streaming server using ffmpeg  and give chromecast url of your server to stream anything
[21:58:35 CET] <DHE> you can pull up a youtube video on your phone's youtube app, select "Cast to Chrome" (or whatever it's called) and the chromecast will play it. even if the phone is shut down, it keeps playing even through the "Next up" stuff
[22:01:00 CET] <karen__> Hello JEEB , thanks for your help yesterday. I thought I'd pass on that the final workaround ended up being: ffmpeg -c:v hevc -i 1_01_R_171112194500.avi -c copy fixed.mp4 (instead of the .mkv that ya'll suggested)
[22:01:42 CET] <karen__> it gave me a file that i can playback in mplayer (not vlc though) - and mplayer was able to seek the file as well -
[22:10:42 CET] <saml> -c:v in front of -i has different meaning?
[22:25:22 CET] <bencc> do I need to pay for license if I transcode from HEVC to VP8 on a server?
[22:29:43 CET] <kepstin> bencc: ask a lawyer? random people on irc can't really give legal advice :)
[22:30:01 CET] <kepstin> bencc: you probably don't need to pay anything for encoding to vp8, at least :)
[22:31:21 CET] <bencc> kepstin: not asking for official legal advice
[22:31:30 CET] <bencc> but people here knows about codecs :)
[22:31:48 CET] <bencc> I can't find info if transcoding from hevc requires a license
[22:32:07 CET] <kepstin> bencc: I know that several groups of companies and individual companies have patents on hevc, and while some of them have public licensing terms, not all of them do.
[22:32:24 CET] <kepstin> and those patents may or may not apply to you depending on where you are
[22:32:28 CET] <kepstin> so ask a lawyer.
[22:32:50 CET] <bencc> thanks
[23:12:21 CET] <caim> Hello ^_^ Have a noob questions please :D I found out ffmpeg can copy the streams ? and redid my command to copy the audio stream with -c:a copy , is there a way to copy the video also ?  I'm using this command to transmux or w/e it's called this rtmp stream to a HLS stream
[23:13:04 CET] <caim> >>  ffmpeg -v verbose -i rtmp://ip:1935/live1/gcha -c:v libx264 -c:a copy -ac 1 -strict -2 -tune zerolatency -crf 23 -preset veryfast -profile:v baseline -maxrate 1200k -bufsize 1200k -x264opts keyint=30:min-keyint=10:no-scenecut -b:v 500k -threads 0 -g 40 -movflags +faststart -flags -global_header -hls_time 10 -hls_list_size 6 -hls_wrap 10 -start_number 1 /var/www//testgreen2.m3u8
[23:14:16 CET] <caim> so maybe i can delete all the stuff about x264options buffsize etc and have only -c:v copy and -c:a copy ?
[23:15:16 CET] <BtbN> just don't specify a stream, and it will apply to all.
[23:15:29 CET] <caim> hmm
[23:25:45 CET] <alexpigment> caim: judging by the "hmm", i'm guessing you may not have misunderstood BtbN's suggestion
[23:25:50 CET] <arooni> question;  00 libavformat >= 58.0.102 libswscale >= 5.0.101 libavfilter >= 7.0.101 libswresample >= 3.0.100' not found) Unable to find development files for some of the required FFmpeg/Libav libraries. Git master is recommended. ;; how do i get these libraries on mac os x
[23:25:54 CET] <arooni> was trying to install mpv from source
[23:26:21 CET] <alexpigment> caim: so what he means is to just say -c copy , rather than -c:a copy (which is specifying the audio stream when you put :a)
[23:26:41 CET] <furq> caim: copying the video stream won't work very well with hls because it'll end up ignoring -hls_time
[23:26:46 CET] <caim> ah so it copies both that's cool
[23:26:50 CET] <furq> unless your input happens to have 10-second gops
[23:27:59 CET] <BtbN> i think hls actually supports segment/gop length mismatches
[23:28:25 CET] <caim> yea I\ll leave it like this hope it doesn't crashes overnight with WriteN, RTMP send error 104 (5 bytes)  i read somewhere i need to reduce the settings to reach speed > 1x
[23:28:38 CET] <caim> i saw when speed dropped to 0.986x this error pops up
[23:29:04 CET] <furq> i'm pretty sure apple devices will throw a fit if the segment size doesn't match the EXTINF duration
[23:29:24 CET] <furq> so it depends how smart ffmpeg's hls muxer is about that
[23:29:52 CET] <furq> hopefully it puts the actual duration and not whatever you set hls_time to
[23:31:18 CET] <furq> it might also compare it to ext-x-targetduration as well, i forget now
[23:38:23 CET] <bonk> does anybody have any idea how to combine a video and audio stream live? my ip cam gives me /video and /audio.opus and i need to combine them before i serve the content
[23:39:31 CET] <klaxa> you could try ffmpeg -i video-url -i audio-url -map 0 -map 1 [whatever output you need] but it will probably be out of sync
[23:40:17 CET] <BtbN> will be hard if not impossible to sync up perfectly
[23:40:48 CET] <bonk> oh ok
[23:40:49 CET] <bonk> thanks
[23:40:59 CET] <bonk> il try that
[23:41:22 CET] <BtbN> could try to pass through device timestamps, and then just hope it does something sensible with them
[23:44:47 CET] <caim> oh wow did it with -c copy only and this 1.9% cpu is awesome while the other one that does encoding is at  34.9%. Seems to be working fine and plays on my iphone. Thanks so much everyone!
[23:46:50 CET] <arooni> yay building from source
[23:46:56 CET] <arooni> always best for latest and greatest
[23:48:27 CET] <bonk> the map command that was sent earlier doesn't seem to actually output any audio
[23:48:38 CET] <bonk> if i listen closely there is only static
[00:00:00 CET] --- Fri Feb  2 2018



More information about the Ffmpeg-devel-irc mailing list