[Ffmpeg-devel-irc] ffmpeg.log.20170930

burek burek021 at gmail.com
Sun Oct 1 03:05:01 EEST 2017


[00:00:02 CEST] <cryptodechange> That was with CRF 16
[00:00:15 CEST] <cryptodechange> If I applied the same to my DBZ blurays, it would probably be 16mbps
[00:00:35 CEST] <BtbN> libx264 is pretty decent at using my 8 cores, how many cores are you throwing at it?
[00:00:50 CEST] <redrabbit> that's plenty of bitrate
[00:01:02 CEST] <BtbN> at some point more cores don't do much anymore. Can't split the work in indefinitely small chunks
[00:01:18 CEST] <cryptodechange> my target is around 8-12mbps for 1080p, preferably lower with it being anime
[00:01:45 CEST] <BtbN> So, use cbr/abr encoding then, and not crf
[00:01:49 CEST] <cryptodechange> But seeing as 1080p is usually targetted around 8-12mbps on most streaming services, I am happy just clocking it at that
[00:07:37 CEST] <cryptodechange> JEEB: what am I looking for in the logs regarding swscale?
[00:34:13 CEST] <cryptodechange> it was VLC
[00:34:14 CEST] <cryptodechange> x_x
[00:34:17 CEST] <cryptodechange> https://imgur.com/a/EUmw5
[00:34:31 CEST] <cryptodechange> no noticeable difference in color
[00:34:49 CEST] <cryptodechange> though linework seems a bit softer, some vertical artifacts
[00:36:30 CEST] <CoreX> yeah you can see it on the nose
[00:38:09 CEST] <cryptodechange> added the command line
[00:39:37 CEST] <cryptodechange> not sure how I would go about making it less soft
[00:43:28 CEST] <CoreX> i would say that is perfect as it is just a minor detail but when its playing you wont notice it
[00:49:32 CEST] <cryptodechange> like 2 layers of pixel gradient vs. 2-3
[01:10:24 CEST] <Arfed> I want to use ffmpeg to connect to an rtsp stream and restream it on localhost - is this possible now that ffserver is gone?
[01:11:43 CEST] <furq> if you want to restream to multiple clients then use nginx-rtmp
[01:14:01 CEST] <Arfed> cheers, that may do the trick
[01:17:52 CEST] <Arfed> would you know offhand, if nginx-rtmp will work properly in windows? seems to be a project that's still maturing
[01:29:26 CEST] <Arfed> is there a more straightfoward program which can connect to an rtsp stream, and restream it locally? particularly, cross-platform. nginx-rtmp looks like it may not be a mature enough project
[01:30:17 CEST] <furq> i should have thought it'd work on windows
[01:30:22 CEST] <furq> there are windows builds around on github
[01:32:45 CEST] <Arfed> ya it's just, I don't trust them all too much - they're unofficial builds from random people, and could contain anything
[01:33:17 CEST] <Arfed> the project isn't mature enough to host it's own builds and all, so would rather find something more mature than nginx-rtmp
[01:34:02 CEST] <JEEB> I think multiple CDNs at this point are using nginx-rtmp :D
[01:34:26 CEST] <Arfed> ya - that's the thing, it seems more developer focused than end-user focused - so makes sense for CDN's to use it
[01:34:28 CEST] <DHE> nginx in non-rtmp configurations (web and stuff) is awesome. I expect nginx-rtmp to be very good
[01:34:35 CEST] <JEEB> lol
[01:34:53 CEST] <JEEB> it has nothing to do with upstream nginx developer-wise so in that scale that comparison is a bit misguided
[01:35:15 CEST] <JEEB> Arfed: most likely the developers only care about linux, where you'd build the module yourself anyways
[01:35:22 CEST] <JEEB> or *nix in general
[01:35:24 CEST] <Arfed> ya it seems so
[01:35:42 CEST] <DHE> wonder if that linux compatibility layer in windows would be of use?
[01:35:43 CEST] <JEEB> but it  seems like "maturity" is not the problem according to what I hear
[01:35:54 CEST] <JEEB> well he could just build his own nginx with the module
[01:35:57 CEST] <JEEB> for windows
[01:36:00 CEST] <JEEB> if he needs windows
[01:36:07 CEST] <Arfed> I just want to stick to duct-taping commandline based stuff together - don't want to delve into custom builds and all here
[01:36:09 CEST] <JEEB> Arfed: anyways what was your actual use case?
[01:36:19 CEST] <JEEB> your input is rtsp
[01:36:31 CEST] <JEEB> that much is clear, but what do you mean "restream it on localhost"
[01:36:33 CEST] <Arfed> I just want to restream an RTSP stream, and connect multiple ffmpeg instances to it locally
[01:36:45 CEST] <furq> tbf hosting your own builds isn't really a sign of maturity, it's just a sign of not really caring about windows
[01:36:55 CEST] <furq> it's very easy to build on *nix
[01:37:14 CEST] <JEEB> ok, try vlc then. the sout syntax is godawful but it should be able to serve rtsp on localhost and able to read your original rtsp
[01:37:17 CEST] <furq> and generally modules don't need to do any explicit work to support windows, that work is all done by nginx
[01:37:36 CEST] <JEEB> nginx-rtmp is not going to help you with rtsp-to-rtsp
[01:37:49 CEST] <furq> well i assume "restream" doesn't necessarily mean rtsp output
[01:38:03 CEST] <furq> but yeah vlc is probably an easier choice on windows
[01:38:32 CEST] <Arfed> doesn't have to be rtsp, I just want something as close to passthrough streaming as I can get, to reduce changes of compatibility issues or unreliability
[01:38:46 CEST] <Arfed> cheers though, didn't think of vlc
[01:40:12 CEST] <Arfed> the end goal though, is just to have multiple ffmpeg instances accessing 1 stream - but without having multiple connections to that remote stream
[01:40:41 CEST] <Arfed> maybe ffmpeg has a way of doing this...ffserver looked like a way, before it was removed
[01:42:49 CEST] <JEEB> it hasn't been removed but it generally isn't a passthrough. as in, passthrough touches the least amount of the data as possible. that said, vlc isn't either. it'll just have more documentation and users than ffserver ever had
[01:43:20 CEST] <Arfed> i was thinking something like the way -vcodec copy works
[01:43:42 CEST] <JEEB> that's only passthrough of the demultiplexed packets
[01:43:47 CEST] <JEEB> and both ffmpeg and vlc can do that
[01:44:17 CEST] <JEEB> but there's a clear distinction between passing something through / re-transmitting
[01:44:23 CEST] <JEEB> and demultiplexing and re-multiplexing
[01:44:24 CEST] <furq> why do you need multiple ffmpeg instances
[01:44:33 CEST] <furq> i assume it's not something you can do with multiple outputs
[01:44:46 CEST] <Arfed> I want one ffmpeg instance to transcode to disk, and I want an ffplay instance to display the output at same time
[01:44:54 CEST] <furq> you can do that with one ffmpeg
[01:45:00 CEST] <JEEB> and I recommend mpv for playback
[01:45:13 CEST] <furq> -i foo -c:v libx264 bar.mp4 -c:v rawvideo -f nut - | mpv
[01:45:15 CEST] <JEEB> uses the FFmpeg libraries in the background and is minimal
[01:45:25 CEST] <Arfed> ya I've been using mpv, but it's unreliable with the stream i have - it can lose connection but doesn't try to reconnect
[01:45:41 CEST] <JEEB> and ffplay does? then it's the default it changes with rtsp
[01:45:50 CEST] <JEEB> it seems to default to tcp while libavformat defaults to udp
[01:46:07 CEST] <Arfed> I don't know if ffplay is more reliable yet
[01:46:09 CEST] <JEEB> as in, mpv by default overrides the rtsp mode
[01:46:14 CEST] <furq> you can even pipe the encoded stream to mpv/ffplay while saving to disk if you use the tee muxer
[01:46:32 CEST] <Arfed> ok, I don't follow completely but this is starting to sound like a more promising route
[01:46:44 CEST] <furq> basically you can have multiple output files from one input
[01:46:52 CEST] <furq> it means you only have to decode it once in addition to only needing to connect once
[01:47:02 CEST] <Arfed> sounds ideal
[01:47:06 CEST] <furq> e.g. -i foo -c:v libx264 out.mp4 -c:v libvpx-vp9 out.webm
[01:47:22 CEST] <furq> but in your case the second output file would be rawvideo to stdout piped to mpv's stdin
[01:47:31 CEST] <furq> i'm pretty sure that'll still work on windows, but if not it definitely works in e.g. msys2
[01:47:52 CEST] <furq> without needing any special ffmpeg/mpv binaries or whatever
[01:48:02 CEST] <Arfed> okey, nice - and that should avoid the unreliability problem mpv has (assuming that isn't also present with ffmpeg)
[01:48:55 CEST] <Arfed> ah I see, nice
[01:48:56 CEST] <Arfed> https://trac.ffmpeg.org/wiki/Creating%20multiple%20outputs
[01:49:02 CEST] <Arfed> thanks for that, looks perfect
[01:50:48 CEST] <furq> bear in mind that closing the player will kill the encode
[01:50:59 CEST] <furq> if one output drops then the whole thing dies
[01:51:07 CEST] <Arfed> so the output pipe needs to already be open, and when it closes it takes down ffmpeg ya?
[01:51:12 CEST] <furq> right
[01:52:01 CEST] <Arfed> okey - will explore other output alternatives, as ya that makes it a bit tricky
[01:54:22 CEST] <Arfed> that page above says can output to udp stream - would that avoid the pipe problem?
[01:54:57 CEST] <Arfed> nevermind - was messing with that earlier, and I think ffmpeg needs to have a remote connection for the udp stream - so same problem
[01:55:44 CEST] <Arfed> actually...for my use case it's fine, desired even, for ffmpeg to stop when the pipe/mpv closes
[02:05:44 CEST] <furq> not sure if this is what you meant but mpv/ffplay don't already have to be open beforehand
[02:06:01 CEST] <furq> piping to mpv will spawn mpv
[02:12:30 CEST] <Johnjay> thebombzen: I got the newer release of ffmpeg installed, and it does indeed have loudnorm filter in it
[02:12:42 CEST] <Johnjay> i still have no idea what the options do, but it's a start. are the filters written in C?
[02:13:00 CEST] <Arfed> oh, I've setup mpv to listen on a named pipe, and am trying to get ffmpeg to output to that named pipe
[02:13:36 CEST] <furq> i guess that works
[02:14:35 CEST] <furq> regular piping should work in the windows shell though
[02:14:43 CEST] <Arfed> am trying to do that at the moment, but mpv doesn't like the input
[02:14:48 CEST] <Arfed> (with named pipes)
[02:14:56 CEST] <furq> does the stream have audio
[02:15:01 CEST] <Arfed> no
[02:15:19 CEST] <Arfed> even if i specify audio though, doesnt work
[02:15:21 CEST] <furq> -f yuv4mpegpipe - | mpv -
[02:16:04 CEST] <Arfed> codec not supported
[02:16:09 CEST] <Arfed> i specified rawvideo as well
[02:16:16 CEST] <furq> in mpv?
[02:16:17 CEST] <furq> weird
[02:16:44 CEST] <furq> ffmpeg -f lavfi -i yuvtestsrc -f yuv4mpegpipe - | mpv -
[02:16:46 CEST] <furq> does that work
[02:16:49 CEST] <furq> that's the exact command i just ran
[02:18:10 CEST] <Arfed> not sure how to translate that to my usage - what does the first -f specify? is that  the input format?
[02:18:27 CEST] <furq> yeah
[02:18:31 CEST] <Arfed> okey
[02:18:39 CEST] <furq> just run that command
[02:18:51 CEST] <furq> -f lavfi -i yuvtestsrc just generates a test stream
[02:19:18 CEST] <furq> if that works then just replace that with your real input and add your file output
[02:23:18 CEST] <Arfed> ok, got the test input working
[02:26:28 CEST] <Arfed> hmm, okey - getting somewhere
[02:26:33 CEST] <Arfed> have the stream working now
[02:26:36 CEST] <Arfed> cheers
[02:29:44 CEST] <thebombzen> Johnjay: you're probably looking for acompressor, not loudnorm
[02:32:59 CEST] <Johnjay> "Because of its complex settings it may take a long time to get the right feeling for this kind of effect. "
[02:33:05 CEST] <Johnjay> --ffmpeg doc on acompressor
[02:39:39 CEST] <thebombzen> true
[02:39:54 CEST] <thebombzen> either way, you're probably not looking for loudnorm
[02:40:13 CEST] <thebombzen> you already have the numbers for acompressor though
[02:47:26 CEST] <Arfed> cheers furq - I've got this going now, almost exactly as I want
[03:07:54 CEST] <Johnjay> thebombzen: I guess it would be easier if I had something concrete to go on
[03:08:10 CEST] <Johnjay> like a graph of audio waveform I can see before and after applying the filter
[03:08:27 CEST] <Johnjay> maybe I should just go look at the filter source code
[03:29:47 CEST] <Mista_D> Anyway to easily automate audio track dump? e.g. -i source_file -c:a copy file.$ext; where ext=ac3/mp2/mp3/aac... automatically?
[03:30:44 CEST] <wckd> Mista_D: pretty easy to script that thing, yes.
[03:32:02 CEST] <Mista_D> wckd: I thought there was a switch in FFmpeg somewhere )
[03:33:11 CEST] <wckd> Mista_D: Oh. Maybe this is close enough? -> https://stackoverflow.com/questions/21634088/execute-ffmpeg-command-in-a-loop
[03:34:01 CEST] <furq> does it have to be the "right" container
[03:34:06 CEST] <furq> you could just put them all in .mka
[03:34:47 CEST] <furq> failing that you'll need ffprobe and some kind of lookup table, unless you really want raw ac3/aac bitstreams, which you don't
[03:35:29 CEST] <Mista_D> furq: true, does mka preserve any additional timing/metadata?
[03:35:42 CEST] <furq> should do
[03:36:47 CEST] <furq> but yeah there's no way i know of to autoselect the output container, much less the file extension
[03:37:07 CEST] <furq> it'd only be three or four if statements in sh though
[03:37:21 CEST] <furq> assuming the audio codecs you're actually likely to find
[03:40:26 CEST] <furq> http://vpaste.net/nRmOi
[03:40:31 CEST] <furq> that's the ffprobe command you'd want fwiw
[03:42:05 CEST] <Mista_D> furq: thanks a lot
[03:56:24 CEST] <thebombzen> Johnjay: or the documentation
[03:56:50 CEST] <thebombzen> other than that I can't provide any more info
[03:56:59 CEST] <thebombzen> read ffmpeg-filters.html in the docs, you might find what you're looking for
[04:08:04 CEST] <Arfed> is there a way to start ffmpeg without a window, in windows?
[04:20:16 CEST] <Mista_D> Encoding to 10 different outputs (1080,720..QCIF) using x264. Can't saturate 32 HT Xeon, anyway to speed it up and use entire CPU?
[04:34:41 CEST] <Arfed> ffmpeg doesn't appear to support running without a window - could this please be added?
[04:37:10 CEST] <furq> ffmpeg can't really do anything about that
[04:37:39 CEST] <furq> that's a linker option iirc
[04:38:03 CEST] <furq> and idk if enabling -mwindows would break everything
[04:48:00 CEST] <Arfed> I think other software solves this by providing a wrapper executable
[04:59:59 CEST] <Spec> Heya. I'm trying to capture a udp stream with ffmpeg. I can play the stream with ffplay and a few flags, but when I try with ffmpeg I can't seem to get it to work
[05:02:02 CEST] <Spec> here are the commands i am attempting https://pastebin.com/raw/cD2dQN1X
[05:09:46 CEST] <titbang> Arfed how to expect to exec it without a window?
[05:10:08 CEST] <Arfed> ah, what do you mean?
[05:12:20 CEST] <Arfed> I did find a way to hide the window in the end, using the nircmd utility, with: nircmd win hide title 'windowtitle'
[05:12:46 CEST] <titbang> how u know when its done?
[05:13:36 CEST] <Arfed> it's reading an rtsp stream, and encoding it to disk, and piping it to mpv at the same time. when I close mpv, that closes the pipe and closes ffmpeg automatically as well
[05:16:07 CEST] <titbang> well done
[05:16:23 CEST] <Arfed> anyways, off to bed now, spent all night figuring that out - working pretty much perfectly now, pretty happy with that - thanks for help furq/all
[05:16:28 CEST] <Arfed> ta :)
[13:30:36 CEST] <rabbe> if i want to have real time live stream of my webcam shown in the browser, what do you recommend using?
[13:31:41 CEST] <BtbN> twitch
[13:31:51 CEST] <rabbe> is that some online stuff?
[13:31:59 CEST] <rabbe> i want offline
[13:32:01 CEST] <titbang> chaturbate
[13:32:03 CEST] <titbang> they pay good
[13:33:07 CEST] <BtbN> That makes no sense. What's the point of showing your webcam in a browser, if it's not streamed somewhere?
[13:33:47 CEST] <rabbe> just to get an interface for controlling a robot
[13:33:55 CEST] <rabbe> it's a job
[13:34:20 CEST] <BtbN> And why use a browser then? Just use some normal software the view it.
[13:34:58 CEST] <rabbe> the customer has a "control tower" web page, so i figured i can provide a stream which they can display in the same page
[13:35:42 CEST] <BtbN> so setup an rtmp server somewhere, stream to it, and make it output hls or dash, and display that
[13:35:54 CEST] <BtbN> but it will take considerable resources to encode the video on the source device
[13:36:03 CEST] <BtbN> Also will have at least 10 seconds of delay, if not 20.
[13:36:25 CEST] <rabbe> hm, okay. no way to do it otherwise?
[13:36:31 CEST] <BtbN> no
[13:36:49 CEST] <rabbe> i will be using a magewell hdmi-usb capture card
[13:36:58 CEST] <BtbN> For a webcam?
[13:37:01 CEST] <BtbN> That sounds overkill
[13:37:03 CEST] <rabbe> gopro :=
[13:37:20 CEST] <rabbe> so it will be "like a webcam"
[13:37:40 CEST] <BtbN> That machine better have a powerful USB3 controler then
[13:37:47 CEST] <BtbN> and a powerfull processor do handle the video
[13:38:44 CEST] <BtbN> And Browsers generally do not support low latency video playback
[13:39:08 CEST] <rabbe> i guess it's ok with minor latency
[13:39:27 CEST] <BtbN> it's at least 3 times segment size
[13:39:35 CEST] <BtbN> and usually you have 4 second segments
[13:39:40 CEST] <BtbN> so 12 seconds minimum delay just by that
[13:39:51 CEST] <rabbe> hm
[13:41:12 CEST] <Cracki> webrtc is probably the only tech in browsers that's meant for low latency video
[13:41:30 CEST] <Cracki> but this means the codecs included in the browser _are_ capable of low latency decode
[13:41:43 CEST] <Cracki> (codecs, demuxers, etc)
[13:42:15 CEST] <BtbN> webrtc uses horribly low quality codecs
[13:42:15 CEST] <Cracki> magewells are good, imho
[13:42:24 CEST] <rabbe> can i not set the segment size then?
[13:42:25 CEST] <Cracki> low quality doesn't matter if it's low latency
[13:42:38 CEST] <BtbN> and streaming webrtc to a browser is a horrible mess that basically no readily made software exists for
[13:42:43 CEST] <BtbN> except other browsers
[13:43:06 CEST] <BtbN> It's designed for video chat and conferencing. Not video streaming from a server.
[13:43:16 CEST] <Cracki> I remember this discussion. isn't there a wiki page where all that is written down for reference?
[13:43:40 CEST] <BtbN> If you want to stream video to a browser, hls or dash with h264/aac it is
[13:43:54 CEST] <Cracki> again, I'd like to mention jpeg/mjpeg for low latency (and high bandwidth)
[13:43:58 CEST] <BtbN> Or vp8 or vp9, but support for that is not nearly as universal
[13:46:06 CEST] <rabbe> hm.. mpeg dash is fragmented mp4?
[13:47:48 CEST] <rabbe> how do i set that up with ffmpeg/ffserver.. i first take the camera input and save a fragmented mp4 file to disk, then stream it to ffserver using .ffm extension and ffserver can be set up to provide .mp4 stream to clients?
[13:48:48 CEST] <rabbe> hard to make sense of the information i've found online..
[13:50:47 CEST] <rabbe> with jpeg/mjpeg method, i would just serve the latest image from the camera?
[14:05:13 CEST] <broman> can anyone please help me?? I want to use ffmpeg to capture some webcam, but I need for it to store in separate files, so I can compress it with my compression infrastructure, and then stream over web... Until now, what I can do is to capture the webcam stream to a single file...
[14:14:05 CEST] <DHE> there's a 'segment' output file format that will do automatic splitting
[14:37:53 CEST] <broman> DHE, what do you mean?
[14:40:53 CEST] <broman> DHE, got it! thank you so much!
[14:41:24 CEST] <DHE> ffmpeg [inputoptions] -i input  [outputoptions] -f segment -segment_time 60 dir/output%03d.mp4   # makes 60-second segments with starting with output000.mp4 as a filename
[14:42:22 CEST] <broman> thank you!
[14:50:43 CEST] <broman> What about stream those segment files to a rtmp? is there a way to do it?
[14:51:31 CEST] <Mavrik> broman, why not setup a streaming server which will do segmentation and conversion to HLS/DASH/RTMP for you?
[14:54:26 CEST] <broman> Mavrik, I need real time live compression... so I'm setting up an openstack cloud with a few computers... so I will grab the full hd video, transcode it on my cloud and stream it to a personal rtmp web streaming
[14:54:39 CEST] <Mavrik> and?
[14:54:57 CEST] <Mavrik> That sounds exactly like you need a streaming server :P
[14:55:27 CEST] <broman> :P
[14:55:32 CEST] <broman> yes, but scalable
[14:56:09 CEST] <broman> We don't have the bandwith to transmit the raw file
[14:57:18 CEST] <BtbN> rabbe, don't use ffserver, it's dead.
[15:00:50 CEST] <broman> I see how to stream a single mp4 file to a rtmp... but how about those compressed segments/
[15:02:54 CEST] <BtbN> a single mp4 file cannot be streamed, unless it's already finished.
[15:03:01 CEST] <JEEB> OBJECTION!
[15:03:09 CEST] <Mavrik> unless segmented :P
[15:03:12 CEST] <JEEB> *fragmented
[15:03:19 CEST] <Mavrik> that. :)
[15:03:22 CEST] <BtbN> when it's segmented or fragmented it's a lot of files stitched together
[15:03:29 CEST] <Mavrik> Still, I'm rather vary of doing that on the encoder
[15:03:50 CEST] <BtbN> And I doubt you can stream a fragmented mp4 anywhere. Especially not to an rtmp server.
[15:03:53 CEST] <BtbN> You use rtmp for that
[15:04:06 CEST] <JEEB> I've worked with plenty of solutions although usually tehy want MS's fork of it :<
[15:04:12 CEST] <JEEB> "smooth streaming"
[15:04:19 CEST] <JEEB> and then they output multiple formats out
[15:04:26 CEST] <Mavrik> ew
[15:04:27 CEST] <Mavrik> :)
[15:04:33 CEST] <JEEB> but per-stream fragments do make sense IMHO
[15:04:45 CEST] <Mavrik> Btw, why would you use RTMP instead of TS-over-UDP?
[15:04:47 CEST] <JEEB> for each dump of stuff you get start DTS + duration and then fragments
[15:05:22 CEST] <BtbN> Because rtmp actually has sofware that supports it
[15:05:23 CEST] <JEEB> well not fragments, samples
[15:05:24 CEST] <JEEB> :V
[15:05:33 CEST] <JEEB> and yes, a lot of solutions have focused on RTMP
[15:05:34 CEST] <BtbN> and ts over udp has a terrible error or congestion handling
[15:05:35 CEST] <JEEB> which is unfortunate
[15:05:41 CEST] <JEEB> becasue now Adobe has given up
[15:05:53 CEST] <JEEB> and we will not have anything post-AVC on FLV :P
[15:05:59 CEST] <Mavrik> BtbN, how does congestion handling work for live streams?
[15:06:05 CEST] <Mavrik> I mean, at the end you drop stuff anyway?
[15:06:14 CEST] <BtbN> but you don't drop I frames if possible
[15:06:21 CEST] <BtbN> rtmp has that intelligence to drop B frames first
[15:06:25 CEST] <BtbN> udp has not
[15:07:06 CEST] <JEEB> but yea, I guess the per-track fragments let the media server index the content more easily, which is helpful when remultiplexing on the fly
[15:07:14 CEST] <Mavrik> on encoder <--> server transport?
[15:07:19 CEST] <BtbN> I think the thing that will come after rtmp in the long term is webrtc.
[15:07:29 CEST] <BtbN> Some streaming services already use hacked up versiosn of it for ingest
[15:07:33 CEST] <JEEB> yea
[15:07:39 CEST] <Mavrik> yea
[15:07:41 CEST] <JEEB> at least that's based on Actual Standards
[15:07:53 CEST] <BtbN> But it's so wtf complex that you might as well use black magic instead
[15:07:53 CEST] <JEEB> so it's not like random chinese asking for hacked up HEVC-in-FLV
[15:07:59 CEST] <JEEB> BtbN: lol yea
[15:08:18 CEST] <BtbN> and if it fails, nobody can tell you why or what to do about it
[15:08:20 CEST] <BtbN> I'm not a fan
[15:08:37 CEST] <BtbN> some new light weight streaming protocol would be nice
[15:09:04 CEST] <BtbN> all services that rely on webrtc I have used so far are horribly unstable.
[15:09:18 CEST] <BtbN> rabb.it for example crashes very frequently and then needs a refresh or even browser restart to revive it
[15:09:43 CEST] <JEEB> for ingest it can even be something like HTTP POST or just raw TCP :V then you just decide on something sane that you push in. be it MPEG-TS with different programs for profiles, or fragmented ISOBMFF with separate moof and empty moov
[15:09:57 CEST] <JEEB> or matroska even now
[15:09:58 CEST] <BtbN> You do need some intelligence though
[15:10:12 CEST] <BtbN> The packet-dropping-code in rtmp does some work usually
[15:10:14 CEST] <JEEB> and then whatever gets pushed out of the media server is then whatever the clients like
[15:10:54 CEST] <BtbN> I don't want something with fragments as ingest protocol
[15:11:07 CEST] <BtbN> the lag introduced by the segmented distribution is annoying enough
[15:12:02 CEST] <JEEB> well it generally makes sense. you get separate fragments for each track/stream and with each fragment you get a start DTS+duration
[15:12:08 CEST] <JEEB> they don't have to start with a RAP either
[15:12:16 CEST] <JEEB> so you can get the latency smaller with that
[15:12:27 CEST] <JEEB> although for really low latency you need UDP anyways
[15:13:26 CEST] <broman> I'm sorry guys, the conversation developed too fast... :). I'm capturing the webcam to mp4 segment files... I need to send it to my cloud, transcode them, get them back and then stream them to a rtmp nginx that I have... this is the workflow of the 'thing' I want to accomplish
[15:13:27 CEST] <BtbN> old Twitch with rtmp to ingest, rtmp to browser, has sub 1 second overall latency
[15:13:50 CEST] <BtbN> and that was all tcp of course
[15:13:54 CEST] <JEEB> yea
[15:14:12 CEST] <BtbN> when they started with HLS it was 30 seconds +
[15:14:25 CEST] <BtbN> Which they got down to ~6 seconds now, by violating the HLS spec in their player
[15:16:04 CEST] <JEEB> I don't remember how the browsers nowadays take it when you feed them stuff not beginning with a RAP, but basically single-sample fragments as a steady stream should do it :V
[15:16:30 CEST] <JEEB> of course you could just discard the non-RAP samples
[15:16:38 CEST] <BtbN> MSE basically only takes in normal mp4
[15:16:45 CEST] <BtbN> That's what all the massive js blobs do
[15:17:01 CEST] <BtbN> they stitch together the segments in a way that makes them look like a normal mp4 file with the moov atom in front
[15:17:29 CEST] <JEEB> also the problem IIRC with browsers is that they tend to cache the http request results
[15:17:38 CEST] <JEEB> so you need shit like websockets (?) to get a continuous stream going
[15:17:50 CEST] <JEEB> without having to request more and more separate HTTP requests
[15:17:55 CEST] <BtbN> well, you can just tell it to not cache something
[15:18:08 CEST] <JEEB> think of a non-ending HTTP GET
[15:18:09 CEST] <BtbN> it clearly has to not cache the playlist
[15:18:13 CEST] <BtbN> for HLS or DASH
[15:18:17 CEST] <JEEB> no, skip playlists even :P
[15:19:11 CEST] <JEEB> IIRC if you start doing XmlHTTPRequest the browser will keep buffering the result
[15:19:16 CEST] <JEEB> so HTTP is a no-go in browsers :P
[15:19:23 CEST] <JEEB> even if you can get the data while it's still reading (?)
[15:19:28 CEST] <BtbN> that depends entirely on the Cache-Control headers
[15:19:40 CEST] <JEEB> you're thinking in multiple requests
[15:19:52 CEST] <JEEB> I'm talking about a single request to the continuous live byte stream
[15:20:10 CEST] <JEEB> which seems impossible with the browsers due to how their HTTP request stuff works :/
[15:20:19 CEST] <JEEB> thus websockets or webrtc come up
[15:20:43 CEST] <BtbN> you can stream with XmlHttpReq
[15:20:57 CEST] <BtbN> Problem is, it will keep the entire thing in RAM, until the request is done and you free it
[15:21:05 CEST] <JEEB> that is what I meant by caching
[15:21:35 CEST] <JEEB> "...if you *start* doing XmlHTTPRequest the browser will keep buffering the result..."
[15:22:27 CEST] <JEEB> but yes, this effectively kills the CDN market that we've got now with standard nginx etc
[15:22:37 CEST] <JEEB> because suddenly it's not files any more
[15:22:41 CEST] <JEEB> but that was true to rtmp as well
[15:22:51 CEST] <JEEB> actual streaming protocols are harder to just throw CDNs at
[15:25:15 CEST] <BtbN> it's also annoying that MSE takes in audio and video seperately
[15:25:37 CEST] <BtbN> so you can't just make a fragmented mp4 stream with ffmpeg, put it into a websocket, and then write it into mse in a browser
[15:28:10 CEST] <rabbe> BtbN: if ffserver is dead, what replaced it? or does that depend on what i want to do?
[15:28:33 CEST] <JEEB> it depends on what you want to do
[15:28:41 CEST] <BtbN> nobody maintains it anymore, nobody has any knowledge about it anymore.
[15:28:55 CEST] <rabbe> ok.. :)
[15:28:56 CEST] <BtbN> But some people desperately fight against its removal
[15:32:26 CEST] <rabbe> so, my best bet to get a near-realtime webcam feed into a browser is with a dash stream? i think picture quality and smoothness is more important than low latency in this case. does that change my options?
[15:32:48 CEST] <BtbN> DASH has the same delay as HLS.
[15:33:02 CEST] <BtbN> near realtime to a browser just isn't a thing
[15:33:28 CEST] <JEEB> unless you go webrtc or something funky like that, which I haven't looked at too much from the server side yet
[15:33:42 CEST] <rabbe> you said its proportional to segment size. can i just decrease the segment size?
[15:34:35 CEST] <rabbe> is 1080p at 60fps too much to hope for, btw? :)
[15:34:48 CEST] <JEEB> no
[15:34:55 CEST] <JEEB> unless you need to do transcoding on a low end box
[15:35:04 CEST] <JEEB> if you can just get H.264 from the camera that already is positive
[15:35:15 CEST] <JEEB> (unless the camera sucks and the bit stream it gives out is broken)
[15:36:43 CEST] <rabbe> transcoding computer hardware can be whatever. in the beginning we had to have the computing unit near the robot arm with the camera, but now we send the signal with WHDI to the computer having the capture card
[15:37:16 CEST] <rabbe> its a hero6 camera
[15:37:50 CEST] <rabbe> sorry, hero5
[15:45:41 CEST] <rabbe> will you still need ffmpeg for doing this or are there alternatives on that side too?
[15:47:20 CEST] <JEEB> there's always multiple things doing different things, but generally either ffmpeg or something utilizing FFmpeg's libraries is recommended
[15:47:29 CEST] <JEEB> depends on how things are accessed etc
[15:48:12 CEST] <rabbe> ok
[15:48:14 CEST] <JEEB> you have one thing grabbing your capture (be it raw or already transcoded by the capture card) and outputting to a media server that then handles the re-packaging that might be needed for different clients
[15:48:49 CEST] <JEEB> generally the second part is handled by nginx-rtmp since it's one of the open sourced solutions (even though RTMP is definitely not future proof on any level due to adobe dropping it on the floor)
[15:48:55 CEST] <JEEB> or more like, adobe dropped FLV on the floor
[15:49:24 CEST] <JEEB> (which is the A/V container that's used in RTMP)
[15:50:06 CEST] <rabbe> aha, yeah i knew that flash stuff is going away, but didn't think the realtime stuff depended on it
[15:50:21 CEST] <JEEB> a lot of CDNs utilize RTMP as the ingest
[15:50:28 CEST] <JEEB> and media servers
[15:50:44 CEST] <JEEB> so they take FLV-over-RTMP in, and then serve HLS/DASH
[15:50:49 CEST] <JEEB> nginx-rtmp is one of such
[15:51:12 CEST] <JEEB> (and happens to be open source under a fairly liberal license so it's also often used in corporate scenarios I bet)
[15:51:30 CEST] <rabbe> ok, so capturecard -> ffmpeg -> nginx-rtmp -> browser
[15:51:35 CEST] <JEEB> yup
[15:51:56 CEST] <JEEB> that way you have a separation of concerns between "this does the transcoding" and "this does the media serving"
[15:51:57 CEST] <rabbe> i'll have a go and see
[15:52:17 CEST] <rabbe> ok
[15:53:02 CEST] <rabbe> i guess i will need both DASH and HLS for cross-browser support?
[15:53:14 CEST] <JEEB> you essentiallly get both with nginx-rtmp if I recall correctly
[15:53:23 CEST] <rabbe> oh, ok
[15:53:31 CEST] <JEEB> some players prefer one, some the other
[15:54:06 CEST] <JEEB> also another way I've seen people do is just have a plain nginx and then you can just HTTP POST to it with the HLS writer
[15:54:37 CEST] <JEEB> so instead of output file name you give ffmpeg a HTTP URL
[15:54:47 CEST] <JEEB> and then it will attempt to POST/DELETE things
[15:55:16 CEST] <rabbe> i see. and that would improve performance?
[15:55:38 CEST] <JEEB> no idea, it's just another way of getting an HLS stream out :)
[15:55:50 CEST] <rabbe> k :)
[15:56:01 CEST] <JEEB> in that case you would only be getting that one thing out instead of both DASH, HLS and RTMP
[15:56:11 CEST] <JEEB> which you'd get with nginx-rtmp
[15:56:43 CEST] <rabbe> ah
[15:57:25 CEST] <JEEB> also with that manual way you would also manually have to generate the "meta" HLS playlist :)
[15:57:26 CEST] <rabbe> ok, but the ffmpeg output should be flv then?
[15:57:34 CEST] <JEEB> in case of RTMP it is FLV over RTMP, yes
[15:58:07 CEST] <JEEB> which does support H.264 video and AAC audio as a solution, so it does (still) work for a lot of use cases
[15:58:11 CEST] <Mavrik> can RTMP carry anything else?
[15:58:32 CEST] <JEEB> not that I know of, it's a thing tied so heavily to Adobe
[15:58:37 CEST] <JEEB> and Adobe has since given up on it
[16:00:08 CEST] <JEEB> for ingest I would just replace RTMP with X over HTTP, and the X is something I don't have heavy preferences over :)
[16:01:17 CEST] <Mavrik> Hmm, doesn't HTTP have more overhead?
[16:01:28 CEST] <JEEB> I don't think so after you start the POST
[16:01:44 CEST] <rabbe> when i started looking at this you needed .mp4 (H.264+AAC) for IE, Safari, Chrome, Iphone and Android, then .webm (VP8+Vorbis) for Firefox and Opera. Is DASH the equivalent of MP4 and HLS the equivalent of WebM?
[16:01:45 CEST] <JEEB> it's not like you have to a) stop the POST or b) send additional HTTP headers
[16:02:26 CEST] <JEEB> rabbe: ok, if you really need the VP8/9+vorbis/opus then rip :D
[16:02:43 CEST] <JEEB> because there really isn't a nice ingest for anything yet taking those in
[16:03:08 CEST] <JEEB> there's the WebM DASH thing, but that's for the client serving side, not ingest
[16:03:46 CEST] <JEEB> also both firefox and opera under windows and macos support H.264+AAC just fine, with linux it depends on the distribution
[16:04:19 CEST] <JEEB> basically whether or not the distribution distributes the decoder libraries or not. on windows and macos the system libraries are utilized.
[16:04:26 CEST] <rabbe> i'll just stick to vanilla DASH :)
[16:05:21 CEST] <rabbe> WebRTC is only related to the browser side then?
[16:05:23 CEST] <JEEB> although I guess it's possible to do it with the HTTP POST writing way, if it actually works with the webm DASH way :P
[16:05:53 CEST] <JEEB> rabbe: the underlying specification is RTP but yes - WebRTC is something that the web guys came up as a realtime UDP-based thing for voice/video chat
[16:06:11 CEST] <rabbe> got it. thanx
[16:09:43 CEST] <rabbe> difference between RTMP, RTP and RTC?
[16:10:19 CEST] <JEEB> RTMP: Adobe-developed proprietary protocol over TCP that covers commands and FLV-over-it
[16:10:27 CEST] <JEEB> RTP: some UDP-based protocol for streaming data
[16:10:45 CEST] <JEEB> WebRTC: limited use case specification of how to use RTP with browsers
[16:11:14 CEST] <rabbe> UDP is because web sockets are used?
[16:11:20 CEST] <JEEB> websockets are separate
[16:11:39 CEST] <JEEB> the popularity of the first one in multimedia is just that it was the first one that got really popular on the web (flash player supported it, and it did live streaming)
[16:12:05 CEST] <JEEB> the fact that user facing solutions no longer almost never use it didn't stop people from still using it for the ingest side :)
[16:12:33 CEST] <rabbe> k :)
[16:12:46 CEST] <JEEB> I think websockets is just a two-way protocol with TCP?
[16:12:46 CEST] <robswain[m]> webrtc essentially uses SRTP/DTLS - encrypted RTP
[16:13:01 CEST] <JEEB> yea
[16:13:11 CEST] <robswain[m]> yes, websocket starts as HTTP, does an 'Upgrade' request and is thereafter tcp
[16:13:43 CEST] <rabbe> ah, ok
[16:14:16 CEST] <rabbe> so this solution is not using TCP?
[16:14:30 CEST] <JEEB> RTMP and HTTP are both TCP-based protocols
[16:14:47 CEST] <rabbe> hehe.. the mess
[16:14:52 CEST] <JEEB> just that HTTP is a single client->server request-based thing
[16:15:07 CEST] <JEEB> RTMP I think is two-way (in theory)
[16:15:14 CEST] <rabbe> ok
[16:15:24 CEST] <robswain[m]> probably the reason webrtc is a bit awkward is because it is designed to be peer to peer, though both peers could be servers. the way a webrtc session is set up requires communicating SDP between the two clients to configure streams, codecs, IP and Port, transport protocol (udp, tcp, relayed, etc)
[16:15:43 CEST] <robswain[m]> and there is no standardised way of doing that 'signalling' as part of webrtc
[16:15:50 CEST] <JEEB> yup
[16:16:08 CEST] <robswain[m]> so you have to implement custom things for your service or so
[16:16:44 CEST] <robswain[m]> the protocol stack is mostly just the modern parts of RTP communication tech
[16:18:18 CEST] <rabbe> well, i'll have a go at this. thanks for all help
[17:40:22 CEST] <Fyr> guys, what is the difference between vorbis and libvorbis?
[17:47:16 CEST] <arpu> hmm how can i get a crf ( constant rate factor) without reencode ?  ( tested -15 and -vsync crf )
[17:47:44 CEST] <Fyr> arpu, why do you need CRF?
[17:47:52 CEST] <Fyr> where from do you need a CRF?
[17:48:03 CEST] <arpu> for segmenting the stream to dash
[17:49:30 CEST] <Fyr> I still don't get your question.
[17:49:49 CEST] <furq> are you sure you don't mean cfr
[17:49:52 CEST] <furq> otherwise the question makes no sense
[17:49:56 CEST] <arpu> ohh
[17:50:00 CEST] <furq> Fyr: libvorbis is better
[17:50:09 CEST] <furq> vorbis is the internal vorbis encoder which sort of sucks
[17:50:18 CEST] <Fyr> furq, then why do FFMPEG developers keep both?
[17:50:24 CEST] <furq> shrug
[17:50:25 CEST] <arpu> Fyr,  you are right! sorry i mean CFR
[17:50:38 CEST] <arpu> constant frame rate
[17:50:44 CEST] <Fyr> it would be better if the developers mark the former obsolete and abandon it in the future versions.
[17:52:51 CEST] <n000g> Hey, guys. I'm trying to scale and crop a video (about 2.35:1) to 480x272 (about 16:9), cropping off the left and right. It won't work though, because SAR and DAR are not identical, and I seemingly don't get square pixel. How do I get a square pixel 480x272 video, without a distorted image?
[17:57:56 CEST] <JEEB> you want to scale by limiting to either width or height depending on which makes one (or both) of either width or height match with the destination scale, then if you explicitly need 480x272, you should add padding where required and then use setsar=1
[17:58:08 CEST] <JEEB> it's possible, just requires some if/else stuff
[17:59:44 CEST] <feliwir> hey, what could av_read_frame cause to crash? Fffmpeg plays 1 video file just fine, then i want to play the next file (contexts, all new) and it crashes
[18:00:46 CEST] <JEEB> to handle multiple files you will have to reset the input structures including input AVStreams etc, and make sure you're not using anything from the old input lavf context
[18:00:54 CEST] <JEEB> it should work, so actually try debugging :P
[18:01:15 CEST] <JEEB> basically you will just have to keep in mind your mapping of what input stream at which point goes where etc
[18:01:41 CEST] <n000g> Ooff.
[18:01:48 CEST] <JEEB> if you can get a small sample that shows your breakage and you think it's a bug, post it on the issue tracker or the libav-users mailing list
[18:02:12 CEST] <JEEB> then you will either get told that you're using something wrong, or it's a bug and gets fixed :P
[18:02:17 CEST] <JEEB> in either case you get a step forward
[18:06:05 CEST] <feliwir> ok, thanks JEEB
[19:27:12 CEST] <thebombzen> does zimg support the sRGB transfer function?
[19:27:31 CEST] <thebombzen> it's not in libavfilter's zscale, but that doesn't mean it doesn't support it
[20:03:39 CEST] <JEEB> thebombzen: isn't that just yer usual BT.1886?
[20:03:47 CEST] <JEEB> since sRGB is just BT.709 in many ways
[20:04:40 CEST] <JEEB> https://github.com/sekrit-twc/zimg/blob/master/src/zimg/api/zimg.h#L279
[20:06:44 CEST] <JEEB> ok, it seems to be IEC 61966-2-1
[20:06:48 CEST] <JEEB> so I guess that's it?
[20:09:43 CEST] <JEEB> but yea, seems like they're different and IEC-61966-2-1 seems to be the relevant spec
[20:16:03 CEST] <wondiws> hi, I want to convert this code that uses avpicture_fill to av_image_fill_arrays, as the former is deprecated, and the latter is what is recommended
[20:16:40 CEST] <wondiws> but it doesn't work as of yet. What do I need to to fill in in the last argument of av_image_fill_arrays? int align?
[20:18:23 CEST] <Mavrik> wondiws, see what avpicture_fill does:
[20:18:35 CEST] <Mavrik> return av_image_fill_arrays(picture->data, picture->linesize, ptr, pix_fmt, width, height, 1);
[20:18:48 CEST] <wondiws> Mavrik, oh, really? :)
[20:18:54 CEST] <Mavrik> https://www.ffmpeg.org/doxygen/3.0/avpicture_8c_source.html#l00037 :)
[20:19:07 CEST] <rabbe> JEEB, i think i got the first part working.. i've set up nginx with rtmp and started streaming x11grab to the rtmp address configured in nginx.. then i start ffplay and after a while it shows the screen :)
[20:19:47 CEST] <wondiws> Mavrik, thanks, I tried 0 myself as initial guess, and now with 1 it does work :P
[20:20:30 CEST] <rabbe> is there a way to cut down the waiting time of ffplay?
[20:22:18 CEST] <teratorn> rabbe: use a real media player instead?
[20:22:54 CEST] <rabbe> so the problem is with ffplay? i thought it might be some setting
[20:22:58 CEST] <rabbe> i'll try vlc
[20:25:11 CEST] <rabbe> VLC seems to need some time also
[20:25:37 CEST] <rabbe> some initial buffering or something?
[20:27:22 CEST] <rabbe> nginx config only contains: worker_connections 1024, listen 1935, chunk_size 4096, live on, record off
[20:28:53 CEST] <furq> you can try ffplay -fflags nobuffer (iirc)
[20:29:24 CEST] <furq> there is also a buflen setting in nginx-rtmp which defaults to 1 second
[20:29:40 CEST] <furq> reducing that much further sounds like a good way to have a bad time, but you're the boss
[20:29:41 CEST] <Mavrik> Althrough if playback is taking long to start it's usually the stream that's problematic
[20:29:41 CEST] <Mavrik> not the buffer
[20:32:25 CEST] <rabbe> stream is generated by: ffmpeg -f x11grab -s hd1080 -i :0.0 -c:a libfdk_aac -b:a 128k -c:v libx264 -b:v 512k -f flv rtmp://localhost/live/test
[20:33:44 CEST] <Mavrik> is perhaps your A/V out of sync?
[20:33:49 CEST] <Mavrik> What's your keyframe interval?
[20:33:53 CEST] <rabbe> waiting time before it starts is ~ 13-15 seconds
[20:34:45 CEST] <rabbe> don't know about the keyframe interval.. must be some default value?
[20:35:16 CEST] <Mavrik> do -g 30 and try again
[20:35:27 CEST] <rabbe> in ffmpeg command?
[20:36:02 CEST] <furq> yeah as an output option
[20:36:19 CEST] <furq> bear in mind x264 frame threading will introduce some latency as well, but probably less than a second
[20:36:31 CEST] <furq> so unless you're going for like sub-second latency or something then you probably want to leave that on
[20:36:31 CEST] <Mavrik> Yeah, 12 seconds is way too much
[20:36:37 CEST] <JEEB> also latency != time for getting the stream open
[20:36:41 CEST] <JEEB> those are two different things
[20:36:44 CEST] <Mavrik> Also ffmpeg threading will give you latency
[20:36:52 CEST] <Mavrik> but not delay when starting the stream (e.g. zap time)
[20:36:56 CEST] <furq> right
[20:37:06 CEST] <JEEB> you can have time taken until you get your first picture but you can still be very low latency
[20:37:07 CEST] <rabbe> now waiting is about 6 seconds
[20:37:12 CEST] <Mavrik> Zap time is long usually due to long keyframe interval or due to player waiting for one of the out-of-sync streams
[20:38:23 CEST] <Mavrik> rabbe, also your set bitrate looks horribly low
[20:38:36 CEST] <furq> depends what's on the screen
[20:38:41 CEST] <rabbe> advice me :)
[20:39:15 CEST] <rabbe> screen resolution is 1920x1080
[20:39:55 CEST] <rabbe> i'd like 60 fps also.. i will connect and create the stream from an HDMI camera later
[20:40:36 CEST] <rabbe> i guess it can lag < 3-4 seconds
[20:42:11 CEST] <rabbe> bbl
[20:56:24 CEST] <k4kfh> hey guys - I need to use ffmpeg to conform a bunch of videos to the same size/fps (for use with CasparCG). I need to be able to feed in any kind of video and get a 1080p 60fps output, even if that means artificial upscaling. What kind of command would I use for this?
[20:57:06 CEST] <Mavrik> scale and fps video filters
[20:57:17 CEST] <Mavrik> -vf scale=-2:1080,fps=60
[20:57:22 CEST] <Fyr> right
[20:57:25 CEST] <Mavrik> note that you can get judder because of that
[20:57:40 CEST] <Mavrik> if your input fps doesn't map nicely to 60 fps
[20:57:42 CEST] <k4kfh> sorry, dumb question, what is judder? Also why the -2 in the scale filter?
[20:58:14 CEST] <Fyr> Mavrik, perhaps, would scale=1920:-1 be better?
[20:58:33 CEST] <Mavrik> mhm, I prefer -2 to avoid encode failures for funny videos
[20:58:44 CEST] <k4kfh> and I need the audio to copy over (shouldn't be anything fancy, most input videos will probably have stereo audio, and I just need the output format to be something fairly standard that won't be super problematic to work with, I don't know much about audio codecs)
[20:58:45 CEST] <Mavrik> k4kfh, "-1" means "choose whatever to keep aspect ratio"
[20:58:55 CEST] <Mavrik> k4kfh, "-2" means "choose whatever to keep aspect ratio and make sure it's divisible by two"
[20:59:12 CEST] <Mavrik> if you have 16:9 that will create 1920x1080 videos
[20:59:37 CEST] <Mavrik> if not, you essentially say "make height 1080 and set width to whatever won't make it stretched"
[20:59:52 CEST] <Mavrik> alternatively you can take Fyr's example and say "make width 1920 and height whatever won't make it stretched"
[21:00:27 CEST] <Mavrik> k4kfh, as for judder, this is how it looks like: https://www.youtube.com/watch?v=5SSU-s0AUH0
[21:00:32 CEST] <k4kfh> cool i will try both and see what CasparCG seems to like best. Luckily most input videos should be 16:9 anyway
[21:00:46 CEST] <Mavrik> it's caused by the fact that you can't get from 25fps to 60fps (or from 29.997 fps to 60fps) with just duplicating or dropping frames
[21:00:57 CEST] <Mavrik> if your input is 30fps / 60fps you're fine
[21:01:10 CEST] <Mavrik> if it's 25, 29.997 or VFR you'll probably get at least a bit of judder
[21:01:27 CEST] <k4kfh> this is just for a broadcast team in a school so it's not imperative that everything is absolutely perfect lol
[21:01:42 CEST] <Mavrik> ok ^^
[21:01:49 CEST] <k4kfh> basically just trying to avoid paying for Adobe Media Encoder when it looks like ffmpeg will do this just fine
[21:03:06 CEST] <k4kfh> Mavrik, so -vf scale=-2:1080,fps=60 will ensure a constant framerate of 60fps even with a VFR input? doubt that will happen but CasparCG is very finicky about framerates so I need constant framerate at all costs
[21:03:21 CEST] <Mavrik> it should yeah
[21:08:18 CEST] <k4kfh> cool, thank you both. Do you know any good GUIs for ffmpeg that would let me put in settings like that? I am comfortable mucking about in bash and whatnot but the teacher in charge is not quite as experienced so I want to make it as foolproof as posisble
[21:08:53 CEST] <Mavrik> hmm, Handbrake is a pretty decent (although a bit complex) GUI
[21:09:15 CEST] <Spec> Hiya! I'm having an issue re-streaming with ffmpeg, any chance I could get some pointers? :)
[21:09:57 CEST] <k4kfh> Mavrik: I tried handbrake but couldn't get it to do a constant framerate and couldn't get it to upscale for some reason. If I put in 4K footage it would downscale to 1080p but it wouldn't ever upscale 720 to 1080 which is bad
[21:11:24 CEST] <Spec> I am able to stream this thing with ffplay and view the stream, but when I go to record with ffmpeg or restream with ffmpeg, i get an error/nothing. Here're the commands I've been using: https://pastebin.com/raw/ihLjyRSK
[21:12:41 CEST] <furq> k4kfh: bash function?
[21:13:08 CEST] <furq> upscale() { ffmpeg -i "$1" -vf scale=-2:1080,fps=60 "${1%.*}_60fps.mkv" }
[21:13:18 CEST] <furq> except add newlines
[21:13:27 CEST] <Mavrik> uh
[21:13:34 CEST] <Spec> here's a screenshot of ffplay working (albeit with some errors?) https://i.imgur.com/NWPuOB6.png
[21:13:34 CEST] <Mavrik> Spec, is MPEG-TS over RTP a thing? O.o
[21:13:49 CEST] <Spec> Mavrik: yes, gopro hero 4 does this thing
[21:14:12 CEST] <Spec> that command works whether it's udp:// or rtp:// and examples i've found online use them interchangeably
[21:14:25 CEST] <Spec> it's the ffmpeg part i can't get to work
[21:14:33 CEST] <furq> also bear in mind that -2:1080 or 1920:-2 won't give you a constant size if the inputs are different aspect ratios
[21:14:38 CEST] <Spec> right now i'm using OBS to restream the ffplay window :(
[21:14:44 CEST] <furq> so you might just want to force 1920:1080 and deal with slightly incorrect AR
[21:14:52 CEST] <furq> otherwise you'd have to crop or pad
[21:54:51 CEST] <k4kfh> anybody know how to tell ffmpeg to strip the metadata (like the album art and such) out of an audio file (specifically MP3)?
[21:56:45 CEST] <DHE> something like: ffmpeg -i input.mp3 -c:a copy -map 0:a output.mp3
[22:00:21 CEST] <furq> k4kfh: -map_metadata -1
[22:00:44 CEST] <furq> that will obviously remove all the tags as well, so if you just want rid of the album art then do what DHE said
[00:00:00 CEST] --- Sun Oct  1 2017



More information about the Ffmpeg-devel-irc mailing list