[Ffmpeg-devel-irc] ffmpeg.log.20160718

burek burek021 at gmail.com
Tue Jul 19 02:05:01 CEST 2016


[00:00:08 CEST] <JEEB> if you due to any legal requirements need a more limited binary, it's up to you to configure it
[00:00:17 CEST] <JEEB> the configuration is available, though
[00:00:32 CEST] <JEEB> (you can enable/disable components as required)
[00:01:53 CEST] <furq> there is also an important distinction between codecs which someone issues paid licenses for, and codecs which might incorporate patented technology
[00:02:00 CEST] <furq> because the latter seems to be pretty much all codecs
[00:02:25 CEST] <egnun> Ok. o_O
[00:02:56 CEST] <furq> it's extremely unlikely you'll get sued for only using theora, or that the plaintiff has a chance of winning a case, but not impossible
[00:03:38 CEST] <JEEB> and for on2/google's formats there's mostly just google's nice whispers (they do not say they will defend you against any litigation, though, of course)
[00:03:47 CEST] <furq> http://en.swpat.org/wiki/Ogg_Theora
[00:03:51 CEST] <furq> Patent licensing group MPEG LA have made vague, unsubstantiated claims that all video formats infringe their patents:
[00:03:54 CEST] <furq> i like that sentence
[00:04:14 CEST] <furq> mpeg-la sure are a cheery bunch
[00:04:35 CEST] <JEEB> well, there's even more cheerful bunches out there
[00:04:43 CEST] <JEEB> like those actively killing HEVC
[00:05:35 CEST] <furq> i've not been keeping up with the hevc licensing saga
[00:05:45 CEST] <JEEB> MPEG-LA made a patent pool with simple licensing and of course some business people decided that they wanted a bigger pie. so now we have (at least) two patent pools and I think technicolor (?) separately
[00:06:25 CEST] <JEEB> essentially this makes HEVC be in a very funky position even if the two other things end up crawling back to MPEG-LA
[00:06:35 CEST] <furq> is hevc used anywhere yet
[00:06:46 CEST] <JEEB> in some broadcast and VOD things
[00:07:22 CEST] <__jack__> (torrents use it :3)
[00:07:28 CEST] <JEEB> a Japanese channel has been airing it from 2014, and netflix made the (totally marketing) decision to use it for 2160p (which are only available for specific DRM'd plastic boxes)
[00:07:49 CEST] <JEEB> and by now there are test/production channels with it around the world (a dozen or so I guess?)
[00:07:54 CEST] <furq> i meant other than torrents encoded with x265 veryfast at far-too-small sizes
[00:08:20 CEST] <furq> because someone read on the internet that x265 is 100% better than x264 and then immediately stopped reading any information forever
[00:08:53 CEST] <JEEB> so there is usage of it being around, but it totally seems like the licensing status is moving people looking for alternatives
[00:09:22 CEST] <furq> it would be funny if AV1 actually delivered something usable next year and hevc ended up dead in the water
[00:09:24 CEST] <JEEB> hopelly the AOM thing leads to a more proper spec+ref impl-based development model for the google things
[00:09:30 CEST] <furq> or AOM even
[00:10:14 CEST] <JEEB> because when people were yelling at how closed HEVC was I was seeing vp9 development going on pretty much behind closed doors and HEVC was being openly developed on the JCT-VC mailing list
[00:10:48 CEST] <furq> does vp9 have a spec yet
[00:10:57 CEST] <JEEB> I think they finally made a bitstream guide kind of thing
[00:11:16 CEST] <JEEB> due to going SW-only at first there's some bugs that ended up being The Way
[00:27:08 CEST] <bklang> Hi all. Im trying to find a way to blend 5 frames in a row together. The idea is to reduce motion in a time-lapse video. Ive found the tblend filter, but that only seems to blend 2 frames, which doesnt appear to be enough for me. Are there any ways to do this with the CLI, or do I need to write something with libav?
[01:25:27 CEST] <thebombzen> the biggest problem I have with x265 is that it's too slow
[01:25:48 CEST] <thebombzen> like I'm willing to use x264 for everything still because I like my videos to encode without waiting until the day after next tuesday
[01:26:35 CEST] <thebombzen> the downside of modern video codecs is that they all encode too slowly. which I guess they didn't really care about because who wants to use HEVC other than big multinational corporations who can pay for renderfarms, n'est-ce pas?
[01:27:04 CEST] <thebombzen> I mean, an ordinary user with limited UL bandwidth couldn't /possibly/ care about encoding time
[01:27:27 CEST] <thebombzen> sorry I'm just salty that HEVC takes forever.
[01:29:44 CEST] <JEEB> also do note that in '06 x264 was slow
[01:29:51 CEST] <JEEB> CPUs were a lot slower etc
[01:30:29 CEST] <furq> well then it's a good job cpus have been getting so much faster recently
[01:31:31 CEST] <JEEB> it's not only CPUs but also the optimizations on the encoding side, which is what comes with time. also new SIMD instruction sets bring out new possibilities for optimization
[01:33:41 CEST] <JEEB> in any case, x265 as a project has its own issues (no real community that would contribute, format being under uncertain terms etc) but also it's kind of weird to compare an encoder three years after the inception of a format against an encoder that has been developed for a format that was initially finished in '03 (some profiles happened a year or two later, such as high profile which added mainly 8x8 dct)
[01:40:27 CEST] <thebombzen> I guess. H.264 is 13 years old. I guess that HEVC encoders will be as good in 10 years as H.264 encoders are nowadays
[01:40:40 CEST] <thebombzen> but it feels awkward to use 2003 technology, y'know?
[02:17:27 CEST] <egnun> Hello, it's me again.
[02:18:00 CEST] <egnun> I wanted to convert a video from .webm to .ogv
[02:18:14 CEST] <egnun> It said that the Theora encoder is missing.
[02:18:37 CEST] <limbo> install a version of ffmpeg with that encoder.
[02:18:47 CEST] <egnun> When I execute
[02:18:47 CEST] <egnun> $ ffmpeg -codecs
[02:18:47 CEST] <egnun> it says
[02:18:47 CEST] <egnun>  D.V.L. theora               Theora
[02:18:49 CEST] <limbo> or, whatever set of gstreamer plugins you need.
[02:19:16 CEST] <egnun> Well, I compiled ffmpeg myself.
[02:19:32 CEST] <egnun> But I am wondering why it didn't compile the theora encoder.
[02:20:21 CEST] <egnun> I used following configure parameters:
[02:20:21 CEST] <egnun> --enable-gpl --enable-version3 --disable-nonfree
[02:20:43 CEST] <egnun> Could "--enable-version3" have caused the problem?
[02:21:49 CEST] <darkapex> try with --enable-libtheora
[02:21:52 CEST] <darkapex> https://trac.ffmpeg.org/wiki/TheoraVorbisEncodingGuide
[02:22:20 CEST] <darkapex> the wiki is quite good btw
[02:24:20 CEST] <furq> egnun: https://www.ffmpeg.org/general.html#Video-Codecs
[02:24:27 CEST] <furq> anything with an E requires an external library
[02:26:34 CEST] <egnun> furq: Well, there seems to be an installation of a package "libtheora".
[02:27:28 CEST] <darkapex> egnun: yes it's an external library, you'll need to install it from your package manager if it hasn't been already (but as the wiki said the native ffmpeg encoder is experimental and not advised to use for production)
[02:28:17 CEST] <egnun> So, because the native ffmpeg encoder is unstable, it uses the external one?
[02:29:07 CEST] <darkapex> I don't know which one it uses by default, but if you want to force the native one specify '-codec:a vorbis -strict experimental'
[02:29:31 CEST] <darkapex> i.e. if installing libvorbis is a problem, retry the command with this ^ and let us know if it still gives an error
[02:29:31 CEST] <furq> there is no native theora encoder
[02:29:51 CEST] <darkapex> oh shet why did i start talking about vorbis my bad
[02:30:05 CEST] <darkapex> you'll need to install libtheora
[02:30:07 CEST] <furq> well you'll want libvorbis as well if you want to encode ogv
[02:31:09 CEST] <furq> you probably also want libvpx and libopus if you want the most useful license-free codecs
[03:01:05 CEST] <egnun> How can I tell pkg-config to use one specific library?
[03:01:25 CEST] <furq> huh
[03:02:17 CEST] <egnun> Ah, nvm
[03:02:21 CEST] <egnun> nevermind
[03:03:07 CEST] <egnun> I have a "libopusfile" installed, but there is no "libopus" in "/usr/lib64/pkgconfig"
[03:03:40 CEST] <egnun> There is no *opus*.pc in that directory
[03:04:13 CEST] <egnun> I just could compile ffmpeg without opus
[03:04:38 CEST] <furq> you want libopus, not libopusfile
[03:04:58 CEST] <egnun> Well, without opus encoding.
[03:08:13 CEST] <egnun> I have a libopus
[03:08:25 CEST] <egnun> But no file that is called "libopus.pc"
[03:08:33 CEST] <furq> do you have the devel package installed
[03:08:53 CEST] <furq> presumably libopus-devel but i have no idea what fedora calls it
[03:09:30 CEST] <egnun> Well, you'd think that there is one.
[03:09:33 CEST] <egnun> But there is none.^^
[03:09:43 CEST] <egnun> I've already looked for it. ^^
[03:09:49 CEST] <furq> nice
[03:10:34 CEST] <egnun> Ahh1
[03:10:36 CEST] <egnun> Ahhh!!
[03:10:44 CEST] <egnun> It's not called "libopus" but just "opus"
[03:12:06 CEST] <furq> how nice of them
[03:12:24 CEST] <egnun> ^^
[03:12:29 CEST] <egnun> I should file a report.
[03:12:33 CEST] <egnun> But not now!
[03:12:41 CEST] <egnun> Now I am going to install ffmpeg!
[03:12:41 CEST] <furq> i like how fedora's package info site takes you to three different domains before giving you a file list
[03:12:55 CEST] <egnun> Which site?
[03:13:23 CEST] <furq> https://admin.fedoraproject.org/pkgdb/package/rpms/opus/
[03:14:15 CEST] <furq> oh never mind apps.fedoraproject.org has a better search
[03:14:22 CEST] <furq> it's nice of them to not bother getting that to #1 on google
[03:15:36 CEST] <egnun> ^^
[03:15:44 CEST] <egnun> Who uses google anyway?
[03:15:53 CEST] <egnun> DuckDuckGo > *
[03:16:16 CEST] <furq> why not just use bing directly
[03:16:31 CEST] <egnun> &
[03:16:34 CEST] <egnun> Yahoo.
[03:17:17 CEST] <furq> yahoo is also just bing
[03:17:51 CEST] <egnun> Ya, hoo cares?
[03:18:21 CEST] <furq> google know everything about me anyway
[03:18:29 CEST] <furq> i might as well take advantage of that fact and get better search results
[03:19:09 CEST] <egnun> Well, I get quite good results via DDG too.
[03:19:38 CEST] <egnun> And if it doesn't give me the results I am looking for, I can still use google
[03:19:41 CEST] <egnun> even via DDG.
[03:20:25 CEST] <furq> that all seems plausible
[03:20:34 CEST] <furq> but what good is irc if not for making fun of people's minor life choices
[03:21:37 CEST] <egnun> I don't hate you
[03:22:24 CEST] <furq> i feel like we've made a real breakthrough here
[03:23:28 CEST] <egnun> :D
[04:19:49 CEST] <egnun> ffmpeg seems to be working so far! \o/
[04:19:56 CEST] <egnun> Thank you all for your help!
[04:34:45 CEST] <ycon_> Hi all, I'd like to convert a .mp4 video to AVI, mpeg,mpeg4 or mkv
[04:35:51 CEST] <ycon_> I mean I want to convert from .m4v.
[04:35:52 CEST] <ycon_> But when i did ffmpeg -i video1.m4v to output.avi the quality was un-readable
[04:40:47 CEST] <egnun> So the converting process worked without problems?
[04:42:15 CEST] <egnun> Do you need to convert it to .mp4?
[04:42:15 CEST] <egnun> You should try to convert it to .ogv instead.
[06:32:26 CEST] <seanrdev> So I have some videos that get somewhat choppy during (action) playback. Now it's not a huge deal however I have a skylake 4 core processor and an nvidia 965m gtx card. I have used multiple players and I'm wondering what needs to be adjusted to stop choppyness during intense scenes in video playback.
[06:35:27 CEST] <seanrdev> oh also the video files are stored on a ssd vnand drive with read speeds at 22,587 MB/s
[08:19:15 CEST] <Betablocker> moin moin
[08:19:32 CEST] <Betablocker> is there any REST api for ffmpeg ?
[09:24:48 CEST] <egnun> Betablocker: You can REST asure, that I have no clue.
[09:59:32 CEST] <Fjorgynn> jaså
[10:28:24 CEST] <roxlu> hey, when I use --sdp_file stream.sdp, the generated SDP file adds the string "SDP" as the first line. Why is that?
[11:57:16 CEST] <Galadriel> Hi, i tried to compile ffmpeg but there are a lot of (declared at libavcodec/avcodec.h:4763) [-Wdeprecated-declarations] warnings. Any idea?
[11:57:35 CEST] <Galadriel> this is the newest version
[12:03:08 CEST] <Galadriel> anyone
[12:03:09 CEST] <Galadriel> ?
[12:12:11 CEST] <markvandenborre> I have two video streams that I'm putting together into a pretty picture-in-picture stream
[12:12:51 CEST] <markvandenborre> I would like to add the exact (ntpdate) time into that stream, while causing as little sync or delay problems as possible
[12:19:08 CEST] <JEEB> Galadriel: internal deprecated stuff usage is not somethign you should care about
[12:19:16 CEST] <JEEB> unless you want to develop FFmpeg itself
[12:21:12 CEST] <Galadriel> @JEEB: thanks for your answer .. when i encode a video i also have in the log
[12:21:14 CEST] <Galadriel> [webm @ 0x2770880] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
[12:33:18 CEST] <Galadriel> JEEB: is this ok ?
[13:13:19 CEST] <moala> hey, I'm trying to decode AC3 into an LtRt format: https://sourceforge.net/projects/audiotestfiles/files/Audacity%20example%20files/AC3%20multi%20channel/5.1_de.tar.gz/download
[13:13:49 CEST] <moala> ffmpeg -i 5.1_de.ac3 -acodec libmp3lame -ab 320k -ac 2 5.1_de.mp3
[13:14:36 CEST] <moala> but what I obtain is the rear channels mixed to the front channels, not the matrixed ("ProLogicII") version
[13:14:53 CEST] <moala> any idea about how to obtain that?
[13:18:04 CEST] <BtbN> -c:a libmp3lame -b:a 320k -af aresample=matrix_encoding=dplii -ac 2
[13:24:10 CEST] <moala> BtbN, wow, thanks a lot!
[13:24:42 CEST] <BtbN> might even drop the ac, the aresample filter should output 2 channels
[13:25:40 CEST] <moala> yep, works
[17:11:18 CEST] <simulacr> curl, wget etc supports proxy env without protocol prefix, but ffmpeg just _ignore_ them (even without error/warning!), wtf?
[17:28:10 CEST] <simulacr> ah 2259, nevermind
[19:14:15 CEST] <jfhbrook> I have kind of an odd question: I'm working on a mixed media server that does resizes and conversions of stills and video on-the-fly
[19:14:29 CEST] <jfhbrook> which is great because we don't know what sizes of videos, etc., that people will want a'priori
[19:14:45 CEST] <jfhbrook> but there are things we *do* know, like we know what format conversions our system supports
[19:14:59 CEST] <jfhbrook> and we have problems with a thundering herd when content goes live
[19:15:25 CEST] <jfhbrook> so my question, I guess, is: what kinds of pre-processing can I do to improve performance of the final resize?
[19:15:49 CEST] <jfhbrook> like, if I knew aspect ratios and formats, would it be useful to pre-convert to the right ratio and format, and then do resizing on-demand?
[19:15:53 CEST] <jfhbrook> or would that be a waste?
[19:16:09 CEST] <jfhbrook> and are there resizes I can do that would make it easier to do a further resize later?
[19:56:37 CEST] <soulshock> you can do a filter-chain to resize only once for each target resolution and create many bitrates, i.e. -i input.mp4 -filter_complex "scale=X:Y, split=3[out1][out2][out3] -map [out1] -b:v 1000k out1.mp4 -map [out2] -b:v 500k out2.mp4 -map [out3] -b:v 250k out3.mp4
[20:55:30 CEST] <jfhbrook> soulshock: the problem is I don't know the resizes a'priori
[20:56:20 CEST] <jfhbrook> soulshock: so I can, say, convert from mp4 to webm, and I could choose some pre-resizes that are convenient for getting other sizes off, but I can't actually pre-render any of the actual resizes because they're requested on-the-fly via path params
[20:57:06 CEST] <jfhbrook> so like, /media/abc123/w_100,h_50/foo.webm
[20:57:14 CEST] <kepstin> jfhbrook: might be best to just look at your stats and just pre-render any cases that are really common
[20:57:47 CEST] <klaxa> offering users too much customization might also backfire
[20:58:11 CEST] <klaxa> imagine someone requesting /media/abc123/w_99999999999,h_99999999999/foo.webm
[20:58:29 CEST] <kepstin> if you want to reduce the work you do scaling, you should make use of the fact that all video players usually have scalers builtin, often gpu accellerated :)
[20:58:38 CEST] <jfhbrook> klaxa, that's the reality of the world, for various reasons
[20:59:10 CEST] <jfhbrook> ie, this is an existing API contract put in place because designs refuse to commit to preset dimensions
[20:59:33 CEST] <jfhbrook> and we're trying to extend the service to be able to handle mp4 and webm in addition to jpg, png and (potentially animated) gif
[21:01:02 CEST] <klaxa> in that case how about re-encoding every video to half-, quarter- and eight-size and use the next bigger one to generate your requested output?
[21:01:17 CEST] <kepstin> one thing I can think of that might provide a useful speedup is to transcode the source file to some format that's low cpu to decode, to reduce cpu usage in final encode.
[21:01:22 CEST] <klaxa> that obviously only speed up requests lower than half your max resolution
[21:01:25 CEST] <jfhbrook> klaxa, yeah, wondering if that's a viable strategy
[21:01:51 CEST] <jfhbrook> also wondering if it's cheaper to resize while sticking to a common format, vs resizing and changing format at the same tim e
[21:01:54 CEST] <jfhbrook> time*(
[21:02:01 CEST] <jfhbrook> ie, is resizing mp4 to mp4 faster than mp4 to webm?
[21:02:12 CEST] <klaxa> highly depends on codec settings
[21:02:13 CEST] <jfhbrook> given the same crop/resize transforms?
[21:02:14 CEST] <klaxa> and codecs used
[21:02:16 CEST] <jfhbrook> hm
[21:02:32 CEST] <jfhbrook> so let's say the kind of mp4s and webms that average users would upload
[21:02:34 CEST] <klaxa> you will probably want x264 and vp8
[21:02:36 CEST] <kepstin> jfhbrook: the resizing is constant based on video size, no dependencies on format at all
[21:03:02 CEST] <kepstin> jfhbrook: (well, png + gif are special, since they're changing to rgb/palette as well)
[21:03:15 CEST] <klaxa> also keep in mind that every re-encode loses information
[21:03:48 CEST] <jfhbrook> hmmmm
[21:04:23 CEST] <klaxa> i think generally speaking x264 outperforms vp8
[21:04:30 CEST] <kepstin> honestly, I'd just encode to some common presets, and serve up a "close" size to what was requested, and let the user's player scale it if needed.
[21:04:33 CEST] <klaxa> in encoding
[21:05:17 CEST] <jfhbrook> yeah kepstin that's an interesting one, because at one point in the past we *were* doing pre-cuts, but then we'd get a list of 80 cuts that someone absolutely needed and.......yeah.
[21:05:26 CEST] <kepstin> but I guess you might have some business requirements that preclude that.
[21:05:27 CEST] <jfhbrook> and that was *before* we were talking about changing format on the fly
[21:05:35 CEST] <jfhbrook> yeah kepstin
[21:08:14 CEST] <jfhbrook> hmmmmmm
[21:08:31 CEST] <jfhbrook> so it sounds like maybe for v1 trying to pre-cache crops isn't the best idea
[21:08:40 CEST] <furq> you could maybe try playing with swscale's options or use zscale for scaling
[21:08:54 CEST] <jfhbrook> unless we come up with a way for users to specify what they want to pre-render on upload
[21:08:55 CEST] <furq> swscale has a "fast_bilinear" scaling algorithm which could help
[21:08:58 CEST] <jfhbrook> which might be doable
[21:09:01 CEST] <furq> although i expect it'll degrade the quality
[21:09:12 CEST] <jfhbrook> I see
[21:09:15 CEST] <kepstin> I suspect that the biggest part of your overhead is in the video encoding itself, particularly with vp8
[21:09:18 CEST] <furq> but if you're doing webm the problem is presumably the encoding
[21:09:27 CEST] <furq> yeah what he said
[21:09:35 CEST] <kepstin> so a faster scaler will provide slight speedups, but it's really not the biggest target for optimization
[21:09:52 CEST] <jfhbrook> kepstin: oh interesting! So in this case pre-converting and then resizing later would be a win?
[21:09:57 CEST] <kepstin> in order, it's probably encoder, then decoder, then scaler
[21:10:09 CEST] <furq> no?
[21:10:10 CEST] <kepstin> no, because if you scale then you need to do the encode again
[21:10:21 CEST] <furq> you need to decode and reencode to scale
[21:10:44 CEST] <furq> it's not like jpeg where you can do it losslessly
[21:11:16 CEST] <jfhbrook> oh, I see
[21:11:29 CEST] <furq> i'm not really an expert in doing this at such a large scale, but i can't see an obvious solution other than throwing more hardware at it
[21:11:44 CEST] <jfhbrook> so in this case the suggestion is to pick codecs that are cheak to encode/decode
[21:11:48 CEST] <jfhbrook> cheap*
[21:11:59 CEST] <furq> well for the web you're pretty much stuck with h.264 and vp8/9
[21:12:06 CEST] <jfhbrook> I see
[21:12:10 CEST] <furq> i wouldn't use vpx if you don't have to though
[21:12:13 CEST] <furq> libvpx is very slow
[21:12:28 CEST] <jfhbrook> yeah, well, downstream consumers XD
[21:12:34 CEST] <jfhbrook> "Requirements"
[21:12:39 CEST] <kepstin> jfhbrook: if you have lots of io bandwidth/storage, it might make sense to transcode the source files to some format that has minimal or no loss, and is fast to decode
[21:12:42 CEST] <jfhbrook> still, I'll note this in my, well, notes
[21:12:46 CEST] <kepstin> then encode from that
[21:12:53 CEST] <jfhbrook> kepstin: oh? what kinda format would be ideal for that?
[21:13:05 CEST] <furq> rawvideo would be ideal but you're talking a huge amount of storage space
[21:13:12 CEST] <kepstin> depends on the balance of bandwidth to cpu usage you want
[21:13:22 CEST] <furq> there is some lossless yuv format which is optimised for fast decoding, but i forget which one it is now
[21:13:25 CEST] <furq> maybe utvideo?
[21:13:30 CEST] <kepstin> if you have lots of storage/bandwidth, try something like huffyuv, utvideo
[21:14:12 CEST] <kepstin> you could also try just encoding to a fairly high quality h264 via x264 with -tune fastdecode
[21:14:19 CEST] <furq> for reference, one hour of 720p yuv420p rawvideo is ~125GB
[21:14:31 CEST] <furq> so you'd need a fair old amount of storage space to get away with that
[21:14:34 CEST] <Mavrik> furq, huffyuv?
[21:14:57 CEST] <furq> i thought huffyuv was fast to decode by virtue of being a bit crap
[21:15:15 CEST] <kepstin> if you only do it for recently uploaded videos that you expect a lot of requests for, and throw out the temp file for older videos, a raw or fast lossless format might make sense
[21:15:19 CEST] <DHE> it's lossless though, isn't it?
[21:15:25 CEST] <furq> sure
[21:15:51 CEST] <jfhbrook> hmmmm
[21:16:01 CEST] <kepstin> furq: well, it's super simple, lossless, very fast. It's basically just a format for use when you would otherwise use raw video, but don't have enough io bandwidth :)
[21:16:03 CEST] <Mavrik> furq, yes, it is
[21:16:14 CEST] <Mavrik> But it's still order of magnitude reduction of size :)
[21:16:33 CEST] <furq> yeah but i'm pretty sure there's a more efficient lossless yuv codec optimised for fast decode
[21:16:33 CEST] <jfhbrook> so the trade-off there, in my mind, is less disk space and more bandwidth, since we're using s3
[21:16:53 CEST] <kepstin> ouch, s3? bandwidth is kinda expensive there.
[21:17:18 CEST] <furq> i hope you're not doing the whole thing on aws
[21:17:39 CEST] <kepstin> if you are doing it all on aws, and want to try this sort of 'temp raw file' thing, it might make sense to put the temp files on an instance store or storage volume instead of s3
[21:18:15 CEST] <jfhbrook> furq: yeah, s3 and ec2
[21:18:32 CEST] <furq> actually i forgot amazon have that transcoding service now
[21:18:35 CEST] <furq> is that any good
[21:18:41 CEST] <jfhbrook> I think we cache files on-disk cause chances are good there will be multiple renders of the same file
[21:18:54 CEST] <furq> i sort of assumed it would be ridiculously expensive because it's aws
[21:18:56 CEST] <jfhbrook> I read into it a bit furq and I'm far from an expert (surprise!) but it sounds *okay*
[21:19:08 CEST] <jfhbrook> yeah, I kinda don't pay attention to cost (not MY wallet oho!)
[21:19:27 CEST] <kepstin> I assume the amazon transcoding stuff is pretty much just they have a bunch of ec2 servers that do transcodes for you, hooked up into their queue and messaging stuff.
[21:19:38 CEST] <jfhbrook> yeah, it's all SNS/SQS
[21:19:44 CEST] <furq> i assume it's all hardware encoding
[21:19:52 CEST] <jfhbrook> zencoder is similar except there's an http interface for interacting with the jobs
[21:20:20 CEST] <kepstin> you can do your own hardware encoding on aws via the gpu instances (nvenc), but iirc. it's actually cheaper to do cpu encoding
[21:20:24 CEST] <furq> oh
[21:20:29 CEST] <furq> yeah apparently they're using x264
[21:20:31 CEST] <jfhbrook> yeah I was reading that somewhere
[21:20:36 CEST] <furq> that doesn't seem worthwhile then
[21:21:29 CEST] <jfhbrook> yeah, I mean, in my case I feel like we already have a bunch of media servers doing imagemagick stuff, making the same servers do ffmpeg stuff sounds sane, minus imagemagick's penchant for taking up *all* your system ram
[21:22:16 CEST] <furq> i take it you're doing more than just resizing with imagemagick
[21:22:20 CEST] <TD-Linux> I do a lot of encoding on ec2 and it works fine. encoding through EFS also works fine if you don't want s3
[21:22:23 CEST] <jfhbrook> resizes and crops, furq
[21:22:39 CEST] <TD-Linux> that said these are very slow encoding jobs
[21:22:44 CEST] <kepstin> resizes and crops shouldn't be that bad, unless you're talking really big images...
[21:23:15 CEST] <jfhbrook> that's true kepstin but we sometimes have multi-meg animated gifs we have to deal with
[21:23:23 CEST] <jfhbrook> like the kind that *should* be videos
[21:23:43 CEST] <furq> i thought the cool kids used gifsicle for that
[21:23:48 CEST] <jfhbrook> I mean
[21:23:49 CEST] Action: kepstin definitely put some image size, ram, and *run time* limits on his imagemagick processors, and if people exceed them, well, too bad :/
[21:23:51 CEST] <jfhbrook> I'm not that cool XD
[21:24:31 CEST] <kepstin> aside from that one time I set the run time limit too small, and it started affecting some images from customers we actually care about ;)
[21:25:06 CEST] <jfhbrook> heh
[21:25:40 CEST] <jfhbrook> yeah I mean, I'm just thinking we eyeball it based on usage, like trade off lack of memory/disk space for imagemagick vs available memory/disk for ffmpeg
[21:26:24 CEST] <kepstin> keep in mind that x264 can use a *lot* of ram in the slower presets, since it keeps around a lot of raw frames as reference images.
[21:26:24 CEST] <jfhbrook> speaking of, nothing I should be worried about as far as using stdio for this vs the fs, right?
[21:26:31 CEST] <jfhbrook> oh, good to know
[21:26:36 CEST] <jfhbrook> we might want to make the boxes bigger
[21:26:55 CEST] <kepstin> but if you're using faster presets (which I assume you are), that's not as much of a problem
[21:27:05 CEST] <kepstin> still, you should measure it
[21:27:13 CEST] <jfhbrook> haven't quite gotten that far yet kepstin XD still in the discovery phase
[21:27:33 CEST] <jfhbrook> so like, rn I'm detailing what bonus/challenges ffmpeg offers over amazon's transcoder or zencoder
[21:27:43 CEST] <kepstin> we've had people come in here before complaining that video encoding was causing an OOM on their little boxes :)
[21:27:46 CEST] <jfhbrook> trying to get notes on avconv too, just found a few random blog posts though---sounds like bad blood?
[21:28:11 CEST] <kepstin> avconv is just the ffmpeg tool, renamed, and with the options arbitrarily changes in incompatible ways
[21:28:36 CEST] <jfhbrook> yeah, that seems to be my memory from the last time I trued using avconv (because debian)
[21:29:33 CEST] <kepstin> ffmpeg and avconv command line tools are very much tied to the ffmpeg vs. libav versions of the libraries.
[21:29:39 CEST] <furq> debian is back in ffmpeg now
[21:29:44 CEST] <jfhbrook> nice furq
[21:29:48 CEST] <soulshock> ffmpeg -i input.mov -pass 1 -pix_fmt yuv420p -profile:v high -c:v libx264 -f mp4 NUL sets profile to "main" for some reason: [libx264 @ 00000000037c2e60] profile Main, level 4.1
[21:29:51 CEST] <furq> debian is back on ffmpeg, rather
[21:29:53 CEST] <jfhbrook> from a clueless consumer standpoint that's definitely good news
[21:29:54 CEST] <furq> or ffmpeg is back in debian
[21:30:04 CEST] <soulshock> is it impossible to use profile high in 1st pass?
[21:30:32 CEST] <kepstin> soulshock: you're working under a misunderstanding of x264's profile handling
[21:30:44 CEST] <soulshock> that's highly probable
[21:30:56 CEST] <kepstin> soulshock: x264 fills the profile field in the result based on the features actually used in the encoded video stream
[21:31:06 CEST] <kepstin> soulshock: the -profile option only sets a "maximum limit"
[21:31:27 CEST] <kepstin> soulshock: when doing a 1st pass encode, you aren't supposed to use the resulting video file, so x264 disables some stuff to make it faster
[21:31:44 CEST] <kepstin> soulshock: as a result, it might not be the same profile as the 2nd pass
[21:32:26 CEST] <soulshock> aha ok.
[21:32:39 CEST] <soulshock> and the -preset option, does that also not apply?
[21:33:13 CEST] <kepstin> you should usually use the same -preset option on first and second pass
[21:33:29 CEST] <kepstin> since it can change some settings that affect the bitrate prediction, iirc.
[21:33:56 CEST] <soulshock> ok. that was my understanding. cool
[21:34:57 CEST] <kepstin> soulshock: also take a look at the description of the "--slow-firstpass" option to x264, to see what x264 disables in the first pass by default.
[21:36:48 CEST] <soulshock> ok
[21:37:25 CEST] <jfhbrook> thanks y'all, and I'm sure I'll be in here at a later date all "omg how u video lol" once this project gets greenlit
[21:37:42 CEST] <jfhbrook> either that or my coworker that totally knows c++ will insist on writing native bindings and I won't be on it ¯\_(Ä)_/¯
[21:38:50 CEST] <kepstin> soulshock: keep in mind that using a faster preset in x264 will also result in turning off some features, e.g. I don't think you'll ever get 'high' output with -preset ultrafast.
[21:40:45 CEST] <furq> ultrafast is always baseline isn't it
[21:40:50 CEST] <soulshock> kepstin yeah. I'm experimenting with -preset veryslow because I want to build files I can use for psnr comparison
[21:41:24 CEST] <soulshock> i.e. copy the netflix idea of per-title-encodingf
[21:42:56 CEST] <kepstin> hmm, ultrafast does appear to be a superset of the options x264 sets for baseline, yeah
[21:43:52 CEST] <__jack__> speaking of "netflix per title encoding", what's the technical stuff behind that commercial speach ?
[21:46:51 CEST] <__jack__> is there anything more than just preset & crf, instead of constant bitrate, I mean
[21:47:19 CEST] <DHE> constant quality with -qp, variable quality with -crf, and variable bitrate with -b:v
[21:47:46 CEST] <kepstin> -qp isn't constant quality, it's constant *quantizer*
[21:47:56 CEST] <DHE> which is usually close enough
[21:48:24 CEST] <DHE> it basically means the bitrate skyrockets when the picture becomes busy
[21:48:56 CEST] <kepstin> crf is nominally constant perceptual quality, with some optimizations to lower the quantizer where the psy optimizations think extra detail isn't needed.
[21:49:09 CEST] <furq> __jack__: i'm pretty sure it's only relevant if you can only use cbr
[21:50:30 CEST] <abumusa> I want to build a screen recorder using Native Client of Google Chrome, I am having hard time to build the encoder because the compiling process takes lots of time, I am looking for ready code I use
[21:50:34 CEST] <kepstin> online streaming is a tricky challenge, since you have to actually limit bitrate to fit the best quality in a particular internet connection bw.
[21:51:07 CEST] <abumusa> http://stackoverflow.com/questions/38444827/screen-capture-with-ffmpeg-and-google-native-client
[21:51:42 CEST] <BtbN> I hope a website won't ever be able to capture my screen.
[21:52:03 CEST] <kepstin> BtbN: too late for that, chrome already has builtin screen capture support for webrtc stuff
[21:52:10 CEST] <kepstin> iirc it's only exposed to extensions right now
[21:52:19 CEST] <BtbN> So not going to use Chrome, nothing to change here.
[21:52:24 CEST] <abumusa> exactly, I am building an extension
[21:52:43 CEST] <abumusa> I want to make this thing open source, I hate the fact I have to buy a screen recorder each time I want record a screencast
[21:52:49 CEST] <kepstin> firefox does as well  :)
[21:53:29 CEST] <kepstin> abumusa: just use obs
[21:53:45 CEST] <abumusa> Compiling the code is taking to much time from me, because it cross compiles to three different platform, the documentation stinks, I've been compiling the code for the past week
[21:54:30 CEST] <abumusa> kepstin: I need the software to be a chrome extension it is part of the requirements I can not change or use 3rd party software
[21:54:56 CEST] <abumusa> I am building a usertesting.com like website, so I need to push UX tests using chrome extensions
[21:55:18 CEST] <abumusa> so I need to use FFMPEG to compile screen capture which Chrome offer to a webm file
[21:55:33 CEST] <abumusa> there are four extensions doing this, so I think it is not that hard to program
[21:55:50 CEST] <abumusa> but I do not have the experience to build it and could not find a freelancer to do it for me
[21:56:15 CEST] <abumusa> if this project works for me I will open source the code of it
[21:56:40 CEST] <abumusa> If anyone of you interested in building this with me, I am more than happy to show what I have done so far
[21:58:14 CEST] <kepstin> but yeah, libvpx's "constant quality" mode (exposed via the "-crf" option in ffmpeg, even though it's nothing like x264's crf mode) is actually kind of interesting, since it was designed explicitly for online video streaming.
[21:58:57 CEST] <kepstin> it's used in combination with a bitrate and vbv settings, so you can' say "encode the video for this bitrate, but use less bits if not needed in order to maintain at least this quality"
[22:00:04 CEST] <kepstin> I'm not sure, but I think you might be able to use x264's crf mode along with vbv settings to achieve something similar?
[22:11:07 CEST] <abumusa> In case you know some who wants to work on this, please let me know
[22:41:55 CEST] <DelphiWorld> hey dudes
[22:42:01 CEST] <DelphiWorld> i have a small issue if someone could help?
[22:42:25 CEST] <DelphiWorld> ffmpeg -i http://1.1.1.1:4343/1.ts -acodec copy -vcodec copy -f flv rtmp://localhost/live/1.ts
[22:42:44 CEST] <DelphiWorld> this work perfectly except when the http source goes down,  so the ffmpeg process shutdown
[22:42:55 CEST] <DelphiWorld> is there any timeout for the input so ffmpeg keep retrying?
[22:46:25 CEST] <kepstin> well, http is a tcp connection, if it doesconnects, it disconnects, and you have to restart from scratch...
[22:46:33 CEST] <kepstin> just run the command in a shell script loop?
[22:46:53 CEST] <DelphiWorld> kepstin: sad... pretty big issue always need to restart manualy
[22:47:02 CEST] <DelphiWorld> if the source was multicast this issue wont happend
[22:47:03 CEST] <furq> https://www.ffmpeg.org/ffmpeg-protocols.html#http
[22:47:16 CEST] <furq> maybe reconnect_at_eof and/or reconnect_streamed will do what you want
[22:47:33 CEST] Action: DelphiWorld reading furq suggestion
[22:49:37 CEST] <DelphiWorld> furq: where to put that reconnect_at_eof ?
[22:49:50 CEST] <furq> -reconnect_at_eof 1 -i http://...
[22:50:08 CEST] <DelphiWorld> thank furq, doing it now
[22:52:29 CEST] <DelphiWorld> ok, furq. ffmpeg -reconnect_at_eof 1 http://furq.me/chan.ts
[23:13:57 CEST] <DelphiWorld> it's up till now
[23:17:57 CEST] <streulma> how can I encode a http mjpeg stream to mp4 format x264 h264 codec ?
[23:17:57 CEST] <streulma> or better an rtsp h264 stream to mp4 file
[23:18:09 CEST] <streulma> mp4 stream sorry
[23:53:24 CEST] <whomp> how can i set the duration of the audio and video to be the same, and to be the lesser of the two's current durations? i'm dealing with video files where one of the two is slightly longer than the other, and it causes playback issues on ios
[23:57:38 CEST] <whomp> in general, how am i supposed to sync my videos that play at 29.97 fps with my audio that is at 44100hz?
[00:00:00 CEST] --- Tue Jul 19 2016


More information about the Ffmpeg-devel-irc mailing list