[Ffmpeg-devel-irc] ffmpeg.log.20171002

burek burek021 at gmail.com
Tue Oct 3 03:05:01 EEST 2017


[00:01:34 CEST] <JEEB> Durandal: probably http://git.videolan.org/?p=ffmpeg.git;a=commit;h=5a3b602acda68fe5ca09082dc753179450a97a13
[00:01:55 CEST] <JEEB> although wait that's not it
[00:08:05 CEST] <Durandal> I'm guessing that applies to the alternate implementation, not libopus
[00:09:09 CEST] <Durandal> here's my command line: ffmpeg -i in.mkv -c:a libopus -b:a 212K -mapping_family 1 -map 0:a:0 out.ogg
[00:09:41 CEST] <Durandal> the source stream looks like this on ffprobe: Stream #0:1(eng): Audio: dts (DTS), 48000 Hz, 5.1(side), fltp, 1536 kb/s (default)
[00:23:33 CEST] <JEEB> Durandal: http://git.videolan.org/?p=ffmpeg.git;a=commit;h=37941878f193a2316c514bd5ba55bfe9d2dfdfcf
[00:23:36 CEST] <JEEB> this was it then
[00:23:46 CEST] <JEEB> you specifically setting mapping family gave the hint out :P
[01:41:05 CEST] <lindylex> How can I change the color of this audio wave to green?  This is what I tried :::  ffmpeg -i hgs.m4a -filter_complex "[0:a]volume=2, showwaves=s=1280x720:mode=line,colorkey=#123456,format=yuv420p[v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -y output.mkv
[01:48:45 CEST] <lindylex> Never mind I solved it :  ffmpeg -i hgs.m4a -filter_complex "[0:a]volume=2, showwaves=s=1280x720:mode=line:colors=white,format=yuv420p[v]" -map "[v]" -map 0:a -c:v libx264 -c:a copy -y output.mkv
[09:02:16 CEST] <rabbe> for live camera streaming to browser, using ffmpeg -> FLV -> nginx, what WebRTC implementation can i use?
[09:02:31 CEST] <rabbe> (FLV over rtmp)
[09:05:04 CEST] <rabbe> do I configure nginx with for example "dash on" and something else to expose a dash stream which can then be opened by the browser running a WebRTC player on the client side?
[09:57:21 CEST] <rabbe> hm, if i stream dash or hls, i don't need a player browser side, html5 video tag will work?
[09:58:10 CEST] <JEEB> there are few players that take things straight into a <video> tag, mostly MS Edge and Apple's thing (Edge supports DASH and HLS manfiests/playlists in a video tag, the Apple thing HLS only)
[09:58:32 CEST] <JEEB> so you have to utilize dash.js or hls.js to parse the things and feed them separately to the decoders instead (it is a video tag underneath, though)
[10:01:09 CEST] <rabbe> ok, so i need dash.js and hls.js on the web page. how much code do i need for parsing?
[10:02:02 CEST] <rabbe> have an example anywhere?
[10:02:34 CEST] <JEEB> they handle it for you, just google for those libraries and the usage example
[10:02:44 CEST] <rabbe> will i get better performance using separate player?
[10:03:23 CEST] <JEEB> dunno, that would be a general browser | not a browser thing
[10:05:32 CEST] <rabbe> in nginx, i would just need dash on; dash_path /tmp/dash; to send in dash format?
[10:06:14 CEST] <JEEB> that is nginx-rtmp specific so see its configuration
[10:06:47 CEST] <JEEB> and it's probably more correct to say "offer" content in that format, since you're not pushing but the client is pulling
[10:06:57 CEST] <rabbe> right
[10:07:08 CEST] <rabbe> oh, and i need some manifest file for dash?
[10:07:21 CEST] <JEEB> it should generate that by itself if it's generating DASH
[10:07:24 CEST] <JEEB> since it's part of it
[10:07:31 CEST] <rabbe> oh
[10:08:00 CEST] <rabbe> so on the web page, i will just need something like this? view-source:http://dashif.org/reference/players/javascript/1.4.0/samples/getting-started-basic-embed/auto-load-single-video-src.html
[10:08:11 CEST] <JEEB> yes, or any of the newer versions :P
[10:08:23 CEST] <JEEB> you can even first test with one of the examples
[10:08:30 CEST] <JEEB> https://reference.dashif.org/dash.js/
[10:12:10 CEST] <rabbe> thanks, i'll try
[10:19:52 CEST] <rabbe> wait, was this solution only for MS Edge and Apple?
[10:20:19 CEST] <rabbe> i need a separate js player for other browsers?
[10:21:20 CEST] <JEEB> dash.js is for everything, MS Edge supports HLS and DASH manifest URL right in the video tag
[10:22:06 CEST] <rabbe> so for like chrome i need dash.js + some separate player?
[10:22:16 CEST] <JEEB> dash.js itself gives you basic playback capabilities
[10:22:24 CEST] <furq> hls.js and dash.js work with a regular video tag
[10:22:35 CEST] <furq> you can use another player on top of that if you want to but you don't need to
[10:22:36 CEST] <JEEB> and if you wanted a separate player those generally already include hls.js/dash.js
[10:22:42 CEST] <furq> and yeah, that
[10:23:10 CEST] <rabbe> for chrome i need some extra code to give the video tag the feed?
[10:23:18 CEST] <furq> yeah but only a few lines
[10:23:27 CEST] <rabbe> okay
[10:23:34 CEST] <rabbe> example anywhere?
[10:23:37 CEST] <JEEB> ...
[10:23:45 CEST] <JEEB> have you actually tried reading the docs in the dash.js repo or so?
[10:23:52 CEST] <JEEB> or do you expect being spoonfed everything?
[10:24:10 CEST] <rabbe> hehe.. just stressed, sorry
[10:24:15 CEST] <rabbe> i will check
[10:24:18 CEST] <JEEB> I mean you even linked dash.js docs *yourself* for christ's sake :)
[10:24:34 CEST] <rabbe> :)
[10:37:39 CEST] <Nacht> There are quite a few players around as well
[10:37:58 CEST] <Nacht> Theo player, JW player, Meister Player
[10:38:29 CEST] <Nacht> https://www.theoplayer.com/ https://www.jwplayer.com/ https://www.meisterplayer.com/
[11:48:41 CEST] <cart_man> Hi everyone
[11:48:51 CEST] <cart_man> I am trying to install FFMPEG from .deb packages only ! BUT I cannot see to find and install any of the libavcodec-ffmpeg packages
[11:49:04 CEST] <cart_man> I get told that its missing libavcodec-ffmpeg56 but when I try and install it it just says it does not exist ...
[11:49:05 CEST] <cart_man> So then later on I see that Ubuntu16 has libavcodec-ffmpeg57 now but that can not install either ... tried -> " sudo apt-get install libavcodec-extra57 libavformat57 libvutil55 "  : Unable to locate packages
[11:52:15 CEST] <BtbN> You'll have to ask your distribution about that. FFmpeg does not provide deb packages, let alone binaries.
[11:54:20 CEST] <cart_man> where can I go to ask distribution for it though?
[11:54:27 CEST] <cart_man> Just thought people in this channel isntall its quite allot
[12:02:26 CEST] <tittybang> https://packages.debian.org/source/jessie/libav
[12:02:27 CEST] <tittybang> ?
[12:05:17 CEST] <BtbN> That's not ffmpeg.
[12:08:38 CEST] <tittybang> https://wiki.debian.org/ffmpeg
[12:09:45 CEST] <tittybang> its not nesc. a distribution issue, if the distributions arent informed on how to setup its kind of impossible to maintain.
[12:12:22 CEST] <cart_man> I have installed it in the past using apt-get BUT I had to install a couple of packages prior to it
[12:12:31 CEST] <cart_man> and I remember it was libav packages
[12:12:37 CEST] <cart_man> tittybang ^^
[12:12:38 CEST] <furq> you really shouldn't have to do that
[12:13:07 CEST] <furq> it sounds like either your repos are messed up or you've got some custom PPA that's fucking with things
[12:16:55 CEST] <cart_man> furq : Ok so it says to run "apt-get -f install" to fix the issue ... that might be what you are referring too? Problem is I need to be able to install these packages on a machine with no internet. Has to be version locked soo running apt-get -f install is not an option
[13:03:10 CEST] <rabbe> if ffmpeg streams to rtmp://localhost/live/test and nginx is configured with: live on; record off; dash on; dash_path /tmp/dash; it will create a manifest file inside /tmp/dash which the web page can reference using http://localhost/something.. or should it be rtmp://something?
[13:33:17 CEST] <c_14> http://, though you probably want to use /path to make it protocol agnostic
[13:40:16 CEST] <rabbe> i changed dash_path to /tmp/dashfolder.. i see a test.mpd being created inside that folder.. but i can't get video using source src="http://localhost/live/test.mpd"
[13:40:23 CEST] <rabbe> on the webpage
[13:42:33 CEST] <rabbe> if i used source src = "http://dash.edgesuite.net/envivio/dashpr/clear/Manifest.mpd" it worked
[14:39:09 CEST] <rabbe> should this produce a valid manifest file? ffmpeg -f x11grab -s 1920x1080 -i :0.0     -c:v libx264 -profile:v baseline -preset veryfast -tune zerolatency -pix_fmt yuv420p -x264opts crf=20:vbv-maxrate=3000:vbv-bufsize=100:intra-refresh=1:slice-max-size=1500:keyint=30:ref=1  -f flv rtmp://localhost/live/test
[14:46:13 CEST] <celyr> any idea on why this mp3: https://paste.debian.net/988681/ is not working in a car mp3 radio reader ?
[14:51:35 CEST] <Guest72430> hi I need some help I've tried to use ffmpeg to split audio from video while preserving the folder structure but I just can't get it done.
[14:51:55 CEST] <Guest72430> here is a link to a bat file I've made
[14:51:56 CEST] <Guest72430> http://www.filedropper.com/videotoaudioonlyv0021
[14:52:00 CEST] <relaxed> celyr: ffmpeg -i x.mp3 -map 0:a -c copy out.mp3
[14:52:22 CEST] <celyr> relaxed, tnx for the spoonfeed and what i'm doing ? :XD
[14:52:32 CEST] <Guest72430> I pretty much don't know what I'm doing though
[14:52:43 CEST] <relaxed> celyr: ignoring the album art
[14:52:50 CEST] <celyr> cool
[14:52:58 CEST] <Guest72430> I've googled everything and tried using the manual but I still cant get it done
[14:54:15 CEST] <relaxed> Guest72430: pastebin the script
[14:54:25 CEST] <Guest72430> k
[14:55:17 CEST] <Guest72430> https://pastebin.com/kJ7rEGdQ
[14:56:42 CEST] <Guest72430> I'm hacking it together since I don't really understand what I'm doing
[14:57:41 CEST] <Guest72430> I've got a lot of fodlers and subfolders
[14:59:33 CEST] <Guest72430> converting everything and then sorting it again would take hours. I really need something to convert/split audio from video while keeping the folder structure intact.
[14:59:58 CEST] <Guest72430> converting every folder would be my last resort but doing that would also take very long
[15:10:23 CEST] <Guest72430> anyone?
[15:10:27 CEST] <Nacht> iek Windows
[15:11:16 CEST] <Guest72430> well some of us don't work as admin and have to use windows
[15:11:18 CEST] <relaxed> it looks like you're mixing windows batch and bash
[15:11:28 CEST] <Nacht> yeah
[15:11:31 CEST] <Guest72430> oh ok
[15:11:37 CEST] <Nacht> Does For even work in Windows ?
[15:11:40 CEST] <Nacht> I know of FORFILES
[15:12:26 CEST] <relaxed> "for" is valid
[15:13:08 CEST] <Guest72430> I could use Cygwin
[15:13:17 CEST] <Guest72430> if windows is a problem
[15:13:30 CEST] <Nacht> Nah, Windows shouldnt be much of a problem
[15:14:57 CEST] <Nacht> So to get it streight, you want a script that copies only the audio of all the mp4 files in your input folder, and create audio files of it in the output folde r?
[15:15:28 CEST] <Guest72430> it ok if it's in the same folder
[15:15:40 CEST] <Guest72430> or the folder structure need to be copied
[15:15:52 CEST] <Nacht> Are the video's all useing the same audio codec ?
[15:16:06 CEST] <Guest72430> probably
[15:16:18 CEST] <Guest72430> is that important?
[15:16:31 CEST] <Guest72430> it could get mixed sometimes
[15:16:40 CEST] <Nacht> Then it's best to transcode
[15:17:13 CEST] <Guest72430> yes
[15:17:48 CEST] <Nacht> So you get something like this: ffmpeg -i <input> -vn -c:a mp3 -f mp3 <output>.mp3
[15:18:31 CEST] <Guest72430> yes the problem is I can't convert everything in subfolders with it
[15:18:40 CEST] <Guest72430> only one folder after another
[15:18:46 CEST] <Guest72430> I can do that
[15:19:28 CEST] <Guest72430> My problem is like I said keeping the folder structure
[15:20:13 CEST] <Guest72430> something like convert everything in the folder and subfolders and palce it in that same folder
[15:20:19 CEST] <Nacht> So your problem lies outside the usage of ffmpeg
[15:20:33 CEST] <Guest72430> does it?
[15:20:52 CEST] <Guest72430> Im new to this
[15:21:41 CEST] <Nacht> You're using a script to parse trough the folder/files. So you want to know what directory you're in, so you can give the same directory to ffmpeg
[15:22:19 CEST] <Guest72430> yes I think so
[15:22:55 CEST] <Nacht> I'm no expert in windows scripting, but I might have something. Give me a few mins to test it
[15:23:22 CEST] <Guest72430> that would be awesome
[15:25:41 CEST] <Guest72430> it doesn't have to be windows script I'll take anything
[15:26:44 CEST] <Guest72430> the reason I have the bat file is because that's what I found per google
[15:31:03 CEST] <Nacht> you don't happen to have Windows 10 and BASH installed ?
[15:36:00 CEST] <Guest72430> nope
[15:36:21 CEST] <Guest72430> but I an install cygwin
[15:36:52 CEST] <Guest72430> or use my deb hdd
[15:41:19 CEST] <dystopia> anyway to handle bink video with ffmpeg?
[15:41:49 CEST] <JEEB> bink v1 is implemented
[15:41:54 CEST] <JEEB> bink v2 is not yet
[15:41:59 CEST] <JEEB> Watches Pelcome
[15:42:17 CEST] <JEEB> thankfully you can get the official player from bink itself :)
[15:42:25 CEST] <JEEB> so you can just reverse engineer that
[15:44:05 CEST] <Nacht> Jeez, Windows scripting feels so restrictive
[15:44:18 CEST] <JEEB> powershell isn't that bad
[15:44:35 CEST] <JEEB> also in addition to batch windows for a very long time has had visual basic and JS based scripting
[15:44:45 CEST] <JEEB> but yes, batch is old and barren and full of special cases
[15:45:12 CEST] <JEEB> the one thing I really liked about PS is that it let you poke around registry as if it was a file system
[15:45:16 CEST] <JEEB> which is <3
[15:45:57 CEST] <Nacht> Guest72430: Do you have powershell ?
[15:46:34 CEST] <Guest72430> nope
[15:46:57 CEST] <Nacht> okido, relative paths it is then
[15:47:54 CEST] <Nacht> FORFILES -p "C:\mount\Test\input" /m *.mp4 /s -c "cmd /c ffmpeg -i @path -vn -c:a mp3 -f mp3 @PATH.mp3"
[15:48:10 CEST] <Nacht> doesn't even like relative paths
[15:48:23 CEST] <Guest72430> that short o.O?
[15:48:25 CEST] <Nacht> So what does this do ? FORFILES will go recursively trough all your subfolders
[15:48:34 CEST] <Guest72430> dang nice
[15:48:37 CEST] <Nacht> finding all files that end with .mp4
[15:48:42 CEST] <Guest72430> you'r my here :D
[15:48:45 CEST] <Guest72430> hero
[15:49:01 CEST] <Nacht> it will then paste it to ffmpeg and it will make an mp3 in the SAME folder, but ending with .mp3
[15:49:10 CEST] <Guest72430> I'll test it out asap
[15:49:11 CEST] <Nacht> its dirty, but it does the job
[15:49:22 CEST] <Guest72430> just waht I needed
[15:52:59 CEST] <sash__> Hey, I'm hitting the same issue this guy has hit: https://lists.ffmpeg.org/pipermail/ffmpeg-user/2013-March/014050.html - Basically, when transcoding to mpegts, the start time is always 1.4 seconds for no reason. Can anyone help?
[16:00:59 CEST] <Guest72430> did you run it from command promt or as a bat file?
[16:02:36 CEST] <Guest72430> lol I forgot to put ffmpeg in system32
[16:03:53 CEST] <Guest72430> nope still getting errors
[16:05:37 CEST] <Guest72430> ERROR: Invalid argument/option - '-i'.Type "FORFILES /?" for usage.
[16:10:19 CEST] <Guest72430> strange I used /? and it should work everything is in doublequotes
[16:18:52 CEST] <Guest72430> you've tested the script right?
[16:25:40 CEST] <rabbe> how to solve this? Specified type attribute of application/dash+xml is not supported. Load of media resource http://localhost/live/test.mpd failed.
[16:25:55 CEST] <rabbe> i've added it to ginx mime.types
[16:26:43 CEST] <funman> /usr/bin/ffmpeg -ar 48000 -i ird.ts
[16:26:46 CEST] <funman> Option sample_rate not found.
[16:27:11 CEST] <funman> what am I missing?
[16:28:22 CEST] <funman> -ar:0:3 (i want the option for 3rd audio track) doesn't change a thing
[16:31:57 CEST] <DHE> you don't set the sample rate for the input unless it's a headerless format that needs it specified
[16:36:38 CEST] <funman> it's a headerless format that needs it specified
[16:37:34 CEST] <funman> another one:
[16:37:39 CEST] <funman> /usr/bin/ffmpeg -non_pcm_mode copy -y -i ird.ts -map 0:3 -c:a:0 pcm_s32le -f s32le s302.raw
[16:37:44 CEST] <funman> Codec AVOption non_pcm_mode (Chooses what to do with NON-PCM) specified for input file #0 (ird.ts) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some decoder which was not actually used for any stream.
[16:40:29 CEST] <Guest72430> does anyone what I'm doing wrong here: cmd is: FORFILES -p "C:\mount\Test\input" /m *.mp4 /s -c "cmd /c ffmpeg -i @path -vn -c:a mp3 -f mp3 @PATH.mp3" error I get is: ERROR: Invalid argument/option - '-i'.Type "FORFILES /?" for usage.
[16:40:54 CEST] <Guest72430> I get that FORFILES has a problem with the ffmpeg -i command
[16:41:08 CEST] <Guest72430> I tried putting it in quotes but that didn't work
[16:41:22 CEST] <Nacht> I put it in a .bat file btw
[16:42:02 CEST] <Nacht> Did you change the path ?
[16:42:58 CEST] <Guest72430> same problem msg with the bat
[16:43:25 CEST] <Guest72430> and yes I changed to a path with the subfolders and files in it
[16:45:02 CEST] <Guest72430> to me it looks like forfiles has a problem with the ffmpeg commands
[16:45:33 CEST] <Guest72430> but if it woked for you I'm kinda stuck
[16:47:18 CEST] <Guest72430> god damn windows
[16:48:50 CEST] <Guest72430> I just found this: To include special characters in the command line, use the hexadecimal code for the character in 0xHH format (ex. 0x09 for tab). Internal CMD.exe commands should be preceded with "cmd /c".
[16:50:06 CEST] <Guest72430> maybe it's not working because ffmpeg is not a cmd.exe command>
[16:50:08 CEST] <Guest72430> ?
[16:50:33 CEST] <Guest72430> I've used and tested ffmpeg though
[16:51:04 CEST] <Nacht> It worked for me ?
[16:51:09 CEST] <Guest72430> and the error only states that it has problems with -i
[16:51:26 CEST] <Guest72430> god damn it I hate this kinda stuff
[16:51:36 CEST] <Guest72430> feel like I'm cursed
[16:53:07 CEST] <Nacht> ffmpeg is in your PATH ?
[16:54:52 CEST] <Guest72430> what do you mean with in my path?
[16:55:13 CEST] <Guest72430> I put it in win32 so I can use it in cmd.exe without path
[16:55:24 CEST] <Guest72430> I mean system32
[17:00:12 CEST] <Nacht> If you want to run a binairy in windows in your cmd without having to use the entire path, you should put it in your envoirement path
[17:00:16 CEST] <Nacht> NOT system32
[17:02:33 CEST] <Guest72430> holy shit I found the problem I had to remove the "" around the path
[17:47:16 CEST] <vans163> what is the difference between baseline  and constrained baseline?
[17:47:38 CEST] <JEEB> baseline has various additional features which are not found in main/high profiles
[17:47:46 CEST] <JEEB> pretty much nothing uses or implements those
[17:47:55 CEST] <JEEB> constrained baseline is what is a subset of main/high
[17:48:08 CEST] <JEEB> (and it is what is meant with "baseline" in 99% of all cases)
[17:48:14 CEST] <furq> x264 baseline profile is actually CBP
[17:48:25 CEST] <furq> so there's usually no practical difference unless you're dealing with some weird hardware encoder
[17:48:34 CEST] <vans163> JEEB: ah thanks, so contrained baseline is an improvement over baseline
[17:48:37 CEST] <vans163> not worse
[17:48:46 CEST] <furq> CBP supports fewer features than baseline
[17:48:57 CEST] <vans163> oh
[17:49:01 CEST] <furq> https://en.wikipedia.org/wiki/H.264/MPEG-4_AVC#Profiles
[17:49:05 CEST] <furq> see the table
[17:49:10 CEST] <vans163> baseline is already so bad though.. this is for WebRTC
[17:49:32 CEST] <JEEB> vans163: the thing is, pretty much nothing supports those additional features
[17:49:36 CEST] <furq> right
[17:49:42 CEST] <furq> openh264 only outputs CBP afaik
[17:49:54 CEST] <vans163> JEEB: ah i got it, yes they use Openh264 decoder
[17:49:56 CEST] <JEEB> so constrained baseline is pretty much what people mean when they say "baseline"
[17:49:57 CEST] <vans163> and encoder
[17:50:11 CEST] <furq> also those additional features in baseline are all for error prevention afaik
[17:50:22 CEST] <furq> they won't make the picture any nicer
[17:51:25 CEST] <vans163> got it.  il have to play with webrtc later and try to get the browsers hardware decoder to decode main/high instead of baseline
[17:51:54 CEST] <vans163> and if software decoder is chosen, fallback to CBP so openh264 can decode it in decent speed
[17:52:13 CEST] <vans163> i noticed openh264 can decode main/high profiles, but it takes almost 2x if not more CPU usage and time
[17:52:24 CEST] <furq> i take it you can't just bump the bitrate
[17:52:36 CEST] <vans163> furq: i can
[17:52:45 CEST] <furq> baseline should look fine if you throw enough rate at it
[17:52:58 CEST] <furq> at least with x264 it will, idk about openh264
[17:53:17 CEST] <vans163> furq: any idea what the recommended rate is for 30fps 720p and 1080p?
[17:53:29 CEST] <furq> depends massively on the source
[17:53:37 CEST] <furq> you'll probably just have to go with trial and error
[17:53:39 CEST] <vans163> source is very colorful(?_
[17:54:05 CEST] <vans163> i noticed for 30fps 720p at 16mbps is much better than 8mbps
[17:54:14 CEST] <vans163> and 16->32mbps not much increase
[17:54:41 CEST] <vans163> but afaik a main/high profile  with 5-8mbps looks like the 16mbps baseline, but im not sure
[17:55:01 CEST] <furq> if that's main/high from x264 then i'd expect that to be better in general
[17:55:12 CEST] <furq> the only reason to use openh264 is that there's no licensing fee afaik
[17:55:26 CEST] <vans163> its main/high from nvenc
[17:55:32 CEST] <furq> if that's not of concern to you then you're probably better off using x264 for baseline as well
[17:55:35 CEST] <furq> oh
[17:55:50 CEST] <vans163> but i want the client to decode it which is a webrtc browser (chrome, firefox)
[17:56:11 CEST] <furq> well yeah any constrained baseline stream should work, regardless of encoder
[17:59:51 CEST] <vans163> furq, JEEB: thanks good to know what CBP is now
[18:00:00 CEST] <vans163> oh while were on the topic
[18:00:12 CEST] <vans163> theres an option for level-asymmetry-allowed 1/0
[18:00:28 CEST] <vans163> any idea what this option could mean
[18:03:29 CEST] <vans163> i see so Main and High give you Interlaced Coding and CABAC which I guess is respinsible for the bandwith savings
[18:03:50 CEST] <vans163> b slices im not using as they case latency in the decoder
[18:05:15 CEST] <JEEB> cabac is the biggest thing
[18:06:00 CEST] <JEEB> b-pictures are useful for compression but yes, add latency (on the level of N frames up to 16 I think)
[18:06:28 CEST] <vans163> going from main to high seems to offer 8x8 vs 4x4 transform adaptivity
[18:06:50 CEST] <vans163> and then the next step up i guess is high244
[18:06:57 CEST] <vans163> for "perfect" color
[18:07:39 CEST] <vans163> openh264 repo does have a cabac decoder
[18:07:50 CEST] <vans163> which is maybe why i noticed it can decode main?
[18:08:45 CEST] <JEEB> yes, 8x8dct is a high profile only feature. limited use compared to cabac/b-pictures you get with main
[18:08:57 CEST] <JEEB> or well, limited results
[18:11:03 CEST] <vans163> JEEB: i see, so basically I just want CABAC as the main bitrate saving feature.  1 annoying thing I noticed at 8mbps vs 16mbps on a baseline profile was,  at 8mbps every time the iframe came (every 3~ seconds at 30 fps 100 frames) the picture blurred
[18:11:11 CEST] <vans163> at 16mbps the blur never happened
[18:11:35 CEST] <vans163> it blurred for like 10-15 frames
[18:21:03 CEST] <Johnjay> Why do I need the v=0 in this command line?
[18:21:25 CEST] <Johnjay> https://pastebin.com/pZEMAu6C
[18:21:39 CEST] <MooseMoose> Is there a way to chain -f mpegts -map p:19 to lavfi? I've got an MPTS where I'd like to be able to extract the subcc based off the program number.
[18:22:42 CEST] <furq> !filter concat @Johnjay
[18:22:42 CEST] <nfobot> Johnjay: http://ffmpeg.org/ffmpeg-filters.html#concat
[18:25:03 CEST] <Johnjay> yes it says the default of v is 1
[18:25:06 CEST] <Johnjay> but i only have audio files
[18:25:12 CEST] <Johnjay> so why is saying v=0 necessary
[18:26:41 CEST] <furq> because the filter defaults to 2
[18:26:44 CEST] <furq> or 1, even
[18:30:01 CEST] <Johnjay> hmm ok
[18:30:08 CEST] <Johnjay> the error message was totally confusing
[18:32:02 CEST] <Johnjay> Stream specifier ':a:0' in filtergraph description [0:a:0] [1:a:0] concat=n=2:a= 1 [a] matches no streams.
[18:32:14 CEST] <Johnjay> But when I put the v=1 in it goes away
[18:32:33 CEST] <Johnjay> er v=0
[18:35:14 CEST] <Hopper_> Hey FFMPEG, getting an error when trying to overlay text on my video stream.  https://pastebin.com/pfDCUpeb
[18:39:30 CEST] <furq> Hopper_: remove the " after hstack
[18:40:18 CEST] <Hopper_> But then there is still one at the start of drawtext=
[18:40:48 CEST] <furq> looks like you get to have fun with escaping quotes
[18:40:52 CEST] <furq> lucky~
[18:41:11 CEST] <Hopper_> I have no idea how this syntax operates...
[18:41:11 CEST] <furq> you don't actually need the ones around drawtext tbh
[18:41:18 CEST] <furq> if the whole thing is in quotes
[18:41:21 CEST] <Hopper_> But I'm using spaces.
[18:41:32 CEST] <furq> right, but the entire filterchain is quoted
[18:41:36 CEST] <Hopper_> Oh, okay.
[18:41:51 CEST] <Hopper_> I'll try that
[18:41:53 CEST] <furq> also you want to get rid of that trailing : after fontsize=24
[18:42:12 CEST] <Hopper_> Oh, that implies another filter, right?
[18:42:36 CEST] <Hopper_> Perameter*
[18:42:51 CEST] <Johnjay> correct
[18:43:00 CEST] <Johnjay> the doc makes the syntax of filtergraphs very confusing at first
[18:43:53 CEST] <Hopper_> "No such filter: 'hstack drawtext'"
[18:44:02 CEST] <furq> hstack,drawtext
[18:46:24 CEST] <Hopper_> furq: Just when I thought I was getting a hang of FFMPEG, >.<
[18:46:25 CEST] <Hopper_> https://pastebin.com/uhjWeUiB
[18:47:33 CEST] <furq> good old windows path separators
[18:48:00 CEST] <tdr> Both text and text file provided. Please provide only one  <--- shouldnt be that bad
[18:48:01 CEST] <furq> i think last time i tried to pass an absolute path to fontfile i gave up and just put it in the working directory
[18:48:20 CEST] <furq> you probably want to replace those \s with \\ or \\\ or \\\\ or...
[18:48:36 CEST] <furq> or maybe / but idk if that works
[18:51:14 CEST] <Johnjay> Hopper why are you taking input from 2 different cameras?
[18:51:44 CEST] <Hopper_> Johnjay: 2 side of my device need monitoring.
[18:51:58 CEST] <Hopper_> furq: Ya, it is in the working directory.
[18:52:05 CEST] <furq> pass a relative path then
[18:52:07 CEST] <Hopper_> I guess I should take out the path.
[18:53:50 CEST] <tdr> or at least change those backslashes that are being interpreted as escapes
[18:54:02 CEST] <tdr> (furq's suggestion)
[18:54:16 CEST] <furq> well my suggestion is just pass a relative path with no slashes at all
[18:54:22 CEST] <furq> anything else is a giant pain on windows iirc
[18:54:44 CEST] <tdr> furq, quote magic doesn't work on windozen?
[18:54:47 CEST] <Johnjay> pro tip: you can open a command window on windows in a given directory
[18:54:51 CEST] <Johnjay> by holding down the shift key
[18:55:07 CEST] <Johnjay> i.e. shift-right click -> Open Command window here
[18:55:30 CEST] <Johnjay> i've found windows and linux are both Pita just in different ways.
[18:56:25 CEST] <MooseMoose> git-bash makes life on windows simpler at times
[18:58:15 CEST] <Johnjay> I've had some success with msys2
[19:03:36 CEST] <Hopper_> Now I'm getting video input too full...
[19:11:19 CEST] <feliwir> hey, how do i seek to a beginning of a file (a reset basically). When i do: av_seek_frame(format_ctx, stream_idx, 0, AVSEEK_FLAG_BYTE); there is no impact
[19:11:33 CEST] <feliwir> same when i use AVSEEK_FLAG_FRAME
[19:28:43 CEST] <Johnjay> hmm
[19:28:49 CEST] <Johnjay> is there a way with ffmpeg to "pad" out a file?
[19:29:09 CEST] <Johnjay> I.e. take file A and concat copies of file B until file A is X seconds long?
[19:54:40 CEST] <Puck`> hi everyone
[19:54:57 CEST] <Puck`> is there a method to include images in a live video stream? These images would be changes every ~3 minutes.
[19:55:09 CEST] <Puck`> *changed
[20:00:58 CEST] <Hopper_> furq: Is it possible to use hstack and pad at the same time?
[20:01:37 CEST] <Hopper_> nvm, I think I found a solution.
[20:13:30 CEST] <Johnjay> well i'll be fucked
[20:13:35 CEST] <Johnjay> there is a pad filter in ffmpeg
[20:13:37 CEST] <Johnjay> apad
[20:13:49 CEST] <Johnjay> i guess i don't need this 10secsilence.mp3 file anymore
[20:14:47 CEST] <Johnjay> although i'm not sure if i can use apad with segmenting effectively
[20:15:03 CEST] <Johnjay> i'll need a way to find the last file output from the segment filter and then pad it
[20:46:53 CEST] <MooseMoose> Can ffmpeg transcode interlaced to progressive using nvdec and nvenc?
[20:49:32 CEST] <alexpigment> MooseMoose, I don't have a lot of experience with using nvdec (although I've used nvenc quite a bit). Does using nvdec prevent you from using -vf yadif=1 ?
[20:50:07 CEST] <alexpigment> (if yadif=0 if you want to throw away half of the temporal resolution)
[20:50:17 CEST] <alexpigment> *or yadif=0, i mean
[20:50:18 CEST] <MooseMoose> it's kind of a sneaky question, I'm using the nvidia sdk separately but ffmpeg is about the only real implementation for reference
[20:50:40 CEST] <MooseMoose> I don't see the progressive profile being used so I was curious
[20:51:27 CEST] <alexpigment> well, you might see if you can tap into cuda deinterlace
[20:55:40 CEST] <MooseMoose> I believe ffmpeg already tells nvdec to deinterlace directly
[20:56:07 CEST] <alexpigment> so what exactly is telling you the output is still interlaced?
[20:56:58 CEST] <alexpigment> anyway, this might help: https://devtalk.nvidia.com/default/topic/1023784/-resolved-ffmpeg-cuvid-nvdec-frame-drop-when-using-option-deint-2/
[20:58:05 CEST] <furq> -deint 0 is the default here
[20:58:25 CEST] <furq> nvm i got the question backwards
[20:59:16 CEST] <BtbN> That's a weird response
[20:59:24 CEST] <BtbN> Just double the input framerate manually, and you're good
[21:00:27 CEST] <alexpigment> BtbN: who are you responding to?
[21:00:35 CEST] <BtbN> that forum thing
[21:00:42 CEST] <alexpigment> oh
[21:00:45 CEST] <alexpigment> the original question is unrelated
[21:00:58 CEST] <alexpigment> i pasted it just because it showed an example of how to set up an ffmpeg command line
[21:01:19 CEST] <BtbN> you can also freely use yadif with cuvid/nvenc, if you're fine with the CPU usage it causes
[21:01:19 CEST] <alexpigment> i am vehemently opposed to dropping temporal resolution, personally, but to each his own
[21:02:39 CEST] <MooseMoose> mmm looks like ffmpeg spits out the right result. Guess I'll go mad finding out why my stuff doesn't
[21:03:25 CEST] <alexpigment> MooseMoose: just to make sure, you're not expecting the H.264 profile to change to something like "Progressive High Profile", right?
[21:03:49 CEST] <BtbN> cuvid pretty much always deinterlaces
[21:04:02 CEST] <BtbN> The "Weave" Deinterlacer is a "do nothing" deinterlacer though
[21:04:11 CEST] <MooseMoose> this is more of an adventure of "how to force nvenc to spit out progressive" with ffmpeg as a reference
[21:04:14 CEST] <alexpigment> weave is a flawed concept imho
[21:04:22 CEST] <MooseMoose> with nvidia's tragic documentation
[21:04:34 CEST] <BtbN> progressive is the default
[21:04:43 CEST] <BtbN> you need to force it really hard to make it output interlaced
[21:04:50 CEST] <BtbN> so hard ffmpeg does not fully implement it
[21:04:51 CEST] <alexpigment> again, what's making you think it's interlaced? just visual inspection or something is reporting it?
[21:05:00 CEST] <MooseMoose> nvidia's sample sdk transcoder spits out interlaced content for me
[21:05:13 CEST] <MooseMoose> mediainfo for quick reference
[21:05:14 CEST] <BtbN> "Weave" just means "Take the two fields, and put them into one picture"
[21:05:16 CEST] <BtbN> so, the default
[21:05:33 CEST] <BtbN> It's a "Do-Nothing" if the content isn't field-coded
[21:05:40 CEST] <alexpigment> BtbN: you said words, but i still read them as "flawed concept" ;)
[21:05:55 CEST] <BtbN> How else would you want to put two interlaced fields into one frame?
[21:06:02 CEST] <BtbN> On top of one another?
[21:06:10 CEST] <alexpigment> not by assuming that the two fields are from the same time stamp
[21:06:23 CEST] <alexpigment> weave assumes this
[21:06:44 CEST] <ZEEX> im using -> ffmpeg -f gdigrab -i title="theWindow" ...        to capture a desktop window...       can I use wilcards or regex for the window name ?
[21:06:48 CEST] <BtbN> They are in the same packet, so they in fact do only have one timestamp
[21:06:57 CEST] <alexpigment> anyway, this is a side conversation really. i just get very annoyed with people's disregard for properly handling interlaced content
[21:07:28 CEST] <BtbN> If you're using mbaff, what most sources do, you only have one picture in the bitstream anyway
[21:07:50 CEST] <alexpigment> BtbN: the original time stamps aren't the same. "weave" deinterlacing basically only works if nothing has changed between field a and field b
[21:08:06 CEST] <BtbN> again, Weave is not an actual deinterlacer.
[21:08:13 CEST] <alexpigment> it's an actual deinterlace method
[21:08:14 CEST] <BtbN> It's mostly a no-op
[21:08:27 CEST] <alexpigment> weave, blend, bob, etc
[21:08:40 CEST] <BtbN> The only time it does something is when the fields are coded as seperate pictures. It weaves them into one picture, as everything else expacts that.
[21:09:23 CEST] <alexpigment> the only safe default way to deinterlace is bob
[21:09:42 CEST] <BtbN> Weave is _NOT_ deinterlacing
[21:09:44 CEST] <alexpigment> weave can only be done if there's something that first determines that no motion has occurred between field a and field b
[21:09:47 CEST] <BtbN> it is storing two fields in one picture
[21:09:56 CEST] <BtbN> So yadif and the like can handle it, or nvenc can encode it
[21:10:00 CEST] <BtbN> encode as interlaced
[21:10:24 CEST] <alexpigment> this use of the term weave is completely new to me
[21:10:43 CEST] <alexpigment> I'll have to take your word for you, but it sounds incorrect to me
[21:10:49 CEST] <BtbN> cuvid only has 3 output modes for Interlaced Content: Weave, Bob and Adaptive
[21:10:52 CEST] <alexpigment> Weave has always been - to me - a mode of deinterlacing
[21:11:01 CEST] <BtbN> And Weave is the "Do not deinterlace" option of the 3
[21:11:16 CEST] <alexpigment> BtbN: if that is true, then someone fucked up
[21:11:33 CEST] <BtbN> No, your understanding of Interlaced content just seems to be weird
[21:11:41 CEST] <BtbN> You have two fields per frame
[21:11:53 CEST] <alexpigment> BtbN: I work with interlaced content every single day
[21:12:03 CEST] <BtbN> and the default way to transport them is weaved in a single picture, with even and odd lines representing one field
[21:12:05 CEST] <alexpigment> I deal almost exclusively with interlaced content
[21:12:36 CEST] <BtbN> hence the name of the "Deinterlacer".
[21:12:42 CEST] <alexpigment> BtbN: again, this is the first time i've ever heard of weave being used in that manner. it sounds like you're saying that "weave" and "interlaced" are equivalent
[21:13:10 CEST] <BtbN> Weaving is the process of producing a line interleaved picture of two fields...
[21:13:29 CEST] <BtbN> All the deinterlacing filters in ffmpeg operate on such pictures
[21:14:10 CEST] <alexpigment> OK, so what you're saying is that since the interlaced method stores them as such, "weave" is effectively not processing, although the result would be functionally identical to an interlaced video that has been deinterlaced using the Weave method
[21:14:23 CEST] <alexpigment> which, sure, I can see how the distinction is technical
[21:14:33 CEST] <alexpigment> but Weave is first and foremost a deinterlacing method
[21:14:51 CEST] <BtbN> If by deinterlacing you mean taking two entirely seperate fields, and putting them in one picture, sure.
[21:15:10 CEST] <BtbN> But as pretty much everything already deals with line interleaved pictures, Weave is a No-Op in most cases
[21:17:06 CEST] <alexpigment> BtbN: ok, I see where you're coming from. I just think it's kinda dumb if nvdec has 3 modes, two of them are deinterlacing methods, and the third doesn't deinterlace but is named after a deinterlacing method
[21:17:25 CEST] <BtbN> It only means that it will output a line interleaved picture
[21:17:37 CEST] <BtbN> It cannot output the two fields seperately
[21:18:01 CEST] <BtbN> and as the cuvid API is old, from times where the codecs only carried the fields seperately, it makes sense to have that in the API like that
[21:19:18 CEST] <alexpigment> Well, lemme rephrase my original statement then: weave deinterlacing is a flawed concept and the "weave" nvdec mode is a bad choice
[21:19:31 CEST] <BtbN> Why is it a bad choice?
[21:19:45 CEST] <BtbN> If you want to not deinterlace the picture, it's your only choice.
[21:19:53 CEST] <alexpigment> no, not at all
[21:19:59 CEST] <alexpigment> well
[21:20:01 CEST] <alexpigment> ok
[21:20:07 CEST] <alexpigment> if you want to remain interlaced
[21:20:24 CEST] <alexpigment> but if you want to have progressive picture, you want to use one of the other two methods
[21:20:30 CEST] <alexpigment> which was the start of this topic
[21:20:38 CEST] <BtbN> The other two deinterlacers aren't too amazing either
[21:20:48 CEST] <BtbN> yadif generally gives better results. And for that, you also want to set cuvid to Weave
[21:21:25 CEST] <alexpigment> BtbN: it's not about the merits of each deinterlacer. It's just that true interlaced content has either 50 or 60fps of temporal resolution
[21:22:06 CEST] <alexpigment> when you weave (or leave it weaved as you're saying) and convert to progressive, you're combining data from two timestamps and halving the temporal resolution
[21:22:12 CEST] <alexpigment> combing occurs
[21:22:14 CEST] <alexpigment> looks bad
[21:22:41 CEST] <BtbN> combing only occurs if you displays the weaved picture without actual deinterlacing
[21:23:24 CEST] <BtbN> The actual deinterlacer which splits the two fields again does interpolate proper timestamps
[21:23:55 CEST] <alexpigment> BtbN: again, I'm talking about when converting from interlaced to progressive
[21:24:02 CEST] <alexpigment> you're assuming that it's going to stay interlaced
[21:24:30 CEST] <alexpigment> which, yes, SOME deinterlacer in the chain needs to be doing something less silly than a weave deinterlace method
[21:24:58 CEST] <alexpigment> because weave deinterlacing is a *flawed concept*
[21:29:26 CEST] <MooseMoose> so uh, if the frame *is* deinterlaced, then for encoding it is a NV_ENC_PIC_STRUCT_FRAME?
[21:30:58 CEST] <alexpigment> that sounds right to me, but that's more in BtbN's wheelhouse
[21:32:14 CEST] <MooseMoose> actually yeah, looks like in ffmpeg if its not set to weave it's a STRUCT_FRAME. mmm
[21:33:02 CEST] <MooseMoose> anyway thanks for the help. Looks like ffmpeg has the magic, just gotta find out what I've overlooked
[21:43:52 CEST] <BtbN> MooseMoose, the logic for cuvid in ffmpeg is to detect if a frame was interlaced or not, and if interlaced, apply one of the deinterlacing methods, defaulting to Weave
[21:48:31 CEST] <MooseMoose> Yeah I saw. I figured my problem out, I had missed a setting when Initializing the encoder which was borking progressive mode and causing other insane behavior
[21:49:49 CEST] <kepstin> most of the video decoders in ffmpeg will give you weaved frames when decoding interlaced video, that's a normal and common way of storing interlaced video on computers
[21:50:15 CEST] <kepstin> calling it a "deinterlacing method" is, well, not really correct i guess
[21:50:30 CEST] <BytesBacon> Is there anything ffmpeg could do, or maybe someone knows of something else. That can take PGS subtitles and convert to srt?
[21:50:38 CEST] <BytesBacon> That'd be cli.
[21:51:01 CEST] <kepstin> BytesBacon: pgs subtitles are pictures - you need to run ocr software on them to get text
[21:52:28 CEST] <BytesBacon> The only thing I'm finding out there is all gui required.
[21:54:16 CEST] <BtbN> There is an ocr filter in ffmpeg
[21:54:25 CEST] <BtbN> but the quality of text OCR is questionable at best
[21:55:01 CEST] <BytesBacon> Well I guess I could once it's done manually overview it or run something to spellcheck the srt file.
[21:55:43 CEST] <kepstin> you'd want to manually review it against the images, yeah, and you might have to redo any positioning or typesetting manually
[21:56:54 CEST] <BytesBacon> So I really need to use a gui for this for the best results it seems. Hmm, unless maybe I could use Plex to check it and edit the file manually.
[22:01:43 CEST] <Serg_Penguin> hi ! I try to encode h265 by https://trac.ffmpeg.org/wiki/Encode/H.265, ffmpeg-3.3.4 says "The encoder 'aac' is experimental but experimental codecs are not enabled, add '-strict -2' if you want to use it." I put these before all options, but to no good. Where to put them ?
[22:02:03 CEST] <JEEB> encoder settings go after the input(s)
[22:02:19 CEST] <JEEB> also if you have the latest FFmpeg the AAC encoder is no longer experimental
[22:02:48 CEST] <JEEB> also I am not sure why the help tells you to use -strict -2, while it should be "-strict experimental" which maps internally to -2
[22:02:56 CEST] <BtbN> by now, getting that message means you really should updated your way outdated version
[22:03:40 CEST] <Serg_Penguin> 3.3.4 is old ? so what is new ? Got at https://www.johnvansickle.com/ffmpeg/ just days ago
[22:03:55 CEST] <BtbN> 3.3 does not produce that message.
[22:04:00 CEST] <JEEB> :D
[22:04:32 CEST] <BtbN> It was either 3.2 or 3.3 that un-experimentalized the aac encoder
[22:06:50 CEST] <Serg_Penguin> thanks for order advice, it works. BTW, for me order-dependency of option is very weird, except for key-value parst like 'tar --file backup.tgz' or 'java -Xopt=value -Darg=value1'
[22:07:13 CEST] <BtbN> "order-dependency"?
[22:07:14 CEST] <Serg_Penguin> ... key-value _pairs_
[22:07:43 CEST] <Serg_Penguin> for me, python and GNU guidelines, options are unordered set/dict
[22:07:53 CEST] <JEEB> yea, ffmpeg.c is one of the few where the order matters
[22:07:59 CEST] <JEEB> due to input and output options
[22:08:08 CEST] <JEEB> you can set the same option for either the demuxer or decoder
[22:08:10 CEST] <BtbN> You would have one hell of a time realising a tool like ffmpeg without re-using options like that
[22:08:12 CEST] <JEEB> or to the encoder or muxer
[22:08:21 CEST] <BtbN> You would need like 500 -codec options
[22:08:31 CEST] <Serg_Penguin> no matter how you order them, they work, ex for outright positional args like 'cp' or 'mv'
[22:08:33 CEST] <BtbN> -codec_input_1, -codec_audio_input_1, ...
[22:08:34 CEST] <JEEB> nah, you'd just need to prepend -decoder -demuxer
[22:08:41 CEST] <JEEB> or encoder/muxer
[22:08:59 CEST] <kepstin> but how do you handle multiple output and input files then? :)
[22:09:03 CEST] <JEEB> although yes, duplication often happens so what BtbN says is more realistic
[22:09:54 CEST] <BtbN> Options are evaluated left to right, and always affect the next input/output following them.
[22:10:10 CEST] <BtbN> The right most value that matches something wins if multiple of the same are specified.
[22:11:59 CEST] <Serg_Penguin> 2i7M to 6.4M at 17.33 fps, but i have maybe weeks or months of video I was thinking of compressing
[22:12:19 CEST] <Serg_Penguin> ... 217M to 6.4M
[22:13:20 CEST] <Serg_Penguin> can it compress by GPU ?
[22:13:38 CEST] <BtbN> you don't want to.
[22:13:45 CEST] <DHE> it takes a rather new nvidia GPU if you're using nvenc to support h265 compression, but it can be done
[22:14:03 CEST] <JEEB> note: no GPGPU is used
[22:14:04 CEST] <BtbN> You really do not want to use GPUs for HEVC encoding.
[22:14:21 CEST] <BtbN> Might as well just use libx264
[22:14:35 CEST] <JEEB> even the vendors selling you solutions with kool-aid have stopped a few years ago from trying to throw GPGPU at video encoding
[22:14:38 CEST] <JEEB> as it doesn't really work
[22:14:51 CEST] <JEEB> (unless you make the format from the ground up for that type stuff)
[22:15:01 CEST] <JEEB> (which also means "goodbye high compresison")
[22:16:17 CEST] <DHE> your GPU is for when you NEED realtime compression in the face of large video (1080p and higher) or maybe if you just need to capture a stream and you're willing to throw bits at it for faster throughput. but even in that case I'd go with x264 veryfast
[22:17:55 CEST] <teratorn> hey kids, suppose you wanted to get better economic value than xencoder (etc) for h264 transcoding... do you think rolling your own rack server crammed full of hw-accelerated capacity would be the start of a possible solution? any other ideas for saving those transcoding dollars?
[22:18:32 CEST] <BtbN> Threadripper seems like good economic value to me
[22:19:10 CEST] <BtbN> possibly the best you will get anywhere
[22:19:27 CEST] <BtbN> 16 cores gives you quite some FPS with x264
[22:19:37 CEST] <teratorn> BtbN: *nod*
[22:19:46 CEST] <teratorn> obviously i would prefer to stick with x264
[22:19:58 CEST] <alexpigment> i'm kinda curious what the i7-8700k does compared to threadripper
[22:20:04 CEST] <BtbN> poorly?
[22:20:10 CEST] <BtbN> it's only 6 cores
[22:20:14 CEST] <alexpigment> well, intel still has an edge on IPC
[22:20:17 CEST] <teratorn> but quality isn't the #1 priority - if there was a half-way decent option for hw-accel h264 I would love to know about it
[22:20:26 CEST] <alexpigment> and in some cases, they do surprisingly well even with fewer core counts
[22:20:36 CEST] <BtbN> It's almost 3 times as many cores
[22:20:39 CEST] <alexpigment> right
[22:20:55 CEST] <BtbN> if you're only transcoding a single stream, the Intel might be faster
[22:21:13 CEST] <BtbN> but the moment it's more than one stream, the Threadripper will shredd the Intel
[22:21:14 CEST] <alexpigment> but if you look up a 7700k and compare it to an 1800x and similar, the 7700k is surprisingly fast
[22:21:28 CEST] <BtbN> That's because it's almost 1GHz faster
[22:21:37 CEST] <alexpigment> BtbN: yeah that's true, I didn't realize we were talking about multiple streams
[22:22:29 CEST] <alexpigment> anyway, the i7-8700k is the first Intel 6 core I remember seeing that isn't a) a ton of money and b) doesn't sacrifice too much clock speed for the core increase
[22:22:41 CEST] <jkqxz> I think optimising hardware cost ends up giving you a large number of cheap Intel boxes (like Pentium G46xx type - quick sync can do a lot per unit cost on them), but you have to ask whether you actually want to administer that sort of setup.
[22:22:42 CEST] <BtbN> You can thank AMD for that
[22:22:48 CEST] <jkqxz> Mid-range dual-Xeon (or quite possibly EPYC now, if you can actually buy it) servers are much denser and generally easier to manage.
[22:22:50 CEST] <alexpigment> BtbN: certainly :)
[22:23:10 CEST] <BtbN> Epyc will be less cost-effective than Threadripper
[22:23:14 CEST] <BtbN> as it's more expensive per core
[22:23:28 CEST] <BtbN> The 16 Core Threadripper is by far the cheapest way to get tons of cores
[22:23:43 CEST] <jkqxz> It's also much smaller than a dual-EPYC box, so you have to have more units.
[22:23:57 CEST] <BtbN> Smaller in what way?
[22:24:00 CEST] <BtbN> The die size is the same
[22:24:09 CEST] <jkqxz> Actual compute throughput.
[22:24:37 CEST] <BtbN> hm?
[22:25:01 CEST] <kepstin> yeah, epyc gives you up to 2x more cores per socket than tr, so it's denser in terms of physical rack space.
[22:25:18 CEST] <BtbN> But I doubt it's going to be as cost effective
[22:25:55 CEST] <BytesBacon> How do I go about extracting the PGS subtitles or what is the PGS format extension ffmpeg wants? Can't seem to get it to work, thanks.
[22:26:26 CEST] <jkqxz> Cost should include the overheads of dealing with it to some degree.  The cheap Intel boxes are not fun to deal with.
[22:27:20 CEST] <jkqxz> (Threadripper is probably beaten by mid-range Ryzens on that sort of metric, anyway.)
[22:28:09 CEST] <jkqxz> 3x 1600 can probably do more than a 1950X.
[22:28:44 CEST] <BtbN> but you need 3 mainboards for them, and PCUs, and RAM and everything
[22:29:52 CEST] <jkqxz> Cheap AM4 mainboards cost less than 1/3 what TR4 ones do.
[22:31:02 CEST] <teratorn> hmm so xencoder just uses x264 too like everyone...
[22:31:50 CEST] <teratorn> my boss through bandwidth to our own rack server would be problematic - but I think that is nonsense - b/w is dirt cheap
[22:31:56 CEST] <teratorn> s/through/thought/
[22:32:16 CEST] <BtbN> depends on what you send there
[22:32:21 CEST] <BtbN> raw video is enormous
[22:32:43 CEST] <teratorn> oh yeah no, it's already probably h264
[22:32:53 CEST] <teratorn> it just needs various transcoding
[22:34:38 CEST] <teratorn> we're already paying for b/w implicitly anyway, with any cloud transcoding service
[22:34:53 CEST] <teratorn> though they probably get it a little cheaper than we would
[22:36:53 CEST] <BytesBacon> Does ffmpeg need to be compiled with a switch to support extracting PGS subtitles?
[22:41:26 CEST] <c_14> afaik, it just doesn't support writing raw pgs
[22:45:33 CEST] <BytesBacon> How would I go about extracting them from the mkv container.
[22:49:34 CEST] <JEEB> a muxer was recently added http://git.videolan.org/?p=ffmpeg.git;a=commit;h=7a6bd541528b4d00b52d422d02f01d42346e68df
[22:49:54 CEST] <JEEB> (that's the libbluray dev)
[23:30:59 CEST] <Dianaxxyyzz> Hello all
[23:43:52 CEST] <rabbe> is there any other alternative than ffmpeg with v4l2 to create flv over rtmp from a capture card?
[23:44:17 CEST] <JEEB> there are multiple things but pretty much all of them will utilize FFmpeg's libraries underneath in one way or another
[23:44:38 CEST] <JEEB> (let's just say that you can do a lot of stuff with the libraries)
[23:44:43 CEST] <rabbe> seems v4l2 has some bug affecting my magewell capture card
[23:44:57 CEST] <JEEB> I recommend reporting it then
[23:45:10 CEST] <JEEB> although how simple it is to fix depends
[23:45:15 CEST] <rabbe> it was reported. https://trac.ffmpeg.org/ticket/4030
[23:45:20 CEST] <JEEB> ok, cheers
[23:45:57 CEST] <JEEB> uhh
[23:46:04 CEST] <JEEB> that issue was seemingly fixed ~2 years ago
[23:46:57 CEST] <rabbe> but not in the official v4l2 releases, right?
[23:47:37 CEST] <JEEB> uhh
[23:47:48 CEST] <JEEB> that patch was for libavdevice which is part of FFmpeg
[23:48:12 CEST] <rabbe> aha,, well, i read different threads
[23:48:25 CEST] <rabbe> so is it fixed in ffmpeg?
[23:48:53 CEST] <JEEB> that specific issue you linked to ends with "Fixed in  http://git.videolan.org/?p=ffmpeg.git;a=commitdiff;h=28f20d2ff487aa589643d8f70eaf614b78839685 "
[23:48:56 CEST] <Dianaxxyyzz> if it was fixed in libavdevice , was fixed in ffmpeg too :)
[23:49:24 CEST] <Dianaxxyyzz> ffmpeg.exe or linux binary is just a binary who use the ffmpeg libraries :)
[23:49:25 CEST] <rabbe> but i still have it.. and i installed ffmpeg not long ago
[23:49:53 CEST] <JEEB> Dianaxxyyzz: granted depending on ffmpeg.c's API usage not everything fixed in the libraries (or made possible) is possible with it
[23:49:56 CEST] <furq> what ffmpeg version
[23:50:07 CEST] <Dianaxxyyzz> try git version
[23:50:41 CEST] <Dianaxxyyzz> compile yourself last version and check that
[23:50:51 CEST] <rabbe> ffmpeg version 2.4.3-1ubuntu1~trusty6 Copyright (c) 2000-2014 the FFmpeg developers
[23:50:51 CEST] <rabbe>   built on Nov 22 2014 17:07:19 with gcc 4.8 (Ubuntu 4.8.2-19ubuntu1)
[23:50:56 CEST] <JEEB> woooah
[23:50:56 CEST] <furq> lol
[23:50:57 CEST] <Dianaxxyyzz> loooooooooool
[23:50:59 CEST] <furq> yeah that's old
[23:50:59 CEST] <rabbe> didn't sound that recent :)
[23:51:02 CEST] <Dianaxxyyzz> antique
[23:51:06 CEST] <Dianaxxyyzz> :)
[23:51:07 CEST] <rabbe> wtf
[23:51:14 CEST] <furq> i don't even know what ubuntu version that is
[23:51:19 CEST] <furq> 15.10?
[23:51:19 CEST] <Dianaxxyyzz> ancient one version :)
[23:51:20 CEST] <JEEB> the 4.8.2 gives it away
[23:51:21 CEST] <JEEB> 14.04
[23:51:27 CEST] <furq> did 14.04 even have ffmpeg
[23:51:28 CEST] <JEEB> and thr trusty
[23:51:35 CEST] <furq> i thought it was gone until 15.10
[23:51:40 CEST] <JEEB> probably a PPA? no idea :P
[23:51:41 CEST] <Dianaxxyyzz> :)) compile it man ,compile last version
[23:51:44 CEST] <furq> maybe
[23:51:49 CEST] <rabbe> kinda new to this linux world.. but i think i just ran apt-get or something
[23:52:01 CEST] <rabbe> sheesh
[23:52:10 CEST] <furq> ubuntu isn't a rolling distro
[23:52:12 CEST] <alexpigment> apt get install really old ffmpeg ;)
[23:52:14 CEST] <Dianaxxyyzz> https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu
[23:52:18 CEST] <rabbe> hehe
[23:52:20 CEST] <Dianaxxyyzz> this allways works good
[23:52:27 CEST] <furq> whatever package versions were released at the time the distro was released are what you get forever
[23:52:30 CEST] <furq> other than security fixes
[23:52:36 CEST] <furq> https://www.johnvansickle.com/ffmpeg/
[23:52:44 CEST] <furq> just grab one of these if you don't want to upgrade your distro
[23:53:00 CEST] <furq> 14.04 is pretty old though so you should maybe do that
[23:53:05 CEST] <Dianaxxyyzz> install with apt-get all they say in this quide https://trac.ffmpeg.org/wiki/CompilationGuide/Ubuntu , then just compile ffmpeg ,
[23:53:15 CEST] <furq> don't bother compiling it, just get those static builds
[23:53:22 CEST] <furq> you only need to compile it if you need non-free packages
[23:53:39 CEST] <JEEB> yea, if the static versions work and you trust their source then it's fine
[23:53:52 CEST] Action: JEEB tries to stay neutral regarding them
[23:54:02 CEST] <rabbe> who's johnvansickle ? :)
[23:54:05 CEST] <furq> relaxed is probably one of those dang russian malware guys i've heard so much about
[23:54:14 CEST] <JEEB> :D
[23:54:17 CEST] <alexpigment> JEEB: i see your eyelid twitching as you hold back your opinion ;)
[23:54:45 CEST] <Dianaxxyyzz> :)
[23:54:46 CEST] <JEEB> alexpigment: not that sort of evening and to be honest walking a newbie through compilation esp. now that nasm is required for x264 can be a PITA
[23:54:52 CEST] <JEEB> as in, a new enough nasm
[23:54:56 CEST] <JEEB> FFmpeg itself is still OK
[23:55:12 CEST] <Dianaxxyyzz> any of you use MaruOs ?
[23:55:13 CEST] <JEEB> also funny how world goes 'round with nasm/yasm
[23:55:16 CEST] <alexpigment> fair enough
[23:55:33 CEST] <furq> i'm amazed they're still making big enough changes to x264 to require that sort of change
[23:55:37 CEST] <furq> is that for avx2 or some shit
[23:55:40 CEST] <JEEB> AVX512
[23:55:44 CEST] <furq> close enough
[23:55:46 CEST] <JEEB> :)
[23:55:56 CEST] <JEEB> also yasm hasn't been developed for ages now :/
[23:56:05 CEST] <JEEB> they did the usual death call of "we'll totally rewrite this in C++"
[23:56:10 CEST] <furq> nice
[23:56:25 CEST] <JEEB> (it also didn't help that nasm changed its license, which was the initial reason for yasm to spawn at all)
[23:56:39 CEST] <JEEB> so suddenly yasm's raison d'etre went poof
[23:57:21 CEST] <Dianaxxyyzz> does  Fabrice Bellard still develop ffmpeg?
[23:57:29 CEST] <furq> not for a long time
[23:57:44 CEST] <Dianaxxyyzz> so since 2002 or so he did not added any code?
[23:58:01 CEST] <furq> i think he's focused on qemu stuff atm
[23:58:02 CEST] <JEEB> you can check the history of the git repo for when he last committed things
[23:58:04 CEST] <JEEB> and yea
[23:58:11 CEST] <JEEB> he's got other pet projects :)
[23:58:17 CEST] <furq> he's a busy man
[23:58:22 CEST] <JEEB> the rest of us can be the cesspool on the internet
[23:58:26 CEST] <Dianaxxyyzz> yes , that I was asking you cause you kno better
[23:58:39 CEST] <furq> he's the kind of guy who makes you wonder wtf you've done with your life
[23:58:55 CEST] <JEEB> I had that when I got in touch with the x264 guys circa 2008
[23:59:14 CEST] <JEEB> "this guy is 1-2 years younger than I am and I can't habla this"
[23:59:17 CEST] <furq> https://bellard.org/
[23:59:20 CEST] <furq> he's far too busy for css
[23:59:23 CEST] <Dianaxxyyzz> i think  Fabrice Bellard is genious
[23:59:27 CEST] <Dianaxxyyzz> is he?
[23:59:33 CEST] <furq> i would have thought so
[00:00:00 CEST] --- Tue Oct  3 2017


More information about the Ffmpeg-devel-irc mailing list