[Ffmpeg-devel-irc] ffmpeg.log.20140207

burek burek021 at gmail.com
Sat Feb 8 02:05:01 CET 2014


[02:37] <iaaah> Hi, i am trying to do a standard-ish (as close to source) conversion to mpeg. the only settings im using are `-vcodec libx264 -qscale 18`; which gives me a VBV error. In this case the source is a .mov format. How can I set a standard/default VBV? Should I use a different codec? Thanks for your help.
[06:14] <znf> Hi. Is there a "simple" way (ie: without involving external scripts) to overlay a short video over a longer video, while "stretching" the shorter one to extend to the duration of the long one?
[06:20] <cjhmdm> hello, does anyone know when the trac site will be functional again? it's been giving a 503 error for the past day or so
[07:56] <Waster> https://trac.ffmpeg.org/ Service Temporarily Unavailable
[08:53] Last message repeated 1 time(s).
[08:53] <relaxed> Waster: the devs know
[08:53] <Waster> thanks
[10:15] <Waster> still down I need docs for tee ASAP..8)
[10:44] <relaxed> Waster: http://webcache.googleusercontent.com/search?q=cache:8EYClTmiO8YJ:trac.ffmpeg.org/wiki/Creating%2520multiple%2520outputs+&cd=1&hl=en&ct=clnk&gl=us
[10:46] <Waster> relaxed: argh, thanks again..)))
[11:03] <relaxed> Waster: you're welcome
[11:33] <Samus_Aran> is there any way to record the X11 desktop with FFmpeg and maintain correct colours?
[11:35] <Samus_Aran> I've tried libx264 and huffyuv.  libx264 alters the colours badly.  huffyyuv is closer to the original colours, but pixelates badly.
[11:35] <Samus_Aran> is there any way to convert the colourspace of the rawvideo for the desktop to something libx264 can use?
[11:37] <JEEB> 1) use RGB as output (libx264 has a separate 'codec' name for the RGB version) 2) if RGB is not possible, then make sure you use the same conversion matrix when convertin to and from YCbCr
[11:37] <JEEB> any encoder doesn't alter colors at all, unless you are using something with lossy coding
[11:38] <JEEB> encoders just encode what they are fed
[11:38] <JEEB> the ffmpeg's log should tell you what pix_fmt the input was and what pix_fmt the output was, among other things
[11:39] <Samus_Aran> Stream #0:0: Video: rawvideo (BGR[0] / 0x524742), bgr0, 640x480, 245760 kb/s, 25 tbr, 1000k tb
[11:46] <Samus_Aran> JEEB: can you explain what you mean by "use the same conversion matrix when convertin to and from YCbCr"?  using -codec:v libx264rgb I get videos I can't play in mplayer or vlc
[11:46] <JEEB> then both of those are just too old
[11:46] <JEEB> and just contain a too old libavcodec to deal with 4:4:4 H.264
[11:47] <Samus_Aran> how can I convert it to something playable on older players?
[11:47] <Samus_Aran> it encodes the video fine, or seems to
[11:48] <Samus_Aran> profile High 4:4:4 Predictive, level 3.0, 4:4:4 8-bit
[11:48] <JEEB> and I mean exactly what I said, for example when you use the "normal" libx264 'codec', the default pix_fmt for that is 4:2:0 YCbCr, so your RGB gets converted to that in a specific way (usually with the BT.601 color matrix, welcome to swscale), and for the colors to (more or less) match (you've already lost information with the chroma subsampling), you will have to convert that YCbCr the same way back to RGB when playing it back
[11:49] <JEEB> for 640x480 resolution most things should use the BT.601 color matrix if not specified in the bit stream (although many players just ignore what's in the bit stream and instead always guess by the resolution)
[11:50] <JEEB> the chroma subsampling and the fact that YCbCr is generally kept in limited range instead of full range leads to some color degradation, and you really can't do much about that.
[11:51] <JEEB> and if you want to encode for end users, then you will almost have to use 4:2:0 YCbCr for compatibility. When you are capturing I recommend using RGB.
[11:52] <JEEB> and pretty much everything that's newer than, say, august or so 2011 should support 4:4:4 in libavcodec
[11:52] <JEEB> so if something fails at that, it means you've got really old stuff there :P
[11:52] <JEEB> or the player just fails at using it
[11:53] <Samus_Aran> my ffmpeg is version git-2013-11-16-d7ebeba, libavcodec     55. 43.100 / 55. 43.100
[11:53] <JEEB> well naturally
[11:53] <JEEB> your ffmpeg can encode it just fine and most probably is new enough
[11:54] <Samus_Aran> is there no way to convert bgr0 losslessly to yuv420p?
[11:54] <JEEB> no
[11:54] <JEEB> unless you resize to 2x
[11:54] <Samus_Aran> that would be fine
[11:54] <JEEB> with nearest neighbor
[11:54] <JEEB> although to be 100% honest
[11:54] <JEEB> since you are converting from full range RGB to limited range YCbCr
[11:54] <JEEB> you _will_ have some loss
[11:55] <Samus_Aran> I just want to record my screen without the colours looking unusably awful
[11:55] <JEEB> if you want to record then RGB is just fine for that
[11:55] <JEEB> you only need to convert to 4:2:0 YCbCr for end user encoding
[11:55] <JEEB> as in, something you are going to distribute
[11:55] <Samus_Aran> if I can't share the video, there's not much point to recording it?
[11:56] <JEEB> it's better to keep the original than keeping a degraded version, no? And then you can make a version of it that you can share
[11:56] <JEEB> that way no matter how you fuck up with creating the distributed thing, you can always redo it
[11:57] <sspiff> Samus_Aran: record RGB, keep that file, but for distribution scale to 2x NN, convert to yuv420 and rescale to 1x, and you've got a high quality original you might use eventually when people start having players that support 4:4:4 H264 and a lower quality version you can share right now?
[11:58] <JEEB> sspiff, uhh
[11:58] <JEEB> that last part
[11:58] <JEEB> > rescale to 1x
[11:58] <JEEB> that will pretty much derp it up :P
[11:58] <sspiff> Ah right
[11:58] <sspiff> sorry
[11:58] <JEEB> so you scale to 2x NN in RGB, then convert to 4:2:0 YCbCr
[11:59] <JEEB> also to be honest a whole lot of libavcodec-based solutions already support 4:4:4/RGB H.264
[11:59] <JEEB> the support was added around summer of 2011 after all
[11:59] <JEEB> stuff like VLC did fix it rather late IIRC, though?
[12:00] <sspiff> JEEB: wouldn't rescaling back to 1x only derp up a few pixels? They'd get the b&c of one of their neighbours, but most of the time that should be similar enough, no? Or would that just give us exactly the same as we got without rescaling at all?
[12:00] <JEEB> also it's still not 100% lossless because you're converting from full range (0-255) RGB to limited range (16-235 luma/16-240 chroma) YCbCr
[12:01] <JEEB> sspiff, it would lead to exactly the same thing as when you had when you just convert  the original RGB picture to YCbCr
[12:01] <JEEB> not exactly, but it pretty much is the same thing
[12:01] <sspiff> alright
[12:01] <JEEB> you are resizing 2x in order to not be affected by chroma subsampling, if you downscale that you start getting affected by it
[12:02] <JEEB> Samus_Aran, build mpv with mpv-build to gain something you can check that RGB H.264 in
[12:02] <znf> record as qtrle on a 1TB hdd, have fun!
[12:02] <JEEB> https://github.com/mpv-player/mpv-build
[12:02] <JEEB> since I assume you are on linux because you most probably have an old VLC too
[12:02] <JEEB> from the repositories
[12:06] <Samus_Aran> znf: I've not heard of qtrle, I assume something raw/lossless?
[12:07] <znf> Samus_Aran, yeah, will produce huge files though
[12:07] <znf> will most likely not be suitable for distribution
[12:07] <JEEB> that's exactly the same as if using ffvhuff or lossless RGB H.264
[12:08] <Samus_Aran> I tried playing it with ffplay, and it is closer than either of the other methods, but isn't identical colours to what's actually on the screen
[12:08] <Samus_Aran> putting the video over the real window it's obvious
[12:08] <JEEB> ugh
[12:08] <JEEB> first of all
[12:08] <znf> qtrle should be identical to what you see on screen
[12:09] <JEEB> how can lossless and lossless not be exactly the same, understand where the goddamn differences are doing and tell ffmpeg not to do that
[12:09] <JEEB> it probably 'looks better' because you're not converting away from RGB
[12:09] <Samus_Aran> JEEB: what lossless?  libx264 isn't lossless
[12:09] <JEEB> IT IS
[12:09] <JEEB> -crf 0
[12:09] <JEEB> or -q:v 0
[12:09] <JEEB> everything else is not lossless
[12:09] <Samus_Aran> that isn't lossless, it's "close to lossless" at least according to the docs
[12:09] <JEEB> no
[12:09] <JEEB> then you're reading the docs wrong
[12:10] <JEEB> crf 0 with 8bit x264 most motherfucking definitely is bit-exactly lossless
[12:10] <JEEB> if you have a 9 or 10bit x264
[12:10] <relaxed> you mean -qp 0 ?
[12:10] <relaxed> or does -q:v 0 work too ?
[12:10] <JEEB> yes, I have not ever set QPs manually in ffmpeg since crf is generally the preferred way
[12:10] <JEEB> -q should but the fuck I know
[12:10] <Samus_Aran> JEEB: why are you so angry at me when I've done nothing but ask questions?  sheesh.  I'm no video expert, that's why I'm here.
[12:10] <JEEB> anyways, yes -- if your x264 is built as 9bit or 10bit then yes, the CRF range is changed
[12:11] <JEEB> but I'm pretty goddamn sure you don't have that
[12:11] <Samus_Aran> I have an ffmpeg that is 10 bit, and one that is 8 bit
[12:11] <JEEB> so as long as you use crf zero or quantizer zero with x264 (first with 8bit only of course)
[12:11] <JEEB> then it is LOSSLESS
[12:11] <Samus_Aran> okay?
[12:12] <JEEB> as I am trying to tell you for a long time now any difference otherwise is in any possible color space conversion before feeding your crap into the encoder
[12:12] <Samus_Aran> so how can I make it the same as what's on my screen in 4:4:4?
[12:12] <relaxed> can you pastebin what you're doing?
[12:12] <JEEB> your input is RGB so you will want to make sure the output is RGB as well, and that swscale is not going in there somewhere unneededly
[12:13] <JEEB> x264 has its own 'codec' name for that
[12:13] <JEEB> otherwise use -pix_fmt after -i
[12:13] <znf> Samus_Aran, what are you using to feed that into ffmpeg?
[12:13] <znf> I mean the video stream
[12:13] <JEEB> and if it STILL differs, then something wonky is going on
[12:14] <znf> Samus_Aran, have you tried saving your output as a simple .png image and see if that image differs from what's on your screen?
[12:14] <JEEB> just... just please learn to differentiate when something is due to color space conversions before feeding your input into an encoder
[12:14] <JEEB> and when the actual encoder is doing something
[12:14] <znf> ffmpeg -i (...) out.png
[12:16] <Samus_Aran> relaxed: http://pastie.org/pastes/8708304/text
[12:16] <JEEB> and if there still is a difference even when ffmpeg is not doing any color space conversion, then the problem is somewhere else. Either in the way you're inputting stuff into ffmpeg, or in the way you are checking that output.
[12:17] <JEEB> you shouldn't have to set pix_fmt with libx264rgb, just btw
[12:17] <Samus_Aran> znf: not today, but when I've used PNGs other times they were exact
[12:17] <JEEB> it should be RGB by default
[12:18] <Samus_Aran> I have -pix_fmt bgr24 in there because it would complain at me otherwise, but still use it.
[12:18] <JEEB> also the swscaler warning is kind of weird
[12:18] <JEEB> can you post the output with -v debug before -i
[12:19] <Samus_Aran> sure, just a moment
[12:21] <Samus_Aran> znf: I output to PNG and it was pixel-exact
[12:21] <JEEB> well, that at least tells you that your input is OK
[12:26] <Samus_Aran> JEEB: http://pastie.org/pastes/8708326/text
[12:26] <JEEB> thank you
[12:26] <Samus_Aran> the colours are fading toward white slightly
[12:26] <Samus_Aran> or losing some saturation
[12:27] <Samus_Aran> less vibrant, more pastel.
[12:27] <JEEB> well the only thing that's poking the picture there is the swscale scaler
[12:28] <JEEB> if the x264 line contains "qp=0" that means it's lossless
[12:29] <JEEB> what are you using to preview that thing?
[12:30] <Samus_Aran> ffplay
[12:30] <JEEB> what happens if you output PNGs from that H.264 stream?
[12:30] <JEEB> with ffmpeg
[12:30] <JEEB> let's just say that ffplay has plenty of places it can go wrong
[12:32] <Samus_Aran> that is pixel-exact
[12:32] <JEEB> ok, that means that the H.264 stream is pixel-exact
[12:32] <Samus_Aran> pngs from the mkv
[12:32] <JEEB> so it was ffplay herping a derp
[12:33] <Samus_Aran> my ffplay may be older, I have too many versions of ffmpeg installed for different things
[12:33] <Samus_Aran> ffplay version git-2013-11-16-d7ebeba
[12:33] <Samus_Aran> it's newish
[12:33] <JEEB> well ffplay is in general something you should not trust too much
[12:33] <JEEB> it's a proof-of-concept kind of hackjob
[12:35] <squ> how to list time length of many files
[12:35] <squ> of many .m4a files
[12:35] <JEEB> anyway, an RGB<->RGB conversion in swscale is /generally/ safe, but swscale is swscale, there be dragons
[12:38] <Samus_Aran> squ: perhaps parsing the output of ffmpeg -i file, or parsing the output of ffprobe
[12:39] <squ> what is different between two?
[12:40] <JEEB> ffprobe is made for metadata etc. grabbing
[12:40] <JEEB> so you'll probably want to follow its usage manual
[12:40] <Samus_Aran> I had an issue with ffprobe where some files would report no duration, but ffmpeg listed the correct durations
[12:41] <JEEB> in theory it should do the same things, but it might need some extra settings since you can pick rather well if it will touch the actual streams in the container at all
[12:43] <Samus_Aran> squ: I've used something like this in one of my Bash scripts: ffmpeg -i Screen_Capture.mkv 2>&1 | awk -F'(: |, )' '/^  Duration: / {print $2}'
[12:43] <Samus_Aran> 00:00:05.10
[12:48] <relaxed> for seconds: ffprobe -show_format -i input.mkv 2>&1| awk -F= '/^dur/ { print $2 }'
[12:49] <Samus_Aran> JEEB: thanks for your help
[12:50] <JEEB> no problem :)
[12:51] <squ> Samus_Aran: thank you
[12:54] <Samus_Aran> squ: welcome
[13:08] <squ> Samus_Aran: http://vpaste.net/Qsuxc
[13:25] <Samus_Aran> squ: looks fine.  might want to use ffmpeg if you're not using ffprobe's options to output stream info
[13:27] <Samus_Aran> such as ffprobe -show_streams
[13:27] <Samus_Aran> goodnight, thanks again for the help, screen capturing is going well.
[14:06] <Bname> Hello friends, I was referred here after asking a rather specific question and was told you guys would be knowledgeable in this area.
[14:07] <Mavrik> maybe.
[14:07] <webchat21323> hello, i want to add scrolling text to a video. the text should only appear on the lower half of the video and scroll upwards. once a line reaches the middle of the screen it should disappear (or better fade away). How can I do that? what is the exact drawtext command ? thanks!
[14:07] <Bname> I wish to re-stream live, streaming content from another website to one of my servers to any user who wishes to watch. Could somebody point me in the right direction?
[14:08] <Mavrik> hmm
[14:08] <Mavrik> how much do you know about basics of video / video streaming?
[14:08] <Bname> I'd have to say very little.
[14:09] <Mavrik> Bname, ok, first find out which protocol the source uses
[14:09] <Mavrik> then find out which streams and which formats has the source :)
[14:09] <Mavrik> we can continue then ;)
[14:09] <Bname> Alright, I'll try. Thanks Mavrik
[14:10] <Mavrik> when you know that it'll be easier to talk :)
[14:32] <webchat21323> Can anyone give me some advice?
[14:49] <webchat21323> How can i add scrolling text to avideo that's only shown on the bottom half????
[14:54] <webchat21323> At the moment I'm using ffmpeg -loop 1 -i "Pic.png" -i aud.mp3 -c:v libx264 -c:a aac -strict -2 -shortest -r 24 -vf drawtext="fontfile=/windows/fonts/calibri.ttf:textfile='text.txt':fontsize=24:fontcolor=white:y=h-10*t-40" test.mp4 . How to make text disappear after it reached thge middle of the screen ????????
[16:36] <coalado> how can I extract the m4a/aac audio track from a mp4 video file?
[16:37] <coalado> just extract the audiostream, direct copy -  no conversion
[16:41] <Mavrik> coalado, use "copy" as a codec
[16:42] <Mavrik> ffmpeg -i <file> -vn -codec:a copy out.m4a
[16:46] <coalado> thanks
[17:15] <yamyam> does anybody know if it is possible to compile ffmpeg arguments to auto start with the binary? for example -af aresample=async=1000
[17:29] <RenatoCRON> yamyam, can't you just use a bash or alias ?
[17:30] <yamyam> well thats the prob, another bin is calling ffmpeg and building the argument chain, so I can''t inject arguments diretly&. thats the reason why I try to compile the filter as autostart on launch
[17:31] <yamyam> I hacked around how to trick the other bin but they escape my strings and so I wind up trying to compile the wanted functions as auto active :-(
[17:31] <RenatoCRON> yamyam, ok.. it's possible, but I guess you'll need implement the default only if you code
[17:32] <RenatoCRON> yamyam, I assume it's linux, maybe you can export $PATH and create a fake ffmpeg script that arrange the arguments and pass it to the real bin
[17:32] <yamyam> :-) that's exactly what I didn't wanted to hear (but feared to happen) haha :-)
[17:33] <yamyam> Thanks Renato I tried that allready&. but this killed the other bin from the streamig server ans screwed there argument chain
[17:34] <yamyam> I think I have to go the hard way and write the function&.. :-(
[17:35] <RenatoCRON> yamyam, the thing that you say that scrwed up, is another ffmpeg or ffXXXX ?
[17:36] <yamyam> it's flussonic streaming server solution and they call ffmpeg from a compiled erlang bin& and they did nice string escaping why I can't inject arguments
[17:37] <yamyam> it's about the good old sound syn issue on low DVB input and they don't want to implement my solution so I have to do it on my own and recompile ffmpeg
[17:38] <RenatoCRON> I guess you known what you doing, so you don't removed previous envs from $PATH
[17:38] <RenatoCRON> yamyam, also, maybe #ffmpeg-dev can help you more
[17:38] <RenatoCRON> sorry, #ffmpeg-devel *
[17:39] <yamyam> that's a good idea, wanted to ckeck here first if somebody knows a solution I haven't thought about yet. I will ask them thanks a lot Renato
[17:43] <relaxed> yamyam: can't you use ffmpeg directly?
[17:44] <relaxed> you could turn on those options by default and recompile ffmpeg, but that would be horrible
[17:46] <yamyam> the problem is that if i go with ffmpeg only, the playout server is not exception my input&. they may use metadata or something to check if ffmpeg is called by myself or flussonic
[17:47] <yamyam> ok do you know how to turn them on? i searched in the configure file but found no solution - finally I started working in the filter.c file to try to start them
[17:47] <relaxed> You would have to edit the source and recompile ffmpeg
[17:47] <yamyam> ok :-( I'm allready there&.. just was hoping to get it easier done
[17:48] <yamyam> but thanks a lot for sharing your brain for my issue :-)
[17:48] <relaxed> can't you peek at how it's runninf ffmpeg with `ps` and mimic it?
[17:48] <relaxed> running*
[17:51] <yamyam> i tried to strace how they to it, but this guys compile ffmpeg into an erlang bin and redirect all output away&.. I tried to decompile the erlang bin but winded up at encrypted source :-(
[17:52] <yamyam> my only solution is to recompile ffmpeg to force the erlang bin to call my "moded" ffmpeg with my additional arguments i guess
[18:32] <w-wright> Hello, I'm just wondering but if I have compiled ffmpeg from source and I want to add something else afterwards. Will it wipe out my previous config?
[18:33] <relaxed> what config? you'll have to compile a new binary.
[18:34] <w-wright> when I do ./configure --enable etc
[18:34] <w-wright> in source
[18:35] <relaxed> yes, you'll have --enable everything again
[18:35] <w-wright> Ok thanks for your time!
[18:35] <relaxed> you're welcome
[19:28] <Wa4> can ffmpeg do a fade out of an audio file? for example the last 10 seconds
[19:28] <klaxa> http://www.ffmpeg.org/ffmpeg-filters.html#afade
[19:29] <Wa4> cool
[19:29] <Wa4> and now the superduper question:
[19:30] <Wa4> I have 200 mp3s in a folder, I want to automate to cut the first 40 seconds of each file and make the last 10 seconds of each file do a fade out
[19:30] <Wa4> and delete the original full mp3s
[19:30] <klaxa> write a bash script that reads the duration
[19:31] <Wa4> ffmpeg accepts this?    ffmpeg -i *.mp3
[19:38] <RenatoCRON> Wa4, from what I known, with JPG, yes, so mp3 may work in a similar way
[19:39] <RenatoCRON> Wa4, try "%d"
[19:39] <RenatoCRON> ffmpeg -f [???] -i %*.mp3
[19:45] <three_le1s> hello
[19:45] <three_le1s> I was wondering if i could offload some of the work onto the gpu
[19:48] <RenatoCRON> quote "x264's dev IRC has been testing OpenCL lookahead for months. You can join the IRC, request for the patch, compile a binary, and test or use on your own risk."
[19:49] <RenatoCRON> three_le1s, http://forum.doom9.org/showthread.php?t=164960
[19:49] <RenatoCRON> theholyduck, #ffmpeg-devel may have more info
[19:52] <three_le1s> RenatoCRON: thank you :D
[19:54] <RenatoCRON> three_le1s, you are welcome. if you plan to encode x264, you should also look at https://developer.nvidia.com/nvidia-video-codec-sdk
[19:55] <RenatoCRON> there's people reporting very fast results with that.
[19:57] <ChocolateArmpits> SD MOV files seem to report two different SARs. Is there any way to prioritize one of them for the scale video filter ?
[20:00] <theholyduck> RenatoCRON, uguu
[20:00] <theholyduck> i wish people could tab complete on last spoke order
[20:02] <theholyduck> http://www.youtube.com/watch?v=uOOOTqqI18A if people are intrested in why gpu acceleration of encoding is hard, and what opencl lookahead is
[20:02] <theholyduck> its a pretty nice talk
[20:03] <pk__> can we encode RGB directly to h264 without converting to YUV?
[20:03] <pk__> not asking whether it can be done by ffmpeg..i am asking in general can it be done specification/algo wise?
[20:04] <theholyduck> h264 is yuv only as far as i know
[20:04] <RenatoCRON> theholyduck, cool, i'd save it on favs. but I guess for a while I'll not use it, because i'm using -preset ultrafast and i'm under AWS until the cost to have a colocation justify
[20:05] <theholyduck> pk__, but, ffmpeg can take rgb as input and output yuv h264. its not like you need a seperate conversion stage
[20:05] <theholyduck> first
[20:05] <RenatoCRON> pk__ I think theholyduck is right
[20:05] <pk__> but there is conversion taking place right?
[20:05] <theholyduck> pk__, yuv to rgb yeah
[20:05] <theholyduck> er
[20:06] <theholyduck> rgb to yuv
[20:06] <theholyduck> pk__, there are rgb video formats, but h264 is not one of them
[20:06] <theholyduck> rgb is just too wasteful
[20:06] <theholyduck> compared to yuv
[20:06] <pk__> i read somewhere that direct rgb is also possible as if we are doing yuv 4:4:4
[20:06] <theholyduck> yuv 4:4:4 is still yuv
[20:07] <theholyduck> rgb is an entirelly different way of storing color data
[20:08] <theholyduck> yeah, h264 supports yuv 4:0:0 4:2:0 4:2:2 and 4:4:4
[20:08] <pk__> yes but what happens if say..we take a rgb file and feed to encoder as if it were yuv 4:4:4
[20:08] <pk__> dont you think it will encode and decode fine?
[20:08] <theholyduck> pk__, the encoder would do the conversion
[20:08] <theholyduck> to make it yuv
[20:08] <pk__> oh common
[20:09] <ChocolateArmpits> It would get interpreted to horrible results
[20:09] <theholyduck> pk__, rgb is fundamentally completely different from yuv
[20:09] <theholyduck> if you could get the encoder to parse rgb as yuv, then, it would just break horribly
[20:09] <pk__> okay :(
[20:10] <theholyduck> pk__, rgb is very inefficient, cause it stores the luminoscity in all 3 channels
[20:10] <pk__> yes i know
[20:10] <theholyduck> yuv doesnt work like that at all. and so, no way in hell would you be able to parse it correctly
[20:10] <theholyduck> pk__, colorspace conversions are automatic and relativly painless
[20:11] <ChocolateArmpits> How can you call it inefficient when efficiency isn't in the mind of the format?
[20:11] <theholyduck> ChocolateArmpits, well, thats why actual consume video formats dont use rgb
[20:11] <theholyduck> or dont support it :P
[20:11] <theholyduck> yuv is just a much much more efficient way to go about things
[20:11] <pk__> i have a 720p RGB data which i want to pass thorugh a 100Mbps link with least computation
[20:12] <theholyduck> pk__, take a look at the ffmpeg huffyuv encoder
[20:12] <theholyduck> it might produce bitrates small enough for you. depending
[20:12] <theholyduck> and it supports rgb
[20:12] <ChocolateArmpits> Well I just think judging rgb on basis of efficiency isn't -correct- because efficiency isn't in the mind of rgb
[20:12] <theholyduck> or was it ffv1?
[20:13] <theholyduck> one of the speciality ffmpeg encoders
[20:13] <theholyduck> does rgb
[20:13] <pk__> huffyuv supports rgb? ;)
[20:13] <pk__> you mean it converts?
[20:13] <ChocolateArmpits> intra frames only
[20:13] <ChocolateArmpits> then
[20:14] <theholyduck> pk__, well, its not called huffyuv in ffmpeg
[20:14] <pk__> how much bandwidth can i compress it to (any idea?) if i do intra only?
[20:14] <theholyduck> it has a different name
[20:14] <theholyduck> pk__, try it out?
[20:14] <ChocolateArmpits> As much as you are ready to give bits
[20:15] <pk__> actually i can't try unless i knowthe expected result...it is a totally different architecture and i will have to port the encoder to it
[20:16] <pk__> ChocolateArmpits: you mean even with intra only we can compress to very low bandwidth?
[20:16] <ChocolateArmpits> I mean with very low latency
[20:16] <ChocolateArmpits> If you want bandwith savings you'll have to use inter frame coding too
[20:17] <ChocolateArmpits> Which will add to the potential latency
[20:17] <pk__> i just want it to make it through100Mbit thats it
[20:17] <ChocolateArmpits> Are you familiar with intra and inter frame concepts?
[20:17] <theholyduck> hmmi
[20:17] <pk__> ChocolateArmpits: yes
[20:17] <ChocolateArmpits> ok
[20:17] <theholyduck> i could ahave sworn ffv1 or the ffmpeg version of the basic lossless huffman video encoder
[20:18] <theholyduck> supported rgb
[20:18] <theholyduck> but i cant seem to find any backup info on it
[20:18] <theholyduck> pk__, why is it so important to preserve the RGB?
[20:18] <pk__> not important
[20:18] <pk__> thing is i will need to create a hard chip for this
[20:18] <pk__> and i want the least exercise for me :)
[20:19] <pk__> after the simplese algo is found i will write the hardware design
[20:20] <pk__> huffyuv is lossless ..doesn't that mean it will save me very little in terms of bandwidth?
[20:21] <theholyduck> pk__, sure, but its also a very simple codec
[20:21] <theholyduck> pk__, but, since it doesnt support rgb afterall
[20:21] <theholyduck> you could go with whatever
[20:22] <pk__> okay i will convert rgb to yuv..but after that how much do you think huffyuv will compress the stream to?
[20:22] <theholyduck> pk__, well, if you are willing to use yuv, theres  bunch of other video formats with more options for compression
[20:22] <theholyduck> huffyuvs virtue is that its fast and simple and gives a bit of gain
[20:23] <theholyduck> pk__, i dont remember how much, been a while
[20:23] <theholyduck> try it out yourself with ffmpeg
[20:23] <theholyduck> see what numbers you get on your content
[20:23] <pk__> right
[20:23] <pk__> i can try it first using ffmpeg and then port later if it satisfies
[20:24] <pk__> is there some simple lossy intra only algo also?
[20:25] <pk__> is (m)jpeg easy?
[20:25] <theholyduck> relativly i guess
[20:25] <ChocolateArmpits> mjpeg is intra and a stack of jpegs
[20:30] <pk__> okay for testing huffyuv i have this video mp4(1080p h264 + mp4a audio)...how to dump a 10 second raw YUV 420 out of it and then pass it through huffyuv?
[20:30] <pk__> using ffmpeg
[20:31] <pk__> can you please tell me a command line?
[20:32] <theholyduck> ffmpeg is pretty easy to use and decently documented these days
[20:32] <theholyduck> do some googling :P
[20:32] <theholyduck> and trial and error? :P
[20:32] <pk__> okay
[20:32] <pk__> you are right
[20:33] <pk__> my computer is pretty slow..do you think it will be possible to process only 10 seconds or so?
[20:33] <pk__> just yes or no? rest i will find on google
[20:34] <theholyduck> pk__, sure
[20:34] <theholyduck> theres a commandline to restrict the number of frames it uses
[20:34] <theholyduck> and i think one to set the endpoint in time
[20:34] <theholyduck> aswell
[20:35] <pk__> okay let me try
[20:35] <pk__> i should convert firsmt to raw 420 file and then huffyuv or i can directly go for convert to huffyuv?
[20:37] <theholyduck> just go directly
[20:37] <theholyduck> ffmpeg can handle any colorspace conversion
[20:38] <pk__> you were saying that it is not called huffyuv in ffmpeg then what is it calles?
[20:38] <theholyduck> ffhuff or something
[20:38] <theholyduck> should be some documentation on what flag to use somewhere
[20:39] <theholyduck> i have to run away now
[20:39] <theholyduck> some work needs doing
[20:40] <pk__> okay thank you
[20:40] <pk__> so much for you help
[20:45] <llogan> libx264rgb
[20:47] <pk__> llogan: what is libx264rgb?
[20:47] <three_le1s> RenatoCRON: x264 already has it merged
[20:50] <ChocolateArmpits> Does anyone know if 1st pass encoding takes encoding settings into account ?
[20:55] <zennist> Hi guys
[20:56] <zennist> I want to convert videos for best compatibility/smallest size; used for html5
[20:56] <zennist> anyone has any advice?
[20:56] <ChocolateArmpits> compatability with what exactly?
[20:56] <llogan> pk__: ffmpeg -h encoder=libx264rgb
[20:56] <zennist> to be used as html5 video
[20:57] <llogan> ChocolateArmpits: which encoding settings? what encoder?
[20:58] <ChocolateArmpits> llogan, x264, bitrate, maxrate, bufsize, default x264 presets
[20:58] <pk__> llogan: that will convert rgb to h264?
[20:58] <llogan> pk__: Supported pixel formats: bgr24 rgb24 (i didn't really read the wall o' text...so maybe i'm out of context)
[20:59] <pk__> llogan: yes you certainly are :)
[20:59] <zennist> :P anyone? settings for html5 video that gives me smallest size
[21:01] <llogan> zennist: depends on the format you want to use
[21:01] <llogan> (and the browsers too)
[21:01] <zennist> llogan: mp4 would be
[21:02] <ChocolateArmpits> llogan, what does console output have to do with what 1st pass does?
[21:02] <llogan> i want to see all of your options and your ffmpeg version so i can give you an accurate answer
[21:03] <DeadSix27> anyone knows a backup page for http://trac.ffmpeg.org/ ?
[21:03] <DeadSix27> since: Service Temporarily Unavailable
[21:03] <ChocolateArmpits> DeadSix27, try google cache
[21:03] <DeadSix27> ah right
[21:03] <llogan> zennist: what are yout input formats?
[21:04] <llogan> hopefully trac will be back today or tomorrow
[21:04] <zennist> llogan: mhh, a bunch of mp4's I think. And also mov's.
[21:04] <llogan> maybe you don't have to re-encode then
[21:06] <zennist> llogan::'( but they are too big..and are exceeding my website storage limit
[21:06] <llogan> zennist: a typical command can look like: ffmpeg -i input -c:v libx264 -crf 23 -preset medium -movflags +faststart -c:a libfdk_aac -vbr 5 output.mp4 (probably fine for progressize download)
[21:07] <llogan> use the highest -crf value that gives you an acceptable quality. use the slowest -preset that you have patience for.
[21:08] <rcombs> anyone know if there's a way to access the current AVPacket from AVCodec.encode_sub?
[21:08] <llogan> zennist: crf range is a log scale of 0-51. 0 is lossless. 18 is roughly "visually lossless" (depending on input), 23 is default.
[21:08] <zennist> llogan: crf...full name? Sorry for being a noob
[21:08] <llogan> constant rate factorb. the b is for bargain.
[21:08] <llogan> constant rate factor
[21:08] <rcombs> erm, wrong channel
[21:10] <ChocolateArmpits> llogan, I don't have access to my workstation currently but the command line for the 1st pass would go like this:
[21:10] <ChocolateArmpits> %ffmpeg% -y -i input.avi -vf "[in]yadif=0:-1:0[yadif];[yadif]hqdn3d=4:4:6:6[denoise];[denoise]scale=out_h*dar:576:0[out]" -sws_flags bicubic -pass 1 -v:c libx264 -preset slow -b:v 430k -maxrate 680k -bufsize 200k -an -pix_fmt yuv420p output.mp4
[21:11] <ChocolateArmpits> probably NUL instead of output.mp4
[21:13] <ChocolateArmpits> oh forgot -profile:v main
[21:14] <llogan> ChocolateArmpits: "fastfirstpass" is a default settings, IIRC, unless you use slow-firstpass via x264-params or -x264opts (or maybe disable it with -fastfirstpass 0 libx264 private option)
[21:15] <llogan> ChocolateArmpits: http://mewiki.project357.com/wiki/X264_Settings#slow-firstpass
[21:16] <llogan> shows a lost of settings that -pass 1 will apply
[21:16] <llogan> *list
[21:16] <ChocolateArmpits> The option describes placebo preset as enabling slowfirstpass too
[21:17] <llogan> yes, but placebo is a joke. that's why it's called placebo to placate the "i wanna teh best encodning settingz" users.
[21:17] <ChocolateArmpits> Can it be believed that the first pass doesn't take into account bitrate settings ?
[21:18] <ChocolateArmpits> Or even the profile if trellis is off ?
[21:20] <llogan> ChocolateArmpits: you can move use format=yuv420p within your filtergraph instead of -pix_fmt yuv420p so you can have more control as to when it is used
[21:20] <llogan> same with sws_flags (see "flags" option for scale filter)
[21:20] <pk__> ffmpeg -i d:\videos\ganjababe.mp4 -vcodec huffyuv -vframes 1 out.huff   this is not working ..how do i dump raw huffyuv?
[21:21] <pk__> it says Unable to find a suitable output format for 'out.huff'
[21:21] <ChocolateArmpits> llogan, I believe pixel format transforms are done after applying video filters
[21:22] <llogan> that's my guess if you use -pix_fmt, but you can also use the format filter instead so you can tell it when to do that
[21:23] <ChocolateArmpits> I've had that previously but decided to cut down on filters used
[21:23] <ChocolateArmpits> It would've went to the very end eitherways
[21:24] <ChocolateArmpits> my inputs are 4:2:0 and 4:1:1 most times
[21:26] <llogan> some DV inputs?
[21:26] <ChocolateArmpits> DVCPRO and DVCAM for the time being
[21:29] <ChocolateArmpits> back to the topic llogan, so do you know if first pass takes bitrate rendered video and then makes decisions or does it not do that and only access filters at most?
[21:29] <ChocolateArmpits> Because the settings listed in the mewiki seem more like to speed up the process
[21:30] <ChocolateArmpits> potentially to not use settings the firstpass won't be making decisions on at all
[21:30] <llogan> sorry, but i don't quite understand your question.
[21:30] <ChocolateArmpits> what does firstpass do?
[21:31] <llogan> http://mewiki.project357.com/wiki/X264_Settings#pass
[21:32] <ChocolateArmpits> >contains information about every input frame
[21:33] <ChocolateArmpits> So I presume it doesn't render the video and then write information, but rather takes input only ?
[21:34] <ChocolateArmpits> So having encoding information, video filter information would prove useless and only add to the encoding time ?
[21:34] <ChocolateArmpits> If firstpass only does is write information about the input
[21:34] <llogan> no. keep everything the same. just change to -pass 2 and add your audio info on the second pass.
[21:35] <ChocolateArmpits> Are you saying those settings matter for the firstpass ?
[00:00] --- Sat Feb  8 2014


More information about the Ffmpeg-devel-irc mailing list