[Ffmpeg-devel-irc] ffmpeg.log.20121011

burek burek021 at gmail.com
Fri Oct 12 02:05:01 CEST 2012


[00:13] <pfifo> I am using ffmpeg and dvdauthor to make some video disks. Im having a problem with the final dvd's aspect ratio. The first video, a short 5 minute cartoon is 4:3, however the main feature is 16:9. When I play the dvd, the first clip plays fine, but the main feature is stretched to fill the entire 4:3 space.
[00:21] Action: pfifo better pick up a dvd-rw a wal-mart
[00:30] <lake> i have video source @ 640x480, with ffmpeg cropping borders, the size is 624x464. I want to transfer it to a dvd. should these dimensions cause issues?
[00:31] <lake> will i need to scale it up to 704x480?
[00:34] <whiplash> i'm trying to figure out how to write a vfw codec
[00:34] <whiplash> do you ffmpeg devs think that the vfw portion of the ffmpeg source is a good example?
[00:35] <pfifo> lake, yes it will create issues
[00:36] <lake> pfifo: can i scale it to 640x480 and it be okay?
[00:36] <pfifo> lake, im not sure, theres a list of resloutions that are allowed, you have to crop/scale/letterbox to what you need
[00:39] <pfifo> i think 640x480 is svcd
[00:46] <lake> pfifo: thanks. i'll try it at 704x480
[00:49] <efs071> Hi...
[00:49] <efs071> how are you?
[00:49] <efs071> I have a problem with ffmpeg
[00:50] <efs071> I want to add head and tail to a mov file
[00:50] <whiplash> how do I compile ffmpeg into a dll file?
[00:50] <efs071> I use "-ss -00:01:0" and this add a Head (repeat first frame)
[00:51] <efs071> How I do this to append frames after the last frame?
[01:14] <efs071> How can I append frames before video?... These frame are a copy of last frame..
[01:14] <efs071> Thanks...
[01:17] <llogan> efs071: see the "concat" protocol in ffmpeg and http://ffmpeg.org/faq.html#How-can-I-join-video-files_003f
[01:18] <burek> whiplash, why?
[01:18] <burek> what's the point?
[01:22] <efs071> Ok... Thanks... but is imposible without concat with other video. I can add copy of first frame with "ffmpeg -i movie.mov -ss -00:01:0 output.mov"
[01:24] <burek> efs071 you're better off using some video editor or something
[01:25] <efs071> I need a ffmpeg command
[01:26] <whiplash> burek: so it can be used as a dshow codec
[01:26] <whiplash> and just as an exercise
[01:27] <burek> whiplash, did you check zeranoe's builds?
[01:27] <whiplash> yes, that's where i got the source from
[01:28] <efs071> If I have a video movie1.mov and I create a movie2.mov with last frame of movie1. How can I create movie2.mov with exact codec and params to concat it?. And Concat only work with mpeg videos?
[01:28] <whiplash> i'm using the october 9th revision
[01:28] <whiplash> f3f35f7 and i'm building under cygwin for i686 on a windows 7 x64 box
[01:29] <burek> efs071, what's the purpose for such thing?
[01:29] <whiplash> this is what i invoke .configure with: ./configure --target-os=mingw32 --extra-cflags=-mno-cygwin --extra-libs=-mno-cygwin --enable-cross-compile --cross-prefix=i686-pc-mingw32- --arch=i686 --disable-doc
[01:29] <whiplash> as far as i can tell, this page: http://ffmpeg.org/platform.html#toc-Crosscompilation-for-Windows-under-Cygwin is wrong
[01:30] <whiplash> --extra-cflags=-mno-cygwin isn't part of cygwin's gcc anymore
[01:31] <efs071> I need expand video with head and tail. The can create hear (repeat first frame) with: "ffmpeg -i input.mov -ss 00:01:0 output.mov". But I known how append tail to video (repeat last frame for example 1 minute)
[01:31] <whiplash> hmm, i never noticed these external libraries on zeranoe's page before, maybe there's some that cygwin ports doesn't cover
[01:33] <llogan> efs071: if you want to use ffmpeg you will have to create a cat-able movie of the frame with the duration you want it to appear.
[01:33] <llogan> there is no "frame hold" option as far as i know
[01:34] <llogan> as in: ffmpeg -loop 1 -r ntsc -i input.png -t 5 -c:v mpeg2video -q:v 2 output.mpg
[01:34] <llogan> or similar
[01:34] <llogan> then continue as shown in the faq
[01:37] <llogan> whiplash: if you believe it is wrong then consider submitting a bug report (although I don't know shit about cygwin).
[01:39] <efs071> Ok
[01:40] <efs071> But concat only work with mpeg?
[01:40] <llogan> i don't know.
[01:41] <efs071> And how can I create a video from frames but with other video format?
[01:41] <efs071> Becouse I need two videos with equal format to concat
[01:42] <llogan> ffmpeg -i input.mp4 -vframes 1 -ss <the time of the frame you want> output.foo
[01:42] <burek> efs071, also take a look at http://ffmpeg.org/trac/ffmpeg/wiki/How%20to%20concatenate%20(join%2C%20merge)%20media%20files
[01:42] <efs071> Thanks a lot!
[01:43] <llogan> i still haven't figured out what the deal is with -q:v 1 with mpeg*. carl always says to avoid it, but doesn't 'splain.
[01:43] <whiplash> well ok, but how about building ffmpeg as a dll?
[01:44] <whiplash> do i need to install mingw?
[01:44] <llogan> whiplash: maybe Zeranoe will know.
[02:17] <Zeranoe> whiplash: What is your question:?
[02:18] <whiplash> compiling ffmpeg as a dll
[02:19] <Zeranoe> the whole thing as one .dll?
[02:19] <Zeranoe> whiplash: Why not just use each dll like avcodec-54.dll for example
[02:20] <whiplash> well i gotta admit i only kinda know what i'm doing here
[02:20] <whiplash> but i want to use ffmpeg as a dshow codec
[02:20] <whiplash> so don't i need an ffmpeg.dll to use regsvr32 on?
[02:20] <Zeranoe> Try downloading one of my builds, a shared one, and use the .dll files in the /bin dir
[02:22] <whiplash> hmm... which one should i be registering? all of them?
[02:22] <whiplash> wait, let me rephrase that
[02:22] <whiplash> which should i point fdshow-tryouts to?
[02:23] <Zeranoe> whiplash: IM sent.
[05:25] <mishehu> greets.  I'm having a little difficulty here.  I have a file containing raw speex packets, and I need to transcode them to pcm_s16le.  I'm doing `ffmpeg -acodec libspeex -ar 16000 -ac 1 -i raw_input_file -acodec pcm_s16le -ac 1 -ar 16000 -fmt wav outputfile.wav` and it complains "rawinputfile: Invalid data found when processing input"
[05:26] <mishehu> so I think it's looking for a speex container on the stream, which doesn't exist.
[07:42] <iam8up> i have a bunch of mpeg4 videos i want to combine, at the bottom of the man page it says: ffmpeg -i test1.avi -i test2.avi -vcodec copy -acodec copy -vcodec copy -acodec copy test12.avi -newvideo -newaudio
[07:42] <iam8up> first question - do i need ot have the -vcodec and -acodec in there twice?
[07:43] <iam8up> second question - i want to input a lot of files, but my syntax doesn't seem to be working (i believe it's saying no input) http://pastebin.com/qM29TAkh
[07:47] <ubitux> ffmpeg 0.6? oO you are 6 release far away you know? :)
[07:47] <ubitux> we are in ffmpeg 1.0 currently (0.7, 0.8, 0.9, 0.10, 0.11)
[07:48] <ubitux> i don't remember how -new* behave, and they are removed now
[07:48] <ubitux> "combining" videos is quite vague btw
[07:49] <ubitux> it can mean a lot of different things
[07:55] <iam8up> it's what centos/rpmforge have in their repos...
[07:55] <iam8up> i mean concatinate, one video after the other (they're in dates so i want to show 20121001.mp4 then 20121002.mp4 etc)
[08:01] <ubitux> you can use the -vf concat added in the latest version
[08:01] <ubitux> or you could use an intermediate concatenable container like mpeg
[08:01] <iam8up> ok, i'll just get the new version then
[08:01] <iam8up> thanks =)
[10:53] <danisan> hi all, I've a simple question: ffmpeg library is under LGPL v 2.1 or LGPL v 3?
[10:53] <wedekind> Hello, I have tried to insall a ffmpeg on an ubuntu server and follewed the instruction from https://ffmpeg.org/trac/ffmpeg/wiki/UbuntuCompilationGuide.
[10:54] <Tjoppen> danisan: read LICENSE
[10:54] <Tjoppen> but 2.1 pretty much
[10:58] <wedekind> checking the compilation I see that the ffmpeg still uses the "old" libvpx 0.9.5  - instead of the new installed libvpx 1.1.0. This is because the libvpx-dev is not created and the compiler uses the old version (configure failes)
[11:03] <Mavrik> danisan: it depends on what pieces you compile in
[11:04] <Mavrik> danisan: most of it is LGPL, some libraries and outside encoders are GPL, some are even non-free
[11:04] <Mavrik> danisan: you get library license displayed at the end of "./configure" call
[11:04] <Mavrik> default compilation is LGPL though
[11:16] <danisan> Marwik: the ./configure reports: "LGPL version 2.1 or later", So the question is what means the "or later"?
[11:26] <retardant> hey, what would the easiest way to generate a video with a timer on it be, i am using Subtitle() with avisynth tehen rendering it with ffmpeg at the moment
[11:27] <retardant> but i just wrote a python script to generate a billion Subtitle() lines in an avs
[11:27] <retardant> seems kinda ghetto
[11:36] <divVerent> ubitux: https://github.com/divVerent/ffstuff you may find this interesting... obviously lacks documentation, though
[11:36] <divVerent> basically, shell based filter graph editing... nasty, but works
[11:36] <ubitux> heh sorry yesterday i didn't have time to look at the subtitles thing, certainly for this week end
[11:36] <divVerent> sure
[11:36] <ubitux> i'll have a look :)
[11:37] <ubitux> retardant: maybe you can use the timecode feature in drawtext
[11:37] <ubitux> but that's a smpte timecode, so you have some frame rate constraints and such
[11:38] <ubitux> retardant: you can also use the current time in drawtext (aka not pts)
[11:38] <retardant> ok
[11:38] <ubitux> and you can also generated a subtitle stream yourself and burn it somehow
[11:39] <ubitux> a pts2time could be nice in the filter
[11:39] <divVerent> inb4: making a pts.srt file that contains 120 minutes of srt elements, one per millisecond ;)
[11:39] <divVerent> with timecodes as text
[11:40] <divVerent> actually... it wouldn't be THAT insane to have such a thing... would probably be a fun way to see bugs in players
[11:47] <divVerent> mplayer has some... ISSUES dealing with such an insane srt file
[11:50] <retardant> i've been using this movie i made to find bugs in my rendering code
[11:50] <retardant> since there is some time bug
[11:58] <theholyduck> i should dig up that
[11:58] <theholyduck> MOST AWESOME ASS FILE OF ALL
[11:58] <theholyduck> i used to have it lying around atleast
[11:58] <theholyduck> a .ass for rendering ultra fancy karaoke effects and other stuff onto a video
[11:59] <theholyduck> would actually render with libass, just not anywhere near realtime.
[12:04] <divVerent> theholyduck: I think I have that file (or such a file) somewhere
[12:04] <divVerent> it does a dissolve effect by rendering every character 9001 times with different clip rectangles, and moving the parts around
[12:09] <theholyduck> divVerent, something like that yeah
[12:17] <retardant> how do i tell if a bit of h264 is constant or variable
[12:17] <retardant>   Duration: 00:00:15.04, start: 0.000000, bitrate: 5899 kb/s
[12:17] <retardant>     Stream #0.0(eng): Video: h264 (Baseline) (avc1 / 0x31637661), yuv420p, 1280x720 [SAR 1:1 DAR 16:
[12:17] <retardant> 9], 5721 kb/s, 30 fps, 30 tbr, 30k tbn, 60 tbc
[12:17] <retardant> i think my ffprobe is probably 900 years old
[12:17] <divVerent> there is not really such a thing as CBR H.264... or rather
[12:18] <divVerent> there is, but nobody uses it, what people tend to use where CBR is required is some buffering algorithm (keyword: VBV)
[12:18] <divVerent> which is locally VBR, but has a "globally constrained" bitrate
[12:18] <divVerent> and not sure if there are tools to detect if a file fulfills that
[12:19] <JEEBsv> I think you can get complete CBR, if the frame size is set to something static. then there's the VBV-constrained VBR with random padding that makes it CBR.
[12:19] <divVerent> sure you can :P
[12:19] <retardant> trying to hunt down a bug in my app where sometimes a user sets an in and out point to trim the video and it's miscalculating it by up to 45 seconds sometimes
[12:19] <retardant> heh which is a ridiculous margin of error
[12:19] <divVerent> but nobody really does, because it's a quality loss and most H.264-using playback devices (and standards) have published buffering requirements
[12:20] <JEEBsv> yeah
[12:20] <JEEBsv> except for some dumb broadcast places IIRC
[12:20] <JEEBsv> where you have to have nal-hrd + CBR padding
[12:20] <divVerent> sounds stupid
[12:20] <retardant> this is just for kids at a museum
[12:20] <JEEBsv> then you have some very limited encoder/decoder components used for web video etc. that use locked frame sizes
[12:20] <JEEBsv> and yes, both of these use cases are mostly dumb by now
[12:21] <JEEBsv> retardant: you are cutting by file sizes?
[12:21] <divVerent> retardant: why is that an issue in your app? do you estimate the position of a frame by just calculating file length * pos / total time length?
[12:21] <divVerent> that can go wrong quite a lot, even with the typical VBV-using H.264 encoding mode
[12:22] <divVerent> simply because having a temporarily lower bitrate is always allowed in the VBV model
[12:22] <retardant> am using avisynth to put 2 prerendered bits of h264 onto the start and the end of some h264 coming out of a camera and then overlaying some png images over their video to give it a kind of fun effect (so to speak)
[12:22] <retardant> but they get to select what they want to keep from waht they recorded
[12:22] <divVerent> note how x264 help screen doesn't even show in the examples how you would actually do CBR ;)
[12:23] <retardant> and rendering it all with ffmpeg
[12:23] <retardant> wondering if there is a better plan othre than avisynth but in my research it looked the best idea
[12:24] <divVerent> where does bitrate come into the equation there?
[12:24] <retardant> sorry i didn't originally clarify i was interested in VFR
[12:25] <divVerent> BTW, even CBR MP3 is typically not exactly CBR... silent frames are typically encoded a lot shorter even in CBR mode
[12:25] <retardant> but you guys were saying something interesting so
[12:25] <divVerent> oh, frame, not bit rate...
[12:25] <divVerent> now I see why it irritates avisynth - it has no timecode support ;)
[12:25] <divVerent> with many formats, e.g. mp4, it's absolutely not easy to verify that a file is really VFR
[12:26] <divVerent> because it only stores a denominator for the time base in the header
[12:26] <divVerent> so if you record 23.98fps NTSC, you actually have a 24000 as denominator, and the timecodes go up by 1001 each frame
[12:26] <divVerent> so to properly check if a given MP4 file is really CFR, you have to actually read every single frame's timecode
[12:26] <divVerent> to see if it is a constant amount after the previous one
[12:27] <divVerent> but, fear not, there is a solution
[12:27] <divVerent> you can extract timecodes to a file, then work with avisynth, and when calling e.g. x264 to encode again, you can make it import the saved timecodes back
[12:27] <divVerent> and possibly you will have to process the timecodes file by a script to e.g. put your stuff you added to the beginning
[12:28] <divVerent> --tcfile-in and --tcfile-out are the x264 options that can help you there
[12:28] <retardant> ok cool
[12:28] <divVerent> and also IIRC there is some matching avisynth support for this
[12:29] <divVerent> the file is a text file, so you should be able to do all necessary processing from your program
[12:29] <retardant> yeah i saw a plugin that generates a .ffindex file that i think does that
[12:29] <retardant> is there a notation to get ffmpeg to work like ffprobe
[12:29] <divVerent> but the thing is, you really have to assume that your input is VFR, you can't really limit it to CFR - VFR input is way too common already
[12:30] <divVerent> e.g. when using inverse telecine filters, but scenecuts happen very often or some parts aren't even telecined
[12:30] <retardant> yeah the camera spits out vfr
[12:30] <retardant> avisynth has a cry
[12:30] <divVerent> hehe
[12:30] <divVerent> why would a CAMERA output vfr... ;)
[12:30] <retardant> i think it might be the source of my cropping problem
[12:30] <retardant> well avisynth occasionally falls over saying the framerates don't match
[12:30] <retardant> but nothing in terms of the software is changing
[12:30] <divVerent> weird
[12:31] <divVerent> BTW, I wonder if you can do what you want with ffmpeg directly
[12:31] <divVerent> there is quite a bunch of filters, you can also of course overlay the image with a png image too
[12:31] <retardant> i need to overlay graphics at certian time intervals and bookened the video with 2 bits of prerendered video and of course crop the video, which is straight forward
[12:31] <divVerent> and ffmpeg is properly VFR aware (tends to need -vsync vfr, though)
[12:31] <retardant> ok
[12:32] <divVerent> not sure if you can do the overlay easily, but I'd bet you can somehow
[12:32] <retardant> i'd rather reduce the moving parts, also then i could move away from windows for the host
[12:32] <divVerent> there is a filter that can - based on timecode - accept and reject frames
[12:32] <divVerent> this MAY help, but not sure
[12:32] <retardant> seems to be a pretty sparse programming field... doing video stuff
[12:34] <divVerent> BTW, I wonder if what you are overlaying is something like possible with DVD subtitles
[12:34] <divVerent> then you could just use that format, and use "spumux" to adjust the start/end times of the overlays with it easily
[12:35] <divVerent> but even if not, I wonder if a short RGBA rawvideo you can quickly encode from a few empty frames and your overlay frames, with custom time codes, would do
[12:36] <retardant> i was looking at direct show stuff
[12:36] <retardant> because i already use it to talk to the camera
[12:36] <retardant> since ffmpeg couldn't get the right pin for the 264 stream
[12:36] <divVerent> but of course, the other way to do it would be avisynth + timecode files
[12:42] <cheeseduck> http://i.imgur.com/o1upg.png <- This is a close-up of a black-and-white video clip which appears to have weird greenish spots. It's extremely annoying and I think it must have something to do with how it was digitzed.
[12:42] <cheeseduck> Because the original is completely black and white.
[12:46] <retardant> gah all my tests are to the frame accurate
[12:47] <retardant> i need to log more data i guess
[12:57] <kenanb> hi folks, is it possible to create 60fps video in h264
[12:57] <kenanb> it somehow creates the video in 25 fps  even though i enter -r 60
[12:58] <kenanb>  ffmpeg -i %04d.png -i ../foo/bar.mp3 -c:v libx264 -preset veryslow -crf 18 -r 60 -c:a copy baz.mp4
[12:59] <kenanb> this is the commnad i used, pardon me for pasting here if that was wrong, i pasted because it was nearly a one-liner
[12:59] <kenanb> so this somehow  outputs 25fps video, any ideas?
[13:04] <kenanb> is it possible that 1600x1200 res and 60fps was a bad idea? :)
[13:15] <kenanb> anybody there
[13:23] <durandal11707> kenanb: i think you need to specify rate for input and not for output
[13:24] <durandal11707> img2 demuxer default framerate is 25
[13:25] <kenanb> durandal11707: wups, i didn't even know such difference exists, so how should i do that
[13:25] <durandal11707> put -framerate 60 as first arguments and see if it helps
[13:26] <kenanb> will do so, thank you so much durandal11707
[13:27] <cheeseduck> http://i.imgur.com/o1upg.png <- This is a close-up of a black-and-white video clip which appears to have weird greenish spots. It's extremely annoying and I think it must have something to do with how it was digitzed. Because the original is completely black and white.
[13:27] <cheeseduck> Any idea what's causing this?
[13:29] <durandal11707> cheeseduck: you have cmd line that produces this?
[13:30] <cheeseduck> durandal11707: No...
[13:30] <cheeseduck> I didn't even make the video.
[13:30] <cheeseduck> Just thought that people in here have video knowledge.
[13:45] <zap0> cheeseduck, 16 bit RGB 565   is sometimes misinterpreted
[13:46] <zap0> and results in higher than average green-ness
[13:47] <cheeseduck> Higher than average?
[13:47] <cheeseduck> There is supposed to be ZERO greenness.
[13:50] <zap0> average, as in...  a 5bit-level of blue, does that correspond with the same VALUE of 5-bit green (which is not full green) in a rgb565 encoding..  or is it supposed to correspond to the same meaning of full, thereby being 6bits of green.
[13:50] <zap0> then there is the issue is colour temperature being part of the weighting.
[13:53] <cheeseduck> Ugh.
[13:54] <cheeseduck> I wish there were a quick button to switch between black-and-white and colour.
[13:54] <zap0> cheeseduck, you've used a colour sensor to take this image?    what colour standard does it adhere too?
[13:54] <cheeseduck> It is very distracting when it's got "some" colours.
[13:54] <cheeseduck> zap0: I took a screenshot with my media player.
[13:54] <zap0> what colour standard does <media player> adhere too?
[13:55] <cheeseduck> No idea.
[13:55] <cheeseduck> Media Player Classic Home Cinema.
[13:55] <zap0> what format is the recording in?    is it a MP4?
[13:55] <zap0> h264?
[13:55] <zap0> what codec?
[13:56] <cheeseduck> Video: Xvid 720x560 25fps 886kbps [Video 0]
[13:56] <cheeseduck> Audio: MP3 48000Hz stereo 128kbps [Audio 1]
[13:57] <zap0> ok,  so the underlying colour format is likely: yuv 4:2:0      which means it is rather trivial for a filter to generate a black & white image from
[13:57] <zap0> you should be able to find a media player with a "black & white" filter/fx.
[14:01] <kenanb> i have 250 pngs, i want to use the images 2,4,6 (so with framestep 2)how should i define a framestep in ffmpeg for the input?
[14:10] <kenanb> ah, i guess it is -vf "framestep 2"
[14:19] <kenanb> hmm, the output is bigger than 60fps version in size
[14:19] <kenanb> that is odd
[14:22] <relaxed> kenanb: so you only want a video with 125 frames?
[14:23] <kenanb> relaxed: exactly
[14:23] <relaxed> ffmpeg by default reads at 25fps and outputs 25fps
[14:24] <relaxed> so that would give you 5 seconds of video- what does your output look like?
[14:25] <kenanb> relaxed: i rendered images to be 60 fps, and there are 250 images in total, but 60 fps seems to high for playback of this resolution, so i want to make it 30 fps and use 1 frame for every 2 images rendered
[14:26] <kenanb> relaxed: the result seems as i expected (including the delay caused by my cpu's lack of rendering 1600x1200 at 60 fps :D) but the file size is somehow bigger than 60fps version
[14:27] <relaxed> do you really need the output framesize that large?
[14:28] <kenanb> stupid client expectations :)
[14:29] <relaxed> okay, just checking :)
[14:29] <relaxed> some people see pie in the sky and have to chase it
[14:29] <kenanb> they really need to see the delay on their own machines to realize they can't use that framesize indeed
[14:29] <divVerent> is there any way to use ffmpeg filter chains with ffplay?
[14:30] <divVerent> e.g. to render DVD subtitles
[14:30] <relaxed> divVerent: I think so, ubitux would know.
[14:30] <kenanb> relaxed: so is the -vf "framestep 2" option right?
[14:30] <divVerent> I noticed that -vf "[0:0] [0:3] overlay" does not work
[14:31] <relaxed> kenanb: yes, I believe so.
[14:32] <relaxed> kenanb: though you may have to script this.
[14:32] <kenanb> because it seems right when i check video properties they seem ok, then i look at file sizes, two files are encoded with the same "-i %04d.png -i bar.mp3 -c:v libx264 -preset veryslow -crf 18 -c:a copy baz.mp4", but 60fps one is 4 mb while 30 fps one is 5mb
[14:33] <kenanb> which seems strange to me in encoding using CRF
[14:33] <kenanb> but i am an ueber-newbie in encoding, my logic is probably faulty
[14:35] <relaxed> is there a lot of motion in the video?
[14:35] <kenanb> yes
[14:36] <divVerent> it then may simply detect the motion badly at the lower fps
[14:36] <kenanb> ah, do you think because of the constant motion the 60 fps one is encoded better?
[14:36] <kenanb> i see
[14:36] <divVerent> this may happen, yes
[14:36] <relaxed> I know -crf behaves differently based on the framerate
[14:37] <divVerent> that too
[14:37] <kenanb> I never knew encoding is such a deep field
[14:37] <kenanb> respect!
[14:37] <divVerent> the other thing is, the higher fps are, the less "important" a single frame's content is
[14:37] <divVerent> which crf may also take into account (not sure if it does, but it'd make very much sense)
[14:38] <divVerent> this basically tells you: you can't say the videos have same quality just because you used the same crf value
[14:38] <divVerent> compare them visually
[14:38] <divVerent> and compare THAT to the file size ;)
[14:38] <kenanb> anyway, it doesn't really matter because 1600x1200 doesn't playback smoothly even in 30 fps, so I'll just let them see the 60 fps version and hate it
[14:39] <kenanb> :D
[14:39] <divVerent> the thing is, crf is basically black magic
[14:40] <divVerent> it is an attempt to define the "requested" visual quality to the encoder, and it does "its best" to get that done the best way possible
[14:40] <divVerent> it is not a constant quantizer, but does all sorts of stuff to improve efficiency
[14:41] <divVerent> quality still may vary between different videos, but the general intention is to make it roughly the same for all, whatever "perceived quality" is ;)
[14:41] <ubitux> < divVerent> is there any way to use ffmpeg filter chains with ffplay? // ffplay -f lavfi -i <filtergraph>
[14:41] <kenanb> i see, i guess professionals go the two-pass route to achieve what they want in the optimum file sizes and quality, right?
[14:41] <divVerent> ubitux: okay... how do I get a given stream from an input file
[14:41] <divVerent> kenanb: depends ;)
[14:41] <ubitux> divVerent: you can't use the filter complex thing, so movie and amovie source filters
[14:41] <divVerent> crf is quite comparable to the two-pass route regarding bitrate/quality ratio
[14:41] <divVerent> but 2pass allows you to actually set the target size in advance
[14:42] <divVerent> ah, I see now, movie takes a si= stream index
[14:43] <divVerent> or not, as stream_index is deprecated
[14:43] <ubitux> it's a bit tricky sometimes but it's quite useful
[14:43] <divVerent> so what is one supposed to use now?
[14:43] <ubitux> mmh? i don't think si= is deprecated
[14:44] <divVerent> "Specifies the index of the video stream to read. If the value is -1, the best suited video stream will be automatically selected. Default value is "-1". Deprecated. If the filter is called "amovie", it will select audio instead of video."
[14:44] <divVerent> says the manpage
[14:44] <ubitux> ah it's because of libav i believe
[14:44] <ubitux> they gave no alternative
[14:44] <ubitux> don't worry about that.
[14:44] <divVerent> haha, so libav can just deprecate a ffmpeg feature without replacement?
[14:45] <divVerent> what kind of nonsense is that
[14:45] <divVerent> and I mean, deprecate even in ffmpeg
[14:45] <ubitux> it's "deprecated", or more correctly useless, in the conversion tool
[14:45] <ubitux> avconv for them
[14:45] <divVerent> AH, now I see, wait
[14:45] <divVerent> there IS an alternative
[14:45] <divVerent> you can use streams= too
[14:45] <divVerent> which uses stream specifiers
[14:45] <ubitux> huh?
[14:45] <divVerent> so stream_index indeed is deprecated
[14:46] <ubitux> oh ok
[14:46] <ubitux> the streams specifiers
[14:46] <divVerent> movie=foo.avi:streams=1+2
[14:46] <divVerent> would read streams 1 and 2
[14:46] <ubitux> right ok, i was wrongly bad mouthing then
[14:46] <ubitux> my bad :)
[14:46] <divVerent> hehe, you'll find many other good reasons ;)
[14:47] <divVerent> the part I like least about libav are the borderline copyright violations of ffmpeg
[14:48] <divVerent> http://blog.pkh.me/p/13-the-ffmpeg-libav-situation.html - basically, the taking over of commits (token-wise unchanged, only reformatting) while not attributing the original author
[14:49] <crashd> hi all, I have a QT (prores 422 HQ) with 1 'stream' or track of audio that has 14 discrete channels within it. Is there a one-line ffmpeg solution that will take the 14 channels, split them out and turn them into 14 discrete 'streams' or tracks, keeping the video stream as is?
[14:55] <relaxed> crashd: look at channelsplit in the man page
[14:55] <ubitux> divVerent: yeah, i wrote that article, i'm quite aware of the problem ;)
[14:56] <divVerent> ah, that's you :P
[14:57] Action: relaxed forks ubitux 
[14:57] <divVerent> I am waiting for them to do that on a somewhat bigger commit
[14:57] <divVerent> which "likely" constitutes copyright
[14:57] <divVerent> and then libav being mentioned on the ffmpeg hall of shame ;)
[14:57] <ubitux> our HoS page is unmaintained :(
[14:57] <divVerent> and down
[14:58] <divVerent> we need more such pages...
[14:59] <divVerent> piratebay legal threats got no updates either for a while, was quite entertaining while it lasted
[14:59] <crashd> relaxed: thanks, im not using 1.0 release so it wasn't coming up in any man pages. i'll get it upgrade. thanks again!
[15:19] <crashd> relaxed: does channel_layout support none-standard audio channels, eg: not 5.1,7.1, stereo etc..?
[15:25] <relaxed> crashd: I'm not sure. Give it a whirl.
[15:37] <crashd> ah, looks like we can only use pre-defined 'channel_layouts' which only seem to go upto 8 total chans, would need to create a patched version to support some arbitrary 14 track layout. thanks anyway relaxed!
[15:42] <relaxed> crashd: sox may be able to do it
[16:40] <Spideru> burek: I have one more question about ffmpeg ---> ffserver <--- ffplay and is about ffm. Is there a way to use - for example - RTMPS over ffm?
[16:40] <burek> just a sec
[16:40] <Spideru> from ffmpeg to ffserver
[16:40] <Spideru> sure
[16:41] <Spideru> (now is working like a charm with pcm :) )
[18:20] <amstan> any idea why android refuses to accept my video? i made it with: ffmpeg -r 12 -i %06d.png -sameq -vcodec libx264 -preset fast -crf 25 simulator.mp4
[18:21] <amstan> they're about 1700 pngs made at 1024x768
[18:21] <amstan> it works fine in vlc and dragonplayer(kde default media thingy)
[18:30] <burek> amstan, why -sameq
[18:31] <burek> btw, try with global headers
[18:31] <burek> and qt-faststart tool
[18:31] <burek> also read about -profile
[18:31] <burek> it might help
[18:44] <stooge3> I have interlaced video tapes from a Sony Digital8 camcorder.  Scenes with panning show fringing.  How cal I best deinterlace the videos and put them on DVD?
[18:47] <relaxed> burek: -sameq was removed in git, so one day it will trouble us no more :)
[18:47] <burek> thats the great news :)
[18:48] <relaxed> of course we're talking years, haha
[18:48] <burek> oh..
[18:48] <burek> :D
[18:50] <relaxed> The interweb is full of examples using it, so keep your trigger finger ready.
[18:51] <relaxed> Hmm, maybe it could be backported to the earlier branches.
[18:52] <burek> that would be great if possible
[18:52] <burek> but I believe that, as soon as we provide official deb packages somehow, people will just update
[18:52] <burek> and that's it
[18:53] <burek> which reminds me :)
[18:53] <burek> ping ubitux :)
[18:53] <burek> priv? :)
[18:54] <ubitux> mmh?
[18:54] <burek> can I ask you something in PM
[18:54] <ubitux> just ask in pm...
[18:54] <ubitux> :p
[19:02] <iam8up> i'm trying to concatinate mpeg4 files but i'm not having much luck - http://pastebin.com/qJPvLSs5
[19:02] <iam8up> i also tried it with -f mpg4 which didn't change the outcome
[19:03] <Diogo> hi
[19:03] <Diogo> anyone works with mediaroom IPTV plataform?
[19:03] <Diogo> this is possible to stream consume live from ffserver?
[19:03] <relaxed> iam8up: use MP4Box's -cat
[19:05] <relaxed> iam8up: MP4Box -cat 1.mp4 -cat 2.mp4 -new combined.mp4
[19:06] <iam8up> is there any way to make the input easier?  i've got dozens of files
[19:06] <iam8up> say -cat *.mp4  or accept from - with ls|sort -n
[19:07] <burek> iam8up, you can't just cat *.mp4
[19:07] <burek> it will concatenate file headers too
[19:07] <burek> and render it unusable
[19:09] <burek> i mean, some players might recognize it and play it without problem
[19:09] <burek> but others can fail, so..
[19:09] <burek> the correct way is to re-encode all your videos and produce one big output
[19:09] <iam8up> ok i gotcha, just looking for a solution other than typing -cat 1.mp4 -cat 2.mp4 all the way up to 30 or 50 or whatever every month
[19:10] <burek> man MP4Box :)
[19:10] <burek> see if there is an option to provide a file list from a file
[19:10] <iam8up> just doing svn now
[19:10] <iam8up> need to get the packages, no binary for centos
[19:28] <iam8up> there is not, and you can only have 20 -cat per command
[19:33] <iam8up> relaxed, thanks a bunch for the tip
[19:35] <iam8up> very odd result, it opens up 5 windows in VLC?
[19:42] <relaxed> try ffplay
[19:47] <iam8up> i don't have any  machine with linux and a monitor
[19:47] <iam8up> i used the win32 build of mp4box and it made the video just fine, something is wrong in my command
[19:48] <relaxed> yeah, I've never heard of that happening.
[20:45] <DelphiWorld> hey everyone
[20:45] Action: DelphiWorld yel at burek
[20:45] <DelphiWorld> please, how to build a Shoutcast compatible output using FFserver ?stream
[20:45] <DelphiWorld> what format should i chouse
[21:00] <DelphiWorld> hi Sast
[21:00] <DelphiWorld> can you look at some bug in ffm?
[21:01] <DelphiWorld> or michaelni
[21:08] <llogan> DelphiWorld: if you have a bug, search the bug reports. if it is not listed, then submit a bug report.
[21:09] <DelphiWorld> llogan: #ffmpeg-devel ;-)
[21:17] <saschagehlich> hey, i just compiled rtmpdump v2.4 from git, but it still gives me "WARNING: HandShake: Type mismatch: client sent 6, server answered 9"
[21:26] <llogan> saschagehlich: without context it is hard for people to get an idea of what you're doing
[21:27] <saschagehlich> llogan: okay
[21:27] <saschagehlich> https://gist.github.com/3874907 this should be enough context?
[21:27] <llogan> meaning you need to show your commands or whatever so people can easily duplicate the issue
[21:28] <llogan> hm, i assumed you were using librtmp via ffmpeg.
[21:28] <llogan> this isn't really a support channel for rtmpdump, but someone here may know more
[21:29] <DelphiWorld> llogan: me or saschagehlich ?
[21:29] <saschagehlich> is there an rtmpdump support channel? I thought it's developed by the ffmpeg team since it's hosted at git.ffmpeg.org
[21:29] <saschagehlich> but okay, I'll try to compile the latest ffmpeg on my machine, give me a sec
[21:30] <saschagehlich> what would be the respective ffmpeg command?
[21:32] <llogan> you can try the mailing list: https://lists.mplayerhq.hu/mailman/listinfo/rtmpdump
[21:33] <llogan> or this forum: http://stream-recorder.com/forum/rtmpdump-f54.html
[21:45] <saschagehlich> okay, ffmpeg compilation seems to be going well until a certain point when it totally breaks https://raw.github.com/gist/4bd047b2bb68afce408c/028013307aba221c378e501a7c39e80e4e51e938/gistfile1.txt (scroll down)
[21:46] <llogan> saschagehlich: what was your ./configure?
[21:47] <saschagehlich> llogan: https://gist.github.com/a250be7a3bb2866452c5
[21:48] <llogan> this is ffmpeg from git?
[21:49] <saschagehlich> yup
[21:49] <saschagehlich> master branch
[21:49] <saschagehlich> commit 313b40e
[21:51] <llogan> works for me.
[21:52] <saschagehlich> any tips?
[21:53] <saschagehlich> i mean it looks like gcc is totally broken here, it sees syntax errors where there are none
[21:55] <llogan> ppc?
[21:56] <saschagehlich> nah intel
[21:57] <saschagehlich> checked out v0.6.1 - works
[21:57] <llogan> duh. i forgot about your configure...it shows arch
[21:58] <llogan> sounds like a regression to me and should be reported
[21:58] <saschagehlich> I'm checking out the 1.0 release& if that one works, I'm gonna report it immediately
[21:59] <llogan> is this os x?
[22:00] <saschagehlich> yup
[22:00] <saschagehlich> mountain lion
[22:00] <saschagehlich> 10.8
[22:00] <llogan> i'll have access to one of those later today. maybe i'll try to duplicate the issue.
[22:02] <saschagehlich> alright
[22:02] <saschagehlich> 1.0 release: same issue& trying 0.9
[22:04] <DelphiWorld> DUUUUUUUUDES
[22:04] <DelphiWorld> i am happy now llogan
[22:04] <DelphiWorld> llogan: let me pb my new file
[22:05] <llogan> DelphiWorld: glad you figured it out.
[22:05] <DelphiWorld> llogan: lovely, let me share
[22:05] <llogan> also looks like burek made a ffserver guide: https://ffmpeg.org/trac/ffmpeg/wiki/Streaming%20media%20with%20ffserver
[22:06] <llogan> saschagehlich: a git-bisect would do better to pinpoint the commit then introduced the issue
[22:06] <llogan> *that
[22:06] <saschagehlich> llogan: never tried git-bisect, but will do now
[22:07] <DelphiWorld> llogan: http://dpaste.de/yB0Gd/
[22:07] <DelphiWorld> saschagehlich: tel me what you're trying to do
[22:07] <saschagehlich> DelphiWorld: I'm trying to find a bug in ffmpeg that leads to compilation issues on os x mountain lion
[22:08] <DelphiWorld> saschagehlich: ah, out of my knoledges
[22:08] <saschagehlich> :D been using git for 3 years now, but that's something I never touched
[22:08] <DelphiWorld> ubitux: http://dpaste.de/yB0Gd/
[22:11] <DelphiWorld> llogan: thought ?
[22:11] <llogan> i know nothing of ffserver
[22:16] <DelphiWorld> llogan: just read and compare what did i change :-P
[22:16] <DelphiWorld> llogan: my issue soleved
[22:24] <saschagehlich> omfg git bisect is awesome :O
[22:31] <saschagehlich> and that's the point where commiting in small steps becomes awful...
[00:00] --- Fri Oct 12 2012


More information about the Ffmpeg-devel-irc mailing list