[Ffmpeg-devel-irc] ffmpeg.log.20140419

burek burek021 at gmail.com
Sun Apr 20 02:05:01 CEST 2014


[01:44] <llogan> blippyp: what distro?
[01:46] <llogan> jonascj: that 0.8.10 is probably not from FFmpeg, but from a fork
[01:53] <Jim3> is FFmpeg's experimental AAC encoder safe from all the license BS?  it doesn't seem to be excluded when you don't have the --non-free option
[01:58] <JEEB> Jim3, it's yer usual LGPL
[01:58] <JEEB> so yes, it is excempt from the software license side of licensing problems
[02:12] <iive> Jim3: non-free options actually enables everything, despite the license. the result however cannot be distributed.
[02:12] <Jim3> yea, I was just checking out the experimental AAC encoder specifically for distribution
[02:12] <Jim3> (also I appricate the answers, thank you)
[03:48] <suzaru> can ffmpeg be used as an interactive encoder?
[03:50] <suzaru> i guess is there a way to interact with the process, like to tell it to stop the transcode at a particular timestamp in the file
[03:51] <klaxa> you can tell ffmpeg to encode a specific segment of a video
[03:51] <klaxa> but i don't think you can do that interactively other than stopping it
[06:48] <damon> If I create an app that calls ffmpeg through command line/shell scripting, what do I need to do legally?
[06:50] <suzaru> send your licensing dues to me
[06:50] <damon> ... Seriously.  Anyone?
[07:01] <jonascj> damon: look at the ffmpeg license. It should state all you need. (I do not know whether it is apache, gpl, bsd or etc. licensed)
[07:02] <jonascj> damon: according to the license page this is the license they use: http://www.gnu.org/licenses/old-licenses/lgpl-2.1.html
[07:02] <jonascj> so just do as it says
[07:19] <damon> @jonascj thanks for the link.  I've read through that, I'm confused whether if I include their executable library and call it through shell scripting is the same as using their library.  I want to give them as much credit as possible, I'm more curious if I need to have my application be open source or not.  It will be a free program so I would like to keep the legal costs down, if not free as
[07:19] <damon> possible.  Here's the part I'm concerned with "When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License
[07:19] <damon> permits more lax criteria for linking other code with the library."  Also shortly followed after that: "For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job
[07:19] <damon> as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License." which since FFMPEG is a standard in media handling could I possibly charge for my software, given that it becomes worth while?
[07:31] <mmint> invoking a command line is not the same as linking to a library
[07:31] <mmint> If your program just executes "ffmpeg ...." in the system's shell, you have not linked to the library
[07:31] <jonascj> mmint: but if he distribute the ffmpeg binary alongside his application
[07:32] <mmint> Yes, distributing ffmpeg is still controlled by the license
[07:32] <jonascj> mmint: but he could require his users to install ffmpeg and make it available on their path.
[07:33] <mmint> That's probably easiest, if the users are comfortable with installing software
[07:38] <jonascj> Do I explicitly need to enable video4linux2 when building ffmpeg?
[07:41] <mmint> yes. It's disabled by default.
[07:41] <mmint> --enable-libv4l2
[07:44] <jonascj> mmint: would you think it was enabled by linux package maintainers? (it is linux after all, v4l2 would make sense to have)
[07:46] <jonascj> because it seems to be available in the ffmpeg binaries I've tried around on different linux distros. I am building ffmpeg for the first time now with some streaming in mind and I need v4l2, so good to know I should enable it manually :)
[07:47] <mmint> Actually, I don't see it in the configuration of the binary distributed with debian, but I remember using it with a v4l2 device before
[07:47] <jonascj> hmm
[08:17] <jonascj> I have compiled ffmpeg without --enable-libv4l2 and when I do "/tmp/ffmpeg -v verbose -r 10 -s 320x240 -pix_fmt yuvj420p -f video4linux2 -i /dev/video0 http://wuhtzu.dk:8090/webcam.ffm"[mjpeg @ 0x2e62a50] Specified pixel format yuyv422 is invalid or not supported"
[08:17] <jonascj> * when I do "/tmp/ffmpeg -v verbose -r 10 -s 320x240 -pix_fmt yuvj420p -f video4linux2 -i /dev/video0 http://wuhtzu.dk:8090/webcam.ffm
[08:18] <jonascj> I get this error: "[mjpeg @ 0x2e62a50] Specified pixel format yuyv422 is invalid or not supported". Is that because video4linux2 support isn't there or is it something else?
[08:18] <jonascj> sorry for the horrible paste containing a newline (without the newline I think it was okay to paste it here rather than pastbinning it)
[08:48] <blippyp> llogan: arch
[09:27] <blippyp> jonascj: run ffmpeg -pix_fmts - do you see yuvj420p in the list (it's on mine)? if you do than you're capable of the format - I would guess it's that you're lacking v4l2, also you likely don't need to manually set it (I don't, and I can record without any problems from multiple cameras). But each situation may be different...
[09:30] <jonascj> this is the output of the pix_fmts command containing *yuv*: http://pastebin.com/2V7kXu6q
[09:30] <jonascj> So could this really be me lacking v4l2? The error isn't very descriptive if that is the case
[09:32] <blippyp> I'm no expert, but I would say yes - clearly your command will not work if you do not have 4vl2 enabled - that it may be giving you an unhelpful error isn't impossible... just re-compile with 4vl2 enabled, you clearly want it - you will need it, without it your testing is just a waste of time...
[09:34] <jonascj> blippyp: my "problem" is I am trying to crosscompile it for an arm device (raspberry pi) and for my toolchain --enable-libv4l2 does not work out of the box. I have to find the v4l2 library somewhere or something similar
[09:34] <blippyp> ah
[09:35] <jonascj> I just find it funny that guides all over the internet for webcam streaming and linux (even raspberrypi) says "ffmpeg -f video4linux2..." and none of them mentions the need for --enable-libvideo4linux2. Someone in here said that it was disabled by default.
[09:36] <blippyp> it is - you have to include it (don't know if it makes a difference, but on my compiles I did today - I was using --enable-libv4l2) - maybe that's your problem???
[09:37] <blippyp> I've never seen a reference for libvideo4linux2 (but I'm no expert on this)
[09:38] <jonascj> reference? For the video4linux2 library, or for the "-f v4l2"?
[09:38] <blippyp> --enable-video4linux2
[09:38] <blippyp> I've always seen it as --enable-v4l2
[09:39] <jonascj> blippyp: oh yeah, if I said --enable-libvideo4linux2 I wasn't thinking. It is --enable-libv4l2 and that is what I am trying to make work :)
[09:41] <blippyp> k - then unfortunately I got nothing for ya then...  :( - I'd google specifically for issues related to v4l2 and rasp-pi - someone else must of had the same issue as you...
[09:41] <jonascj> blippyp: yeah, I'll go to #raspberrypi and see if anyone has the same problem
[09:42] <jonascj> (not that I've not already been there :P)
[09:44] <blippyp> re-posting my question from earlier in hopes that someone may be able to help:
[09:44] <blippyp> I have a long pre-typed reference to my problem: But in hopes of saving you from a long read, I believe the root of my problem is the WARNING: library configuration mismatch (but do not know how to solve this): http://pastebin.com/3nDCiq3k
[09:44] <blippyp> I am attempting to compile ffmpeg with frei0r enabled. Everything seems fine during the install, but I receive a segmentation fault when I try to use it and I get the above error when I run ffmpeg by itself. At the top of the pastebin is the exact ./configure parameters I used. I did not run make install, running ffmpeg directly from my git folder. Does anyone have any advice for a lost soul? Do you want me to paste the rest of the details 
[09:46] <blippyp> llogan requested which distro I was using, so I'll re-post that as well: Arch - I updated it with pacman today.
[09:49] <tm512> is there a way to make yuv420 look less bad?
[09:50] <tm512> when the input is rgb
[09:51] <blippyp> I assume you're referring to the colors? - YUV is 12 bit, wereas rgb is 24 - you can't squeeze money from a rock...  :(
[09:52] <tm512> well the colors end up looking washed out in some cases
[09:52] <blippyp> I would check to see if the source can use anything better than yuv420 - I think that would be your only option....
[09:53] <tm512> the video will eventually need to be uploaded to youtube
[09:53] <blippyp> yeah, people won't like that on youtube - can you post an example of what you mean by 'washed-out"
[09:54] <blippyp> take a screenshot and put it on imgur.com or something...
[09:54] <blippyp> or if you've already uploaded something to youtube, a link to that would be fine (perhaps with a time to go to for a reference)
[09:55] <tm512> I'll do a screen cap
[09:55] <blippyp> that's cool
[09:57] <tm512> http://i.imgur.com/hyZLhia.png
[09:58] <tm512> compare to http://i.imgur.com/SyByNE3.png
[09:59] <blippyp> I don't see any noticeable difference???
[09:59] <tm512> and that is "lossless"
[09:59] <tm512> the colors are a lot less saturated
[10:00] <blippyp> if saturation is what's bothing you - try this:
[10:00] <blippyp> once you've loaded the video - run a filter like this:
[10:00] <blippyp> -vf "format=rgb24, hue=s=2"
[10:00] <blippyp> change the value of s to change the saturation levels
[10:01] <blippyp> maybe that will solve what's bothering you
[10:01] <blippyp> for the difference in saturation in your screenshots - I'd say a value of 1 (or lower?) might be all you want
[10:01] <jonascj> blippyp: when you said you compiled with --enable-libv4l2, what system did you do that on? I've just cloned ffmpeg via git and configured with only one option (--enable-libv4l2) and the configuration does not succeed. It says libv4l2 was not found.
[10:02] <blippyp> jonascj: I did the exact same thing today, and it worked fine for me...???
[10:02] <jonascj> blippyp: * and this is my trying to compile it on my laptop with linux (just for testing, the pixelformat error is the same for me whether I use my laptop to feed or the raspberrypi)
[10:02] <blippyp> jonascj: I already had the latest ffmpeg installed from my repository though - maybe that made a difference...
[10:03] <tm512> it's less of a problem in stuff that has more color and motion, I dunno
[10:04] <blippyp> jonascj: let me re-state that - it didn't work fine for me (if you see my previous question above) - but I didn't receive any errors during my compile or .configure stages... so I assume it worked fine (I think I'm having other issues)
[10:04] <tm512> I'd rather not throw away too much color information to begin with
[10:05] <tm512> and I was wondering if the conversion itself was something that could be tweaked
[10:06] <blippyp> tm512: you're using a format that's 12 bit (4095 colors) as opposed to 24bit (65534 colors) - you either need to change your pixel format while capturing or put up with the loss in colors...
[10:06] <blippyp> I don't know if you can 'tweak' the conversion - either way you're still putting up with a huge loss of colors
[10:07] <blippyp> I thought you were capturing from a device (like a camera), but if you're capturing your screen, then why not use rgb24 to begin with?
[10:07] <jonascj> I hate these things. I must be doing something really wrong. All those guides on the forums and blogs for webcam streaming with ffmpeg/ffserver would mention v4l2 if it was a problem. It must work out of the box for most linux distros etc. It is kind of conflicting with the fact that it is disabled per default when building ffmpeg....
[10:08] <blippyp> if it's a speed issue, use a lossless codec (or near lossless, like libx264 with a crf of 0 and a preset of ultrafast) - this will record fast - you will likely want to re-encode it afterwards, but the video will look great.
[10:08] <tm512> well no point if it just has to go to 12 bit and there's no way I can stop it
[10:08] <blippyp> jonascj: what distro are you using?
[10:08] <blippyp> tm512: how are you capturing it? why do you have to use yuv420?
[10:08] <tm512> I'm using x264 with -qp 0 and -preset ultrafast
[10:09] <tm512> I'm using x11grab
[10:09] <blippyp> then why are you using yuv420p?
[10:10] <tm512> the destination is youtube, can I even use anything less lossy?
[10:10] <jonascj> blippyp: they are all debian based. I have Raspbian (debian ported for raspberrypi) and ubuntu desktop 12.04
[10:10] <blippyp> youtube won't destroy a video like that - something else is wrong
[10:11] <blippyp> tm512 - show me your exact command for capturing
[10:11] <tm512> it's part of a script
[10:11] <blippyp> I'm not very familiar with debian - which debian are you using? squeeze, wheezy, etc...
[10:12] <tm512> ffmpeg -y -rtbufsize 1500M -f alsa -ac 2 -ar 44100 -i pulse -acodec flac "$1".flac -f x11grab -s "${w}x${h}" -i :0.0+"${x},${y}" -an -vcodec libx264 -preset ultrafast -qp 0 -vsync cfr -r 30 -threads 0 -pix_fmt yuv420p "$1".mp4
[10:12] <blippyp> remove the -pix_fmt yuv420p from your command
[10:13] <blippyp> re-test it an upload it to youtube - it will likely look fine
[10:13] <jonascj> blippyp: I am probably loosing my internet connectivity soon. So I'm not quitting, just being disconnected :)
[10:13] <blippyp> if this cause some other issue for you - let me know what's bother you about removing that
[10:13] <blippyp> jonascj: alright
[10:14] <jonascj> blippyp: on ubuntu it turned out that I needted to intsall libv4l-dev. Now I can configure what that option. I am excited to see if that fixes the original pixel format error or if it is somehting else :)
[10:14] <tm512> heh, mplayer just converts it down to 420
[10:15] <blippyp> jonascj: yeah, I checked my packages in arch, but I couldn't find anything specific to v4l2 installed for me - not sure where I'm getting it from
[10:15] <tm512> it still looks less colorful
[10:15] <tm512> but it is 444p now
[10:16] <blippyp> tm412 - that's good, now you're using 24 bit
[10:16] <blippyp> do you still want more colors?
[10:16] <jonascj> blippyp: I have arch on my desktop at home, I will try it there also if I do not succeed here on debian-derivatives :)
[10:16] <blippyp> you can use a different format still
[10:17] <blippyp> try setting it to yuv422p16le (32 bit)
[10:17] <blippyp> oops
[10:18] <blippyp> nope, that was right - or yuv444p16le (for 48 bit)
[10:18] <tm512> it doesn't look much better at all
[10:18] <blippyp> or yuva444p16le for 64 bit (but that may just add the ability for alpha channels - you might now notice a difference between this and 48 bit - not sure
[10:18] <blippyp> in arch it works as is
[10:19] <tm512> how do I set it to rgb to check?
[10:19] <blippyp> jonascj: in arch it works as is - you won't need to do anything for it - get the latest from the AUR - I've been using it for weeks, it's great - but even the one in the default repository will also capture using v4l2 without any modifications required
[10:20] <blippyp> tm512: you can convert it if you want: ffmpeg -i video.mkv -vf "format=rgb24" -c:v libx264 -crf 0 -preset ultrafast -c:a copy rgb.mkv
[10:21] <tm512> but it's already lost the color data then
[10:21] <blippyp> tm512, but I doubt you'll notice any difference - in fact it should technically be worse - they're both already using the same bit count for colors
[10:21] <jonascj> blippyp: If i were a linux package manintainer I would enable v4l2. video4LINUX2 sounds like a thing you want when you are on a linux os... So it sounds like either the arch package maintainer or the arch AUR is more sensible than the debian/ubuntu package maintainer...
[10:22] <blippyp> yes, like I said you will need to change to a better pixel format (like I mentioned earlier) and re-capture - It will capture more colors for you
[10:22] <tm512> it looks the same with 444p
[10:23] <blippyp> jonascj: yes - that's exactly it - and one of the biggest reasons people use arch - I just wish they would also include frei0r - I'm in the same vote as you - only I want a different filter enabled, but I can't get it working either...
[10:23] <blippyp> tm512 - don't know what to tell you - you're using different settings than what I use for encoding - maybe your codec is downgrading it
[10:24] <blippyp> tm512: change your encoding settings to only use -c:v libx264 -crf 0 -preset ultrafast
[10:24] <blippyp> tm512: this will keep a near exact duplicate of what you're encoding while recording it very fast - but your final file size will be quite large
[10:25] <tm512> I'm already recording it lossless
[10:25] <tm512> a possible explanation is just that the colors in my terminal just don't translate well
[10:26] <blippyp> tm512: when you're finished recording it - you might want to re-encode it to get the file size down, alternately you can increase the crf value to something higher (anything past 17 will be noticeable by the human eye - I always use 17) - point being, increase the crf value until you can't record properly at the same time (get's choppy, etc.. system slows down)... 0 will be very fast and will slow down your computer very little while captur
[10:26] <tm512> I can't record lossy in real time
[10:27] <tm512> afterwards I encode it to a webm file
[10:27] <blippyp> why do that?
[10:27] <blippyp> youtube will do that anyways?
[10:28] <blippyp> you should always give youtube the best quality video you can
[10:29] <tm512> I'm not gonna upload lossless though
[10:29] <blippyp> you're re-encoding the video too much - by the time youtube is done with it your 'near lossless' compression you started out with is now riddled with artifacts from being converted over and over again from non-lossless codecs (fyi - even at the 'lossless' settings for x264, it's still not lossless)
[10:29] <tm512> I'm taking the output and reencoding it once before uploading
[10:30] <blippyp> maybe I'm wrong, but I'd take your initial encoding - re-encode with a crf of 16 or 17 and send that to youtube - I wouldn't go to webm, I'd leave it as a 264
[10:30] <tm512> I want lossless to begin with, then keep it lossless before a final high quality encoding prior to uploading
[10:31] <blippyp> tm512: ya, you clearly have the 'right way' of thinking about it - I just question whether switching to webm is a good move - ultimately it might not matter, but 264 is a better encoder in my experience
[10:32] <tm512> what kinds of bitrates do I want to shoot for with a 720p video?
[10:33] <blippyp> either way, I'm not sure where your color bleeding is coming from - the yuv444p16le is 48 bits - that's a shit-ton of colors - if you can encode at that format without it killing your machine, you should be getting near perfect video...
[10:33] <blippyp> tm512: depends on the length of the video
[10:33] <tm512> don't see how it really depends on that
[10:34] <tm512> I have bandwidth
[10:34] <tm512> and disk space
[10:34] <blippyp> one second
[10:35] <blippyp> https://support.google.com/youtube/answer/1722171?hl=en is this what you're looking for?
[10:36] <blippyp> youtube will 'downgrade' you video to these settings as far as I know - so if you encode the same/better than youtube won't destroy you video any further than these settings
[10:37] <blippyp> your video never looks that same on youtube as what you create - it's a fact we all have to live with....
[10:40] <tm512> the final encode of the video I did was at about 2500kbps, I'll definitely have to bump that
[10:40] <blippyp> yeah if you're using 720p you want to double that
[10:40] <blippyp> this is likely why you're running into problems
[10:40] <tm512> no the original "lossless" has bad colors
[10:40] <tm512> not in all cases
[10:41] <tm512> just some things look washed out
[10:41] <blippyp> like I said, x264 isn't 'lossless' though
[10:41] <tm512> would yuv420p 2500 look like yuv444p 5000?
[10:42] <tm512> do I have to use higher bitrates to get the same quality now?
[10:42] <blippyp> youtube will destroy those colors for you as well, the same way - use a TRUE lossless codec like huffyuv - make a (very short clip) and upload that lossless encoding to youtube - I'm betting they will destroy it the same way
[10:43] <blippyp> not sure - there's a lot more colors is yuv444p - but like I said, youtube will downgrade your video to 5000 anwyasy, encode it at yuv444p - this will give you 48 bit colors - what you end up with in your encoding should be quite similiar to what you see on youtube at that point...
[10:43] <tm512> you aren't understanding, the colors are bad straight away
[10:43] <blippyp> if you increase your bitrate even further - and get a better quality - youtube will still destroy the video, so there's really no point in aiming higher
[10:44] <blippyp> yes, but that's probably because of your low bitrate right now
[10:44] <tm512> it's -qp 0
[10:44] <blippyp> the encoder is probably trying to compensate
[10:44] <blippyp> try it
[10:45] <blippyp> setting qp to 0 likely has little effect when you're bottlenecking the bitrate
[10:45] <tm512> I'm not setting the bitrate
[10:46] <blippyp> oh - I thought you said you were (I didn't go look at your command again)
[10:46] <tm512> I'm setting it in the encode before uploading
[10:47] <tm512> I need to get to sleep, I'm assuming that it's just some colors having problems fitting in YUV space
[10:47] <tm512> and the conversion makes them look washed out
[10:47] <blippyp> that would be my guess
[10:47] <blippyp> good luck - hope you figure it out - but I think you're right
[10:48] <tm512> I'll start recording in a true lossless format though
[10:48] <blippyp> wait - if you have a second...
[10:48] <tm512> yea?
[10:48] <blippyp> let me find a link for you...
[10:50] <blippyp> ttps://www.youtube.com/watch?v=ciIHfUOn8Xw
[10:50] <blippyp> I think that's the right one
[10:51] <blippyp> at some point in that video he talks about the video colors needing to be modified...
[10:51] <tm512> alright, putting it in my watch later, thanks
[10:51] <tm512> good night
[10:51] <blippyp> this might be related to what you're complaining about - something about the colorspace being 'off' during the recordings - It's a common thing I've seen mentioned in many places - I found the exact details for it somewhere but I don't remember...
[10:52] <blippyp> alright night - good luck
[13:45] <wilco> Would anyone know why when I transpose a video purely to rotate, the output mov isn't the same duration and usually shorter than the original?
[13:51] <wilco> okay, thanks
[13:53] <wilco> http://pastebin.com/8Tn8LPb6
[13:54] <wilco> sorry, I didn't put the output in there.. hold
[13:54] <klaxa> can you somehow get ffmpeg's output?
[13:54] <klaxa> kk
[13:59] <wilco> http://pastebin.com/bWVGdSsz
[13:59] <klaxa> you are using avconv, see #libav for support
[14:00] <wilco> oh..
[14:01] <wilco> Thanks klaxa
[16:37] <Guest25204> hello i have a question. im piping a bunch of png's to ffmpeg and its processing fine until near the end of the pipe. at frame/png 55 it stops... nothing happens... and i have to close my binary writer to get it to continue then i get a pipe input/output error.. and it finishes off... why doesnt the pipe close gracefully?  http://pastebin.com/tiD5pBUw
[16:39] <Guest25204> code snippet: http://pastebin.com/KnFTHh6U
[16:40] <Guest25204> driving me nuts... if i dont use pipe and test using the cmd line , it works fine with a image%d.png for input... but when piping it seems to hang. like its waiting for some buffer to be dealt with.but im already reading the output stream and error stream.
[16:50] <klaxa> well... that is how it's supposed to work
[16:51] <klaxa> how is ffmpeg supposed to know there are no more images coming through the pipe unless you close the pipe?
[16:52] <Guest25204> klaxa exactly... so ive sent all 58 images... it gave error stream feedback that it processed 55
[16:52] <Guest25204> klaxa, then it hangs... no feedback for 56, 57, and 58
[16:53] <Guest25204> klaxa, so i close the writer... and it throws a pipe input/output error.... then it processed 56, 57, 58.
[16:53] <Guest25204> klaxa, how can i make it process everything i sent to it before closing my writing stream?
[16:54] <Mavrik> due to how encoders work your frames will be buffered
[16:54] <Mavrik> when you close the output it'll flush them out
[16:54] <Guest25204> ok
[16:54] <Guest25204> how do i detect it is finished?
[16:54] <klaxa> closing the stream should not raise an I/O error though
[16:54] <Guest25204> so i can process more output?
[16:54] <Mavrik> 3 frames of delay is normal
[16:54] <klaxa> maybe java or C# or whatever you are using doesn't close the pipe properly
[16:55] <Guest25204> is there no graceful stream close method to avoid the pipe error?
[16:55] <Guest25204> c#.net
[16:56] <Guest25204> if 3 frames buffered is normal, and i close my writing stream.... i need to detect it finishing... is there a logical way by looking at the state of the output stream async event? or must i parse the error stream for some text like "video:" ?
[16:57] <Guest25204> it doesnt report "finished" or anything like that in the error feedback stream
[16:58] <klaxa> the process should return 0 or something. maybe there is something like Thread.join() or similar?
[17:01] <Guest25204> hmmm i guess i can ignore the pipe error
[17:02] <Guest25204> i have a 2nd question...
[17:03] <Guest25204> question: during the output stream i am reading the video bytes into memory as ffmpeg shares them to the stream... then when it's process exits so i know the output stream is done too.... i write those bytes to a file with appropriate extension to my format which is webm.
[17:04] <Guest25204> question continued: but... when i play the file, it's small and blank in chrome. so i dont know what happened during its encoding or my byte streaming to a memorystream... however when i run it via command line using the same png files as input to the video creation, the output webm is perfect
[17:06] <Guest25204> snippet: http://pastebin.com/wKQmNzwt
[17:27] <Guest25204> will bbl
[19:54] <Guest25204> anyone around?
[19:55] <Guest25204> http://pastebin.com/wKQmNzwt ... when i write this stream out to a file for a blah.webm file the webm fails to play or show any pixel data. but if i done use the piped output and just write the output to a file directly the webm is fine
[19:59] <Guest25204> the file.writeallbytes method in c# ends up with a file about 19kb smaller. the video files are about 2.5mb with a 19kb diff. i dont know how to see if the webm is proper or missing something maybe?
[20:30] <Guest25204> its like something not being written... the output says video:2558kB , but the actual file when i dump the memorystream is 2540kb.  the command line direct file save is also 2558kb.. hmmm
[20:30] <Guest25204> like the outputstream that im reading into a memory stream isnt complete for some reason
[20:53] <Guest25204> anyone available to help this is driving me nuts?
[21:34] <galex-713> Hello
[21:36] <galex-713> I got a wma file from some mms stream saved with mplayer, but I want to cut some part of this stream and ffmpeg just output a file that, when read by mplayer, is said have no sound. Could this be related to the fact that when I read the original file, at begin, mplayer say to be at time 108:06:25?
[21:36] <galex-713> *the fact that
[21:42] <c_14> Nah, that's normal. That time is the complete length of the stream so far. What you can do is give us your entire ffmpeg command and complete console output.
[21:42] <galex-713> The complete length of the stream since when? (I paste the output)
[21:44] <galex-713> https://paste.galex-713.eu/?35d5740ea607b9de#CO3aLn1Qigom82P2kv+/Ji0qR+2qnw0C366M7opQR3Q=
[21:45] <galex-713> c_14: ^
[21:45] <c_14> Usually length of the stream since it started streaming.
[21:46] <galex-713> Ah ok
[21:47] <c_14> avconv is part of libav, see #libav for support
[21:47] <galex-713> oops
[21:47] <galex-713> Wait, I try with ffmpeg
[21:48] <c_14> I'm pretty sure I know what the problem is though.
[21:48] <galex-713> Ok, it works with ffmpeg
[21:48] <galex-713> \o/
[21:48] <c_14> Oh, ok.
[21:59] <Corin> Does anyone know how I could stream a video so that other people could watch?
[22:02] <blippyp_> corin: ffserver
[22:03] <Corin> blippyp_: Could you be more specific? I'm genuinely a moron.
[22:04] <blippyp_> you'd have to read the man page or go to the ffmpeg website - go to the documentation link at the top of the page - click on ffserver. It's been forever since I've done it - but it does it
[22:05] <Corin> Oh wow.
[22:05] <Corin> This seems complicated.
[22:05] <blippyp_> yeah, it is at first - but it's simple once you figure it out
[22:05] <blippyp_> if you know how to use ffmpeg, then ffserver isn't much different
[22:06] <blippyp_> I am taking a quick look to see if I can't come up with a basic example for you - but I'm not promising anything - like I said, it's been a while and I'm also busy doing other things
[22:07] <Corin> Yeah, I should be able to figure it out. It'll just take a bit.
[22:07] <Corin> Thanks.
[22:08] <blippyp_> no problem - good luck, it's kind of cool once you get it
[22:10] <galex-713> Still with my file begining at 108:06:25: is there a way to directely say a begining time at ffmpeg like 108:26:57, or should I be forced to do recalculation of absolute time separatly?
[22:12] <galex-713> c_14?
[22:13] <blippyp_> galex-713: not really sure I'm understanding your problem: You have a video, that you want to take a 'chunk' out of the middle of - is that what you're trying to do?
[22:14] <galex-713> blippyp_: I stiil want to cut a piece of the audio file beginning at 108:06:25, but I would like to not have to do calculations separately to convert 108:06:25 to a time relative to the real beginning of the file
[22:15] <c_14> The easiest thing would be to remux the file and then do whatever you want with it. That'll fix the times. You should, however, be able to set the absolute time if you add the time you want to what the file says it starts at. Relative times should work regardless. Note that this is a big should and ymmv since I don't actually know what the ffmpeg source does for time detection.
[22:15] <Corin> blippyp_: It should work for Windows just the same as UNIX-like systems, though, right?
[22:15] <Corin> I wanna try to help a friend stream, too.
[22:15] <blippyp_> ffmpeg -s 108:06:00 -i inputfile.mp4 -s 25 -t (length of time you want to recrod) -vn -a:c (whatever codec you want) -sn output.wav (or whatever)
[22:16] <blippyp_> yeah, as far as I know - ffmpeg works the same in each OS
[22:16] <galex-713> ok, thanks
[22:16] <blippyp_> but I've only ever used it with linux
[22:17] <blippyp_> you run into some differences - Things like if I want to record my desktop - I you x11grab - but with windows you use something else i think. Plus it's different for using a device like say your camera I believe - but the basic commands/syntax are the same I believe...
[22:18] <blippyp_> windows uses directx I believe (forget what it's called specifically by ffmpeg)
[22:19] <blippyp_> galex-713 - notice how I have two -ss in my example (although I left out a second s in the second one - sorry - it should be -ss not -s)
[22:20] <galex-713> blippyp_: ah ok
[22:20] <blippyp_> this is because the first -ss isn't accurate... So I set it to a time frame close to where you wanted it, but not exactly - then I specified the rest of the time required in the second -ss after the input
[22:20] <galex-713> I was searching for -s option and didnt understand ^^
[22:21] <blippyp_> this will allow it to seek to where you want accurately - I'm looking at your url you gave earlier now - you're missing that in your parameters....
[22:21] <galex-713> but using -ss 108:06:00 ffmpeg says Invalid duration specification for ss: 108:06:00
[22:21] <blippyp_> yeah you specified a beginning time at 108 hours into the video
[22:21] <galex-713> yeah
[22:22] <galex-713> So I *have* to either do calculations to know whats the time from the begining of the file or remux file?
[22:22] <blippyp_> don't remux the file
[22:22] <blippyp_> not calculations required
[22:22] <galex-713> ?
[22:22] <blippyp_> you just need to know when to start and how long you want to record
[22:23] <galex-713> Yeah
[22:23] <blippyp_> the only calculation you need is the length of the recording you want
[22:23] <blippyp_> so if I wanted to record a video that began at 25 minutes and record for 3 minutes:
[22:23] <galex-713> -ss 25:00 [&] -t 3:00
[22:23] <galex-713> I know
[22:24] <blippyp_> ffmpeg -ss 00:24:30 -i input.mp4 -ss 30 -t 00:03:00 -c copy out.mp4
[22:24] <galex-713> Ah also: the original file isnt a video but an audio filee
[22:24] <galex-713> *file
[22:24] <blippyp_> that won't matter
[22:24] <galex-713> Yeah
[22:24] <galex-713> Just to correct what you said ^^
[22:25] <blippyp_> correct what?
[22:25] <galex-713> Oh, nothing, you said a video, I misread as your video
[22:25] <blippyp_> you mean because my example is an mp4 instead of mp3 or something...
[22:25] <blippyp_> no difference
[22:25] <galex-713> yeah
[22:25] <blippyp_> replace .mp4 with .mp3
[22:26] <galex-713> (actually its wma)
[22:26] <blippyp_> either way - doesn't matter...
[22:26] <blippyp_> what's m2o ?
[22:26] <blippyp_> it doesn't have an extension?
[22:27] <galex-713> No because it was written by a script and I dont want to seek the correct extension for every webradio
[22:27] <blippyp_> ffmpeg might not know what that is... (it might read it fromt he header or something, but seriously - it might not understand what your input format is)
[22:27] <galex-713> So I let mplayer find the right format alone
[22:27] <galex-713> blippyp_: for the first cut I made it worked
[22:28] <blippyp_> cool - might not be an issue then - but ffmpeg has the ability to specifically specify the format for a reason - just bringing it up - glad it's working
[22:28] <galex-713> So, I mean: the time I have is 108:26:57, when I said I have to make calculations, its about calculating the time from the beginning of the file
[22:28] <blippyp_> that time has to be wrong
[22:28] <blippyp_> that's a HUGE file
[22:28] <galex-713> I cant just do -ss 108:26:57
[22:29] <blippyp_> 108 hours 26 minutes and 57 seconds?
[22:29] <galex-713> The file begin at 108:06:25
[22:29] <blippyp_> I think ffmpeg tops at 99 hours - but I haven't tested it
[22:29] <blippyp_> the format is 00:00:00.000
[22:29] <galex-713> When I begin to read the file mplayer says 108:06:25
[22:29] <blippyp_> seiously - 108 HOURS into the file
[22:29] <galex-713> So the time I get is 108:06:25+the relative time since begin
[22:29] <galex-713> blippyp_: no
[22:30] <blippyp_> where did you get this thing
[22:30] <galex-713> The file doesnt get 108 hours
[22:30] <blippyp_> the timing in the file is messed up
[22:30] <galex-713> But at the begin mplayer says its 108:26:25
[22:30] <blippyp_> somethings wrong with it
[22:30] <blippyp_> ignore that time
[22:30] <galex-713> Its an audio stream
[22:30] <blippyp_> it's screwed up
[22:31] <blippyp_> use a different player and see if it will correctly show you the time
[22:31] <galex-713> Like c_14 said, its probably counting from the start of streaming
[22:31] <galex-713> blippyp_: with vlc it says the same thing
[22:31] <blippyp_> okay, but even that is weird - a 108 hour streaming session?
[22:31] <blippyp_> try re-encoding the entire file into a different format
[22:31] <galex-713> Its a radio
[22:31] <galex-713> Its still recording
[22:32] <galex-713> *Im still recording
[22:32] <blippyp_> then pull from the new recording
[22:32] <blippyp_> you're pulling from a file that's being written to?
[22:32] <galex-713> yeah
[22:32] <galex-713> Its a *radio*
[22:32] <blippyp_> then you'll have to do the math I think
[22:32] <galex-713> Ok
[22:32] <blippyp_> write down the time as soon as it starts
[22:32] <galex-713> Already done ^^
[22:33] <blippyp_> then find the time you want to begin - subtract the start time from it
[22:33] <galex-713> 108:06:25
[22:33] <blippyp_> what time do you want to begin recording, and how long do you want to record?
[22:33] <galex-713> (I earlier said 108:26:25, that was wrong)
[22:34] <galex-713> from 108:26:57 to 108:29:54 according mplayer
[22:36] <blippyp_> using your url you provided above: your command should be something like: ffmpeg -ss 00:06:00 -i m20 -ss 57 -t 177 -m20-1.wma
[22:36] <galex-713> So if ffmpeg cant calculate the time according what audio stream say, I think I should do 108:26:57  108:06:25 for the beginning and 108:29:54  108:26:57 for the duration
[22:36] <blippyp_> oopa left out the codec, but you get the idea
[22:36] <galex-713> oopa?
[22:36] <blippyp_> oops
[22:36] <galex-713> ah ok
[22:36] <galex-713> yeah
[22:37] <galex-713> But since I probably will cut many piece, I think I should automate that&
[22:37] <blippyp_> yeah, I would
[22:37] <blippyp_> I'd just create a file with the start time and the length of time you want - then subtract the initial begin time from each start time
[22:38] <blippyp_> if you're in linux it should be simple to do with awk I think
[22:38] <blippyp_> and xargs
[22:38] <galex-713> *GNU/Linux
[22:38] <galex-713> I never found time to learn awk :/
[22:38] <blippyp_> it's simple enough - just google it - you'll find a simple example
[22:38] <blippyp_> I have to look it up each time myself
[22:39] <galex-713> You mean google for awk usage or google for some awk example to do time substraction?
[22:40] <blippyp_> awk should be capable of both i think
[22:40] <galex-713> What?
[22:40] <blippyp_> google it
[22:40] <galex-713> I asked what was you meaning by google *it*, whats it?
[22:40] <blippyp_> awk
[22:41] <galex-713> awk usage or specific awk example for time substraction?
[22:41] <galex-713> Ah
[22:41] <blippyp_> both
[22:41] <galex-713> ok&
[22:50] <tachoknight> hi all, i am creating a 30 second movie from an image using: ffmpeg -loop 1 -r 23.976 -i test.jpg -t 00:00:30 test.mp4
[22:51] <tachoknight> and was wondering if there was a way to add a fade to black over it at a particular duration
[22:52] <blippyp_> tachoknight man ffmpeg-filters search for fade I believe
[22:52] <tachoknight> i was thinking id have to create separate images at different levels, create sub movies, then stitch those together
[22:52] <tachoknight> oh, thanks blippyp_
[22:52] <blippyp_> no problem
[00:00] --- Sun Apr 20 2014


More information about the Ffmpeg-devel-irc mailing list