[Ffmpeg-devel-irc] ffmpeg.log.20120922

burek burek021 at gmail.com
Sun Sep 23 02:05:01 CEST 2012


[17:49] <vivienschilis> hi
[17:50] <vivienschilis> I have video where the video stream has a start_time > 0
[17:50] <vivienschilis> and an audio stream start_time = 0
[17:50] <vivienschilis> it plays well on html5 player but not on flash players
[17:50] <vivienschilis> because of this delay
[17:51] <vivienschilis> I was wondering how can make the video stream start at 0
[19:25] <Steve132_> Hi, does ffmpeg support output-level parallelism?  I want to output to two files at once, but use multiple threads to do it
[19:27] <Mavrik> each output encoders will run in parallel.
[20:05] <Steve132_> Mavrik, are you sure about that?
[20:08] <Mavrik> what kind of question is that?
[20:11] <Steve132_> Well, I'm testing it right now and it doesn't seem to work
[20:12] <Steve132_> When I have the process manager up, only one thread appears to be in use
[20:12] <Steve132_> with this
[20:12] <Steve132_> ffmpeg -threads 2 -y -i ~/Downloads/logo_corrected.avi out1.avi out2.avi
[20:13] <Steve132_> To try it again,  to see if I actually get two threads, I made out2.avi into a named pipe in order to enforce an artificial delay
[20:13] <Steve132_> because out2 could not be written until it is read
[20:13] <Steve132_> out1.avi is never written
[20:13] <Steve132_> or even opened
[20:14] <Steve132_> if they were on two threads, shouldn't out1 have been processed regardless of the write delay on out2 ?
[20:16] <Steve132_> Marvik: Got any ideas?
[20:16] <Steve132_> *Mavrik
[20:34] <burek> Steve132_ it does it using one thread
[20:34] <burek> what Mavrik wanted to say probably is that both outputs are in sync
[20:34] <Steve132_> ah
[20:34] <burek> written out in parallel
[20:35] <burek> I also think outputs should be written in separate threads, because it's more natural that way I guess
[20:35] <Steve132_> So, basically, my application for this is that I want to archive everything that comes in a stream
[20:35] <Steve132_> but have a preview
[20:35] <Steve132_> window
[20:35] <Steve132_> so my attempt to do that was going to be to output to a file
[20:36] <Steve132_> but also output raw video to a named pipe
[20:36] <Steve132_> and run my 'preview' application , reading from that named pipe as fast as possible and updating the preview window
[20:36] <burek> maybe you could read this: http://ffmpeg.org/trac/ffmpeg/wiki/Creating%20multiple%20outputs
[20:37] <Steve132_> Yeah I read that
[20:38] <Steve132_> So, correct me if I'm wrong
[20:38] <Steve132_> but
[20:38] <Steve132_> ffmpeg -f v4l2 -i /dev/video0 -vcodec libx264 -f mpegts - | \
[20:38] <Steve132_> 	ffmpeg -f mpegts -i - \
[20:38] <Steve132_> 		-c copy -f mpegts udp://1.2.3.4:5678 \
[20:38] <Steve132_> 		-c copy -f mpegts local.ts
[20:39] <Steve132_> what that does is read raw video from the webcam, encode it to an h264 with an mp4 transport stream
[20:39] <Steve132_> then, pipe the transport stream to a new instance of ffmpeg that mirrors the stream to a socket and a local file
[20:40] <burek> that's the case when you want to have same formats for all outputs
[20:40] <Steve132_> right
[20:40] <burek> that's why -c copy
[20:40] <Steve132_> So, based on that, I did
[20:40] <burek> so you can remux to a file format, to a stream format, to a screen, whatever
[20:40] <burek> without re-encoding
[20:41] <Steve132_> ffmpeg -f v4l2 -vcode mjpeg -i /dev/video0 -vcodec copy local.avi -vcodec raw named_pipe
[20:42] <Steve132_> which, as I understand it, captures raw mjpeg grames from the webcam, outputs them to the avi file without reincoding, then decodes each frame and outputs it to the named pipe
[20:42] <burek> yes
[20:42] <Steve132_> right
[20:43] <burek> -vcodec rawvideo of course
[20:43] <Steve132_> yes, sorry
[20:43] <Steve132_> So, the problem is that I'm worried that if the application doesn't read the named pipe immediately (for whatever reason)
[20:44] <Steve132_> then the whole pipeline will block
[20:44] <burek> yes it will
[20:44] <Steve132_> and my output to the avi file will fail
[20:44] <Steve132_> or be slow
[20:44] <burek> well, it'll skip
[20:44] <burek> I guess
[20:44] <burek> but, you can do something like this
[20:44] <burek> to prevent that
[20:45] <burek> ffmpeg -f v4l2 -vcode mjpeg -i /dev/video0 -vcodec copy local.avi -vcodec raw -f mpegts udp://localhost:1234
[20:45] <burek> and start another ffmpeg/ffplay
[20:45] <burek> to watch from port 1234
[20:45] <Steve132_> I don't understand
[20:45] <Steve132_> how would that help?
[20:45] <burek> you send the raw output to the local udp port
[20:45] <burek> and your application/viewer/whatever reads that port
[20:45] <burek> and if it doesn't read it fast enough
[20:46] <burek> only that app/viewer/whatever will drop frames
[20:46] <Steve132_> and the UDP packets get simply dropped if they aren't read
[20:46] <Steve132_> nice
[20:46] <burek> not avi file also
[20:46] <Steve132_> I like it
[20:46] <Steve132_> I was thinking something like that but I wasn't really sure how to do it
[20:46] <Steve132_> how do I start an ffmpeg to start from port 1234?
[20:47] <burek> you might use a buffering process instead of named pipe, like "tee" command
[20:47] <burek> but I'm not sure does it buffer things
[20:47] <burek> or it just blocks
[20:47] <Steve132_> I like your idea the best
[20:47] <Steve132_> also it means if my preview app crashes the stream won't interrupt
[20:47] <Steve132_> which is awesome
[20:47] <burek> ffmpeg -f mpegts -vcodec rawvideo -i udp://@:1234 ...
[20:47] <burek> yes
[20:48] <burek> that's why I use it in my use cases :)
[20:48] <Steve132_> very cool
[20:48] <Steve132_> and you can run that in another thread
[20:49] <burek> sure
[20:51] <Steve132_> thanks!
[20:51] <Steve132_> thats awesome
[20:51] <burek> :beer: :)
[20:52] <Steve132_> Are there any bandwidth considerations?  Are there bandwidth limits over localhost in the kernel?  There shouldn't be, but I'm just curious
[20:52] <burek> that's your last concern :)
[20:52] <burek> localhost is just a loopback nothing more
[20:52] <burek> so it practically works at the speed of ram memory
[20:53] <Steve132_> Second question, is there any reason I shouldn't/couldn't write the client myself?
[20:53] <Steve132_> If its just a rawvideo stream over the socket, couldn't I just open() the port with UDP and read the data?
[20:53] <Steve132_> with sockets?
[20:53] <burek> you could recvfrom()
[20:53] <burek> of course
[20:53] <burek> if you know how to interpret rawvideo buffers
[20:53] <Steve132_> is there anything special in that stream?
[20:54] <Steve132_> other than just the data?
[20:54] <burek> well, it's structured for sure
[20:54] <burek> you need to search docs for what exactly is "rawvideo"
[20:54] <burek> what's its structure, etc
[20:54] <burek> but you can capture udp port yourself, why not
[20:54] <burek> or use vlc, like: vlc udp://@:1234
[20:55] <burek> ffplay also: ffplay udp://@:1234
[20:55] <burek> etc
[20:55] <Steve132_> I was going to use omething like -pix_fmt rgb24 on the stream
[20:55] <Steve132_> and then the stream is just width*height*3 bytes per frame, right?
[20:58] <burek> hm.. well, you can always write a test :)
[20:58] <burek> and check what is it exactly :)
[20:58] <Steve132_> yeah
[20:58] <Steve132_> ok
[21:01] Last message repeated 1 time(s).
[21:03] <Steve132_> last question: what happens on multiple input streams?
[21:03] <Steve132_> like, I have two streams specified with -i
[21:03] <Steve132_> what happens to them?
[21:04] <Steve132_> normally
[21:04] <burek> i dont understand the question
[21:06] <Steve132_> Well, ok
[21:06] <Steve132_> so my application is a stereoscopic webcam
[21:06] <Steve132_> I have two webcams
[21:06] <Steve132_> and I want to continually save their output
[21:06] <Steve132_> now I could just do what we just said twice
[21:06] <Steve132_> and have 3 threads
[21:06] <Steve132_> one for each cam archiving to its own file and broadcasting
[21:07] <Steve132_> but
[21:07] <Steve132_> and the client reading both servers
[21:07] <Steve132_> but
[21:07] <burek> you have /dev/video0 and /dev/video1 (for each cam a separate video device) ?
[21:07] <Steve132_> it seems like ffmpeg could synch them (yes)
[21:07] <Steve132_> It seems like if ffmpeg could sync the streams somehow it would be ideal
[21:08] <Steve132_> so I was wondering if it would be possible to have one instance of ffmpeg read both cams
[21:08] <burek> why not just: ffmpeg -f v4l2 -i /dev/video0 -f v4l2 -i /dev/video1 -map 0 -map 1 -vcodec:copy ...
[21:08] <Steve132_> what would that do?
[21:08] <Steve132_> I don't understand the 'map' command
[21:09] <burek> well, it reads both video devices, joins 2 video streams inside one container (format) and that's it
[21:10] <burek> -map 0 means all streams from input 0, and -map 1 means the same just for input 1
[21:10] <Steve132_> so your example would put both the streams from input 0 and input 1 into the same output file?
[21:10] <Steve132_> same output container?
[21:11] <burek> yes
[21:12] <burek> so you have "1 file" to deal with, with both videos in it
[21:12] <burek> isn't that just cool :)
[21:12] <Steve132_> yes...but what happens if you try to play that with say, vlc?
[21:12] <burek> it will play both videos
[21:12] <burek> (will open another video window for stream 1)
[21:12] <Steve132_> I see
[21:12] <Steve132_> what container formats contain multiple synced streams like that the best?
[21:15] <burek> mkv, flv, mp4, mpegts
[21:15] <burek> I don't know, try it with ffmpeg
[21:15] <burek> if some format does not support it
[21:15] <burek> it will complain
[21:15] <Steve132_> ok
[21:15] <Steve132_> cool
[21:16] <Steve132_> thanks
[21:16] <Steve132_> I'm going to try it
[21:17] <burek> ok
[21:32] <Steve132_> It gives me the error "Number of stream maps must match number of output streams
[21:32] <burek> pastebin
[21:32] <Steve132_> ffmpeg -y -f video4linux2 -vcodec mjpeg -s 1280x720 -r 30 -i /dev/video1 -f video4linux2 -vcodec mjpeg -s 1280x720 -r 30 -i /dev/video2 -map 0 -map 1 -vcodec copy -an -ar 44100 output.avi
[21:34] <Steve132_> http://pastebin.com/zp1jrET6
[21:36] <burek> first of all you are using avtools, not ffmpeg
[21:36] <burek> which means it's OLD
[21:37] <burek> in that version I'm not sure if -map is working at all
[21:37] <burek> try -map 0.0 instead of -map 0
[21:37] <burek> and -map 1.0 instead of -map 1
[21:38] <burek> or -map 0:0 and -map 1:0
[21:38] <burek> something like that
[21:39] <Steve132_> avtools?
[21:39] <Steve132_> No, I'm not
[21:40] <burek> yes you are
[21:40] <burek> look what it says after Copyright...
[21:40] <burek> it should say FFmpeg developers
[21:40] <burek> if it was ffmpeg
[21:42] <Steve132_> http://pastebin.com/UxRgrtNz   ok, well, fair enough, you'd know better than me
[21:42] <Steve132_> its what comes in universe for ubuntu 12.04
[21:43] <JEEB> ubuntu has a libav package atm, afaik there is no ffmpeg package
[21:43] <burek> they are actually violating ffmpeg's copyright
[21:43] <ubitux> burek: you should add a feature to fflogger
[21:43] <ubitux> burek: something like !fork explaining to the user he is not using ffmpeg
[21:43] <ubitux> ;)
[21:44] <francogrex> Hi, I am trying to make ogg files to load into my android along with the other present ringtones, but it's not working: ffmpeg -ss 33.5 -i file.flv  -ab 320k -acodec libvorbis -t 34 out.ogg the file plays on my computer but not on the android device!
[21:44] <burek> because they are distributing modified ffmpeg (and they state that it is "deprecated" which is not) and plus they are claiming the copyright over it
[21:44] <Steve132_> JEEB: you sure?  I installed ffmpeg with sudo apt-get install ffmpeg
[21:44] <burek> so it's pretty much a criminal activity
[21:44] <burek> ubitux, definitely :)
[21:44] <ubitux> Steve132_: yes they are also using the ffmpeg project name for their propaganda ;)
[21:44] <Steve132_> okk
[21:44] <Steve132_> well, politics aside
[21:45] <burek> it's not just politics, it's the fact you are not using ffmpeg and your -map option doesn't work as it should
[21:45] <JEEB> Steve132_, libav's elenril basically rewrote quite a few bits of ffmpeg (the tool), and it was renamed to avconv. The ffmpeg tool was in libav for a short while, and ubuntu has a package from that time. The ffmpeg (tool) in libav basically was left to bit rot, not updated. ffmpeg (project) then merged elenril's changes into its ffmpeg (tool)
[21:45] <JEEB> yeah, you'd have to use the avconv tool to get more up-to-date way things work :)
[21:46] <ubitux> or use ffmpeg from the FFmpeg project
[21:46] <JEEB> yup
[21:46] <Steve132_> burek: so, ok, t doesn't work right (I tried both of those options)
[21:46] <burek> JEEB, well, you have to agree, they are claiming the copyright over that tool (ffmpeg) with this line "ffmpeg version 0.8.3-4:0.8.3-0ubuntu0.12.04.1, Copyright (c) 2000-2012 the Libav developers"
[21:46] <burek> so it's one more lie
[21:46] <burek> and somebody should sue those people
[21:47] <JEEB> I have no idea about that, for me it just looks like you are trying to find just yet another point to poke them with. That said, I Am Not A Lawyer. I think every other fork has its own copyrights on the binaries, but loldunno.
[21:48] <ubitux> burek: you should really write that page explaining the user a little what's happening and that we can't support the fork
[21:48] <burek> JEEB don't act ignorant
[21:48] <JEEB> also, the fact it got deprecated is wholly true. The message is very ambigious and not nice and all that, but within the libav project it was deprecated. That's what you do to stuff that will be removed during the next release
[21:48] <burek> one day you are saying they are "just using old ffmpeg"
[21:48] <burek> and now it is not important
[21:48] <burek> what's next?
[21:48] <ubitux> burek: you could stop wasting your time trying to debate that everyday and loose your sanity :)
[21:48] <burek> let them do what they do, right, spit all over ffmpeg
[21:48] <burek> that's ok, right? :)
[21:49] <JEEB> I don't think anyone's spitting anything by now
[21:49] <burek> well, it's not ok
[21:49] <burek> and stop defending them
[21:49] <JEEB> at least I don't see anything like that
[21:49] <JEEB> I'm not
[21:49] <JEEB> :|
[21:49] <JEEB> it's just sad to see people still herping a derp here
[21:49] <burek> I think every other fork has its own copyrights on the binaries
[21:49] <burek> yes, but, why does it say "ffmpeg"
[21:49] <burek> ?
[21:49] <burek> if it's a fork?
[21:50] <JEEB> because the tool's name is ffmpeg?
[21:50] <JEEB> yes, that shit is ambigious
[21:50] <ubitux> JEEB: well TBH if the debian-like distro could stop lying to the users that wouldn't happen :p
[21:50] <burek> if the tool's name is ffmpeg, why they clame a copyright on it?
[21:50] <ubitux> tool, and package name.
[21:50] <Steve132_> so what should I do/try?
[21:50] <JEEB> Steve132_, check the avconv binary first
[21:50] <JEEB> yes, the package name is bad
[21:50] <JEEB> or just test one of those ffmpeg builds provided
[21:51] <Steve132_> what do you mean 'check it'
[21:51] <Steve132_> how do I check it?
[21:51] <ubitux> try it&
[21:51] <ubitux> download & run it :p
[21:51] <JEEB> basically replace "ffmpeg" in your command line with avconv
[21:51] <ubitux> Steve132_: note that if you have issues with avconv, or the ffmpeg program from libav, we can't really support that
[21:51] <JEEB> if that doesn't work, you can then try the ffmpeg static build ubitux linked
[21:51] <Steve132_> ok
[21:51] <JEEB> yes, there's #libav for libav issues
[21:51] <JEEB> :)
[21:53] <Steve132_> something crashes and the stream doesn't quite work
[21:53] <burek> it's no wonder it crashes
[21:54] <burek> uninstall all that with apt-get --purge ffmpeg
[21:54] <burek> apt-get --purge libav*
[21:54] <burek> and compile your ffmpeg from git
[21:54] <burek> and we should really provide a deb package for stupid debian/ubuntu
[21:54] <burek> because they make life of a lot of people a hell because of this
[21:54] <ubitux> deb-multimedia?
[21:55] <ubitux> (http://www.deb-multimedia.org/)
[21:55] <JEEB> yes, and you're not making it any better in case that's a desktop user
[21:55] <JEEB> because many multimedia players etc. link to the libav* libraries provided
[21:55] <JEEB> and if you're going to purge that...
[21:55] <JEEB> although I'm not sure if that will exactly do that
[21:55] <JEEB> maybe it will just purge that tool
[21:56] <JEEB> hate on libav all you want, but seriously please do tell me that kind of purging isn't going to just break or uninstall o9k packages for people
[21:56] <relaxed> it will because they depend on libav*
[21:56] <burek> JEEB no need to copy Daemon404's talk, I can read devel chan myself
[21:56] <cbreak> you can just use mplayer
[21:56] <Steve132_> I think what I'll probably just do is take the risk that the left and right streams get out of sync and copy them seperately
[21:56] <JEEB> burek, I am not copying him
[21:56] <cbreak> it statically links its own libav :)
[21:57] <JEEB> I just told him elsewhere you were at it again and he responded at the same time as I did
[21:57] <burek> ok, Steve132_, forget what I said
[21:57] <burek> read all that docs
[21:57] <burek> learn how to properly install ffmpeg
[21:57] <burek> and then return here
[21:57] <burek> until then, go to libav irc chan
[21:57] <burek> and ask for support there.. sorry..
[21:58] <Steve132_> No offence, but thats kinda silly.  You helped me out a ton, which I appreciate
[21:58] <burek> well I know, but we hit the wall and now there's a problem
[21:58] <burek> I don't know what libav uses instead of -map
[21:58] <burek> so you have to ask there..
[21:59] <Steve132_> Do you think the left and right stream sync issues will be enough to cause a problem?  They do have timestamps on all the frames, timestamps that I'm pretty sure persist in the avi format
[21:59] <Steve132_> so I can always merge them later, right?
[22:00] <JEEB> only problem with avi is that you can only have constant difference between frames, and that b-frames in avi suck
[22:00] <Steve132_> ok
[22:00] <Steve132_> what's a better container format for streaming webcam then?
[22:00] <ubitux> flv?
[22:01] <JEEB> mkv, flv, mpeg-ts depending on your needs
[22:01] <Steve132_> ok?
[22:01] <JEEB> also there's the fragments feature in mp4
[22:01] <cbreak> motion jpeg? :D
[22:01] <Steve132_> my needs are "I need to dump raw mjpeg streams quickly to disk"
[22:01] <Steve132_> "and play them back correctly later"
[22:01] <JEEB> mkv or avi
[22:01] <Steve132_> so I need a container format that has timestamps and mjpeg frames
[22:01] <JEEB> mkv is probably more resistant to timestamp differences
[22:01] <JEEB> mkv
[22:02] <Steve132_> ok
[22:02] <cbreak> can't you just dump the stream?
[22:02] <Steve132_> cbreak: That's what I'm doing
[22:02] <Steve132_> except 1) I want to be able to play it back later, 2) I need a real-time preview
[22:02] <Steve132_> I was thinking about trying something like this: http://stackoverflow.com/questions/7581665/multiple-video-sources-combined-into-one
[22:03] <cbreak> maybe tee
[22:03] <Steve132_> also my webcam doesn't support reading directly from /dev/video*
[22:03] <cbreak> the webcams I've used had an HTTP interface
[22:03] <cbreak> I was able to just download with curl or so
[22:04] <cbreak> (but in the end I used ffmpeg to real time decode)
[22:04] <Steve132_> So, in that stackoverflow example, it uses ffmpeg with pad and overlay to mux two streams into one side-by-side
[22:04] <Steve132_> that would be really useful for me, but I can't seem to figure out how to do it with two webcam streams
[22:05] <Steve132_> In the filter, it uses 'movie=small0.avi' as the way to get the second stream
[22:06] <Steve132_> but obviously I can't do that because my stream is a secondary input to ffmpeg
[22:06] <Steve132_> What would the filter syntax be to do that?
[22:07] <JEEB> unfortunately I have no real experience with various filters
[22:10] <relaxed> Steve132_: There are examples in the man page: movie=/dev/video0:f=video4linux2
[22:11] <relaxed> paired with -i /dev/video1
[22:12] <Steve132_> relaxed: would I have to specify ALL the params that way?
[22:12] <Steve132_> like
[22:12] <Steve132_> r=30
[22:13] <Steve132_> vcodec=mjpeg
[22:13] <Steve132_> s=640x480
[22:20] <ubitux> you can use -filter_complex instead
[22:21] <ubitux> Steve132_: the movie=... is part of the filtergraph string
[22:21] <Steve132_> right
[22:21] <ubitux> that's why you need to put the option in it as well
[22:21] <ubitux> and that's the current syntax
[22:21] <ubitux> anyway, you can avoid that using the -filter_complex option
[22:22] <ubitux> look for examples on http://ffmpeg.org/ffmpeg.html
[22:23] <Steve132_> yeah I got that
[22:23] <ubitux> for real time preview unfortunately,
[22:23] <ubitux> ffplay doesn't have it
[22:23] <Steve132_> http://pastebin.com/Mf57BMBB
[22:23] <ubitux> and you need to use a "lavfi" device with a filtergraph with everything in it
[22:24] <Steve132_> I used that command but it crashes
[22:24] <Steve132_> opens one stream and not the other
[22:24] <ubitux> oh?
[22:24] <ubitux> can i see the full output?
[22:24] <ubitux> and eventually a backtrace with gdb --args ./ffmpeg ... ?
[22:25] <ubitux> btw, you can't use -vcodec copy
[22:25] <Steve132_> yeah
[22:25] <Steve132_> I got that
[22:25] <Steve132_> I changed it
[22:25] <Steve132_> and it didn't crash
[22:25] <Steve132_> which makes sense
[22:25] <ubitux> mmh if there is a crash it's not normal
[22:25] <Steve132_> but now it says "not enough inputs to 'overlay'
[22:26] <ubitux> yes you only send the padded output to overlay
[22:27] <Steve132_> I thought the way it worked was that unnamed inputs get appended if they are unallocated?
[22:27] <ubitux> try something like '[0:v] pad=iw*2:ih [p]; [1:v][p]overlay=w'
[22:28] <ubitux> or just use a comma
[22:28] <Steve132_> comma?
[22:28] <ubitux> pad=iw*2:ih[src], [src]overlay=w may work
[22:28] <Steve132_> wait, isn't that what I have?
[22:28] <ubitux> no you have a semi colon
[22:28] <Steve132_> what's the difference?
[22:29] <ubitux> a comma separates a queue of filters
[22:29] <ubitux> a semi colon starts a new one
[22:29] <ubitux> [inX] a,b,c,d [outX]; [inY] x,y,z [outY];
[22:30] <ubitux> here you have two inputs inX and inY
[22:30] <ubitux> and two outputs outX and outY
[22:30] <ubitux> input x is filtered with 4 successives filters
[22:30] <ubitux> and 3 filters for inY
[22:31] <Steve132_> so,  I tried
[22:31] <Steve132_> 'pad=iw*2:ih[src], [src]overlay=w'
[22:31] <ubitux> try '[0:v] pad=iw*2:ih [p]; [1:v][p]overlay=w' first
[22:31] <Steve132_> and it said 'not enough inputs to overlay'
[22:32] <Steve132_> Output pad "default" for the filter "src" of type "buffer" not connected to any destination
[22:32] <ubitux> and 'pad=iw*2:ih, [1:v]overlay=w' ?
[22:32] <ubitux> Steve132_: that messages means you need to map the output
[22:33] <Steve132_> ok?
[22:33] <Steve132_> how do I do that
[22:33] <ubitux> append [out] at the end of the filtergraph
[22:33] <ubitux> and use -map '[out]'
[22:34] <Guest1787> hello
[22:34] <Steve132_> http://pastebin.com/ny8n9G6N
[22:35] <ubitux> upgrade
[22:35] <ubitux> i can't help you with that deprecated and broken version
[22:35] <ubitux> sorry
[22:35] <Guest1787> ok
[22:35] <ubitux> or at least try the static binary
[22:35] <Steve132_> ok
[22:35] <Steve132_> no problem
[22:35] <ubitux> just download the static version
[22:35] <Steve132_> thank you at least for the intuition
[22:35] <ubitux> and ./ffmpeg it
[22:35] <Guest1787> lol
[22:35] <Steve132_> Where is the static version again?
[22:36] <ubitux> it's a standalone binary, easy to test
[22:36] <ubitux> you'll work on fixing your packages later if that work
[22:36] <Guest1787> ok
[22:38] <Steve132_> ok, static binary works
[22:38] <ubitux> great
[22:38] <Steve132_> as in, its able to run simple filters
[22:38] <Steve132_> let me try the command
[22:38] <Steve132_> which one is best to try?
[22:38] <ubitux> what do you mean?
[22:38] <Steve132_> ou gave me like 3 filters to try for SBS
[22:39] <Steve132_> now that I have an ffmpeg that works
[22:39] <Steve132_> which one should I try?
[22:39] <Guest1787> cool
[22:39] <ubitux> Steve132_: what's SBS?
[22:39] <Steve132_> overlay Side-By-Side
[22:39] <ubitux> well, do you have one working or not yet?
[22:40] <Steve132_> one of the commands you told me to try? no
[22:42] <ubitux> give me a few minutes, going to look at it
[22:43] <Steve132_> sweet, my original filter works
[22:45] <Steve132_> cool
[22:45] <ubitux> ./ffmpeg -f v4l2 -i /dev/video0 -i ~/samples/Carly\ Rae\ Jepsen\ Call\ Me\ Maybe.webm -filter_complex '[0:v] scale=320:160,pad=iw*2:ih [a]; [1:v] scale=320:160 [b]; [a][b]overlay=w' -y out.flv
[22:45] <ubitux> this kind of work for me
[22:45] <ubitux> i see me on the left (webcam) and carly rae on the right singing
[22:46] <ubitux> (i don't have two webcams sorry)
[22:47] <Steve132_> Yeah I got it to work with 'pad=iw*2:ih[p];[p]overlay=w'
[22:47] <ubitux> so basically i get input 0 (the webcam) rescaled and then padded; input 1 (the video) rescaled; and then i overlay filtered input 1 (b) on filtered input 0 (a)
[22:47] <Steve132_> Although in order to do that I can't copy the hardware stream
[22:47] <Steve132_> because I have to do the image ops inline and then transcode
[22:47] <ubitux> yes
[22:47] <Steve132_> which sucks...I care more about synchronization
[22:48] <ubitux> if you use the filtering, you have to re-encode
[22:48] <Steve132_> yeah
[22:48] <ubitux> you can't do anything about that
[22:48] <Steve132_> however, its super useful to know HOW to do it, because now I can do it offline
[22:48] <Steve132_> awesome
[22:48] <Steve132_> What I'm going to do instead, i think
[22:48] <Steve132_> is do what burek was suggesting earlier
[22:48] <Steve132_> just encode both streams to the same container
[22:48] <Steve132_> synchronized
[22:49] <Steve132_> Is there a compelling reason I shouldn't use the static ffmpeg?
[22:51] <Steve132_> It works and I'd rather just use it
[22:51] <Steve132_> in the short term for my deadline
[22:53] <Steve132_> thank you for all the help btw
[22:54] <ubitux> reason not to use the static ffmpeg mmmh..
[22:54] <ubitux> i'd say you might expect some performances issues (that needs to be verified)
[22:55] <WeThePeople> is ffmpeg able to convert webm videos?
[22:55] <ubitux> and it could fail at some points because of the embedded glibc (kernel features mismatch etc)
[22:55] <ubitux> WeThePeople: sure
[22:55] <ubitux> Steve132_: and of course, it's not packaged.
[22:56] <Steve132_> so what should I do to get the right ffmpeg in my system?
[22:56] <ubitux> mmh you have various solutions
[22:56] <ubitux> you're using ubuntu, right?
[22:56] <Steve132_> yeah
[22:56] <ubitux> that's kind of a problem if you want to install it
[22:56] <ubitux> because it will likely be a problem with all the users apps depending on the fork
[22:56] <Steve132_> also, why would there be performance issues?
[22:57] <ubitux> OTOH, you can build it yourself, but that won't provide you much benefits from the static version
[22:57] <WeThePeople> ubitux, it says unable to find a suitable file extension to convert to when i type 'ffmpeg -i video.webm video.avi' <<<is this correct?
[22:57] <ubitux> Steve132_: cause the binary is huge, and your glibc might be better than the one embedded maybe
[22:57] <Steve132_> ok
[22:57] <ubitux> WeThePeople: yes that's correct, please pastebin the full output
[22:58] <ubitux> Steve132_: huge binaries could cause some cpu caching issues
[22:58] <Steve132_> Yeah makes sense
[22:59] <Steve132_> On the other hand, dynamic branches to dlls can also cause cpu caching issues
[22:59] <Steve132_> seems like 6 of one, half a dozen of the other unless someone actually benchmarked it
[22:59] <ubitux> keep in mind the static build doesn't only mean you have the libav* libs into the binary
[22:59] <ubitux> you also have all the deps around
[22:59] <ubitux> zlib, glibc etc
[23:00] <ubitux> which makes the binary even bigger
[23:01] <Steve132_> right
[23:14] <Steve132_> so, using that static binary
[23:14] <Steve132_> it says it cannot input alsa
[23:14] <Steve132_> how does ffmpeg import sound?
[23:14] <Steve132_> or, alternately, is that static binary built with alsa support?
[23:32] <ubitux> Steve132_: maybe not, dunno
[23:33] <ubitux> you might want to build it yourself then
[23:33] <ubitux> maybe burek could add it to the static build, but it might cause issues with the kernel and stuff
[23:33] <Steve132_> *sigh*
[23:34] <Steve132_> how can I check that?
[23:34] <ubitux> check what?
[23:34] <ubitux> if the static build has alsa?
[23:34] <Steve132_> according to the docs, ffmpeg --list-indevs is supposed to output all configured devices
[23:34] <Steve132_> but it doesn't do anything
[23:34] <Steve132_> oh nvm
[23:34] <Steve132_> thats a configure option
[23:36] <ubitux> i don't remember the option to list indevs mmmh
[23:37] <ubitux> saste_: ping?
[23:37] <Steve132_> so, if I was to try to build this myself
[23:37] <Steve132_> what should I do?
[23:38] <saste_> Steve132_, configure --list-indevs
[23:38] <ubitux> Steve132_: starts here: https://ffmpeg.org/download.html
[23:38] <ubitux> saste_: at runtime
[23:38] <ubitux> like -formats
[23:38] <saste_> why users always freely reinterpret the documentation?
[23:38] <ubitux> oh, -formats does it..
[23:38] <saste_> yes -formats that's it
[23:38] <ubitux> Steve132_: ./ffmpeg -formats | grep alsa
[23:38] <ubitux> ’  DE alsa            ALSA audio output
[23:38] <saste_> in/out devices are treated like muxers/demuxers
[23:38] <ubitux> should return something like this if supported
[23:39] <Steve132_> nope
[23:40] <ubitux> Steve132_: https://ffmpeg.org/trac/ffmpeg/wiki/CompilationGuide
[23:40] <ubitux> you can check this out
[23:40] <ubitux> might be quite a pain if you're willing to build it with all the libs
[23:40] <ubitux> you can certainly use the libs from the packages most of the time though
[23:41] <ubitux> if they are recent enough
[23:41] <Steve132_> so there's no reason to do the first apt-get remove?
[23:41] <ubitux> if you don't plan to install it on your system
[23:41] <ubitux> you shouldn't uninstall anything
[23:42] <ubitux> at least that's not necessary
[23:42] <Steve132_> ok... I don't plan on installign it, no
[23:42] <ubitux> and i personnaly wouldn't recommend to install it
[23:42] <ubitux> you can just ./configure --enable-gpl and a add few other configure switches like mmh --enable-libx264
[23:43] <ubitux> as long as it's not external libraries, everything should be autodetected
[23:43] <ubitux> you'll likely need some -dev packages for the libs and stuff
[23:43] <ubitux> then "make" and here you go.
[23:44] <ubitux> make -jXXX and ccache will be your friend if you're going to attempt multiple compilations
[23:44] <ubitux> friends*
[23:46] <Steve132_> If i want to put this binary on another machine
[23:46] <Steve132_> should I do enable static?
[23:47] <Steve132_> so I can just copy it over?
[23:47] <Steve132_> I'm not on the production machine but I don't want to deal with dependencies there either
[23:50] <ubitux> depends on what is "another machine"
[23:51] <ubitux> my guess is that you will have a lot of troubles
[23:57] <Steve132_> "another machine" is another linux box with the same OS installed
[23:57] <ubitux> and same libraries version?
[23:57] <ubitux> libraries, kernel, ..
[23:57] <ubitux> cpu?
[00:00] --- Sun Sep 23 2012


More information about the Ffmpeg-devel-irc mailing list