[Ffmpeg-devel-irc] ffmpeg.log.20120327

burek burek021 at gmail.com
Wed Mar 28 02:05:01 CEST 2012


[01:40] <mfwitten> I don't know what this means: Incompatible sample format 's16' for codec 'aac', auto-selecting format 'flt'
[01:40] <mfwitten> Isn't ffmpeg supposed to, you know, CONVERT? :-P
[01:42] <saste> yes, but different codecs support different sample formats, so ffmpeg is converting to a format which can be dealt by aac
[01:42] <saste> or maybe is just a limitation in the implementation, can't say
[01:49] <mfwitten> Well, anyway, I get `Error while opening encoder for output stream #0:1 - maybe incorrect parameters such as bit_rate, rate, width or height'; there is a problem with the audio stream, but I don't know what what; I get this error even when I don't specify parameters.
[01:49] <mfwitten> Am I to assume that there are no defaults and it's complaining about that?
[01:49] <saste> uhm, you usually get some contextual error which explain the error in details
[01:50] <saste> try to post a pastebin
[04:20] <darkstarbyte> How would one start making an interface for ffmpeg in C?
[04:24] <darkstarbyte> maybe I should ask the development channel
[04:45] <_klk_> darkstarbyte:  ffmpeg is in C...
[04:45] <_klk_> do you mean a UI?
[04:48] <darkstarbyte> yeah
[04:50] <darkstarbyte> I would like to make one of my own
[04:53] <_klk_> darkstarbyte:  hmm, sorry don't know much about UI programming in C.  There are existing UI wrarppers around ffmpeg though - mplayer/mencoder and Handbrake i believe
[04:54] <darkstarbyte> oh
[06:45] <grepper> darkstarbyte:  #ffmpeg-devel is for ffmpeg development only, not what you want
[07:29] <darkstarbyte> yeah
[07:29] <darkstarbyte> I just thought they would know
[08:23] <Wntrvnm> Hello.  Is it possible to specify a maximum frame rate when batch-converting videos (but not touch the framerate of ones that have a lower source framerate)?
[08:42] <ikrabbe> if, in the ffplay output is written SDL_OpenAudio, I assume ffplay tries to use sdl as outdev (which I disabled during configure)
[08:42] <ikrabbe> how can I stop that
[08:43] <ikrabbe> or as a first work-around, how can I set alsa as audio output when playing a video with audio
[08:45] <ikrabbe> also: using ./configure ... --disable-outdev=sdl, still shows sdl as part of ./configure --list-outdevs, while the ./configure output itself just shows alsa now.
[10:03] <ikrabbe> http://paste.pocoo.org/show/571923/
[10:14] <hellop> I need to stream a live event over two cellphone modems.   Can anyone think of a way to split an incoming Video/Audio stream (RTP, UDP, RTSP, RTMP, etc..), send each half over a different network connection, then reconstruct and play the stream?
[10:25] <ikrabbe> hellop: this will be very tricky to synchronize
[10:26] <ikrabbe> I don't even know if there is a transport format that is able to synchronize with another transport stream
[10:27] <hellop> Have to do something custom, I guess.
[10:31] <hellop> Wonder what some possible strategies are....
[10:32] <hellop> You could record the source stream to a file.  Then upload the file later, each half over a different connection.
[10:33] <hellop> But, then we have to wait until the end of the event..
[10:35] <hellop> Can ffmpeg can read/write to named pipe.  What format should I use?  Netcat/udp, a file format?  If I can get that working then I can put something in the middle to split the data and recombine it.
[10:58] <grepper> hellop: should be able to read from a fifo
[11:00] <grepper> you may need to do something like (for example):  ffmpeg -pix_fmt yuv420p -f yuv4mpegpipe -i mypipe ...
[11:03] <malinens> *** glibc detected *** /usr/bin/ffmpeg: invalid fastbin entry (free): 0x0000000000d5c4c0 *** ffmpeg 0.10 I get this while trying to start transcoding process- sometimes I get this error and sometimes I don't. what should I do about it?
[13:43] <hellop> grepper: thanks, I'm looking into that, it looks promising.  What about somehow splitting and/or combining a stream?
[13:44] <hellop> Could I split into top half and bottom half, or only re-stream every other frame?
[14:07] <hellop> Maybe use crop and overlay?
[16:19] Action: insider is away: auto-away
[16:34] <hellop> Is there a way to save two different output files using avfilter split?  From the docs: Pass on the input video to two outputs. Both outputs are identical to the input video.
[16:35] <hellop> I've been hacking for some time and can't figure out the syntax to save...  It seems like any split must have an overlay.
[16:50] Action: insider is back (gone 00:30:45)
[17:46] Action: insider is away: auto-away
[17:49] Action: insider is back (gone 00:02:44)
[17:59] <globus> Does anyone knows how to use libav* to do rtp-streaming of x264 encoded NAL units, I want to use it only for streaming the encoding is done using x264-api. Thanks
[18:09] Action: insider is away: auto-away
[18:10] Action: sacarasc stabs insider.
[18:10] Action: insider is back (gone 00:00:34)
[18:10] <ckb> hey guys, I'm encoding a 400k/s 360px width video from a source of 6.2k/s and an aspect ratio of 16:9. When I encode the video, ffmpeg is reporting a 180:10 DAR, why is this?
[18:11] <ckb> 180:10 in the encoded video**
[18:12] <hellop> I think I found a bug.  I can't seem to crop then pad unless I pad greater than the video before cropping.  So I can't do this: "crop=iw:ih/2:0:0, pad=iw:ih*2:0:0"  But I can do this: "crop=iw:ih/2:0:0, pad=iw:ih*2+2:0:0"
[18:14] <iive> hellop: put a scale between crop and pad
[18:15] <iive> screenshot may also do the same.
[18:15] <iive> oops, sorry, wrong channel.
[18:18] <ckb> would if help if someone saw ffmpeg's output?
[18:18] <ckb> s/if/it
[18:20] <iive> btw, does it crash, or it just undoes the cropped image?
[18:20] <hellop> iive, thanks, I have this now: "crop=iw:ih/2:0:0, scale=iw:ih, pad=iw:ih*2:0:0"
[18:20] <hellop> So, I guess that resets the height/width for the input on the next filter, eh?
[18:27] <ckb> no one? :(
[18:29] <iive> hellop: crop could be done via stride(linesize) arithmetic, using the same image, and no full frame copy. The problem is that pad does the opposite, it allocates bigger image and puts the "input" into the center, so it does the exact opposite of crop.
[18:30] <iive> there must be option to explicitly clear the added bands.
[18:31] <iive> ckb: the 18:1 ratio looks wrong and could be bug, put the text log in some pastebin site.
[18:31] <ckb> iive: ok
[18:31] <iive> i won't hunt the bug, but somebody else might. I want to see the options you used too.
[18:34] <ckb> http://pastie.org/3679100
[18:34] <ckb> iive: the options are the first 2 lines.. then re-executed
[18:39] <iive> 180/101  not 180/10
[18:39] <iive> your pixel aspect ratio is 1:1
[18:39] <ckb> hm?
[18:39] <iive> the display aspect ratio depends on resolution
[18:40] <iive> and you change it
[18:40] <ckb> So it's not supposed to stay consistant?
[18:40] <iive> your par is 1:1
[18:40] <iive> it is ok
[18:42] <ckb> thank you
[18:42] <ckb> so 16:9 is dependent on the resolution...
[18:49] <hellop> iive, since you're here, can you comment on using -vf split?  Is there a way write to disk the two "outputs"?  Or do I have to always use overlay to combine them back into one output, before I can write to disk?
[18:54] <hellop> If you look in the docs for split it says: "Pass on the input video to two outputs."  But, using the example code always gives error about "the filter out not connected to any source".  I tried a bunch of different guesses.
[18:58] <iive> hellop: it should be possible, but i'm not familiar with the syntax
[19:01] <iive> in the past ffmpeg was able to encode multiple different codecs taking the same image
[19:02] <hellop> yes, I can do that just by specifying new options after the first output file in the command line.  Just can't get out the two streams from split.  Can only merge them, then output.
[19:03] <hellop> Also can't run two different -vf strings in one command line.
[19:14] <iive> hellop: i'm sure saste would know, hope he answers :)
[19:14] <iive> btw, you can fill the crop+pad as bug in the bugtracker, it is still bug.
[19:17] <saste> hellop: what's the bug?
[19:23] <globus>  Does anyone knows how to use libav* to do rtp-streaming of x264 encoded NAL units, I want to use it only for streaming the encoding is done using x264-api. Thanks
[20:46] <madlag> Hi !
[20:47] <madlag> I am getting a strange hang using ffmpeg + x264 on very long videos created using a pipe for audio and for video inputs.
[20:47] <madlag> Before trying to reproduce it, I wanted to know if somebody had the same issue.
[20:48] <madlag> (it's fully deterministic)
[20:49] <madlag> The version is 0.10
[20:50] <madlag> The process just don't exit. But the file itself is ok.
[21:04] <hellop> I get "Error initializing filter 'movie' with args 'udp://127.0.0.1:10000'", is it possible to specify a live stream for the movie paramater for a filter chain?
[21:04] <hellop> the command: ffmpeg -re -f mpegts -i udp://127.0.0.1:10000 -vf "movie=udp://127.0.0.1:10000 [logo];[in][logo] overlay=0:main_h/2 [out]" -vcodec libx264 -vprofile default -vprofile baseline -f rtsp rtsp://127.0.0.1:1935/live/myStream
[21:10] <hellop> Docs say "can be a stream accessed by some protocol"
[21:24] <sudo_dirk> hi. can anybody help me wit a command line issue? i want to use ffmpeg for an open source project...
[21:25] <sudo_dirk> i want to start ffmpeg from inside python and want to extract a single jpeg from a video file.
[21:25] <sudo_dirk> i found: ffmpeg -vframes 1 -ss 0.5 -i cimg0868.avi pipe:1
[21:26] <sudo_dirk> and get: "nable to find a suitable output format for 'pipe:1'"
[21:26] <sudo_dirk> +U(nable)
[21:26] <sudo_dirk> i want to catch the output from stdout then
[21:30] <hellop> Try naming the pipe: cimg0868.pipe.avi so ffmpeg will recognize the format.
[21:30] <sacarasc> Or use -f
[21:34] <sudo_dirk> sacarasc: what format do i need after -f. jpg or jpeg is not valid.
[21:35] <sacarasc> Anything that's listed in ffmpeg -formats as being encodable should work.
[21:38] <sudo_dirk> thanks -f null worked fine ;-)
[22:11] <hellop> Ok guys.  Lets brainstorm here.  How can I combine two live streams into one?  I can merge 1 live and one file like this: ffmpeg -re -f mpegts -i udp://127.0.0.1:10000 -vf "movie=samplebottom.mp4 [logo];[in][logo] overlay=0:main_h/2 [out]" -vcodec libx264 -vprofile default -vprofile baseline -f rtsp rtsp://127.0.0.1:1935/live/myStream
[22:11] <hellop> What are my options?  Try to read the second udp stream to a named pipe, then open that in the filter chain?
[22:13] <hellop> I tried ffmpeg -i samplebottom.mpg namedpipe.mp4..  But ffmpeg just hangs trying to read it..  any ideas?
[22:33] <hellop> I can do all sorts of stuff now.  I can send a UDP stream to another computer.  I can receive it and send it to a named pipe.  I can read the named pipe and send via RTSP and play it.  But how do I get it into the filter chain?
[23:20] <hellop> I find it weird that the filter chain an read in a live stream from a hardware source like a camera, but can't read a stream.  Can I simulate a hard ware source using a live stream?
[00:00] --- Wed Mar 28 2012


More information about the Ffmpeg-devel-irc mailing list