[Ffmpeg-devel-irc] ffmpeg.log.20131121

burek burek021 at gmail.com
Fri Nov 22 02:05:01 CET 2013


[00:30] <grkblood> im trying to figure out a way to pipe a live stream to ffmpeg for  transcoding but I'm failing pretty badly. This is what I have so far. The stream in the command is from npr. How can I make this work? wget http://nprdmp.ic.llnwd.net/stream/nprdmp_live01_mp3 | ffmpeg -i pipe:0 -f ogg -acodec libspeex pipe:1 | mplayer - -cache 1000
[00:31] <relaxed> mplayer can play that stream directly.
[00:31] <grkblood> right, but this is just an example
[00:31] <grkblood> the core of what I'm trying to do is for something else
[00:32] <grkblood> I got it anyways. wget -O - http://nprdmp.ic.llnwd.net/stream/nprdmp_live01_mp3 | ffmpeg -i pipe:0 -f ogg pipe:1 | mplayer - -cache 1000
[00:33] <relaxed> why are you involving wget?
[00:34] <grkblood> to pipe to ffmpeg
[00:34] <grkblood> is there another way to pipe a live straem in to stdin of ffmpeg?
[00:34] <relaxed> ffmpeg should be able to decode the stream as well.
[00:35] <grkblood> well, just piping in the link to the stream doesnt work for me
[00:36] <relaxed> ffmpeg -i 'http://nprdmp.ic.llnwd.net/stream/nprdmp_live01_mp3' ...
[00:37] <grkblood> yea, that doesnt work for me
[00:38] <grkblood> did you test that?
[00:38] <relaxed> this works here:  ffmpeg -i 'http://nprdmp.ic.llnwd.net/stream/nprdmp_live01_mp3' -c:a libvorbis -f ogg -y /dev/null
[00:39] <relaxed> upgrade ffmpeg if it doesn't for you.
[00:39] <relaxed> http://johnvansickle.com/ffmpeg/
[00:39] <grkblood> im on ubuntu  and using the packaged version with avconv right now
[00:40] <grkblood> ill try it in a bit on a vm
[00:41] <grkblood> thanks
[00:41] <relaxed> work with my avconv too
[00:41] <grkblood> Unrecognized option 'c:a'
[00:41] <grkblood> Failed to set value 'libvorbis' for option 'c:a'
[00:42] <grkblood> thats what i get
[00:42] <relaxed> use -acodec instead
[00:45] <grkblood> ya, that seems to record but then when I try to pipe it to mplayer instead with ...-acodec libvorbis -f ogg pipe:1 | mplayer - -cache 1000 I start havign issues again
[00:46] <grkblood> something with piping and streams that ffmpeg doesnt seem to like
[00:47] <grkblood>   Stream #0.0 -> #0.0
[00:47] <grkblood> Press ctrl-c to stop encoding
[00:47] <grkblood> [mp3 @ 0x142f520] Header missing
[00:47] <grkblood> Error while decoding stream #0.0
[00:47] <grkblood> Cache fill:  4.90% (50143 bytes)   size=      49kB time=17.31 bitrate=  23.2kbit
[01:29] <relaxed> grkblood: ... -f ogg pipe:1 2>/dev/null | mplayer ...
[01:31] <grkblood> thanks
[06:04] <qasd_> Hi. How to set videodevice (vebcam) for grabbing?
[06:04] <qasd_> *in windows
[06:06] <qasd_> for example: ffmpeg -i ??? out.avi
[06:08] <qasd_> what i could to write instead of (???) ?
[06:25] <qasd_> why option list-indevs do not work in ffmpeg on windows?
[06:35] <relaxed> qasd_: https://trac.ffmpeg.org/wiki/How%20to%20capture%20a%20webcam%20input
[07:07] <qasd_> relaxed: thank you
[08:56] <sspiff> can someone tell me what parsers in ffmpeg are, and where they fit in the pipeline of formats, codecs and filters?
[11:31] <Valdiralita> hey, can i convert a video to an mp3 and add a album cover without calling ffmpeg twice?
[11:34] <plepere> I'm currently playing videos with ffplay  using ffplay -i <video path>
[11:35] <plepere> but the player doesn't stop when the video is done. is there an option to close the player once the decoding is done ?
[11:35] <ubitux> -autoexit
[11:35] <ubitux> (it's useful to seek back after the end)
[11:36] <ubitux> Valdiralita: yes
[11:37] <plepere> thanks
[11:38] <plepere> works like a charm
[11:38] <plepere> well, the player part at least
[11:42] <Valdiralita> ubitux, so, how can i achive this? i tried this but it just creates a normal mp3 file: ffmpeg -i "1.mp4" -i "1.jpg" -b:a 320K -vn -n -map 0 -map 1 "out.mp3"
[11:43] <ubitux> and it didn't work?
[11:43] <Valdiralita> no
[11:43] <ubitux> (what's -n?)
[11:44] <ubitux> ah, no overwrite k
[11:44] <Valdiralita> afaik to overwrite existing file
[11:44] <Valdiralita> doenstn matter now :p
[11:44] <ubitux> it's the opposite effect
[11:44] <ubitux> -y force overwrite
[11:45] <Valdiralita> http://hastebin.com/lukofehima.hs
[11:45] <Valdiralita> here you go
[11:48] <Valdiralita> the out.mp3 file i a regular mp3 file without a cover
[11:48] <relaxed> won't -vn block the jpg?
[11:49] <Valdiralita> yeah, i want to block the video from the mp4 and get an image instead
[11:50] <relaxed> ffmpeg -i "1.mp4" -i "1.jpg" -b:a 320K -map 0:a -map 1:v "out.mp3"
[11:51] <Valdiralita> yeah, that worked. thank you!
[11:51] <relaxed> ffmpeg treats the jpg as a video stream too, so -vn was blocking it.
[11:52] <relaxed> I wish -*n would go away
[11:53] <Valdiralita> okay, so this is just setting the inputs to the corresponding video and audio streams
[11:54] <relaxed> 0:a = audio from first input; 1:v = video from second input
[11:54] <Valdiralita> ok, just what i thought. thanks
[12:02] <Romain___> hi, I'm trying to create a swf file from a gif (animated), the problem is that ffmpeg doesn't seems to support transparency, someone would have an idea?
[12:02] <Romain___> sry for english..
[12:05] <Romain___> i've tried to use gif2swf instead, but it's very bugged..
[12:19] <Eduard_Munteanu> Hi. Any chance I could control ffplay through a FIFO / pipe, to implement an automated player? I just need to load and seek into files.
[12:26] <relaxed> Eduard_Munteanu: you can with mplayer
[12:27] <Eduard_Munteanu> Hm, I was trying to avoid mplayer. Unless...
[12:27] <Eduard_Munteanu> Can mplayer sync time to the system clock?
[12:28] <Eduard_Munteanu> I sort of implemented it myself but I'd rather not go there again.
[12:30] <Eduard_Munteanu> Oh...
[12:31] <Eduard_Munteanu> I think I can just concat my streams into a loop and run ffplay on it.
[13:47] <surviv0r> guys i have one quick question: can i chain muxers. for example tee and segmenter ?
[13:52] <surviv0r> i'm generating x264 mp4 and HLS segments with the exact same params with this command http://pastebin.com/nPTHRpRU can i chain them the reserve some CPU power
[14:09] <saste> surviv0r, tee
[14:13] <surviv0r> saste, tee seems to not accept options with no value like - flag -global_header check out the command and output here http://pastebin.com/5WQEbgeJ
[14:16] <efyx> Hi, I just updated my ffmpeg and I get a bunch of "ffmpeg/libavcodec/avcodec.h:1040:16: warning: 'destruct' is deprecated" and "warning: 'priv' is deprecated"
[14:16] <efyx> Is this normal?
[14:19] <saste> surviv0r, wrong syntax
[14:19] <saste> flags:global_header this is wrong
[14:20] <surviv0r> what would be the right syntax? flags=-global_header ?
[14:23] <saste> surviv0r, -flags:v +global_header ... /tmp/currentTest2/720/DC0017400000.mp4|[bsfs/v=dump_extra:f=segment:segment_time=10:segment_list=/tmp/currentTest2/720/playlist.m3u8:segment_format=mpegts]/tmp/currentTest2/720/stream%04d.ts"
[14:25] <saste> see also http://ffmpeg.org/ffmpeg-bitstream-filters.html#dump_005fextra
[14:25] <saste> note that it will increase overhead, since both global and key-packets header will be kept
[14:26] <saste> i don't think there is some way to avoid it
[14:45] <surviv0r> well now here's the output after fixing the syntax http://pastebin.com/CJPCvQhV
[14:45] <surviv0r> and after trying to fix the error message here's what happened http://pastebin.com/qbHz9guh
[14:56] <Eduard_Munteanu> Is there an efficient transport for ffserver "broadcasting" to localhost?
[14:57] <MrPingouin> hello world
[14:57] <Eduard_Munteanu> I wonder if I can use ffserver to decode videos and submit them to a player.
[14:58] <ryannathans> Eduard_Munteanu: streaming?
[14:59] <MrPingouin> does anyone knows if there's an easy way to handle padding when using videofilter drawtext + frame number ?
[14:59] <Eduard_Munteanu> ryannathans: not really, I'm writing an application to control and play files in a loop. I was wondering if I can control ffserver and wire its output into ffplay.
[14:59] <MrPingouin> for now it's working like 0, 1, 2, etc, I would need 00000, 00001, etc
[15:00] <MrPingouin> I guess it might be possible writing an expression, but I was lookin for some kind of "%05d"  syntax
[15:12] <saste> surviv0r, -flags +global_header, and remove the -bsf:a option
[18:46] <Bella> hi
[18:46] <Guest78329> i'm trying to use vlc as a source for ffmpeg, udp local connection and i get [mpeg4 @ 0292b080] get_buffer() failed (-22 00000000)
[18:47] <Guest78329> [mpeg4 @ 0292b080] releasing zombie picture
[18:58] <mikepack> Question: I'm looking to concatenate 2 mp4 videos together, and I'm wondering if it's possible to join the two without having to re-transcode them again as it's creating the output (joined) mp4. Any help is appreciated!
[19:01] <Guest78329> use mp4join
[19:03] <therube> mikepack:  https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20%28join,%20merge%29%20media%20files
[19:05] <mikepack> therube: thanks, but I think that reencodes the output
[19:06] <mikepack> Guest78329: thanks! I'm looking for something like that that's usable programmatically (from the command line)
[19:06] <therube> not necesasrily
[19:09] <Guest78329> mikepack use mp4box
[19:10] <mikepack> therube: is there a way to not reencode it?
[19:10] <mikepack> Guest78329: thanks!
[19:10] <Guest78329> nope
[19:20] <fonso> Hey there, I have a question about FFMPEG installation on Amazon EC2 server. Is there any tutorial to do this?
[19:21] <Guest78329> no
[19:25] <fonso> Well thanks.
[19:26] <Guest78329> what do you want to do ?
[19:27] <fonso> Install ffmpeg in a amazon aws instance, but my linux knowledge is very tiny...
[19:28] <therube> mikepack:  http://pastebin.com/W6MbmzrQ
[19:28] <klaxa> fonso, usually you can just install it through the distro's package manager
[19:28] <klaxa> alternatively you can download a static build
[19:29] <klaxa> static builds only depend on the kernel and the libc i think
[19:29] <fonso> Tnx a lot.
[19:29] <Guest78329> http://ffmpeg.org/trac/ffmpeg/wiki/CentosCompilationGuide
[19:30] <mikepack> therube: I think that's reencoding the file, though I'm not entirely sure. Looks like it from line 49
[19:32] <therube> mikepack: no it is not.  so long as the file "parameters" are the same, it need not re-encode.  it can simply join, without reencode, in a few seconds
[19:34] <Guest78329> if the sizes are the same
[19:38] <mikepack> therube: Thanks. Is there a good way to verify it's not reencoding?
[19:42] <therube> mikepack: use the same command line, but remove the '-c copy', so you'll have,  ffmpeg -f concat -i flist out.mp4  & you'll see that it does reencode
[19:44] <mikepack> therube: I'll play around with that, thanks! I appreciate your help
[19:45] <therube> note "copy" vs "mpeg4 -> libx264)",   Stream #0:1 -> #0:0 (copy),  Stream #0:1 -> #0:0 (mpeg4 -> libx264)
[19:57] <BeWilled> Hello, I have one question please. Is there a program to detect the resolution, bitrate and codec...etc of a video?
[19:58] <sacarasc> ffprobe can do it, so can mediainfo.
[20:00] <BeWilled> thanks
[20:10] <Guest78329> Bewilled you can use ffmpeg for that
[20:10] <Guest78329> with onlyl the file as a parametre
[20:10] <Guest78329> only*
[20:11] <BeWilled> Guest78329: in machine readable format?
[20:12] <sacarasc> ffprobe is the part of ffmpeg that does that, it can output to multiple different formats. As can mediainfo.
[20:12] <Guest78329> i just use java to split infos
[20:12] <Guest78329> it's easy
[20:13] <Guest78329> no need of probe
[20:13] <sacarasc> That's silly.
[20:13] <Guest78329> nothing is silly when you can do it easily
[20:14] <sacarasc> You could do it easier other ways, though, so it is silly.
[20:22] <Bella11911> hi
[20:24] <Bella11911> when i use ffmpeg with an udp source i get  get_buffer() failed
[20:26] <jezeniel> guys how can i add two or more filters in ffmpeg? i am using -af afade=t=in:ss=0:d=2;afade=t=out:st=35:d=2 OUTPUT
[20:27] <jezeniel> btw i am using -af afade=t=in:ss=0:d=2,afade=t=out:st=35:d=2 OUTPUT  comma not semicolon but the second filter doesn't work only the fade in
[20:28] <Bella11911> https://trac.ffmpeg.org/wiki/FilteringGuide
[20:30] <jezeniel> Bella11911: i've read that. but the audio part (afade specifically what i am trying to use) did not mention how to use both fade in or fade out in one command..
[20:32] <Bella11911>   ffmpeg -i INPUT -af afade=t=in:st=50:d=4,afade=t=out:st=125:d=5 OUTPUT
[21:01] <fonso> bye!
[21:07] <Bella11911> getting zombie pictures error with an udp source , any help ?
[21:27] <Bella11911> http://pastie.org/8499245#1
[21:28] <Bella11911> sometimes after a while it works and sometimes not
[21:28] <norbert_> not such which channel to ask this, but would you say "lame -V0 -q0 --vbr-new" is better or "lame --preset cbr 320", if I can pick either?
[21:29] <norbert_> it's about encoding, so I decided to ask here :P
[21:36] <Plorkyeran> v0 is normally equal in quality to 320 at a nontrivially smaller size
[21:37] <norbert_> ok, I'll pick the V0 line then :)
[21:42] <llogan> Bella11911: that's not the complete output
[22:03] <Bella11911> how's that not complete ??
[23:08] <Rager> hi
[23:08] <Rager> I'm trying to use libffmpeg from JNI, and I don't really know where to start
[23:09] <Rager> my end goal is to transcode arbitrary videos (probably h263 or h264, based on what android creates) into x264 mp4's
[23:09] <JEEB> unfortunately libffmpeg is not something provided by ffmpeg
[23:09] <Rager> oh, bother
[23:09] <JEEB> ffmpeg contains various libraries, but none of them are called libffmpeg :)
[23:09] <Rager> libffmpeg.so was built from the ffmpeg sources
[23:10] <JEEB> yes, it's probably a library that uses the ffmpeg libraries underneath
[23:10] <JEEB> but you can't exactly ask here how to use something we have no idea of :)
[23:10] <Rager> it's a library file I literally compiled *from the ffmpeg source*
[23:10] <Rager> like
[23:11] <JEEB> uhh
[23:11] <Rager> the actual official git repos for ffmpeg and the required libs for the various formats
[23:11] <JEEB> you do not get a libffmpeg.so or .a from ffmpeg
[23:12] <JEEB> you get libavcodec, libavformat, libswscale, libswresample, libavresample, libavdevice, libpostproc...
[23:12] <JEEB> if I remembered them all :P
[23:12] <JEEB> if you want help on the libraries ffmpeg provides, then this is the right place
[23:13] <Rager> I guess to figure out what to ask you
[23:13] <Rager> I need to figure out how to relate to you what this lib file I'm working with is
[23:14] <JEEB> ok
[23:15] <JEEB> what you seem to be talking of
[23:15] <JEEB> is
[23:15] <JEEB> https://github.com/lince/libffmpeg-c
[23:15] <Rager> no
[23:15] <JEEB> that creates a libffmpeg.so
[23:15] <Rager> that is not what I am talking about: this is - https://github.com/bubbanat/AndroidFFmpeg
[23:15] <Rager> the code that gets compiled into the "libffmpeg.so" file in question is from these repos: https://github.com/bubbanat/AndroidFFmpeg/blob/master/.gitmodules
[23:16] <Rager> this script is what actually builds it: https://github.com/bubbanat/AndroidFFmpeg/blob/master/FFmpegLibrary/jni/build_android.sh
[23:17] <Rager> ah, it's just a lib created by this script that has the aforementioned libraries in it, I believe
[23:17] <Rager> (the ones you said that you get)
[23:18] <JEEB> yes, it's a wrapper library around quite a few libraries it seems
[23:18] <Rager> what would I use to tear apart the binary to see the symbol table?
[23:19] <Rager> sorry, that was silly - installing a tool, now
[23:19] <JEEB> have you already looked at the example they have in their repo?
[23:20] <Rager> I've been reading and trying to understand, yes
[23:20] <Rager> also been trying to understand ffmpeg's sample code for decoding a video stream
[23:21] <Rager> but that one doesn't explain how to deal with container formats
[23:21] <JEEB> I just used the current avformat/avcodec APIs some time ago
[23:21] <JEEB> for the first time
[23:21] Action: JEEB usually codes within lavc/lavf
[23:21] <JEEB> didn't seem too hard :)
[23:22] <JEEB> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=doc/examples/demuxing_decoding.c;h=325c7b8cda76017f59593745c80750a7b181e221;hb=HEAD
[23:22] <JEEB> this one is a relatively good example
[23:23] <Rager> I guess part of the problem is that I don't understand the workflows at hand, here
[23:24] <JEEB> it's pretty straightforward to get started
[23:24] <JEEB> 1) av_register_all()
[23:24] <JEEB> registers all formats etc.
[23:25] <JEEB> 2) create a AVFormatContext pointer with avformat_alloc_context()
[23:25] <JEEB> or well, create the actual thing with the function
[23:25] <JEEB> and have a pointer to it
[23:26] <JEEB> 3) if you don't want to do any special I/O, just open a file with avformat_open_input()
[23:27] <JEEB> 4) do basic checks on the input with avformat_find_stream_info()
[23:27] <Rager> going through that code you linked and trying to trace out the logic of the steps
[23:28] <JEEB> and at the point of avformat_find_stream_info() you should now have a basic list of the streams in the file
[23:30] <Rager> ok
[23:31] <Rager> at that point, I have an input file, an output file, and stream information on the input file
[23:33] <sirEgghead> I have some mkv files that contain subtitles.  I would like to select these subtitles when converting with ffmpeg.  At https://trac.ffmpeg.org/wiki/How%20to%20burn%20subtitles%20into%20the%20video it says to use the "-filter_complex" option.  I followed the directions exactly and I get "ffmpeg: failed to set value '[0:v][0:s]overlay[v]' for option 'filter_complex'" in return when trying it out.
[23:36] <Rager> I'll keep reading, JEEB
[23:36] <sirEgghead> http://pastebin.com/wRTEfXU3 for the full input and output for my problem.
[23:39] <JEEB> Rager, if it's files and you don't need any custom IO for them that you'd need to wrap around, then using libavformat as-is for the file interaction should be fine :)
[23:39] <JEEB> one lavf context for input, another for output
[23:47] <saste> sirEgghead, ffmpeg version N-32076-g4ca6a15, Copyright (c) 2000-2011 -> two years old ffmpeg
[23:49] <sirEgghead> saste, that might do it.  Lol.
[23:49] <sirEgghead> saste, sounds like the last time I compiled it.
[23:50] <sirEgghead> saste, would you know if my command was good for the subtitles?
[23:50] <sirEgghead> Would at least like to know I'm not just spinning my wheels.  :)
[23:50] <saste> i hope so
[23:50] <saste> you'll need to update anyway
[23:51] <sirEgghead> Alrighty.  Let me do that and I'll get back with you.  Thank you very much.
[23:53] <sirEgghead> I haven't updated it in so long for a couple of reasons.  1. Because it's been doing everything I need it to just fine, until now.  2. I don't remember all the options I used to compile it.  :P
[00:00] --- Fri Nov 22 2013


More information about the Ffmpeg-devel-irc mailing list