[Ffmpeg-devel-irc] ffmpeg.log.20130801

burek burek021 at gmail.com
Fri Aug 2 02:05:01 CEST 2013


[00:16] <intracube> hi, where can I get up to date ffmpeg/x264 .ffpreset files from?
[00:16] <sacarasc> You generally don't need them. You can use -preset veryslow (for example) to use the x264 preset directly.
[00:17] <intracube> sacarasc: ah, thanks - I didn't realise they were built in now
[00:18] <intracube> had to remove the old ones from %HOME/.ffmpeg dir. now it seems to be encoding fine. thanks
[00:43] <gulf> hello!, is possible to use ffmpeg to convert a video file from any format to xdcam hd422 mxf file ?
[00:43] <durandal_1707> xdcam allows 422?
[00:45] <gulf> i think so
[00:46] <gulf> I have found an 422 sample in http://www.opencubetech.com/page47/
[00:47] <durandal_1707> mxf muxer writes OP1a files
[00:47] <durandal_1707> so it should be supported, assuming you put into mxf what xdcam expects
[00:48] <durandal_1707> it accepts audio as 24 bit pcm but each channel must be in separate track
[00:52] <durandal_1707> also it seems to be picky on bit rate stored in mxf
[00:56] <gulf> I am bit newbie in this video formats and transcoding, so I am bit lost, what codec should I use with ffmpeg to convert video file to xdcam mxf file
[00:56] <durandal_1707> mpeg2video
[00:57] <durandal_1707> that is only supported by xdcam afaik
[01:53] <BubbaP> Hi, does anyone know how to install the stereo3d filter for ubuntu 13.04?  It doesn't seem to be built-in to ffmpeg
[02:48] <Plorkyeran> BubbaP: the "ffmpeg" package for ubuntu is actually libav, which does not have the stereo3d filter
[02:49] <Plorkyeran> so you will need to install real ffmpeg
[02:49] <Plorkyeran> I assume there's a PPA for it
[05:51] <iceweasel> are there any ways to reduce the lag in ffserver?
[05:56] <iceweasel>  ___________________
[05:56] <iceweasel> < Is anybody there? >
[05:56] <iceweasel>  -------------------
[05:56] <iceweasel>         \   ^__^
[05:56] <iceweasel>          \  (oo)\_______
[05:56] <iceweasel>             (__)\       )\/\
[05:56] <iceweasel>                 ||----w |
[05:56] <iceweasel>                 ||     ||
[05:58] <iceweasel>  ________________________________________
[05:58] <iceweasel> / __ __ | \/ | ___ ___ | |\/| |/ _ \ / _ \
[05:58] <iceweasel> | \ | | | | (_) | (_) | |_| |_|\___/     |
[05:58] <iceweasel> \ \___/                                  /
[05:58] <iceweasel>  ----------------------------------------
[05:58] <iceweasel>         \   ^__^
[05:58] <iceweasel>          \  (oo)\_______
[05:58] <iceweasel>             (__)\       )\/\
[05:58] <iceweasel>                 ||----w |
[05:58] <iceweasel>                 ||     ||
[05:58] <sacarasc> Stop spamming.
[05:58] <iceweasel> sorry
[10:26] <luc4> Hi! When muxing using callbacks I noticed that ffmpeg needs to seek back on data to change it. Is there any way to avoid this? Maybe there are some containers that does not need ffmpeg to seek?
[10:59] <Mavrik> luc4, TS containers for example.
[11:01] <luc4> Mavrik: ah you're right, but can I place h264 into mpeg 2 for instance?
[11:02] <Mavrik> you can place H.264 into MPEG2-TS container yes
[11:02] <luc4> Mavrik: thanks!
[11:05] <spreeuw> is it possible to increase input buffer on streams played with ffplay?
[11:06] <spreeuw> like mplayers -cache options
[11:10] <nlight> what does it mean if AVCodecContext->pix_fmt is 0?
[11:11] <nlight> when decoding
[11:11] <nlight> means that the demuxer can't figure out the format from the headers?
[11:16] <luc4> Mavrik: do you know off the top of your head what I have to pass to av_guess_format to get mpeg2 ts container?
[11:17] <Mavrik> luc4, mpegts
[11:17] <Mavrik> nlight, probably the probe size was not big enough to probe pix format
[11:17] <Mavrik> nlight, or the input is broken
[11:17] <Mavrik> nlight, I suggest developing with debug version of ffmpeg so you can step through with gdb and see for yourself whats going on
[11:18] <luc4> Mavrik: thanks!
[11:18] <Mavrik> luc4, also make sure you don't have global headers flag set on H264
[11:19] <luc4> Mavrik: it seems I'll have to modify the h264 stream as ffmpeg is complaining... it will take a long time: "[mpegts @ 0x982bfe0] H.264 bitstream malformed, no startcode found, use the h264_mp4toannexb bitstream filter (-bsf h264_mp4toannexb)"
[11:20] <Mavrik> hence my last statement :P
[11:20] <Mavrik> if you don't set global header flag, H.264 will insert PSS/SPS data into stream instead of dumping it into priv_data for one-time output
[11:24] <luc4> Mavrik: I'll se how to do that, thanks!!
[11:32] <luc4> Mavrik: sorry for the stupid questions, I'm not a av expert, but do you mean that I have to place the annexb format into the mpegts instead of the avcc?
[11:33] <Mavrik> avcc
[11:33] <Mavrik> ?
[11:33] <Mavrik> luc4, but yes
[11:33] <Mavrik> H.264 stream in MPEG2-TS must be in AnnexB format
[11:33] <Mavrik> since MPEG2-TS doesn't have global headers
[11:34] <luc4> and that is why ffmpeg does not need to seek, right?
[11:37] <luc4> avcc: http://aviadr1.blogspot.it/2010/05/h264-extradata-partially-explained-for.html, I suppose ffmpeg calls it length prefixed mode.
[11:39] <luc4> Mavrik: works perfectly that way, thanks!
[11:40] <Mavrik> luc4, the seeking is connected to mp4 container
[11:40] <Mavrik> since ffmpeg puts MP4 index at the beginning, and the index can only be written after file is done
[11:41] <JEEB> well, no
[11:41] <luc4> Mavrik: sure, I perfectly understand that.  But my problem is that I can't seek, so I was hoping there was some other solution.
[11:41] <JEEB> the index first is put in the end
[11:42] <JEEB> and then there's an option that then decides whether or not to seek and put it in the beginning
[11:42] <JEEB> :P
[11:42] <Mavrik> JEEB, there's some seeking going on even if you don't put the faststart flag
[11:42] <JEEB> if it was indeed mp4 that was the problem
[11:42] <Mavrik> no idea why really, didn't check the source
[11:42] <JEEB> o_O
[11:42] <JEEB> what the flying...
[11:42] <Mavrik> could also be fixed later :P
[11:43] <JEEB> mp4 in general shouldn't need seeking around unless the index was then moved to the beginning
[11:43] <Mavrik> I did have some annoying problems with that on pre-1.0
[11:43] <luc4> the only problem is that with the mpeg2ts I just created seek when playing is not working... is this correct?
[11:43] <Mavrik> luc4, hmm, it should be possible to seek mpeg2ts with most players
[11:43] <Mavrik> unless you're livestreaming it
[11:44] <luc4> I'm trying with vlc but it is showing some errors.
[11:44] <luc4> no, I'm playing locally.
[11:46] <luc4> JEEB: when muxing to mp4 I get ffmpeg requests to seek many times. Shouldn't this happen?
[11:47] <luc4> JEEB: the resulting mp4 is anyway perfect. If I do not implement seek, it lacks some information and it does not play with vlc.
[11:47] <JEEB> make sure you have the flag off regarding moving the index, as well as make sure ffmpeg doesn't think you're writing into a seekable thing
[11:48] <Mavrik> yuck
[11:48] <Mavrik> http://ffmpeg.org/doxygen/trunk/movenc_8c_source.html
[11:49] <Mavrik> search for avio_seek
[11:50] <JEEB> lol
[11:50] <luc4> yes, I see it needs to seek to write the moove etc...
[11:50] <luc4> that is exactly what is missing if I do not implement the seek callback
[11:53] <luc4> JEEB: those things you remarked are specified when allocing the context? There is a flag in there I see.
[11:57] <luc4> JEEB: ah yes sorry, found it.
[12:03] <luc4> JEEB: If I set seekable = 0 I get: [mp4 @ 0xa7bafe0] muxer does not support non seekable output
[12:04] <JEEB> luc4, then you'd have to use some other muxer I guess
[12:05] <luc4> JEEB: I'm not bound to mp4, I can change. But I tried mpegts which is not seeking but when seeking during playback I get many errors. Anything else that comes into your mind?
[12:05] <luc4> JEEB: I need to mux h264.
[12:06] <luc4> JEEB: also, my h264 stream has vfr.
[12:08] <JEEB> luc4, make sure you're passing SPS and PPS before every seeking point with mpeg-ts
[12:12] <luc4> JEEB: sorry, by "seeking point" you mean I-frame?
[12:12] <JEEB> a Random Access Point
[12:14] <elkng> I have some video file and those are its parameters http://sprunge.us/gHfO codec and stuff, why is wherever I seek forward or backward in player it looks that way for few seconds http://image.bayimg.com/f6cd8a540b1d5c0dc17561430c747aad1eadfaf5.jpg , what is all those garbage is that player problem or that file was coded that way ?
[12:33] <luc4> JEEB: in fact, I'm not passing SPS and PPS to ffmpeg. How do I pass those to ffmpeg? Do I have to create a separate AVPacket or do I have to preprend to I-frames data passed to the AVPacket?
[12:35] <Paranoialmaniac> put them on extradata
[12:51] <spidey_> where would I find the option to enable a specific rtmp authentication?
[12:59] <parshap> is there a way to determine an output format based on an input's format?
[13:01] <parshap> i.e., i want my output to be the same format as my input, the normal output format auto detecting doesn't work for me because i'm piping so there is no file name extension to inspect
[13:01] <nlight> void 	av_dump_format (AVFormatContext *ic, int index, const char *url, int is_output)
[13:02] <nlight> is second parameter stream index?
[13:03] <parshap> nlight: using the command line utility, not lib
[13:03] <ItsMeLenny> how do i go about not copying audio?
[13:03] <ItsMeLenny> in terms of:  avconv -i ./Untitled01.avi -c:v copy -c:a none output.avi
[13:03] <ItsMeLenny> (except none doesnt work)
[13:04] <nlight> parshap, i was asking for myself but ok ;p
[13:04] <parshap> nlight: hehe nm then :)
[13:08] <parshap> when using -codec copy does output format even matter?
[13:09] <ItsMeLenny> no it doesn't, but it will effect the audio if you only specify video to copy
[13:09] <ItsMeLenny> i figured out what i wanted btw
[13:09] <nlight> ok I'm really bummed, what do I need to do to have AVCodecContext->pix_fmt properly initialzied?
[13:09] <nlight> whatever file I open it's always 0
[13:11] <ItsMeLenny> i think i missed what you're working on in the first place
[13:11] <ItsMeLenny> ok, now i need a command to convert video into lossless h.264 with 4:2:2 chroma
[13:15] <parshap> Anyone know how I can get this to simply output the file back to me? `ffmpeg -i ./audio.bin -acodec copy -`. I get "Unable to find suitable output format"
[13:21] <spidey_> I see in libavformat.a authmod=adobe and authmod=llnw, but the options to use them don't seem apparent
[13:44] <nlight> 0 is a valid format..... really? I mean.. really?
[13:44] <nlight> i spent an hour on this lol
[13:45] <parshap> nlight: what are you doing exactly? seems maybe related to what i'm trying to do.
[13:46] <nlight> i'm decoding a video and kept getting pix_fmt == 0 thinking it meant AV_PIX_FMT_NONE
[13:46] <nlight> but it actually means AV_PIX_FMT_YUV420P
[13:46] <nlight> which is beyond retarded but yae
[13:46] <nlight> yea*
[13:46] <parshap> hah
[14:22] <nlight> should I read from AVFrame->data or AVFrame->data[0]?
[14:35] <nlight> to answer my own question -> data[0]
[14:52] <ItsMeLenny> how would i pass a list of images to video but where i specify each images name
[15:03] <nlight> ItsMeLenny, http://en.wikibooks.org/wiki/FFMPEG_An_Intermediate_Guide/image_sequence
[15:04] <ItsMeLenny> nlight, i have the image sequence as a video working perfectly, however i want to specify more images that don't count to the file name, and double up on some images in between
[15:05] <ItsMeLenny> so i basically want to create a kind of xsheet image list that puts the images in the order i choose into a video
[15:05] <nlight> no idea, sorry
[16:19] <w2008usr> Hi to all
[16:21] <w2008usr> I need to read an H264 stream and stream it in MJPEG format
[16:22] <w2008usr> I understand that ffmpeg is able to perform this transcodification
[16:24] <w2008usr> I did some experimets, but I always the error  message "Protocol not found" if I try to stream-out as rmpt or rtp
[16:25] <w2008usr> Maybe in command line like follow:
[16:25] <w2008usr> ffmpeg -i 'rtsp://source/encoder1' [... a lot of options ...] -f flv 'rtmp://server/live/cam0'
[16:25] <w2008usr> keyword "server" stands for an external component that I miss?
[16:27] <w2008usr> Any suggestion?
[16:29] <w2008usr> ( I've googled a lot, and also read documentations, but I'm unable to solve the problem )
[16:37] <kotfic> Hi,  I'm trying to encode an m4a file as an mp3 but i'm getting an error with libswresimple.so.0 https://gist.github.com/kotfic/6131991
[16:39] <b4u> any idea why vp8 encoded through ffmpeg is about 15% slower than vpxenc?
[16:39] <b4u> same lib version
[16:39] <zap0> b4u, using   -threads 99
[16:39] <zap0> or something like it!?!
[16:39] <b4u> using -threads 6
[16:39] <b4u> on both
[16:40] <b4u> same settings
[16:40] <b4u> and same input y4m, for fairness
[16:40] <zap0> one likely compiled with SSE optimzations etc.
[16:40] <b4u> they're the same
[16:41] <b4u> downloaded libvpx, compiled, did make install
[16:41] <zap0> 15% is a typical code optimization amount.
[16:41] <b4u> that installed vpxenc, and also let me compile ffmpeg with libvpx support
[16:41] <zap0> what thread priority is set when it runs?
[16:42] <b4u> how do I tell?
[16:42] <zap0> -6  on a 4core can be detremental if threads are not deliberatly core assigned.
[16:42] <zap0> what CPU?   do you have 6 "real" cores?
[16:42] <w2008usr> ( No answers for my question? )
[16:43] <b4u> it's a server
[16:43] <b4u> 12 real cores, hyperthreaded to 24
[16:43] <zap0> and the workload on the server is much the same when encoding both?
[16:43] <b4u> there's nothing running on it other than the encodings, so yeah
[16:44] <b4u> w2008usr: what was yours?
[16:44] <w2008usr> I want to transcod an h264 stream to an MJPEG4 stream
[16:44] <zap0> do you see a fairly even distribution of workload over all 6 cores during encoding?
[16:44] <b4u> zap0: to be honest neither vpxenc nor ffmpeg use 100% CPU on any of the cores
[16:45] <zap0> so you are storage limited?
[16:45] <b4u> but I see that vpxenc is using around 420% in top and ffmpeg is using around 280%
[16:46] <w2008usr> Experimenting , I noticed that any attempt to stream-out in "rtp", "rmtp", "rstp" stops ffmpeg with error message "Protocol not found"
[16:47] <b4u> is it listed in ffmpeg -protocols?
[16:48] <b4u> if not, perhaps for some reason your build has it disabled
[16:48] <b4u> as it should be enabled by default
[16:48] <saste> w2008usr, !pb
[16:48] <w2008usr> I check "-protocols" an d let you know
[16:50] <b4u> zap0: I think you're right btw
[16:50] <b4u> about hardware compilation
[16:50] <b4u> I checked ffmpeg compile line (I didn't build it) and someone compiled with --disable-yasm...
[16:50] <b4u> recompiling now
[16:50] <w2008usr> rtp is listed
[16:50] <b4u> rtmp should be
[16:51] <zap0> b4u ;)
[16:51] <w2008usr> (I'm going to paste in pasteie or similar)
[16:52] <w2008usr> rtmp is present too
[16:52] <b4u> w2008usr: in that case do what saste says and pastebin your whole command and output
[16:52] <w2008usr> Maybe I've wrote wrong protocol?
[16:52] <w2008usr> ( I'm going to paste... "please hold the line"  )
[16:54] <b4u> zap0: still not really sure why it doesn't max the cores I give it though, x264 does
[16:54] <zap0> cause it's too effecient!
[16:54] <zap0> maybe you need faster storage
[16:54] <b4u> it's SSDs :<
[16:54] <zap0> or perhaps this means you could increase your quality settings
[16:55] <zap0> is it RAID?
[16:55] <b4u> this is a pretty decent production server... 2 CPUs, RAID10 SSDs (yay no trim), 100GB+ of RAM, etc.
[16:55] <Mavrik> are the CPUs NUMA?
[16:55] <zap0> sounds like a dream machine!
[16:56] <zap0> i wish i had one :(
[16:56] <b4u> Mavrik: is that something you can tell if I give you the model number?
[16:56] <b4u> not sure of the answer
[16:56] <Mavrik> not really
[16:56] <zap0> mobo model number ?
[16:56] <Mavrik> but that usually explains non-100% usage on multisocket machines
[16:57] <b4u> either way, hoping I'll see a speed boost with yasm enabled...
[16:57] <zap0> b4u, he makes a good point.  you should try locking the process's threads to a single CPU.
[16:58] <zap0> that might improve cache issues
[16:58] <b4u> ah can't do that, I am testing baremetal now but ultimately this will be rolled out virtualised
[16:58] <b4u> so won't have that kind of control
[16:58] <w2008usr> I'm back
[16:59] <zap0> if it's going to be virtualized, 15% might turn into 50
[16:59] <Mavrik> ugh yeah
[16:59] <w2008usr> ( ops... please wait )
[16:59] <w2008usr> ( I'm sorry )
[16:59] <Mavrik> don't expect perfect utilization on virtualized servers
[16:59] <b4u> of course
[16:59] <b4u> seeing way better utilisation by ffmpeg now with yasm
[17:00] <b4u> I will stab whoever compiled it
[17:00] <w2008usr> Here is it:
[17:00] <w2008usr> http://pastebin.com/7eSU4Hbe
[17:01] <b4u> it now beat vpxenc by quite some margin :)
[17:01] <w2008usr> ( mistake... rmtp istead of rtmp...  the error is now "failed to connect socket")
[17:01] <b4u> is this linux?
[17:01] <w2008usr> No
[17:01] <w2008usr> Windows
[17:01] <b4u> it's your output which is failing
[17:02] <w2008usr> "16:55:16,65>" is the prompt
[17:02] <b4u> but, I am not familiar with streaming so
[17:02] <b4u> maybe someone else can explain why
[17:02] <w2008usr> Oh
[17:03] <w2008usr> ( hope someone can explain )
[17:05] <b4u> it is obviously retrieving your input fine because it has codec, dimensions etc.
[17:05] <w2008usr> Yes, read works
[17:06] <saste> b4u, your rtmp url is not complete, check docs
[17:06] <w2008usr> I've seen the stream with ffplay
[17:06] <b4u> saste: not mine
[17:06] <b4u> w2008usr: I think maybe your output should be rtp:// not rtmp:// for that ip:port format?
[17:07] <w2008usr> I try
[17:07] <w2008usr> it is streaming something
[17:08] <w2008usr> maybe rtmp need some other external component/server to work?
[17:08] <b4u> well as saste said I think your rtmp URL is incorrect
[17:09] <w2008usr> rtp does't work: "Unsupported RTP version packet received" (ffprobe)
[17:09] <b4u> rtmp is supposed to be like: rtmp://127.0.0.1:1234/destinationresource
[17:10] <b4u> can try -f rtp with that instead of -f flv
[17:10] <w2008usr> ( now ffmpeg works - is streaming - but ffprobe says "Unable to receive RTP payload type 106 without an SDP file describing it")
[17:11] <w2008usr> You said it right
[17:11] <w2008usr> But player can't read it
[17:12] <b4u> see this
[17:12] <b4u> http://trac.ffmpeg.org/wiki/StreamingGuide
[17:12] <w2008usr> There is an option to list codec?
[17:12] <b4u> ffmpeg -i input -f rtsp -rtsp_transport tcp rtsp://localhost:8888/live.sdp
[17:20] <w2008usr> The ffprobe says "rstp://127.0.0.1:1234/live.dsp: Protocol not foundsq=    0B f=0/0"
[17:23] <w2008usr> here is ffmpeg / ffprobe: http://pastebin.com/Eq0EaC9x
[17:25] <w2008usr> However, can You confirm that - apart from my mistakes - you can receive in input a stream H264 and stream it in output as MJPEG ?
[17:28] <w2008usr> ( I think so, but I'm not sure, and I have to decide if continue on this way or rewrite a lot of an existing system code )
[17:28] <b4u> ffmpeg can convert pretty much anything to anything :P
[17:29] <b4u> but of course depending on the quality it may not be realtime
[17:30] <b4u> did you read the page I linked?
[17:31] <b4u> also
[17:31] <w2008usr> http://trac.ffmpeg.org/wiki/StreamingGuide ?
[17:31] <b4u> you typoed in ffprobe
[17:31] <b4u> you did .dsp
[17:31] <b4u> instead of .sdp
[17:31] <w2008usr> Bookmarked a couple of hour before I entering here..
[17:31] <b4u> try ffprobe with correct url
[17:32] <w2008usr> you're right
[17:32] <w2008usr> I try
[17:33] <w2008usr> ( My neurons are in need of rest )
[17:34] <w2008usr> "Protocol not found" again
[17:35] <b4u> what if you try to play direct?
[17:35] <b4u> the play command would be this - ffplay -rtsp_flags listen rtsp://localhost:1234/live.sdp
[17:36] <w2008usr> Protocol not found
[17:37] <b4u> weird
[17:37] <b4u> I am not sure sorry, I am not that familiar with streaming files
[17:38] <w2008usr> Maybe both command line are wrong: "rstp://" istead of "rtsp://"
[17:38] <w2008usr> You do not have to be sorry, I'm grateful for helping me :-)
[17:43] <w2008usr> Ok, I'm going
[17:43] <w2008usr> Thank you
[17:43] <w2008usr> Bye!
[19:41] <yousef> i just posted a question regarding libffmpeg on the forum at http://ffmpeg.gusari.org/viewtopic.php?f=11&t=1027 ... if anyone has the time, please take a look. otherwise, i'll wait for an answer (hopefully) on the forum.
[19:43] <durandal_1707> yousef: but ffmpeg project have nothing to do with libffmpeg thing
[19:43] <yousef> it does not? please excuse my ignorance.
[19:44] <JEEBsv> there is no such library as libffmpeg provided by ffmpeg
[19:44] <yousef> no libavcodec, libavformat, etc?
[19:45] <JEEBsv> yes, libavcodec, libavformat, libavutil etc. are ffmpeg libraries
[19:45] <JEEBsv> but there is no such thing as libffmpeg
[19:50] <yousef> my apologies. by libffmpeg i was referring to the collection of ffmpeg libraries.
[19:50] <yousef> no libffmpeg.so
[19:50] <yousef> not*
[19:52] <yousef> i just edited my post and corrected this.
[20:57] <RobertNagy> is avfilter slice threading supported?
[21:09] <durandal_1707> RobertNagy: yes
[21:10] <durandal_1707> you can see what filters have slice threading enabled with: ffmpeg -h filters
[21:10] <durandal_1707> *without '-h'
[21:11] <durandal_1707> RobertNagy: you are interested in specific filter or?
[21:34] <Martijnvdc> hello.
[21:34] <Martijnvdc> i decode a webcam stream, and then encode it using VP8. The webcam stream has a yuyv422 pixel format, which VPx claims doesn't support.
[21:34] <Martijnvdc> How could i change the pixel format, so that i can encode the stream using VP8, please?
[21:34] <durandal_1707> what you use to encode?
[21:34] <Martijnvdc> i'm writing in C, by the way
[21:35] <Martijnvdc> VP8
[21:35] <durandal_1707> there is code that gives list of encoder supported pixel formats
[21:35] <Martijnvdc> i read a webcam stream from /dev/video0, and decode that stream
[21:35] <Martijnvdc> would there be no way to convert the pixel format?
[21:36] <durandal_1707> you can convert with swscale
[21:36] <durandal_1707> or using format filter within libavfilter
[21:38] <Datalink-M> I'm trying to connect a desktop stream to someone else's server over RTMP, I haven't yet figured out the needed settings for the RTMP portion
[21:40] <Datalink-M> User name, stream name and password where provided as well as the needed URLs
[21:43] <Datalink-M> Honestly I should come back when I get home, I don't have the stream info right now
[21:51] <Martijnvdc> durandal_1707: thanks a lot! :)
[22:20] <spidey_> where would I find the options to enable rtmp authentication?
[22:27] <brad_c6> Is it possible to use a std::vector<uint8_t> as the input buffer for decoding audio? It gets to the decode function and says the mp3 header is missing (http://pastebin.com/ZFzrdv4S) Thank You
[22:35] <durandal_1707> spidey_: there is documentation
[22:36] <spidey_> sorry, didn't see it in there
[22:37] <Datalink-M> I've been looking for days and I couldn't find smtp authentication either
[22:37] <durandal_1707> Datalink-M: smtp?
[22:37] <spidey_> think it was was supposed to be rtmp
[22:38] <Datalink-M> Rtmp in my case
[22:38] <durandal_1707> there is everything in documentation for protocols
[22:38] <PatNarciso> Hey fellas.  I'm interested in making a 24/7 stream.  There is plenty of documentation on that.  From time to time within that stream, I'd like to overlay another stream.  There is plenty of documentation related to overlays -- but I'd like to "switch" the overlay on and off as needed, without restarting the ffmpeg stream in progress.
[22:38] <Datalink-M> i repeat, I've been searching it for days without luck
[22:39] <PatNarciso> So - my question is -- how would you fellas go about doing this?  Is there a term for what I'm trying to do?  As of right now, I was thinking about making a script/application that would manage a series of ffmpeg pipes.  When it was time to "switch" on the overlay, the application could direct the source pipe through an additional ffmpeg process in charge of the overlay, and when it was time to stop, would direct the pipe traffic back to
[22:39] <PatNarciso>  the ffserver process.  Think this is possible?
[22:39] <durandal_1707> Datalink-M: read documentation i linked
[22:39] <kevint_> I'm trying to modify the behavior of incoming raw images (over a pipe) while encoding to h264 - I'm looking for the source file that sets the PTS of these incoming images. Where can I find it?
[22:41] <Datalink-M> durandal_1707: I've been hunting through that for days... this is the third time I've said I have been looking and couldn't find it, telling me to RTFM is redundant to the fact I can't find it in the manual
[22:41] <durandal_1707> Datalink-M: what you can not find?
[22:41] <Datalink-M> Rtmp authentication
[22:41] <durandal_1707> did you found rtmp protocol documentation?
[22:41] <Datalink-M> Yes
[22:41] <PatNarciso> RTMP Doc: http://www.ffmpeg.org/ffmpeg-protocols.html#rtmp
[22:42] <Datalink-M> Yes... I've tried commands to the point I'm past what should work
[22:43] <durandal_1707> what ffmpeg version are you using?
[22:43] <durandal_1707> what you actually tried?
[22:45] <Datalink-M> 0.8.6 armhf raspbian, user at pass, as well as a few variations of app= parameters, I'd have to check when I get home for more details
[22:46] <Datalink-M> Rtmp://user@pass:host/app/stream
[22:47] <Datalink-M> Basically a lot of stuff for the flv string
[22:47] <Datalink-M> Meh, I'm gonna come back when I'm home
[22:48] <Mavrik> PatNarciso, basically, write your own transcoder& ffmpeg doesn't really support what you want
[22:48] <Mavrik> it's not meant as a permanent video streamer, it's a file transcoder first and foremost
[22:49] <PatNarciso> Mavrik: right, and it does a great job at that.  and broadcasting -- both pregenerated and live content.
[22:50] <PatNarciso> It also does a great job, via filters, of allowing a person to manipulate video (or audio).
[22:51] <PatNarciso> I'd imagine someone must have made a "switcher" powered by ffmpeg.  I figured that this would be the place to ask before attempting to create my own.
[23:15] <kevint_> I'm trying to alter the PTS of incoming raw rgba images in the source (from pipe, encoding into h264) - what source file does this?
[23:15] <kevint_> I'm swimming through hundreds of source files and just cannot find it
[23:31] <parshap> Anyone know how I can get this to simply output the file back to me? `ffmpeg -i ./audio.bin -acodec copy -`. I get "Unable to find suitable output format".
[23:32] <PatNarciso> define the output format you'd like returned, and then you'll be good to go.
[23:33] <klaxa> if... if you want to get it raw without transcoding, just "cat audio.bin"
[23:34] <parshap> PatNarciso: the problem is that my audio.bin could be mp3, could be ogg... i don't want to "hard-code" an output format, I just want to keep it the same as the input
[23:34] <parshap> klaxa: in reality I am using `-metadata` to sets some metadata on the file]
[23:34] <klaxa> then you have to specify a container too
[23:35] <parshap> klaxa: i want to keep the same container as the input source
[23:35] <parshap> klaxa: i.e., if audio.bin was an mp3, use mp3. if it was ogg, use ogg.
[23:35] <klaxa> shouldn't be too hard with a little scripting
[23:35] <klaxa> yeah
[23:35] <klaxa> but you will have to add -f <container>
[23:35] <parshap> klaxa: :(
[23:35] <klaxa> and you can't use just "-" but use "pipe:" or "pipe:1"
[23:35] <parshap> klaxa: so i guess i could parse the format out of the ffmpeg output
[23:36] <klaxa> mhh yes that would be one way
[23:36] <parshap> klaxa: and then run ffmpeg again passing -f with the format i got from the first run
[23:36] <klaxa> yep
[23:36] <parshap> klaxa: i'd like to avoid having to read the file twice though, any ideas?
[23:36] <klaxa> or you just check the file for the extention and assume it's the right one
[23:36] <PatNarciso> for parsing the output, see ffprobe, and the json format is my favorite.
[23:37] <parshap> PatNarciso: awewsome! thanks for the tip - it looked like it was going to be a pain to get the format from ffmpeg output
[23:38] <klaxa> for file in *; ffmpeg -i "$i" -metadata author="someone awesome" -c copy -f "$(echo $i | sed s/.*\.//)" pipe:
[23:38] <parshap> which there was a way to avoid running two procsses though - wish i could do something like `-f copy` like i can `-c copy`
[23:38] <klaxa> uh... damn :P
[23:38] <PatNarciso> parshap, no problem.  I spent too many hours parsing ffmpeg output before I learned that.
[23:38] <klaxa> for i# in *; ffmpeg -i "$i" -metadata author="someone awesome" -c copy -f "$(echo $i | sed s/.*\.//)" pipe:
[23:38] <klaxa> without the #
[23:38] <klaxa> stupid keyboard
[23:38] <klaxa> argh that's still no valid bash
[23:38] <parshap> *wish - not which
[23:39] <klaxa> for i in *; do ffmpeg -i "$i" -metadata author="someone awesome" -c copy -f "$(echo $i | sed s/.*\.//)" pipe: | <something else>; done
[23:39] <klaxa> now it should be valid bash
[23:39] <parshap> klaxa: i don't have filenames, dealing with just raw buffers of data
[23:39] <klaxa> ah
[23:39] <klaxa> well then ffprobe is your best bet, but you'll still have to read the file twice
[23:39] <parshap> cool, thanks for the help!
[23:39] <PatNarciso> klaxa, whats the best format to use when running a bunch of ffmpeg processes tied together with pipes?
[23:40] <klaxa> in how far?
[23:40] <parshap> out of curiosity, is there a technical reason why something like `-f copy` doesn't exist?
[23:40] <klaxa> hmm... good question
[23:41] <klaxa> PatNarciso, can you explain the usecase a little bit further?
[23:41] <klaxa> in general i would think passing data through pipes is best done in raw data since it requires neither compression nor decompression
[23:42] <PatNarciso> sure.  ffmpeg -i whatever.mpg - | ffmpeg -i - blah.mp4
[23:42] <klaxa> why... would you do that?
[23:42] <PatNarciso> obviously in that example, there is no need for a pipe
[23:42] <klaxa> ah yeah...
[23:42] <klaxa> you want multiple outputs i guess?
[23:43] <PatNarciso> I'm trying to cook up the elements needed to create a switcher (or router?) powered by ffmpeg.
[23:43] <klaxa> if so there is no need for pipes at all
[23:43] <klaxa> afaik
[23:43] <PatNarciso> or in other words: create a live stream, empowering a person to switch feeds at will.
[23:44] <klaxa> in how far livestream?
[23:45] <klaxa> i mean if you write stuff to a pipe it's blocking
[23:45] <klaxa> if you write stuff to a file it's not live
[23:46] <PatNarciso> I'm thinking newsroom application. 4 cameras. and a dude choosing the camera to be on the single output.
[23:47] <PatNarciso> good point about the blocking.  fifo may not be the best idea.
[23:48] <klaxa> you would need a decoder that decodes the stream and only sends it out over fifo if requested
[23:49] <klaxa> but mind you, the output has still to be compliant with container/codec specs, i.e. correct header/trailer
[23:52] <PatNarciso> right.  somewhere in my notes I found a person who was removing that, using... I think it was an mpegpts stream.
[23:54] <klaxa> maybe so, not too familiar with that
[23:54] <klaxa> i've done it once for mp3 but that's it
[23:54] <Mavrik> PatNarciso, how different are those formats?
[23:55] <klaxa> shouldn't one be able to put just everything into one matroska container and selectively play one video stream?
[23:55] <PatNarciso> I recall trying to concat two streams before, and the mpeg header of the second file came along and upset the ffmpeg process.  I think the doc's have an example of how to get around that one.
[23:56] <Mavrik> PatNarciso, MPEG-TS may be switched since it's a streaming format
[23:56] <Mavrik> just don't expect your decoders to survive a resolution switch or a timestamp jump
[23:56] <PatNarciso> Mavrik, so, for example: the source of camera1 and camera2 could be anything.  but I have no problem running additional ffmpeg processes to convert those streams to "whatever" format is needed.
[23:56] <PatNarciso> right.
[23:57] <Mavrik> so yeah, package your streams into MPEG-TS container
[23:57] <Mavrik> make sure they're in same color space, video resolution and formats
[23:57] <Mavrik> and that could be okish
[23:57] <mark4o> and I-frame only
[23:57] <PatNarciso> it's my hopes that, I can chain a few ffmpeg processes together -- and a -switch on the final process be told to ignore the timestamp (and generate a new one?)
[23:57] <Mavrik> mark4o, nah.
[23:58] <Mavrik> PatNarciso, MPEG-TS can be concatenated on TS packet boundaries
[23:58] <Mavrik> even though, putting all of that in a horrible chain of pipes will probably work like crap
[23:58] <Mavrik> considering a specific usecase a more custom solution using libav would probably be way more performant and stable
[00:00] --- Fri Aug  2 2013


More information about the Ffmpeg-devel-irc mailing list