[Ffmpeg-devel-irc] ffmpeg.log.20151006

burek burek021 at gmail.com
Wed Oct 7 02:05:01 CEST 2015

[03:21:42 CEST] <rsgm2> So I may have found a bug or my file is corrupted, When I try to concat these 3 video files, it outputs a video with atoo short of a duration and way too high of a bitrate
[03:21:57 CEST] <rsgm2> Here are the files, plus one with a lot more detailed info on it
[03:22:02 CEST] <rsgm2> https://mega.nz/#F!xsoGWIBA!1EQskrUQ1_09LenwhPawgw
[03:22:48 CEST] <rsgm2> Could someone tell me if I am using it the wrong way, if the audio really is corrupted or if it s a bug with ffmpeg?
[07:42:48 CEST] <voxadam> In the case of streaming MPEG-DASH is the bytestream truly streamed in a classic sense or does the client simply download and render the
[07:43:04 CEST] <voxadam> chunks in order?
[08:12:29 CEST] <TD-Linux> voxadam, well that's kind of a hard question. it can be truly streamed, but generally the client throttles by doing HTTP range requests to only grab a chunk of the file at a time
[08:12:41 CEST] <TD-Linux> (or separate files if the file is chunked)
[08:20:44 CEST] <waressearcher2> if I want to convert *ac3 to *wav is that a right command: "ffmpeg -i in.ac3 -ac 2 -ar 44100 -ab 320k -y -f wav out.wav" ? I mean should I use those "-ac 2 -ar 44100 -ab 320k" ?
[08:23:25 CEST] <voxadam> TD-Linux: Would you say that segmented protocols like DASH and HLS are taking over from RTMP and RTSP or would it depend on the use case? Are there still cases where streaming using RTSP is prefered?
[08:23:50 CEST] <voxadam> I'm particularly interested in static (non-live) streaming.
[08:24:31 CEST] <voxadam> For static streams is there still a reason to use RTSP/RTMP?
[08:26:25 CEST] <TD-Linux> voxadam, no, all RTMP gets you is dependency on flash
[08:27:00 CEST] <voxadam> Is there a benefit to streaming content via RTP encapsulated RTSP along with RTCP?
[08:27:33 CEST] <voxadam> So would it be fair to say that the classic protocols are falling by the wayside?
[08:27:38 CEST] <TD-Linux> RTSP is something totally different
[08:27:47 CEST] <TD-Linux> WebRTC uses RTP
[08:27:55 CEST] <voxadam> I meant RTCP.
[08:28:00 CEST] <voxadam> Initialism hell.
[08:28:10 CEST] <TD-Linux> no you mean RTSP I think
[08:28:51 CEST] <voxadam> RTCP
[08:28:54 CEST] <voxadam> :/
[08:29:27 CEST] <TD-Linux> WebRTC uses both RTP and RTCP. RTCP is the feedback channel for congestion control etc
[08:29:36 CEST] <voxadam> Is there any reason DASH or HLS can't be used in a VOD situation? Something like XBMC/Kodi, Plex, etcetera.
[08:29:43 CEST] <TD-Linux> RTSP is a way to initiate RTP streams using SDP
[08:29:49 CEST] <TD-Linux> RTSP is replaced with JSEP in WebRTC
[08:30:07 CEST] <TD-Linux> voxadam, no, they are a good choice for that
[08:30:09 CEST] <voxadam> What benefit does WebRTC have over other protocols? What is the use case?
[08:31:15 CEST] <TD-Linux> WebRTC uses RTP over SCTP over UDP, which gives it flexibility for e.g. dropping packets, which makes it good for realtime
[08:31:27 CEST] <TD-Linux> when you need low latency video, for example videoconferencing
[08:31:45 CEST] <TD-Linux> though you can totally use it for high latency if you want.
[08:32:00 CEST] <voxadam> Interesting.
[08:32:37 CEST] <voxadam> It seems that time hasn't made video streaming much easier.
[08:32:39 CEST] <voxadam> :)
[08:33:37 CEST] <TD-Linux> well you can just do <video src="file"> and BAM, video streaming :)
[08:34:23 CEST] <voxadam> I also just learned about the existance of HTSP which is used by Kodi, Myth, and probably others.
[08:35:29 CEST] <TD-Linux> much of DASH exists just to work around the limitations of the .mp4 container. because they HAD to use it
[08:40:17 CEST] <TD-Linux> or limitations of CDNs, like not supporting HTTP range requests
[08:42:02 CEST] <JEEB> DASH actually has a mode for that methinks. I'd rather phrase it so that HLS and DASH were made because companies got lazy with real streaming protocols and noticed that we have a large http serving infrastructure that cpuld be (ab)used
[08:42:29 CEST] <waressearcher2> voxadam: are you that PilzAdam ?
[08:42:45 CEST] <TD-Linux> JEEB, yeah DASH can operate either with a full file or it fragmented into individual files
[08:43:09 CEST] <JEEB> since you only need a http daemon that doesn't have to know jack shit about the stream that's beimg published
[08:43:23 CEST] <TD-Linux> and to be fair, HTTP itself is totally fine for this, DASH is just overly complex and tries to meet tons of needs
[08:43:54 CEST] <waressearcher2> are you all talking about that youtube streaming html5 feature ?
[08:44:05 CEST] <JEEB> the standard it uses
[08:44:16 CEST] <JEEB> iso/iec mpeg DASH
[08:44:24 CEST] <waressearcher2> are you all talking about that standard that youtube streaming html5 feature used ?
[08:44:27 CEST] <JEEB> whatever the number was
[08:44:38 CEST] <TD-Linux> part 14?
[08:44:41 CEST] <TD-Linux> or is that fmp4
[08:44:59 CEST] <waressearcher2> I should try later
[08:45:16 CEST] <TD-Linux> waressearcher2, or you could read
[08:46:18 CEST] <voxadam> waressearcher2: Nope.
[08:46:24 CEST] <voxadam> I'm just me.
[08:46:37 CEST] <TD-Linux> but yeah you can just stick a .mp4 or .webm with index on a web server. web browser reads index, then starts playing and can seek with http range. no dash or anything needed.
[08:47:02 CEST] <voxadam> I wasn't aware that DASH supported a non-segmented method.
[08:47:19 CEST] <TD-Linux> voxadam, it does, then it just specifies byte ranges inside one file.
[08:47:32 CEST] <voxadam> Interesting.
[08:47:46 CEST] <TD-Linux> you know, exactly the same information the file's header already has :)
[08:47:51 CEST] <voxadam> Can it still be served using a "dumb" HTTP server?
[08:47:54 CEST] <waressearcher2> TD-Linux: I mean my initial question
[08:48:09 CEST] <voxadam> Is seaking supported?
[08:48:10 CEST] <TD-Linux> voxadam, yup
[08:48:13 CEST] <voxadam> seekign.
[08:48:17 CEST] <voxadam> er...
[08:48:20 CEST] <TD-Linux> with http you can request byte ranges of a file
[08:48:21 CEST] <flux> jeeb, it's not only about there being a CDN ready.. if it were more complicated they would probably need to pay CDNs more money :)
[08:48:41 CEST] <TD-Linux> in fact I believe youtube uses the one-file method
[08:49:01 CEST] <flux> does youtube use dash for live streaming as well? in that case it probably does not.
[08:49:32 CEST] <TD-Linux> I'm not sure, I've not caught a youtube live stream recently to look at
[08:49:51 CEST] <flux> but it sounds reasonable that youtube would use a single file method, they probably have their own CDNs that work with whole videos etc.
[08:49:56 CEST] <voxadam> I've read about nginx-rtmp which can be used to stream DASH and HTS using RTMP. What benefits would it provide?
[08:50:14 CEST] <TD-Linux> none
[08:50:29 CEST] <voxadam> Odd.
[08:50:58 CEST] <TD-Linux> unless you want to use flash player
[08:51:03 CEST] <TD-Linux> which I consider an antifeature :)
[08:51:40 CEST] <voxadam> Is it possible to create a DASH stream from an MP4 or MKV file on the fly so it's not necessary to store multiple files on disk?
[08:51:47 CEST] <voxadam> fsck flash.
[08:53:05 CEST] <TD-Linux> kind of, there's a technicality
[08:53:33 CEST] <voxadam> So, if someone wanted to create a modern non-live VOD solution (e.g. Kodi, Plex, Emby) what would be the ideal protocol or protocol stack?
[08:53:57 CEST] <TD-Linux> either plain HTTP, or DASH, or something DASH-like
[08:54:09 CEST] <TD-Linux> also keep in mind that DASH is implemented in Javascript in web browsers usually
[08:54:27 CEST] <flux> DASH supports automatical bandwidth scaling, so if you want that then probably DASH over HTTP
[08:54:47 CEST] <voxadam> Is DASH going to win over HLS in a non-Apple portion of the universe?
[08:55:12 CEST] <TD-Linux> yes because it's royalty-free*
[08:55:28 CEST] <TD-Linux> *pending MPEG-LA hijinks
[08:55:29 CEST] <voxadam> HLS is patent encumbered?
[08:55:33 CEST] <TD-Linux> yes
[08:55:47 CEST] <voxadam> More so than just MPEG?
[08:58:28 CEST] <waressearcher2> what is hijinks
[08:58:45 CEST] <TD-Linux> well with different patents and different licensing
[08:59:24 CEST] <TD-Linux> waressearcher2, MPEG DASH was specifically designed to be royalty-free. But MPEG-LA (unrelated to MPEG) has announced they are starting a patent pool for it
[08:59:25 CEST] <voxadam> hijinks (n) boisterous
[08:59:28 CEST] <voxadam> fun
[09:01:48 CEST] <voxadam> TD-Linux: Good to know.
[09:05:13 CEST] <waressearcher2> if I want to convert *ac3 to *wav is that a right command: "ffmpeg -i in.ac3 -ac 2 -ar 44100 -ab 320k -y -f wav out.wav" ? I mean should I use those "-ac 2 -ar 44100 -ab 320k" ?
[09:16:08 CEST] <voxadam> ,
[09:18:51 CEST] <c_14> wav is pcm and as such you can't set the bitrate (at least not with -ab)
[09:19:29 CEST] <voxadam> Totally unrelated to anything: are FFMPEG and libav still seperate forks?
[09:19:35 CEST] <c_14> yes
[09:20:15 CEST] <voxadam> That's unfortunate.
[09:24:47 CEST] <waressearcher2> what is libav ? is it a clone of ffmpeg ? why to clone ffmpeg ?
[09:25:24 CEST] <waressearcher2> c_14: from what version it was forked ?
[09:25:30 CEST] <waressearcher2> !pork
[09:25:58 CEST] <c_14> That link explains some of it.
[09:26:31 CEST] <c_14> It was a while ago. Can't remember exactly when.
[09:27:43 CEST] <bluespark> how can i add 2 or more .srt subtitles into mkv or mp4  ... I know how to add one only but can not figurit to add 2 for example one is lang=eng and other is lang=ita   hmm?
[09:28:14 CEST] <c_14> ffmpeg -i video -i sub0 -i sub1 -map 0 -map 1 -map 2 -c copy out.mkv
[09:28:27 CEST] <c_14> map is your friend
[09:29:32 CEST] <bluespark> c_14:  uff so fast repsonse... I tried all but except -map   ... thank you man
[09:36:51 CEST] <bluespark> c_14: hehehe works nice but have to figurit more details....
[09:36:57 CEST] <bluespark> good day
[10:42:12 CEST] <neaj> Hi all .. I want to build ffmpeg using `./configure --with-stage1-ldflags="-Wl,-rpath=somepath..."`, but ffmpeg's configure script doesn't support this ..
[10:42:27 CEST] <neaj> is there a way to get ffmpeg compiled to link to libs in a specific location?
[10:43:39 CEST] <c_14> set LDFLAGS ?
[10:44:50 CEST] <neaj> c_14: does that result in a binary that works? giving it a try now ...
[11:03:12 CEST] <neaj> fails with 'gcc is unable to create an executable file.'
[11:03:33 CEST] <c_14> What were your ldflags?
[11:05:41 CEST] <c_14> Also, output of config.log? (first and last 50 or so lines usually enough)
[11:05:43 CEST] <neaj> export LDFLAGS="-rpath=/home/john/zope/engage/zeocluster/ffmpeg/parts/ffmpeg-build/lib -rpath=/home/john/zope/engage/zeocluster/ffmpeg/parts/x264-build/lib"
[11:06:48 CEST] <neaj> config.log reflects that in LDFLAGS='-rpath=/home/john/zope/engage/zeocluster/ffmpeg/parts/ffmpeg-build/lib -rpath=/home/john/zope/engage/zeocluster/ffmpeg/parts/x264-build/lib -L/home/john/zope/engage/zeocluster/ffmpeg/parts/x264-build/lib -L/home/john/zope/engage/zeocluster/ffmpeg/parts/ogg-build/lib -L/home/john/zope/engage/zeocluster/ffmpeg/parts/theora-build/lib -L/home/john/zope/engage/zeocluster/ffmpeg/parts/lame-build/lib -L/home/j
[11:07:35 CEST] <neaj> ah, here it is ...
[11:07:36 CEST] <neaj> gcc: error: unrecognized command line option '-rpath=/home/john/zope/engage/zeocluster/ffmpeg/parts/ffmpeg-build/lib'
[11:07:41 CEST] <c_14> Wouldn't it be LDFLAGS="-Wl,-rpath=" ?
[11:07:51 CEST] <neaj> rpath is getting passed to gcc
[11:07:55 CEST] <neaj> YOU'RE RIGHT :-)
[11:08:04 CEST] Action: neaj edits ... 
[11:08:44 CEST] <blue112> Hello. I have a problem with LIBAV for which the audiocodec->sample_rate is equal to 0. What could make that?
[11:16:30 CEST] <blue112> And channels have 22050 O_o
[11:16:38 CEST] <blue112> (which sounds like a sample rate)
[11:21:30 CEST] <neaj> LDFLAGS=-Wl,-rpath=${buildout:directory}/parts/ffmpeg-build/lib -Wl,-rpath=${buildout:directory}/parts/x264-build/lib
[11:22:30 CEST] Last message repeated 1 time(s).
[11:22:30 CEST] <neaj> works perfectly. c_14, thank you sir/madam :-)
[11:23:05 CEST] <neaj> c_14: heh, I happen to be in Bucharest next week for the first time
[11:25:17 CEST] <neaj> (sorry, please ignore the 'buildout:directory' bits)
[11:26:50 CEST] <blue112> I think there's a bug in LIBAV. All informations returned by avcodec_open2 are wrong
[11:47:17 CEST] <blue112> What could cause codecCtx->channels to be equal to 48000 and codecCtx->sample_rate = 0 ?
[11:51:44 CEST] <irock> hey guys, do you know of any work related to integrate ffmpeg with ms' wmf codecs and apple's vda framework?
[11:54:48 CEST] <iive> blue112: looks like ABI breakage. aka having headers of one version and library of another version.
[11:56:25 CEST] <blue112> I'm using the headers from ffmpeg-20150923-git-91b668a-win32-dev and the dll from ffmpeg-20150923-git-91b668a-win32-shared
[11:56:57 CEST] <blue112> So they should be the same, isn't it? I do agree that this would explain totally my problem.
[11:58:13 CEST] <blue112> iive, how do I check a DLL exact version?
[11:58:52 CEST] <blue112> Got it, it's, exactly like the version.h header.
[11:59:13 CEST] <iive> oh, windows...
[11:59:33 CEST] <blue112> iive, I do agree on that too :D But I got to make it work on windows, now it works on linux :)
[12:00:43 CEST] <iive> do you have idea what compiler has been used for creating them. Another possible issue is having different structure padding/packing.
[12:01:47 CEST] <blue112> Good point. I've been downloading there from here: AVCodecContext
[12:01:52 CEST] <blue112> http://ffmpeg.zeranoe.com/builds/ *
[12:03:10 CEST] <blue112> Ok, it was using GCC 4.9.3
[12:03:17 CEST] <blue112> And I'm using VS2013...
[12:03:32 CEST] <blue112> Is there a way to make the struct "compatible" :D ?
[12:03:41 CEST] <blue112> cc iive
[12:05:06 CEST] <iive> i think that by default gcc pads structures, so check if vs2013 have some options
[13:05:19 CEST] <Ghabry> Hey I want to convert images to a video slideshow. I use the following command line:
[13:05:21 CEST] <Ghabry> ffmpeg -framerate "1/10" -i Folie%1d.png -c:v libx264 -vf "fps=25,format=yuv420p" out.mp4
[13:05:28 CEST] <Ghabry> the 1/10 results in 10 seconds per image. But this eats gigantic amounts of RAM (more then 4 GB) and I get the warning "[output stream 0:0 @ 00000000005979a0] 100 buffers queued in output stream 0:0". Is it somehow possible to tell ffmpeg to flush more often to disc?
[13:52:25 CEST] <Ghabry> okay solved my problem... Executed it on one our servers that has 64 GB RAM.....
[14:09:41 CEST] <kprimdal> I'm trying to figure out the zoompan function, and I have a video on lets say 30 second. At second 10 I was to zoom to in so that the input is twice as big but the output size stays the same. Could anyone give me an example how that could be done ? Just to play with it I tried this command, where the input is 12 seconds, but it only give some flick
[14:09:41 CEST] <kprimdal> r like every 100 ms http://pastebin.com/uU3z4cve
[17:24:57 CEST] <deroad> Hi! i have a few questions about how should i use the C api when i want to use multiple decoders for the same codec
[17:27:24 CEST] <deroad> i'm writing a an app which takes 2 or more audio streams to decode.
[17:28:19 CEST] <deroad> these streams has speex or nellymoser format
[17:29:16 CEST] <Evermore> I'm using the ffmpeg API and libx264 to encode a video and mux it into .mp4. The video plays great in VLC but doesn't open in Windows Media Player. Am I overlooking something obvious? If I use ffmpeg -vcodec copy to remux it, it plays fine
[17:30:22 CEST] <deroad> what i was wondering is how i should use the ffmpeg multithreaded lib to use less cpu, since if i place 6 streams it starts missing some frames
[17:33:13 CEST] <Evermore> I was wondering if maybe .mp4 has a trailer that I'm not writing, but I triple-checked that I'm calling av_write_trailer before I close the IO context
[17:33:29 CEST] <Evermore> I also flush x264 out before writing the trailer
[17:36:59 CEST] <deroad> have you checked if ffprobe gives you any error? i had a similar problem by not starting the file with a keyframe
[17:42:04 CEST] <Evermore> deroad: ffprobe didn't give any error but I solved it through careful websearching
[17:42:27 CEST] <Evermore> Instead of searching for "ffmpeg" which gives thousands of ffmpeg.exe results, I did "libav windows media player" and got this https://ffmpeg.org/pipermail/libav-user/2014-July/007182.html
[17:42:39 CEST] <Evermore> I wasn't calling avcodec_open2 at all, so the global header flag was not set
[17:43:27 CEST] <Evermore> I'm ashamed to admit that ffmpeg is still voodoo programming for me, I add bits until it works and then take away bits until it doesn't, and somehow I was able to encode a video without opening the encoder... I should probably close it at some point too
[17:43:43 CEST] <Evermore> Also I don't understand why I have an h264 codec from ffmpeg, when I'm already using x264
[17:46:21 CEST] <deroad> i see
[18:07:45 CEST] <wingless> Is it possible to specify a mux rate while building mp4 files? My goal is to have a file where every frame has an absolutely fixed size.
[18:09:09 CEST] <Evermore> All frames uniformly the same size?
[18:09:28 CEST] <wingless> Evermore: Yes. I want to be able to read/change frames from outside with a simple scripts.
[18:09:37 CEST] <Evermore> You want to change them, too?
[18:10:15 CEST] <ChocolateArmpits> What container are you using ? Do you have any container requirements ?
[18:10:22 CEST] <Evermore> That's a cool idea but most lossy compression formats compress the image by however much they can, which is a somewhat unpredictable size
[18:10:50 CEST] <Evermore> Also keyframes are gonna be bigger
[18:14:30 CEST] <Evermore> Brings to mind PVRTC, which actually is a fixed size but has pretty bad quality. It's for compressing textures onto GPUs of phones
[18:16:35 CEST] <ChocolateArmpits> Evermore: MXF OP1a has a relatively simplistic structure that what he wants could be accomplished if he uses that. I wrote a partial header decoder in Powershell mixed with C# even though I had no previous experience in video file parsing
[18:17:01 CEST] <ChocolateArmpits> I don't know of structures of other popular containers to say if it would be as easy
[18:17:37 CEST] <Evermore> never heard of mxf, looking it up now
[18:17:38 CEST] <ChocolateArmpits> using something else
[18:17:52 CEST] <ChocolateArmpits> Material-Exchange-Format
[18:18:46 CEST] <wingless> ChocolateArmpits: Evermore: mp4
[18:19:04 CEST] <wingless> mpeg-ts is fine too
[18:20:15 CEST] <popara> Guys can someone tell me what the OCR filter does exactly in the new FFmpeg git
[18:38:45 CEST] <ChocolateArmpits> popara: Maybe it's there to help read hardcoded subs ?
[18:39:09 CEST] <popara> i saw the code, it has some kind of blacklist
[18:39:12 CEST] <popara> not sure what it does though
[18:52:41 CEST] <lindylex> I am trying to slice and adjust the video brigthness andthis is not working.  ffmpeg-git-20151004-64bit-static/ffmpeg -i MVI_5949.MOV -ss 00:00:44.0 -t 00:00:10.0 -vf curves=preset=lighter -c:a copy -y d1.mov
[18:55:00 CEST] <ChocolateArmpits> lindylex: You may want to use the mplayer filter, I think it has dedicated brightness, contrast and gamma settings
[18:56:33 CEST] <ChocolateArmpits> lindylex: Here's a blog post explaining it's use https://wjwoodrow.wordpress.com/2013/11/22/adjusting-video-contrast-brightness-saturation-and-color-balance-with-ffmpeg/
[18:56:35 CEST] <ChocolateArmpits> its*
[18:56:40 CEST] <lindylex> Thanks I was about to ask
[18:57:28 CEST] <ChocolateArmpits> popara: I found ocr entry in the filters page https://www.ffmpeg.org/ffmpeg-filters.html#ocr
[18:57:31 CEST] <lindylex> I am using a static build.  I think i tried this and it said someabout the mp
[18:57:44 CEST] <lindylex> This >>   [AVFilterGraph @ 0x4ec0840] No such filter: 'mp'
[18:58:41 CEST] <ChocolateArmpits> lindylex: oh sorry ! https://www.ffmpeg.org/ffmpeg-filters.html#eq
[18:58:47 CEST] <ChocolateArmpits> it's eq now
[18:59:05 CEST] <lindylex> This is what I tried  /home/share/del/ffmpeg-git-20151004-64bit-static/ffmpeg -i MVI_5949.MOV -ss 00:00:44.0 -t 00:00:10.0 -vf mp=eq2=1:1.3:0.1 -c:a copy -y d1.mov && mplayer -loop 0 d1.mov
[18:59:14 CEST] <lindylex> It does not work.
[18:59:53 CEST] <lindylex> Ok let me build the string again.
[19:02:13 CEST] <popara> blacklist   Set character blacklist.
[19:02:27 CEST] <popara> What this does exactly it doesnt say. ?
[19:05:27 CEST] <lindylex>  ChocolateArmpits: Thanks ffmpeg -i MVI_5949.MOV -ss 00:00:44.0 -t 00:00:10.0 -vf eq=1:.3:1 -c:a copy -y d1.mov
[19:06:36 CEST] <ChocolateArmpits> popara: I'm not familiar with the script, but I guess blacklist sets characters which should not be outputted, while the whitelist sets the output only to specified characters
[19:18:04 CEST] <lindylex> How do you do copy video and audio codec?  c:v:a  like this?
[19:18:46 CEST] <Evermore> I usually do -vcodec copy
[19:18:55 CEST] <Evermore> so probably that and -acodec copy
[19:18:57 CEST] <lindylex> ok I thoght that was the old way
[19:19:09 CEST] <lindylex> That is the way I have done it in the past.
[19:19:13 CEST] <Evermore> probably is
[19:19:17 CEST] <Evermore> I'm terrible at learning syntax
[19:19:38 CEST] <lindylex> I think they have moved on to -c:a means copy audio.
[19:19:47 CEST] <lindylex> -c:v mean copy video
[19:20:08 CEST] <Evermore> why
[19:20:11 CEST] <lindylex> I was wondering if they are separated.
[19:20:28 CEST] <lindylex> if I want to copy the audio and video
[19:20:41 CEST] <lindylex> such ass -c:a -c:v
[19:20:44 CEST] <lindylex> as
[19:21:13 CEST] <klaxa> you can do -c copy
[19:21:17 CEST] <klaxa> that will copy all streams
[19:21:22 CEST] <klaxa> unless otherwise specified
[19:25:09 CEST] <lindylex> klaxa: thanks
[21:06:28 CEST] <svanheulen> anyone know how to add album art to an ogg vorbis file with ffmpeg?
[23:14:36 CEST] <alesan> hello... I have some script that processes a bunch of images and spits them out ready to be encoded in a video... what is the best way to feed ffmpeg with a couple of ten thousands images? ideally, uncompressed, so I do not have to make an extra unnecessary encoding and decoding op for each
[23:19:27 CEST] <waressearcher2> alesan: cat {0001..2000}.png | ffmpeg -f image2pipe -vcodec png -r 25 -i - -vcodec mpeg4 -vb 1500k -y -f avi 1.avi
[23:19:54 CEST] <alesan> I do not have 'n' image files
[23:20:05 CEST] <alesan> I have a script that generates image frames
[23:20:25 CEST] <alesan> I mean, I could always have the script write out the files first, of course
[23:20:27 CEST] <c_14> alesan: image2pipe
[23:20:45 CEST] <alesan> but it does not seem the smartes solution
[23:20:51 CEST] <c_14> Why?
[23:21:00 CEST] <alesan> c_14, I will look into that
[23:21:01 CEST] <alesan> well
[23:21:08 CEST] <c_14> You could also use yuv4mpegpipe
[23:21:16 CEST] <c_14> But then you'd have to convert colorspace (probably).
[23:21:22 CEST] <alesan> because I do not know in advance how many images I am going to have, so I will need to write all images first
[23:21:27 CEST] <alesan> and then do the encoding
[23:21:34 CEST] <alesan> so I will not be able to use parallelism
[23:21:36 CEST] <c_14> That's what the pipe is for.
[23:21:42 CEST] <alesan> and I will also need a huge disk space
[23:21:53 CEST] <c_14> Not for image2pipe
[23:21:55 CEST] <alesan> a pipe is something that I would very much like
[23:22:14 CEST] <alesan> c_14, I was referring to the kind advice given by waressearcher2
[23:22:18 CEST] <c_14> ah
[23:22:25 CEST] <alesan> :) sorry for the confusion
[23:22:40 CEST] <BtbN> Just replace cat with whatever outputs your images?
[23:22:57 CEST] <alesan> well I would like to preserve colorspace, but my program has to do RGB compositing operations anyway...
[23:23:20 CEST] <alesan> BtbN, ohhh I see sorry I just focused on the initial command
[23:23:35 CEST] <alesan> that is in fact what also c_14 said I believe
[23:24:04 CEST] <alesan> can I give raw RGB to ffmpeg as input? or is it always better to go through a real format
[23:24:27 CEST] <alesan> I'd like to avoid a compressed format for the sake of avoiding unnecessary encodings and decodings
[23:26:14 CEST] <Evermore> There is a rawvideo format that lets you pass RGB frames to ffmpeg through a pipe
[23:26:26 CEST] <Evermore> I only tested it on a very large uncompressed file though
[23:26:57 CEST] <BtbN> I don't think image2pipe would work with raw rgb. How would it know when a frame ends?
[23:27:19 CEST] <alesan> BtbN, maybe I can specify frame size on the ffmpeg command line
[23:27:29 CEST] <alesan> and then it's only a matter of counting the bytes
[23:27:42 CEST] <Evermore> The thing I was using calculated frame size from width, height, and pixel depth, and knowing they were all  uncompressed
[23:27:44 CEST] <alesan> but maybe I should use something like uncompressed tif or pnm
[23:28:24 CEST] <Evermore> Unfortunately Cygwin seems to be dropping my reverse-search history periodically so I can't dig up the exact command
[23:28:31 CEST] <alesan> I will also need to resample the size of the image, should ffmpeg perform this?
[23:28:50 CEST] <alesan> or the script that generates the input to ffmpeg
[23:28:54 CEST] <Evermore> Yeah I think it can do that, maybe as a "filter"
[23:29:11 CEST] <Evermore> The frames will probably be run through swscale internally, which can do resizing as part of the intake process
[23:29:21 CEST] <alesan> am I correct when I think ffmpeg has very advanced scaling algorithms? for max quality
[23:29:42 CEST] <Plorkyeran> no
[23:29:48 CEST] <alesan> no?
[23:29:50 CEST] <Plorkyeran> it has fast scalers
[23:29:54 CEST] <Evermore> It has lanzscos doesn't it?
[23:29:57 CEST] <Plorkyeran> it does not have unusually high quality scalers
[23:30:14 CEST] <alesan> I will need to reduce the size from 12Mpixel to 1080p or similar
[23:30:35 CEST] <waressearcher2> alesan: why don't you have script generate 1 image on disk then you feed that file into ffmpeg and after that remove the image file, that way  you don't need huge disk space ?
[23:31:03 CEST] <alesan> waressearcher2, well a pipe would do that without using any disk space...
[23:31:31 CEST] <alesan> in any case waressearcher2 how would you tell ffmpeg to get the same image over and over again while keeping the encoding context?
[23:31:39 CEST] <alesan> is there an option for that maybe?
[23:31:40 CEST] <waressearcher2> alesan: { while :; do if [ -f image_file ]; then cat image_file; fi; sleep 1; done }  | ffmpeg -f image2pipe
[23:32:01 CEST] <alesan> well
[23:50:02 CEST] <waressearcher2> I actually did something similar few month ago, I had script that generates thousands of images something like 8000x6000 and I calculated it will take many GB of space but I made it generate one image at a time and them feed to ffmpeg, it took many hours but at least it took just few MB of space as a whole, the disadvantage is if there is any
[23:50:07 CEST] <waressearcher2>  error I will not have actuall images so would have to generate them over again, but it worked
[23:50:32 CEST] <alesan> yes I can live with that
[23:56:00 CEST] <BBk> Hi
[23:56:24 CEST] <BBk> Do you think that is possible to make a multichannel audio stream with FFmpeg ?
[23:56:58 CEST] <c_14> yes
[23:56:58 CEST] <BBk> by using multiple audio input from Driectshow and output them as a single mpeg-ts output
[23:57:08 CEST] <c_14> yes
[23:57:14 CEST] <c_14> https://trac.ffmpeg.org/wiki/AudioChannelManipulation
[23:58:19 CEST] <BBk> they need to not be mixed on a signle stereo signal
[23:59:04 CEST] <BBk> the idea is to have a "real" multichannel audio mpeg-ts by multiplexing one video input and some audio input, to have a multi-language streaming
[23:59:38 CEST] <BBk> I know that mpeg-ts stream can handle that, but I can't find a solution to multiplex the input
[00:00:00 CEST] --- Wed Oct  7 2015

More information about the Ffmpeg-devel-irc mailing list