[Ffmpeg-devel-irc] ffmpeg.log.20140304

burek burek021 at gmail.com
Wed Mar 5 02:05:01 CET 2014


[00:02] <Mavrik> you'll have to comply to that part's license of course.
[00:02] <Mavrik> but pretty certanly the answer is "no"
[00:02] <Mavrik> at least not without releasing most of your source as LGPL
[00:03] <Mavrik> grepwood, read up on LGPL/GPL license.
[00:04] <grepwood> Mavrik, my source code is GPLv3
[00:04] <Mavrik> ah
[00:04] <Mavrik> that should make it easier :)
[00:04] <grepwood> I'm asking because in all honesty I don't feel motivated enough to read a legalese text wall, I want to focus on writing code
[00:05] <grepwood> hopefully that is not a deadly sin :)
[00:06] <Mavrik> well
[00:06] <Mavrik> honestly if you can't read an A4 paper of rules...
[00:06] <Mavrik> or a wiki article.
[00:07] <Mavrik> again: 1.) Check license of ffmpeg source file you want to use
[00:07] <Mavrik> 2.) adhere to it
[00:07] <grepwood> alright
[02:46] <TheWinnerFjk> See this please  http://toomanydownloads.x10host.com/?ref=511
[13:21] <martin___> hi
[13:21] <martin___> i'm trying to build a localhost rtsp stream on windows
[13:22] <martin___> i've googled around and i've noticed that ffserver is only for linux
[13:22] <martin___> any help?
[16:05] <dvnl> hi Everyone! I would like to compile ffmpeg (with reduced functionality) "by hand", so I need to know which .c and .h files are needed. I need this because I'd like to compile it to Xilinx's Microblaze processor. Now I'm compiling on Windows by MSys. make gives me output in which only .o object files are shown like CC libavcodec/dvdec.o. Mw question would be how I know which sources are used for the building process? I'd be very happy if
[16:10] <AndrzejL> Hi guys I need to convert a video to a html5 / mp4 standard - any advise / commands I could copy off?
[16:11] <AndrzejL> I know only that it needs to be html5 / mp4 compliant ;D - I know nothing bout codecs plus the video has no sound (on purpose) so I need to skip audio convertion ;D
[16:14] <AndrzejL> would this be correct? ffmpeg -i {input}.mov -vcodec h264 -acodec aac -strict -2 {output}.mp4
[16:14] <AndrzejL> I will use avi as input file instead of mov tho
[16:16] <JEEB> you'll probably want to add a few settings :P
[16:16] <JEEB> also with newer ffmpeg you can use -c:v and -c:a instead of vcodec and acodec
[16:16] <JEEB> also -strict -2 is better written as -strict experimental
[16:16] <JEEB> more readable, I have no idea why the error tells you to use -2 and not the text variant
[16:17] <JEEB> also if you have no audio I don't think you need an audio stream unless some client only reads files with video and audio
[16:18] <AndrzejL> JEEB: actually I dont want to add anything :P it worked :P
[16:18] <AndrzejL> hehe
[16:18] <JEEB> but that is not proper
[16:19] <JEEB> let me write you a proper line
[16:19] <AndrzejL> I was lucky - I need html5 / mp5 video for my piwigo video plugin :P and video that I have just created using this command worked perfect :D
[16:19] <AndrzejL> mp4*
[16:19] <JEEB> then you're really lucky in quite a few ways
[16:19] <JEEB> ffmpeg -i {input} -an -c:v libx264 -crf AA -preset BBB -maxrate CCC -bufsize DDD -movflags faststart {output}
[16:19] <JEEB> an disables audio
[16:20] <JEEB> crf sets the rate factor, by default you get 23
[16:20] <AndrzejL> thank You JEEB I will paste both into my treasure command line notepad :D
[16:20] <JEEB> lower rate factor = more rate used, "higher quality", higher rate factor = less rate used, "lower quality"
[16:20] <JEEB> basically find the highest rate factor that still looks good enough for you
[16:20] <JEEB> -preset is the speed VS compression option
[16:20] <JEEB> http://mewiki.project357.com/wiki/X264_Settings#preset
[16:21] <JEEB> ^ list of presets
[16:21] <JEEB> then we have maxrate and bufsize
[16:21] <JEEB> if you are doing something over a limited bandwidth (and "HTML5" already hints that you are)
[16:21] <JEEB> you NEED them
[16:22] <JEEB> bufsize is the amount of bits/time buffered, and maxrate is the maximum average bit rate used over bufsize
[16:22] <JEEB> so for example if you've got a 1mbps connection being the minimum you want to serve
[16:22] <JEEB> you set maxrate to that, minus possible audio and then some extra for the container overhead
[16:23] <JEEB> so when you have no audio, let's say 975K
[16:23] <JEEB> (or k, not sure which it was with ffmpeg)
[16:23] <JEEB> then you set the buffer size depending on what your usual clients do, or how you can set them
[16:23] <JEEB> usually bufsize is not smaller than "one second" (1*maxrate)
[16:24] <JEEB> and then you have -movflags faststart, which then remuxes the file at the end to have the index in the beginning
[16:24] <JEEB> and that's it :)
[16:24] <JEEB> there are various extra limitations you might have to add in case of the clients being limited
[16:24] <JEEB> but I have no clue what kind of clients you're wanting to serve
[16:25] <AndrzejL> JEEB: :) https://andrzejl.no-ip.org:30303/images/index.php?/category/27 > tls 1.2 compatible browser and ignore the self signed cert - check out the little gadget I bought myself today ;D
[16:26] <AndrzejL> actualy bought it 3 weeks ago on ebay but it arrived today ;D
[16:27] <JEEB> and basically the extra limitations would be mostly towards the H.264 stream (limiting the profile and level, working around stupid quirks of some decoders if you happen to hit such), or having to make a quiet audio track
[16:27] <JEEB> depends on what kind of clients you end up serving, as I said
[16:27] <AndrzejL> Mostly linux + some nice browser like firefox and chromium ;)
[16:27] <JEEB> but libx264 with crf, preset and VBV is the basics (VBV being the bandwidth limitation system that gets enabled with maxrate/bufsize)
[16:27] <JEEB> ok
[16:27] <JEEB> then you should generally be OK :P
[16:28] <JEEB> without extra limitations
[16:28] <AndrzejL> :D
[16:28] <JEEB> although does firefox OOB support H.264?
[16:28] <JEEB> on lunix
[16:28] <JEEB> I know they had the gstreamer thingy
[16:28] <AndrzejL> Just checked all my browsers - video works I am happy ;D
[16:28] <JEEB> but I have no idea if it's enabled anywhere
[16:29] <AndrzejL> so now I will keep those commands and I can convert any video I will make and just throw them into my piwigo... this is awesome ;)
[16:41] <simpleirc1> hello
[16:44] <simpleirc1> i was not able to find my subtitle in my ts file
[16:46] <relaxed> where was the last place you saw it?
[16:58] Action: NucWin_ bangs head on the wall to see if it helps build ffmpeg + libfdk_aac for win64
[16:58] <JEEB> huh
[16:58] <JEEB> it should be rather straightforward
[16:59] <NucWin_> yeh i was hoping so
[16:59] <NucWin_> currently trying to cross compile from arch -> win64
[17:00] <JEEB> for autoconf-based stuff (fdk-aac) it's something like this ./configure --host=${P64B} --prefix=/mingw/${P64B}
[17:00] <JEEB> where P64B is the prefix (x86_64-w64-mingw32 usually for mingw-w64's win64)
[17:00] <JEEB> and forget the prefix I set, just set your own
[17:01] <NucWin_> think i might have been a little cocky when i tried to use the switches from Zeranoe build
[17:01] <NucWin_> configure switches*
[17:02] <JEEB> then with ffmpeg set PKG_CONFIG_PATH=/your/win64/prefix/lib/pkgconfig ./configure --enable-nonfree --enable-libfdk_aac --cross-prefix=${P64B}- --arch=x86_64 --target-os=mingw32
[17:02] <JEEB> and possibly add extra-cflags and ldflags to include /your/win64/prefix/lib and /your/win64/prefix/include
[17:02] <JEEB> in search paths
[17:03] <JEEB> that's really all you would need to do
[17:04] <Zeranoe> NucWin_: Whats the problem?
[17:05] <NucWin_> think im the noob problem
[17:05] <NucWin_> trying to copy your build switches without having the dependencies i think
[17:06] <JEEB> just don't do that :P don't try to start with everything and the kitchen sink
[17:06] <JEEB> because in most cases you really don't need them
[17:07] <ssspiff> is it possible to change the entropy coding algorithm for an AVC video without actually going all the way through the decode & reencode pipeline?
[17:07] <NucWin_> my aim is to transcode x264 + ac-3/dts to x265 + aac
[17:07] <JEEB> there's nothing pre-done for that
[17:07] <JEEB> NucWin_, just make sure you use preset placebo and 16 refs and bframes with libx265
[17:08] <JEEB> otherwise it really isn't better than x264
[17:08] <JEEB> expect ~10x the time used for encoding compared to libx264 at preset placebo
[17:08] <JEEB> oh, and tune ssim because otherwise you'll lose AQ
[17:09] <Zeranoe> NucWin_: Then you only need libx264 and fdk-aac. just focus on those, and dont use placebo
[17:09] <JEEB> Zeranoe, he wants to use libx265
[17:09] <JEEB> not libx264
[17:09] <JEEB> and libx265 is uttercly craptastic if you don't max it out
[17:09] <Zeranoe> Also x265
[17:10] <JEEB> I mean, why use libx265 if you don't expect a better result than with libx264 :P
[17:10] <JEEB> because it will be slower in any case
[17:10] <Zeranoe> its slow to start with... I cant imagine placebo
[17:10] <JEEB> dunno, 4.40fps to 0.4fps with 720p, 8bit encoding
[17:10] <JEEB> only ~10x slower, so for some that might be bearable
[17:10] <JEEB> but if it's just useless _and_ slow without maxing it out
[17:11] <JEEB> I don't see any reason to use it
[17:11] <JEEB> :P
[17:11] <JEEB> thus if he wants to use it, the only viable option is to max it out
[17:11] <Zeranoe> Doesn't it have it's sights set on real time?
[17:12] <JEEB> Zeranoe, whatever it has its sights on doesn't mean it's anything else but kool aid right now
[17:13] <Zeranoe> lol
[17:13] <JEEB> I mean, it can be realtime with high-end hardware and the faster presets
[17:13] <JEEB> but that doesn't mean you want to use it for anything else but "LOL I'M USING HEVC"
[17:13] <JEEB> aka PR/promotion
[17:14] <JEEB> if you want to actually get somewhat better results per the same bit rate compared to libx264, there's not much else to do but set tune to SSIM, max the preset and refs+bframes
[17:14] <NucWin_> i had a play with x265 last night using Zeranoe's build (worked greate btw thanks for build) and i managed to get the quality quite good using crf=25
[17:14] <NucWin_> downside is the copied over DTS is now twice the size of the video
[17:14] <JEEB> well the fact still stands that you aren't getting better compression with libx265 with such settings :P
[17:14] <Zeranoe> They seem to be really active
[17:15] <Zeranoe> with their source, so I'll try and keep up with them with the builds.
[17:15] <JEEB> yes, MCW's kindergarden teacher (steve) really has a lot of work to do to keep his chinese and indian devs on a leash
[17:15] <JEEB> expect build failures at random times, and random regressions
[17:15] <JEEB> they don't seem to have a goddamn buildbot on their office even :P
[17:16] <JEEB> s/on/at/
[17:16] <JEEB> and as sad as libx265 is under MCW, it's still better than the proprietary encoders
[17:16] <JEEB> which is really, really sad
[17:16] <Zeranoe> Off topic, but why hasn't x264 ever made a release version
[17:17] <JEEB> basically the "stable" tags can be taken as "releases" if you wish
[17:17] <JEEB> generally though, x264 is very stable
[17:17] <Zeranoe> I've never had an issue with it
[17:18] <JEEB> you can wait for a week or two after a push and see if any quickfixes get pushed
[17:18] <JEEB> if not
[17:18] <JEEB> then it's all right
[17:18] <NucWin_> which is the best aac encoder that is included with the GPL version (i might just try with that to save this headache)
[17:18] <Zeranoe> NucWin_: The internal
[17:18] <JEEB> the internal one
[17:19] <groundnuty> hey, ffmpeg version 2.0.2 here. Quesion: I have files: psp-0.000000_0.png, psp-0.005000_0.png,   psp-0.010000_0.png,  psp-0.015000_0.png, psp-0.020000_0.png...
[17:19] <JEEB> vo-aacenc is a fucking trainwreck and I fully understand why Google suddenly licensed fdk-aac after like a single release :D
[17:19] <groundnuty> what pattern to -i should I use to generate an animation?
[17:20] <Zeranoe> If only you sell bininaries with fdk inside... it would be GPL
[17:22] <JEEB> well, at least there's the LongestThread on the trac
[17:22] <JEEB> about improving ffaac
[17:25] <Zeranoe> I feel like it's pretty good. "Acceptable"
[17:26] <JEEB> it is
[17:26] <JEEB> just don't try to use it at too low bit rates
[17:26] <JEEB> around 2010-11 it was giving out random artifacts at 192kbps
[17:26] <Zeranoe> For mono?
[17:27] <JEEB> stereo
[17:27] <JEEB> but that was a bug and squashed back then
[17:48] <NucWin_> well ive given up trying to build on archlinux, installed the libfdk-aac package and it still says libfdk_aac not found :(
[17:49] <NucWin_> using internal aac is working ok though, getting a nice 11fps
[17:49] <Zeranoe> NucWin_: You need to cross compile
[17:50] <Zeranoe> every lib you want to use in your Windows Ffmpeg needs to be cross compiled too
[17:50] <NucWin_> suspected that would be the problem
[17:50] <Zeranoe> Also, I use debian and highly recommend it
[17:51] <NucWin_> never been a good fan of debian, seems their packages are normally quite out of date
[17:51] <NucWin_> just love how arch is bleeding edge
[17:51] <NucWin_> i might try building in MSVC tomorrow
[17:51] <NucWin_> but now i need to go get ready for pancake day \o/
[17:52] <relaxed> debain sid is fairly current
[17:52] <Zeranoe> If you use stable yes, I use testing with xfce (because screw gnome)
[17:53] <relaxed> I've been running sid for over 7 years now and I can't imagine using anything else.
[17:53] <Zeranoe> ^^
[17:54] <Zeranoe> I don't go with unstable, but testing, Point is you have options with Debian
[17:58] <NucWin_> yay bouncer is back
[18:02] <NucWin> awww it crashed about 40% through transcoding
[18:04] <Zeranoe> What command
[18:08] <NucWin> ffmpeg -i "\\filesrv\share\test (720p BluRay).mkv" -strict -2 -c:v libx265 -c:a aac -b:a 640k -x265-params crf=25 "\\filesrv\share\test (720p BluRay HEVC+AAC).mkv"
[18:08] <NucWin> asking for all kinds of trouble lol
[18:09] <relaxed> I don't think you can mux hevc in anything right now using ffmpeg.
[18:10] <NucWin> It worked fine when I copied the audio rather than conver
[18:10] <NucWin> could have been the lan or overclocked cpu :P
[18:11] <relaxed> hmm, I'm mistaken
[18:13] <relaxed> NucWin: [aac @ 0x3ffada0] Too many bits per frame requested
[18:13] <relaxed> lower your bitrate
[18:19] <NucWin> didnt see any errors like that it just popup up the windows ffmpeg has crashed dialog
[18:19] <NucWin> what is the optimal bitrate for aac?
[18:20] <Zeranoe> NucWin: 64bits per channel (generally).see https://trac.ffmpeg.org/wiki/GuidelinesHighQualityAudio and http://trac.ffmpeg.org/wiki/AACEncodingGuide
[18:22] <NucWin> thanks
[18:23] <NucWin> running again @ 384k ;)
[18:48] <vmBenLubar> The command `ffmpeg -ss 0.3 -i ak_0001.wav -t 0.2 py.wav` is giving me a 0.25 second file as output. Is there a way I can force ffmpeg to be more precise with durations?
[18:59] <relaxed> vmBenLubar: have you tried sox?
[19:01] <vmBenLubar> relaxed: sox?
[19:11] <vmBenLubar> relaxed: ah, works like a charm. thanks.
[19:38] <omin> Hi guys! Does anybody know how to make an infinitive input loop from video file for ffmpeg?
[19:39] <omin> I have read a lot but without success =(
[19:41] <omin> -f lavfi -re -i "movie=somefile.mov:loop=0" does not work
[19:45] <jangle> at some point your movie file will have to stop looping
[19:45] <jangle> so, how many loops do you want your movie file to contain?
[19:57] <simpleirc1> i think omin asked for infinite, u  can use concat commnd and read from named pipe giving the list of file
[19:58] <simpleirc1> on that named pipe you can write using yes command infinitely
[20:00] <jangle> well there is no such thing as a video file that is infinitely long.  There is such thing as a video player that will play your file from the beginning, or a video file that has a fixed number of repeated sections appended at the end of it
[20:00] <jangle> *play your video file over from the beginning once it reaches the end
[20:10] <Zeranoe> Why might ffplay display a different aspect when playing a video if -t 00:00:00.01 is used?
[20:14] <radagast> hello
[20:14] <radagast> can someone help me with ffmpeg
[20:14] <radagast> ?
[20:14] <radagast> i have a little problem
[20:15] <llogan> groundnuty: see third example http://ffmpeg.org/ffmpeg-formats.html#Examples-3
[20:16] <llogan> radagast: don't ask to ask. just ask your question.
[20:17] <radagast> ohh sorry
[20:19] <radagast> i have 60 file in .h264 format take out of a ZMODO nvr a security cam recorder i have problem to view them the only way is with vlc with che command  vlc force --demux h264, now i have to unify all file in a single file with ffmpeg i have to create a unique avi file but it's the first time i work with ffmpeg
[20:20] <radagast> is it possible can someone give me a help on how to unify the files in h.264 format?
[20:20] <Jack64> what do you mean by unify?
[20:21] <Jack64> concatenate?
[20:21] <radagast> create a unique avi file with 60  other small files
[20:21] <radagast> yes concatenate
[20:21] <radagast> sorry for my bad english
[20:21] <Jack64> ok
[20:22] <Jack64> are you using windows or linux?
[20:22] <radagast> linux
[20:22] <Jack64> me 2, I do that often. here's how I do it:
[20:22] <Jack64> echo "file '/tmp/ram/overlaid0.mp4'" >> /tmp/ram/config
[20:22] <luc4> llogan: ping
[20:23] <llogan> radagast: see https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20%28join,%20merge%29%20media%20files
[20:23] <llogan> luc4: pong
[20:23] <Jack64> echo "file 'path/to/file1'" >> queue.txt
[20:23] <Jack64> echo "file 'path/to/file2'" >> queue.txt
[20:23] <Jack64> ffmpeg -y -f concat -i queue.txt -c copy out.avi
[20:23] <luc4> llogan: don't know if you remember my issue, anyway I tested ffmpeg 2.1.4 but still the same on mac os.
[20:24] <Jack64> radagast get it?
[20:24] <llogan> luc4: but did you test with a recent, daily build from tessus: http://ffmpeg.org/download.html#MacOSXBuilds
[20:24] <radagast> is it a problem if the files are in h264 format?? i iask you that because vlc and other programs doesnt open the file i have to foce the h264 demus or something like that
[20:24] <radagast> i try now
[20:25] <Jack64> no problem for me here, so if you're having problems maybe it's the separate files that need to be fixed
[20:25] <Jack64> but try that see if it works for you
[20:26] <luc4> llogan: nope, I'll try that as well. Thanks.
[20:26] <Jack64> (also I usually do it in a RAMDisk, because it's almost instantaneous, if you have a lot of RAM I encourage you to do it too)
[20:27] <jangle> radagast:  if vlc won't play your individual h264 files, then something is wrong with them.  If you've built ffmpeg yourself, and if the supplied ffplay program can't play your h264 files, then something is wrong with the files
[20:28] <jangle> *or the bitstream is in a format that vlc doesn't yet support, and ffplay may give you more information about that
[20:28] <Jack64> jangle you know anything about -filter_complex ?
[20:29] <jangle> Jack64: nope
[20:29] <luc4> llogan: latest seems to be built on Feb 24 2014 14:56:08. Same output for me...
[20:30] <Jack64> crap.. I'm building a PHP front-end for ffmpeg to help me with some videos I usually process and I need to apply multiple filters at the same time to some videos, but the syntax is a bit crazy ...
[20:31] <radagast> Unknown input format: 'concat'
[20:32] <radagast> ffmpeg -y -f concat -i queue.txt -c copy out.avi
[20:33] <luc4> llogan: http://paste.kde.org/pnrpokncd
[20:34] <luc4> llogan: I'm also testing on linux
[20:34] <jangle> Jack64:  google suggets that you pass multiple filters via comma separated sections of a string, like -vf "filter1=adsfasdf, filter2=asdfadfasdf"
[20:35] <radagast> llogan, http://pastebin.com/TxPJnguq
[20:35] <llogan> radagast: you're using a fake ffmpeg from a fork
[20:35] <llogan> get the real ffmpeg
[20:36] <llogan> or compile http://trac.ffmpeg.org/wiki/UbuntuCompilationGuide
[20:36] <Jack64> jangle: I know, I've searched a bit through but it's hard because the filters I have to pass are a bit crazy, for example, in some videos I have to split them in 3 parts, then add a 3 second image overlay to the first split + a scorebar with current game score (soccer vids), then split 2 will have a different scorebar (just 1 filter), then split 3 will have even another score bar + fade out at the end. (get the gist of it?)
[20:37] <radagast> ok i try this tnaks llogan I  really thank you
[20:37] <Jack64> someone should update that ubuntu compilation guide just to add make -j4 to all makes
[20:37] <Jack64> makes it a lot faster
[20:37] <Jack64> or even -j8
[20:38] <jangle> Jack64: can you draw a tree with what you want done?
[20:38] <llogan> Jack64: you can use trim/atrim or maybe http://ffmpeg.org/ffmpeg-filters.html#Timeline-editing
[20:41] <llogan> luc4: i meant a daily snapshot, not a release http://www.evermeet.cx/ffmpeg/snapshots/
[20:41] <llogan> forget releases exist
[20:41] <luc4> llogan: ok, sorry
[20:42] <luc4> llogan: on linux 64 bit ok even the one from 24 feb.
[20:43] <luc4> llogan: I'm testing the snapshot.
[20:44] <Jack64> jangle It's complicated. What I'm doing is for a friend who records his bro's indoor soccer matches and then he uploads them to this server and then he just points the goals and then I use PHP to dynamically generate the scripts that make the score bars and splash-screens in the beginning of each half and fade-outs... so it's hard to explain what it does without actually giving you the code
[20:44] <jangle> Jack64:  Well you say you are breaking the video out into 3 parts
[20:44] <Jack64> after processing each video it then concatenates everything into a full match (with all the scorebars and stuff)
[20:45] <jangle> do you mean, temporally? or do you mean the container you're recieving has 3 video tracks in it
[20:45] <Jack64> that's the example for a video with 2 goals
[20:45] <Jack64> temporally
[20:45] <Jack64> because it has to split the video by goals (because of the different score-bar overlays)
[20:46] <Jack64> if a given input video has no goals, it just gets the score at that current time and overlays scorebar and adds a fade out at the end
[20:46] <Jack64>  [19:44] [Jack64(+i)] [3:freenode/#ffmpeg(+cnt)] [Act: 2]
[20:46] <Jack64> [#ffmpeg] th
[20:46] <jangle> ok, well maybe you can't do it all on one line, you need to end up splitting the video up into all the pieces you need, and feed the individual pieces to individual ffmpeg instances set up to add the score, and then one at the end once you have all the pieces to et the fadeout
[20:47] <Jack64> yea that's what I'm doing now, it's working pretty well, but it takes like 2h for 1h video
[20:47] <Jack64> because of the multiple passes
[20:47] <Jack64> but I think I can still optimize it further
[20:47] <Jack64> and make it faster
[20:48] <relaxed> luc4: "[mp4 @ 0x7fa3138a4e00] Frame rate very high for a muxer not efficiently supporting it." and "dup=3449"
[20:48] <jangle> well if you break out the video into parts, as long as you wait for all parts to get their individual filters applied then concat them at the end, then you can run the individual parts in paralelly
[20:48] <jangle> lellelelele
[20:48] <luc4> relaxed: yes
[20:49] <relaxed> luc4: use matroska
[20:49] <luc4> relaxed: same with mov and mkv
[20:49] <Jack64> jangle yea that's what I thought, but that would be ideal if I had multiple worker nodes (i.e. more computers) so I could send the video part to that worker and have it process then return the processed video parts to the main server and concat then...it'd be awesome
[20:49] <luc4> relaxed: but give me a second, yesterday's build may be ok...
[20:50] <jangle> or& just multiple threads?
[20:50] <relaxed> luc4: did you try "-vsync 2" or lowering the output framerate?
[20:50] <Jack64> ffmpeg already uses almost 100% of all cpus
[20:50] <jangle> ok.
[20:51] <Jack64> if I'm using multiple ffmpeg instances they'll slow each other down
[20:51] <llogan> relaxed: so far, AFAICT, that message appears only on the OS X machine, but not the Linux one. I'm waiting for him to test ffmpeg from git master.
[20:52] <luc4> relaxed: mkv was different anyway, just the message MB rate (324000000) > level limit (2073600).
[20:52] <Jack64> do you think it's worth open sourcing it?
[20:52] <Jack64> or is it too specific a use case?
[20:53] <Jack64> I suppose you could use it for any sport..
[20:53] <Jack64> if you make your own vids..
[20:56] <luc4> relaxed, llogan: no warning now. The small sample seems ok with the latest build.
[20:56] <jangle> jack64: so do you already have a command line form that gets you a filter applied only to a part of a video file?
[20:56] <Jack64> yes
[20:57] <llogan> luc4: this is why it is always recommended to try the lastest builds first
[20:57] <Jack64> the line itself is generated in PHP depending on the input video's number of goals, first half or second, etc
[20:57] <Jack64> well actually it generates a whole script (because it's several ffmpeg lines) per input video
[20:58] <Jack64> in real terms, my friend just went from spending about 3h on Power Director doing this, to spending 5 minutes just telling the program how many goals in each video and which team scored and clicking convert lol
[20:59] <luc4> llogan: don't know if this is related but transcoding to a mov I get this: http://paste.kde.org/pgmskzgwl.
[20:59] <Jack64> and for someone who does this twice a week it's pretty cool ;)
[20:59] <jangle> jack64: so it looks like you want to make a number of chains based on how many goals there were
[20:59] <Jack64> yep
[21:00] <jangle> jack64: then use each of those chains as a source for a chain that simply concatenates them all
[21:00] <luc4> llogan: result can be played by vlc anyway. But don't know if that still something is wrong with my file or that still something is not entirely ok.
[21:01] <radagast> llogan, the video concat works but the out.avi is to fast and freez every 2 second
[21:01] <jangle> the http://trac.ffmpeg.org/wiki/FilteringGuide grid example suggests how individual chains can be named and then used as sources for the final chain
[21:01] <Jack64> that's what I'm doing now. for example video 1 has 2 goals. first split gets me beginning to goal 1 (split1b) and  goal 1 to end of video (split1e).
[21:01] <Jack64> then I use split1e to get video from goal 1 to goal 2
[21:01] <Jack64> and goal 2 to end
[21:03] <jangle> do you need to specify split1e as its own section?
[21:03] <Jack64> here's the actual line generated for a 2 video file
[21:03] <Jack64> ffmpeg -y -i /tmp/ram/$infile -t $intime0 -c copy $workingdir/smallfile0b.mp4 -ss $intime0 -c copy $workingdir/smallfile0e.mp4 </dev/null >/dev/null 2>/var/log/ffmpeg.log
[21:03] <Jack64> 2 goal file sorry
[21:03] <Jack64> ffmpeg -y -i $workingdir/smallfile0e.mp4 -t $intime1 -c copy $workingdir/smallfile1b.mp4 -ss $intime1 -c copy $workingdir/smallfile1e.mp4 </dev/null >/dev/null 2>/var/log/ffmpeg.log
[21:04] <Jack64> those 2 lines output 4 videos from 1 input video
[21:04] <jangle> so instead [inputvid] scoreboard filter for time from begin to goal 1[goal1] ; [inputvid] sb filter for goal 2 [goal2] ; [inputvid] fadeout filter and final sb filter[final] ; [goal1][goal2][final] concat
[21:05] <Jack64> YES that would be awesome
[21:05] <Jack64> just one pass in the input video and it's done
[21:05] <Jack64> doing it all at the same time
[21:06] <jangle> well I doubt its truly one pass, it may help if the input video is seekable, otherwise it must be played through to find the target times?  but that is an assumption
[21:06] <Jack64> still even if it's not, it would be a great time improvement I'm sure..
[21:06] <jangle> the other assumption i made is that concat accepts input in that way.  I don't have the command line example to do the fadeout or the overlay
[21:06] <jangle> or for a specific time
[21:07] <jangle> so if you have those, try it out on a short file and see
[21:07] <llogan> radagast: are all of your inputs the same format, frame size, frame rate, etc?
[21:09] <Jack64> jangle: so what happens is when PHP is generating the scripts, it loads the goal times for each video and team names and all data from a SQL database and also looks for the exact milisecond of the keyframe in each video (through ffprobe), so you say a goal is in min 1 sec 23, it goes and looks for a keyframe in that second and splits it exactly there
[21:09] <radagast> llogan, yes all the same, if i open a file with vlc it gives me the same graphical errors and timing  the only solution i have found to view the single files is with vlc force --demux h264 recfile_-140227-150418-150509-61U30300.264 command
[21:10] <llogan> what does ffmpeg say about a single file?
[21:10] <radagast> i try now
[21:12] <Jack64> jangle: here's an example ffmpeg line for applying 0-0 overlay:
[21:12] <Jack64> ffmpeg -y -i $workingdir/smallfile0b.mp4 -acodec copy -vcodec libx264 -crf 20 -vf "movie="$logo0" [watermark]; [in][watermark] overlay=0:0 [out]" $workingdir/overlaid0.mp4 </dev/null >/dev/null 2>/var/log/ffmpeg.$
[21:12] <llogan> radagast: and which concat method did you use? demuxer, filter, or protocol?
[21:12] <jhoffing> Hello! I have a question about the libfaac codec for encoding audio files into an AAC format. By default, libfaac hard-codes the MPEG version to MPEG-4. You can see the hard-coded line at line 115 in libfaac.c. I need to create an audio file that is wrapped in an MPEG-2 container (encoded with Version MPEG-2). Is there a place where I can ask for a feature request to make the mpeg version an additional option in ffmpeg? Related SO here
[21:12] <jhoffing> http://stackoverflow.com/questions/15133393/ffmpegs-aac-libfaac-codec-options and here http://stackoverflow.com/questions/21741058/convert-mp3-to-aac-with-mpeg-2-container-ffmpeg
[21:13] <Jack64> jangle: it gets the smallfile0b (beginning to goal 1) and applies $logo0 (which is the scorebar for 0-0) in that video
[21:13] <jangle> Jack64: and how do you apply the filter to a specified time range?
[21:13] <bparker> why does libavformat have an arbitrary limit of 1024 bytes for a URL/filename ?
[21:14] <radagast> llogan, SMPlayer gives me Error Streaming Media message and vlc work for 5 seconds with fast video and then gives me the same problem
[21:14] <bparker> I have a long URL that is hitting this limit and causes the stream to fail
[21:15] <radagast> llogan, i dont know i used the command you wrote me on this channel
[21:15] <Jack64> jangle: no need for a time range, because smallfile0b is already the result of the split [ input file beginning to goal 1 ]
[21:16] <jhoffing> Also....is this the right forum for asking about ffmpeg's libfaac codec?
[21:16] <bparker> browsers support much longer URLs than 1024 bytes also
[21:18] <axelri> I'm trying to create a video from a series of screenshots (from an Android phone). I'm not able to take the screenshots at a regular interval, so I haven't found a trivial way to feed them to ffmpeg and get an accurate video out. My current solution is to make a short video file for every screenshot, then put them all together with concat. This approach gets the right result, but i find it far too slow for my application, even when tryi
[21:18] <axelri> ng the fast presets for x264. Any ideas on how to speed it up?
[21:19] <Zeranoe> axelri: -crf 0 and encode after its saved?
[21:20] <Zeranoe> axelri: Mightt  want to look at: https://trac.ffmpeg.org/wiki/StreamingGuide
[21:20] <jangle> Jack64: right, but& if you want to do it all at once, then we are assuming that there is a way to apply a filter to a subset of the video
[21:20] <relaxed> jhoffing: what are you actually trying to do? why do you need this?
[21:20] <bparker> jhoffing: MPEG TS/PS containers support mpeg4 audio, you either leave it at that or transcode to mpeg2audio
[21:21] <bparker> faac on gstreamer will output both mpeg2 and mpeg4 audio
[21:21] <bparker> not sure about ffmpeg though
[21:22] <bparker> but you don't want to insert mpeg4 audio and mark it as version2, just use one or the other
[21:22] <axelri> Zeranoe: You mean output every intermediate video file losslessly and then encode them in the concatenation process?
[21:23] <relaxed> -profile mpeg2_aac_low might do it
[21:24] <Jack64> jangle: http://ffmpeg.org/ffmpeg-filters.html#Filtergraph-description Check out 2. Filtering Introduction the flow chart suggests it's possible
[21:25] <axelri> Zeranoe: Also thanks for the link. I hoping to avoid having to write my own C code for this.
[21:26] <Jack64> jangle: it suggests it's possible to apply two filters to the same video.. not a subset of it
[21:26] <Zeranoe> axelri: Huh? it talks about low latency encoding
[21:26] <Jack64> jangle: so maybe splitting it is the only approach possible
[21:28] <axelri> Zeranoe: I just meant that it would be nice to use a scripting language and the ffmpegs binaries, and not having to go all the way down to the C libraries in order to do a efficient enough implementation =)
[21:28] <jangle> Jack64:right below that is a timeline edititing option
[21:29] <jhoffing> relaxed: bparker Essentially, I'm trying to pump a 'preroll' audio buffer in front of a livestream, and its MPEG version has to match exactly that of the livestream's
[21:30] <Jack64> jangle: indeed it is, 5. so I guess it's possible, I just have to study how to implement it with the PHP I wrote
[21:31] <Mavrik> axelri, well if you're lazy and don't want to use any of half-sensible options
[21:31] <Mavrik> use MediaCodec.
[21:31] <jhoffing> Also...I'm still learning about ADTS headers and container formats. Is an MPEG Version the same thing as an audio container that an audio file is wrapped in?
[21:36] <axelri> Mavrik: We looked into that option and determined it wasn't such a good idea. Our application must work on pre-4.0 Android devices, and we got the impression that MediaCodec is a pain to work with then. Another demand was that we shouldn't have to have root access, with further seemed to complicate in-phone video processing.
[21:36] <Mavrik> what does root access have to do with anything?
[21:36] <Mavrik> axelri, I'm just telling you your options, if you're too lazy to properly use ffmpeg's libav libraries and bind them together in C to get a proper solution
[21:36] <Mavrik> you might as well use the lazy solution and go with MediaCoder
[21:36] <Mavrik> otherwise use C, make a library that will handle the encoding and expose a method call that will pass images to C code which will encode it
[21:37] <jangle> well you end up building the chains in text strings
[21:37] <jangle> the questions are, is there really a concat filter that will behave in the filter_chain syntax the way that you need
[21:41] <axelri> Mavrik: Sorry if I came across as lazy, ignorant or whatever, I'm just new in the area and don't really know how to solve this properly. My impression was that in earlier versions of Android, you can't encode arbitrary data streams with MediaCodec but just images from the camera. Without root access taking screenshots is a pain, therefore the irregular interval. I've also had no luck finding good tutorials in how to encode video with th
[21:41] <axelri> e libav libraries, therefore I was hoping to avoid it if possible
[21:43] <Mavrik> axelri, yeah, sorry :) Anyway, MediaCodec is available from Android 4.1+ and is a shim for hardware acceleration on Android
[21:43] <Mavrik> it's by far the best way to do video encoding if you can deal with it's limitations
[21:43] <Mavrik> before hand you can use ffmpeg yes, but trying to script it up or doing stuff in other languages will kill your performance and give you headaches because you'll have to write bridging code
[21:43] <Mavrik> there are examples on how to do encoding and muxing in ffmpeg source code under doc dir
[21:43] <Mavrik> I suggest you do a simple demo on PC first with that code
[21:44] <Mavrik> and then compile it for android...
[21:44] <Mavrik> the best way you can do that is to write whole encoding code in C and just expose a method that will pass frames form Android/Java to your C encoding library
[21:44] <Mavrik> you can do it the other ways but it's gonna be much slower and clunkier...
[21:44] <Mavrik> Video encoding is still pretty much sport for people who know C :)
[21:46] <axelri> Mavrik: Yeah, I wish we could just drop support pre jelly bean =) I noticed the doc examples, but I think I'm a bit too new in the media world to understand all of it properly. But I guess there's no denying that when you want top performance scripting isn't really an option
[21:47] <axelri> Still, we decided to do the image processing on the server-side, so that does leave us a lot of options at least
[21:52] <axelri> I'm going to try to fiddle around with the encoding options in my current script, but if that doesn't work out I'll focus on getting dirty with the C libraries. Thanks for the good advice anyway! I haven't used ffmpeg for such a long time but it seems to me that it has a really great community =)
[21:53] <Mavrik> :(
[21:53] <Mavrik> :)
[21:53] <geekgm> hi
[21:54] <geekgm> is there a setting to BURN IN subtitles and change colour of the text?
[21:55] <llogan> geekgm: you can use ass. you can format the color or the text within the ass file.
[21:55] <Mavrik> that's a totally unfortunate name for a format tho.
[21:55] <axelri> Boy was I confused about libav and ffmpeg at first though. Took me a while to figure out that ffmpeg in apt actually wasn't ffmpeg.
[21:56] <llogan> geekgm: you can use aegisub if you want a GUI to edit your ass file.
[21:56] <llogan> then burn it as shown in: https://trac.ffmpeg.org/wiki/How%20to%20burn%20subtitles%20into%20the%20video
[21:57] <geekgm> txs. I will look into that
[21:57] <llogan> also see https://trac.ffmpeg.org/wiki/x264EncodingGuide since you will have to re-encode
[21:58] <llogan> geekgm: you can also stream copy the audio with "-codec:a copy" so it does not get re-encoded if you're not wanted to change the audio format http://ffmpeg.org/ffmpeg.html#Stream-copy
[21:58] <geekgm> I tried using Handbrake but it always burns the subtitles in black.
[22:01] <llogan> i forgot that srt can also change color...but now he's gone
[22:03] <Jack64> hey guys are you aware of any HTML5 way to provide subtitles to all platforms (Android, iOS, desktop) ?
[22:03] <Jack64> HTML5 video + subs I mean
[22:04] <Jack64> or do we have to burn them in to ensure it works everywhere the video plays?
[22:06] <JodaZ> well, in theory you can overlay them, and then fullscreen the div containing both the video and the subtitles; but there are potentially platform and performance problems with that
[22:09] <JodaZ> Jack64, http://fiddle.jshell.net/sandro_paganotti/TcpX5/show/light/ see if this works on your target devices
[22:10] <Jack64> it doesn't work in my browser
[22:10] <Jack64> Chrome on ubuntu
[22:11] <Jack64> I'm using something called captionator
[22:11] <Jack64> works everywhere except android
[22:11] <Jack64> which is a bummer -.-
[22:11] <Jack64> so close to full compatibility
[22:11] <JodaZ> what do you mean doesn't work, what does it do when you click go fullscreen
[22:12] <Jack64> in desktop browsers it overlays the subs with javascript
[22:13] <JodaZ> ah, hmm, the captionator doesn't work for me in fullscreen
[22:13] <Jack64> in iOS it uses webVTT to give the Apple CoreMedia player
[22:13] <Jack64> it works everywhere in Chrome (not Firefox) if that's what you're using now
[22:13] <Jack64> should've specified sorry xD
[22:14] <Jack64> but the problem is Android doesn't work with webVTT
[22:14] <Jack64> actually I'm not sure there's any implemented way to overlay subs on HTML5 video in Android browsers
[22:14] <Jack64> I think I've only seen it done in native apps
[22:15] <JodaZ> well, manually overlaying, question is if captionator tries to use webVTT and not the normal overlay method it uses in chrome on the desktop (or does it use webvtt there too?)
[22:16] <Jack64> I have it working in my own webserver and the videos only have webVTT subs and it works in chrome
[22:17] <Jack64> so with the same source file it works everywhere
[22:17] <Jack64> except Android
[22:17] <Jack64> lol
[22:18] <Jack64> oh yea I forgot, it's even pickier
[22:18] <Jack64> it works on android
[22:18] <Jack64> you can see the subs
[22:18] <Jack64> just not in fullscreen
[22:19] <JodaZ> hmm, well, you need to fullscreen differently
[22:20] <JodaZ> if you fullscreen the video using the videos own fullscreen mechanics and there is no webVTT support nothing will show obviously... but i think this captionator has a different fullscreen method maybe that works
[22:23] <Jack64> how?
[22:23] <Jack64> and how would I change it for android users only?
[22:23] <Jack64> I'd have to read the headers
[22:23] <Jack64> no?
[22:24] <Jack64> or does that method work for all devices?
[22:28] <JodaZ> Jack64, i don't know, you'd have to ask captionator developers for assistance with that
[22:28] <JodaZ> Jack64, video.js might interest you, its an alternative solution to captionator
[22:29] <Jack64> I use them both :D
[22:29] <Jack64> I use video.js to server videos that have burnt in subs and captionator otherwise
[22:29] <Jack64> serve*
[22:30] <JodaZ> video.js supposedly supports webvtt subtitles too
[22:30] <JodaZ> https://github.com/videojs/video.js/blob/master/docs/guides/tracks.md
[22:31] <Jack64> I tried to use it but it only worked on desktop browsers
[22:32] <JodaZ> again, thats propably to do with the way you fullscreened, you'd use the video.js fullscreen element
[22:34] <Jack64> yes, I didn't override the fullscreen
[22:34] <Jack64> I'll have to study how that's done
[22:36] <jangle> so I'm trying to decode h264
[22:37] <jangle> I've found the avcodec.c example and am wondering why the example isn't set up to decode the test video when h264 is selected?
[22:38] <jangle> for recall, the example generates an in memory video and then encodes it in either h264 or mpeg1video.  if you picked mpeg1video, then the program will not only dump the encoded video to disk, but it will also decode the video and dump each frame to disk
[22:39] <Jack64> does that mean you get like a png or something of every frame?
[22:39] <jangle> this does not happen when h264 is selected, only the h264 annex b file is generated.  When I modified the example code to decode the file when using the h264 decoder, the decoder isn't happy
[22:39] <jangle> in this case a pgm, but yes
[22:39] <Jack64> I never played with C libs
[22:40] <Jack64> that's libavformat right?
[22:40] <jangle> all I need (i think) is libavcodec
[22:40] <Jack64> right
[22:40] <Jack64> you should only need libavcodec if you're just encoding/decoding
[22:41] <jangle> I already have the encoded video in memory when I get to this point, but I don't know how to set up the decoder datastructures by hand, because I'm not starting with a container file that the easy library calls can use to pick apart the data and set everything up
[22:41] <jangle> so I could use some help
[22:42] <Jack64> so when you use mpeg1video it doesn't load it the same way?
[22:44] <Jack64> you should follow the example code flow for case mpeg1video and see how it differs from h264 in the loading
[22:45] <Jack64> or maybe it's not decoding the stream because that function isn't called
[22:51] <jangle> so
[22:52] <jangle> the example code only decodes the video if you chose to use the mpeg1 encoder
[22:52] <jangle> so it sets up a decoder with the id of mpeg1video,
[22:52] <jangle> and if you did indeed choose the mpeg1 codepath, it issues a video decode call using the aforementioned decoder
[22:52] <jangle> if you chose the h264 path, the test video is encoded and dumped to disk, and not decoded
[22:53] <jangle> if I change the code to setup an h264 decoder, and then add the code that makes the decode call when choosing the h264 path, the decoder complains
[22:53] <Jack64> hmm
[22:54] <Jack64> try different C++ code
[22:54] <jangle> the only reason I can't officially call this a bug is because the provisions for decoding the video in h264 are added by me.  I assume they've been left out because there is more to it that they decided not to go into
[22:54] <jangle> if I found some then I would have :-P
[22:54] <Jack64> here's what seems to be a nice tutorial
[22:54] <Jack64> http://blog.tomaka17.com/2012/03/libavcodeclibavformat-tutorial/
[22:54] <Jack64> http://lmgtfy.com/?q=libavcodec+decode+h264+and+dump+frames+example
[22:55] <jangle> :-P
[22:55] <jangle> I've found several of the 1st example
[22:55] <jangle> and the problem is that I'm not starting with a container file
[22:55] <jangle> I'm starting with encoded nals already, and need to progress from there.
[22:56] <Jack64> I love lmgtfy eheh :D
[22:56] <Jack64> hmm ok
[22:57] <JodaZ> encoded nals ? where you got those from?
[22:57] <Jack64> so maybe you need to decode those nals into the the proper structure and then you'll be at the first example's starting point right?
[22:57] <Jack64> JodaZ: you can get them from mkvextract for example
[22:58] <Jack64> if you extract mkv video track you can do it in .nal
[22:58] <Jack64> or .264 which is the same
[22:59] <Jack64> I assume the code the example uses generates the input in memory and goes with nal formate
[22:59] <Jack64> format*
[23:04] <jangle> I have my own rtsp/rtp client
[23:04] <jangle> JodaZ:
[23:04] <Jack64> so your input is like a webcam?
[23:04] <Jack64> or you have a server too?
[23:05] <jangle> Jack64: anything that spews h264 over rtp and uses rtsp as the control protocol
[23:06] <jangle> and for your last question, the code example just dumps the nals to disk in the proper format
[23:06] <jangle> I can generate my own nals, but you'd think that the library would be able to easily decode the results of its own encode
[23:07] <jangle> and since the h264 annex b format is just a lightly packed file containing all of the encoded data, and that ffplay can play this file, that the encode step and decode step should be mirror images of each other
[23:07] <jangle> I invite anyone to try it, its a two line change to avcodec.c
[23:07] <Jack64> it makes sense that it would be so
[23:08] <Jack64> you're doing it in linux?
[23:08] <jangle> Jack64: have you figured out how to apply your filter to a section of video?
[23:08] <jangle> osx
[23:08] <jangle> but all c/c++, not objective c or wrapped up in whatever
[23:09] <Jack64> I've bookmarked the ffmpeg docu page but not working on that right now (since what I have is working, even though it takes a while)
[23:10] <JodaZ> jangle, you can't use the ffmpeg commandline tool to just do whatever you want to do with your nalstream?
[23:10] <Jack64> I'll optimize later, I think I'll get an even better boost if I write a worker handler that detects my PC and my friend's PC on the same LAN and sends them the various input files (distributed workers)
[23:11] <jangle> JodaZ: I'm going to be displaying the frames on a canvas in real time.  So I don't know, does that seem doable?
[23:12] <JodaZ> jangle, why not just ffplay them then?
[23:12] <jangle> JodaZ:  Because I don't know how to get the live bitstream into ffplay
[23:13] <JodaZ> jangle, pipe?
[23:13] <jangle> and does it accept raw nals?
[23:14] <JodaZ> jangle, propably; why not just do some experiments, i'd start by having ffmpeg to convert them and see if that works
[23:14] <jangle> ffmpeg can convert them, so I'm at the next step
[23:15] <JodaZ> i strongly asume if ffmpeg can convert them, that ffplay can play them
[23:15] <jangle> I can generate a snapshot of the webcam, and outpu the h264 nals in annex b format that ffmpeg can convert
[23:15] <JodaZ> so play them
[23:16] <JodaZ> snapshot?
[23:16] <jangle> I need to transition to displaying frames in real time
[23:16] <jangle> ffplay opens up a container file
[23:16] <jangle> and plays its contents
[23:16] <jangle> or optionally an rtsp stream maybe I haven't looked
[23:16] <jangle> I have my own rtsp/rtp code
[23:16] <jangle> and can get the nals
[23:16] <jangle> and now I need to decode the nals and display frames on something in real time
[23:17] <JodaZ> sure, so your rtsp/rtp code outputs them to a file?
[23:17] <jangle> no
[23:17] <jangle> only when I ask it do, to demonstrate that I have recovered the encoded data correctly
[23:17] <JodaZ> well, just push them out to stdout and pipe that into ffplay
[23:19] <jangle> ffplay doesn't read from stdin
[23:19] <JodaZ> sure it can
[23:20] <jangle> I'm looking at the code, but it requires that a filename be passed as an argument
[23:20] <JodaZ> yes, with - as a special filename meaning stdin
[23:25] <jangle> well I guess that worked.  while this may work for now, I still need access to the individual frames
[23:25] <jangle> Or I guess I should say that this works because I'm catting an h264 file on the command line into ffplay
[23:26] <JodaZ> well, you should be able to rewrite your rtsp client to output on stdout, alternatively you could use a unix pipe file
[23:28] <JodaZ> jangle, for getting individual frames you can use the image2 output of ffmpeg
[23:28] <JodaZ> http://www.ffmpeg.org/ffmpeg-formats.html#image2-1
[23:29] <jangle> It already dumps other things to stdout, and I've already failed to figure out how to use pipes for this exact purpose before, so I'd rather not.  Also, I need to do things with the individual frames.  Tricking ffplay into thinking its reading an h264 annex b file whlie in reality its being "streamed in" seems like a hack
[23:29] <jangle> cool.  but in memory.  thanks for the link though
[23:29] <JodaZ> piping is not a hack
[23:30] <JodaZ> and it is "in memory"
[23:30] <jangle> sorry unclear, the image2 call dumps the image frames to disk
[23:30] <JodaZ> regarding it dumping other things to stdout, thats for example avoided in ffmpeg, it outputs messages only to stderr
[23:31] <JodaZ> well, either you want individual frames or you want a video stream, you currently have a video stream, you can do everything you want with it, but since you don't say what exactly you do want to do with it i can only guess
[23:32] <JodaZ> play it with ffplay, pipe it to a file for storage, transmute it with ffmpeg
[23:32] <jangle> once I have the frames I can do any number of things at the same time
[23:32] <jangle> I can display them on a canvas, I can record snapshots of events I care about, or make picture in picture displays, or use opencv to do interesting things
[23:32] <jangle> I need the frames so I can do any of them.
[23:33] <JodaZ> jangle, well, maybe use ffmpeg to output them as a yuv stream then
[23:33] <jangle> and at the moment, I need to control displaying of frames a the right time, because ffplay seems to play my bistream too fast.  I understand that timing information may not be stored in the bitstream, so that is yet another thing I need to handle
[23:33] <jangle> i have to decode them first!
[23:34] <JodaZ> how lucky you are that ffmpeg is good at decoding things then
[23:34] <jangle> yeS!!!!!1
[23:34] <jangle> but in memory!!!!!!
[23:34] <JodaZ> sure
[23:34] <Jack64> I wanna play with opencv too -.-
[23:35] <jangle> am I the only person in the history of the world, that wants to use the libav* libraries to decode in memory encoded data?
[23:35] <jangle> using a statically linked version of the libraries?
[23:35] <jangle> or a dynamically linked version, but not by running a second binary?
[23:36] <JodaZ> jangle, maybe the only non-programmer
[23:36] <jhoffing> *whoosh*
[23:36] <jangle> Nice one
[23:36] <jangle> explain to me, how I''ve failed to understand my problem
[23:37] <Jack64> if you can do it by running a second binary (which is probably the easiest way), why not?
[23:38] <alexa> How to amplify sound using ffmpeg?
[23:38] <jangle> because using the library to pretend to be giving it a file and instead giving it a streamed version of the file feels like a hack
[23:39] <alexa> I used to extract audio with audacity and to apply amplify effect, then to save as audio, then to add into original video using avidemux.
[23:39] <Jack64> jangle, ah so your quarrel is with unix piping you feel it's a hack eheh
[23:39] <JodaZ> alexa, if you put that question as is into google, the third result will tell you
[23:39] <jangle> no, I've used it before for exactly this purpose and I coulnd't get it to work
[23:40] <jangle> and I'm writing cross platform code and I don't need to use pipes
[23:40] <jangle> I only need to use pipes because apparently programmers never need to use anything else
[23:40] <Jack64> alexa, http://lmgtfy.com/?q=How+to+amplify+sound+using+ffmpeg%3F
[23:42] <Jack64> well I actually agree with you in that using libavcodec to do it should be more straightforward
[23:42] <jangle> right?  except we are now the only two who think so
[23:42] <Jack64> especially because of live sources
[23:43] <Jack64> I assume it'd be easier to manipulate live sources if they are decoded without an external binary
[23:43] <Jack64> easier/better performance
[23:43] <jangle> there is nothing complicated about the h264 annex b stream, and any code that stands up reading the file stands up a decoder to display the frames.  I just need to use the library to accept the encoded data from a buffer in memory, and not from trying to do it for me by reading it in from the file
[23:44] <alexa> but what if I need to do it inside flv video
[23:44] <alexa> ?
[23:45] <Jack64> jangle, so your in memory data is nal and it expects an mkv file or it can deal with nal but it has to be a file?
[23:45] <jangle> no
[23:45] <jangle> every example i've found
[23:46] <jangle> is code that opens a file to start decoding and encoding
[23:46] <jangle> so in those steps of opening the files and picking out the different media streams
[23:46] <jangle> the libraries are built to make it easy to do that, to autodiscover whats in the file, what kind of encoded media are in each stream
[23:46] <jangle> and then let you do things with each stream.
[23:47] <Jack64> but it has to be a file?
[23:47] <jangle> since I'm showing up with nals in memory (and since the h264 annex b is essentially the nals lightly wrapped) and since ffplay can open this file and play the contents onto a canvas, I figured it would be
[23:49] <JodaZ> http://ffmpeg.org/doxygen/trunk/api-example_8c-source.html
[23:49] <JodaZ> seems the data is coming from fread(), so you could just replace that with your own in memory data source
[23:49] <jangle> the examples describe working with files.  otherwise you have to setup the decoder yourself.  In the case of h264, my assumption is that if your annex b stream can be read by ffplay, since there isnt' anything more to it than that, that the same routines that are used to setup the decoder for data read from the file can be used to decode data passed in from a buffer
[23:51] <JodaZ> jangle, the example i linked is working with a file, but since you are calling yourself a programmer, i asume you should be able to replace that fread call with a something that gets the data from your "memory" instead
[23:55] <jangle> JodaZ: the other example I found, which is similar, is this http://ffmpeg.org/doxygen/trunk/avcodec_8c-example.html
[23:56] <JodaZ> its the same
[23:56] <jangle> i changed the decoder in the linked file to be h264, and attepted to do what it does with mpeg1video with h264 instead.  It didn't work and I've posted about it.
[23:56] <JodaZ> well, that thing is doing a whole bunch of stuff before that
[23:56] <jangle> oh really?
[23:56] <jangle> like what mister programmer?
[00:00] --- Wed Mar  5 2014


More information about the Ffmpeg-devel-irc mailing list