[Ffmpeg-devel-irc] ffmpeg.log.20141023

burek burek021 at gmail.com
Fri Oct 24 02:05:01 CEST 2014


[01:19] <bencc> I'm cropping and transcoding mkv video to mp4 but ms movie make can't play the video, only the sound
[01:19] <bencc> http://dpaste.com/08TEMR9
[01:19] <bencc> am I using unusual parameters?
[01:31] <bencc> llogan: http://dpaste.com/27D8K5W
[01:36] <llogan> bencc: i don't know if ms movie can decode lossless
[01:37] <iive> well, easy to find out. try a bigger number for quantizer :)
[01:37] <llogan> try 18
[01:37] <llogan> or a better format for editing
[01:37] <bencc> llogan: 18 what?
[01:37] <llogan> -crf 18
[01:37] <bencc> what's a better format for editing?
[01:38] <llogan> ut video probably
[01:38] <bencc> ut?
[01:38] <llogan> ut video
[01:38] <bencc> what is ut video?
[01:40] <llogan> a lossless codec
[01:41] <bencc> llogan: is it supported by default on windows?
[01:41] <bencc> what is the file extension?
[01:41] <PovAddict> codecs don't have file extensions
[01:42] <bencc> ok. I see I need something like ffmpeg -i input -codec:v utvideo -codec:a pcm_s16le output.avi
[01:42] <bencc> for utvideo inside avi container
[01:42] <PovAddict> do you know the difference between a codec and a file format / container?
[01:42] <bencc> does windows support it by default?
[01:42] <bencc> PovAddict: yes. I was wrong
[01:44] <bencc> does windows media player support ut video by default?
[01:44] <PovAddict> dI  never heard of ut before, so I doubt Windows supports it
[01:44] <iive> bencc: first, try the -crf , then try other codecs.
[01:45] <bencc> iive: I'm trying it now
[01:45] <iive> you can use -t 1:0 to encode just one minute
[01:45] <iive> you don't need to finish the whole video for the test.
[01:46] <llogan> bencc: ms movie *may* support it if you install ut video first, then restart ms movie.
[01:46] <llogan> http://umezawa.dyndns.info/archive/utvideo/?C=M;O=D
[01:46] <bencc> llogan: I need to send the video to someone that work with microsoft movie maker
[01:46] <llogan> then they can install it
[01:46] <bencc> he won't be able to install additional codecs
[01:47] <PovAddict> is ut video encoding fast enough to do in real time?
[01:47] <iive> llogan: is there lossless vc1?
[01:47] <llogan> i don't know
[01:48] <llogan> i've use ut video in and out of premiere a few times
[01:49] <llogan> when i felt too lazy to use a frameserver
[01:49] <llogan> since i hate AME
[02:23] <bencc> crf 18 works. thanks
[02:26] <iive> crf 18 is equivalent to mpeg2 quant=2 . smaller quant gives better quality and bigger size
[02:33] <bencc> iive: microsoft movie maker couldn't handle crf 0
[02:33] <bencc> I'll try crf 1
[02:55] <dektec> Hello, I'm trying to output a video stream and an audio stream to separate FIFO files, that is, the video stream to one file, the audio stream to another fifo file, but it's not working
[02:56] <dektec> any ideas, please?
[03:02] <dektec> Ok, here's the pastebin showing input and outputs: http://www.pastebin.com/k8sDFhKt
[03:03] <dektec> i'm trying to pipe the video stream to fifo1.m2v, and the audio stream to afifo.mp2, this doesn't work, I can't pipe anything. However, when used individually, only one pipe, it works as expected.
[03:03] <dektec> even when one output is a fifo file, and the other output is a regular file, it works, but not when both outputs are set to fifo files
[03:03] <dektec> any ideas? I think I'm lacking in linux here
[03:05] <sacarasc> Why are you going for mpeg2?
[03:06] <dektec> it's required for the hardware I'm working on
[03:06] <sacarasc> What is going wrong?
[03:07] <dektec> when trying to ffplay one of the fifos, nothing happes, it's like the fifos were empty
[03:07] <dektec> *happens
[03:07] <sacarasc> Which one?
[03:08] <dektec> any of the two
[03:08] <dektec> but when I set the output for 1 fifo, and 1 regular file, everything works fine. I can ffplay the fifo just fine
[03:09] <dektec> Is it actually possible in linux to output to 2 different fifo files simultaneously?
[03:13] <PovAddict> you can't read from just one
[03:13] <PovAddict> because ffmpeg will be writing to both simultaneously, if you read from one, you get just a chunk of the video and then ffmpeg is blocked writing to the other, with nobody reading
[03:14] <PovAddict> try ffplay fifo.m2v, and simultaneously, cat afifo.mp2 > /dev/null
[03:15] <dektec> would that be ffplay fifo.m2v && cat afifo.mp2 > /dev/null ?
[03:15] <dektec> trying it now
[03:15] <PovAddict> no, that would wait until ffplay quits before running cat
[03:15] <PovAddict> do it in two consoles
[03:16] <PovAddict> so they are simultaneous
[03:17] <dektec> wow that worked!
[03:18] <PovAddict> great
[03:18] <PovAddict> you could also try two ffplays :)
[03:18] <dektec> amazing, but I thought that ffmpeg just wrote the fifo and didn't care if someone is reading it?
[03:19] <PovAddict> the write blocks if nobody is reading, that's an operating system thing
[03:19] <PovAddict> where would the data go otherwise? dropped?
[03:19] <PovAddict> stored in an infinite-size buffer?
[03:19] <llogan> dektec: now get rid of your -tune and -preset since those do nothing with mpeg2video
[03:20] <llogan> why did you add -r 23? your input is 24000/1001
[03:27] <dektec> hehe, thnaks
[03:27] <dektec> *thanks
[03:28] <dektec> i need to set -r 23, otherwise video freezes for some reason
[03:28] <dektec> i'm streaming to a modulator hardware for digital tv
[03:58] <PovAddict> I had to move my DVD menu from the VMGM to the titleset, and now I can set it to entry="root", so the 'menu' button on my player remote goes to the menu \o/
[04:23] <onyx> hey guys, is it possible to crop a 4:3 livestream to 16:9 while recording it?
[04:23] <PovAddict> crop top and bottom you mean?
[04:23] <onyx> yes
[04:23] <PovAddict> sure is
[04:24] <onyx> so it goes from 4:3 to 16:9 format
[04:24] <PovAddict> do you already have a command for basic recording?
[04:24] <onyx> yes one sec
[04:24] <onyx> this is what Im using http://pastebin.com/195KgWvQ
[04:29] <groxx> Can someone help me figure out why my crop filter seems to be ignored?
[04:30] <PovAddict> onyx: sorry, got distracted by other channels :)
[04:30] <onyx> PovAddict: you see my command?
[04:30] <PovAddict> yes
[04:31] <PovAddict> try -vf crop=out_h=in_w/16*9
[04:31] <onyx> ok
[04:38] <groxx> Some more info perhaps: I'm building ffmpeg for Android, from the ffmpeg-android github repo.  crop is in the list of filters, but despite passing `crop=168:168:0:21` nothing gets cropped.  is there any way I can debug this further?  on older versions I would see [crop <etc>] <crop details> in the output, but there's nothing like that now.
[04:44] <onyx> PovAddict: where in the command should I add that part?
[04:51] <PovAddict> anywhere between the input and output
[04:51] <onyx> ok
[04:51] <PovAddict> ffmpeg -i http://serverip:1935/livestream -acodec copy -vcodec copy -t 5:00 -vf crop=blah /pathto/videoname-$(date +\%Y-\%m-\%d).mp4
[04:54] <onyx> PovAddict: didnt work
[04:54] <onyx> no error or anything, it doesnt do anything
[04:55] <PovAddict> it saves the video unmodified?
[04:55] <onyx> no it doesnt do anything at all
[04:55] <PovAddict> could be a problem with the server? try removing the -vf stuff again
[04:57] <onyx> mmm... wierd now it just hanging there
[04:57] <onyx> wait a min
[04:57] <onyx> this make the not show anything right > /dev/null 2>&1
[04:57] <PovAddict> lol yup
[04:58] <onyx> lol
[04:58] <onyx> ok now trying without that -vf part
[05:00] <Akagi201> Why does ffmpeg has some function has a number after the name, like avcodec_open2, but I can't find avcodec_open function?
[05:00] <Akagi201> Why does ffmpeg has some function with a number after the name, like avcodec_open2, but I can't find avcodec_open function?
[05:14] <rcombs> Akagi201: that happens when the API changed, so a new function name (with a 2 or 3 or &) suffix was added, and then later the previous version was deprecated and removed
[05:16] <Akagi201> rcombs: This trick is not very elegant
[05:17] <rcombs> Akagi201: it's really the only way to update a C API while maintaining backwards compatibility for some period
[05:30] <onyx> PovAddict: im getting this error Filtergraph 'crop=out_h=in_w/16*9' was defined for video output stream 0:0 but codec copy was selected.
[05:30] <onyx> Filtering and streamcopy cannot be used together.
[05:30] <PovAddict> ah yes
[05:30] <ghostwagon> is it at all possible to get hold of the rtmpdump v2.5 source?
[05:30] <PovAddict> you're saving the stream unmodified
[05:30] <PovAddict> if you want to do any filtering, you need to reencode it
[05:30] <onyx> oh booo
[05:31] <PovAddict> you can do it as you download, if your CPU is fast enough
[05:31] <onyx> is a single core
[05:31] <onyx> with little ram too
[07:08] <t355u5> Does anybody know what configure option is required to have LPCM support in ffmpeg?
[08:16] <ruby_on_tails> how can i totally remove the audio codec from a video
[08:16] <ruby_on_tails> my video doesnt have any audio and something is wrong with the audio codec definition due to a problem while transcoding it elsewhere
[08:16] <ruby_on_tails> i want to get rid of any audio info to make the video work, it fails in many players
[10:45] <xata> Hello
[10:46] <xata> I have a video data to be demuxed in array, can i use lavf to do this? Or it is file-only?
[10:52] <ubitux> what does that mean?
[10:52] <ubitux> you can use a rawvideo input
[10:53] <ubitux> look at doc/examples/demuxing_decoding.c
[10:53] <ubitux> ignore the decoding part
[10:53] <ubitux> force the input format as "rawvideo"
[10:53] <ubitux> set the pixel format and dimensions
[10:53] <ubitux> and that's it
[10:55] <ubitux> basically, the same as if you were doing  ffmpeg -f rawvideo -pixel_format rgb24 -video_size 320x240 -i "pipe/fifo/whatever" -f null -
[11:00] <xata> ubitux: well imagine i have a mpeg2-file in array. I want the video to be decoded on hardware using external library, but first i have to separate the video from audio and other data. How do i do that - is there something like this in examples?
[11:04] <ubitux> "a mpeg2-file in array"?
[11:05] <termos> is it possible to set the Decoded Picture Buffer (DPB) in FFmpeg somehow?
[11:11] <xata> ubitux: yup. a mpeg2 encoded file fully loaded to memory (array of 8bit ints)
[11:13] <ubitux> xata: o_o
[11:13] <ubitux> xata: well then you might need to redefine the avio layer
[11:14] <ubitux> look at doc/examples/avio_reading.c
[11:17] <xata> ubitux: thanks!
[13:11] <termos> Is there a good way to add salt & pepper noise to ffmpeg output? The noise filter only seems to support other types of noise
[13:46] <Alina-malina> hmmm is it possible to do transition effect with ffmpeg? for example i have 1 video playing so i want to stop it and replace with another one immideatly, but while it is playing? for example i play a movie it is streaming, then i hit a hotkeys and it does a transition to other movie, but without stoping the process, soft transition?
[13:47] <Alina-malina> and for example if i hit another hot key it make a transition to other movie to the exact video time?
[13:47] <klaxa|work> no, i don't think ffmpeg is able to do that
[13:49] <Alina-malina> hmmm
[13:49] <Alina-malina> what can i use then for that?
[13:49] <Alina-malina> :-/
[14:51] <termos> how can I set analyzeduration using the C api?
[14:56] <c_14> Pretty sure it's max_analyze_duration2 in AVFormatContext
[15:05] <termos> hm yes, that was easy :) it says no direct access though
[15:07] <termos> av_opt_set_int(inout_context->priv_data, "analyzeduration", 10 * 100000, 0) i guess?
[15:10] <c_14> try it, I'm not that great with the api
[15:18] <termos> hm okey, analyzeduration is not really doing what I wanted it seems.
[15:18] <termos> I wanted to increase the input buffer, increasing the delay of the streaming but making it more robust with regards to unstable/laggy input streams
[15:20] <c_14> most input protocols have a dedicated buffer option you can set
[15:25] <termos> the input is an rtmp stream, do you know how to set its buffer?
[15:26] <c_14> rtmp_buffer ?
[15:37] <termos> ok, I could set it but it doesn't look like it's being honored
[15:39] <termos> I see it's by default 3000ms = 3 seconds, setting it to 13000 should be 13 seconds delay but there seems to be no difference
[15:39] <c_14> eh, you could try adding a buffer filter, but other than that, I'm out of ideas
[15:50] <termos> okey thanks :) I was hoping to get the frames buffered before the decoding+filter graph
[16:11] <Tatou> hi
[16:12] <seasc> hi
[16:12] <Tatou> So, i have problem for burn subtitles into video
[16:12] <Tatou> '-'
[16:14] <Tatou> i can't configure libass
[16:14] <Tatou> :p
[16:14] <seasc> try ass
[16:14] <seasc> :p
[16:14] <Tatou> okay
[16:14] <Tatou> xD
[16:14] <seasc> or if it fails as well, try copy
[16:17] <seasc> me want to add another audio stream to a video(which already has audio, i want both). When i just add '-i audiofile.aac' while encoding it completes successfully, but doesnt include the additional audio stream
[16:17] <c_14> -map 0 -map 1
[16:17] <Hello71> i feel like there should be -map 0,1
[16:18] <seasc> i have that for the main file anyway
[16:18] <c_14> ffmpeg will only copy one video and one audio stream by default
[16:18] <c_14> If you want more, you'll have to tell it that.
[16:18] <seasc> well... -map0:v and map 0:14
[16:18] <seasc>  ffmpeg -v quiet -i "demo-1080-movie-weird-subtext.mkv"  -i "demo-anime.id1.aac"     -strict -2  -map 0:v -c:v libx264   -ac 2 -c:a aac   -map 0:4   -metadata description='Encoded by VHS (Video Handler Script 1.0.5), using ffmpeg 2.1.5' -f mp4 "demo-1080-movie-weird-subtext.0.mp4"
[16:19] <c_14> add -map 1
[16:19] <seasc> just after demo-*aac ?
[16:19] <seasc> or anywhere?
[16:19] <c_14> eh anywhere after demo-anime and before movie-weird-subtext should be fine
[16:21] <seasc> weird-subtext is in ahead alraedy, did you notice? Because other way around it failed due to 'misaligned' streams
[16:21] <c_14> eh, the output one
[16:21] <seasc> just after demo*aac then... thank you
[16:22] <seasc> ah ok
[16:22] <seasc> thank you
[16:22] <c_14> Didn't notice that the string I chose wasn't unique.
[16:36] <seasc> Now it fails: http://pastebin.com/pDjGwuCQ
[16:38] <c_14> You want -map 1 not -map 0:1
[16:38] <seasc> Ahh
[16:38] <seasc> used to 0.* :P
[16:38] <c_14> -map 1 maps all streams from the first (second) file, -map 0:1 maps the first (second) stream from the zeroest (first) input file
[16:39] <seasc> Ahh i see, thank you very much
[16:53] <seasc> weird, it fails when i add more than one audiofile to be included.. "-i INPUT -i F1 -map 1 -i F2 -map 2"
[16:53] <c_14> the maps go after the last input file
[16:53] <c_14> and before the output file you want it to go to
[16:53] <seasc> ok thank you
[16:54] <seasc> i assume filter_complex as well?
[16:55] <c_14> yep
[17:33] <seasc> when i uses my webcam as overlay to the screen resocrding, i use this: guide_complex="'[0:v:0] scale=320:-1 [a] ; [1:v:0][a]overlay'".... what do i need to change when i want (file&file) to make my 2nd file as overlay and the first one beeing the 'ground'? (currently its 2nd is 'ground' and 1st is overlay)
[17:34] <c_14> put the [a] in front of the [1:v:0]
[17:35] <kepstin-laptop> the order of inputs to the overlay filter is what does that; the first input is the bottom, the second input is the top.
[17:36] <seasc> dang, so no number switching in: [0:v:0] ?
[17:36] <kepstin-laptop> hmm? no, you just change [1:v:0][a]overlay to [a][1:v:0]overlay
[17:36] <seasc> @ kepstin-laptop no its not, its the second as background (bottom) and first as overlay (top)
[17:39] <kepstin-laptop> seasc: mind pasting your entire command somewhere? the order that the -i are in the command line is important
[17:39] <kepstin-laptop> [0:v:0] matches the first -i, and [1:v:0] matches the second -i
[17:42] <seasc> currently fixing that part within the script that does generate that part of the command
[17:49] <delet> libavformat/http.c:906:1: error: control reaches end of non-void function [-Werror=return-type]
[17:49] <delet>  }
[17:49] <delet>  ^
[17:49] <delet> cc1: some warnings being treated as errors
[17:49] <delet> make: *** [libavformat/http.o] Error 1
[17:49] <delet> why?
[17:54] <seasc> kepstin-laptop, c_14, http://pastebin.com/Gq2B36vS
[17:56] <kepstin-laptop> seasc: do you want the first input or the second input to be the background in the overlay? And what resolution do you want the final video?
[17:56] <kepstin-laptop> right now you're rescaling the file "demo-1080-movie-weird-subtext.mkv" to 320×nnn
[17:57] <seasc> i want the 'anime' as overlay
[17:57] <seasc> original resolution
[17:57] <seasc> well.. the overlay as 320xnnn
[17:57] <kepstin-laptop> ok, you have a mess of -map options that should mostly be deleted there
[17:58] <Diogo> hi this is possible to generate a mpeg ts with this struture..
[17:58] <Diogo> ffmpeg -y -i rod.mp4 -force_key_frames 2,2,3,3,4,4,4,4,4,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,9 -codec:v libx264 -r 25 -g 25  -codec:a libfdk_aac -f stream_segment -segment_times 2,2,3,3,4,4,4,4,4,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,5,9 -segment_format mpegts
[17:58] <Diogo> stream%d.ts
[17:58] <Diogo> 2 seconds 2 seconds 3 seconds 3 seconds .....
[17:58] <Diogo> and ends with 9 seconss?
[17:58] <seasc> kepstin-laptop, they are the audio streams from the first input (weird subtext)
[17:58] <kepstin-laptop> seasc: if you want to rescale the second input to 320xwhatever, then the input to the scale filter should be [1:v:0] - the first number is the input number; 0 is first input, 1 is second input, and so on.
[18:03] <kepstin-laptop> so, to use the 1080-movie file as the background and add a 320x copy of demo-anime on top, you'd want something in your filter-complex like "[1:v:0]scale=320:-1[a];[0:v:0][a]overlay"
[18:04] <seasc> ahahah ok... i was changing the wrong value... thank you kepstin-laptop i'll try this
[18:04] <kepstin-laptop> seasc: or you could just switch the order of the -i options and use your original filter chain
[18:05] <kepstin-laptop> (but that would mean changing all your map commands around)
[18:05] <seasc> right, thats why i prefer to make another complex filter string :)
[18:06] <kepstin-laptop> you have a bunch of map commands there which are either referring to video tracks or are ambiguous, you might want to check them all and see which are needed
[18:06] <xata> What version of SDL is needed to build ffplay? 1.2 or 2?
[18:06] <kepstin-laptop> xata: my copy's built with 1.2; i dunno if it supports 2.
[18:07] <seasc> kepstin-laptop, which map command would be ambigous? the 0:4 or the 0:v?
[18:08] <kepstin-laptop> seasc: well, the 0:v refers to a video track, which is unnessary with the type of filter chain being used
[18:08] <kepstin-laptop> the -map 1 near the start of the command could be ambiguous
[18:08] <kepstin-laptop> but it doesn't actually seem to be causing issues at the moment
[18:09] <seasc> the 0:v is used (generated) as soon the main input file has a video stream which is to keep.
[18:09] <xata> kepstin-laptop: thanks
[18:10] <kepstin-laptop> seasc: but your filter chain is taking 0:v and 1:v explicitly as inputs, and outputing a completely new video track
[18:10] <kepstin-laptop> so using -map 0:v there is a bit confusing
[18:12] <seasc> kepstin-laptop, yes, but as i said, its a script generating the command, and the -filter_complex 'filter' is only added if another input to be added is provided... it is not taking it 'doubled', yet....
[18:12] <kepstin-laptop> I'm not sure it's entirely wrong; depending on where it's being applied it might actually be selecting the output of the filter chain, which would have been selected by default anyways
[18:14] <seasc> kepstin-laptop, i'll do a few 'test-runs' and see what happens, if it wails you've already pointed out a likely (?) possiblity... Thank you again!
[18:15] <seasc> kepstin-laptop, oh one last thing... now i have the overlay audio as 'default' audio, the main file's audiostream has id 3
[18:15] <kepstin-laptop> but pretty much the only times you have to use map are if a) you want to include multiple tracks of the same type in the output, or b) the wrong track is selected
[18:16] <seasc> or if you have diffrent inputs
[18:16] <kepstin-laptop> well, you don't need them unless the wrong track is selected
[18:19] <kepstin-laptop> fun, you have 3 different audio tracks from those two files; 0:4, 1:2, and 1:3
[18:19] <seasc> yeah the anime has 2 streams, eng jap
[18:19] <kepstin-laptop> the default selected would be the first one, from the movie file
[18:19] <seasc> its the english from the overlay
[18:20] <seasc> anime
[18:20] <kepstin-laptop> oh, it's selected 1:2 by default?
[18:20] <kepstin-laptop> that might be because of the "-map 1" near the start of your command line
[18:20] <seasc> they're just sample files... i take them as reference when trying out somethign new.. like this
[18:21] <seasc> but there's a -map 0:4 afterwards
[18:21] <seasc> so that 0:* is weaker than the '0' (here 1)
[18:21] <kepstin-laptop> seasc: the "-map 1" selects all the audio tracks from the anime file; then the "-map 0:4" adds a third audio track from the movie
[18:21] <kepstin-laptop> the output file will have 3 audio tracks
[18:22] <seasc> i'm not irritated that it has 3 audio trask
[18:22] <seasc> i'm irritated that the one played by default is the one from the anime, which is overlayed
[18:22] <kepstin-laptop> the order of the -map commands results in the order of the tracks
[18:22] <seasc> isee
[18:23] Action: seasc is digging in code
[18:23] <kepstin-laptop> so  if you want the audio track 0:4 before 1:2 and 1:3, then do the -map 0:4 before the -map 1
[18:25] <seasc> kepstin-laptop, figured already, as i've changed the code now accordingly... hope its working as expected now :)
[18:26] <seasc> GREAT! kepstin-laptop <3
[18:28] <kepstin-laptop> seasc: you probably want to use -c:a copy rather than doing a lossy transcode from ac3 to ac3; unless you are trying to do a surround to stereo downmix?
[18:28] <seasc> thats up to the script user
[18:29] <seasc> but you're right, copy would have been faster too
[18:35] <Guest97882> hello, please help. compiling ffmpeg on Windows 7 got an "ERROR: opus not found" ... file opus.pc is needed .. .can anyone explain please where to get it?
[18:50] <Guest97882> Package opus was not found in the pkg-config search path. Perhaps you should add the directory containing `opus.pc' to the PKG_CONFIG_PATH environment variable
[18:50] <Guest97882> I have opus libs but there is no any opus.pc files
[19:36] <bofh> Hello! Where can I read more about conditional expressions in filters, like 'if(between(...))'?
[19:40] <VJ> Hi... I can get the container name from AVInputFormat->iformat->name which is a const char*. But is there an enumeration similar the ones we have for AVCodecID? Something like AVContainerID maybe? Disclaimer: This is my first post to this IRC and a new ffmpeg user.
[19:44] <PovAddict> http://upload.wikimedia.org/wikipedia/commons/thumb/9/9a/TV-PG_icon.svg/300px-TV-PG_icon.svg.png
[19:44] <PovAddict> I want to extract the timestamps at which that thing appears on the video, what do you suggest?
[19:45] <PovAddict> I don't mind writing my own program, I just don't know where to start with finding a pattern in an image
[19:45] <PovAddict> and doing it quickly, since I have to inspect every frame
[19:46] <LexRussia> hey
[19:46] <LexRussia> tell me please, ffmpeg 2 is equally libav 10?
[19:46] <LexRussia> or 11...
[19:46] <PovAddict> both are evolving on their own, so they are diverging in what they can do
[19:46] <LexRussia> shit
[19:48] <PovAddict> for starters, let's say I write a program that can tell if the TV/PG logo is in a frame, how do I run it on every frame of the video easily? afaik decoding via the ffmpeg libraries is quite involved :/
[19:48] <PovAddict> are there python bindings at least?
[20:10] <VJ> Could someone point me in the right direction please? Is there an enumeration for container types similar to the way there is one for the codecs i.e something similar to AVCodecID? I couldn't find one so far after searching a lot...
[20:11] <seasc> VJ, all i know is ffmpeg -decoders // ffmpeg -encoders
[20:12] <seasc> other than that... a container CONTAINs the streams, of which the limitiations of codecs are only set by a few containers, such as flv or wma...
[20:12] <VJ> Thanks seasc, that gives me a list of the codecs. ffmpeg -formats gives me the containers
[20:13] <kepstin-laptop> VJ: well, the container types are "formats"; in the api, you can iterate through the list with av_iformat_next or av_oformat_next
[20:14] <VJ> For example, I am trying to determine programatically if the stream I get is a transport stream container or a mp4 container by using avformat_find_stream_info()
[20:14] <JEEB> they're not integers as far as I know, but strings. so you have to do string comparisons I would say
[20:15] <VJ> Thanks kpestin-laptop and JEEB.
[20:15] <PovAddict> I think I'll use opencv for my task
[20:15] <VJ> JEEB, I think you are right, I see this - AVInputFormat Struct Reference const char *name and const char *long_name
[20:15] <JEEB> you should use the name
[20:15] <VJ> So, I guess, the only way then is to do a string search to see if I have a matching format.
[20:15] <JEEB> the long_name is a generic description
[20:16] <VJ> Hopefully the names don't change much with ffmpeg version changes... so for example mpegts hopefully doesn't become MPEG-TS in the future.
[20:17] <JEEB> nah
[20:17] <VJ> Thanks for your inputs JEEB. I just wanted to make sure I didn't miss some enumeration. Thx guys!
[20:17] <kepstin-laptop> the short name is almost certainly gonna stay the same, yeah
[20:17] <JEEB> yup
[20:18] <JEEB> so many people already depend on them so it's not really simple to change them
[20:18] <VJ> gotcha. good point!
[20:56] <BlackDream> Hello i have one question
[20:56] <BlackDream> In some Streams i see that the audio codec has 48000 hz sample rate but its playable with flv container
[20:56] <BlackDream> in some other streams i see that it says an error with sample rate 48000 that is not supported on FLV container
[20:57] <BlackDream> how is that possible :/
[21:16] <SH_> Hello?
[22:21] <seasc> I want a movie with 2 video overlays(bottom-left & bottom-right), but i only get one...  http://pastebin.com/nQ6gH1s7
[22:26] <c_14> You can only have 1 filter complex per output
[22:26] <c_14> name the implicit output pad of each filtergraph, combine the filtergraphs with a ; and map both output pads
[22:27] <c_14> or eh, sec
[22:27] <c_14> name an output pad for the first overlay filter, and use that pad as the input for the second overlay filter
[22:27] <c_14> then map the output of the second overlay filter
[22:54] <seasc> i'm not getting it... i rather encode "the video" 2 times - making this (even just the first nonworking suggestion) generated by code gets insane anyway ;) -- done is better than perfect
[22:54] <seasc> c_14, thank you
[23:10] <adam_89> Hi guys! Could you help me please? I'm trying to resize a video with this command: ./ffmpeg -y   -i test/100\ Folk\ Celsius\ -\ Lányok\ szívében\ lakom-BrpwcTnSh6Q.mp4 -filter "scale=w=800:h=600"  frames.avi, I get "Cannot connect video filter to audio input" error. It works fine with "-an" option, but i want the new video to have audio.  Here you
[23:10] <adam_89> can see the whole ffmpeg output: http://pastebin.com/UuRPUc26 Thank you in advance
[23:11] <c_14> use -vf instead of -filter
[23:11] <c_14> Or use -filter:v
[23:14] <adam_89> thank you :)
[00:00] --- Fri Oct 24 2014


More information about the Ffmpeg-devel-irc mailing list