[Ffmpeg-devel-irc] ffmpeg.log.20141211

burek burek021 at gmail.com
Fri Dec 12 02:05:01 CET 2014


[01:30] <t4nk843> hello! does anybody know how to control jpeg quality of the frames extracted from video by means of C++?
[01:31] <t4nk843> parameter quality of AVFrame doen't work for me
[01:46] <roadfish> trying to get video reformatted for those old BlackBerry torches.
[01:46] <roadfish> A webpage gave a ffmpeg command-line that included "-aspect 4:3" which to fit the screen.
[01:47] <roadfish> But then gave a second line with "-aspect 16:9" and said it would "letter box" the video, with 16x9 dimensions.
[01:48] <roadfish> So I try the video on my Linux box and, yes, _is_ 16x9.
[01:48] <roadfish> And the first video shows a distorted 4x3.
[01:48] <roadfish> But then I put both videos onto the BlackBerry and _both_ are distorted.
[01:48] <roadfish> Don't seem to be getting this promised letterboxing.
[01:49] <roadfish> Also, on the Linux box, I didn't actually _see_ any letterboxing on the video. That is there are no black bands above and below the video itself. Rather, the video is just it a 16x9 format.
[01:50] <c_14> Just don't add -aspect ?
[01:51] <roadfish> hmm? Are you asking if I added "-aspect". Yes, I did.
[01:51] <c_14> I'm telling you to try not adding it.
[01:51] <roadfish> Or are you asking if I did something more on the command line? I guess so.
[01:51] <roadfish> Ok. Will try now.
[01:54] <roadfish> no go.
[01:54] <roadfish> c_14: no go.
[01:54] <roadfish> you are correct in that dropping "-aspect".
[01:55] <roadfish> ok, will do now
[01:55] <t4nk843> ;-/
[01:55] <roadfish> anyway, you are correct in that dropping "-aspect" did generate 16x9 on the Linux box. But I still got the distorted 4x3 on the BlackBerry.
[01:56] <roadfish> I was thinking there might be something in ffmpeg to _explicitly_ add _hard_ black-bands above-and-below the image.
[01:56] <roadfish> ...
[01:56] <roadfish> Anyway, I will now paste up the ffmpeg command to show what happened.
[01:56] <c_14> oh
[01:56] <c_14> pad
[01:56] <roadfish> by the way, this is a YouTube video.
[01:57] <c_14> If you want hard black borders, just pad.
[01:57] <roadfish> ok, so I want do do something called "pad". Yeah, I'm thinking this will make BlackBerry obey.
[01:57] <roadfish> awesome. will now look into this command flag
[01:57] <c_14> https://ffmpeg.org/ffmpeg-filters.html#pad
[02:00] <david__> I'd like to overlay a video over an image, and get the image to change following a 'schedule' (for example, use 1.png from 0:00 to 2:39, 2.png from 2:39 to 4:05 ...). Is it possible?
[02:04] <c_14> Make each of the images into a video of the length you want, concat them and use that as the bottom of the overlay.
[02:04] <c_14> Only way I can think of.
[02:38] <david__> hmm. and if I wanted to do that live? (slides and a camera feed on top of it)
[03:58] <AlexLikerock> hi
[03:59] <AlexLikerock> how  to acces to my stream
[03:59] <AlexLikerock> ?
[07:02] <blackyboy> I'am using handbrake . Only mkv format showing in hanbrake is it possible to get all formats in list by installing ffmpeg ?
[07:26] <k_sze[work]> Does ffmpeg support nvenc yet?
[07:59] <cbsrobot_> k_sze[work]: it seems - since tonight http://lists.ffmpeg.org/pipermail/ffmpeg-cvslog/2014-December/084146.html
[08:24] <k_sze[work]> cbsrobot_: wow
[08:24] <k_sze[work]> I'm lucky, heh.
[08:24] <k_sze[work]> I wonder if it works on Windows as well.
[08:25] <cbsrobot_> k_sze[work]: read the few commits following this one
[08:26] <cbsrobot_> I think there  was an  issue on cygwin
[08:27] <k_sze[work]> Well, I won't be using cygwin. :D
[08:31] <k_sze[work]> Although I doubt zeranoe's semi-official builds have nvenc included.
[08:32] <k_sze[work]> including nvenc would not be GPL, I suppose?
[08:37] <cbsrobot_> the license is not gpl, so for now you have to build it with --enable-nonfree
[09:23] <k_sze[work]> It's a little strange how the NVENC SDK requires Quadro K4000 and up.
[09:23] <k_sze[work]> I thought NVENC is supported even on K2000.
[09:27] <k_sze[work]> And the ultrafast preset for libx264 doesn't seem to be actually faster.
[09:29] <fffan> hi, anybody?
[09:30] <pzich> your best bet is to explain the problem, and pastebin the details
[09:33] <k_sze[work]> ok, so ultrafast is faster, but it's still not fast enough somehow
[09:33] <k_sze[work]> It's only doing 10 fps with 1080p input on a Core i5.
[09:34] <k_sze[work]> I need it to be over 30 fps. :/
[09:34] <pzich> sounds like you need at least two more machines
[09:34] <k_sze[work]> pzich: that won't work.
[09:34] <k_sze[work]> eventually the input will come in real time from a camera.
[09:35] <pierre_> is there any possibility to decode audio stream to s16 sample_format, in ffmpeg since 2.4.x ? setting AVCodecContext.sample_fmt doesn't help, why ?
[09:36] <fffan> hi, pzich, I have a problem
[09:38] <fffan> when I av_read_frame and avcodec_decode_video2 the frames, I found that the audio timestamp is always smaller than video timestamp, it is non-monotonic
[09:38] <fffan> how can I get monotonic timestamps.
[10:39] <Elirips> Hello. I'm using a command like this 'ffmpeg -i rtsp://root:axis@192.168.0.114:554/axis-media/media.amp -c:v h264 -strict -2 -f hls -s 352x288 -segment_time 10 -hls_time 10 -hls_list_size 2 -hls_wrap 4 -start_number 1 C:\xampp\htdocs\video\stream.m3u8' to create a http-live stream froman rtsp stream. This works fine, it generates segments with a 10 seconds duration.
[10:39] <Elirips> But if I now change the input-source to a source that delivers only 5 frames per second, the resulting segments will have a duration of 50 seconds - seems the hls_time / segment_time is ignored. Why? What amI doing wrong? Thanks for any hints
[12:04] <pierre_> hi guys, i got a weird video https://www.sendspace.com/file/9tiqg3, can you tell me what is wrong with aac decodec, i am using ffmpeg-2.4.3 and mplayer on linux, aac decoder seems to have problem with decoding frames, but i don't now why
[12:12] <devffm> i want to conversion rtsp to motion jpeg.  how to configure avserver?
[12:14] <devffm> how do configure ffserver?
[12:19] <devffm> hi armada
[12:20] <devffm> do u know about how to configure ffserver?
[12:20] <azk> ttps://www.ffmpeg.org/ffserver.html#Stream-examples
[12:21] <devffm> thank azk
[12:22] <azk> np
[12:25] <devffm> azk following configuration in ffserver .it doesnt work for me
[12:26] <devffm> http://pastebin.com/3s3T3Yxq
[12:26] <devffm> pls check above url
[12:27] <azk> What does your ffmpeg line look likeÞ
[12:27] <azk> ?
[12:29] <devffm> avconv -i iptest.mp4 http://localhost:8090/feed1.ffm
[12:33] <azk> logging the output to a pastebin wouldn´t be a bad idea, I´m a bit busy at work but I´ll try to take a look at it later.
[12:33] <devffm> ok
[12:38] <MindSpark> hi, I have a few mp4's with aac audio. How do I extract audio in aac format? Why does ffmpeg need to go through reencoding?
[12:56] <jarainf> MindSpark, you don't need to you can use -c:a copy
[12:57] <jarainf> If I recall correctly you should even be able to simply rename it to m4a
[12:58] <jarainf> You'd have a (probably) huge overhead because of the video but it should play back just fine, hrhr.
[13:06] <MindSpark> jarainf, -acodec copy worked just fine :) thanks!
[13:12] <DrSlony> Hi, I just installed ffmpeg 2.5, how do I de-shake/stabilize video?
[13:19] <DrSlony> how does one use ffmpeg's vid.stab?
[13:20] <DrSlony> hey theres this things called googiel and it answered my question http://public.hronopik.de/vid.stab/features.php?lang=en
[13:45] <DrSlony> hmm
[13:45] <DrSlony> ffplay foo.mp4 -vf deshake
[13:45] <DrSlony> workd
[13:45] <DrSlony> and
[13:45] <DrSlony> *works
[13:46] <DrSlony> ffplay foo.mp4 -vf crop=400:300
[13:46] <DrSlony> works too
[13:46] <DrSlony> but how do i combine the two?
[13:46] <DrSlony> ffplay foo.mp4 -vf deshake;crop=400:300
[13:46] <DrSlony> that does not work
[13:47] <DrSlony> ah, comma
[13:48] <tkraaienest> Stream #0:0[0x1e0]: Video: mpeg2video (Main), yuv420p(tv), 720x576 [SAR 64:45 DAR 16:9], max. 7500 kb/s, 25 fps, 25 tbr, 90k tbn, 50 tbc
[13:48] <tkraaienest> -vf "crop=708:576:6:0,scale=iw*sar:ih,crop=min(iw\,ih*(16/9)):ow/(16/9),scale=-1:404,setsar=sar=1/1"
[13:48] <tkraaienest> is there a way to combine the two scales?
[13:48] <tkraaienest> I believe 2 scales would lower quality?
[13:55] <EuaD> i encoded a video using ffmpeg and h265 because i need very high compression but great quality for my website. i cant play it. how can i play back the file to check it quality?
[13:55] <BtbN> Webbrowsers won't be able to play h265 for quite a while
[13:56] <BtbN> People still have trouble because their CPUs are too slow for h264...
[13:58] <EuaD> BtbN, im making my own webpage using video.js or the ilke
[13:58] <EuaD> *like
[13:58] <BtbN> So?
[13:58] <EuaD> so im asking how to play back my h265 to check it
[13:58] <BtbN> you don't
[13:59] <EuaD> ffmpeg can't play back h265?  even though it was used to encode it?
[13:59] <BtbN> Even if video.js implements a h265 decoder in JavaScript, you'd propably need a 8 cores to decode it.
[14:01] <EuaD> hmm, ok. i was under the impression that html5 could decode x265?
[14:01] <EuaD> is it the part10 that is tough to decode on a website?
[14:02] <EuaD> BtbN, so you're saying that an 8 core cpu is required to play back this encoded video? http://gyazo.com/39ba6f1ee8dafbc92ea8e7c06ca85a59
[14:03] <BtbN> Well, with an optimized C decoder, it still fully uses a quad core to decode in realtime
[14:03] <EuaD> my goal is the get a smaller file size for my videos
[14:04] <BtbN> If you'd implement it in JavaScript, it would be a lot slower. Propably even way more than 8 cores
[14:04] <BtbN> Browsers still struggle with h264 and vp9. And you expect h265?
[14:04] <EuaD> is that for 10bit or 8bit?
[14:05] <EuaD> i had no idea that browsers struggle with h264
[14:05] <EuaD> youtube's doing it everyday
[14:05] <BtbN> no, they do WebM
[14:05] <BtbN> and only 30 fps, because 60 fps used to be too much until recently
[14:05] <EuaD> ah ok, so that's what i should look into then
[14:06] <EuaD> it's for a 720p video podcast that i want to put up
[14:06] <BtbN> WebM with vp8 is not as optimal as h264
[14:06] <EuaD> 720p at 30
[14:06] <BtbN> You basicaly have to do both if you want to support all browsers
[14:07] <EuaD> both being webm and flash?
[14:07] <BtbN> flash isn't a video codec.
[14:07] <EuaD> or webm and h264
[14:08] <EuaD> ok, i misspoke.
[14:08] <EuaD> as i said my goal is to get the smallest possible video i can for my website
[14:08] <DrSlony> and for 0 people to view it
[14:08] <EuaD> youtube is doing it now
[14:09] <EuaD> im having a hard time following what you guys are telling me
[14:09] <DrSlony> then use youtube
[14:09] <EuaD> DrSlony, i can't because of the way they hide the raw video file url, they require 0auth2 or some shit
[14:10] <EuaD> i would need some custom coded wordpress plugin for my website
[14:10] <DrSlony> probably because which video file depends on which device is used to view it
[14:11] <EuaD> that makes sense
[14:11] <DrSlony> webm or vp8 or h264 or whatever
[14:12] <EuaD> so i guess i just need to find some coder that wil make a wordpress plugin for my youtube videos then.  this is all new to me, my goal is to host my video podcast on my website and the wordpress plugins i've tried aren't working with just the regular youtube url
[14:13] <EuaD> ok, thanks for your help guys.  but can you please tell me how i can play back my x265 part10 encoded video just for shits and giggles
[14:13] <DrSlony> why dont you just embed your video?
[14:13] <DrSlony> wild guess, try ffplay
[14:13] <EuaD> DrSlony, it's the way the podcast plugins work for auto creation of the rss feed.
[14:14] <BtbN> I'd be surpsirsed if there are not at least 10 plugins for wordpress to embed youtube videos.
[14:14] <DrSlony> same here
[14:15] <DrSlony> maybe you used the smallest plugin written in html6? ;]
[14:15] <hans_s> EuaD: do you want your podcasts to be downloadable? I can't think of any other reason why youtube shouldn't work
[14:16] <EuaD> here's the wordpress podcast plugin i am using now which doesn't work with the youtube url. http://gyazo.com/23f5dc676b39815aa940fda9462c7d7d
[14:16] <EuaD> hans_s, i don't mind if someone downloads my video podcast if that's what you're asking
[14:17] <EuaD> i believe the auto creation of the rss feed requires the "Audio File" to be a hard link to the actual video file so it can read the mimetype, i think....
[14:17] <DrSlony> you're trying yo upload a youtube url as an audio file...
[14:17] <EuaD> DrSlony, LOL, that's where my dilemma is......
[14:18] <DrSlony> oh dear
[14:18] <EuaD> DrSlony, hence why i wanted to host my own x265 video file
[14:18] <hans_s> do you want to create a podcast feed that's explicitly compatible with podcast apps and itunes?
[14:18] <EuaD> hans_s, yes
[14:18] <EuaD> that's what my current podcast plugin does
[14:19] <EuaD> i realize all this cvould be solved with custom code but i am no coder
[14:19] <DrSlony> ok, good luck, im not taking further part in this :)
[14:19] <hans_s> which means that you will have to host your videos yourself, as I doubt that podcast apps will accept youtube urls
[14:20] <EuaD> DrSlony, thanks for the help. i used ffplay and it worked to play back my x265 video. what's flipping awesome is it's only 256mb in size   where it's equivalent x264 video is 700mb
[14:20] <EuaD> hans_s, hence why i asked my original question
[14:20] <DrSlony> http://vdo.me/youtube-rss-feed-for-you-channel/
[14:20] <EuaD> full circle   :)
[14:21] <hans_s> yes, but no browser supports h265 playback, the only format that every modern browser accepts, is h264
[14:22] <DrSlony> i very highly doubt the 256MB h265 version is equivalent in quality to the 700MB h264 version
[14:22] <DrSlony> and dont say "they sound identical"
[14:22] <hans_s> so, encode your videos with h264/aac in an mp4 container
[14:22] <EuaD> DrSlony, hmm, i just tried http://www.youtube.com/rss/user/linuxtechandgaming/videos.rss  in gpodder and it doesn't work. thanks for trying though
[14:23] <hans_s> if you don't want to host the video files yourself, upload them to youtube and tell your viewers to use whatever youtube app they want to view your videos
[14:24] <EuaD> DrSlony, it sures looks the same to me. http://gyazo.com/7883ec2e6510068757456f7fa437bb13
[14:24] <DrSlony> EuaD there is no user youtube account called linuxtechandgaming
[14:25] <DrSlony> and if there was i'd send a complaint because technologically clueless people shouldnt be recording podcasts about linux tech and spreading nonsense
[14:26] <EuaD> DrSlony, AHHH, ok. my youtube channel is linuxtechandgaming
[14:26] <DrSlony> but yeah, i was supposed to not take further part precisely to keep myself from expressing my infallible opinion :]
[14:26] <EuaD> DrSlony, but my channel name is https://www.youtube.com/channel/UCe3BDMwiiSrfx66PbqXH6IQ
[14:26] <EuaD> DrSlony, i appreciate you help. :)
[14:26] <EuaD> DrSlony, that worked for my video rss feed. THANKS
[14:28] <EuaD> DrSlony, "technologically clieless",  ouch.......
[14:28] <EuaD> think that was a litle much.  i am not a coder so i don't understand what the podcast plugin was doing and or what was required.
[14:29] <EuaD> i didn't know realize that podcast rss feeds required a raw link to the real file
[14:29] <EuaD> so i thank everyone for their help. you've solved an issue for me. i can use the youtube video rss url on my website. thanks again
[14:37] <tkraaienest> believe I got it
[14:37] <tkraaienest> -vf "crop=708:576:6:0,scale=max(iw*sar\,oh*((iw*sar)/ih)):404,crop=min(iw\,ih*(16/9)):ow/(16/9),setsar=sar=1/1" -sws_flags lanczos
[14:37] <tkraaienest> this looks right?
[14:38] <tkraaienest> scratch that
[14:38] <tkraaienest> completely stretched
[14:39] <ubitux> you can use force_original_aspect_ratio with scale filter
[14:39] <tkraaienest> even after crop?
[14:39] <ubitux> and flags option
[14:39] <ubitux> force_original_aspect_ratio is used to define a "box"
[14:40] <ubitux> like, scale=320:240:force_original_aspect_ratio=decrease will fit the image inside this size
[14:40] <ubitux> and add your scaling flags in it
[14:40] <ubitux> like, scale=320:240:force_original_aspect_ratio=decrease:flags=+lanczos
[14:41] <DrSlony> ubitux im testing various deshaking parameters but using ffplay on 1080p is impossible. is there a switch to get it to process and buffer eg. 60 frames before playing them?
[14:41] <DrSlony> (using ffplay with real-time deshaking on 1080p, that is)
[14:41] <tkraaienest> does it consider DAR and SAR?
[14:50] <EuaD> DrSlony, it appears like you're technically clueless because you don't know how to process and buffer
[14:50] <EuaD> ;)
[14:53] <DrSlony> touche
[15:03] <Kriss_> hello! Does anybody know how to extract high quality jpeg from video by means of ffmpeg with c++? all the examples on the web give me bad quality :(
[15:20] <Kriss_> anybody? :)
[15:46] <klaxa|work> Kriss_: what do you have so far?
[15:48] <Kriss_> klaxa|work: for now I have bad quality of extracted frame while it is fine in the video ... even bmp I receive different then in the video ... what could be the reason?
[15:48] <Kriss_> klaxa|work: settings like "quality" don't work at all
[15:50] <wyatt8740> would bitrate matter...?
[15:51] <wyatt8740> or is there some setting like CRF?
[15:56] <Kriss_> well, I don't set bitrate by myself ... should I? even in image?
[15:59] <elliotd123> getting the following error, anyone know what it means?: Application provided invalid, non monotonically increasing dts to muxer in stream 0: 72758271 >= 72758271
[16:30] <Kriss_> could it be that pixelformat affects the quality?
[16:46] <Kriss_> is it ffmpeg channel?
[16:48] <iive> and yes, you are recommended to set at least quantizer, the defaults are not always sane.
[16:49] <iive> (usually big quantizer means bad quality).
[16:50] <iive> mjpeg is series of jpeg, so the bitrate could be applied to it.
[16:55] <Kriss_> actually I don't use command line, I 'm trying to get frames as images programmatically by means of c++ and ffmpeg libs
[16:56] <Kriss_> can I show code I use?
[16:58] <klaxa|work> yes please
[16:58] <klaxa|work> that would help i think
[16:58] <orough> Hi, does anyone have a one-liner for extracting high quality keyframes? My video is 25fps and 60 seconds long. This is what I have so far http://pastebin.com/jQ7YYPY5
[16:59] <orough> The line I pasted works, but is low quality and only produces 49 frames for a 1:36 min. video. I need a lot more.
[17:02] <tkraaienest> what's the best way to deinterlace and keep the framerate? -vf "idet,yadif=0:-1:1" seems kinda choppy. I did try -vf "idet,yadif=1:-1:1,fps=fps=25" and the result looks the same.
[17:02] <Kriss_> here is an example of jpeg saving I use
[17:02] <Kriss_> https://trac.ffmpeg.org/ticket/3090?cversion=2&cnum_hist=2
[17:04] <Kriss_> if you can please help me to impact on quality in the example or please explain why I can't do this
[17:08] <iive> Kriss_: try to set quantizer to value of 2 (e.g. vqscale=2 )
[17:10] <Kriss_> iive: excuse me, but where exactly should I place it in the code?
[17:10] <iive> tkraaienest: can you figure out why does it look choppy?
[17:10] <Kriss_> iive: it is parameter of what?
[17:11] <tkraaienest> it drops too many frames?
[17:11] <iive> Kriss_: to be honest i'm not quite sure what is the proper way to set that. using dictionary options or avctx directly.
[17:11] <iive> tkraaienest: yadif=send_frames shouldn't drop any frame...
[17:12] <tkraaienest> idet show this:
[17:12] <tkraaienest> [Parsed_idet_1 @ 0x6a47ec14360] Repeated Fields: Neither: 86087 Top:   306 Bottom:   391
[17:12] <tkraaienest> [Parsed_idet_1 @ 0x6a47ec14360] Single frame detection: TFF: 73578 BFF:    37 Progressive:  5185 Undetermined:  7984
[17:12] <tkraaienest> [Parsed_idet_1 @ 0x6a47ec14360] Multi frame detection: TFF: 82069 BFF:    38 Progressive:  4628 Undetermined:    49
[17:13] <troulouliou_dev> hi how does seek works ? it seek in the conteiner just before the first frame of any kind at that time ?
[17:13] <Kriss_> iive: I hope we are talking not about command line, right? )
[17:20] <iive> Kriss_: try to set avctx->qscale =2; before optneing the encoder. open2 also takes settings as 3'd parameter, but at least the example doesn't use it)
[17:23] <iive> or take a look of ffmpeg-source/doc/examples/decoding_encodiing.c
[17:23] <Filarius> Hi, I`m trying to send bitmaps to ffmpeg with TByteStream and anon pipe. I made app what good in sending video files this way (file > stream > pipe > ffmpeg) and I can`t find right way to replace file with number of generated TBitmap
[17:24] <iive> function video_encode_example, seems to set avctx->bitrate before open2()
[17:30] <Kriss_> iive: hm ... is there this decoding_encodiing.c on the web? in the cases I found there is not such string as "avctx->bitrate"
[17:32] <iive> avctx is short for avcontext...
[17:33] <iive> git http should show you the file
[17:34] <Kriss_> iive: I use Windows :)
[17:34] <shevy> you poor man
[17:35] <Kriss_> oh come on ...
[17:35] <shevy> :)
[17:44] <tkraaienest> well, I made it -vf "idet,yadif=1:-1:1,fps=fps=30000/1001", and it looks good. Not exactly what I wanted, but it'll have to do even though it's a very slow encode.
[17:44] <wyatt8740> what does 'yadif=2' do?
[17:49] <iive> Kriss_: web git should work with internet explorer too :P
[17:50] <iive> just browse ffmpeg website or find ffmpeg mirror on github
[17:51] <wyatt8740> kriss_: or install cygwin if you want to do it the way of *nix users
[17:51] <Kriss_> iive: I found it thank you)
[17:52] <Kriss_> iive: anyway let me ask you did you mean AVCodecContext? or exactly AVContext?
[17:57] <Filarius> Hello, I`m looking for help with bitmaps and input via pipe http://pastebin.com/aCtht4gD
[17:57] <Filarius> Can`t fild whats wrone here
[17:58] <iive> the one returned by avcodec_alloc_context<n>
[18:01] <Kriss_> iive: it has parameter brd_scale ... is there any range of it's values?
[18:01] <iive> this is probably bitrate distortion scale
[18:03] <Kriss_> strange but all the setting and values of any parameter don't affect anything)) it's really funny
[18:03] <Kriss_> I'm doing something wrong
[18:07] <iive> my info seems to be quite outdated.
[18:08] <Kriss_> commenting all the manuall setting of AVCodecContext gives the same result ... what are they for in this case? strange
[18:09] <iive> qscale is moved into RcOverride. you allocate an array and point it in c->rc_override/_count/
[18:10] <Al3x4nd3r> Hi, anyone have news about playlist loop with ffmpeg?
[19:14] <DrSlony> Can I ask for transcode help here? I want to try out vid.stab, I have transcode-1.1.7 and ffmpeg-2.2.10 installed. When I run "transcode -i 10s.mp4 -J stabilize -y null,null -o dummy" I get a failed detect in infinite frames, as if it wasnt reading the input file at all. When I try "transcode -i 10s.mp4 -x ffmpeg,null -J stabilize -y null,null -o dummy" I get "Format 0x0 not supported".
[19:14] <DrSlony> http://dpaste.com/0SBFMCQ
[19:15] <DrSlony> I tried ffmpeg-2.5 and got the same error, but downgraded back to 2.2.10 as anything higher breaks other packages on gentoo.
[19:18] <DrSlony> 10s.mp4 here http://filebin.net/q2d8pjq8gk
[19:21] <DrSlony> looks like transcode has been dead for 3 years
[19:23] <pzich> I don't know anything about transcode, but from your log it seems like it's failing to read your input
[19:23] <DrSlony> yes, though im using the same command i used years ago when it did work
[19:23] <pzich> on an MP4?
[19:23] <DrSlony> yes
[19:24] <DrSlony> though then i also used a different command which used mplayer instead of ffmpeg, and now i dont use mplayer so i only tried the ffmpeg version
[19:27] <pzich> these lines from your log suggest the failure had to do with opening the file (prior to ffmpeg's involvement): http://dpaste.com/2Y9EQ2Z
[19:27] <DrSlony> i can try conerting it to some other format, what do you suggest?
[19:28] <pzich> I've never used it before, but have you tried the stabilization in ffmpeg itself? http://adaptivesamples.com/2014/05/30/camera-stabilisation-with-ffmpeg/
[19:28] <pzich> and the details for the filter: http://ffmpeg.org/ffmpeg-filters.html#deshake
[19:28] <Kriss_> hell, why it was made so  hard for understanding? ))
[19:29] <Kriss_> RcOverride is described but used nowhere))) just nowhere but example
[19:29] <DrSlony> pzich yes i have, its not good
[19:29] <pzich> ah, shame
[19:30] <DrSlony> comment 1 seems worth investigating
[19:30] <pzich> http://ffmpeg.org/ffmpeg-filters.html#vidstabdetect-1
[19:31] <pzich> looks like you may need to build with those enabled, or find one compiled
[20:00] <DrSlony> pzich i cloned ffmpeg from git and get "ERROR: vidstab not found using pkg-config". Full error: http://dpaste.com/04BAAR5 http://filebin.net/ok41zoio9b/config.log
[20:19] <DrSlony> anyone?
[20:21] <Kriss_> full error description is at the end of config.log
[20:22] <DrSlony> Kriss_ there is no vidstab.pc in the cloned folder
[20:26] <Kriss_> seems like this is additional software and it should be either compiled separately or be included already compiled, in your case something is wrong and questions should be addressed to those who share this
[20:29] <DrSlony> http://ffmpeg.org/ffmpeg-filters.html#vidstabdetect-1 made it sound like it should come with ffmpeg. I cloned https://github.com/georgmartius/vid.stab and now compiled vidstab.pc
[20:31] <DrSlony> looks like that worked
[20:39] <DrSlony> compilation succeeded
[20:42] <DrSlony> but wen i run ffmpeg it cant find the library
[20:42] <DrSlony> $HOME/programs/ffmpeg/ffmpeg
[20:42] <DrSlony> /home/drslony/programs/ffmpeg/ffmpeg: error while loading shared libraries: libvidstab.so.1.0: cannot open shared object file: No such file or directory
[20:42] <DrSlony> is there some environment variable i need to set?
[20:43] <JEEB> did you install in a local prefix or in a global search path?
[20:43] <DrSlony> local
[20:43] <JEEB> ok, then yes
[20:44] <JEEB> and I don't mean, /usr/local btw, which is generally contained in the global search path
[20:44] <DrSlony> i try to keep everything hand-compiled in $HOME/programs/
[20:44] <JEEB> ok
[20:44] <JEEB> then you have to use LD_LIBRARY_PATH when running the application
[20:44] <JEEB> LD_LIBRARY_PATH=/herp/derp/lib/ ffmpeg
[20:44] <DrSlony> yay! works
[20:45] <JEEB> shared binaries are kind of meh like that on linux, which is why you generally end up building static libraries when doing user-specific binaries
[20:45] <DrSlony> speaking of which, how do i build static libs?
[20:45] <JEEB> depends on the project
[20:46] <JEEB> vidstab seems to have built itself shared
[20:46] <JEEB> see its configure's --help
[20:46] <DrSlony> this one for example, is there some ./configure command?
[20:46] <DrSlony> ok
[20:46] <JEEB> usually there's stuff like --disable-shared --enable-static
[20:46] <DrSlony> -preset is deprecated in git?
[20:46] <JEEB> no
[20:46] <JEEB> -preset is libx264 specific
[20:46] <JEEB> so if you don't have libx264 linked in, ffmpeg doesn't have that option
[20:46] <DrSlony> ah
[20:47] <JEEB> or well, I guess these days libx265 has it too
[20:47] <DrSlony> recompiling ffmpeg here we gooo :)
[20:47] <JEEB> but you get the general gist
[20:47] <DrSlony> yes, thank you
[20:56] <DrSlony> in case anyone wants to try, here's my progress so far:
[20:56] <DrSlony> http://dpaste.com/3KV2RP8
[21:06] <DrSlony> pzich Kriss_ JEEB vidstab works great!
[21:07] <Kriss_> cool :)
[21:08] <pzich> woohoo!
[21:08] <Kriss_> so, experts of ffmpeg ... does anyone know how to use RcOverride?
[21:08] <Kriss_> or send me to those who knows))
[21:17] <DrSlony> complete instructions:
[21:17] <DrSlony> How to compile ffmpeg with libvidstab support (vid.stab) and deshake video: http://dpaste.com/26PQQ0F
[21:18] <pzich> nice
[21:22] <Filarius> I`m trying to send generated bitmaps to ffmpeg. http://pastebin.com/aCtht4gD   With copy of file sended to input pipe it work. I gess I miss something
[21:23] <pzich> I think you also want to specify a pixel format for the output video, probably yuv420p
[21:24] <Filarius> not so want, ffmpeg is okay without tuning
[21:25] <Filarius> it use yuv420p
[21:25] <pzich> according to your paste it's using yuv444p
[21:26] <Filarius> oh, right, i read not right line
[21:26] <pzich> also, I don't know too much about your program up above, but I don't think you want to close FPipeIn until after you run the ffmpeg command
[21:28] <pzich> to debug, can you just write your bmps to one big file and then `cat imagestream | ffmpeg ...`?
[21:28] <Filarius> my program is okay to send files via FPipeIn (.avi and .ts tests is ok), now I looking way to send bitmaps
[21:28] <Filarius> i`m on Windows
[21:28] <pzich> oh, yes
[21:29] <pzich> well, however you'd do that on windows
[21:30] <pzich> I'm also seeing `-f image2pipe -vcodec bmp -s 300x300` being suggested instead of your rawvideo/bgr24
[21:30] <pzich> you may not need to specify the size, but I've not used it specifically with bmp before
[21:32] <Filarius> tested, http://pastebin.com/ELfJ2Rg6
[21:33] <pzich> well you didn't pipe any input
[21:35] <DrSlony> thank you again
[21:37] <Filarius> this one with file to pipe http://pastebin.com/pDQw4Tvt
[21:37] <Filarius> and it okay
[21:37] <wyatt8740> I love pipes
[21:38] <pzich> pipes are great
[21:38] <wyatt8740> I capture video while previewing with ffmpeg and ffplay using pipes
[21:38] <wyatt8740> and tee
[21:38] <wyatt8740> ffmpeg <input stuff> -f matroska - | tee outfile.mkv | ffplay -
[21:39] <wyatt8740> just today copied a star wars VHS like that
[21:40] <pzich> ah, so your input is also real time?
[21:40] <wyatt8740> it's a video capture device
[21:40] <wyatt8740> his?
[21:40] <wyatt8740> looks realtime to me
[21:41] <pzich> most of my encoding is much faster or slower than realtime, so piping to ffplay wouldn't work too well
[21:42] <wyatt8740> I had to use x264 with -preset veryfast
[21:42] <wyatt8740> and then shrink it later
[21:43] <wyatt8740> I guess at the point you're using veryfast rawvideo would be almost as good
[21:43] <wyatt8740> I think ffmpeg slows down to meet ffplay's speed
[21:44] <wyatt8740> or at least shoves stuff into the buffer
[21:44] <wyatt8740> Can't be certain though. I mainly encode x264 and so it's rare that I use a codec that's very much faster than the framerate of the video I
[21:45] <wyatt8740> *I'm encoding
[21:45] <wyatt8740> On LAN connections I also use pipes with wget or curl to stream movies off of a server into ffplay or mplayer
[21:54] <Filarius> okay, now I have something like only one frame in video of 6 seconds http://pastebin.com/cNJ2qnPR  buffer size is 27005400
[21:55] <Filarius> *pipe buffer
[21:56] <Filarius> only change i made - change starting position of TStream to zero
[22:14] <mattlea> Hi all, does anyone know how to get the next frame when writing an audio filter?
[22:14] <mattlea> i.e. when filter_frame is called, I need the next frame also to process.
[22:18] <pzich> I've not done anything with filters, but it may be that you have to instead store the last frame and process previous and current instead of current and next
[22:21] <mattlea> pzich: I was thinking something similar, but I'm not sure how to send two frames to ff_filter_frame (which is what is usually called at the end of filter_frame).  I'll keep looking at other filters to see if one does it differently.
[22:24] <Filarius> yey, it work, I just set input frame rate too... but picture somethey is look like it moving to right... maybe there is additional data saved to stream, not only raw pixels
[00:00] --- Fri Dec 12 2014


More information about the Ffmpeg-devel-irc mailing list