[Ffmpeg-devel-irc] ffmpeg.log.20131020

burek burek021 at gmail.com
Mon Oct 21 02:05:01 CEST 2013


[01:13] <kriskropd> does ffmpeg have some built in hotkey to pause a running encode process? if not, is it safe to send 'kill -STOP pid' and 'kill -CONT pid' to pause adn resume a running ffmpeg process?
[01:17] <zybil_> hi+
[04:58] <ferrouswheel> Hi all - anyone have any tips on how to automate conversion of old codecs to newer ones? I have some old videos that don't play on my RPi. Ideally I'd scan all my media, identify what files are Div3, and convert to something newer.
[05:17] <sacarasc> If you can, you'd be better off starting with the source material, as Div3 is pretty poor, and you'll only lose quality encoding from it,
[05:19] <sacarasc> But, if you can script a little bash, you could use ffprobe to see what codec it's using.
[05:19] <ferrouswheel> Yeah - it's pretty bad, but there are videos that would be hard to track down again.
[05:19] <ferrouswheel> Thanks :-)
[05:19] <ferrouswheel> I'd started investigating ffprobe
[05:20] <ferrouswheel> Using xargs and grep to find out just how many Div3 I have. Hopefully not many ;-)
[07:40] <rudolfovich> I'm trying to implement a real-time video streaming via udp.
[07:40] <rudolfovich> At this moment, I implemented a simple communication protocol, where each coded video frame is wrapped into a NAL unit and divided into pieces no larger than MTU.
[07:40] <rudolfovich> This works fine in LAN, but when I test over internet, i have a lot of undelivered packets and forced to drop whole frame.
[07:40] <rudolfovich> It is obvious that putting in NAL entire frame - it's a bad idea.
[07:40] <rudolfovich> But I do not know what part of the encoder work result, is best to wrap into NAL to ensure minimum loss.
[13:38] <Ulfalizer> sorry if this is a bit tangential. i'm thinking of adding a video recording feature to an emulator i'm working on. it would need to take the raw audio and video streams and encode and stream them to a file on disk. easy switching of encoding formats would be a plus. would libavcodec be a good match for that? any possibly simpler alternatives? i have very little experience when it comes to video.
[13:39] <JEEB> libavcodec and libavformat probably are the simplest way to provide multiple alternatives
[13:39] <JEEB> libavcodec for the actual audio/video streams being encoded, and then libavformat to write those streams into a container
[13:39] <Ulfalizer> how about gstreamer? or is it more tailored to streaming over a network and the like and hence overkill?
[13:40] <JEEB> gstreamer really isn't about streaming, it's meant to be a multimedia framework like DirectShow on Windows
[13:41] <JEEB> and it generally is a mess and using libraries straight out in general usually ends up being a better idea
[13:42] <Ulfalizer> ok, i'll look into libavcodec and libavformat. thanks!
[13:43] <Ulfalizer> from some quick reading up this might be a politically loaded questions, but how do the ffmpeg and libav versions of libavcodec compare? :)
[13:43] <Ulfalizer> which one is more commonly used? pros and cons?
[13:44] <JEEB> well, ffmpeg tends to merge more stuff in while libav generally merges only things deemed important enough and if someone pokes them with stuff
[13:44] <JEEB> generally the usage should be similar enough, except for some derpy cases
[13:45] <Ulfalizer> switching from one to the other should be relatively painless?
[13:46] <JEEB> in general, yes
[13:47] <JEEB> when you start adding support for various versions of things and such it tends to get a bit more painful
[13:47] <Ulfalizer> ok, i'll try the ffmpeg version first
[19:06] <zennist> Hi anyone knows about problem with decoding AAC audio stream?
[19:06] <zennist> I've got a code snippet that decodes almost all formats correctly using libavcodec, but it fails on AAC streams and instead gives back noise
[19:09] <zennist> But the command line ffmpeg can still convert the files correctly; so I'm thinking it's a problem in my own code, but I don't know what's wrong
[19:09] <zennist> Here's the code http://pastebin.com/VfkU1kVi
[19:11] <JEEB> zennist, could it be that you're expecting 16 bit integer pcm or something?
[19:11] <JEEB> didn't check the code, but that's the most common error
[19:11] <zennist> Yes, I'm playing the audio with libao, so I guess that's 16bit pcm
[19:12] <zennist> JEEB: does the decoded frame not contain data as 16bit pcm?
[19:12] <JEEB> with lossy formats it's generally float
[19:13] <JEEB> only lossless formats will have integer audio
[19:13] <JEEB> you can use libavresample or libswresample for conversion for output
[19:13] <zennist> JEEB: okay..so I have to use resample to convert it pcm correct?
[19:14] <JEEB> it's still pcm :P
[19:14] <JEEB> it's just 32bit float
[19:14] <JEEB> not 16 bit integer or whatever you're expecting
[19:15] <JEEB> so yes, use one of the resampling libraries to make the output be what you need
[19:15] <zennist> JEEB: cheers! I would make some changes and see whether it works out
[20:19] <Ulfalizer> can avcodec_encode_video() do format conversions? e.g. for encoding ARGB frames to mpeg, which seems to use YUV.
[20:20] <JEEB> generally no
[20:20] <JEEB> you would have to use libswscale for that, or create your own conversion routines
[20:21] <Ulfalizer> ok
[20:27] <Ulfalizer> what's the deal with AVFrame.data being an array of uint8_t* by the way? is it more common for video to use planar rather than packed representations?
[20:29] <Ulfalizer> and how would something like ARGB appear there (if it ever does)? will it always be split?
[22:01] <zennist> Just added resampling to my program; and swr_convert is returning 0 all the time. Any ideas?
[22:02] <zennist> I put src_sample_fmt as AV_SAMPLE_FMT_FLTP and dest_sample_fmt as AV_SAMPLE_FMT_S32
[22:07] <JEEB> Ulfalizer, yes -- esp. with YCbCr video it's more often planar than packed
[22:07] <JEEB> Ulfalizer, also I think you should be able to handle both packed and planar ARGB
[22:08] <JEEB> zennist, unfortunately I can't say much more than look at the format you're getting from the decoder and stare at the doxygen :s
[22:11] <Fresh> hello, would this be the correct place to ask a noob usage question?
[22:13] <Fresh> I am trying to create an http video stream of my desktop so I can view it on my tv, but i cannot figure out how to get the segementer to intergrate with ffmpeg.  If someone can tell me the missing link, I would appreciate it
[22:14] <Fresh> I have downloaded a precompiled segmenter, with these expected parameters. USAGE: segmenter  -i input-MPEG-TS-file  -d seconds-per-segment  [-o segment-file-prefix]  -x output-playlist-m3u8  [-p http-prefix]  [-w live-segments]  [--watch-for-kill-file]  [--strict-segment-duration]
[22:15] <Fresh> Here is my ffmpeg input ffmpeg -f dshow  -i video="UScreenCapture" -r 10  -vcodec mpeg4 -q 12 -f mpegts <OUTPUT HERE>
[22:15] <Fresh> Here is the segmenter input segmenter -i <INPUT HERE> -d 10 -x test.m3u8
[22:15] <Fresh> how can I end up streaming the stuff to http://localhost:8080/ instead of having it on the local disk ?
[22:15] <Fresh> thank you if to anyone, who can maybe clear up the missing step.
[22:15] <Fresh> I have searched google for quite a while now
[22:16] <zennist> JEEB: mhh, the frame I got from the decoder does indicate the format is still AV_SAMPLE_FMT_FLT though..
[22:38] <viric> Hi!
[22:39] <viric> I've a video reported as:     Stream #0:0[0x1e0]: Video: mpeg2video (Main), yuv420p, 720x576 [SAR 64:45 DAR 16:9], 25 fps, 25 tbr, 90k tbn, 50 tbc
[22:39] <viric> But *it is interlaced*.
[22:39] <Mavrik> and?
[22:39] <viric> I want yadif to provide 50fps
[22:39] <viric> yadif=1 doesn't work with it. I think because it's marked as non-interlaced...
[22:40] <viric> yadif alone deinterlaces fine, though.
[22:40] <viric> (it's a direct recording from DVB)
[22:41] <viric> Do you have any idea how can I tell ffmpeg that the image comes from two fields?
[22:41] <viric> (for what I understand, it's interlaced video encoded as progressive)
[22:41] <viric> Is it an usual thing? Do I understand it right?
[22:42] <viric> ah, damn it. only PART of the video is interlaced... I guess the broadcaster messed all up
[22:44] <Mavrik> hmm, I mean your question is wierd
[22:44] <Mavrik> if you're deinterlacing the video, you're merging the fields
[22:44] <Mavrik> at least that's the primary use case
[22:44] <viric> yes
[22:45] <viric> but if I have 25fps interlaced video, that means 50 fields per second. yadif can give me either 25fps or 50fps progressive, right?
[22:46] <viric> in this case, the video media is '25fps progressive', but I see in the pictures that it is interlaced. So it means that the broadcaster sent interlaced video as progressive.
[22:46] <viric> Or I'm completely wrong? where ffprobe tells if it is interlaced or not? :)
[22:48] <cbsrobot> did you add -r 50 ?
[22:48] <viric> hm no
[22:52] <viric> maybe I'm just checking things bad.
[22:53] <viric> In this video, I see parts of the picture interlaced, and parts clearly not interlaced.
[22:53] <Mavrik> there's no real reason why WHOLE video should be interlaced :)
[22:53] <viric> hehe
[22:53] <viric> but there should be some flags in the file, whether it has one or two fields, no?
[22:54] <Fresh> I was wondering if anyone could point me in the direction of how to setup a http video stream
[22:54] <viric> Fresh: ffserver comes with examples
[22:54] <Fresh> i will take a look over there now, thank you
[22:55] <viric> Hm in fact I don't remember ever a program noticing that a file is interlaced, and so, automatically deinterlacing it.
[22:55] <viric> Mavrik: your last sentence has very valuable information, it seems
[22:57] <Mavrik> viric, I'm not entirely sure about MPEG-2
[22:57] <Mavrik> but in H.264 you can't know if frame is interlaced without actually decoding it
[22:59] <Fresh> viric, is it impossible under windows?  It seems like ffserver is only for linux, yet ffmpeg has been able to act as a udp,rdp, etc server
[22:59] <viric> Mavrik: but once decoding it... is it clear?
[22:59] <Mavrik> well
[22:59] <Mavrik> not really :)
[23:00] <viric> ha
[23:00] <viric> what a mess
[23:01] <viric> Btw, when I shot video, and I keep only 25fps (instead of the '50fps interlaced'), it seems to me as the movement is much less soft.
[23:02] <viric> (when I pan the camera horizontally, for example)
[23:02] <viric> does this mean that I should keep 50fps, that I should not move the camera so fast, or something else?
[23:02] <viric> I don't see 50fps videos very spread... so I guess it's the movement thing. I move too fast.
[23:05] <Mavrik> viric, if you want to keep smoothness, leave it interlaced
[23:05] <Mavrik> anything else you do will be a mess
[23:05] <viric> aha...
[23:05] <viric> interesting
[23:06] <viric> how do *encoders* know that it's interlaced?
[23:07] <viric> Imagine I want to reencode from mp4 to webm, both interlaced. If players don't know about it being interlaced, how do encoders know?
[23:12] <viric> I imagine encoders care, whether the input is interlaced or not
[23:14] <Mavrik> hmm, good question
[23:17] <viric> I'm afraid vp8 doesn't support "interlaced mode"
[23:18] <viric> but I guess I've to tell the encoder
[23:20] <viric> the x264 program allows the interlaced mode
[23:26] <viric> Mavrik: maybe the 'il' filter?
[23:27] <viric> hm no
[23:33] <Ulfalizer> JEEB: okay
[23:44] <viric> hm neither VP8 or Theora supports interlaced video.
[23:45] <viric> Isn't there any free codec I can use for interlaced video?
[00:00] --- Mon Oct 21 2013


More information about the Ffmpeg-devel-irc mailing list