[Ffmpeg-devel-irc] ffmpeg.log.20150216

burek burek021 at gmail.com
Tue Feb 17 02:05:01 CET 2015


[01:26] <matham> I have been using the logic of av_image_fill_pointers to determine if the user passing plane buffers missed a plane for the format, as well as the required plane size.
[01:27] <matham> However, the second plane is optional for PAL formats so, unless I make explicitly an exception for PAL, my code would return an error if the user didn't supply that plane.
[01:28] <matham> Is pal the only format whose other planes are optional? Or do I have to make an exception for other formats?
[04:03] <nell> hi
[04:38] <wad> Hi guys. Over the past month, I've captured the contents of 34 mini DV tapes (home videos) to my hard drive. They are .avi files, 720x480.
[04:38] <wad> Each file is about 13 GB in size.
[04:38] <wad> I'm running Ubuntu Linux.
[04:38] <wad> I'm now attempting to cut them up into smaller files, and transcode them into a less bulky file format.
[04:39] <wad> I've tried using a few program: HandBrake, Pitivi, WinFF, OpenShot.
[04:39] <wad> All of them except for OpenShot choke on the input files. They lock up or crash.
[04:40] <wad> OpenShot seems to be working, so far. My question is: What format should I export the files as?
[04:40] <wad> I'd like to be able to view them on PCs, mobile devices, and my old hacked XBox.
[04:41] <wad> I think the framerate is 24.97 fps.
[04:41] <wad> OpenShot gives me tons of export options, but I don't know which one to choose.
[04:41] <wad> Any guidance would be welcome.
[07:20] <YaMoonSun> How do I recommend suggestions to developers? It seems .iso files aren't being mapped, and I'd like to be able to make a proper rip without having to extract an .mkv first.
[07:35] <relaxed> YaMoonSun: dvd iso?
[07:35] <YaMoonSun> Yeah, it was extracted using dvd decrypter
[07:38] <relaxed> ffmpeg can't read DVDs, just have dvd decrypter dump the title to a vob
[07:38] <YaMoonSun> I dunno if it can, but will attempt.
[07:39] <relaxed> or mount the DVD and use the vob as input for ffmpeg
[07:39] <relaxed> (assuming it's the main title)
[07:54] <YaMoonSun> Fingers crossed, thanks for the input
[08:07] <YaMoonSun> So I did 'ffmpeg -i "Source.mp4" -c:v copy -c:a copy -ss 00:00:13 "Output.mp4" and now the music plays before the video appears in VLC, what do?
[08:07] Action: YaMoonSun runs to get something to eat
[08:08] <YaMoonSun> Also, do I need to use .mkv .ogg and .webm when supplying data to Archive.org?
[08:08] <YaMoonSun> Implying, do the codecs need to be as free as the content?
[08:33] <YaMoonSun> Could it be because I used checked the WebOptimization box in the first .mp4 and don't know how to manually optimize for web?
[09:53] <YaMoonSun> I'm trying to take one part of a webm and put it into another webm, but my file keeps appearing corrupted. No video.
[09:53] <YaMoonSun> http://pastebin.com/ayzRuqxB
[09:58] <relaxed> No video using vlc?
[10:00] <YaMoonSun> Yeah, not unless I do -c:v libvpx -b:v 1M
[10:31] <YaMoonSun> Why would a site tell me my .gif that I made using ffmeg from a webm contains malicious content? =/
[10:38] <BtbN> "a site"?
[10:40] <YaMoonSun> Worked fine on imgur, but 4chan refused it.
[11:06] <yaronj> Hi, I was looking around to find the reason for the warning while running with the compiled FFmpeg:
[11:06] <yaronj> "No accelerated colorspace conversion found from yuv420p to rgb24"
[11:06] <yaronj> and found it's related to the reason of not usging the GPU.
[11:06] <yaronj> did the following with no difference:
[11:06] <yaronj> 1) compiled the FFmpeg with x264
[11:06] <yaronj> 2) added to the configure flags: --enable-neon
[11:06] <yaronj> this warning become when calling:
[11:06] <yaronj> sws_getContext with PIX_FMT_RGB24 where the source is YUV420P
[11:06] <yaronj> compiled for iOS arm64
[11:22] <jpluijmers> Hi guys, I have been banging my head against this problem for days now and am at a complete loss. I have posted on several forums but no body seems to know a definitive answer
[11:22] <jpluijmers> My command is: ffmpeg -y -i bunny-source.mp4 -i wm.png -filter_complex \"[1:v]scale=iw*0.3:-1[watermark]; \ [0:v][watermark]overlay=10:main_h-overlay_h-10[outv]" \ -map "[outv]" -map 0:a  marked.mp4
[11:22] <jpluijmers> the output is: http://pastebin.com/Jn0krS7W
[11:23] <jpluijmers> I am trying to get the watermark complex filter to scale percentage based on the width of the first input source.
[11:23] <jpluijmers> But I cant seem to manage to get a valid (non-selfreferencing) token within the complex filter.
[11:24] <jpluijmers> Any input at all would be greatly appreciated.
[12:03] <jpluijmers> Hi guys, I have been banging my head against this problem for days now and am at a complete loss. I have posted on several forums but no body seems to know a definitive answer
[12:03] <jpluijmers> My command is: ffmpeg -y -i bunny-source.mp4 -i wm.png -filter_complex \"[1:v]scale=iw*0.3:-1[watermark]; \ [0:v][watermark]overlay=10:main_h-overlay_h-10[outv]" \ -map "[outv]" -map 0:a  marked.mp4
[12:03] <jpluijmers> the output is: http://pastebin.com/Jn0krS7W
[12:03] <jpluijmers> I am trying to get the watermark complex filter to scale percentage based on the width of the first input source.
[12:03] <jpluijmers> But I cant seem to manage to get a valid (non-selfreferencing) token within the complex filter.
[12:03] <jpluijmers> Any input at all would be greatly appreciated.
[12:13] <BtbN> Is ffmpeg capable of recovering a broken mp4? (Live recording, app crashed before writing the moov atom)
[12:29] <Mavrik> BtbN, I doubt anything is really capable of recovering an MP4 without an index.
[12:56] <synthecypher> Is there a way to skip files that haven't changed?
[13:07] <BtbN> Mavrik, there are tools that can do it, but i'd like to do it with ffmpeg
[13:42] <termos> I have some issues linking against my newly compiled FFmpeg: undefined reference to `lzma_stream_decoder' etc. Is this a known problem? I have libtiff installed
[13:42] <BtbN> Is your ffmpeg a static library?
[13:43] <Mavrik> termos, it works just like you intended
[13:43] <Mavrik> you just don't understand how static linking works
[13:43] <Mavrik> (you have to explicitly link objects for dependent libraries youself, LD won't do that for you)
[13:49] <termos> you're right it's static, libav*.av files in my lib folder. I used the same configure arguments as before though, has something changed?
[13:50] <BtbN> Nope, static libs don't have a dependency system. You have to take care of that manualy.
[13:57] <termos> I'd prefer to dynamically link against FFmpeg though
[13:58] <Mavrik> then link dynamically. You'll still have to properly link to all dependencies.
[13:58] <Mavrik> Aren't you using CMake or any proper build system that does that for you/
[13:58] <Mavrik> _
[14:00] <BtbN> Well, with dynamic linkins the dll/so have dependencies
[14:00] <termos> I'm using CMake where I specify include and lib directories, then I have a list of target_link_libraries with -lavcodec stc.
[14:00] <termos> etc
[14:03] <jpluijmers> Hi guys, I have been banging my head against this problem for days now and am at a complete loss. I have posted on several forums but no body seems to know a definitive answer
[14:03] <jpluijmers> My command is: ffmpeg -y -i bunny-source.mp4 -i wm.png -filter_complex \"[1:v]scale=iw*0.3:-1[watermark]; \ [0:v][watermark]overlay=10:main_h-overlay_h-10[outv]" \ -map "[outv]" -map 0:a  marked.mp4
[14:03] <jpluijmers> the output is: http://pastebin.com/Jn0krS7W
[14:03] <jpluijmers> I am trying to get the watermark complex filter to scale percentage based on the width of the first input source.
[14:03] <jpluijmers> But I cant seem to manage to get a valid (non-selfreferencing) token within the complex filter.
[14:03] <jpluijmers> Any input at all would be greatly appreciated.
[14:08] <termos> Okey I fixed the linking problem by install libzma-dev and adding "-llzma" to my linking, I guess this is a new dependency for FFmpeg that I didn't know about
[14:09] <termos> Last time I compiled the library was about six months ago
[14:14] <Mavrik> termos, well cmake should resolve all the -l's required for a dependency
[16:48] <nslinux> hi
[16:48] <nslinux> hello guys
[16:49] <nslinux> i got this error when i try to compile the ffmpeg
[16:49] <nslinux> any one can help plz
[16:49] <nslinux> ERROR: libnut not found  If you think configure made a mistake, make sure you are using the latest version from Git.  If the latest version fails, report the problem to the ffmpeg-user at ffmpeg.org mailing list or IRC #ffmpeg on irc.freenode.net. Include the log file "config.log" produced by configure as this will help solve the problem.
[17:26] <Mavrik> hrmf, on OS X VDA doesn't work... at all
[17:26] <Mavrik> that's not cool
[17:26] <nslinux> bad irc room
[17:27] <ac_slater_> alright guys. I was here a few days ago. I have a custom encoder that gives me h264 data (I and P frames), but no timestamps. Any advice when generating PTS?
[17:29] <ac_slater_> it doesnt even give me NAL boundaries sadly
[17:31] <Mavrik> well, that sucks :)
[17:31] <ac_slater_> yea it's worst case
[17:36] <Mavrik> well
[17:36] <Mavrik> Short answer: you can't.
[17:37] <ac_slater_> cant? I can rig detect when a 'frame' is complete I guess. Assume I can. Assume I have a group of NALs that represent a frame.
[17:38] <Mavrik> You're asking the same things you asked a few days ago.
[17:38] <Mavrik> Since you can't know how the frame is ordered without the encoder telling you that.
[17:39] <Mavrik> There's no way for you to extrapolate since NAL's don't have timing info.
[17:40] <ac_slater_> Mavrik: right. The new info I have is, I can always get data in decode order. I can, infact, read() until EAGAIN, find the nal boundaries (in order), and get a 'frame'
[17:44] <Mavrik> that still doesn't help you if you have B-frames.
[17:44] <ac_slater_> I dont. I thought I did, but I only have I and P
[17:45] <Mavrik> well then it's simple
[17:45] <Mavrik> you have FPS
[17:45] <Mavrik> and you know when you get a new frame
[17:45] <Mavrik> so just increment PTS and DTS with appropriate difference
[17:45] <Mavrik> everytime you get a frame
[17:45] <ac_slater_> can they be incremented at the same rate? DTS and PTS that is?
[18:36] <sg90> Hi, I'm trying to record a video and split it into segments every 30 seconds using the segment muxer. I am having issues forcing a new keyframe on the first frame on the new segment. Has anyone encountered this issue before?
[18:44] <sleep> hi, is there a way to use the gdigrab module  by using a pid, or something else unique, to find the window to record ?
[18:45] <sleep> I don quite get why someone thought grabbing by title only is a good idea, tbh
[18:45] <d00f_> is ffmpeg have a method to encode a low resolution high framerate video stream with a high resolution low framerate stream into a single high resolution output stream?
[18:46] <d00f_> s/is/does/g
[18:59] <ac_slater_> Mavrik: thanks for the help, I was able to get something working. It's kinda crazy
[19:00] <Mavrik> ac_slater_, yes, they can be incremented at the same rate and probably should be the same if you have no B frames
[19:00] <ac_slater_> Mavrik: that's what I thought. Thanks mate
[21:07] <sybariten> evening
[21:07] <sybariten> does an ffmpeg from 2012 typically not have -vf ?
[21:07] <sybariten> trying to do -vf scale=iw/2:-1
[21:07] <sybariten> FFmpeg version 0.6.5, Copyright (c) 2000-2010 the FFmpeg developers      built on Jan 29 2012 17:52:15 with gcc 4.4.5 20110214 (Red Hat 4.4.5-6)
[21:08] <JEEB> ahaha
[21:08] <JEEB> that's old, congratulations
[21:08] <JEEB> like pre-fork stuff
[21:08] <sybariten> ok
[21:56] <d00f_> Hello - does anyone know of any resources that would help me live transcode a low resolution video stream with a high resolution stream together?  The framerates between the two are different
[21:57] <c_14> What do you mean, "transcode together"
[21:57] <d00f_> "interleave" i think
[21:57] <c_14> frame 'a' frame 'b' frame 'a' frame 'b' ?
[21:58] <d00f_> yes, but the framerates are different as well
[21:58] <d00f_> so it would be abbbbbbabbbbbbabbbbbbabbbbbb
[21:59] <d00f_> its a high resolution camera with two feeds
[22:00] <c_14> try the interleave filter, not sure if that'll work out of the box though
[22:00] <ChocolateArmpits> Do they have to be different in the end?
[22:02] <d00f_> no, they dont have to be different in the end.  I am trying to cut down transmission size downstream
[22:03] <d00f_> It might look cool too in the end
[22:34] <voip_> Hello guys
[22:35] <voip_> what is mean:  Circular buffer overrun. To avoid, increase fifo_size URL option. To survive in such case, use overrun_nonfatal option
[22:35] <voip_> ?
[22:38] <rcombs> seems pretty self-explanatory to me
[22:41] <voip_> rcombs, how to fix ?
[22:41] <voip_> increase fifo_size URL option
[22:54] <ubitux> sybariten: 0.6.x is probably the worst release ever made
[22:54] <ubitux> why the hell would you use that?
[22:57] <JEEB> > red hat 4.4.5
[22:57] <JEEB> seems like either centos or rhel
[22:57] <JEEB> aka fucking awesomely old shit
[22:57] <JEEB> (thankfully compilation isn't too hard in general)
[00:00] --- Tue Feb 17 2015


More information about the Ffmpeg-devel-irc mailing list