[Ffmpeg-devel-irc] ffmpeg.log.20141119

burek burek021 at gmail.com
Thu Nov 20 02:05:01 CET 2014


[00:03] <Phlarp> When I try to concat two .ts files the resulting output has horrible compression artifacts, what should I do differently?
[00:03] <c_14> What are you doing?
[00:04] <c_14> Note, if you want to concat ts files you can usually just cat *ts > new.ts
[02:12] <danomite-> I'm getting some delay when I'm starting from an rtsp stream, is it possible to control the load buffer
[02:46] <troy_s> Need a little help. Using the FFMPEG API, is it possible to chain formats together using the AVInputFormat.next pointer?
[02:47] <troy_s> (Looking to use avformat_open_input but only on a limited subset of the file types that FFMPEG supports.)
[02:52] <troy_s> (Or is it more prudent to register only the subset of codecs?)
[04:15] <ningu> is there an easy way to split a video file into two parts with ffmpeg? so if I have a 2 hours video make two files, 1 hour each
[04:16] <ningu> I'd like to be able to say: split into two chunks, rather than specify how long each chunk will be
[04:26] <danomite-> I'm trying to transcode a live stream but there seems to be a 30 second startup delay, how can I reduce it?
[04:50] <Techstar> is there a good tutorial on how to use FFMPEG?
[05:18] <relaxed> Techstar: http://ffmpeg.org/ffmpeg.html   https://trac.ffmpeg.org/wiki/Encode/H.264
[10:25] <Pazns> hey people
[10:28] <Pazns> I have a problem with ffmpeg and libopus here. I have an .opus file (got on internet) with 1 audio and 2 video streams (cover arts), but when I try to do the same kind of thing by myself, just with stream mapping, ffmpeg throw an error.
[10:29] <Pazns> Does LibOpus in ffmpeg can handle adding video streams to .opus files ?
[10:33] <Pazns> http://pastebin.com/raw.php?i=zVyQ12G0
[10:37] <relaxed> Pazns: I think you want .webm
[10:38] <Pazns> But how does the original file exist at all ?
[10:38] <Pazns> Seems enough legit for my music player to handle it properly, and display the cover art from the video stream.
[10:39] <relaxed> ffmpeg's output says it's ogg
[10:39] <Pazns> aww, didn't that
[10:39] <Pazns> see*
[10:40] <Pazns> it's obvious now :|
[10:41] <Pazns> Well, thanks, then.
[11:31] <Pazns> Hm, it's me again.
[11:32] <Pazns> http://pastebin.com/raw.php?i=w12XVYHs Doesn't work better than previously. What am I missing this time ?
[11:32] <Pazns> The documentation is not very clear.
[11:40] <relaxed> try ffmpeg -i input -map 0:a -c copy -f ogg test.opus
[11:41] <Pazns> But then the video stream (holding a cover art) is not there.
[11:42] <relaxed> humor me
[11:44] <Pazns> Well, tried it and there is just one (audio) stream. Like requested in the command.
[11:45] <relaxed> ok, I just wanted to see if the opus stream was the issue.
[11:58] <c_14> ogg only likes theora video
[11:59] <c_14> ie, you'll have to use -c:v libtheora
[11:59] <Pazns> How the original file has those two video streams at first ?
[12:00] <c_14> Probably the muxer producing incorrect output.
[12:01] <c_14> Unless the ogg spec was expanded and that hasn't been included in FFmpeg yet.
[12:04] <c_14> It could always be that FFmpeg just never implemented that part of the spec/feature. I just know that theora works.
[12:04] <Pazns> It could be a weird variety of ogm  ?
[12:06] <Pazns> Anyway, now I just need to figure how to dump this into a theora stream without getting just a black screen x)
[12:06] <Pazns> Thanks for the help.
[12:07] <c_14> Mhm, it seems you can use FLAC's METADATA_BLOCK_PICTURE field as a vorbis comment.
[12:10] <Pazns> And a base64 encoded image, if I understood.
[12:11] <Pazns> Just a metadata with this name and the b64 value will do the trick ?
[12:12] <c_14> No, it needs some extra info as well.
[12:12] <c_14> There seems to be a bash script here that you can look at: https://github.com/acabal/scripts/blob/master/ogg-cover-art
[12:13] <Pazns> A bash script is a nope for me, I'm using ffmpeg windows build.
[12:14] <c_14> Sure, but you can see what you have to add to the picture etc.
[12:15] <c_14> You could always open a feature request on trac for FLAC/ogg/opus cover art support.
[12:18] <Pazns> In fact I found one old trac issue about that, I guess.
[12:18] <Pazns> https://trac.ffmpeg.org/ticket/2655
[12:18] <Pazns> i'm not sure if this is really related to my case
[12:19] <c_14> >vorbiscomment: Add support for embedded cover ar
[12:19] <c_14> It seems relevant, let me look at the change.
[12:20] <c_14> Mhm, it looks like it's only in the ogg demuxer.
[12:23] <Pazns> The feature is "read-only", then ?
[12:23] <c_14> Currently, yes.
[12:25] <Pazns> Then need to play with metadata tags. Finally, no cover art for me and more free time !
[12:28] <hefest> hello
[12:29] <hefest> im converting bunch of flv video files to mp4 but the audio is only 4kb/s. any idea how to improve that?
[12:29] <c_14> -b:a 9001k
[12:30] <c_14> https://trac.ffmpeg.org/wiki/Encode/AAC
[12:32] <hefest> c_14: 13 kb/s :(
[12:32] <hefest> c_14: i have the same video transcoded by zencoder and it has 47kb/s
[12:32] <c_14> Pastebin your current commandline and output please.
[12:33] <hefest> c_14: http://pastebin.com/THPA6xJS
[12:34] <c_14> Output?
[12:34] <hefest> c_14: of ffmpeg -i ?
[12:34] <c_14> Of the command you just pasted.
[12:36] <hefest> c_14: http://pastebin.com/be5PDfyP
[12:36] <hefest> c_14: first one is the original flv file, second one is the file i got transocded by zencoder, aka that's what quality i want ,and the third is the results im getting.
[12:39] <c_14> Try with -b:a 47k -profile:a aac_he_v2
[12:39] <c_14> And -ac 2
[12:40] <c_14> Or with aac_he if you want to keep it mono
[12:40] <c_14> But then you should use -b:a 48k
[12:59] <hefest> c_14: i got invalid profile for aac_he
[12:59] <hefest> and for aac_he_v2 too
[13:00] <c_14> Aah, eeh, it seems only libfdk_aac supports those profiles...
[13:01] <hefest> Unknown encoder 'libfdk_aac'
[13:01] <c_14> Yep, you'd have to compile from source to get that one.
[13:01] <c_14> Though I'm not sure why the default aac encoder is ignoring the bitrate you give it.
[13:02] <c_14> Did you ever try with just -b:a 47k ?
[13:02] <hefest> i got 24kb/s with this: ffmpeg -i ET2GYGBFW4.flv -q:v 0 -profile:v baseline -r 25 -vcodec libx264 -ac 2 -b:a 47k test.mp4
[13:03] <c_14> Well, it's almost twice as high as it was earlier...
[13:03] <c_14> Let's try 49?
[13:03] <c_14> 49k that is
[13:03] <hefest> but that's for stereo, i'd like mono
[13:03] <hefest> i think it's twice as high because -ac 2
[13:04] <c_14> Aah, didn't see that.
[13:04] <c_14> When you get rid of it it's back at 13k?
[13:04] <hefest> ok, with mono and 49k it's 13kb/s
[13:04] <hefest> yeah :(
[13:06] <c_14> try with -c:a aac -strict -2 -b:v 47k
[13:06] <hefest> Too many bits per frame requested
[13:08] <c_14> I'd try compiling with libfdk_aac and then try with the aac_he profile.
[13:08] <c_14> I'm not sure what's going wrong.
[13:12] <c_14> You can always encode the audio with a different encoder and then just mux it back in with ffmpeg.
[13:47] <aabuild> Hi
[13:47] <aabuild> I have question...
[13:48] <c_14> 42
[13:48] <aabuild> what is best easy way for build ffmpeg windows 32bit version?
[13:48] <aabuild> I just want add libfdk-aac and faac on zeranoe build
[13:49] <c_14> I've never tried any, but there are https://trac.ffmpeg.org/wiki/CompilationGuide/MinGW https://trac.ffmpeg.org/wiki/CompilationGuide/CrossCompilingForWindows https://ffmpeg.org/platform.html#Windows and https://trac.ffmpeg.org/wiki/CompilationGuide/MSVC
[13:49] <BtbN> msvc builds are slow though
[13:49] <BtbN> and using libraries is a nightmare
[13:49] <relaxed> plus, I think zeranoe has a guide in his forum
[13:50] <aabuild> Thanks... but I tried on msys mingw... that was so terrible... some external modules get error or can't find module..
[13:50] <aabuild> especially fontconfig...
[13:51] <aabuild> and librtmp...
[13:51] <aabuild> too much get trouble when configure or build
[13:53] <aabuild> No way for all modules enabled ffmpeg build without build external modules?
[13:53] <c_14> aabuild: http://ffmpeg.zeranoe.com/forum/viewforum.php?f=19&sid=c1f8fb1fd41ecffa605dc83975d4c259 <- there's a few more topics on zeranoes wiki here about building
[13:55] <bobdobbs> I've got an m4v file. I'd like to convert it to both webm and ogg. However, all of my attempts yeild files with low quality.
[13:55] <bobdobbs> How can I convert, but keep quality?
[13:56] <c_14> https://trac.ffmpeg.org/wiki/Encode/VP8
[13:56] <c_14> https://trac.ffmpeg.org/wiki/TheoraVorbisEncodingGuide
[13:57] <bobdobbs> c_14: this was my last attempt. But I was really just taking parameters from pages on the net.
[13:57] <bobdobbs> https://pastee.org/c6njk
[13:58] <aabuild> This method working on mingw windows? or only on linux(ubuntu... etc..) - https://github.com/qyot27/mpv/blob/extra-new/DOCS/crosscompile-mingw-tedious.txt
[13:58] <bobdobbs> c_14: I'm checking outthat encoding guide now...
[13:58] <c_14> bobdobbs: first, you're not using ffmpeg from FFmpeg, you're using libav
[13:58] <c_14> Either build from source, grab the static build or if you're on debian grab ffmpeg from unstable
[13:59] <bobdobbs> oh dang. I just installed it from the default repos
[13:59] <c_14> Or ask in #libav for help
[14:02] <c_14> Though the gusari builds haven't been updated since july, so if you're going to go static use the johnvansickle builds
[14:03] <bobdobbs> I think I might actually retire for the night, and come back to the task in the morning
[14:03] <bobdobbs> still... this evening I have learnt something!
[14:04] <bobdobbs> not all ffmpeg is ffmpeg!
[14:04] Action: bobdobbs shakes fist at ubuntu
[14:05] <c_14> ffmpeg from FFmpeg is slowly being merged back into debian, but it'll take a while before it goes stable and linking programs against it will still be a pain
[14:05] <bobdobbs> hmmm
[14:18] <aabuild> many encoding options difference with ffmpeg and avconv?
[16:08] <relaxed> aabuild: not really
[18:10] <compstomp> Hi all, I have a question about getting ffmpeg to compile into a binary with libraries baked in.  I have successfully gotten ffmpeg to build with shared libraries (after having to add a path to/etc/ld.so.conf.d/ and then ldconfig-ing) but I would rather not have to make changes to the ld.so cache.  Here is the input for my build: http://pastebin.co
[18:10] <compstomp> m/jtknCqWg. How might I set the configure options to include the libs in the executable? Thanks!
[18:11] <PovAddict> I have a video at 352x576 [SAR 24:11 DAR 4:3], how do I change the DAR to 16:9 without actually scaling, ie. leaving the resolution at 352x576?
[18:12] <kepstin-laptop> PovAddict: use the 'setsar' or 'setdar' filter
[18:12] <PovAddict> would -aspect work? (I *just* found it in the manpage)
[18:12] <kepstin-laptop> Not sure, it probably does the same thing as the setdar filter.
[18:13] <kepstin-laptop> (except it looks like you can use -aspect with -codec copy; that doesn't work with the filter)
[20:46] <sgtpepper> hey guys... is ffmpeg supposed to store a cookie that is set via header on an initial request, and replay it on subsequent requests?
[20:46] <sgtpepper> the source says yes
[20:47] <sgtpepper> google says no
[20:47] <sgtpepper> I'm talking about 2.3.4
[20:47] <c_14> Trust the source?
[20:47] <sgtpepper> c_14: well... its not working :S
[20:47] <sgtpepper> I call http://example/playlist.m3u8
[20:47] <sgtpepper> it sets a cookie
[20:48] <sgtpepper> playlist.m3u8 asks for chunlist.m3u8
[20:48] <sgtpepper> and when ffmpeg's issues a GET
[20:48] <sgtpepper> the cookie is not there anymore
[20:49] <c_14> There was a patch about something with HTTP and auth that was applied on the 15th.
[20:49] <c_14> Might want to try with git.
[20:50] <sgtpepper> let me try the the last precompiled binary
[20:53] <sgtpepper> c_14: nope
[20:53] <sgtpepper> doesn't work
[20:54] <c_14> Where in the source did you find it saying yes?
[20:56] <c_14> source/doku/wherever
[21:47] <Phlarp> http://pastebin.com/GTSVVg67
[21:47] <Phlarp> So I have this command
[21:48] <Phlarp> pretty simple, should just concat these three files
[21:48] <Phlarp> but it's failing
[21:48] <Phlarp> throwing an error on an audio stream-- but all three inputs have audio.
[21:49] <c_14> Can you ffprobe all 3 files and pastebin the result?
[21:49] <c_14> also, can you get rid of the double quotes around [v] and [a]
[21:51] <llogan> also the trailing : after a=1. a pastebin of the console output would help too
[21:54] <Phlarp> http://pastebin.com/EKp2mKCq
[21:54] <Phlarp> so it looks like i don't have audio on the first and last chunk
[21:55] <c_14> So it would seem.
[21:56] <Phlarp> http://pastebin.com/mFd0V0tm
[21:57] <c_14> Eh, no wonder you don't have any audio?
[21:57] <c_14> You used -an
[21:58] <llogan> you can use "-f lavfi -i aevalsrc=0|0:s=48000" to make a silent audio source if that's what you want (or add it to your filtergraph).
[21:59] <Phlarp> How would I pass through the audio stream the original file has
[21:59] <llogan> you can include it as an other input and reference it in your filtergraph
[22:00] <llogan> or remux the video and original audio into a new file
[22:17] <Phlarp> Got it working! thanks much for sending me down the right path.
[22:44] <zoch> Anyone here have experience with trimming a video stream as it comes in? (Being recorded)
[22:45] <c_14> "trimming" ?
[22:46] <zoch> Well what I want to do, is have a video stream record from webcam, but erase anything older then 30 seconds, unless you manually hit a record button. This way if you see something you can press record and start to save from 30 seconds ago until your done, but it will erase anything after 30 so you dont fill up your hdd
[22:47] <kepstin-laptop> hmm, so basically recording to a circular buffer
[22:47] <c_14> You can probably accomplish something with the segment muxer
[22:47] <kepstin-laptop> you could do something like that by recording to e.g. 10s segments, and deleting old segments
[22:47] <c_14> You just need something that will save the segments when you "pressrecord"
[22:47] <zoch> @kepstin, but what happens if you say press record at 5 second, would you have to combine the 5 second clip and the 10 second clip and grab the newest 10 seconds?
[22:48] <kepstin-laptop> zoch: the only thing pressing 'record' does is stop deleting old segments
[22:49] <kepstin-laptop> so the history will only be approximately 30 seconds long, it'll actually be "at least 30, but no more than 30 + the length of one segment"
[22:49] <zoch> hmm
[22:50] <zoch> so would you ever end up with more or less then 30 seconds before you hit record? (trying to wrap my head around it)
[22:50] <kepstin-laptop> you will usually end up with slightly more; never less (assumign you implement the deletion correctly)
[22:50] <zoch> I think that would work actually, now should I be researching circular buffer? or segment muxer like c_14 said
[22:51] <kepstin-laptop> segment muxer would be how to do it, yeah
[22:51] <zoch> cool cool. That was less complicated then I thought :)
[22:53] <kepstin-laptop> I think the segment muxer can run in two modes; one where it keeps all segments, one where it automatically deletes them; you probably want to have the segment muxer keep all segments, and have the external tool which handles the record button do the deletion or archiving of segments.
[22:53] <zoch> yeah that was what I was thinking
[22:54] <zoch> Do you know if when running ffmpeg in something other then linux (android using external libraries) if any features would be limited?
[22:54] <kepstin-laptop> I dunno if ffmpeg can capture directly from the cameras on android devices, i don't think so?
[22:55] <kepstin-laptop> they're not v4l, you have to go through some funky android interface. and you'd probably want to use a hardware video encoder on the device rather than software encoding in ffmpeg
[22:55] <kepstin-laptop> (you could still use ffmpeg to do segmenting in that case, with -codec copy)
[22:55] <zoch> well I believe it can be used like a webcam is used on a computer now. android wraps it in a video stream
[22:56] <zoch> does codec copy just copy a file in realtime as it is made?
[22:56] <kepstin-laptop> zoch: no; codec copy means "don't re-encode the video and audio"
[22:56] <zoch> oh
[22:56] <kepstin-laptop> so if the android camera interface gives you h264, you could pipe that to ffmpeg and have ffmpeg write it to split files
[22:57] <zoch> then yeah i would want that
[22:57] <zoch> exactly
[22:57] <zoch> cool thanks alot man. Searching google was not very helpful :D
[22:58] <TheDracle> zoch, Are you using the native android mediarecorder?
[22:59] <TheDracle> The Android native camera interface gives you raw YUV frames.
[22:59] <zoch> I have not decided yet, I wanted to use xamarin framework so I could use ffmpeg on ios and android, but I have not gotten into sepcifics yet, just making sure i can do it first
[22:59] <zoch> @thedracle, would that help me?
[22:59] <kepstin-laptop> if you have sufficient ram on the device, you could get more complicated by storing the 30s buffer in ram and only writing to disk when you hit 'record'; saves writes on the (generally low quality) flash on the phone :)
[23:00] <TheDracle> zoch, Well, MediaRecorder can create MP4 files directly on disk using the underlying hardware encoder of the system.
[23:00] <TheDracle> Yeah, and encoded video is a problem for circular buffering in this way, because you need to make sure you synch your buffer up by iFrame.
[23:01] <TheDracle> So you don't cut off the stream with a bunch of P-Frames based on a non-existent iFrame.
[23:01] <zoch> @Dracle, yeah but I do not believe I can do that 30 second segmenting with the medirecorder (could be wrong)
[23:01] <TheDracle> The right format for what you're trying to do would be something like MPEGTS, and setting a really short GOP size.
[23:01] <zoch> Interesting
[23:01] <TheDracle> MPEGTS is made for streaming, and you can easily truncate it in the way you're indicating.
[23:02] <TheDracle> I'd use libav.
[23:02] <zoch> and thats for android right?
[23:02] <zoch> I will look into it, Gotta go to work, thanks guys!
[23:02] <kepstin-laptop> with smartphones tho, you *really* want to use the hardware encoder if you can, just due to battery life/heat issues :/
[23:02] <TheDracle> It's just a C library.
[23:03] <TheDracle> You can use the Android NDK to build native C stuff.
[23:03] <TheDracle> zoch, If you want, I've done a lot of programming with libAV and Android, let me know if you have questions.
[00:00] --- Thu Nov 20 2014


More information about the Ffmpeg-devel-irc mailing list