[Ffmpeg-devel-irc] ffmpeg.log.20140814

burek burek021 at gmail.com
Fri Aug 15 02:05:01 CEST 2014


[00:17] <th0rne> If I have an m4a file of bitrate some x and want to convert it to ogg, is there some bitrate cx that I should aim for to get a reasonable balance of not too much extra lossiness and not too much larger file size? What would that value c be? Or am I approaching this wrong?
[00:19] <sacarasc> You shouldn't reencode at all. You'll always lose something.
[00:30] <th0rne> But assuming I have m4a, but need not m4a and have no way around that.
[00:30] <th0rne> Some (additional) loss is acceptable to me.
[00:33] <sacarasc> There is no formula for doing it.
[00:57] <freeroute> hi, I'm just trying to flip an mp4 upside down but I end up getting errors - http://kpaste.net/9fea11 - I'm really noob at all of this, so could someone please help?
[01:00] <c_14> First of all, you're not flipping the mp4 upside down, you're rotating it counterclockwise. Second you're using Libav not FFmpeg; either use FFmpeg or ask #libav for support.
[01:02] <freeroute> c_14: should it be "ffmpeg -i <myvid> -vf transpose=0 <myvid-flipped>" ?
[01:02] <freeroute> also, how do I get ffmpeg on Ubuntu 12.04?
[01:02] <c_14> That or compile it yourself.
[01:03] <c_14> freeroute: Either use the hflip or vflip filter.
[01:03] <c_14> I never know which one stands for which.
[01:03] <c_14> I think it's horizontally across the horizon, so maybe hflip first?
[01:06] <freeroute> hmm, it seems to be giving me the same kind of errors. Seems that the format is not supported. I'll try downloading those ffmpeg stuff
[01:28] <freeroute> c_14: I downloaded the ffmpeg.static.64bit.latest.tar.gz and extracted it into the folder where I have all the videos
[01:28] <freeroute> somehow it doesn't appear to add the +x to it
[01:28] <freeroute> no matter what I do, chmod 777 ffmpeg or chmod +x ffmpeg
[01:28] <freeroute> is this normal?
[01:28] <c_14> Can you pastebin the output of ls -l ffmpeg ?
[01:29] <c_14> And the error output as well.
[01:30] <freeroute> http://kpaste.net/3c1ae02b
[01:30] <freeroute> not getting any error output when I do chmod
[01:30] <sacarasc> Is the partition mounted with exec?
[01:32] <freeroute> /dev/sda2 on /media/Windows7_OS type fuseblk (rw,nosuid,nodev,allow_other,default_permissions,blksize=4096)
[01:32] <freeroute> does this mean it's not?
[01:36] <c_14> try something along the lines of `mount -o remount,exec /dev/sda2 /media/Windows7_OS
[01:38] <freeroute> c_14: I just did this command from my home Downloads dir - "./ffmpeg -i /media/Windows7_OS/films/galaxy/20140814_003236.mp4 -vf "vflip" /media/Windows7_OS/films/galaxy/20140814_003236_fl.mp4" and it worked
[01:38] <freeroute> but I'll keep that in mind next time I get weird permission problems
[01:38] <freeroute> I have still lots to learn
[01:41] <freeroute> for example, since I don't know how to bulk process the flipped videos I am doing this manually one by one
[01:41] <freeroute> I'm sure the command has this "for i=1" or something, but I haven't come to that yet
[01:41] <c_14> ffmpeg doesn't have that internally, but any decent shell should
[01:46] <freeroute> damn wifi
[01:47] <freeroute> c_14: yeah, I haven't had time to learn the shell yet properly, which is a shame really because I'm trying to get on the CLI as much as possible
[01:48] <sacarasc> for i in *.mp4; do stuff with "$i"; done
[01:54] <freeroute> sacarasc: so how do I say, for every modified file just add "_fl" to it before the extension so that the processed files look like filename_fl.mp4 ?
[01:55] <sacarasc> That's where I fail.
[01:55] <c_14> for i in *.mp4; do stuff with "$i" and name it "${i#.mp4}_fl.mp4
[01:55] <c_14> something like that
[01:55] <c_14> eh, with the trailing "; done of course
[01:56] <c_14> Eh '%' instead of '#'
[01:58] <c_14> `bash variable mangling' being the golden google search-phrase
[02:01] <starPause> i'm able to cut a 30 second (or less) chunk out of a movie, but i want to split an entire movie into several 30 second (or less) chunks with a single ffmpeg command. is that possible or do i need to write a script that calls several ffmpeg commands?
[02:02] <c_14> the segment muxer is your friend
[02:02] <c_14> (probably)
[02:02] <freeroute> c_14: I will study that command and that search term
[02:02] <freeroute> one day :p
[02:03] <c_14> starPause: `-f segment -segment_time 30' should do it
[02:03] <c_14> Read the man page for more fun tidbits.
[02:03] <starPause> c_14: thanks for the help will check that
[02:34] <sanjose_kid__> i am trying to add the capability to ffmpeg project for packet-loss concealment into ffmpeg's RTP stack
[02:34] <sanjose_kid__> am using RTP networking stack for low-latency video-conferencing
[02:34] <sanjose_kid__> By packet loss concealment, I refer to whatever sender sends, receiver needs to receive independent of network conditions.
[02:34] <sanjose_kid__> does anyone know how I can do this?
[02:44] <sanjose_kid__> https://www.ffmpeg.org/ffmpeg-protocols.html
[02:44] <sanjose_kid__> this doc doesn't seem to include packet-loss concealment
[03:00] <icecube45> hello! I'm trying to stream my webcam via ffmpeg, however, I get the error: Cannot find a proper format for codec 'none' (id 0), pixel format 'none' (id -1)
[03:00] <icecube45> Assertion *codec_id != AV_CODEC_ID_NONE failed at /build/buildd/ffmpeg-1.2.6/libavdevice/v4l2.c:868
[03:27] <kode54> for %i in (*.mp4) do ffmpeg -i "%i" -c:a copy -vn "%i.audio.m4a"
[03:28] <kode54> oops
[03:28] <kode54> didn't realize I was scrolled up
[03:37] <freeroute> kode54: your mistake is my learning vehicle :p
[03:37] <freeroute> I'll study this command as well
[03:49] <kode54> interesting test file
[03:49] <kode54> http://www.hydrogenaud.io/forums/index.php?showtopic=106570&hl=
[03:49] <kode54> wmal2pcm.exe manages to decode 4 samples from the packet within, disregarding that the ASF header specifies both a duration and a pre-roll of 3 seconds
[03:50] <kode54> the stream itself contains a single packet of WMAL data
[03:50] <kode54> the decoder eats the first 2 bytes as a stream reset
[03:50] <kode54> then eats the other 13KB or so, outputting nothing
[04:55] <Zhenya> Hi Everybody. I am piping a mp4 file and always get Error setting option pix_fmt to value -1.
[04:55] <Zhenya> I am trying to put -pix_fmt option in front of '-i inputfile', but it seems not work
[04:56] <Zhenya> By the way, the ffmpeg version I was using is the latest one from git repo
[04:59] <Zhenya> And the mp4 video clip is from a sumsung tablet.
[05:01] <Zhenya> The command I use: $ cat b2.mp4 | ./ffmpeg -pix_fmt yuva422p -i pipe:0  a.mp4
[05:02] <Zhenya> The video is at: https://www.dropbox.com/s/oz0e51ggkbpgj0f/b2.mp4
[05:05] <relaxed> Zhenya: why would you piping it?
[05:06] <Zhenya> As the video will be a live stream
[05:06] <Zhenya> The video will be in record when I have to parse it
[05:07] <relaxed> pastebin the output of ffmpeg -i b2.mp4
[05:09] <Zhenya> relaxed: here it is http://pastebin.com/8hNdLeQ2
[05:09] <Zhenya> relaxed: unspecified pixel format...  How can I specify one?
[05:34] <relaxed> that's not what I asked for.
[05:47] <Zhenya> relaxed: sorry, 1 second
[05:48] <Zhenya> relaxed: http://pastebin.com/a07HXfJ1
[05:55] <relaxed> did you try? --> cat b2.mp4 | ./ffmpeg -i pipe:0  a.mp4
[05:56] <Zhenya> Yeah, that's exactly what I run
[05:57] <relaxed> no, you had -pix_fmt yuva422p in there earlier
[05:58] <Zhenya> I added the prefix when the original one not work..
[05:58] <Zhenya> see http://pastebin.com/8hNdLeQ2
[05:58] <Zhenya> relaxed: see http://pastebin.com/8hNdLeQ2
[05:59] <relaxed> cat b2.mp4 | ./ffmpeg -pix_fmt yuv420p -i pipe:0  a.mp4
[05:59] <relaxed> the output also says, "Consider increasing the value for the 'analyzeduration' and 'probesize' options"
[06:00] <Zhenya> This is mine: http://pastebin.com/vCwXQJVF
[06:00] <Zhenya> Option pixel_format not found.
[06:05] <Zhenya> Do you know what's the difference between isomavc1 and isom3gp4?
[06:08] <relaxed> cat b2.mp4 | ./ffmpeg -probesize 10M -analyzeduration 10M -i pipe:0 a.mp4
[06:10] <Zhenya> let me try
[06:11] <Zhenya> same error.. http://pastebin.com/98Z2etAL
[06:12] <Zhenya> you can see in the output: minor_version   : 0, I have another file with minor_version   : 1. which can be piped
[06:15] <relaxed> you should really build it without --disable-yasm, because encoding will be much slower
[06:16] <relaxed> build the latest release and see if it still happens
[06:16] <Zhenya> ok, got it. I will recompile it later.
[06:16] <Zhenya> let me do a git pull
[06:17] <Zhenya> relaxed: are you a ffmpeg developer?
[06:19] <relaxed> no
[06:19] <Zhenya> relaxed: I am recompiling
[06:25] <Zhenya> relaxed: no, not work. I think something wrong with the file format
[06:27] <relaxed> it's because the moov atom, or index, isn't in the front
[06:27] <Zhenya> yeah
[06:28] <Zhenya> I was trying to specify them in parameters
[06:28] <Zhenya> via parameters
[06:28] <relaxed> it works if I pipe it after running it through mp4box
[06:29] <Zhenya> -pix_fmt not work for input stream
[06:30] <Zhenya> what is mp4box?
[06:30] <relaxed> Zhenya: you need the app to record the video in the flv container or something else that pipeable
[06:30] <Zhenya> what's your command in shell?
[06:30] <Zhenya> is avi pipeable?
[06:31] <Zhenya> I am actually building an android app and want to apply subtitle in real time otherwise it may take  the time too long
[06:32] <Zhenya> so I piping the file in record to ffmpeg to apply subtitle
[06:33] <Zhenya> So you know any fast/quick way to apply subtitle
[06:33] <Zhenya> hopefully just merge two files into one ball (mp4 file and subtitle file)
[06:34] <Zhenya> Do you know*
[06:34] <Zhenya> any fast/quick way to apply subtitle?
[06:36] <relaxed> hardsubs or soft?
[06:37] <Zhenya> i am using ffmpeg subtitle
[06:37] <Zhenya> which is really slow
[06:37] <Zhenya> to burn subtitle over video
[06:37] <Zhenya> if soft is ok?
[06:38] <Zhenya> Now I am doing it hard
[06:38] <Zhenya> hard subtitle
[06:38] <relaxed> libass might work, but I'mnot sure how you would do that on the fly
[06:39] <Zhenya> my plan is to use pipe feature to split file into pieces, then apply them one by one, then merge them
[06:39] <Zhenya> for example 10s a file
[06:40] <Zhenya> something like: tail -f b.mp4 | ffmpeg -ss x:x:x:2.2.2 -i pipe:0
[06:46] <Zhenya> -ss so quick
[06:47] <Zhenya> If i can do the pipe....
[09:11] <termos> what exactly do I need so that moov atom is written at the beginning of the file? I keep getting "moov atom not found"
[09:12] <relaxed> termos: it probably means the index in your mp4 is absent, which happens when you have an incomplete file
[09:13] <termos> I thought there would be a workaround for that, I tried with -movflags +faststart but no success
[09:15] <relaxed> are you interrupting the encode? why is it incomplete?
[09:19] <termos> yes i'm interrupting it, I want to make sure that the file is playable even if it's interrupted
[09:20] <relaxed> try using "killall -INT ffmpeg" instead of ctrl+c or "q"
[09:22] <relaxed> I think with the other ways you're stopping ffmpeg before it has a chance to use -movflags +faststart
[09:29] <termos> hmm I see
[09:30] <termos> I can't really decide how to end the program, it's just that I'm transcoding really big files and I want to make sure that the file is playable even if the program crashes
[09:30] <termos> without running it through a second pass or some qt-faststart utility
[09:31] <termos> similar to how it works if I write flv to file, then it can end at any time
[09:55] <relaxed> Is there some reason you need mp4? If not, go with matroska
[10:19] <termos> not really any reason, mkv seems to work a lot bette
[10:19] <termos> better
[10:38] <Zhenya> relaxed: Do you know how to set a subtitle stream to be on by default?
[10:40] <Mavrik> termos, MOOV atom cannot be generated before full file (segment) is written
[10:40] <Mavrik> so what you're asking cannot be done
[10:40] <Mavrik> do qt-faststart as a second pass.
[11:10] <ghospich> Anyone faced with this issue? https://trac.ffmpeg.org/ticket/3852#ticket
[11:17] <ubitux> ghospich: http://trac.ffmpeg.org/ticket/2067
[11:26] <ghospich> thanks, i'll try
[13:41] <baidoc2> I`m trying to get the frames of a video frame by frame, it works fine when I write the frames as files but now I want to redirect the frames to pipe
[13:41] <baidoc2> now how can I know when a frame is complete, to step to the next frame?
[13:41] <baidoc2> because atm what i see is a bytestream
[13:53] <Mavrik> baidoc2, well you know the frame size
[13:54] <baidoc2> how can I know it?
[13:55] <Mavrik> width * size * bits_per_pixel
[16:11] <Nopik> hi, I'm trying to change pixel format, I'm doing ffplay -i file.mpg -pixel_format rgba, but is says that "Option pixel_format not found.", I tried several approaches, none of which seemed to work, any ideas how I should format the argument?
[16:13] <Keshl> I use "-pix_fmt", oÉo.
[16:15] <Nopik> hm, and when i tried that on ffmpeg, it worked first time. Well, 'worked'. No error thrown, but the resulting file is of wrong format
[16:57] <sfan5> !pc Nopik
[16:58] <cbenhagen> hi, i am trying to decode a raw frame from a phantom flex 4k camera. i am somehow unable to set the correct codec_tag on the commandline. rawvideo (BIT[0] / 0x544942) is what i need (http://git.videolan.org/?p=ffmpeg.git;a=commit;h=97bb0076c5bb1b30a33b911f8b92ff1c11b7ffb5) but rawvideo ([186]GB[16] / 0x104247BA) is what gets set.  how do i set this codec_tag?
[17:02] <cbenhagen> http://pastebin.com/vyhgAHYC
[17:04] <cbenhagen> this is what i get from the same frame in cine format. eg with a cine header http://pastebin.com/udvm0Qu9
[17:07] <cbenhagen> actually that was the output with cine header. sorry for the confusion: http://pastebin.com/L2d7s91V
[17:24] <Phlarp> Can anyone assist me in consolidating these commands together? I feel like I am (unnecessarily) encoding things multiple times.  http://pastebin.com/2k9ykEze
[17:34] <sfan5> Phlarp: without figuring everything out you could "concat" the commands
[17:34] <sfan5> like this:
[17:36] <sfan5> ffmpeg <args> -c:v rawvideo -c:a pcm_s16le -f matroska - | ffmpeg -f matroska -i - <more args> final.mp4
[17:37] <c_14> Phlarp: just take all those filterchains and append them to each other making one nice large complex filtergraph
[17:39] <Phlarp> c_14: How do I do this? I understand the idea, but I'm having trouble with the syntax, nothing I try seems to work
[17:42] <c_14> I don't feel like doing all the writing so I'm going to be shortening a lot: -filter_complex '[0:v][1:v]overlay[..][o1];[o1]drawtext[..][o2];[o2][1:v]overlay[..][o3];[0:a]amix[..][a]' -map '[a]' -map '[o3]' finished.mp4
[17:46] <j53> hello, can someone help me
[17:47] <j53> im trying to encode 1080i 25fps source video to 720p at 50fps
[17:47] <j53> i have two issues, if i add cropping to my command line it works, but the video is 25fps
[17:48] <j53> if i remove cropping from the command line, it is 50fps, but there is no crop
[17:48] <j53> ok
[17:50] <j53> http://pastebin.com/gQTUfgqB this is my script im working on
[17:50] <j53> i will run it on a test video for console output, just  a se
[17:50] <Phlarp> c_14: I got them strung together. Thank you for the example, this makes much more sense now.
[17:51] <c_14> j53: you can't have more than one -vf in a single command line, put both filters in the same -vf and separate with commas
[18:43] <allengates> what the best audio Noise reduction library do you guys know?
[18:55] <Keshl> I'm not sure what it's called, but whichever one Audacity uses is _really_ good, provided you have roughly 15 seconds of nothing but ambient noise.
[19:03] <allengates> ok, I will try.
[19:05] <rp__> Hi, anybody knows how to record live streams with current timestamps (as metadata) and then seek stream by passing timestamp as start position?
[19:05] <rp__> I'm trying to set up streaming server with archive playback functionality in it.
[19:08] <rp__> According to ffserver docs (https://www.ffmpeg.org/ffserver.html) it should be easy configurable but I've been stuck on this for a while. ("ffserver is a streaming server for both audio and video. It supports several live feeds, streaming from files and time shifting on live feeds. You can seek to positions in the past on each live feed, provided you specify a big enough feed storage.")
[19:19] <tomjscott> Is there licensing information covering the distribution and use of the pre-built ffplay.exe app? I only see licensing regarding compiling the source into another app.
[20:02] <Jaxan> I have a question about memory managment. If I understand correctly, when decoding, the AVFrame has a buffer. This buffer is allocated by calling avcodec_decode_video2, but AVCodec own it and destroys it for example on avcodec_close. Right?
[20:02] <Jaxan> And when AVCodecContext.refcounted_frames I am the owner. Why was AVCodec designed to be the owner? performance?
[20:16] <Jaxan> on a sidenote, is there any high-level documentation about these kind of things. I am particularly interested to see a list of 'open/close' or 'alloc/free' pairs, or some other lifetime-related documentation
[20:45] <fajung> is there a way to add cover image to a output.mp4 ?
[22:05] <Nopik> what should be the syntax to read file, change its pixel format to bgra, run that through filter, then output mpeg? I'm using ffmpeg -i file.mov -vcodec rawvideo -pix_fmt bgra -vf nopik out.avi -> works fine, produces bgra avi uncompressed. But the moment I change out.avi to out.mpg ffmpeg can't encode mpg anymore, getting me lots of buffer underflow errors
[22:06] <Nopik> it works fine if i drop pix_fmt, but in such case the filter does receive yuv420
[22:06] <smo_> is it possible to do for exemple
[22:07] <smo_> ffmpeg -i http://mirrorblender.top-ix.org/peach/bigbuckbunny_movies/big_buck_bunny_720p_stereo.avi -sn -c:v libx264 -c:a libvorbis -f matroska bbunny.mp4
[22:07] <smo_> then start vlc immediately to read the bbunny.mp4 AND get the original video length
[22:07] <smo_> when i do things like that my players always report NAN:NAN or 00:00 as length
[22:07] Last message repeated 1 time(s).
[22:07] <Nopik> ah, nevermind, I've been also using -vcodec rawvideo, when I dropped it and I dropped -pix_fmt it worked, since my filter already negotiated rgba
[22:07] <smo_> so no seek nothing :(
[22:07] <sfan5> smo_: why are you putting matroska into a .mp4 file?
[22:08] <smo_> normaly i use liopus for audio not vorbis
[22:08] <smo_> libopus
[22:08] <sfan5> thats not what I mean
[22:08] <sfan5> you are putting data using the matroska file format into an .mp4
[22:09] <smo_> what  must i use
[22:09] <sfan5> either put mp4 into mp4
[22:09] <sfan5> or use an mkv container
[22:09] <sfan5> just omit the -f option
[22:09] <sfan5> ffmpeg can auto-select the format
[22:10] <smo_> will fix my NAN:NAN problem ?
[22:10] <sfan5> proably not
[22:38] <Nopik> how to play 2 streams simultaneously with single ffplay? e.g. I have in1.mpg and in2.mpg I would like to play them synchronously with single command. any visualization will do, e.g. 2 ffplay windows, or one window bigger, whereas those 2 videos are next to each other
[22:49] <c_14> overlay filter
[22:49] <Nopik> yeah, got it, thanks
[22:50] <Nopik> can i force mpg/avi encoder to use something else than yuv420p? it seems to ignore all my tries to do so
[22:50] <c_14> pixel formats usually depend on the codec not the container
[22:54] <nr08> Quick question on audio conversion. I am running the following command:   "ffmpeg.exe -i audio.mp3 -acodec pcm_s16le audio.wav".  The initial audio.mp3 file is roughly 10mbs with a 128kbs bitrate. The output does not change the bitrate to the expected 16kbs. Is there something I am missing? NOTE: I have also attempted adding "-ab 16k" and "-b:a 16k
[22:54] <nr08> " options and this does not solve the issue either.
[22:55] <Nopik> c_14: thanks
[22:58] <c_14> nr08: Going by my understanding of pcm, the bitrate should be 689kb/s
[22:58] <c_14> Assuming a sampling rate off 44100Hz
[22:59] <nr08> Executing "ffmpeg.exe -formats" displays the "pcm_s16le" format as a PCM signed 16-bit little-endian. Thats specified within the specific codec I am using to encode the new file.
[23:01] <c_14> PCM 16 bits per sample times 44100 samples per second
[23:01] <c_14> Though it could be 15 bits per sample with one bit being reserved for the sign
[23:02] <c_14> In both cases you should get around 600+ kbps
[23:04] <nr08> In attempt to create the desired output (8000k 1 channel 16kb/s), I also tried "ffmpeg.exe -i audio.mp3 -ac 1 -ar 8000 -ab 16k audio.wav" and the bitrate did not change.
[23:05] <nr08> If this helps any, I am attempting to convert any mp3 files to a .wav format supported by asterisk. The specification in my previous message is what asterisk wants, in a PCM format.
[23:06] <c_14> Like I just said, the bitrate of pcm audio depends directly and _only_ on the sampling rate and bit depth. For 16kbps audio you need with a bit depth of 16 bps, you need a sampling rate of 1024
[23:06] <c_14> Hz
[23:07] <c_14> s/audio you need/audio
[23:10] <nr08> I see. Thank you for your patience.
[23:54] <ghospich> Yay, finally found nice workaround for subtitles in fast seek mode!
[23:55] <ghospich> -vf setpts=PTS+60/TB,subtitles=sub.srt,setpts=PTS-STARTPTS
[23:57] <ghospich> Crazy, but working. Workaround which provided in #2067 has some quality issues for me, but this one is working nice.
[00:00] --- Fri Aug 15 2014


More information about the Ffmpeg-devel-irc mailing list