[Ffmpeg-devel-irc] ffmpeg.log.20150120

burek burek021 at gmail.com
Wed Jan 21 02:05:01 CET 2015


[03:58] <greves_> can anyone help me out here? I'm trying to add cover art with ffmpeg via pydub (a python wrapper for ffmpeg)
[03:59] <greves_> i'm having trouble with the -map command(s?) and if i need some other metadata argument or something?
[04:02] <c_14> https://ffmpeg.org/ffmpeg-formats.html#mp3
[04:02] <c_14> That's for mp3
[04:04] <greves__> yeah i was looking at that already, let me paste the command pydub is giving to ffmpeg
[04:07] <greves__> ah i think i' mmissing metadata:s:v, what is that?
[04:08] <c_14> It writes id3 tags. Probably so that the image is detected as cover art
[04:12] <greves__> so what is the difference between -metadata and -metadata:s:v ?
[04:12] <c_14> -metadata is global metadata, -metadata:s:v is stream metedata for the video stream
[04:15] <greves__> http://pastebin.com/tqW8fwbf
[04:16] <c_14> wait
[04:16] <c_14> what
[04:16] <greves__> thats what pydub is sending to the command line
[04:16] <greves__> it all gets joined together in one string
[04:16] <c_14> You only have 2 inputs, why are you mapping the 3rd and 4th input?
[04:17] <greves__> oh i thought it was argument position
[04:17] <greves__> so i should do map 0 1 ?
[04:17] <c_14> yes
[04:18] <greves__> no, still nothing
[04:18] <c_14> what
[04:19] <c_14> Why do you have id3v2_version 3 and 4 ?
[04:19] <c_14> Is that even valid?
[04:19] <greves__> oh right that's a pydub thing, let me fix that
[04:20] <c_14> and the (Front) is being lost
[04:21] <greves__> http://pastebin.com/zqYzXSDE
[04:21] <greves__> oh do i actually need (Front)?
[04:21] <c_14> You might?
[04:22] <greves__> ok, i just tried that it doesn't work either
[04:23] <greves__> actually since i started doing the -i cover.jpg -c copy -map x -map y, my mp3 output is blank, 0 bytes
[04:24] <c_14> Is there any ffmpeg output?
[04:24] <greves__> hmm i'm not sure how to access that since it's wrapped in pydub...
[04:25] <greves__> maybe let me try running the command direct from terminal
[04:30] <greves__> wow this is so hard lol i can't even type the command
[04:33] <greves__> i think ffmpeg thinks my image is an audio stream
[04:35] <greves__> http://pastebin.com/sJh0F5Sv
[04:35] <greves__> does anything there help?
[04:36] <c_14> >[mp3 @ 0000000002c20900] Invalid audio stream. Exactly one MP3 audio stream is required.
[04:37] <c_14> Something's b0rked
[04:37] <c_14> oh
[04:37] <c_14> You can't put pcm into mp3
[04:37] <c_14> You'll need to encode it
[04:37] <c_14> derp
[04:37] <c_14> change -c copy to -c:a libmp3lame
[04:38] <greves__> wow
[04:38] <greves__> how did you do that
[04:38] <greves__> it's like magic
[04:39] <greves__> thanks a lot :)
[04:40] <c_14> npnp
[04:40] <greves__> how would i have learned that from the docs?
[04:40] <greves__> like, which section?
[04:41] <greves__> wow my other computer is going haywire
[04:42] <c_14> That error message
[04:43] <c_14> Then looking at the bottom near output0 stream 0 and noticing that it's pcm and not mp3
[04:43] <c_14> The Could not write header is usually also indicative.
[07:08] <k_sze> Is there a particular (video) container file format that supports simultaneous reading and writing (as long as all reading and writing are sequential)
[07:14] <BtbN> should work for mpegts and flv
[07:14] <BtbN> depends a lot on the player though
[07:14] <BtbN> if it just scans the length one at startup, it will stop playing once i reaches that point
[07:14] <BtbN> +c
[07:36] <tmh1999> Hi guys, I hope someone could help me :D. I want to rescale/resize a video. So as far as I understand, the procedure is : 1. Open video file, read video stream. 2. Read packet from that stream. 3. Decode the packet into frames. Resize those frames. 4. Encode those frames into packets. 5. Write those packets to new file.
[07:36] <tmh1999> Is that correct? Thanks.
[07:37] <BtbN> tmh1999, you want to do that with the ffmpeg cli tool, or by using libav*?
[07:38] <tmh1999> BtbN : I am using ffmpeg API, none of them.
[07:38] <tmh1999> oh
[07:38] <tmh1999> sorry
[07:38] <tmh1999> libav*
[07:38] <tmh1999> misunderstanding your question
[07:41] <tmh1999> as far as I know I use libavcodec, libavformat, libavswscale
[07:41] <BtbN> basicaly you use libavformat to access the container, use libavcodec to decode it, scale it with libavfilter, encode it with libavcodec, and then mux it again with libavformat.
[07:41] <BtbN> Ah, yes. libswscale for scaling. Not avfilter. But yes, you got the idea.
[07:42] <BtbN> might still need libavfilter for stuff like deinterlacing. At least that's where i think that stuff is.
[07:42] <tmh1999> So, my procedure above is right?
[07:42] <BtbN> I'd personaly just use the ffmpeg cli tool.
[07:43] <tmh1999> wow, that's new, deinterlacing?
[07:43] <BtbN> If the source is interlaced, that would be a good idea, yes
[07:43] <tmh1999> TBH I am new to both ffmpeg and video processing in general.
[07:43] <BtbN> Or you need to use interlacing-aware scaling
[07:44] <BtbN> With the ffmpeg cli tool scaling a video is one of the more simple tasks.
[07:44] <BtbN> Doing that manualy via libav* requires quite a lot of work.
[07:45] <tmh1999> Actually, I want to call ffmpeg/libav* API in my threads. If I use ffmpeg CLI like, system("ffmpeg ..."); that would be a process
[07:48] <tmh1999> OK I will consider your opinion.
[07:48] <tmh1999> Thanks :)
[08:43] <Elirips> Hello everyone. I have an app that creates one frame per second. On Startup it creates an ffmpeg process and then pipes those frames to ffmpeg, instructing ffmpeg to create a rtmp stream with 5 fps from the single frame.
[08:44] <Elirips> I use ffmpeg like this: 'ffmpeg.exe -framerate 1 -i - -an -c:v libx264 -preset fast -profile baseline -pix_fmt yuv420p -x264opts keyint=5 -r 5 -s 260x208 -f flv rtmp://<some-ip>/streams/name'
[08:44] <Elirips> I'm wondering: What would happen if my input-source to ffmpeg does not provide 1 fps reliable? For example, if the input source has a delay, and does not provide a frame for 3 seconds. will ffmpeg then just write out 15 frames of the last image passed to ffmpeg?
[08:45] <Elirips> or will ffmpeg 'do nothing' and wait for the next frame, as I told it that it will receive 1fps?
[08:53] <sfan5> Elirips: iirc ffmpeg waits for input
[08:53] <Elirips> sfan5: mmhh. this is bad for me :P
[08:54] <Elirips> Is it possible to instruct ffmpeg something like 'if you do not get the framerate reported on the command-line, "pad" with the last frame?
[08:54] <sfan5> Elirips: you could have a different thread that passes frames to ffmpeg
[08:55] <sfan5> and that thread would then write the same frame if there isn't a new one
[08:55] <Elirips> sfan5: you're right, that would maybe be the way to go
[08:55] <Elirips> But I would need to implement quite some timing stuff
[08:55] <sfan5> actually ffmpeg does not get any "frame rate", ffmpeg just (tries) to read as many frames as possible and encodes/copies/whatever-you-instructed-it-to-do them
[08:56] <Elirips> but wat is then the effect of the parameter 'framerate 1' for the input?
[08:57] <sfan5> that is how many frames the input has
[08:57] <sfan5> that is needed to playback
[08:58] <Elirips> I see. I will add some tests to see whats going on
[09:00] <sfan5> Elirips: maybe my explanation was confusing; ffmpeg doesn't try to read n frames per second, it reads as many frames per second as it manages to process
[09:01] <Elirips> sfan5: and if I specify -r 5 on the output, it will try to make an output of 5fps. So, in my case, where I send the frames using a pipe, I could ommit the -framerate 1 parameter, not?
[09:01] <sfan5> depends
[09:01] <sfan5> is the input in a container?
[09:01] <Elirips> the input are single bitmaps
[09:02] <sfan5> i don't know what ffmpeg defaults to in that case
[09:02] <Elirips> anyway thx for the hints. I'll try to add a text displaying the current frame / timestamp on the video-created, then I can start testing more
[09:02] <sfan5>     Stream #0:0: Video: bmp, bgr24, 454x454, 25 tbr, 25 tbn, 25 tbc
[09:02] <sfan5> seems like bitmaps have no framerate at all
[09:03] <Elirips> Where did you find that?
[09:03] <sfan5> which makes sense considering that they are static images
[09:03] <sfan5> ffmpeg -i somebitmap.bmp
[09:05] <Elirips> Hm, what would happen if I transcode one single bitmap into an infinite rtmp-stream? Well I can test that on my own
[09:05] <Elirips> I regret that I was too lazy to implement the pipe to read from ffmpegs stderr to my app :P
[09:07] <Elirips> If only ffmpeg could tell me what I'm doing wrong with my -vf
[09:14] <Zombie> Does anyon know how to use de-interlacing filters to de-interlace video the way VLC does?
[09:14] <sfan5> Zombie: what is "the way vlc does"?
[09:15] <BtbN> vlc has its own deinterlacing code. It doesn't use ffmpeg.
[09:16] <BtbN> It has a lot of diffrent deinterlacing algorithms.
[09:16] <sfan5> last time i looked at man ffmpeg-filter i found a lot of ways to deinterlace video
[09:16] <BtbN> yes, ffmpeg itself also has deinterlacing support
[09:16] <BtbN> but vlc doesn't use that.
[09:20] <Zombie> I'm trying to de-interlace a batch of VHS Tapes.
[09:51] <Zombie> Hello?
[10:04] <relaxed> Zombie: are you using the yadif filter?
[10:04] <Zombie> I have done one video that way.
[10:04] <Zombie> Is that optimal?
[10:04] <relaxed> yes, how did the output look?
[10:05] <Zombie> I tried it once, but there was an improvement.
[10:38] <galba> hi, is there a simple command to unshake (stabilize) a video?
[10:42] <BtbN> ffmpeg -i input.mp4 -filter:v deshake -c:a copy -c:v libx264 ..... output.mp4
[10:49] <galba> thanks, I just found vidstab, is it different?
[10:52] <ubitux> it's way better
[10:52] <galba> ok :)
[10:54] <galba> do I have to use "-threads n" to use n processors or is it automatic?
[10:55] <ubitux> it's automatic
[11:56] <TonyIenciu> hello all, I'm trying to decode audio stream provided by internet radio stations, but I seem to get memory leak when reading the stream with av_read_frame. I'm also calling av_free_packet after each call... can someone help me, please?
[11:58] <Mavrik> valgrind can help you alot more than we can.
[12:30] <Ludowicus> Hello! Could someone tell me where I can find FFmpeg with x264 10bit included?
[12:50] <relaxed> Ludowicus: http://johnvansickle.com/ffmpeg/
[12:50] <Ludowicus> relaxed, Thank you!
[12:51] <relaxed> you're welcome
[13:02] <Wader8> helllo
[13:03] <Wader8> MP4 metadata doesn't work well from this wiki http://wiki.multimedia.cx/index.php?title=FFmpeg_Metadata
[13:03] <Wader8> most of the documentation is oboslete
[13:03] <Wader8> only title and genre worked
[13:04] <Wader8> -metadata title="my title" -metadata author="my author" -metadata TIT3="my subtitle" -metadata comment="my comment" -metadata genre="my genre" -metadata comment="my comments" -metadata year="2014" -metadata year="2015" -metadata description="description" -metadata episode_id="episode id"
[13:05] <Wader8> pretty much any ffmpeg documentation found on the web older than 3 years is useless
[13:06] <Wader8> okay date="2014" works too
[13:06] <Wader8> not year=
[13:07] <Wader8> this is on Windows
[13:07] <Wader8> comments also work
[13:24] <theholyduck> Wader8, i found it sort of depressive how far the -sameq falacy
[13:24] <theholyduck> spread
[13:25] <theholyduck> i mean, the fact they had to resort to removing it due to all the people being wrong on the internet
[13:34] <Wader8> well most programmers aren't known for good namings :p
[13:34] <Wader8> I find it often very confusing, not pointing at this case but in general, usually programmers out of colleges just weird me out
[13:35] <Wader8> you don't suppose to use advanced/misc stuff to mix with basic abbreviations
[13:36] <Wader8> I don't who the heck came up with sameq, but glad it's removed :p
[13:38] <Wader8> and I already had a big rant on ffmpeg documentation on a forum and what I think about that, a pity
[13:39] <dv_> hi, I have a question about vc-1 . I keep reading about repeated frames in it. is this a vc-1 specific feature? can in vc-1 one input frame produce N output frames (that is, the same output frame repeated N times)?
[13:40] <Wader8> it's really a semi-cult, where you have the industry folks who study ffmpeg for a job ofcourse they're going to understand all the commands, and other paid software that uses ffmpeg as the backend, seems like they want to intentionally make poor documentation
[13:41] <Wader8> for example, I find linux namings extremely weird
[13:42] <Wader8> folders with names "abt" ebt afv  dda wsa wsdd sda sa q sys dsa odo sos rwd wa  BIN BIN BIN BIN BIN
[13:45] <Wader8> probably wouldn't be a problem if I devoted 5 years of my life trying to adapt to linux GUI and commands, anything's possible
[13:46] <Wader8> im ranting again! oh well
[14:20] <visiot> Solver: how ffmpeg get input frames in filtergraph
[16:53] <Elirips> Anonyone a hint how to use -vf on windows? I'm stuck on the problem described here: http://stackoverflow.com/questions/28035800/ffmpeg-could-not-load-font-c-cannot-escape-semi-colon
[16:56] <Elirips> Also if I add c:\windows\fonts to the path, it fails to open
[16:58] <Elirips> ah, just copy arial.ttf to the same dir as ffmpeg
[20:01] <baran> hey, can somebody tell me values of CodecContext->level for MPEG-2 format?
[20:01] <baran> are they determined by ffmpeg, or are they purely set by a user?
[23:15] <Fraifrai> hello
[23:16] <Fraifrai> I use ffmpeg to modify videos , for example to merge separated audio and videos files
[23:16] <Fraifrai> or to extract parts
[23:16] <Fraifrai> but my problem is new and I found no help onforums
[23:16] <Fraifrai> I have a 1920x1152 videos which I would like to reduce to 1080 by STRIPPING 36 pixels top and bottom
[23:17] <Fraifrai> without reencoding nor rescaling
[23:17] <Fraifrai> is that possible with ffmeg
[23:17] <Fraifrai> sorry for mypoor english I'm french
[23:17] <Fraifrai> any help appreciated
[23:18] <JEEB> <Fraifrai> without reencoding nor rescaling <- Impossible
[23:19] <Fraifrai> you mean that i'm forced to loose quality ?
[23:19] <Fraifrai> or may be there is an adapted filter i do not know
[23:19] <JEEB> no, but you are forced to re-encode. you can encode with a lossless encoder and there will be no loss (and the file will be huge), or you encode with enough rate to make sure it looks good enough
[23:20] <JEEB> but yes, re-encoding is inevitable
[23:22] <Fraifrai> ok
[23:22] <Fraifrai> which options would you use to convert from 1920x1152 to 1920x1080 deleting the 36 top and bottom lins?
[23:22] <Fraifrai> croping?
[23:23] <Fraifrai> with keep_aspect?
[23:24] <Fraifrai> http://ffmpeg.org/ffmpeg-filters.html#crop
[23:24] <Fraifrai> using Crop the input video central square ?
[23:24] <Fraifrai> no
[23:24] <Fraifrai> Crop the central input area with size 100x100:
[23:25] <Fraifrai> crop=1920:1080
[23:25] <Fraifrai> ?
[23:26] <foonix> crop=1920:1080:0:36
[23:28] <Fraifrai> it's running
[23:28] <Fraifrai> thx a lot
[23:28] <Fraifrai> it seems to work
[23:28] <Fraifrai> Thansk a lot
[23:39] <Fraifrai> hum
[23:39] <Fraifrai> it seems that it changes encoders http://pastebin.com/DyxTx617
[00:00] --- Wed Jan 21 2015


More information about the Ffmpeg-devel-irc mailing list