[Ffmpeg-devel-irc] ffmpeg.log.20151113

burek burek021 at gmail.com
Sat Nov 14 02:05:01 CET 2015


[00:41:37 CET] <donguston> guys getting some ultra strange thing where streaming a video file over rtmp the encoding output says the time is going faster than actual time
[00:41:59 CET] <donguston> i assume it buffers the output somewhere and streams it rather than encodign and streaming in real time?
[00:50:56 CET] <donguston> c_14, idk how pastebin will make a difference - the "time" field on the output is going up 2 seconds for every 1 second of time
[00:51:09 CET] <c_14> Just show me your commandline
[00:51:12 CET] <donguston> but the stream is working fine - obviously ffmpeg time field will get to the end of the video file before the stream ends
[00:51:16 CET] <c_14> Are you using -re ?
[00:51:28 CET] <donguston> ffmpeg -i "video.mp4" -vf yadif=0:-1:0 -s 1280x720 -vcodec libx264 -b:v 2200K  -acodec libmp3lame -ab 128k -ar 44100 -preset veryfast  -f flv rtmp://removed/bbc1/britpol-stream
[00:51:42 CET] <c_14> add -re as an input option
[00:51:47 CET] <c_14> (before -i)
[00:56:09 CET] <donguston> ok thans
[00:56:22 CET] <donguston> what happens without it though c_14 ? will the stream actually show the entire video?
[00:56:37 CET] <c_14> Yes, ffmpeg defaults to outputting video as fast as it can.
[00:57:02 CET] <c_14> -re forces it to read video in "realtime"
[00:59:01 CET] <donguston> oh ok thanks, stream already started but ill use re in the future
[01:25:01 CET] <bencc> anYc: when using avcut, the start of the video is pixelated
[01:25:07 CET] <bencc> are you familiar with this issue?
[01:55:53 CET] <AndrewMock> Does libavcodec have full support for Atmos?
[03:48:16 CET] <carlitux> hi, I want to overlay a video with an image but with 24frames per second... any way to set the between statement to handle this?
[03:49:54 CET] <carlitux> or if you can send me the link where the documentation of between is so I can read... I couldn't fine it
[04:12:31 CET] <Admin__> hey guys.. i have a Nvidia capable card that can encode at h265... what param can i use to push it to one of the GPUs ?
[04:12:43 CET] <Admin__> i tried the -gpu function but it says it only works with nvenc not hevc
[04:12:52 CET] <Admin__> but... my card supports it.. so i am trying to figure out what to do
[04:17:45 CET] <Admin__> anyone ?
[04:58:01 CET] <sekrit> /w/ac
[05:01:04 CET] <AndrewMock> Does libavcodec have full support for Atmos?
[05:17:19 CET] <RazWelle1> Hey, anyone ever heard of loss of bitdepth or lossy decoding on mobile for h264?
[05:17:27 CET] <RazWelle1> Like, in the hardware?
[05:17:31 CET] <RazWelle1> Or otherwise
[09:28:24 CET] <anYc> bencc: it might be related to remaining issues I have with one of my own videos. If you can reproduce the issue with an available video, I'm interested in analyzing the problem.
[13:21:02 CET] <devshark> hello
[13:21:23 CET] <devshark> anyone around willing to help me out a bit? :)
[13:23:21 CET] <devshark> ... or just point me to a decent and working prebuilt binary for Android that's newer than 2.4.2 because it doesn't seem to support the -filter_complex that i need, that would be great.
[16:09:42 CET] <JonG> If I perform "ffmpeg -i input.mp4", the result is an exit code == 1 and the message: "At least one output file must be specified". I am just using this to programmatically check the validity of a file before moving on to other tasks. I don't really want to ignore the exit code, because I imagine there are other reasons the exit code will be 1, othe
[16:09:43 CET] <JonG> r than this one error. Ideally, I would like ffmpeg to return success if the file is valid and error if not. How can I achieve this? Thanks.
[16:11:44 CET] <furq> JonG: use ffprobe
[16:13:20 CET] <JonG> furq: Thanks, I did consider that, but I haven't included ffprobe in my android application (yet), whereas I have included ffmpeg. That is my back up plan though.
[16:16:59 CET] <furq> you could use ffmpeg -i input.mp4 -f null -
[16:17:02 CET] <furq> that'll decode the whole file though
[16:17:27 CET] <c_14> add -t 0
[16:17:51 CET] <furq> yeah that works
[16:18:19 CET] <JonG> ah, nice idea, thanks, I'll try that
[16:26:08 CET] <tripkip> At ArgusTV we are having trouble getting a live stream of satellite MPEG-TS data transcoded in and onto  an RTSP stream. I have uploaded an example here: http://ul.to/4ym3a03b
[16:26:57 CET] <furq> Estimated download time: ~ 2 hours 11 Minutes and 14 seconds
[16:26:59 CET] <furq> good luck with that
[16:28:04 CET] <tripkip> Turning mpeg-ts into mp4 package keeping audio and video also fails
[16:28:21 CET] <tripkip> It works, but the timings are completely messed up
[16:36:55 CET] <newdagger> hi?
[16:37:27 CET] <newdagger> is there anyone?
[16:37:42 CET] <newdagger> I need help
[16:39:16 CET] <DHE> be specific
[16:39:53 CET] <newdagger> hi
[16:39:54 CET] <newdagger> okay
[16:40:17 CET] <newdagger> I use ffmpeg to grab my display under linux
[16:40:39 CET] <newdagger> I use -f x11grab
[16:40:43 CET] <newdagger> libvpx
[16:40:47 CET] <newdagger> to webm
[16:41:02 CET] <newdagger> I start the process in background
[16:41:28 CET] <newdagger> and want it to finish when I close the gnome session
[16:41:43 CET] <newdagger> but
[16:41:59 CET] <newdagger> thers a problem
[16:42:06 CET] <newdagger> if I kill the process
[16:42:20 CET] <newdagger> the video.webm is corrupt
[16:42:35 CET] <newdagger> I cannot scrub
[16:42:59 CET] <newdagger> is there a way to send a signal like q
[16:43:02 CET] <newdagger> ?
[16:56:25 CET] <c_14> SIGINT or SIGTERM
[17:05:12 CET] <newdagger> how does it work?
[17:05:35 CET] <furq> sigint is the same as sending ^C
[17:06:23 CET] <c_14> kill -INT pid or kill -TERM pid or pkill -INT ffmpeg or pkill -TERM ffmpeg
[17:07:37 CET] <newdagger> thank you
[17:09:10 CET] <newdagger> but I think, that it kills ffmpeg and encoding is stopped
[17:09:26 CET] <c_14> It should terminate gracefully.
[17:11:34 CET] <newdagger> okay
[17:11:39 CET] <newdagger> thank you very much
[17:11:48 CET] <newdagger> cu
[17:11:49 CET] <c_14> (as long as you don't send it twice in a row)
[17:11:57 CET] <newdagger> twice in a row?
[17:12:04 CET] <newdagger> what does it mean?
[17:12:20 CET] <c_14> if you were to send SIGINT or SIGTERM, and then send it again before the process closed
[17:12:43 CET] <c_14> Sending it once makes it terminate gracefully, but if you send it again it will just exit
[17:13:18 CET] <newdagger> ahh
[17:13:20 CET] <newdagger> okay
[17:13:22 CET] <newdagger> I see
[17:13:25 CET] <newdagger> thanks again
[17:13:31 CET] <newdagger> bye
[20:02:00 CET] <shin2> OH there is a channel for this massive crazyness
[20:03:57 CET] <JEEB> uh-huh
[20:07:59 CET] <shin2> holy crap someone is alive in this channel.. by any chance do you know opencv as well?
[20:08:48 CET] <Primer> crickets...
[20:09:17 CET] <durandal_1707> what you want to do?
[20:10:28 CET] <shin2> its related to engineering and cross platform with a processor Im going to light on fire
[20:10:54 CET] <furq> could you be less specific
[20:12:41 CET] <shin2> Link.txt is dumping /usr/local/lib when I dont have ld_library_path set to anything. I need it to stop searching that location for libraries
[20:12:49 CET] <shin2> in highgui module.
[20:13:06 CET] <shin2> Im building this... wonder.... piece of art... for a mips processor.
[20:13:33 CET] <Primer> surely ld.so.conf has it
[20:13:45 CET] <Primer> or, on modern systems, ld.so.conf.d
[20:14:06 CET] <shin2> in my case would there be a way to tell cmake to not look there for a while?
[20:14:09 CET] <Primer> otherwise it's compiled in
[20:14:23 CET] <Primer> err, I thought you meant at runtime
[20:14:31 CET] <Primer> compile time is a whole other story :)
[20:15:24 CET] <Primer> But what do I know...I'm only here to try to coax someone into revealing the secrets of vlc (yes, I know this is #ffmpeg)
[20:15:39 CET] <Primer> But I found that ffplay can't do what I want
[20:15:44 CET] <shin2> Well funny thing is opencv channel has idlers but they dont talk
[20:15:51 CET] <furq> the secret of vlc is that it's actually not very good
[20:16:03 CET] <shin2> And since those lovely people decided to use "ffmpeg" I figured some people in here might know something
[20:16:17 CET] <furq> why not set LD_LIBRARY_PATH
[20:16:19 CET] <exploreshaifali> Hello! I want to use ffmpeg to compress videos using python, what is best way?
[20:16:31 CET] <furq> it should take precedence over the standard directories
[20:16:50 CET] <shin2> yeah but what does CMAKE do for priorities?
[20:17:13 CET] <furq> shrug
[20:17:18 CET] <exploreshaifali> is it like using python's subprocess module?
[20:17:22 CET] <furq> the only piece of cmake advice i can give is "try to avoid cmake at all costs"
[20:17:26 CET] <furq> and it seems you've not managed that
[20:17:44 CET] <shin2> Well... its not that I have its opencv hasnt managed that.
[20:17:52 CET] <furq> and you by proxy
[20:18:01 CET] <shin2> proxies...
[20:18:04 CET] <shin2> i feel like a scapegoat
[20:19:25 CET] <c_14> exploreshaifali: using subprocess is going to be your easiest bet, yes
[20:19:44 CET] <exploreshaifali> c_14, and what about performance?
[20:19:56 CET] <exploreshaifali> c_14, basically I want to know best way to do so
[20:20:15 CET] <Primer> furq: Honestly I'd much rather use a pure command line application to do what I'm trying to do, but it seems ffplay doesn't support -filter_complex
[20:20:28 CET] <Primer> I'm more than willing to keep trying if you tell me otherwise :)
[20:20:36 CET] <c_14> exploreshaifali: The python part of the code is most likely not going to be performance relevant.
[20:20:52 CET] <c_14> Primer: ffmpeg -i video -f sdl - -f alsa default (cough, cough, cough)
[20:21:03 CET] <c_14> Primer: what are you trying to do?
[20:21:25 CET] <Primer> I have an 8 camera DVR. I just want to display their videos in a grid locally
[20:21:38 CET] <exploreshaifali> okay, c_14 Thanks! :)
[20:21:55 CET] <Primer> The device uses rtsp over tcp
[20:22:39 CET] <c_14> Either use that really nasty hack there, or pipe ffmpeg output to ffplay (or some other video player)
[20:23:24 CET] <Primer> I also have a script where I can re-stream this using ffmpeg: ffmpeg --rtsp_transport tcp -i "rtsp://url..." --rtsp_transport tcp -i "rtsp://url..." ... -filter_complex etc...
[20:23:40 CET] <Primer> AHA!
[20:24:01 CET] <Primer> So yeah, I completely forgot about piping ffmpeg to ffplay
[20:24:39 CET] <Primer> Thing is, I want a quick and dirty script for windows (wife runs windows). I presume such piping works in windows?
[20:24:57 CET] <c_14> eeeeeh
[20:25:02 CET] <c_14> iirc piping yes, named pipes no
[20:25:26 CET] <Primer> so ffmpeg --bla | ffplay, in a .bat should work?
[20:25:39 CET] <c_14> should
[20:25:46 CET] <Primer> lovely
[20:25:47 CET] <Primer> thanks
[20:27:09 CET] <Primer> For streaming, I included -v:c, -a:c, etc. Is there a less CPU intensive manner?
[20:27:16 CET] <Primer> for piping, that is
[20:27:26 CET] <c_14> I hope you're using -c:v and -c:a
[20:27:37 CET] <c_14> -v:c should be an error...
[20:27:43 CET] <c_14> Anyways
[20:27:49 CET] <Primer> my bad, that was from memory
[20:28:00 CET] <Primer> I've been trying to get the perfect thing in both vlc and ffmepg
[20:28:05 CET] <c_14> use -c:v rawvideo -c:a pcm_s<bitdepth>le
[20:28:31 CET] <c_14> bitdepth is either 16, 24, or 32 pick the one closest to but above your input bitdepth
[20:28:38 CET] <Primer> the source is 16
[20:28:44 CET] <c_14> then just use 16
[20:35:32 CET] <Primer> well, he's to hoping I can do this in cygwin, because windows .bat is shit
[20:46:38 CET] <JEEB> you can most probably use whatever you want
[20:46:47 CET] <JEEB> ffmpeg is just a command line program, after all
[21:33:51 CET] <Primer> just can't get this piping to work...not sure if it's windows or what
[21:34:17 CET] <Primer> but I'm using cygwin, so that might be a factor
[21:34:19 CET] <JEEB> piping itself should work, named pipes are something Completely Different on windows, though
[21:34:36 CET] <Primer> trying a simple test case now
[21:38:58 CET] <Primer> /cygdrive/c/ffmpeg/bin/ffplay -i d:\\Temp\\mouse.mov <-- this works
[21:40:12 CET] <JEEB> yeah, because unixy shells probably take \ as something else than cmd.exe :)
[21:40:15 CET] <Primer> /cygdrive/c/ffmpeg/bin/ffmpeg -i d:\\Temp\\mouse.mov -f rawvideo - | /cygdrive/c/ffmpeg/bin/ffplay - <-- this does not
[21:40:23 CET] <JEEB> uhh
[21:40:31 CET] <JEEB> oh, -f rawvideo
[21:40:45 CET] <Primer> pipe:: Invalid data found when processing inputKB sq=    0B f=0/0
[21:40:52 CET] <JEEB> what container does it pick on the ouptut side?
[21:40:56 CET] <Primer> NFI
[21:41:12 CET] <JEEB> pastebin ffmpeg's output? :P
[21:41:15 CET] <JEEB> and link it here
[21:41:16 CET] <Primer> err, perhaps that acronym is not as well known as I think it is
[21:41:27 CET] <JEEB> oh it is
[21:41:51 CET] <JEEB> and I took it as an offence but I'm still telling you that I might be ready to parse your shitty ffmpeg command line's terminal output
[21:41:58 CET] <Primer> http://pastie.org/10555460
[21:42:02 CET] <JEEB> thanks
[21:42:20 CET] <JEEB> oh right, rawvideo is just that
[21:42:29 CET] <JEEB> ahaha, no fucking wonder then
[21:42:42 CET] <JEEB> ffplay would have no idea what is going on
[21:42:52 CET] <JEEB> -c:v rawvideo -f nut
[21:42:59 CET] <Primer> trying
[21:42:59 CET] <JEEB> try this instead of -f rawvideo
[21:43:24 CET] <Primer> excellent, that worked
[21:43:25 CET] <JEEB> this outputs raw video (and probably audio in some form or another) in NUT
[21:43:45 CET] <ChocolateArmpits> isn't nut deprecated ?
[21:43:51 CET] <JEEB> not as far as I know
[21:44:03 CET] <JEEB> although the only use case for it is literally passing raw video-audio
[21:44:04 CET] <JEEB> :P
[21:44:16 CET] <Primer> now for the more complicated part, which is to run the -filter_complex stuff I make to create a 3x2 grid of 5 security cameras
[21:44:24 CET] <Primer> s/make/made/
[21:44:56 CET] <Primer> The one thing I'm not sure about is how to get a single audio stream from one of the source streams
[21:45:13 CET] <JEEB> -map INPUT_NUMBER:a:AUDIO_STREAM_NUMBER
[21:45:18 CET] <JEEB> both numbers start from zero
[21:45:33 CET] <JEEB> see the output of ffmpeg -i file regarding that
[21:46:01 CET] <Primer> yeah, I've used that before
[21:46:16 CET] <Primer> I had been trying to get this working using vlc
[21:46:29 CET] <Primer> I had it mostly working, but the audio part just didn't work
[21:56:20 CET] <Primer> I had this working when I had a capture card using v4l2://, but now replacing the -i's with the rtsp streams...I get nothing
[21:56:52 CET] <Primer> http://pastie.org/10555494
[21:57:22 CET] <Primer> The filter_complex stuff is quite a black box for me
[21:57:31 CET] <Primer> it was done by trial and error
[21:57:40 CET] <Primer> In the version that worked, that is
[21:58:41 CET] <Primer> http://pastie.org/10555495 <-- the version that worked
[23:34:10 CET] <Primer> ok, I kinda have it working!
[23:36:58 CET] <Primer> The "background" keeps changing color very quickly
[23:37:07 CET] <Primer> and the frame rate is rather slow, compared to vlc
[23:37:15 CET] <Primer> but audio works
[23:37:21 CET] <Primer> that never did in vlc
[23:58:51 CET] <Primer> yeah, it seems that ffmpeg doesn't handle doing a mosaic as well as vlc
[00:00:00 CET] --- Sat Nov 14 2015


More information about the Ffmpeg-devel-irc mailing list