[Ffmpeg-devel-irc] ffmpeg.log.20160225

burek burek021 at gmail.com
Fri Feb 26 02:05:01 CET 2016

[00:02:50 CET] <DANtheBEASTman> https://upload.teknik.io/jcQTV.webm this is with -b:v 2400k
[00:22:49 CET] <yann||work> well, looking at my various search results, I can't even understand what the level of support of VDPAU support is (or of any hwaccel, for that matter) - could someone please shed some light ?
[00:46:57 CET] <DHE> vdpau is for playback as I understand it. and it will be limited by your card's capabilities. mine does 1080p H264 pretty well using mpv or mplayer
[01:31:17 CET] <t4nk524> Hello
[01:31:29 CET] <t4nk524> I'm currently trying to build ffmpeg
[01:32:00 CET] <t4nk524> I'm using ./configure --disable-all , then enabling all the protocols and filters that I need
[01:32:32 CET] <t4nk524> ./configure --disable-all --enable-protocol=https
[01:32:52 CET] <t4nk524> however, it seems that --disable-all triumphs all the enable flags
[01:41:42 CET] <FFMPEGMASTER> How may I help you
[01:43:55 CET] <t4nk524> Hello
[01:44:12 CET] <t4nk524> So, I'm trying to build the latest FFMpeg 3.0 manually
[01:44:37 CET] <t4nk524> when I run ./configure --disable-all --enable-protocol -enable-openssl
[01:44:54 CET] <t4nk524> when I run ./configure --disable-all --enable-protocol=https -enable-openssl
[01:45:25 CET] <t4nk524> the output for the enabled protocols does not include the https protocol
[01:45:30 CET] <FFMPEGMASTER> what is your environment?
[01:45:39 CET] <t4nk524> it seems that --disable-all triumphs all the enable flags
[01:46:02 CET] <t4nk524> Mac OS 10.10.5 (Yosemite)
[01:46:42 CET] <c_14> You might also have to enable the http and tls protocols
[01:46:58 CET] <c_14> Though those should get pulled in iirc
[01:47:46 CET] <t4nk524> Let me try that "./configure --disable-all  --enable-protocol=https,http,tls --enable-openssl"
[01:48:17 CET] <t4nk524> The output for the ./configure command is empty for "Enabled protocols:"
[01:49:50 CET] <t4nk524> This ./configure command worked in ffmpeg 2.5.3, and seems to be broken from >2.6
[01:50:01 CET] <t4nk524> Has the behaviour of --disable-all changed?
[01:50:09 CET] <c_14> you need to enable-avformat
[01:51:22 CET] <c_14> hmm, that only enables http and tcp though. not tls/https
[01:53:14 CET] <t4nk524> mmm, I see that as well.
[01:53:22 CET] <c_14> Right, enable tls_openssl instead of tls
[01:58:47 CET] <t4nk524> Great, let me give that a shot
[02:02:38 CET] <t4nk524> Woooo, that worked! I'm going to enable that flag in my actual project now.
[02:02:40 CET] <t4nk524> thanks so much
[02:03:18 CET] <t4nk524> One more question unrelated
[02:03:28 CET] <t4nk524> Does FFmpeg support range header requests natively?
[02:05:03 CET] <c_14> You mean HTTP byteranges, afaik yes.
[02:05:25 CET] <t4nk524> Yes, http byteranges.
[02:06:26 CET] <t4nk524> So when opening a stream using "avformat_open_input(...)", how do I insert the byte ranges into the header
[02:07:23 CET] <c_14> the offset and end_offset options
[02:07:28 CET] <c_14> to the http demuxer
[02:11:00 CET] <t4nk524> great, would you be able to point me to a resource i could read more about byterange requests in ffmpeg?
[02:11:26 CET] <c_14> https://ffmpeg.org/ffmpeg-protocols.html#http
[02:13:38 CET] <t4nk524> and you've been very helpful today, is there a way to easily contact you in the future?
[02:13:54 CET] <ethe> here, most likely.
[02:14:04 CET] <c_14> I usually hang around here, feel free to highlight me.
[02:19:16 CET] <t4nk524> great, see you around then
[08:04:26 CET] <maziar> hi is there any body at there ?
[08:06:03 CET] <squ> yes
[09:50:13 CET] <mattf000> I am using ffmpeg + rtmp + nginx + flowplayer(flash) in an application where I am trying to display video captured by a usb3 capture card then streamed over wifi to android smart glasses. Problem is the device is locked down to 4.0.4. It does appear that hw decoding is taking place but my problem is just a bit of latency.
[09:52:49 CET] <mattf000> Here is my ffmpeg command: http://pastebin.com/8bH0Reiv
[09:52:59 CET] <mattf000> Can anyone advise on how to lower the latency a bit?
[09:55:32 CET] <furq> mattf000: do you get unacceptable latency when playing back the stream with ffplay -fflags nobuffer
[09:55:45 CET] <furq> if not then it's probably being introduced by the player
[09:56:40 CET] <mattf000> when i stream to a computer the latency seems to be under 100ms .. but when i stream to the android device sometimes it gets bad
[09:57:21 CET] <mattf000> the android is really my only use-case tho so streaming to a computer is fairly useless but it does tell me i'm hitting some limit of the device?
[09:58:14 CET] <mattf000> as far as i can tell it's using the gpu to decode correctly ... i need the player to throw away frames to keep up with timestamps
[09:58:19 CET] <mattf000> or something like that
[09:59:34 CET] <furq> players normally buffer network streams by a few seconds to prevent dropouts
[09:59:50 CET] <furq> you probably just need to find some way to disable that
[10:00:00 CET] <furq> dropouts shouldn't be an issue if it's streaming over a lan
[10:01:46 CET] <mattf000> I've got the client set to a bufferTime of 0. http://flash.flowplayer.org/documentation/configuration/clips.html
[10:04:25 CET] <mattf000> any other ideas?
[10:08:10 CET] <mattf000> should i look into a different codec maybe?
[10:13:21 CET] <furq> if you're using rtmp then you've got no other choice
[10:13:29 CET] <furq> i doubt it would make a positive difference anyway
[10:15:26 CET] <furq> i take it you have to use a web player?
[10:19:07 CET] <furq> i also take it you've changed the default nginx-rtmp settings, in particular buflen
[10:19:26 CET] <furq> that might be overriding the player's settings
[10:19:31 CET] <mattf000> yea i've set buflen but it also gets set by the client config
[10:20:06 CET] <furq> have you tested with vlc for android or something similar
[10:20:19 CET] <furq> or a computer on wifi
[10:20:27 CET] <mattf000> i don't have to use a web player .. i only went with flowplayer flash because @ Android 4.0.4 my options are limited
[10:20:40 CET] <mattf000> yes.. both actually
[10:22:04 CET] <mattf000> the computer does well.. vlc on android isn't as reliable as the flash player
[10:22:48 CET] <mattf000> Any suggestion on what I should be using for a chunk size?
[10:23:14 CET] <mattf000> really haven't noticed a difference whether i use 128 or 4096
[10:23:27 CET] <furq> no idea, i've never had to tune for latency
[10:23:54 CET] <furq> it wouldn't make a dent in the horrible inherent latency of hls
[10:25:11 CET] <mattf000> how's the latency with MPEG-DASH?
[10:25:17 CET] <mattf000> haven't looked yet
[10:28:45 CET] <furq> i've never tried, but i would have thought it has the same problem as hls
[10:29:19 CET] <furq> chunks aren't written out until a gop ends, so latency is inherently at least the length of one gop
[10:29:54 CET] <furq> in my experience it's often much worse, but apparently you can tune that to some degree
[10:30:23 CET] <furq> it sounds totally useless for your use case anyway
[10:31:42 CET] <mattf000> yea i'm just looking for anything that can drop the latency a bit more .. or have the client catch up better if it falls behind from network fluctuations
[15:10:48 CET] <petec> Hi I'm looking for a way to reset timestamps on a rtsp stream from a webcam. Its H.264 / pcm_alaw, vcodec copy, acodec aac, recording it as mp4. Pts drifts between audio and video with result that ffprobe displays negative start time in final file. I want to just throw away the timestamps on the incoming video and audio streams and get ffmpeg to create new ones based on its own time of packet reception. Any way to do this? TIA
[15:11:48 CET] <petec> Did a full write up of the problem with logs at http://ffmpeg.gusari.org/viewtopic.php?f=11&t=2699
[15:12:17 CET] <termos> I'm getting a huge spike of packets in the start of transcoding, I have a suspicion that these are the packets used by the analyzeduration. Is there a way to flush this buffer and remove the huge spike of packets in the beginning?
[16:00:19 CET] <||JD||> hey guys, is it possible to access ffmpeg from php without root/admin permissions? I mean, a library I can install myself in the user folder
[16:07:07 CET] <JEEB> niin /win22
[16:07:13 CET] <JEEB> welp
[16:17:04 CET] <humbledbysymfon1> Hi! Anyone doing html audio livestreaming with ffmpeg?
[16:18:55 CET] <zyme> Didn't know you could do that, but it makes me wonder,
[16:19:49 CET] <zyme> Can you stream to chromecast natively with ffmpeg or would you need a gateway app as a connector?
[16:20:11 CET] <humbledbysymfon1> No idea about Chromecast. But HTML livestreaming: It works, actually pretty well, I get a delay of about 7-25s. Only thing I haven't figured out what to use at the other end.
[16:20:44 CET] <humbledbysymfon1> I don't want to use the native browser support but a webplayer. I thought someone might have some experience with that...
[16:23:31 CET] <zyme> VLC?
[16:23:32 CET] <Mavrik> zyme, you need a gateway app
[16:24:12 CET] <zyme> Its even got a ios port these days.
[16:24:18 CET] <humbledbysymfon1> Zyme, thanks, vlc would work, but it needs to work in any browser.
[16:25:18 CET] <zyme> Mavrik: know of any?
[16:25:58 CET] <zyme> humbledbysymfon1: can't you just use html5?
[16:27:24 CET] <kepstin> I found when testing it a while back that most browsers with html5 audio support can actually play an mp3 icecast stream reasonably well (although you have no control over buffering)
[16:28:42 CET] <zyme> Flash, though its always having a critical exploit fixed, is supported by most browsers.
[16:29:19 CET] <zyme> Otherwise... Silverlight?
[16:29:28 CET] <humbledbysymfon1> kepstin: exactly, i was hoping to find a javascript player that I could configure in terms of buffering and thus reducing latency& And I was hoping to increase stability.
[16:29:33 CET] <zyme> Maybe QuickTime
[16:30:11 CET] <kepstin> humbledbysymfon1: javascript player won't help; they only do custom ui controls. all of the actual media path is done by the browser.
[16:30:31 CET] <kepstin> (although if you're using MSE with a custom streaming protocol, you could maybe bypass that a bit...)
[16:30:41 CET] <zyme> Javascript usually is what loads flash, and often has a check and fallback to html5 if not present, etc.
[16:31:03 CET] <humbledbysymfon1> kepstin: What's mse?
[16:31:46 CET] <kepstin> "media source extensions"; a way to feed media into the browser's decoder/playback engine in chunks. It's how javascript/html5 players implement hls and dash.
[16:32:19 CET] <humbledbysymfon1> Can you recommend any javascript player?
[16:33:30 CET] <kepstin> for this particular use case? haven't really heard of anything.
[16:34:47 CET] <retard> kepstin: browsers will shit themselves when there's inline metadata changes htough
[16:35:31 CET] <kepstin> hmm, I only tested it with a single continuous stream, tbh.
[16:36:38 CET] <||JD||> Guys, is there a ffmpeg library I can install myself in a shared hosting plan?
[16:36:40 CET] <retard> the metadata changes are multiplexed in the stream
[16:36:48 CET] <retard> ||JD||: there are static builds?
[16:37:20 CET] <||JD||> can you link them please?
[16:37:45 CET] <J_Darnley> ffmpeg.org/download
[16:38:02 CET] <retard> http://johnvansickle.com/ffmpeg/
[16:38:23 CET] <retard> http://ffmpeg.zeranoe.com/builds/ for windows
[16:38:32 CET] <||JD||> thank you
[16:50:10 CET] <humbledbysymfon1> kepstin: thanks! I need it for multiple streams and multiple users, but the streams can go to multiple instances of the javascript player
[17:27:04 CET] <lroe> I'm trying to use hls.js to view an m3u8 stream.  I'm looking at the docs and I ended up with: http://paste.debian.net/hidden/14e247cd/
[17:27:39 CET] <lroe> using some web developer tools chrome errors saying "Uncaught ReferenceError: Hls is not defined"
[17:31:57 CET] <DHE> the documentation references dist/hls.js
[17:39:47 CET] <zyme> I wonder if iis loads ffmpeg and i open a page which uses javascript to scream to itself and the chromecast extion is running in chrome i could make an on page/ tv interface four changing ffmpeg and relaunching it
[17:40:31 CET] <zyme> The java app auto retrying to connect during the restart if ffmpeg streaming..
[17:41:15 CET] <zyme> Since the chrome extention mirrors the open page onto the chromecast...
[17:43:14 CET] <zyme> The same iis site url could be open in in a different url as long as its the same application pool, for a web based controller page...
[17:45:04 CET] <zyme> acctually I wouldn't need the java based web player, I have a HEVC extension in chrome for native support.
[17:47:00 CET] <zyme> Might need the occasional open url on pc and click chromecast extention button, but otherwise quick dirty custom streaming of what I want to my chromecast.
[18:08:48 CET] <EmleyMoor> I'm looking for a way to encode my videos in a "YouTube streamable" format, so as to speed up processing time. What do I need to do to achieve this?
[18:12:31 CET] <kepstin> EmleyMoor: pick codecs and settings compatible with https://support.google.com/youtube/answer/1722171 , use mp4 container, use "-movflags faststart"
[19:30:28 CET] <lroe> hallelulah I finally got rtsp->hls working using vlc, nginx, hls.js
[19:30:49 CET] <lroe> but now when viewing the stream in a browser it seems to freeze randomly
[19:31:00 CET] <lroe> and I have to refresh the page to get it started again
[20:06:51 CET] <tEtra> this: https://www.pastery.net/aesgby/
[20:07:05 CET] <tEtra> is a bash script to rotoscope a video
[20:07:44 CET] <tEtra> it works well to create the individual images, but putting them back togeher is driving me crazy
[20:09:18 CET] <J_Darnley> I don't know what to say other than "use something that will work with a video stream"
[20:10:15 CET] <tEtra> https://www.pastery.net/jkwywv/ is the console output of running script
[20:10:35 CET] <tEtra> J_Darnley: are you saying that to me?
[20:10:49 CET] <tEtra> if so, what does that mean?
[20:11:00 CET] <J_Darnley> yes.
[20:11:04 CET] <tEtra> I know absolutely zip about this stuff
[20:11:21 CET] <J_Darnley> something that does not require hundreds of separate files
[20:11:34 CET] <tEtra> I have modified the script extensively to get it to work as good as it doed
[20:12:01 CET] <tEtra> why is that an issue for anything other than storage?
[20:12:15 CET] <J_Darnley> because you said "but putting them back togeher is driving me crazy"
[20:12:30 CET] <J_Darnley> Perhaps you should be more specific.
[20:12:39 CET] <tEtra> yes, my bad
[20:12:50 CET] <tEtra> please have a look at the output paste
[20:13:11 CET] <J_Darnley> yeah, what about it?
[20:13:17 CET] <tEtra> I think I am not encoding/decoding correctly
[20:13:35 CET] <tEtra> input is a cell phone video
[20:13:56 CET] <tEtra> No pixel format specified, yuvj444p for H.264 encoding chosen.
[20:13:57 CET] <tEtra> Use -pix_fmt yuv420p for compatibility with outdated media players.
[20:13:57 CET] <tEtra>  
[20:14:17 CET] <tEtra> I get this error, but if I try the suggested format, it barfs too
[20:14:58 CET] <tEtra> let me ask a different way - how would you do it?
[20:15:25 CET] <J_Darnley> I would start by finding out what "rotoscope" means
[20:15:38 CET] <J_Darnley> Then I would look for a better tool to do whatever that is
[20:15:59 CET] <J_Darnley> if I couldn't find one I would probably use png files
[20:16:44 CET] <J_Darnley> then I would see whether "autotrace" can output something other than svg
[20:16:49 CET] <J_Darnley> then I would make sure that it outputs uniformly sized images
[20:17:04 CET] <J_Darnley> otherwise I might ask "convert" to do that.
[20:17:49 CET] <J_Darnley> failing that I would try a scale filter in the final encoding command
[20:18:02 CET] <tEtra> re: png: autotrace cannot input that format
[20:18:10 CET] <tEtra> I was trying that initially
[20:18:36 CET] <J_Darnley> yet it supports jpeg?!
[20:21:04 CET] <tEtra> that test was before installing an additional libmagickcore-6.q16-2-extra package
[20:21:15 CET] <tEtra> maybe it will work now
[20:21:20 CET] <tEtra> I'll try that
[20:21:35 CET] <tEtra> that J_Darnley!
[20:21:38 CET] <tEtra> thanks
[20:33:39 CET] <svip> I have been using the Capture/ALSA guide to record sound device output while it still plays through the speakers, but to no avail.
[20:33:49 CET] <svip> The files ffmpeg produces are silent.
[20:34:03 CET] <svip> https://trac.ffmpeg.org/wiki/Capture/ALSA << Last section.
[20:34:43 CET] <c_14> You set it up just like in that section?
[20:34:49 CET] <svip> Yes.
[20:35:13 CET] <c_14> Is the application you're trying to record using the correct output device?
[20:35:18 CET] <svip> Well, I do have two extra lines in the file: pcm.pulse { type pulse } and ctl.pulse { type pulse }
[20:35:44 CET] <svip> c_14: Well, I am pretty certain it is, as it is coming out through the speakers and I can control its volume through alsamixer.
[20:35:50 CET] <c_14> What are you trying to record?
[20:36:20 CET] <svip> I am trying to record snippets of Spotify tracks.
[20:38:26 CET] <c_14> Could it be that the application you're trying to record is using pulse?
[20:38:41 CET] <svip> It is plausible.
[20:38:50 CET] <c_14> try checking with pactl or something
[20:38:55 CET] <svip> Pulse definitely seems to be somewhere.
[20:39:09 CET] <ethe> I heard JACK is good for these things
[20:41:48 CET] <svip> c_14: Yes, pactl list clients lists it as one of its clients.
[20:42:13 CET] <c_14> Then you're going to either have to make the program use alsa or record over pulse.
[20:42:24 CET] <svip> How do I record over pulse?
[20:43:40 CET] <c_14> You have to insert some pulse modules afaik
[20:44:09 CET] <c_14> https://dl.c-14.de/t/pulse_record_audio.svg <- so that it looks something like this
[20:44:37 CET] <svip> Aha.
[20:44:49 CET] <svip> Are the modules called that?
[20:45:14 CET] <c_14> I believe so
[20:49:58 CET] <svip> Hmm pactl won't let me load any modules.
[20:50:28 CET] <svip> Someone suggested -f alsa -i pulse
[20:50:33 CET] <svip> But that doesn't seem to work either.
[20:52:41 CET] <c_14> pactl load-module module-null-sink <- that doesn't work?
[20:53:11 CET] <svip> c_14: 24
[20:53:12 CET] <svip> It said.
[20:53:32 CET] <svip> Apparently, I got the names wrong.
[20:54:38 CET] <debianuser> svip: When recording from pulse you need to start ffmpeg recording from pulse default source, e.g. `-f alsa -i pulse` and then in `pavuconrol` "Playback" tab find your running ffmpeg and select its source to "Monitor of whatever card you're playing to".
[20:57:08 CET] <debianuser> Ah, sorry, that should be "Recording" tab, not playback, as ffmpeg records sound :)
[20:57:33 CET] <svip> debianuser: !
[20:57:38 CET] <svip> Great success.
[20:58:27 CET] <debianuser> Great! \o/
[20:58:37 CET] <c_14> debianuser: You wouldn't happen to want to write the Capture/Pulse page on trac would you? (I don't have pulse...)
[21:02:25 CET] <debianuser> c_14: I don't actually have it either. :) I just don't need its features, so I use just alsa. But the tutorial itself is the same: start `ffmpeg -f alsa -i pulse ...`, run `pavucontrol`, on "Recording" tab switch ffmpeg source to "Monitor of ..." your card. Example: http://osvideo.constantvzw.org/screencast-ffmpeg-with-sound/ Also see https://unix.stackexchange.com/a/162822
[21:04:51 CET] <debianuser> (The last https://unix.stackexchange.com/a/162822 is another option, using pulse backend with `-f pulse -i ...monitor ...`, you can see exact names in `pactl list`. But I'd still suggest `-f alsa -i pulse` way, it's easier to explain :) )
[21:05:51 CET] <c_14> Ah well, I'll just try and remember that then.
[22:53:30 CET] <julius> hi
[22:53:51 CET] <julius> can i ask a not strictly ffmpeg question, more about space usage of x264 question here?
[22:54:06 CET] <J_Darnley> sure
[22:54:33 CET] <J_Darnley> we don't stay strictly on tpoic
[22:55:19 CET] <julius> says here: http://stackoverflow.com/questions/701991/h-264-file-size-for-1-hr-of-hd-video  that h264 needs several gb per hour of recording
[22:55:51 CET] <J_Darnley> depends on the content and what sort of quality you expect.
[22:56:15 CET] <sfan5> or rather the encoding settings
[22:56:37 CET] <julius> i would like to record my plants growing over 6 weeks, of course i dont need every second. what about a picture every 5 minutes at 1080p - is there a formula to calculate that or approximate?
[22:57:06 CET] <sfan5> it's probably easier to just use jpeg for that
[22:57:06 CET] <J_Darnley> filesize = bitrate * length
[22:57:31 CET] <julius> ~12600 pictures
[22:57:41 CET] <J_Darnley> I would expect that to be fairly easy to compress
[22:57:49 CET] <julius> is there a "easy" way to get those jpegs later into a video?
[22:57:59 CET] <J_Darnley> ffmpeg?
[22:58:19 CET] <J_Darnley> wait, are the plants indoor or outdorr?
[22:58:22 CET] <julius> didnt think about compression, but youre right, camera will be stationary
[22:58:25 CET] <J_Darnley> *outdoor
[22:58:26 CET] <julius> indoor
[22:58:44 CET] <J_Darnley> good, no wind
[22:58:55 CET] <J_Darnley> little motion
[22:59:03 CET] <julius> the light changes, and the plans a little
[22:59:04 CET] <J_Darnley> pretty easy to compress then
[22:59:37 CET] <J_Darnley> sure, it isn't completely static
[23:00:11 CET] <J_Darnley> but you aren't cycling the lights on and off for every other photo, ar you?
[23:00:42 CET] <J_Darnley> The basic rule about video encoding is the the less that changes between frames the easier it is to compress.
[23:01:01 CET] <julius> ok
[23:01:09 CET] <julius> no, just daylight changing
[23:01:24 CET] <julius> and later on i can just create a video with something like this? ffmpeg -r 25 -qscale 2 -i %05d.morph.jpg output.mp4
[23:01:50 CET] <J_Darnley> Yes (almost)
[23:02:04 CET] <sfan5> https://trac.ffmpeg.org/wiki/Create%20a%20video%20slideshow%20from%20images
[23:02:11 CET] <julius> ah
[23:04:23 CET] <J_Darnley> Saving each image separately is a good idea if you want to check on them by viewing an image.
[23:04:33 CET] <J_Darnley> If you don't need to view the frames as they are being recorded then you might consider making the video directly.
[23:05:01 CET] <J_Darnley> but I guess how easy that is depends on how you capture them.
[23:05:16 CET] <julius> got a rpi2, with a usb webcam
[23:05:49 CET] <julius> will write out to a usb stick or usb harddrive
[23:06:10 CET] <julius> how do you record a video every x seconds?
[23:06:37 CET] <TD-Linux> motion is a nice program that will grab a .jpg from your webcam every x seconds
[23:06:47 CET] <furq> -framerate 1/300
[23:06:52 CET] <jkqxz> There are probably enough things which can go wrong that you'd be better off making images purely so that the intermediate state is verifiable, given the time that would be required for a second attempt.
[23:06:54 CET] <TD-Linux> or that.
[23:06:56 CET] <furq> i would probably just take pictures though
[23:07:19 CET] <furq> running ffmpeg for six weeks sounds unreliable
[23:07:51 CET] <firewated> if the power goes out or something, you could end up with a corrupted video file that's difficult to recover data from, better to stick with individual images
[23:07:52 CET] <jkqxz> Making the video later is easy, and you get more than one attempt to get the settings right with the whole input.
[23:08:06 CET] <julius> hm, ok makes sense
[23:08:25 CET] <julius> excellent info here guys, thank you
[23:08:44 CET] <furq> it's bad enough when i spend an hour encoding something and then realise i've used the wrong settings
[23:08:56 CET] <furq> i'd be pretty annoyed if i had to spend another six weeks retrying
[23:10:11 CET] <J_Darnley> these people all make good points
[23:12:07 CET] <julius> again, not a real ffmpeg question. but maybe you have observed this. i tried 3 webcams so far (1 very cheap) but none of them did output like 30fps. all were like ~10fps. tested with cheese and another program, gcview or something. is the linux video camera support so bad?
[23:12:52 CET] <J_Darnley> Make sure there is plenty of light on the subject
[23:12:54 CET] <jkqxz> Are you somewhere with low light?
[23:13:06 CET] <jkqxz> All webcams drop framerate down a lot in low light.
[23:13:39 CET] <julius> hm, daylight
[23:13:54 CET] <julius> "normal working" hours
[23:14:09 CET] <J_Darnley> Maybe its just shit on the RPi2
[23:14:12 CET] <TD-Linux> uvc web cams generally work the same as on any other os. did you see better framerate on a different os?
[23:14:13 CET] <firewated> could it be that the raspberry pi can't handle more than 10fps at the settings your running at?
[23:14:32 CET] <jkqxz> What is your input format, then?  You can't fit 720p raw video over USB high speed, so it has to be MJPEG or a smaller image.
[23:15:33 CET] <julius> tested on a normal laptop
[23:15:50 CET] <julius> can only test on linux, currently
[23:16:54 CET] <firewated> tried lower resolution or bitrate?
[23:17:32 CET] <julius> yes
[23:17:50 CET] <julius> cheese had some options to go trough, even at 640x480....around 15fps
[23:18:16 CET] <julius> so this is not a normal thing, one of the cams should have produced 30fps?
[23:19:41 CET] <jkqxz> Assuming you don't have really cheap webcams or a really weak machine, yes.
[23:20:52 CET] <julius> the laptop is old, but not that old. core i5
[23:21:17 CET] <TD-Linux> julius, is this the recorded video or the live view that you're seeing this btw?
[23:21:49 CET] <kepstin> julius: a lot of usb webcams can only do higher framerates/resolutions when using mjpeg - try "-input_format mjpeg" on it
[23:21:52 CET] <kepstin> maybe?
[23:23:12 CET] <jkqxz> Lower resolutions will generally fix that anyway.  The constraint is really the USB bandwidth.
[23:26:43 CET] <jkqxz> Looking at "uvcdynctrl -f" on this rubbishy builtin laptop camera, I get 30fps up to 848x480 in YUYV, or 11fps 1280x720 in YUYV, or 30fps 1280x720 in MJPEG.
[23:27:33 CET] <julius> TD-Linux, live...my hand movements before the lense were very sluggish
[23:27:45 CET] <julius> i waas using my laptop for testing, it got usb2
[23:28:04 CET] <TD-Linux> julius, hmm. I get 60fps with cheese and my laptop webcam. it could be a terrible webcam, or potentially a really busted video driver
[23:28:25 CET] <julius> jkqxz, i get that listing too
[23:28:31 CET] <TD-Linux> er 30fps
[23:28:42 CET] <julius> jkqxz, doesnt mean it records at that rate..does it?
[23:29:31 CET] <julius> the built in laptop shows my hand very nicely. smooth
[23:29:37 CET] <julius> let me look for one of the cameras
[23:30:43 CET] <jkqxz> uvcdynctrl just dumps what the V4L2 UVC driver has worked out.  The camera really will do that unless something is very broken.
[23:40:24 CET] <julius> uvcdynctrl -d /dev/video1 -f          says that 30fps at 1920x1080  with MJPG should be possible. but guvcview -f mjpg -d /dev/video1  only gives me like 10fps.  the laptop webcam, can do 30 at these light conditions
[23:42:03 CET] <psycho_> This isn't strictly an ffmpeg question, but I figure there is probably overlap between mplayer and ffmpeg.  I'm trying and failing to convert a gmp4 file.
[23:42:26 CET] <psycho_> This is the console output.
[23:42:27 CET] <psycho_> http://pastebin.com/NECc8UVy
[23:43:00 CET] <psycho_> I think I ran into this issue a year ago, and forgot to document the solution, but needed a different version of either mplayer or mencoder.
[23:44:50 CET] <jkqxz> julius:  Ouch.  That has always worked correctly in my experience, so I'm afraid I have no idea.
[23:45:47 CET] <julius> ok thanks
[23:45:52 CET] <julius> jkqxz, what webcam do you use?
[23:49:24 CET] <jkqxz> I've used quite a few different ones.  (Logitech C920 is still the best, despite being some years old now.)
[23:49:38 CET] <julius> yeah, read about that one
[23:49:43 CET] <julius> and quite expensive for my project
[23:49:56 CET] <julius> what else did you use?
[23:53:50 CET] <jkqxz> Not for a while, so I don't remember precisely.  A lot of cheap ones, which tended to be unmemorable.  The Microsoft ones were OK, though I remember them liking to make up weird colours.  The expensive Logitechs were generally the best, though the C930 was a small step backwards from the C920.
[23:57:02 CET] <Interrogator> is there any script witch can let ffmpeg download Youtube-videos instead of youtube-dl ?
[23:57:55 CET] <firewated> you can get http links from youtube-dl which ffmpeg can use as inputs
[23:58:25 CET] <jkqxz> Integrated ones built in to screens were consistently worse than plausible discrete cameras.
[23:59:48 CET] <julius> jkqxz, yep
[00:00:00 CET] --- Fri Feb 26 2016

More information about the Ffmpeg-devel-irc mailing list