[Ffmpeg-devel-irc] ffmpeg.log.20160229

burek burek021 at gmail.com
Tue Mar 1 02:05:01 CET 2016


[02:31:29 CET] <salviaD> how do I convert yuv4:2:2 to yuv4:2:0 ?
[02:36:50 CET] <c_14> -pix_fmt yuv420p
[02:36:55 CET] <J_Darnley> by using swscale
[02:37:10 CET] <J_Darnley> if you have a specific case, please be specific
[02:38:00 CET] <J_Darnley> but when using ffmpeg c_14's suggestion works
[02:47:33 CET] <salviaD> I am converting videos to an specific hardware that cannot play 4:2:2, only 4:2:0... So i need ffmpeg to skip 4:2:0 , but convert 4:2:2 to 4:2:0
[02:48:14 CET] <J_Darnley> then c_14 gave the right answer
[02:50:03 CET] <salviaD> thank you
[03:38:49 CET] <yongyung> Do you guys know a tool that can search for frame duplication in a video file on pixel level?
[03:39:11 CET] <yongyung> Because using ffmpeg to generate .png files is... slow, to say the least
[03:52:47 CET] <kepstin> yongyung: ffmpeg's "decimate" filter can detect identical (or similar, depending on thresholds) frames and remove them
[03:53:13 CET] <kepstin> but not sure what you mean "on a pixel level"
[03:55:11 CET] <yongyung> kepstin: I mean that the program shouldn't just check the presentation timestamps on the file and actually analyze the frame, which apparently "decimate" does - but I don't really want to remove them, I want to know how many of them there are and where they are
[04:05:51 CET] <yongyung> I don't think it's a problem in general though, it's just that after running fine for ~60s, at the start of a high-fidelity scene OBS duplicates some frames, usually with the pattern ...nnnndnndnnnn... (n=new frame, d=duplicated frame), it doesn't seem too bad, but one can definitely see it and it's kinda annoying
[04:06:24 CET] <yongyung> whooops wrong channel
[04:06:56 CET] <kepstin> weird. obs is a live streaming thing right? I wonder if that's an issue with its bitrate adaptation, or if the encoder fails to hit the timelimit or something
[04:07:09 CET] <kepstin> is it dropping frames that are replaced with dups?
[04:33:16 CET] <yongyung> kepstin: OBS is a live streaming thing yeah, but you can do local recordings as well. Considering lossless has no bit rate limit I don't think so. It might be some problem with the capture method
[07:13:58 CET] <yongyung> I'm trying to extract every frame out of a file that doesn't seem to be exactly 60 FPS, presentation times are varying slightly, with "ffmpeg -i in.mp4 frame%04d.png", but it gives me frequent "past duration too large" messages - which I find odd, .png files don't have a frame rate. Is ffmpeg trying to match exactly 60 FPS, and can I somehow disable that behavior?
[08:18:20 CET] <ubitux> yongyung: -vsync vfr maybe
[12:12:47 CET] <Gp1> Hey, I got a question, when I open a video sent over multicast IP, an IGMP Join request is sent, I am asked to resend the join request every minute to keep the registration on the multicast active, how can I do that?
[12:37:15 CET] <jkqxz> Gp1:  What doesn't work?  Your OS should already maintain its membership correctly with suitable responses to group membership queries.
[13:15:00 CET] <fidothe> I'm trying to get ffmpeg to listen for an RTMP stream so I stream RTMP from a camera and get an mp4 out, using `ffmpeg -listen -i rtmp...` (paste is http://pastebin.com/mpxEpshe) It feels like I'm just missing something important out...
[15:28:03 CET] <Carlrobertoh> Hi
[15:28:14 CET] <Carlrobertoh> I'm using ffmpeg libraries to capture packets. My question is how could i send it to server for streaming?
[15:29:08 CET] <c_14> rtp/http/udp/tcp/pigeons
[15:30:16 CET] <Carlrobertoh> I want to use the rtsp protocol . Right now i've managed to capture a screen, decode it, encode it and simply writing these into a file
[15:30:19 CET] <Carlrobertoh> but i want to stream it
[15:30:22 CET] <Carlrobertoh> not into a file
[15:30:49 CET] <c_14> Use the rtsp muxer/protocol to stream it to the server
[15:32:37 CET] <Carlrobertoh> i can't find any documentation to it
[15:33:29 CET] <Carlrobertoh> like how do i open a rtsp stream, how i send the data for reading whatsoever
[15:35:30 CET] <c_14> afaik same as for file output
[15:56:25 CET] <odinho> I wonder if anyone knows how to possibly work around this (old) bug, where A/V goes out of sync if the output is slow. https://trac.ffmpeg.org/ticket/2445
[16:15:20 CET] <zyme> If your using a 3rd party player I swear I've seen audio sync correctors under config that you could adjust manually as needed, but its not elegant
[16:18:39 CET] <antiPoP> Hi, when scaling a video , does the -preset switch matter?
[16:19:39 CET] <furq> why wouldn't it
[16:20:12 CET] <furq> if you're asking if -preset will change the resampling method then no
[16:20:23 CET] <furq> they're completely unrelated
[16:23:14 CET] <antiPoP> but I mean, when using just an scale filter, the video is decoded and encoded, so the preset should have impact in the output, right?
[16:23:27 CET] <J_Darnley> Yes.  It controls the encoder
[16:23:31 CET] <furq> what he said
[16:25:12 CET] <antiPoP> ok, thanks
[16:32:58 CET] <k13nox> Hi, i use ffmpeg to build the final record of a live meeting. People can join or leave anytime the meeting. I need to change layout : example : when the first people come with a video stream, i've need a full screen in the video record, and when the second come need to split in two part.
[16:33:11 CET] <k13nox> It's hard to build the right record, to manage all possibilities. ffmpeg is the right direction ?
[16:37:03 CET] <J_Darnley> I have no idea how easy dynamically changing layout like that is using ffmpeg's filters
[16:37:13 CET] <J_Darnley> (I would guess that it is hard)
[16:40:44 CET] <k13nox> Yes :/ i go spam the filter_complex :D
[16:42:19 CET] <k13nox> And cut all the records to match the number of peoples.
[16:42:46 CET] <Padde_> Currently I am trying to realize mobile streaming for our application. We are trying different solutions here and we also analyzed some other companies, who are doing the same. There we saw, that some of them provide there mobile livestreams via base64 encoded pictures. Besides there is an extra m3u9 audioplaylist to provide the audio stream. Does anyone have some experiences with that kind of implementation? I am not ab
[16:42:46 CET] <Padde_> le to sync the images to my audio files and I have no clue, how to start. Images are already generated on the fly, as well as the audio playlist. Any help would be appreciated
[16:43:47 CET] <J_Darnley> oh god!  base64 encoding the output?  what a way to waste bandwidth
[16:45:43 CET] <J_Darnley> Just send multiplexed audio and video stream like a normal person
[16:46:42 CET] <furq> wtf
[16:46:46 CET] <furq> why would you ever do that
[16:47:18 CET] <furq> also i hope m3u9 wasn't a typo
[16:47:37 CET] <furq> presumably that's an m3u which uses utf-9
[16:48:33 CET] <tp__> like utf-9 even exists
[16:49:49 CET] <Padde_> because of fucking ios! we have to be able to provide the mobile stream in "windowed" mode - and yeah, i meant m3u8
[16:49:56 CET] <furq> tp__: https://tools.ietf.org/html/rfc4042
[16:50:02 CET] <EmleyMoor> Gosh, it's Herman Hollerith's birthday
[16:50:13 CET] <tp__> "1. april" lol
[16:50:18 CET] <tp__> its not serious
[16:50:24 CET] <J_Darnley> Sure its
[16:50:34 CET] <J_Darnley> It is *deadly serious*
[16:50:34 CET] <furq> nothing gets past you does it
[16:50:40 CET] <Padde_> if we directly use our hls outputs, we can only play the video fullscreen. native app is not possible, because of the market policies ;)
[16:50:54 CET] <J_Darnley> What?
[16:51:05 CET] <J_Darnley> The player controls how shit is shown
[16:51:07 CET] <tp__> nope, but apparently past you, if you really think he meant m3u9
[16:51:35 CET] <J_Darnley> if some player can't do what you want, throw it in the trash
[16:51:40 CET] <Padde_> @J_Darnley pls show me how you can handle it on safari, iphone - they do not respect the inline options
[16:51:40 CET] <furq> i was hoping that the app he was looking at had an m3u8 and an m3u9 playlist
[16:51:56 CET] <J_Darnley> Padde_: remove iOS, install Windows
[16:52:03 CET] <tp__> hehe
[16:52:09 CET] <Padde_> ... what a solution - we got around 20k users ... tell them
[16:52:47 CET] <furq> i have no experience with iOS development but i'm fairly confident that "base64 encode the video" is not the solution to any problem
[16:53:23 CET] <Padde_> it is, for playing the video in windowed mode. i analyzed 5 other communities who are doing the same as we do. and they all do it that way
[16:53:29 CET] <furq> and if it is then iOS is even worse than i thought
[16:53:37 CET] <Padde_> i agree to that ;)
[16:53:44 CET] <firewated> Padde_: youtube apps play half screen, surely they're not doing what you describe
[16:54:22 CET] <Padde_> apps are native - there you can do whatever you implement. but we can only provide a webbased solution
[16:55:29 CET] <firewated> didn't realise you had a web solution, my bad
[16:55:32 CET] <furq> so an hls stream in a video tag opens the native player?
[16:55:47 CET] <Padde_> yep
[16:56:21 CET] <Padde_> for android / windoof you can simply do smth like "webkit-playsinline="1"" <- but iphone will not recognize this option - ipad does!
[16:56:46 CET] <furq> looks like no hls.js for iOS either
[16:56:54 CET] <furq> i guess safari on iOS doesn't support MSE
[16:57:45 CET] <furq> and obviously iOS doesn't support DASH because that would make things too easy
[16:57:52 CET] <Padde_> maybe, i dont know. but because of the fact, that all competitors are doing it the way I described, I think there is no "good" solution
[16:58:06 CET] <furq> how do youtube do livestreams on iOS
[16:58:10 CET] <Padde_> app
[16:58:11 CET] <Padde_> :)
[16:58:59 CET] <Padde_> we really would provide our custom app, but there is no way to get into the markets with some adult contents
[17:00:01 CET] <furq> is this vod or live
[17:00:19 CET] <Padde_> live
[17:00:43 CET] <furq> well i'm out of ideas
[17:00:52 CET] <Padde_> yeah, we too :D damn it
[17:01:00 CET] <furq> i guess you could ask apple for support
[17:01:06 CET] <Padde_> :D
[17:01:23 CET] <furq> that joke would work better if you could see me burst out laughing as i typed it
[17:01:44 CET] <Padde_> it is funny because we really did! they told us to make an app ... then we told them what we do and they hang up^^
[17:02:24 CET] <furq> i'm going to go ahead and assume that the reason the native player is no good is because it would prevent you from typing messages to the sexy girls
[17:03:05 CET] <Padde_> exactly ... that is what the customers are paying for. so we can not provide the livestream in fullscreen mode, or they get mad
[17:04:43 CET] <Padde_> however, thanks for our input! We will try further ...
[17:05:06 CET] <firewated> you can make an app that is a web browser, maybe have the home page as your website, then you can display videos however you want
[17:06:36 CET] <furq> firewated: apple don't allow adult content on the app store
[17:06:57 CET] <firewated> here's an example https://itunes.apple.com/us/app/merlin-webm-player-browser/id1059331577?mt=8
[17:07:18 CET] <furq> only apps where you can buy one million itchy & scratchy dollars for $10,000 of your parents' real money are allowed
[17:08:20 CET] <mcmic> Hello, I used ffmpeg first to record my screen into an mp4 file and then to convert this in a webm file. The video looks weird in Firefox, as if some part was played above the rest, zoomed in. Is there any ffmpeg option I can use to fix this problem?
[17:09:06 CET] <mcmic> (The video plays fine on VLC. Its uploading to http://mcmic.haxx.es/speedrun/supertuxsub2330.webm but the upload is slow, but I already get the problem with what is already uploaded if I load it in firefox)
[17:12:04 CET] <firewated> mcmic: could it be a firefox issue? do other webms work ok? if they do, maybe you can compare the differences between them and your webm
[17:12:39 CET] <mcmic> firewated: It could be of course. Not sure where to find example webms
[17:15:18 CET] <zeryx> I have a couple of wrappers that split a video into frames (on fps), and another that recombines them, where can I find a comprehensive list of options for this operation?
[17:15:43 CET] <zeryx> The only ones I'm aware of are FPS & file extension, but stuff like quality & bitrate are missing
[17:18:35 CET] <firewated> mcmic: https://boards.4chan.org/wsg/thread/950507/cute-animals-thread
[17:19:48 CET] <mcmic> firewated: http://video.webmfiles.org/big-buck-bunny_trailer.webm works fine
[17:21:43 CET] <firewated> maybe you could compare the two using ffmpeg
[17:22:20 CET] <mcmic> firewated: could it be a framerate thing? How do I display this kind of info?
[17:22:49 CET] <firewated> i just do ffmpeg -i yourvideo.webm
[17:24:58 CET] <firewated> by default i think ffmpeg encodes to vp9 whereas the webm you posted is vp8
[17:54:28 CET] <mcmic> firewated: http://base-n.de/webm/VP9%20Sample.html both videos looks fine
[17:54:44 CET] <mcmic> firewated: does http://mcmic.haxx.es/speedrun/supertuxsub2330.webm plays correctly in your browser?
[17:56:12 CET] <mcmic> firewated: OK, someone on #firefox pointed me to https://bugzil.la/1190939 and indeed the video in the ticket has the same problem
[17:56:40 CET] <firewated> the second one plays ok in chrome but seeking doesn't seem to work too well
[17:56:45 CET] <mcmic> So they say its related to «4:4:4 chroma subsampling (profile 1)»
[17:57:01 CET] <mcmic> firewated: maybe the easier for me is to use vp8 instead
[17:57:32 CET] <mcmic> firewated: how do I ask ffmpeg for vp8?
[17:58:16 CET] <firewated> -c:v vp8 but i'm pretty sure you can change the chrome subsampling thing too
[17:58:17 CET] <c_14> -c:v libvpx
[17:58:36 CET] <c_14> You could also use 420
[17:58:43 CET] <c_14> -pix_fmt yuv420p
[18:09:06 CET] <fidothe> I'm trying to capture an RTMP feed sent by a camera. On my friend's mac (with the latest nightly) the following command works fine: `ffmpeg -v debug -listen 1 -i rtmp://localhost:6001/live/test -c copy test.mp4`, but on a Ubuntu box with 2.8.5 it fails with an RTMP_connect0 error. Output is in this paste: http://pastebin.com/ReagVEFp Any suggestions on
[18:09:06 CET] <fidothe> what's wrong?
[18:58:28 CET] <Gp1> Hey, I got a question, when I open a video sent over multicast IP, an IGMP Join request is sent, I am asked to resend the join request every minute to keep the registration on the multicast active, how can I do that?
[19:19:15 CET] <jkqxz> Gp1:  What doesn't work?  Your OS should already maintain its membership correctly with suitable responses to group membership queries.
[19:35:04 CET] <Gp1> jkqxz, I am developing in relation to SAP server that sends sdp with video mc and I need to  maintain both the sap mc and the video... is it something that is taken care of in the OS level?
[19:35:56 CET] <Gp1> It is for an internal but complex WAN network with a lot of router hops
[19:37:00 CET] <DHE> jkqxz: that assumes something is sending queries. IGMP-snooping switches are passive-only by default
[19:38:00 CET] <DHE> Gp1: you need an IGMP querier on the network. typically a router fulfills this job. if no such router, you might be able to get a switch or PC to do the job. the switch is preferable. I know JunOS can do this
[19:39:43 CET] <Gp1> What do you mean? the network is built with standard switches... my question is who is suppose to maintain the open multicast registration?
[19:40:11 CET] <dustobub> hey. I'm looking to discuss some concepts with a video codec / playback engineer. Does anyone have any recommendations on where to connect with someone? I was thinking here might be a good start.
[19:41:05 CET] <DHE> Gp1: the switches will time out your IGMP membership after a few minutes. it's the responsibility of a router to periodically send IGMP queries and all still-interested parties refresh their membership
[19:42:05 CET] <DHE> substituting the querier with something else will be required, but choosing what is important
[19:45:54 CET] <Gp1> I am a little new to this so please forgive me if I am wrong, but when I ask to receive a MC, I am "subscribing" to my 1st hop switch (if I don't have a router) and he is suppose to subscribe further, now I can understand that will timeout. The querier  you mention is a 3rd party that is suppose to check if my MC is still required? If so, who is suppose to answer to it?
[19:48:23 CET] <J_Darnley> dustobub: ffmpeg has a consulting page if you want to pay someone
[19:49:06 CET] <J_Darnley> perhaps someone will answer your questions
[19:49:15 CET] <dustobub> J_Darnley, thanks. I just noticed that. I'll try to ping baptiste.coudurier at gmail.com
[19:49:44 CET] <J_Darnley> you can also try just asking here, #ffmpeg-devel, ffmpeg-user mailing list, ffmpeg-devel mailing list
[19:53:50 CET] <DHE> Gp1: you need 1 device on a vlan, preferably the device most likely to receive all traffic, to send periodic queries. Like I said, normally a router. I use a Juniper switch.
[19:54:15 CET] <DHE> it will send queries, all devices will send their list of active groups and all switches will have their liveliness refreshed
[19:55:27 CET] <Gp1> I believe my network is based on cisco switches, which I can only assume do this functionallity too? But who answers the query?
[20:06:54 CET] <Fyr> guys, I can't send anything to #help channel.
[20:07:10 CET] <Fyr> what's going on there?
[20:07:15 CET] <Gp1> DHE, please help:)
[20:08:01 CET] <J_Darnley> What help channel?
[20:08:16 CET] <Fyr> '#freenode'
[20:08:40 CET] <Fyr> I realized that now it's voiced channel.
[20:08:53 CET] <Fyr> how do people get help there without a voice?
[20:09:09 CET] <DHE> Gp1: all multicast receivers answer the queries
[20:09:41 CET] <J_Darnley> No idea.  Last time I just went there and said things
[20:10:35 CET] <DHE> Gp1: not a cisco expert, there is apparently a "ip igmp snooping querier" command you can put on a vlan interface to make it do querying. set this up on the right vlan on whatever switch is closest to the systems that most receive multicast
[20:10:37 CET] <Gp1> Ok, my computer is a multicast receiver, is windows answering the query since it knows the socket is still open, or is there an implementation that needs to be made here?
[20:10:56 CET] <DHE> windows will answer it. should just work
[20:11:49 CET] <Gp1> Oh ok,I thought it might be something left for the program (in this case ffmpeg) to implement
[20:12:18 CET] <DHE> nope. I just use "ffmpeg -i udp://239.0.1.2:3456 ..." and it works fine. (linux though)
[20:12:37 CET] <Gp1> Ok... I'll check the network then...
[20:12:44 CET] <Gp1> thank you for your help
[20:13:23 CET] <DHE> so pass that advice onto your network engineers (as a guideline, not a rule, I am not a CCSA or anything) and hopefully that will solve it
[20:13:43 CET] <Gp1> sure will
[20:14:19 CET] <T4ng10r> hi
[20:14:36 CET] <T4ng10r> did any one used drawtext with pts and gmtime?
[20:14:36 CET] <Gp1> will check on it,I was told that my program that uses ffmpeg doesn't send igmp join after the initial join, I assume it's the querier fault, but i'll check
[20:14:43 CET] <Gp1> Have a good day
[20:15:08 CET] <DHE> most likely you have none. wireshark can tell you, just use filter "igmp"
[20:15:15 CET] <T4ng10r> [Parsed_drawtext_0 @ 0x3b28120] Both text and text file provided. Please provide only one
[20:15:15 CET] <T4ng10r> [AVFilterGraph @ 0x3b28840] Error initializing filter 'drawtext' with args 'fontfile=Verdana.ttf:expansion=normal: text=%{pts\:gmtime\:0\:2015-11-25 20\\:15\\:24}: r=30: x=(w-tw)/2: y=h-(2*lh): fontcolor=white:box=1:boxcolor=0x000000 at 1'
[20:15:32 CET] <T4ng10r> with custom build ffmpeg i receive something like this\
[20:15:56 CET] <Gp1> kk
[20:16:02 CET] <Gp1> thanks
[20:16:17 CET] <J_Darnley> "Both text and text file provided. Please provide only one"
[20:16:20 CET] <J_Darnley> RTFE!
[20:16:38 CET] <T4ng10r> sorry - i'm not that skillfull
[20:16:44 CET] <T4ng10r> where do I provide textfile?
[20:17:07 CET] <J_Darnley> actually, it looks like you don't
[20:17:09 CET] <J_Darnley> sorry
[20:17:19 CET] <T4ng10r> I uderstand that maybe to many escape chars may fool parsin mechanizm
[20:17:39 CET] <T4ng10r> i tried to understand thsi from code - but I failed
[20:18:45 CET] <T4ng10r> I want to insert my time as reference point - but ... perhapse format is incorrect or not enough '\
[20:19:21 CET] <J_Darnley> try removing the spaces after the colons
[20:19:59 CET] <J_Darnley> and try removing the colons from the text you want drawn
[20:20:14 CET] <T4ng10r> after I used 4x\ = it worked
[20:20:27 CET] <T4ng10r> but without time changing
[20:21:34 CET] <T4ng10r> time is read from ffprobe and inserted to draw function
[20:25:21 CET] <T4ng10r>         const char *timefmt = argc >= 3 ? argv[2] : "%Y-%m-%d %H:%M:%S";
[20:25:36 CET] <T4ng10r> this suggest that ffmpeg should accept this time
[20:47:56 CET] <T4ng10r> how enable works in drawtext?
[20:48:56 CET] <T4ng10r> I'm trying drawtext=enable=betwen(t,1,5) and it complain
[20:49:06 CET] <T4ng10r> Unknown function in 'betwen(t,1,5)'
[20:49:18 CET] <T4ng10r> so question is - what are acceptable functions
[20:49:54 CET] <kepstin> T4ng10r: https://www.ffmpeg.org/ffmpeg-utils.html#Expression-Evaluation
[20:50:10 CET] <kepstin> it will probably work if you spell "between" properly.
[20:50:42 CET] <T4ng10r> such a shame - between
[20:51:13 CET] <T4ng10r> yes, thank you
[21:25:24 CET] <zeryx> can you create a concat file for images?
[21:25:37 CET] <J_Darnley> probably
[21:25:54 CET] <J_Darnley> but the image2 demuxer might be easier
[21:26:22 CET] <J_Darnley> I argue that an image is video with only 1 frame
[21:26:57 CET] <zeryx> yea
[21:27:12 CET] <zeryx> I'm just not sure how the binaries handle it
[21:27:22 CET] <zeryx> I'd like to duplicate my video concat wrapper
[21:28:48 CET] <J_Darnley> I say go ahead and test it with arbitrary images and see what happens
[21:59:22 CET] <GigaRoc> can someone help me compile ffmpeg with libmfx
[21:59:55 CET] <GigaRoc> I'm building ffmpeg/libmfx with mingw64 --toolchain=msvc
[22:00:30 CET] <GigaRoc> but ./configure fails saying it can't find stdc++.lib
[22:04:43 CET] <J_Darnley> If you're building with mingw, why are you telling configure that you want to use msvc?
[22:13:57 CET] <GigaRoc> i'm using mingw for the bash command
[22:14:22 CET] <GigaRoc> if i don't compile with libmfx, it wroks
[22:14:29 CET] <GigaRoc> s/wroks/works
[22:14:52 CET] <J_Darnley> Then kindly post the config.log file (as you were instructed to)
[22:19:00 CET] <derekprestegard> Im doing some PSNR tests for x264. if I have a difference of .165 dB, what does that equate to in terms of percentage? Im seeing lots of stuff on google about converting dB to % but its mostly relating to audio. Does that apply directly to PSNR or is there a speical formula?
[22:19:23 CET] <GigaRoc> oh sorry, http://pastebin.com/aVHKxVfJ
[22:19:31 CET] <J_Darnley> percentage of what?
[22:20:05 CET] <derekprestegard> J_Darnley: like this encode is x% worse than this other encode
[22:20:14 CET] <derekprestegard> is that a logical thing to say?
[22:20:30 CET] <derekprestegard> or rather this encode is x% more impared relative to the source than this other encode"
[22:20:33 CET] <J_Darnley> Oh for crying out loud!  Useless rubbish that is msvc
[22:22:05 CET] <GigaRoc> I agree, but for what i'm doing, i need ffmpeg libs to work with VS2013
[22:23:11 CET] <J_Darnley> oh maybe I should be blaming that useless shell
[22:24:43 CET] <J_Darnley> sorry I have no suggestions on how to fix your problem
[22:24:56 CET] <J_Darnley> derekprestegard: yes that does make some sense
[22:25:15 CET] <J_Darnley> but PSNR is a comparison with the input video
[22:25:32 CET] <J_Darnley> with infinite being identical
[22:25:34 CET] <derekprestegard> right
[22:25:48 CET] <J_Darnley> how it goes from there I don't remember
[22:26:00 CET] <derekprestegard> I see
[22:26:30 CET] <derekprestegard> Im trying to evaluate (objectively with PSNR and SSIM and subjectively with visual analysis) the impact of switching from -preset slow to -preset medium or -fast in x264 for a streaming service
[22:28:25 CET] <J_Darnley> Try reading some of the old MSU codec comparisons
[22:28:33 CET] <J_Darnley> maybe they will help
[22:42:41 CET] <utack> do you know if sources for this android app have been published somewhere? https://play.google.com/store/apps/details?id=com.silentlexx.ffmpeggui&hl=de
[22:51:18 CET] <klaxa> utack: i can't find licensing information neither on the playstore site, the dev's site nor in the app
[22:51:50 CET] <utack> exactly, that is my problem with it
[22:51:57 CET] <klaxa> i'm also no expert in licensing laws and stuff, but i think he might be obligated to release source code depending on the version of ffmpeg he used
[22:52:14 CET] <klaxa> *the version of ffmpeg he configured, built and packed
[22:52:30 CET] <klaxa> you can send him an email though
[22:52:50 CET] <utack> already did, about a full day ago, but i wanted to ask in case someone here made that app
[22:57:55 CET] <J_Darnley> like with most license violators they will probably ignore you
[23:19:42 CET] <llogan> rarely an "accidental" violator will find their ticket on trac and attempt to fix it
[23:28:05 CET] <utack> J_Darnley in this case can't you ask google to take it down?
[23:31:10 CET] <J_Darnley> I could if they are violating the license on my code
[23:41:49 CET] <mattf000> is there any reason why using intra-refresh would increase the latency of a realtime stream?
[00:00:00 CET] --- Tue Mar  1 2016


More information about the Ffmpeg-devel-irc mailing list