[Ffmpeg-devel-irc] ffmpeg.log.20180214

burek burek021 at gmail.com
Thu Feb 15 03:05:01 EET 2018


[00:00:22 CET] <JEEB> right...
[00:01:15 CET] <JEEB> SortaCore: this is the commit that added forcing the MOV/ISOBMFF time base like that http://git.videolan.org/?p=ffmpeg.git;a=commit;h=b02493e47668e66757b72a7163476e590edfea3a
[00:03:50 CET] <SortaCore> *hmm intensifies*
[00:04:29 CET] <JEEB> anyways, it shouldn't be detrimental to you unless you're planning on making long enough clips that (I think) 64 bit integers wouldn't be enough
[00:04:48 CET] <JEEB> not nice that it doesn't give you the time base that you want, of course
[00:05:25 CET] <JEEB> granted, the next year someone had to add an AVOption to tell the darn thing to just use a timescale that's being set
[00:05:57 CET] <klaxa> mifritscher3: i think you'll need to find a way to tell whatever java library you are using to do what you want
[00:06:09 CET] <JEEB> oh, and of course it's a global option for all video tracks in the mux
[00:06:30 CET] <klaxa> like always read as many frames as possible and only process the last one read? i think that's what you want
[00:06:37 CET] <JEEB> SortaCore: set the AVOption video_track_timescale to the time scale you want to force it to have
[00:06:40 CET] <klaxa> not sure if it's just as easy as that though
[00:06:53 CET] <JEEB> and it will override that automated addition of precision
[00:08:23 CET] <JEEB> I have hunch that now that AVStream.time_base is being utilized as the base time base, an API user would actually tell you if a "variable FPS compatible" timescale is needed
[00:10:58 CET] <mifritscher3> klaxa: this library is only a tiny wrapper - it has no own threads or something
[00:11:51 CET] <klaxa> you don't need threads to detect if a read is blocking?
[00:12:14 CET] <mifritscher3> it does not automatically read
[00:12:16 CET] <klaxa> or am i misunderstanding what you are trying to say?
[00:12:57 CET] <mifritscher3> it just setups the library to start, but there is no actual reading
[00:13:20 CET] <mifritscher3> and as I said, while the process has enough cpu ressources the memory usage is steady
[00:15:03 CET] <klaxa> if it is steady (staying the same, not changing) where is the problem? you said if the process is paused for a few seconds the memory usage goes up
[00:15:38 CET] <klaxa> if you cannot modify the library i think you are out of luck then
[00:17:43 CET] <mifritscher3> only in certain error cases (mow cpu ressources, paused for dumping, some kind of problems with the source) it eats memory
[00:17:49 CET] <SortaCore> I don't mind it uses it JEEB, I was just wondering if it was a good idea to replicate that in my code
[00:18:16 CET] <JEEB> you're making your own ISOBMFF or MOV muxer?
[00:18:19 CET] <SortaCore> it sets mux_timebase and stream time_base I think
[00:18:29 CET] <SortaCore> no, just using it in my ffmpeg-accessing dll
[00:18:45 CET] <JEEB> then you're just setting the time base for the AVStream, which then gets calculated like that in movenc
[00:18:53 CET] <JEEB> the timescale value does not get fed anywhere back
[00:18:55 CET] <mifritscher3> klaxa: the problem is that it eats memory if there are problems ;-)
[00:18:57 CET] <JEEB> (to your AVStream time base)
[00:19:03 CET] <SortaCore> it will transcode from rtsp to mp4, but the pts/dts timing was frameNum++, which wasn't ideal
[00:19:07 CET] <JEEB> so no, you as an FFmpeg API customer do not have to do anything
[00:19:11 CET] <mifritscher3> I can modify the library
[00:19:17 CET] <SortaCore> ok, good to know
[00:19:26 CET] <klaxa> yes probably because of frames being buffered
[00:19:28 CET] <JEEB> you just stumbled upon a random hack
[00:19:39 CET] <SortaCore> I was monitoring every time time_base is set
[00:19:45 CET] <JEEB> but it's not
[00:19:47 CET] <JEEB> it's the other way
[00:19:51 CET] <JEEB> time base is used
[00:20:05 CET] <mifritscher3> klaxa: but why there are buffered only if there are problems and not in normal operation (where I don't even request the frames from ffmpeg)?
[00:20:07 CET] <JEEB> timescale is what gets written for that track within MOV/ISOBMFF, yes
[00:20:38 CET] <JEEB> but it doesn't go the other way, it only goes AVStream->timebase ’ MOV/ISOBMFF track's timescale
[00:21:22 CET] <JEEB> SortaCore: and yes, with constant frame rate content the delta for DTS is always +frame_duration_in_timescale
[00:21:23 CET] <mifritscher3> klaxa: I yust tried something - I can even reproduce the problem with the plain ffmpeg binary
[00:21:26 CET] <klaxa> well if i had my magic crystal ball i could look right into your source-code and tell you, but alas it is in repairs
[00:21:38 CET] <JEEB> CTS offsets can be nonzero if you have B-frames
[00:22:14 CET] <JEEB> (CTS is pretty much what the rest of the world calls PTS)
[00:22:25 CET] <JEEB> (MOV/ISOBMFF had to be special like that)
[00:22:39 CET] <mifritscher2> klaxa: the high level code can be found under https://github.com/bytedeco/javacv/issues/846 . for the other files, just a moment
[00:22:49 CET] <SortaCore> I get Video: h264 (h264_qsv) (avc1 / 0x31637661), nv12, 704x480, q=2-31, 1000 kb/s, 6.25 fps, 12800 tbn, 6.25 tbc
[00:23:23 CET] <SortaCore> will RTSP src have a decently-timed dts/pts?
[00:23:33 CET] <JEEB> I have no idea
[00:23:51 CET] <SortaCore> same
[00:24:01 CET] <JEEB> also do note that things where you don't get nice PTS with the frame reordering can give you problems
[00:24:25 CET] <mifritscher2> klaxa: the other 2 interesting files: https://github.com/bytedeco/javacv/blob/master/src/main/java/org/bytedeco/javacv/FFmpegFrameGrabber.java and https://github.com/bytedeco/javacv/blob/master/src/main/java/org/bytedeco/javacv/FrameGrabber.java
[00:24:34 CET] <JEEB> not necessarily, and it should be obvious as hell
[00:25:17 CET] <klaxa> mifritscher2: how did you test it with ffmpeg?
[00:25:18 CET] <SortaCore> so is dts/pts = FrameNum++ ok, or how would I convert the rtsp src dts/pts to output?
[00:25:42 CET] <JEEB> the PTS is what you have to be careful about due to possible b-frames
[00:25:52 CET] <JEEB> DTS and duration are a no-brainer if the input is constant frame rate
[00:26:12 CET] <JEEB> also what sort of timestamps do you get from the input, if any?
[00:26:44 CET] <SortaCore> I'm not sure how to answer that, where should I look?
[00:27:01 CET] <JEEB> the values you get from AVPackets?
[00:27:04 CET] <klaxa> wait... so are you calling FFmpegFrameGrabber.grabeFrame() in your "high-level" code?
[00:27:07 CET] <JEEB> unless you have your own demuxing
[00:27:13 CET] <klaxa> because then you *are* reading the stream
[00:27:25 CET] <mifritscher2> klaxa: ffmpeg  -f dshow -i video="Integrated Webcam" null  - while running it has a steady 24 MB, after dumping it (in the task manager) it goes up to steady 34 MB.
[00:27:29 CET] <SortaCore> I'll chuck a debug func on to output them and look
[00:27:36 CET] <SortaCore> actually, there's debug_ts
[00:27:43 CET] <JEEB> yes, ffmpeg.c has -debug_ts
[00:27:58 CET] <JEEB> which conveniently shows you timestamps at various spots in the darn thing
[00:28:13 CET] <mifritscher2> klaxa: I disabled the grabFrame call (dummy=true)
[00:28:42 CET] <mifritscher2> (it makes no difference if I enable the call)
[00:30:23 CET] <mifritscher2> sorry, wrong command line - meant ffmpeg  -f dshow -i video="Integrated Webcam" c:\temp\o.avi
[00:30:24 CET] <SortaCore> https://hastebin.com/jirewuceyu.hs
[00:31:17 CET] <JEEB> ok, seems like you're getting timestamps?
[00:31:26 CET] <JEEB> pkt_pts/dts
[00:31:45 CET] <SortaCore> yep, some initial ones negative for some reason
[00:32:31 CET] <JEEB> not really
[00:33:05 CET] <JEEB> the off/off_time stuff is by FFmpeg
[00:33:13 CET] <buu> So uh
[00:33:13 CET] <klaxa> hmmm... no idea tbh then, it doesn't happen when using just a plain file?
[00:33:17 CET] <JEEB> as you can see by the demuxer+ffmpeg start of the line
[00:33:20 CET] <buu> Should I be able to see 'soft' subs usinf ffplay?
[00:33:25 CET] <SortaCore> why does ffmpeg do that?
[00:33:31 CET] <buu> Do I need some option to enable subs?
[00:33:47 CET] <JEEB> SortaCore: it moved the initial packet to timestamp zero
[00:34:19 CET] <SortaCore> why is dts on first packet higher than second packet?
[00:35:05 CET] <mifritscher2> klaxa: happens on a normal file as well
[00:35:48 CET] <buu> should I check decoer status or something?
[00:36:21 CET] <klaxa> buu: i think you need the subtitles video filter (-vf subs=somefile)
[00:36:32 CET] <klaxa> where somefile can be the same as your input file if it has subtitles embedded
[00:36:54 CET] <buu> klaxa: !!!
[00:37:11 CET] <klaxa> mifritscher2: and you are sure it's not the dump that is using up the additional memory?
[00:37:47 CET] <klaxa> uh -vf subtitles=somefile of course
[00:37:56 CET] <klaxa> or -vf ass=somefile if it is .ass subs
[00:38:25 CET] <buu> klaxa: That option made ffplay really unhappy
[00:38:34 CET] <mifritscher2> klaxa: fairly - as I "abuse" the dump only as a way to provoke it. I have the same problem sometimes if there are severe problems with the source (WLAN...)
[00:38:36 CET] <buu> By which I mean I got like 2 seconds of audio and no video
[00:39:48 CET] <buu> Yeah, I get no audio or video but ffplay thinks it is running
[00:39:51 CET] <klaxa> mifritscher2: maybe that's a bug then? i have no windows to reproduce, but some devs do, maybe filing a bugreport with steps to reproduce will shed light on this
[00:39:53 CET] <mifritscher2> but at the native ffmpeg proces it doesn't go up such much, so I'm not sure if it has the same cause
[00:40:49 CET] <mifritscher2> I think I'll try to throw some debugger / memory profiler at it tomorrow
[00:41:01 CET] <JEEB> SortaCore: it awfully looks as if an offset was applied to the further packets since they all are +14400 for both PTS and DTS afterwards
[00:41:37 CET] <buu> Augh.
[00:42:24 CET] <JEEB> SortaCore: unless you see a similar jump back in your API usage
[00:42:33 CET] <JEEB> ffmpeg.c can do a whole whackload of weird crap
[00:44:06 CET] <buu> klaxa: using -vf means I don't get any video whatsoever
[00:44:17 CET] <JEEB> SortaCore: try with -copyts before input, which disables some of the weirdness ffmpeg.c does
[00:44:29 CET] <klaxa> buu: maybe try a different video player then?
[00:44:36 CET] <buu> klaxa: Are there any other good ones?
[00:44:42 CET] <klaxa> mpv is fairly good
[00:44:54 CET] <buu> Note that vlc will play the subs fine, it's just.. slow. and bad.
[00:45:12 CET] <klaxa> mpv is fast and good
[00:46:16 CET] <SortaCore> mpv reminds me of those people who leave their desk at exactly 5:00:00.000
[00:46:23 CET] <SortaCore> as soon as it's done it's out
[00:47:39 CET] <furq> doesn't vlc do the same thing by default
[00:47:43 CET] <furq> you can configure them both not to anyway
[00:48:56 CET] <SortaCore> no, vlc sticks around
[00:49:19 CET] <SortaCore> mpv closes the entire app, even if you started it without a file
[00:50:00 CET] <buu> hurray
[00:50:19 CET] <buu> ffplay not exiting at the end of the file was a surprise the first couple of times I tried it lol
[00:51:37 CET] <SortaCore> https://hastebin.com/ojozuxucey.hs JEEB
[00:52:34 CET] <mifritscher2> good night - I'll try to provoke the problem on other ways tomorrow. thank you klaxa!
[00:52:40 CET] <SortaCore> first packet dts is still higher than the next few packets
[00:52:51 CET] <SortaCore> maybe it's something to do with H264 frames
[00:53:46 CET] <klaxa> o/
[03:18:30 CET] <marcurling> Hello, where can I find key shortcuts that can be used in FFMPEG while encoding, please?
[03:41:12 CET] <JoyMarkson> Hi there,anyone could help me with code adding 2 watermarks at once?
[03:55:46 CET] <puppy431> JoyMarkson is it an artefact/bug or would you like to do it?
[04:14:04 CET] <puppy431> Hello, where can I find key shortcuts that can be used in FFMPEG while encoding, please?
[04:16:46 CET] <JoyMarkson> this mirc
[04:16:53 CET] <JoyMarkson> is like dead you wont get anything here
[04:17:03 CET] <JoyMarkson> i get imagination its like 99% sitting are bots lol :D
[04:17:10 CET] <JoyMarkson> or watching nba :D
[04:18:18 CET] <puppy431> About you pb, do you know you can have several inputs
[05:32:16 CET] <SortaCore> h264_nvenc doesn't flush properly
[05:32:39 CET] <SortaCore> hm
[05:34:47 CET] <SortaCore> possibly because I flush the decoder first, and they're both using the same hwaccel
[11:44:18 CET] <rav> Hi, I am using libavformat to mux a audio-video streams to a webm file. When I open audio codec using avcodec_open2(), it returns -12. What does this value mean ?  My audio stream is in opus codec. On a side not, is Opus in Experimental mode in ffmpeg ? Because unless I set AVCodecContext.strict_std_compliance to FF_COMPLIANCE_EXPERIMENTAL, I get a warning saying Opus is a experimental codec in ffmpeg.
[11:44:40 CET] <rav> *On a side note ...
[11:48:58 CET] <sfan5> ffmpeg has both an internal opus encoder and support for libopus
[11:49:03 CET] <sfan5> the internal encoder is experimental
[11:49:10 CET] <BtbN> the opus encoder in ffmpeg is experimental, yes. And you should use libopus right now.
[11:49:23 CET] <sfan5> also -12 (assuming it's an errno) means out of memory
[11:56:22 CET] <rav> Why out of memory error is thrown when I have just opened the codec, and have not written any data
[12:03:14 CET] <rav> And, actually I have configured the FFMpeg build with "--enable-libopus" option in my "./configue". So, ideally it should use libopus only right ? Is there any other way to make FFMpeg use libopus programmatically ?
[14:08:16 CET] <kepstin> rav: to select the libopus encoder specifically, look it up by name ("libopus")
[14:29:49 CET] <Fyr> does anyone know why PGS subtitles use different framerate? it goes normal in the beginning of the movie, but in the end there is a desync of several seconds.
[14:30:42 CET] <Fyr> the movie 24000/1001, the SUP file possibly 24 fps.
[14:31:02 CET] <Fyr> the timestamps of bitmaps in the SUP file are normal.
[14:31:24 CET] <Fyr> however, in the end of the file I see a desync.
[15:42:42 CET] <saml_> what's good quality metric? is vmaf the only thing?
[15:45:28 CET] <colde> Well, there is also PSNR and SSIM
[15:46:11 CET] <saml_> psnr and ssim are frame by frame and i need to compare original and encoding with different framerate, scale, .. etc
[15:46:17 CET] <saml_> basically i want magic
[15:46:27 CET] <saml_> can you write one magic
[15:47:07 CET] <saml_> i'm thinking.  play the video and screen capture at the same interval. then i can compare am i right
[15:47:36 CET] <saml_> is there a way to play video and capture screen without screen (headless)?
[15:52:36 CET] <DHE> ffmpeg can output to, say, png files
[16:45:23 CET] <gh0st3d> Hey everyone. Struggling a bit to get two videos to concatenate. As far as I'm aware I matched up their resolutions & framerates and I get an error saying unable to find suitable output format. Here's the output from the command: https://pastebin.com/PbMVwfah
[16:46:03 CET] <gh0st3d> It also says "-filter_complex: command not found" at the end, but I'm thinking it's actually something wrong with the files? the 2nd clip that I'm trying to add is a clip that's generated by ffmpeg
[16:54:26 CET] <iive> gh0st3d, you get the "command not found" because the bash tries to execute them as separate commands
[16:54:46 CET] <iive> meaning the "\" doesn't seem to work as intended.
[16:58:15 CET] <gh0st3d> ah! I think I got it. I removed that and put it all in one line. I had done that yesterday but I think that was before I had all the variables matching. Thank you!
[19:21:33 CET] <Fyr> guys, is there a way to add a delay to subtitles?
[19:24:10 CET] <CoreX> -itsoffset
[19:24:32 CET] <durandal_1707> via ffmpeg? no
[19:24:44 CET] <Fyr> CoreX, I need to add a delay to two subtitles tracks.
[19:26:00 CET] <relaxed> Fyr: if it's ass subs you can edit the timing
[19:26:07 CET] <Fyr> relaxed, PGS
[19:26:47 CET] <relaxed> -itsoffset may work, never tried it with PGS
[20:02:24 CET] <ladders> does webm resolution have to be some multiple of 2
[20:03:24 CET] <ladders> im encoding webm clips from longer h264 files and -vf scale=300:-1 isnt
[20:04:32 CET] <SortaCore> if that's the case, -2 will do it
[20:05:34 CET] <ladders> i dont care what height the output webm is, but i need the widths to be consistent regardless of the input aspect ratio
[20:31:50 CET] <Cracki> cheers. trying to use qsv encode on a processor that I *know* can do it (also with ffmpeg), but something's screwy... getting 'No device available for encoder (device type qsv for codec h264_qsv).'
[20:31:53 CET] <Cracki> any idea what's up?
[20:32:31 CET] <Cracki> (it's an optimus hardware setup, lenovo t440s, with force-enabled nvidia discrete gpu for ffmpeg processes)
[20:33:30 CET] <Cracki> (h264_nvenc works, hevc_nvenc seems unsupported by this hw...)
[20:34:58 CET] <Cracki> (the GT730M is supposed to have hevc nvenc support...)
[20:37:02 CET] <Cracki> (or not...)
[20:38:04 CET] <JEEB> it is definitely not
[20:38:20 CET] <Cracki> any option to use the intel or nvidia GPUs to *assist*?
[20:38:56 CET] <kepstin> not usefully.
[20:39:20 CET] <JEEB> if you had a full video filtering chain on the GPU it could be useful
[20:39:28 CET] <JEEB> but encoding, no
[20:39:53 CET] <JEEB> that's why everyone and their dog stopped doing the GPGPU kool-aid
[20:39:57 CET] <JEEB> and moved to just integrating ASICs
[20:41:21 CET] <Cracki> ic
[20:41:32 CET] <sim590> I want to use some pulseaudio microphone with this command: http://paste.debian.net/1010294/
[20:41:43 CET] <Cracki> well, motion estimation (optical flow) on gpu is kinda faster than cpu-only
[20:41:51 CET] <sim590> I try to pass hw:1 for $audio_device, but it doesn,t recognize it
[20:42:11 CET] <sim590> My audio devices are http://paste.debian.net/1010295/
[20:42:22 CET] <sim590> I want to use the blue snowflake
[20:42:30 CET] <sim590> it should be card #1... :/
[20:42:31 CET] <JEEB> Cracki: not in the context of video encoding, but as a completely separate workthing probably yes
[20:42:40 CET] <JEEB> GPUs love it when you can just throw hundreds of threads at it
[20:42:51 CET] <JEEB> that usually isn't the case with encoding
[20:42:59 CET] <sim590> The error message: http://paste.debian.net/1010296/
[20:43:19 CET] <klaxa> i think you want -f pulse -i alsa_input.usb-Blue_Blue_Snowflake-00.analog-mono
[20:43:29 CET] <klaxa> also would be useful to see what you even called and not only the output
[20:49:20 CET] <sim590> klaxa: I called the script
[20:49:30 CET] <SortaCore> *phone rings in script's office*
[20:49:34 CET] <sim590> klaxa: the first url
[20:51:06 CET] <klaxa> well try to change $audio_device's value from "hw:1" (which is alsa) to "alsa_input.usb-Blue_Blue_Snowflake-00.analog-mono" (which is pulse)
[20:51:18 CET] <sim590> klaxa: thanks. That's the right audio device.
[20:52:18 CET] <sim590> I'm tryng to stream on youtube.
[20:52:54 CET] <sim590> I try to set the output file as rtmp://a.rtmp.youtube.com/live2${MY_YT_KEY}
[20:53:18 CET] <sim590> But, the youtube page doesn't seem to receive my stream.
[20:53:23 CET] <furq> sim590: just -f pulse -i 6 should work
[20:54:04 CET] <furq> also youtube is probably bailing out because you're passing it a dual audio stream
[20:54:17 CET] <furq> i'm guessing you wanted to mix those together or something
[20:58:00 CET] <sim590> furq: Yeah
[20:58:41 CET] <furq> add -af amerge or -af amix
[20:58:54 CET] <furq> amix will mix them, amerge will put the first input on the left channel and the second on the right
[20:59:22 CET] <furq> also i forget if you still need to explicitly add -f flv for rtmp outputs
[20:59:25 CET] <sim590> furq: nice! Thanks! That's what I want
[20:59:25 CET] <furq> but it couldn't hurt
[20:59:31 CET] <sim590> Yeah
[20:59:50 CET] <furq> also yeah you probably want to get rid of -ac before the pulse devices
[21:00:10 CET] <furq> you need two mono inputs for amerge
[21:02:54 CET] <sim590> furq: do you know what "Simple filtergraph 'amix' was expected to have exactly 1 input and 1 output" means? I did http://paste.debian.net/1010300/
[21:03:07 CET] <furq> oh right yeah
[21:03:12 CET] <furq> -filter_complex amix
[21:03:28 CET] <sim590> nice!
[21:03:42 CET] <furq> wait
[21:03:44 CET] <furq> -qp 0?
[21:04:01 CET] <furq> i can't imagine youtube will want to deal with that lol
[21:04:19 CET] <furq> i assume they have some kind of upper rate limit
[21:05:44 CET] <sim590> I'm having input/output error
[21:05:48 CET] <sim590> With youtube
[21:06:07 CET] <sim590> command: ffmpeg -f x11grab -framerate 25 -s 1920x1080 -i :0.0+0,0 -f pulse -i alsa_input.usb-Blue_Blue_Snowflake-00.analog-mono -f pulse -i alsa_output.pci-0000_00_1f.3.analog-stereo.monitor -filter_complex amix -map 0 -map 1 -map 2 -c:v libx264 -preset ultrafast -c:a aac -f flv -y rtmp://a.rtmp.youtube.com/live2${MYKEY}
[21:06:26 CET] <furq> that all looks ok to me
[21:06:32 CET] <furq> does it work if you output to a file
[21:06:34 CET] <sim590> I,m setting the key according to the format that seems to be used here: https://obsproject.com/forum/threads/how-to-stream-on-linux.4594/
[21:06:39 CET] <sim590> Yes
[21:06:49 CET] <furq> shrug
[21:07:01 CET] <furq> youtube is a bit mercurial
[21:07:14 CET] <sim590> hehe
[21:07:20 CET] <ladders> ok, it looks like the discrepancy in output res for png and webm is due to some weirdness in scale=foo:-1/-2
[21:07:39 CET] <sim590> furq: May be I don't supply the URL in the good manner
[21:07:39 CET] <ladders> if i specify the exact scale=256x150 orwhatever the png and webm res come out right
[21:07:49 CET] <furq> maybe quote the url
[21:07:58 CET] <sim590> I do
[21:08:07 CET] <furq> oh so you do
[21:08:08 CET] <ladders> now i have to change my perl and spend another 3 weeks of cputime re-encoding :)
[21:09:47 CET] <sim590> furq: It needed some /
[21:09:51 CET] <sim590> https://gist.github.com/olasd/9841772
[21:10:15 CET] <furq> lol this script again
[21:11:55 CET] <furq> oh wow the one on the obs forums is even worse
[21:21:57 CET] <ladders> thanks
[21:24:47 CET] <sim590> I would like to dynamically select a region of my screen to stream without having to stop the stream. I guess I can't do that with ffmpeg, can I?
[21:27:26 CET] <DHE> if you're using x11grab, it can be made to follow the mouse cursor.
[21:30:26 CET] <sim590> DHE: How do you do that? :p
[21:31:01 CET] <DHE> sim590: https://ffmpeg.org/ffmpeg-devices.html#x11grab
[21:36:26 CET] <sim590> DHE: thanks!
[22:04:50 CET] <sim590> DHE: I see that the cursor will be followed centered no matter what I set: either -follow_mouse centered or -follow_mouse 1000 for e.g. And it behaves weirdly when you reach near the border of two screens. Is there a way to make it so that it swtiches only if the cursor switches screen?
[22:05:08 CET] <sim590> and to switch completely, not half of both screen.
[22:06:51 CET] <sim590> I guess I could setup something more elaborate with a kind of a sink waiting for stream which would be permanently connected to the stream, but could output a default blank screen if nothing is thrown in and then I use another command to stream to that sink.
[22:06:53 CET] <sim590> Is it doable?
[22:08:42 CET] <sim590> I'm in a context where I don't want to break stream connection with the remote stream (youtube for instance).
[22:11:39 CET] <sim590> Could I use some /dev/ device as input file for audio and video, then with another command I could write to those /dev/ device files?
[22:57:53 CET] <ladders> ok i figured out the odd resolution, i didnt take SAR into account
[22:58:14 CET] <ladders> so that seems to be what it was, it's working now, thanks for the software
[00:00:00 CET] --- Thu Feb 15 2018


More information about the Ffmpeg-devel-irc mailing list