[Ffmpeg-devel-irc] ffmpeg.log.20190122

burek burek021 at gmail.com
Wed Jan 23 03:05:02 EET 2019


[01:56:52 CET] <cluelessperson> is there a way to have ffmpeg display directly to a display device without a window?
[03:05:31 CET] <DHE> cluelessperson: if it's supported, it'll be ffplay not ffmpeg. but even then I'd recommend a better dedicated player app like mpv
[03:36:01 CET] <cluelessperson> DHE: for example, with gstreamer, I'm able to display video pretty direclty to the display.
[03:36:36 CET] <cluelessperson> or an image
[03:39:14 CET] <furq> mpv has drm output
[03:40:59 CET] <cluelessperson> drm?
[03:41:07 CET] <DHE> linux hardware acceleration
[03:41:42 CET] <cluelessperson> I'm looking for an easy way to display content programmatically
[03:42:00 CET] <pink_mist> link to libmpv
[03:42:37 CET] <kepstin> do you actually want to output directly to hardware, or do you just want a window without any window borders? Or are you looking to have a video area embedded in a gui app?
[03:43:23 CET] <cluelessperson> kepstin: honestly, all the above.
[03:43:42 CET] <cluelessperson> thinking about several applications.
[03:45:24 CET] <furq> yeah it sounds like you want libmpv
[05:10:05 CET] <b0bby__> hello
[05:10:26 CET] <b0bby__> So I have a bunch of files from 0-24.flv
[05:12:08 CET] <b0bby__> I need to live stream these(using the -re option) to an rtmp stream
[05:12:31 CET] <b0bby__> is there a way of doing that?
[05:13:47 CET] <friendofafriend> Tried a playlist, b0bby__?
[05:14:14 CET] <b0bby__> friendofafriend: in ffmpeg? I never knew about such a thing
[05:14:41 CET] <b0bby__> friendofafriend: how would I make a playlist
[05:21:02 CET] <friendofafriend> b0bby__: You'd use the concat demuxer!  You can read about it here.  https://trac.ffmpeg.org/wiki/Concatenate
[05:21:36 CET] <friendofafriend> Basically just a list of:  file '/the/absolute/path/to/0.flv'
[05:21:47 CET] <friendofafriend> (And so on.)
[05:21:49 CET] <b0bby__> friendofafriend: ok
[05:21:57 CET] <b0bby__> and I can use this with re?
[05:23:10 CET] <friendofafriend> Yep, that should work fine with -re .
[05:24:18 CET] <friendofafriend> Alternatively, you could just cat your files into a FIFO, and give ffmpeg the FIFO as input.
[05:25:01 CET] <b0bby__> friendofafriend: I'm interested, how would you do that?
[05:25:23 CET] <b0bby__> friendofafriend: thanks for the help so far btf
[05:25:27 CET] <b0bby__> btw
[05:25:33 CET] <friendofafriend> You're more than welcome, b0bby__.  Anytime.
[05:25:52 CET] <b0bby__> friendofafriend: how would I go about the fifo thing
[05:26:06 CET] <friendofafriend> You'd mkfifo ./foo ; cat 0.flv 1.flv 2.flv (etc.) > ./foo.
[05:26:19 CET] <friendofafriend> And then you'd execute ffmpeg with foo as the input.
[05:26:32 CET] <friendofafriend> Maybe call it foo.flv to make things more obvious.
[05:27:41 CET] <friendofafriend> So even if ffmpeg didn't have concat, you could do it another way.  But concat is a lot cleaner, and all in one command.
[05:30:38 CET] <b0bby__> cool
[05:30:41 CET] <tdr> flv files dont have some kind of header or similar you'd be shoving into the fifo mid "stream" when the second file hits?
[05:31:17 CET] <b0bby__> tdr: thats what I thought
[05:31:28 CET] <tdr> or would the ffmpeg capturing from the fifo know how to handle it?
[05:31:34 CET] <b0bby__> also, how would you create a playlist loop
[05:31:34 CET] <friendofafriend> ffmpeg tends to cope well with just shoving things into FIFO, but try it out and see if it works for you.
[05:32:44 CET] <friendofafriend> That's a -stream_loop -1
[11:13:35 CET] <sn00ker> hi all
[11:14:44 CET] <sn00ker> is it possible to send a video to a / dev / video0 (v4l2loopback) with ffmpeg twice at the same time?
[11:15:14 CET] <BtbN> You mean to two separate loopbacks?
[11:15:22 CET] <sn00ker> no
[11:15:33 CET] <sn00ker> two fmpeg to one v4l loopback
[11:16:07 CET] <BtbN> That does not sound like it would produce something sensible.
[11:16:21 CET] <BtbN> What do you want to achive with that?
[11:16:40 CET] <sn00ker> moment. i must translate with google... one moment please
[11:17:40 CET] <sn00ker> I have a v4l2loopback. I send the input to a rtmp server. as soon as I send but no more input ended ffmpeg synonymous send to the rtmp because no frame is there anymore
[11:18:57 CET] <sn00ker> now I thought. I send in a loop permanently a transparent image to the loopback. so the stream stays online permanently without interruptions. and if I then send a picture / video, I send that with a second ffmpeg command synonymous to the loopback
[11:19:08 CET] <BtbN> That doesn't work, no.
[11:19:34 CET] <sn00ker> hhmm
[11:19:35 CET] <BtbN> You'll have to write your own client based on the libav* libs that dynamically injects a static image when the input stream goes down
[11:21:16 CET] <sn00ker> you mean as soon as I do not send anything anymore a script jumps in and sends a picture
[11:22:08 CET] <BtbN> That can't be scripted. You need to write it yourself.
[11:24:22 CET] <sn00ker> OK. So I'm looking for a ready solution
[11:24:38 CET] <BtbN> There's nothing like that I'd be aware of.
[11:25:58 CET] <sn00ker> how do all tv servers do that? they also mix the streams digitally and and and ....
[11:26:16 CET] <furq> well they don't use ffmpeg
[11:26:17 CET] <BtbN> They use highly complex and expensive custom-built software
[11:26:24 CET] <BtbN> (And Hardware)
[11:26:28 CET] <furq> it doesn't need to be highly complex, but yeah
[11:26:39 CET] <BtbN> Well, tv broadcast stuff usually is
[11:26:51 CET] <BtbN> Combinding dozens of sources, smooth transitions between them and all that
[11:26:59 CET] <BtbN> -d
[11:27:25 CET] <sn00ker> yes a lot of sources where mixed
[11:28:05 CET] <BtbN> Maybe you are looking for OBS? Hard to tell from your explanation.
[11:28:50 CET] <sn00ker> but there i need X or VNC
[11:29:03 CET] <sn00ker> is an root server not a local machine with monitor
[11:29:23 CET] <BtbN> You'll need to roll your own then
[11:30:44 CET] <sn00ker> obs over vnc?
[11:31:02 CET] <BtbN> hm?
[11:31:19 CET] <BtbN> There is no readily made solution for what you plan to do.
[11:34:00 CET] <sn00ker> my english is not really good
[11:34:05 CET] <sn00ker> and google translate.....
[11:34:12 CET] <sn00ker> i have found this
[11:34:13 CET] <sn00ker> https://github.com/mpromonet/v4l2rtspserver
[18:41:27 CET] <Jan-> hihi
[18:41:38 CET] <Jan-> does ffmpeg have the ability to convert HEIC files shot on iphones to say jpegs?
[18:41:51 CET] <Jan-> I tried ffmpeg -i foo.heic foo.jpg but it complains "moov atom not found" among other things.
[18:43:04 CET] <JEEB> not yet because it's a mess to think of how that fits the FFmpeg APIs. because the HEIF stuff (which I think .heic is) is basically multiple separate HEVC images put into an mp4 file, except it's a nonstandard one and all that jazz
[18:43:13 CET] <JEEB> I think someone made a basic patch for the early iOS HEIF Files
[18:43:16 CET] <Jan-> bleh
[18:43:26 CET] <JEEB> but I don't think it stitches the parts together
[18:43:45 CET] <Jan-> the problem is that these files tend to come from apple users who barely understand what a file is
[18:43:47 CET] <JEEB> the best part is that HEVC actually has a feature of having separate coded blocks
[18:43:49 CET] <Jan-> and don't understand what grief they're creating
[18:44:01 CET] <JEEB> and the HEIF people decided that nope, they won't use that
[18:44:18 CET] <JEEB> instead they will encode 100% separate "frames" which the device will then have to handle
[18:44:42 CET] <JEEB> anyways, let me see if I can find the patch
[18:45:00 CET] <JEEB> https://ffmpeg.org/pipermail/ffmpeg-devel/2017-August/215003.html
[18:45:01 CET] <JEEB> there
[18:47:38 CET] <c0re`> hey guys, I'm trying to output a stream via the icecast protocol through ffmpeg.
[18:47:53 CET] <c0re`> the thing is my mountpoint is literally "/"
[18:48:04 CET] <c0re`> in return I get "no mountpoint (path) specified"
[18:49:00 CET] <c0re`> https://ffmpeg.org/doxygen/2.5/icecast_8c_source.html
[18:49:41 CET] <c0re`> line 158 to 162 looks to me like it performs a check for mountpoint with any name but "/"
[18:50:56 CET] <c0re`> am I being wrong here or can you confirm that the icecast implementation of ffmpeg wont let me stream to a mountpoint that's exactly "/"?
[18:51:13 CET] <JEEB> that's what it seems to be, and that limitation seems to have been ever since e3dc2c86fc4178b100484c54f12c88705cdf6724
[18:51:18 CET] <JEEB> which is the commit that added the protocol
[18:51:26 CET] <c0re`> oof
[18:51:43 CET] <JEEB> if you think that limitation is incorrect you can prod trac or ffmpeg-devel mailing list
[18:52:22 CET] <c0re`> it's either that or pinging the dev of the radio-suite that I'm running to allow me to change the livestreaming mountpoint name
[18:52:40 CET] <c0re`> anyway, thank's for the quick answer!
[18:52:55 CET] <JEEB> np
[20:33:02 CET] <c0re`> well.. here goes my very first mailing-list contribution ever ^^, JEEB
[20:38:53 CET] <JEEB> c0re`: cheers. since you linked version 2.5's stuff I /bet/ someone will ask if it does the same in current master (which it does unless I'm already behind git master magically)
[21:34:23 CET] Action: Hello71 hacks JEEB's git connection
[21:53:27 CET] <sn00ker> hi all
[21:53:57 CET] <sn00ker> how i can send video to /dev/video and auto to /dev/audio ?
[21:54:38 CET] <friendofafriend> sn00ker: What do you want to do?
[21:55:45 CET] <sn00ker> split a video file and send video to /dev/video (v4l) and audio to and (i search now) an virtual audio device
[21:56:13 CET] <sn00ker> i have create a "switcher" noch i need the audio and video self
[22:00:30 CET] <friendofafriend> I think what you're describing is something like v4l2loopback.  Have you seen that?
[22:01:09 CET] <sn00ker> friendofafriend, yes.. i have run an v4l2loopback device /dev/video but now i need another fpr audio
[00:00:00 CET] --- Wed Jan 23 2019


More information about the Ffmpeg-devel-irc mailing list