[Ffmpeg-devel-irc] ffmpeg.log.20170925

burek burek021 at gmail.com
Tue Sep 26 03:05:01 EEST 2017


[00:02:30 CEST] <wondiws> hi, I got this code from someone else that uses the ffmpeg libraries to open video streams. I usually use http:// streams, but is it possible to use video4linux as well? It looks to me that the avformat_open_input function opens the stream. Is there a prefix or url that I need to use v4l?
[00:02:37 CEST] <wondiws> just entering /dev/video0 didn't work for me
[00:06:43 CEST] <JEEB> l4v2 is supported if that module was built in libavformat
[00:07:26 CEST] <JEEB> ok, no - it's in libavdevice
[00:08:22 CEST] <JEEB> http://git.videolan.org/?p=ffmpeg.git;a=blob;f=libavdevice/v4l2.c;h=f087badf5ca74440ad50396fd36f647c6a614133;hb=HEAD#l1121
[00:21:38 CEST] <blap> i'd like a head-tracking and zooming filter that will just capture the head and shoulders of a lecturerer, if one is present, and then a full res of whatever slide is shown
[00:32:33 CEST] <wondiws> JEEB, do I need to do something like v4l2://dev/video0?
[00:35:13 CEST] <sshowcat> yo! trying to join a local, live video stream with a live remote rtp audio stream. are there any techniques to synchronizing the two? they have 5-15seconds mismatch on each attempt
[01:31:51 CEST] <iive> wondiws: `man ffmpeg-all` look for "-f v4l2" ; -f sets format, but is also used to select  input/output module, (de)muxer etc..
[01:32:16 CEST] <wondiws> iive, I'm not using a command line program, I'm using C code
[01:32:45 CEST] <iive> then you might want to look in libavdevice
[07:49:07 CEST] <doslas1> 'G
[07:49:12 CEST] <doslas1> hi
[07:50:48 CEST] <doslas1> i need live streaming audio with photo on youtube?
[07:55:27 CEST] <doslas1> https://gist.github.com/imdario/c1176e770c1570a07d06
[13:28:08 CEST] <doslas> hi
[13:29:26 CEST] <doslas> i need live streaming audio file with photo on youtube?
[13:35:14 CEST] <doslas> durandal_1707, ran
[14:02:16 CEST] <doslas> ubitux,
[14:10:55 CEST] <blap> i dunno
[14:11:19 CEST] <Nacht> Anyone has any idea as to why the following happends. We have a few TS files, which we concat together. Playing the total.ts plays fine, but when I transmux it to MP4, the file gets screwed up. Some parts the audio runs, but the video is static, other parst the video runs but no audio.
[14:16:56 CEST] <doslas> ??
[14:19:26 CEST] <BtbN> mpegts is a lot more tolerant to sudden format changes in the middle
[14:19:28 CEST] <BtbN> mp4 is not
[14:21:00 CEST] <flux> hmm, but iso base media format does support least multiple sample descriptions? that is not sufficient?
[14:21:45 CEST] <BtbN> it supports what?
[14:22:30 CEST] <ritsuka> yes it does, but some players and muxers don't
[14:26:11 CEST] <BtbN> You can't suddenly change major encoding parameters of a stream in the middle of the file. It only has one extradata header
[14:29:22 CEST] <ritsuka> no, isobff and mov supports multiple sample descriptions (and extradata)
[14:30:45 CEST] <BtbN> If it does, I doubt any muxer seriously implements it. Let alone players
[14:31:10 CEST] <ritsuka> QuickTime does :P
[14:47:04 CEST] <JEEB> libavformat now supports multiple sample descriptions thanks to koda I think
[14:48:27 CEST] <BtbN> Question is if it's intelligent enough to make use of it when fed a bunch of randomly concatenated mpegts files
[14:48:38 CEST] <BtbN> And what players, or YouTube, will do with it
[14:52:42 CEST] <JEEB> yea, I think it was on the demuxing side, totally not sure about mux
[16:56:21 CEST] <doslas> ????????????????
[16:57:05 CEST] <durandal_1707> doslas: do you have actual question?
[16:59:17 CEST] <doslas> Already
[16:59:57 CEST] <doslas> I asked a while ago and waited so long.
[17:00:45 CEST] <doslas> ok
[17:00:51 CEST] <doslas> <doslas> i need live streaming audio file with photo on youtube?
[17:01:27 CEST] <doslas> <durandal_1707>
[17:02:58 CEST] <durandal_1707> never done that
[17:03:35 CEST] <durandal_1707> what you need? full help or just command?
[17:04:07 CEST] <doslas> full help plz
[17:04:46 CEST] <doslas> How to broadcast live audio file
[17:05:00 CEST] <doslas> on youtube
[17:07:24 CEST] <doslas> *I use translator!
[17:08:01 CEST] <doslas> Often
[17:09:22 CEST] <doslas> durandal_1707, You with me?
[17:10:15 CEST] <Nacht> ffmpeg -re -i images.png -i audio.mp3 -c:v libx264 -f flv rtmp://a.rtmp.youtube.com/live2/<KEY>
[17:11:25 CEST] <doslas> Nacht, My dear friend
[17:12:14 CEST] <doslas> Thank you so much for helping me out.
[17:12:23 CEST] <Nacht> oeh forgot the framerate. -framerate 24
[17:12:44 CEST] <Nacht> ffmpeg -re -framerate 25 -i images.png -i audio.mp3 -c:v libx264 -f flv rtmp://a.rtmp.youtube.com/live2/<KEY>
[17:13:36 CEST] <doslas> Thanks :)
[17:14:12 CEST] <doslas> I'll try now
[17:14:56 CEST] <Nacht> To explain what it does: 'framerate' sets your input framerate. 're' is needed for streaming, otherwise it will go to fast. 'c:v libx264' encodes the video with h264, '-f flv rmtp... ' is where it will be outputted to
[17:15:12 CEST] <Nacht> Be sure to update the <KEY> with what Youtube gives you
[17:17:01 CEST] <doslas> OK
[17:19:17 CEST] <doslas> loop??
[17:19:55 CEST] <doslas> Re-broadcast automatically?
[17:20:24 CEST] <doslas> I need that, too.
[17:20:52 CEST] <doslas> I don't want the broadcast to end.
[17:21:35 CEST] <Nacht> Then it's perhaps easier to first create a single file, and just stream that
[17:22:14 CEST] <Nacht> so first:
[17:22:22 CEST] <Nacht> ffmpeg -re -framerate 25 -i images.png -i audio.mp3 -c:v libx264 -f mp4 stream_file.mp4
[17:22:23 CEST] <Nacht> then
[17:22:55 CEST] <Nacht> ffmpeg -stream_loop -1 -re -i stream_file.mp4 -c copy -f flv rtmp://a.rtmp.youtube.com/live2/<KEY>
[17:23:54 CEST] <doslas> Looks like you forgot the picture.
[17:24:10 CEST] <doslas>  -i images.png
[17:24:15 CEST] <doslas> 
[17:24:56 CEST] <doslas> ffmpeg -re -stream_loop -1 -framerate 25 -i images.png -i audio.mp3 -c:v libx264 -f flv "$YOUTUBE_URL/$KEY/$STREAMNAME"
[17:25:07 CEST] <doslas> OK?
[17:25:14 CEST] <Nacht> no, you already integrated it in the first one
[17:25:31 CEST] <Nacht> ffmpeg -re -framerate 25 -i images.png -i audio.mp3 -c:v libx264 -f mp4 stream_file.mp4 || Creates an .mp4 file with audio AND video.
[17:25:46 CEST] <Nacht> ffmpeg -stream_loop -1 -re -i stream_file.mp4 -c copy -f flv rtmp://a.rtmp.youtube.com/live2/<KEY> || Streams that video to youtube
[17:26:17 CEST] <Nacht> That way, you only transcode once, and after that you just transmux. Less cpu needed
[17:26:18 CEST] <doslas> oeh
[17:27:22 CEST] <doslas> Yes I understand , you mean I put the two things together?
[17:29:09 CEST] <doslas> Could you make me a script like last time?
[17:31:46 CEST] <doslas> OK
[17:31:53 CEST] <doslas> this
[17:32:02 CEST] <doslas> ffmpeg -re -framerate 25 -i images.png -i audio.mp3 -c:v libx264 -f mp4 stream_file.mp4
[17:32:23 CEST] <doslas> Creates an .mp4 file with audio AND video
[17:32:37 CEST] <furq> uh
[17:32:41 CEST] <furq> don't use -re in that command
[17:33:00 CEST] <doslas> ffmpeg -stream_loop -1 -re -i stream_file.mp4 -c copy -f flv rtmp://a.rtmp.youtube.com/live2/<KEY>
[17:33:05 CEST] <doslas> this
[17:33:12 CEST] <doslas> in script?
[17:50:47 CEST] <doslas> furq, -re It'll make it happen a lot. Isn't it?
[18:31:04 CEST] <jrun> i have a big file that mpv plays back all black. it was recorded on pixel phone. how do i upload a small bit for you guys to see?
[18:33:04 CEST] <relaxed> jrun: can you try ffplay?
[18:36:07 CEST] <blap> helping google customers is definitely not on my todo list
[18:36:24 CEST] <jrun> ffplay actually plays it back.
[18:36:45 CEST] <jrun> except saying this: [swscaler @ 0x7f44740a5390] deprecated pixel format used, make sure you did set range correctly
[18:37:14 CEST] <jrun> with massive cpu spike
[18:37:34 CEST] <alexpigment> does the pixel phone record at 10-bit 4:2:2?
[18:37:53 CEST] <alexpigment> sorry, i meant to say 10-bit or 4:2:2
[18:38:21 CEST] <jrun> https://gist.github.com/484dd5da8b45117cd91df765e8a51517
[18:38:40 CEST] <alexpigment> ah, yuvj420p
[18:38:51 CEST] <alexpigment> i suspect that's the problem
[18:39:42 CEST] <relaxed> jrun: then I would try a different --vo with mpv and if it stills happens, ask in #mpv
[19:05:09 CEST] <vans163> does ordering matter for a h264 decoder?   if im slicing the frame into udp packets of size 1320~
[19:06:09 CEST] <vans163> also when slicing like this, is losing 1% of any packet okay?  or is losing a key packet like the first sps going to forever break the stream?
[19:06:18 CEST] <vans163> 1% of total packets**
[19:09:04 CEST] <BtbN> re-submit them with every I frame if you are streaming via UDP.
[19:15:48 CEST] <doslas> Too many packets buffered for output stream 0:1
[19:16:23 CEST] <doslas> It doesn't work
[19:18:36 CEST] <doslas> furq,
[19:30:15 CEST] <nohop> I decided to use the nginx-rtmp route, and not try to reinvent the wheel and use my own server (As per my questions last week :) )   Now, is there a prefered way of creating an hls stream to nginx using ffmpeg's api, or should I just pipe my raw data into ffmpeg's standard input ? (feels wrong :) )
[19:30:43 CEST] <BtbN> "hls stream to nginx"?
[19:30:53 CEST] <BtbN> It's not something you deliberately send somewhere.
[19:31:02 CEST] <BtbN> You use rtmp for the ingest, as the name might suggest.
[19:31:06 CEST] <kepstin> nohop: if you're using the nginx-rtmp module, then you send rtmp to nginx
[19:31:21 CEST] <kepstin> if you're using hls, then use the hls muxer to save to disk, then serve with any http server.
[19:36:18 CEST] <nohop> hmmm
[19:37:15 CEST] <nohop> it's an application that creates a video stream, it's not to be saved on disk, since it's a live video-stream. I'm using nginx' rtmp module with the "hls on;" option.
[19:38:02 CEST] <nohop> as a test, i can stream to nginx with ffmpeg, using a file on disk. But in our application I just want to give my raw video frames to ffmpeg, and have ffmpeg stream it to the streaming server
[19:39:04 CEST] <nohop> BtbN: since nginx with the rtmp module and hls support, yes, i am deliberately sending it to the streaming server
[19:39:42 CEST] <BtbN> that's not how HLS works. You cannot direct it somewhere.
[19:39:46 CEST] <BtbN> You serve it from a http server.
[19:39:58 CEST] <BtbN> That's what rtmp is used for.
[19:40:00 CEST] <furq> nohop: ./foo | ffmpeg -f rawvideo -i - -c:v libx264 -f rtmp rtmp://nginx:1935/bar
[19:40:15 CEST] <furq> plus whatever other options you need for rawvideo. probably -s and -r
[19:40:28 CEST] <nohop> yeah, that's what I was wondering
[19:40:28 CEST] <furq> and er
[19:40:29 CEST] <furq> -f flv
[19:40:42 CEST] <nohop> do i just pipe, or is there a 'neater' option that uses the API
[19:40:42 CEST] <BtbN> What even produces raw video frames that you want to take in that way?
[19:40:50 CEST] <BtbN> what?
[19:40:54 CEST] <nohop> but, ok, i'll just pipe raw data up it's butt :)
[19:41:04 CEST] <furq> you can do it with the api if you want
[19:42:01 CEST] <nohop> BtbN: our application collects video streams from multiple camera's, does some processing and combining and sticks that in some raw streaming output buffer
[19:42:23 CEST] <nohop> furq: play, I was hoping that was the case :) I couldn't find examples...
[19:44:12 CEST] <kepstin> I'm sure you've found some "encoding with libavcodec" examples, right? The only differences will be that you use the flv muxer and use a url starting with rtmp: as the output
[19:45:48 CEST] <nohop> Yeah, i'm already encoding to disk using other muxers. Just not sure how to create the output context for an rtmp stream, i guess :)
[19:47:07 CEST] <nohop> wait, do i actually just use the url as the filename, and set format_name to "flv" !?
[19:49:38 CEST] <nohop> as arguments to avformat_alloc_output_context2(), that is
[19:53:57 CEST] <doslas> I want live stream with stereomix?
[19:55:24 CEST] <blap> declarative statements do not use question marks
[20:31:10 CEST] <redrabbit> :)
[20:38:45 CEST] <doslas> blap, thanks
[20:40:00 CEST] <doslas> redrabbit, What are you laughing at?
[20:41:20 CEST] <doslas> Did you instead answer my question, to help me?
[20:41:30 CEST] <nohop> Holy balls, that actually worked. Sorry for all the questions... It's always unbelievable when things are simpler than they seem :)
[20:43:03 CEST] <doslas> I want live stream with stereomix
[21:47:54 CEST] <redrabbit> i want chocolate
[22:04:23 CEST] <rjp421> anyone with a pi and pi-cam, and updated packages+kernel+ffmpeg-git:  can you please test piping raspivid to ffprobe or ffmpeg?  confirm whether or not ffmpeg crashes
[22:36:07 CEST] <dan3wik> I'm trying to transcode a websockets JSmpeg stream from a server but I can't seem to work out how to do it if it is even possible.
[23:29:52 CEST] <BytesBacon> Does reading from one drive and writing to another drive, increase the speed of ffmpeg with on ultraslow switch settings?
[23:30:19 CEST] <JEEB> not unless IO is your bottleneck
[23:34:43 CEST] <BytesBacon> Okay, I think right now my bottleneck is CPU, even with 6 cores. I'm going to have to start looking at dual CPU boards I think.
[23:38:01 CEST] <thebombzen> what is recommended for good headphone listening? is bs2b better or worse than earwax?
[23:42:59 CEST] <JEEB> BytesBacon: remember that with dual CPU you will get into the NUMA problem.
[23:44:22 CEST] <BytesBacon> JEEB, thanks for the heads up, I'll have to read into that.
[23:45:49 CEST] <BytesBacon> Is that only with x265 tho? Right now I'm just using libx264.
[23:48:42 CEST] <BytesBacon> JEEB, thanks, I see now, x264 doesn't support the non-uniform memory from the dual CPUs. Hmm
[23:57:47 CEST] <alexpigment> BytesBacon: what is your current CPU?
[23:58:10 CEST] <alexpigment> not everything can be processed in parallel, and sometimes fast clock speeds are what you need
[23:58:24 CEST] <alexpigment> the new i7-8700 seems like a really good balance of both
[23:58:41 CEST] <alexpigment> 8700k, rather
[23:59:20 CEST] <BytesBacon> It's older AMD Phenom II x6 1055T at 2.8GHz.
[23:59:32 CEST] <alexpigment> ah, yes, you need better IPC than that chip can provide
[23:59:57 CEST] <alexpigment> Ryzen is a big step up in terms of IPC, but Intel is still king of that
[00:00:00 CEST] --- Tue Sep 26 2017


More information about the Ffmpeg-devel-irc mailing list