[Ffmpeg-devel-irc] ffmpeg.log.20160913

burek burek021 at gmail.com
Wed Sep 14 03:05:01 EEST 2016


[00:05:24 CEST] <qxt> can ffmpeg deal with YUV420 colorspace?
[00:05:47 CEST] <furq> it would be pretty useless if it couldn't
[00:06:14 CEST] <qxt> lol yeah would think so =)
[00:18:47 CEST] <qxt> Seem to have some issues trying to pipe VapourSynth into ffmpeg. Wrote this http://paste.debian.net/819786/ and then ffmpeg gives me "pipe:: Invalid data found when processing input"
[00:19:14 CEST] <qxt> this is the cmd that beefs out http://paste.debian.net/819818/
[00:19:41 CEST] <qxt> Trying to pipe in sound into my video
[00:19:41 CEST] <furq> you've not specified an output file for vspipe
[00:19:53 CEST] <furq> add - before |
[00:20:48 CEST] <qxt> omg... kick me in the nuts plz! Thanks furq!
[00:21:13 CEST] <furq> i would but then i'd have to kick myself in the nuts a bunch of times
[00:21:19 CEST] <furq> and i don't have the leverage for that
[00:21:23 CEST] <qxt> haha
[00:22:32 CEST] <durandal_170> qxt: what you use vspipe for?
[00:22:55 CEST] <qxt> doing some magic to get sound into my video. I hope
[00:22:59 CEST] <furq> he just pasted the script he's using
[00:23:22 CEST] <qxt> lol 2.7 fps on a dual Xeon 2630
[00:23:23 CEST] <furq> unsurprisingly it looks like qtgmc
[00:23:31 CEST] <qxt> yeah using that too
[00:23:37 CEST] <qxt> a port of qtgmc
[00:24:37 CEST] <qxt> holywu ported all the old avisynth scripts to vapoursynth a while back.
[00:24:54 CEST] <furq> yeah that's what i use vapoursynth for
[00:25:07 CEST] <furq> avisynth is faster but also really likes crashing after 8 hours
[00:26:16 CEST] <qxt> I had to quite using avisynth because of all the crashes. BTW how does my filter chain look? Not sure if the order is a good one
[00:26:32 CEST] <furq> that's a more advanced script than i've ever needed
[00:27:04 CEST] <qxt> its kinda crazy but it is doing some real magic on these old video8 tapes I have.
[00:27:29 CEST] <furq> http://vpaste.net/xFyM7
[00:27:36 CEST] <furq> that's about as advanced as i get
[00:28:36 CEST] <qxt> give my script a try if you have some old noisy vhs video8 w/e video
[00:29:43 CEST] <qxt> Really love that in vapoursynth the scripting is in python.
[00:37:12 CEST] <durandal_170> furq: how fast is that script?
[00:39:25 CEST] <furq> mine?
[00:39:51 CEST] <furq> not very, but much faster than using nnedi+hqdn3d
[00:40:05 CEST] <furq> and it gives much nicer results than yadif+hqdn3d
[00:40:34 CEST] <furq> that's much more a critique of the deinterlacers than of hqdn3d
[00:41:41 CEST] <furq> if lavfi nnedi had frame-based multithreading i'd consider using that instead
[00:41:48 CEST] <furq> but it's just much too slow atm
[00:42:37 CEST] <durandal_170> isn't nnedi just for anime?
[00:42:42 CEST] <furq> no
[00:42:49 CEST] <JEEB> wut
[00:42:56 CEST] <furq> you said that about nlmeans as well
[00:43:11 CEST] <furq> i have no need for anything that is for animes
[00:43:23 CEST] <JEEB> well, you need usable IVTC filters, no?
[00:43:34 CEST] <JEEB> and most animoo needs IVTC if interlacism is involved
[00:43:49 CEST] <furq> s/for/just for/
[00:44:27 CEST] <JEEB> and then another thing originally made for animoo is that filter that did QTGMC and interpolation to match up 60/1.001 Hz field-based titles and 24/1.001Hz actual content
[00:44:47 CEST] <JEEB> which also sounds useful in normal shows where some imbecile decided to do credits in 60 fields per second :)
[00:45:37 CEST] <qxt> this is ridicules... only getting 3 fps. Have like 20 tapes to clean up =(
[00:45:54 CEST] <furq> you could consider using a faster qtgmc preset
[00:46:13 CEST] <furq> that sounds too slow for that to be the bottleneck though
[00:48:06 CEST] <qxt> Think it might be the AutoAdjust. Using 20+- frames for temporal stuff. Not sure if VapourSynth is loving it.
[00:48:30 CEST] <qxt> but wow does it do a good job.
[00:56:00 CEST] <qxt> yup commented out the AutoAdjust and now getting 11 fps... really cooking now!
[03:13:34 CEST] <TotalPower> How do I use AVStream.codecpar to pass codec parameters to muxers?
[06:00:15 CEST] <Kadigan> Hey. I know this may spark controversy (as most "what is best" questions do), but I'll ask anyway: what audio codec would I want to choose for most flexibility with MKV containers? I have a bunch of video files encoded with AC3 audio, which has licensing issues (most notably isn't supported in VLC on iOS), and I was wondering what I could use instead w/o needing to recode it to something else down the road.
[06:01:21 CEST] <Kadigan> I'm well aware that I cannot recreate information lost in the original compression, and I'm not looking on the basis of "what would be best quality" (since quality can only be as good as I already have, or worse).
[06:02:35 CEST] <Kadigan> Obviously I'm interested in choosing from audio codecs supported by ffmpeg.
[06:11:23 CEST] <Kadigan> Ah, well. I guess I'll have to come back later for the question (can't leave PC running today).
[07:09:35 CEST] <arkady_> Hey folks. In case someone here isn't afk..   I'm using ffmpeg to save a live stream to a mp4 file. While the stream is going, and ffmpeg is encoding the output, I can't play/access the file. I can play it after the stream/encoding is done. Is there any way to force ffmpeg to write the file every few minutes/seconds, so I could play it as it's being written?
[07:10:51 CEST] <arkady_> Also, I don't mean break it up into chunks. I'd like the file to stay one large file of the entire stream. I just want to be able to play it while it's recording.
[07:13:12 CEST] <arkady_> Found the answer here. http://stackoverflow.com/questions/6353519/is-it-possible-to-play-an-output-video-file-from-an-encoder-as-its-being-encode
[07:13:17 CEST] <arkady_> Good night everyone
[08:02:38 CEST] <ferdna> how do i have ffmpeg restart on error?
[08:02:54 CEST] <ferdna> i need to run it as a service and always feed... even if camera breaks
[08:28:10 CEST] <squ> while (1) {}
[08:38:10 CEST] <ferdna> squ, does that really work?
[08:38:18 CEST] <squ> what
[08:38:25 CEST] <ferdna> the while (1)
[08:38:38 CEST] <squ> while (1) always works
[08:41:56 CEST] <ferdna> squ, i was looking at this:
[08:41:57 CEST] <ferdna> http://stackoverflow.com/questions/25773860/shell-script-how-to-restart-a-process-with-pipe-if-it-dies
[08:42:26 CEST] <squ> I don't care what you look at :)
[08:43:54 CEST] <ferdna> you dont have to be mean either
[08:43:57 CEST] <ferdna> =(
[08:43:59 CEST] <ferdna> good night
[10:17:46 CEST] <t4nk049> god monnig
[10:18:06 CEST] <t4nk049> can any one help with fluid stream
[10:18:08 CEST] <t4nk049> ???
[10:20:17 CEST] <t4nk049> no one can help??? i have a stream that seam like a ghost... can any one help??!!!
[12:19:36 CEST] <ozette> is there any significant difference between 'MP4 v2' and 'MP4 Base Media v1' which would cause the first not to play in some media players?
[12:40:20 CEST] <ozette> does anyone have a mp4 v2 of let's say 1Mb? any idea how or where I could get it otherwise?
[13:27:13 CEST] <furq> ozette: ffmpeg -f lavfi -i smptebars=d=60 -f ipod v2.mp4
[13:28:47 CEST] <furq> ffmpeg lists them both as "mpeg-4 part 14" (i.e. mp4v2) but mediainfo flags them differently
[13:28:57 CEST] <furq> -f mp4 and -f ipod, that is
[13:34:06 CEST] <ozette> furq: wonderful, thanks
[13:53:31 CEST] <nonex86> when i used libx264 (compiled into ffmpeg) for encoding, does bit_rate field in codec context matters?
[13:54:25 CEST] <ozette> can i cut a video without re-encoding the output?
[13:55:35 CEST] <ozette> I have a mp4 v2 file. Tried ffmpeg -i Potter.mp4 -ss 00:00:00.000 -t 00:00:59.000 -c copy smaller.mp4
[13:56:12 CEST] <ozette> I'd like smaller.mp4 to be mp4 v2 as well, but instead it's 'mp4 base media v1'
[13:58:31 CEST] <flux> ozette, does it matter, though?
[14:00:46 CEST] <ozette> Well, my web application has trouble streaming a mp4 v2 video, and I'm not sure if it's the type of mp4, the media player or the file size that's causing the network error.
[14:01:02 CEST] <flux> I doubt it very much it's because of that
[14:01:07 CEST] <BtbN> mp4 cannot be streamed.
[14:01:10 CEST] <ozette> The original file is about 3.3Gb and has  duration of 3 hours
[14:01:37 CEST] <ozette> BtbN: oh?
[14:01:46 CEST] <flux> there's probably some other differences. for example the first file might be 'mpeg4 fast start', ie. have its header in the beginning. those could be streamed, right?
[14:02:02 CEST] <BtbN> It can be played while downloading
[14:02:08 CEST] <BtbN> if that fits your definition of streaming
[14:02:17 CEST] <BtbN> but moving the header to the front is a post-processing operation
[14:02:20 CEST] <ozette> ok, yea that's what I'm doing now, I thought that was the same as streaming
[14:02:28 CEST] <ozette> but it's a very very big file
[14:02:52 CEST] <BtbN> Without the faststart header, it can't even be played without having the full file, as the header is at the very end.
[14:03:17 CEST] <flux> ffmpeg supports fast start, though. the movenc option is called "faststart" :)
[14:03:20 CEST] <ozette> smaller files, e.g. 16Mb play, but they're all mp4 base media v1 or Apple Itunes Video etc.
[14:03:42 CEST] <BtbN> very possible that a huge mp4 file just doesn't fit into memory, and your player is trying to cache it entirely
[14:03:46 CEST] <ozette> so I'm trying to find out of mp4 v2 of a smaller size (13Mb) will play
[14:04:07 CEST] <ozette> hmm
[14:04:46 CEST] <flux> -movflags faststart is perhaps a good first try. if it doesn't work, then I guess the only option is to use a DASH fragmenter and hope that it works.
[14:05:04 CEST] <ozette> this is frustrating, because my application normally segments any incoming video format, but this workstation has a limited ffmpeg
[14:06:34 CEST] <ozette> so..
[14:06:53 CEST] <ozette> ffmpeg -movflags faststart Potter.mp4 ?
[14:12:51 CEST] <flux> put the -movflags faststart before the -c in the line you pasted?
[14:17:05 CEST] <ozette> what's it supposed to do? prevent re-encoding?
[14:17:10 CEST] <ozette> i can't find documentation on movflags
[14:17:34 CEST] <flux> it rewrites the file at the end so that the headers go to the beginning
[14:17:38 CEST] <flux> disclaimer: I've never used it
[14:17:47 CEST] <ozette> is it new?
[14:17:58 CEST] <flux> no
[14:18:14 CEST] <flux> I guess something like -movflags help documents it
[14:18:20 CEST] <flux> but the documentation is: "Run a second pass to put the index (moov atom) at the beginning of the file"
[14:19:04 CEST] <flux> if not, you can see the options in libavformat/movenc.c
[14:20:08 CEST] <JEEB> faststart basically, after the file has been finished, goes through the file once again and writes the index in the beginning
[14:20:44 CEST] <JEEB> so it cannot be used for streaming unlike movie fragments
[14:21:03 CEST] <JEEB> if it's static files that are you are sharing through the internet and they can be written to a file system then faststart works
[14:23:05 CEST] <ozette> hmm
[14:24:25 CEST] <ozette> let's say I have this line on the client in a html video tag: <source src="foo.mp4" type="video/mp4">
[14:24:37 CEST] <ozette> and foo is a video file of 3.3Gb
[14:24:38 CEST] <furq> yeah you definitely want faststart for that
[14:24:56 CEST] <ozette> and foo is not segmented
[14:25:18 CEST] <ozette> is that even practical?
[14:25:24 CEST] <furq> sure
[14:25:40 CEST] <furq> without faststart you'd need to download the whole file before it starts playing
[14:25:46 CEST] <furq> i'm guessing your player is just running out of cache
[14:26:05 CEST] <ozette> so, it can be played while downloading?
[14:26:09 CEST] <furq> yes
[14:26:20 CEST] <ozette> aha ..
[14:27:05 CEST] <ozette> ok, that sounds like what i'm looking for.. wil test
[14:31:16 CEST] <ozette> but JEEB, what you said sounds contradicting
[14:32:34 CEST] <furq> technically it can't be used for streaming since the player will just download the whole file
[14:33:11 CEST] <furq> i'm pretty sure firefox can seek with range requests but i have no idea if it evicts anything from the cache
[14:33:17 CEST] <furq> which might be an issue with a 3.3GB file
[14:33:51 CEST] <furq> basically it'll work but there are better solutions
[14:34:03 CEST] <ozette> it's ok, it's good as a fall back
[14:34:08 CEST] <ozette> because my better solution is hls
[14:34:18 CEST] <ozette> but i need to find something else for now for this other machine
[14:34:37 CEST] <furq> hls is supported pretty much everywhere with hls.js
[14:34:40 CEST] <ozette> so if it will enable a browser to start playing a file before it's fully downloaded, i'll be happy
[14:35:12 CEST] <ozette> yea, but my backend makes use of ffmpeg
[14:35:25 CEST] <furq> oh right nvm
[14:35:30 CEST] <furq> i take it you can't install a better ffmpeg on there
[14:36:04 CEST] <ozette> i made many ffmpeg builds for that machine
[14:36:23 CEST] <ozette> but to no avail, everything seemed to 'work' except for creating hls playlists  :(
[14:36:44 CEST] <furq> did you try the static builds
[14:36:52 CEST] <ozette> yes
[14:37:38 CEST] <furq> that's a really odd thing to be broken
[14:37:46 CEST] <ozette> used a toolchain for that particular machine as well, but ends up with SIGILL, so i decided for now to use a fallabck until i can figure out why
[14:39:05 CEST] <ozette> it's odd indeed, other functions don't result in a SIGILL
[14:39:58 CEST] <furq> sounds like you need to break out strace or something
[14:41:04 CEST] <DHE> SIGILL isn't something strace can help you with. gdb might
[14:43:28 CEST] <ozette> I figured, but I haven't figured a way to debug the program yet
[14:44:09 CEST] <ozette> it's a cross compiled program, and I couldnt emulate the target env, i tried qemu
[14:44:19 CEST] <ozette> anyway
[14:45:25 CEST] <ozette> about faststart, i did: ffmpeg -i foo.mp4 -movflags faststart bar.mp4
[14:45:32 CEST] <furq> you need -c copy
[14:45:37 CEST] <ozette> oh
[14:45:41 CEST] <ozette> that was quick
[14:46:12 CEST] <ozette> thanks, it's doing something :-)
[14:58:34 CEST] <sunny26> Hi I am seeing a very weird thing with ffmpeg, The demuxer is giving data with macro block artefacts !
[14:58:58 CEST] <sunny26> I have also asked the same question on ffmpeg libav-user community ? http://libav-users.943685.n4.nabble.com/Libav-user-Demuxer-not-giving-proper-data-to-decoder-td4662628.html
[14:59:09 CEST] <sunny26> Please someone help
[15:07:55 CEST] <LigH> Hello.
[15:08:50 CEST] <LigH> There was a request in a German speaking video forum that ffmpeg is unable to handle E-AC3 audio tracks from a Blu-ray disc, containing a 5.1 AC3 core but additional E-AC3 channels.
[15:09:11 CEST] <LigH> He had to extract the core using "eac3to -core".
[15:09:38 CEST] <LigH> Is this an issue planned to be supported?
[15:50:15 CEST] <ozette> well, even with -movflags faststart the media still fails to play
[15:50:53 CEST] <ozette> any way to prevent re-encoding from mp4 v2 to mp4 base media v1?
[15:52:43 CEST] <flux> ozette, mp4 base media v1/v2 are container formats, not actual media encodings. I don't think ffmpeg is able to produce v2.
[15:54:07 CEST] <flux> v2 basically requires a box called "progressive download information" to be present, perhaps that's what your application requires
[15:54:10 CEST] <LigH> But it's not Microsoft MP4-v2 VfW codec?
[15:54:25 CEST] <LigH> That would be incompatible to MPEG-4 Part 2 (ASP).
[15:54:52 CEST] <flux> (well v2 requires other things but that sounds interesting regaring your application)
[15:55:04 CEST] <flux> ozette, does your streaming solution explicitly say it requires mp4 base media v2?
[16:02:24 CEST] <LigH> Added some details and an attachment to https://trac.ffmpeg.org/ticket/3595
[16:10:41 CEST] <ozette> flux: no it doesn't
[16:10:52 CEST] <furq> what player is this
[16:10:58 CEST] <ozette> videojs
[16:11:22 CEST] <ozette> it fails to play this mp4 v2 of 3.3Gb
[16:11:38 CEST] <ozette> over the internet
[16:12:02 CEST] <ozette> but smaller mp4 v1's play fine over the internet
[16:12:05 CEST] <furq> what exactly is telling you that it's v1 or v2
[16:12:11 CEST] <ozette> file
[16:12:32 CEST] <ozette> file foo.mp4
[16:13:00 CEST] <furq> i wouldn't necessarily trust file
[16:14:14 CEST] <furq> i'm pretty sure the only difference will be that one has ftyp=isom and one has ftyp=iso2
[16:18:19 CEST] <ozette> hmm very possible
[16:18:29 CEST] <furq> yeah
[16:18:34 CEST] <flux> technically v2 requires a few boxes v1 doesn't
[16:18:50 CEST] <furq> adding `-brand mp42` will get file to flag it as version 2
[16:18:50 CEST] <flux> why would someone create a v2 unless they have provided those boxes?
[16:18:59 CEST] <furq> i'm pretty confident it'll make no difference
[16:19:14 CEST] <furq> well
[16:19:20 CEST] <flux> a better indicator of the difference between those files would be MP4Box -diso
[16:19:29 CEST] <furq> it'll definitely make no difference with ffmpeg because that's literally just changing the ftyp
[16:19:30 CEST] <sunny26> Demuxer is giving distorted data to decoder ! http://libav-users.943685.n4.nabble.com/Libav-user-Demuxer-not-giving-proper-data-to-decoder-td4662628.html
[16:19:38 CEST] <sunny26> Please help
[16:19:56 CEST] <flux> (buut after ffmpeg gets hold of the file you would need to check the differences out manually, diff is not going to cut it)
[16:20:07 CEST] <furq> i'd be really shocked if this is what's causing it to break with video.js
[16:20:25 CEST] <furq> you definitely don't need mp4v2 to view large mp4s in a browser
[16:20:54 CEST] <furq> "this" being an actual mp4v2, rather than just one with forced ftyp
[16:24:11 CEST] <ozette> the mp4v1 play in videojs, the mp4v2 doesn't
[16:24:44 CEST] <ozette> not sure if it's the size or the type
[16:24:47 CEST] <ozette> videojs says this:
[16:24:50 CEST] <ritsuka> what browser are you using btw? did you try with a different one?
[16:25:00 CEST] <ozette> The media could not be loaded, either because the server or network failed or because the format is not supported
[16:25:25 CEST] <furq> does the video url work in the browser
[16:25:29 CEST] <ozette> a pretty old chrome, but tried the newest as well, and tried firefox
[16:25:49 CEST] <ozette> yes
[16:28:36 CEST] <ozette> this browser can play the mp4v2, if i drop it directly in the browser
[16:29:11 CEST] <furq> i thought you said v1 was broken
[16:29:55 CEST] <ozette> yea, i already thought you were thinking that, but it's v2 that's broken
[16:31:32 CEST] <ozette> in <video> with videojs: http://imgur.com/a/zAZ11
[16:32:08 CEST] <ozette> seems to really be a src error, and not network
[16:32:40 CEST] <furq> well there is no MEDIA_VIDEOJS_SUCKS in that enum, so it's probably falling back to source not supported
[16:32:57 CEST] <ozette> lol
[16:33:29 CEST] <furq> this is why i just use hls.js
[16:33:39 CEST] <ozette> i use both videojs and hls.js
[16:34:47 CEST] <ozette> i grabbed videojs because of the css and plugins
[16:35:24 CEST] <ozette> controls for playback speed for example is nice
[16:35:40 CEST] <furq> shrug
[16:35:51 CEST] <ozette> :D
[16:35:52 CEST] <furq> if it works in your browser but not in video.js then there can only be one thing to blame
[16:54:03 CEST] <ozette> i wonder if i'll have more luck if it was 480p instead of 720p
[16:57:20 CEST] <ozette> hmm
[16:57:56 CEST] <ozette> i found some github issue that says chrome won't load certain formats if the 'area' is too small
[16:58:28 CEST] <ozette> my video tag contains width="640px" height="267px" and the video is 720p
[16:58:47 CEST] <ozette> could that even be a problem?
[17:03:40 CEST] <ozette> nevermind, changing the aspect ratio to 16:9, still same error
[17:33:13 CEST] <ses1984> hey, i'm not that knowledgeable about all the correct terminology so i was hoping a human could help me figure out what to search for... i was wondering if ffmpeg would work for my use case
[17:33:20 CEST] <ses1984> there are two live video feeds i want to consume, one is over rtsp and the other is something like... asf over http? not sure ... i was wondering if i could get the stream, apply some filters, and then pipe it to something else like a gui video player
[17:33:26 CEST] <ses1984> i'm not sure if ffmpeg is the right tool for the job, or there is some other tool i need to use instead of, or in addition to, ffmpeg
[17:51:04 CEST] <kerio> hi all, does anyone know of a way to colorize a grayscale video with ffmpeg filters?
[17:51:30 CEST] <kerio> i'd like to use something like matplotlib's "plasma" colormap http://matplotlib.org/_images/colormaps_reference_00.png
[17:55:07 CEST] <durandal_170> kerio: convert to pal8 and edit palette?
[17:55:26 CEST] <kerio> wtf is pal8 ;o
[17:56:51 CEST] <durandal_170> kerio: palette 8bit depth
[17:57:04 CEST] <kerio> yes that's pretty much what i'd want
[17:57:13 CEST] <kerio> how would i set the palette?
[17:59:31 CEST] <durandal_170> kerio: with lut3d?
[18:00:53 CEST] <kerio> doesn't lut3d/lutyuv require a formula?
[18:02:55 CEST] <durandal_170> kerio: lut3d doesn't
[18:07:35 CEST] <kerio> meh, i'll just do the colorization in the thing that also generates the video
[18:07:49 CEST] <kerio> and i'll just output raw rgb24 instead of raw gray
[18:15:40 CEST] <vans163> Hello. Is there a way to output to unix socket?
[18:16:51 CEST] <vans163> basically I am piping in a stream of PPM files into 1 unix socket and would like it to output to the same or another
[18:16:59 CEST] <vans163> I currently have it working to pipe in then output to file
[18:17:27 CEST] <DHE> unix:/tmp/file.sock works as a "filename"
[18:19:30 CEST] <kerio> vans163: wouldn't a fifo also work?
[18:20:02 CEST] <vans163> kerio: can you explain? DHE: let me try that, i used that  [NULL @ 0x7c67a0] Unable to find a suitable output format for 'unix://tmp/ffmpeg_unix_out' unix://tmp/ffmpeg_unix_out: Invalid argument
[18:20:26 CEST] <kerio> vans163: mkfifo /tmp/whatever
[18:20:37 CEST] <kerio> and then you can open it for reading and for writing
[18:20:46 CEST] <kerio> and it's like a pipe except it has a path
[18:20:52 CEST] <kerio> (you can only open it once for each side tho)
[18:21:35 CEST] <vans163> kerio: ah its like a weaker unix socket
[18:21:43 CEST] <vans163> kerio: i rather just have 1 2way unix socket open
[18:22:15 CEST] <vans163> kerio: also i alreayd have all the code to read/write to unix socket
[18:22:27 CEST] <DHE> I don't think you can two-way a unix socket...
[18:22:36 CEST] <vans163> DHE: You can write to it and read from it
[18:22:43 CEST] <kerio> yeah but i don't think ffmpeg can do that
[18:22:55 CEST] <kerio> i mean, anything is possible with enough socat
[18:22:55 CEST] <kerio> but
[18:23:03 CEST] <vans163> kerio: for input it can either create the unix socket or connect to an open one
[18:23:06 CEST] <kerio> vans163: can you explain what you're doing a bit better?
[18:23:13 CEST] <vans163> kerio: it should be able to do that for the output
[18:23:25 CEST] <kerio> because opening a unix socket and connecting to one are two very different things
[18:23:32 CEST] <DHE> can you use a pre-opened file descriptor?
[18:23:38 CEST] <vans163> i want to convert a stream of images into a video without anything touching the disk
[18:23:46 CEST] <kerio> stream of images from where?
[18:23:49 CEST] <vans163> from PPMs
[18:23:51 CEST] <DHE> named pipe or several sockets will do it
[18:23:56 CEST] <vans163> from binary
[18:24:05 CEST] <kerio> how's the binary separating the PPMs?
[18:24:31 CEST] <vans163> no seperation one after the other.  the fileheaders for PPM ffmpeg understands. it works right now but writes the output to disk
[18:24:37 CEST] <vans163> i rather it write the output to a unix socket
[18:24:42 CEST] <vans163> or i guess named pipe
[18:24:59 CEST] <kerio> so make it output to the named pipe
[18:25:07 CEST] <kerio> and read from it on the ffmpeg side
[18:25:22 CEST] <vans163> kerio: yea guess il try that next, its just i have code in erlang that sends and reads from the UNIX socket, now I need to code up using named pipes
[18:25:42 CEST] <kerio> oh do you also want to get output back on the same socket?
[18:25:49 CEST] <vans163> yea exactly, thats the simplest
[18:26:00 CEST] <kerio> ...that's not the simplest at all
[18:26:23 CEST] <kerio> use popen(3)
[18:26:46 CEST] <vans163> so erlang listens on a unix_socket, ffmpeg connects to it.  Then anything that gets sent to that unix socket from erlang, ffmpeg considers input, and anything ffmpeg sends back to it, erlang considers output
[18:28:10 CEST] <vans163> let me test something, as i was only making ffmpeg listen, now i want to try to make it connect to recv input
[18:28:13 CEST] <kerio> vans163: can't you just use spawn() or whatever
[18:28:31 CEST] <vans163> kerio: im not sure what spawn() has to do with anything
[18:29:23 CEST] <kerio> oh hm that's just for erlang stuff right
[18:29:34 CEST] <kerio> surely erlang has a way to run a process with pipes for stdin and stdout
[18:31:11 CEST] <vans163> kerio: yea it does, if not a 3rd party lib would (erlang just got unix socket support recently)
[18:32:55 CEST] <kerio> (also i wouldn't use ppm, i would use raw data if it's even slightly simpler)
[18:34:23 CEST] <vans163> kerio: I can use raw data, but im not sure how to frame it then. Like to tell ffmpeg the width/height+stride
[18:34:28 CEST] <vans163> and the pixel format
[18:35:03 CEST] <vans163> i can pull the pixels directly out the vram :P
[18:35:04 CEST] <kerio> -f rawvideo -framerate foo -size fooxbar -pixel_format foo
[18:35:33 CEST] <vans163> kerio: ah will note this
[18:35:43 CEST] <kerio> where the pixel format is like
[18:35:46 CEST] <kerio> rgb24
[18:36:27 CEST] <kerio> anyway i'd still rather use stdin and stdout or two FIFOs or something like that
[18:36:40 CEST] <kerio> it's probably much easier if you can spawn ffmpeg right from your program
[18:36:56 CEST] <vans163> kerio: yea the problem is stdin and out is so annoying to use with erlang. Il have to write my own C NIF wrapper to spawn ffmpeg proc and pipe into it
[18:37:41 CEST] <kerio> vans163: if you really want to use your single socket thing, you can use socat
[18:38:31 CEST] <kerio> ...ok ok actually
[18:39:00 CEST] <kerio> can't you just pass the unix socket as an already opened file descriptor
[18:39:09 CEST] <kerio> and use /dev/fd/n as input and output?
[18:40:34 CEST] <vans163> kerio: right now what works is erlang listens on a unix_socket, accepts FFMPEG -i unix unix://tmp/unix_socket.  now when i send from Erlang to that socket the PPMs, FFMPEG starts making video
[18:41:12 CEST] <kerio> i'd use two fifos, honestly
[18:41:14 CEST] <vans163> but the output if i use unix://tmp/unix_socket, ffmpeg gives that error.   [NULL @ 0x12a38c0] Unable to find a suitable output format for 'unix://tmp/unix_socket'
[18:41:34 CEST] <vans163> if ffmpeg can accept input from unix socket, I assume it can output to one?
[18:42:11 CEST] <kerio> the issue is using the same one i believe
[18:43:14 CEST] <vans163> kerio: let me try a diff one sec
[18:43:34 CEST] <kerio> but a different one would just be two fifos except more complicated :<
[18:45:48 CEST] <vans163> same error
[18:46:32 CEST] <kerio> vans163: have you considered using the ffmpeg library instead of the ffmpeg binary?
[18:46:43 CEST] <kerio> you can't be the first one that ever used ffmpeg in erland
[18:47:08 CEST] <vans163> kerio: afaik its not a erlang problem
[18:47:18 CEST] <vans163> ffmpeg does not wanna output to the unix socket / connect to it
[18:47:24 CEST] <vans163> maybe i need to pass an extra arg
[18:47:35 CEST] <vans163> but erlang looks super simple to work with pipes. im gonna try that now
[18:47:41 CEST] <vans163> (fifo)
[18:47:42 CEST] <flux> vans163, have you tried telling the output format?
[18:47:49 CEST] <vans163> here is my command sec
[18:48:07 CEST] <vans163> ffmpeg -framerate 30 -i unix://tmp/ffmpeg_output -c:v libx264 -qp 0 -preset ultrafast -r 30 -pix_fmt yuv420p unix:/tmp/ffmpeg_output1
[18:48:12 CEST] <flux> vans163, if ffmpeg supports unix sockets, then I imagine the problem is it that it cannot know what encoding to use for that output. (ie. with .mp4 it would be easy)
[18:48:38 CEST] <vans163> flux: ! bingo thats probably it.  because replace that end ffmpeg_output1 with out.mp4 and it works great
[18:49:10 CEST] <flux> vans163, how about putting -f mp4 before the last argument
[18:49:42 CEST] <vans163> flux: trying
[18:50:10 CEST] <flux> vans163, I guess you're aware that you won't end up with a streamable output this way?
[18:52:03 CEST] <vans163> flux: i think i realize something is awry.  It needs to send the output back to the input socket.
[18:52:22 CEST] <vans163> flux: and why would the output not be streamable?
[18:52:38 CEST] <vans163> flux: if it connects again to the listen socket, itl make 2 connections
[18:52:40 CEST] <flux> vans163, because the mpeg4 files it creates are not streamable
[18:52:48 CEST] <kerio> something something movflags
[18:52:48 CEST] <flux> it has the frame contents, but not the framing itself
[18:53:04 CEST] <flux> I don't think movflags help here, unless there's one to do segmented mpeg4
[18:53:16 CEST] <vans163> flux: I want to use it with broadway  https://github.com/mbebenita/Broadway
[18:53:43 CEST] <vans163> on the readme at the bottom before API it has the ffmpeg command to use
[18:53:50 CEST] <kerio> flux: i thought faststart was specifically to stream mp4s
[18:53:51 CEST] <flux> I guess it's ok then, because Broadways says it needs to load the whole file before showing it
[18:54:12 CEST] <vans163> flux: broadway has a frame by frame mode tho, which is what I wanna use to stream
[18:54:20 CEST] <flux> kerio, I understand faststart needs to use random access to write the header afterwards, or specifically just rewrite the whole file?
[18:54:29 CEST] <relaxed> faststart needs a second pass to move the index to the beginning
[18:54:38 CEST] <kerio> :(
[18:55:04 CEST] <vans163> is there any guides for the correct params to use for zero latency streaming?
[18:55:09 CEST] <vans163> (utilizing x264)
[18:55:35 CEST] <flux> zero latency? that's a tricky one..
[18:55:35 CEST] <vans163> that what i pasted i copied from the ffmpeg wiki, zero latency section. only difference is their had .mkv as output format
[18:55:42 CEST] <kerio> "zero"
[18:55:48 CEST] <flux> well mkv is a VERY big difference here :-)
[18:56:02 CEST] <kerio> -preset veryfast -tune zerolatency is what i use when i feel like streaming stuff to twitch
[18:56:23 CEST] <flux> though there are versions of mp4 that support streaming, mkv does it more "out of the box" I understand
[18:57:10 CEST] <kerio> vans163: HOOOOOOOOLD on
[18:57:17 CEST] <kerio> why do you want to use a javascript x264 decoder
[18:57:31 CEST] <DHE> javascript format adapter is one thing...
[18:57:42 CEST] <vans163> flux: from the little i understand, something called NAL frames are inserted into these low latency streams,which allow you to render each frame 1 by 1 almost, removing the ability to sedek
[18:57:45 CEST] <vans163> sedek/seek
[18:58:20 CEST] <vans163> kerio: Because I want it to work in the browser :P, only option is that or making a Chrome/Firefox app and using real decoders
[18:58:28 CEST] <vans163> *app not extension
[18:58:38 CEST] <kerio> can't you just use <video> or whatever
[18:58:58 CEST] <vans163> kerio: that has too much latency and it fails with these zero latency streams because it cant understand them
[19:02:42 CEST] <kerio> meh i don't get it
[19:03:12 CEST] <vans163> kerio: Look for example at SPICE or other remote access protocols.
[19:03:30 CEST] <vans163> kerio: Everything drills down to using x264 as its the most effective for computation resources and bandwith
[19:03:45 CEST] <vans163> kerio:  The license fee is negligble
[19:03:47 CEST] <kerio> *because everyone and their mom has a hardware-accelerated x264 decoder
[19:03:50 CEST] <vans163> yup
[19:03:52 CEST] <vans163> it just works
[19:04:33 CEST] <vans163> SPICE used a combination of custom coded algo for diffmapping then LZ4 compression. and its just not the best
[19:04:51 CEST] <vans163> spiking bandiwth almost to 130mbps just to watch a youtube movie
[19:05:00 CEST] <vans163> (to transmit all the pixels)
[19:05:22 CEST] <vans163> x264 in that situation would take 50mbps maximum and thats at 1080p 60fps (not actual results, I was just told this)
[19:05:26 CEST] <vans163> hope to test it soon
[19:06:13 CEST] <vans163> also SPICE produces flickers and artifacts due to thier buggy diffmapping
[19:06:16 CEST] <kerio> well i mean
[19:06:29 CEST] <kerio> people stream games in 1080p60 at barely 5mbps
[19:06:38 CEST] <vans163> kerio: yea but what is the latency
[19:06:50 CEST] <vans163> kerio: this is close to zerodelay im talking about
[19:07:00 CEST] <vans163> kerio: so much less compression is used
[19:07:29 CEST] <vans163> kerio: for streaming even 1second latency is okay, I think 100-300ms is standard
[19:08:14 CEST] <vans163> zerolatency is like.. 5-20ms to encode, 5-25ms to decode
[19:08:31 CEST] <kerio> well this is for twitch.tv
[19:08:49 CEST] <kerio> so ffmpeg/libx264 with -tune zerolatency and -preset veryfast
[19:09:06 CEST] <kerio> ...and then it ends up having like a 30 second delay because of twitch's cdn and HLS segmentation and stuff
[19:10:48 CEST] <furq> yeah you shouldn't use zerolatency for streaming to twitch
[19:10:48 CEST] <vans163> kerio: wow lol, that is certainly bad onpart of twitch.  Is there anyway to benchmark how long it took to encode a frame?
[19:10:52 CEST] <furq> you're just losing a ton of efficiency
[19:13:18 CEST] <DHE> kerio: encoding delay can be longer with lookahead enabled. I've had 2-3 second delays, depending on your keyframe interval
[19:13:36 CEST] <DHE> zerolatency stops this, but also means bitrate estimation without 2-pass is kinda bad
[19:13:55 CEST] <kerio> so what would be the best way to do what vans163 is doing?
[19:14:06 CEST] <kerio> ie splurge out on the bitrate but deliver the fastest image possible
[19:16:05 CEST] <furq> cue everyone searching for that x264 developer blog post about vbv which isn't there any more
[19:17:06 CEST] <furq> https://web.archive.org/web/20150507012544/http://x264dev.multimedia.cx/archives/249
[19:21:59 CEST] <vans163> furq: nice read
[19:25:33 CEST] <vans163> How would you read from fifo using ffmpeg.  when  i do -i /tmp/fifo it says invalid input
[19:25:56 CEST] <vans163> the fifo just got created and had nothing written to it
[19:27:59 CEST] <kerio> furq: ok how do i do that with ffmpeg
[19:28:02 CEST] <kerio> :>
[19:29:07 CEST] <vans163> going to try the stdin / out approach and fight through the erlang annoyances
[19:29:32 CEST] <furq> kerio: see comment 8
[19:29:41 CEST] <furq> those are x264 options but they're not that different
[19:29:58 CEST] <furq> also obviously don't actually do this if you're just streaming to twitch
[19:31:57 CEST] <kerio> don't tell me what to do :v
[19:35:08 CEST] <kerio> i wonder if OBS can do it
[19:36:32 CEST] <TD-Linux> vans163, you might want to consider WebRTC instead which can do low-latency playback
[19:37:10 CEST] <TD-Linux> in browser JS playback will work of course, but with WebRTC you get other benefits like using UDP so that you drop frames instead of build up a buffer
[19:37:39 CEST] <kerio> why do browsers need a separate reimplementation of everything
[19:37:44 CEST] <kerio> isn't that just rtmp
[19:37:58 CEST] <kepstin> webrtc is just rtp, albeit encrypted with dtls
[19:38:08 CEST] <furq> i asked the w3c for an official response to that question
[19:38:11 CEST] <TD-Linux> well, srtp with the key exchange over dtls
[19:38:12 CEST] <furq> here is the response: "fuck you buddy"
[19:38:16 CEST] <TD-Linux> kerio, it's not rtmp, rtmp is tcp
[19:38:19 CEST] <furq> i hope that answers all your questions
[19:38:35 CEST] <kerio> TD-Linux: fair enough
[19:38:47 CEST] <furq> also the guy punched me in the face after he said it so imagine that happening to you as well
[19:38:52 CEST] <kerio> kee
[19:38:55 CEST] <kerio> *kek
[19:38:56 CEST] <TD-Linux> :^)
[19:39:35 CEST] <TD-Linux> webrtc is actually directly interoperable with rtp if you handle the encryption
[19:40:49 CEST] <kepstin> most of the webrtc annoyances just have to do with the javascript api to get it up and running in the browser
[19:43:53 CEST] <vans163> TD-Linux: WebRTC is hell to get working, you need STUN/Nat traveral/etc.  Afaik TCP should do okay, at this stage I am only considering local network use. But I would use QUIC over WebRTC and tell everyone not using Chrome 53 that thier performance would be degraded by 30%
[19:44:13 CEST] <vans163> *Meaing I would use QUIC instead of WebRTC :P
[19:44:21 CEST] <TD-Linux> wat
[19:44:33 CEST] <vans163> QUIC is a udp PROTOCOl that works in chrome. ops caps
[19:44:52 CEST] <vans163> its much better then RTP and anything WebRTC offers because it has built in error correction
[19:45:01 CEST] <vans163> also much simpler to write a server for
[19:45:03 CEST] <kerio> does quicktime support periodic intrarefresh?
[19:45:21 CEST] <TD-Linux> umm RTP also as fec
[19:45:24 CEST] <TD-Linux> *has
[19:46:08 CEST] <vans163> TD-Linux: ah well then it comes down to simplicity of implementation I guess. Also QUIC is much faster as establishing the initial connection if that matters. Also pure UDP means it doesnt drop connection if you change towers.
[19:46:43 CEST] <TD-Linux> I mean, I agree it's better than TCP, I just don't think it's better than RTP.
[19:47:33 CEST] <vans163> TD-Linux: I think it is, otherwise google would not serve all their video over it, they would use WebRTC instead right?
[19:47:45 CEST] <TD-Linux> except they do use webrtc?
[19:48:12 CEST] <vans163> TD-Linux: on youtube?
[19:48:19 CEST] <TD-Linux> no, but youtube isn't low latency
[19:48:44 CEST] <vans163> TD-Linux: WebRTC they use for hangouts, but the quality is crap and latency is bad :P
[19:49:09 CEST] <vans163> TD-Linux:  Also afaik RTP uses TCP and UDP, so it will interrupt if you change cellphone towers.
[19:49:40 CEST] <TD-Linux> nope, RTP is UDP only.
[19:50:14 CEST] <TD-Linux> (well unless you use TURN TCP as a fallback)
[19:50:53 CEST] <vans163> TD-Linux: Yea nevermind I thought RTCP implied TCP lol
[19:51:31 CEST] <vans163> TD-Linux: Still if you look at QUIC it just looks like a well thought upgrade, and does not impose the limitations WebRTC does, like forcing you to use STUN
[19:52:07 CEST] <vans163> TD-Linux: plus you can use QUIC to serve your whole website (http2) and websockets
[19:52:26 CEST] <vans163> WebRTC you can only use for WebRTC media streams
[19:54:07 CEST] <TD-Linux> no, webrtc has data channels. but anyway have fun with your project :)
[19:55:13 CEST] <kepstin> iirc, webrtc datachannels are actually based on sctp, supporting a couple of reliable/non-reliable packet or stream based modes.
[20:12:59 CEST] <vans163> damn.. I believe im stuck.. is there anyway to output .mp4 to stdout?
[20:13:07 CEST] <vans163> reading says the mp4 encode requires to backtrack
[20:13:11 CEST] <vans163> in the file
[20:13:27 CEST] <vans163> anyway to have it keep a cache perhaps?
[20:13:51 CEST] <vans163> because outputting to a file.mp4 I see the file growing as I pipe in more input
[20:15:16 CEST] <furq> you probably want fragmented mp4 for streaming
[20:15:22 CEST] <furq> or a proper streaming format like mpegts
[20:15:48 CEST] <furq> `-f mp4 -movflags frag_keyframe+empty_moov` for fragmented mp4, `-f mpegts` for ts
[20:16:59 CEST] <ChocolateArmpits> mp4 isn't fit for piping
[20:17:02 CEST] <ChocolateArmpits> you need mpegst
[20:17:08 CEST] <ChocolateArmpits> mpegts
[20:18:25 CEST] <furq> oh actually i guess you can just use -f ismv for fragmented mp4
[20:19:39 CEST] <vans163> furq, ChocolateArmpits: -f ismv  works.  whats the difference between that and mpegst?
[20:20:01 CEST] <furq> ismv will probably be less hassle to get working in browsers
[20:20:58 CEST] <ChocolateArmpits> ismv is for smoothstream, it does produce a fragmented stream but stream readers may not read it properly or not have support for it
[20:21:03 CEST] <JEEB> uhh, browsers use ismv even less than fragmented isobmff :P
[20:21:14 CEST] <furq> do they use it less than mpegts
[20:21:23 CEST] <JEEB> about the same amount
[20:21:30 CEST] <JEEB> because it's a fork of isobmff
[20:21:38 CEST] <JEEB> and the timestamps are different for fragments
[20:21:55 CEST] <JEEB> of course I guess you can ignore some of the stuff from the MSS-specific boxes
[20:22:03 CEST] <JEEB> because the mov demuxer in lavf does that too
[20:22:11 CEST] <vans163> Some users are reporting you can use a command like this (to support Broadway.js)  ffmpeg -i "http://10.33.0.68/channels/preview/1/flv:500k" -an -vcodec libx264 -pass 1 -coder 0 -bf 0 -flags -loop -wpredp 0 -vb 500k /tmp/p.h264
[20:22:21 CEST] <flux> hmm, that -f ismv seems pretty nice, hadn't noticed it's so easy
[20:22:44 CEST] <JEEB> just note that ismv is a nonstandard fork of ISOBMFF from MS
[20:22:53 CEST] <JEEB> it's not proper fragmented ISOBMFF
[20:22:56 CEST] <flux> the overhead seems minimal
[20:22:56 CEST] <vans163> then this particular user was read from the output file every 50ms, which imo is inneficient
[20:23:01 CEST] <flux> jeeb, :(
[20:23:40 CEST] <flux> would proper fragmented ISOBMFF be difficult to implement?
[20:23:45 CEST] <furq> vans163: wtf is that using -pass 1 for
[20:23:56 CEST] <JEEB> uhh, the movenc muxer should more or less be able to do it :P
[20:24:04 CEST] <JEEB> you just have to set th flags for it
[20:24:26 CEST] <JEEB> the guy who implemented smooth streaming uses lavf for fragmented ISOBMFF after all these days
[20:27:26 CEST] <vans163> furq: no idea :P
[20:27:45 CEST] <kerio> can't we just use matroska for everything
[20:28:00 CEST] <furq> only if you're willing to put up with vp9
[20:28:09 CEST] <kerio> y :c
[20:28:11 CEST] <JEEB> timestamps aren't proper in matroska, so in a way I'd rather have fragmented ISOBMFF than matroska.
[20:28:32 CEST] <TD-Linux> if you have any brilliant ideas to fix that btw, post on CELLAR
[20:28:35 CEST] <JEEB> matroska almost had proper timestamps :<
[20:28:46 CEST] <JEEB> then mosu said "no, not adding new shit"
[20:28:47 CEST] <kerio> what's a proper timestamp?
[20:28:50 CEST] <JEEB> and then there was no lights
[20:28:58 CEST] <TD-Linux> JEEB, except we are now, matroska v5 can be perfect
[20:29:09 CEST] <JEEB> kerio: go look at the limitations of the timebase in current matroska
[20:29:25 CEST] <JEEB> TD-Linux: cool
[20:29:40 CEST] <JEEB> I thought they were still doing the "specifying current matroska shit" thing
[20:29:41 CEST] <JEEB> :D
[20:29:53 CEST] <JEEB> instead of looking for fixes
[20:30:48 CEST] <JEEB> and just for the record, I don't hate matroska as you can see by me writing up https://lists.matroska.org/pipermail/matroska-devel/2013-September/004567.html
[20:33:57 CEST] <flux> -frag_duration 1000000 indeed creates a nice fragmented file. no idea if it's standard or not, though :)
[20:34:27 CEST] <TD-Linux> JEEB, no, they are making two specs, one that specifies v1-4 "in the wild" and one that does v5
[20:34:33 CEST] <JEEB> TD-Linux: oh
[20:34:40 CEST] <TD-Linux> https://www.ietf.org/mail-archive/web/cellar/current/maillist.html
[20:34:47 CEST] <JEEB> flux: if it's not -f ismv it should be standard :P
[20:34:58 CEST] <flux> I shall be using that for all my create-large-files-with-chance-of-interruption-needs in the future :)
[20:35:03 CEST] <JEEB> -f ismv enables all sorts of hacks
[20:35:20 CEST] <JEEB> or well, microsoft smooth streaming things
[20:35:31 CEST] <TD-Linux> see "rationale numbers for timestamps" thread. there isn't a clear solution yet. some people had ideas of ways to make it play acceptably in a v4 player
[20:35:38 CEST] <furq> can't ms just stick to wmv and other formats i don't have to encounter
[20:36:18 CEST] <flux> cool, I can even truncate that file with dd and it plays without worries ;)
[20:36:31 CEST] <vans163> using -f ismv  the filesize is constant
[20:36:58 CEST] <flux> at least it complains at the end that stream 1, offset 0x1460b63: partial file
[20:37:02 CEST] <vans163> like its not growing as i pipe in more input
[20:37:13 CEST] <kerio> i can't get latencies below 2 seconds with obs :(
[20:38:10 CEST] <vans163> I see it print to stdout like  frame=   67 fps=1.8 q=0.0 size=       1kB time=00:00:02.20 bitrate=   2.8kbits/s speed=0.0595x
[20:38:21 CEST] <vans163> making me think it is doing something
[20:39:13 CEST] <kerio> i wonder if nginx-rtmp is also the issue here
[20:39:18 CEST] <kerio> ...or if maybe the ssh tunneling is
[20:39:23 CEST] <vans163> confirmed sanity check, changing it back to mp4, the file grows as input comes
[20:40:12 CEST] <JEEB> do not use ismv unless you are using a media server that takes live input in that format
[20:41:07 CEST] <vans163> the input is ppm stream (which can be adjusted to raw input) just for simplicity testing with PPMs now
[20:45:56 CEST] <CFS-MP3b> is this possible for ffmpeg to listen for rtsp connections? I need to send a stream to Wowza, but with wowza pulling it instead of ffmpeg pushing it... I guess with ffserver out of comission that's no longer a possibility?
[20:57:48 CEST] <ChocolateArmpits> CFS-MP3b: You can try using VLC as a server
[20:58:44 CEST] <tdr> or mediatomb, theres a bunch of things easy to setup to stream from
[21:37:17 CEST] <xjrn> ahh yes the old "button vanishes when a chifon barrier is placed upon it"... story
[21:44:38 CEST] <mosb3rg> question about mapping
[21:44:39 CEST] <mosb3rg> -map 0:p:4
[21:44:48 CEST] <mosb3rg> this allows the program stream i want, but further more, i need to select the streams directly
[21:45:16 CEST] <mosb3rg> i only want 2 of the contained streams inside the program, and its including the timed_id3 which seems to be creating a problem with the dump
[21:45:40 CEST] <mosb3rg> so how do i specifically choose program 4 then video and audio specific streams.
[21:46:00 CEST] <xjrn> 0.3 ish?
[21:46:34 CEST] <xjrn> -map ?:p -map ?:v -map ?:a
[21:46:44 CEST] <mosb3rg> oh..
[21:46:53 CEST] <mosb3rg> so you do -map 4:p initially ?
[21:46:57 CEST] <mosb3rg> and this states program stream 4
[21:47:00 CEST] <mosb3rg> ?
[21:47:01 CEST] <xjrn> if its supposed to be first
[21:47:25 CEST] <mosb3rg> so i can write the map command 3 times back to back to back ?
[21:47:27 CEST] <xjrn> ?:p is all streams before the others
[21:47:39 CEST] <xjrn> in the example
[21:47:47 CEST] <mosb3rg> oki hang on let me copy something so my example makes sense to me
[21:47:53 CEST] <xjrn> if any then first
[21:48:06 CEST] <mosb3rg>  Program 4
[21:48:06 CEST] <mosb3rg>     Metadata:
[21:48:06 CEST] <mosb3rg>       variant_bitrate : 5200000
[21:48:06 CEST] <mosb3rg>     Stream #0:16: Video: h264 (High) ([27][0][0][0] / 0x001B), yuv420p, 1280x720 [SAR 1:1 DAR 16:9], Closed Captions, 59.94 fps, 59.94 tbr, 90k tbn, 119.88 tbc
[21:48:06 CEST] <mosb3rg>     Metadata:
[21:48:08 CEST] <mosb3rg>       variant_bitrate : 5200000
[21:48:10 CEST] <mosb3rg>     Stream #0:17: Audio: aac (LC) ([15][0][0][0] / 0x000F), 48000 Hz, stereo, fltp, 116 kb/s
[21:48:12 CEST] <mosb3rg>     Metadata:
[21:48:14 CEST] <mosb3rg>       variant_bitrate : 5200000
[21:48:16 CEST] <mosb3rg>     Stream #0:18: Unknown: none ([134][0][0][0] / 0x0086)
[21:48:18 CEST] <mosb3rg>     Metadata:
[21:48:20 CEST] <mosb3rg>       variant_bitrate : 5200000
[21:48:22 CEST] <mosb3rg>     Stream #0:19: Data: timed_id3 (ID3  / 0x20334449)
[21:48:24 CEST] <mosb3rg>     Metadata:
[21:48:28 CEST] <mosb3rg>       variant_bitrate : 5200000
[21:48:30 CEST] <mosb3rg> so this is the program stream #4
[21:48:32 CEST] <mosb3rg> i need 16 and 17 only.
[21:48:38 CEST] <mosb3rg> i want nothing else included to see if thats why the glitching is occuring because 10gbit lines are being used.
[21:49:00 CEST] <mosb3rg> so how would the map command look if i just wanted those 2 streams from that program stream #4
[21:49:38 CEST] <xjrn> -map 0.17 -map 0.16
[21:49:59 CEST] <mosb3rg> audio goes ahead of video ?
[21:50:09 CEST] <xjrn> -map ?:d -map 16:v
[21:50:38 CEST] <xjrn> oh sorry i was looking at id3
[21:50:43 CEST] <furq> -map 0:16 -map 0:17
[21:50:55 CEST] <furq> also don't paste that much text into irc
[21:51:03 CEST] <mosb3rg> k
[21:51:04 CEST] <xjrn> -c copy -map ?:v -map ?:a
[21:52:30 CEST] <mosb3rg> thanks for the assistance guys
[21:52:47 CEST] <mosb3rg> No trailing CRLF found in HTTP header, does this error leave me something to be concerned about, considering the glitching issue, or is this nothing ?
[21:52:52 CEST] <furq> it's nothing
[21:53:39 CEST] <furq> you can use -v error if you want to make it shut up
[21:53:59 CEST] <mosb3rg> i think this fixed the problem :D
[21:54:30 CEST] <mosb3rg> i really appreciate it fellas, thanks again.
[21:54:39 CEST] <furq> if you have a very slow terminal (e.g. cmd.exe) and ffmpeg is spamming hundreds of errors then it can actually slow down encoding
[21:54:42 CEST] <soulshock1> is it possible to output h264 on a self-hosted stream with ffmpeg? i.e. ffmpeg acts as the streaming server
[21:54:58 CEST] <furq> soulshock1: not really
[21:55:14 CEST] <furq> you can serve a video stream but only to one listener and it'll die when you disconnect
[21:55:28 CEST] <furq> so technically yes but actually no
[21:55:29 CEST] <soulshock1> ok
[21:55:40 CEST] <soulshock1> yeah one listener would not be enough for my idea. cheers
[21:56:02 CEST] <furq> use something like nginx-rtmp
[21:57:07 CEST] <soulshock1> ok
[22:31:43 CEST] <mosb3rg> oki guys next issue is, related to that glitching still i believe.
[22:31:47 CEST] <mosb3rg> http://pastebin.com/xKkjvacr
[22:32:16 CEST] <mosb3rg> im getting these decoding errors when passing it through to my 10gbit server.. i dont see the decoding errors when its being dumped and relayed over udp using loopback ip internally
[22:32:51 CEST] <mosb3rg> sending into nimble streamer just fine, but the output i watch from the 10gbit line shows these decoding issues. but im doing direct copy from the map. what might i be doing incorrectly here, because this glitching isnt good.
[22:34:41 CEST] <mosb3rg> i dumped to a local .ts file for 3-4 minutes direct from the original source, and its glitch free.
[22:41:17 CEST] <DHE> mosb3rg: you might want to use the 'bitrate=10000000' parameter (for 10 megabit content) on the output UDP command. see if that helps.
[22:41:24 CEST] <DHE> it's a new feature, so you may need a very up-to-date ffmpeg for this to work
[22:42:03 CEST] <mosb3rg> hmm
[22:42:04 CEST] <DHE> wait, are you outputting to UDP over the network?
[22:42:10 CEST] <mosb3rg> internal
[22:42:16 CEST] <DHE> oh, okay
[22:42:17 CEST] <mosb3rg> i run the command on the same box as nimble
[22:42:30 CEST] <mosb3rg> so i send out at @ 127.0.0.1:15000
[22:42:37 CEST] <mosb3rg> but im getting minor glitching
[22:42:44 CEST] <DHE> that's meant for if you use UDP to deliver over the network in the same way...
[22:43:08 CEST] <mosb3rg> perhaps im confused but im delivering to nimble as UDP
[22:43:23 CEST] <mosb3rg> is that command not needed in this situation ?
[22:43:50 CEST] <DHE> if it's over a NIC using UDP, that option is strongly recommended.
[22:44:01 CEST] <mosb3rg> ok could you give me an example of the command syntax
[22:44:02 CEST] <mosb3rg> ?
[22:44:10 CEST] <mosb3rg> like where does it go, and how does it look
[22:45:34 CEST] <DHE> ffmpeg -i $SOURCE -c libx264 ... -f mpegts udp://192.168.0.6:15000?bitrate=10000000
[22:46:11 CEST] <mosb3rg> ahh
[22:46:28 CEST] <DHE> this is one version, and it's the one I use.
[22:46:33 CEST] <mosb3rg> i was using
[22:46:35 CEST] <mosb3rg> "udp://127.0.0.1:15001?pkt_size=1316"
[22:46:42 CEST] <mosb3rg> per wowza recommendations before, but using nimble now
[22:46:51 CEST] <mosb3rg> so that could legit be why its acting like that
[22:46:52 CEST] <DHE> oh, do both. use & then...
[22:47:01 CEST] <mosb3rg> ok sec
[22:47:02 CEST] <DHE> but you have to be careful the shell doesn't interpret it
[22:47:14 CEST] <DHE> ffmpeg -i $SOURCE -c libx264 ... -f mpegts udp://192.168.0.6:15000?bitrate=10000000\&pkt_size=1316
[22:49:07 CEST] <mosb3rg> oki great give a moment lets see
[22:49:50 CEST] <mosb3rg> is it also possible the pkt size im after is causing a problem or is that a reasonable packet size
[22:51:12 CEST] <mosb3rg> av_interleaved_write_frame(): Cannot allocate memory
[22:51:12 CEST] <mosb3rg> Error writing trailer of udp://127.0.0.1:15001?bitrate=10000000\&pkt_size=1316: Cannot allocate memoryframe=  598 fps=0.0 q=-1.0 Lsize=    5388kB time=00:00:09.92 bitrate=4446.1kbits/s speed=  94x
[22:51:12 CEST] <mosb3rg> video:4849kB audio:159kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 7.573402%
[22:51:21 CEST] <mosb3rg> @ DHE
[22:51:31 CEST] <mosb3rg> was that \ and accident ?
[22:52:36 CEST] <mosb3rg> tried without also, same error cannot allocate memory
[22:53:31 CEST] <mosb3rg> so i ran with just the bitrate command, thats whats causing the error
[22:54:46 CEST] <DHE> on unix the & is a special shell character
[22:55:33 CEST] <DHE> https://www.ffmpeg.org/ffmpeg-all.html#udp  pretty sure that's right...
[22:56:27 CEST] <mosb3rg> im running linux
[22:56:38 CEST] <mosb3rg> and i have it inside ""
[22:56:54 CEST] <mosb3rg> but either way without combining them its still giving me the memory error :\
[22:58:51 CEST] <mosb3rg> does the video have to be processed as encode ?
[22:59:00 CEST] <mosb3rg> or is it ok to use copy since its not needed changes
[23:00:59 CEST] <mosb3rg> also.. i just assumed udp delivery was best for copy function material with mpegts as the intended end game.
[23:01:17 CEST] <mosb3rg> i just felt like why waste my time with shitty rtmp without packets being sent in a single signal right
[23:12:44 CEST] <mosb3rg> either way you made the error clear for me so thank you again
[23:18:49 CEST] <mosb3rg> without packet size noted that trailing error goes away
[23:23:55 CEST] <mosb3rg> for anyone reading, my answer was to completely remove options
[23:24:11 CEST] <mosb3rg> i was noting a pkt size that wasnt helping, it works for wowza, but nimble doesnt like it.
[00:00:00 CEST] --- Wed Sep 14 2016


More information about the Ffmpeg-devel-irc mailing list