[Ffmpeg-devel-irc] ffmpeg.log.20161203

burek burek021 at gmail.com
Sun Dec 4 03:05:02 EET 2016


[00:37:00 CET] <courrier> youtube-dling some TV show stucks on that ffmpeg error at video and audio muxing: "Malformed AAC bitstream detected" http://paste.debian.net/900255/
[00:37:05 CET] <courrier> Have you got any clue?
[00:37:30 CET] <courrier> I tried the advised filter without success, still the same error
[01:00:25 CET] <fling> Is vbr aac good in terms of compatibility?
[01:29:54 CET] <llamapixel> Is there a way to convert on ffmpeg frame into a vector svg or eps?
[01:30:00 CET] <llamapixel> one / on*
[01:30:24 CET] <klaxa> not really
[01:30:27 CET] <llamapixel> I want to take a git / gource file and from one day make a vector
[01:30:36 CET] <llamapixel> http://hastebin.com/vesusiruda.tex
[01:31:10 CET] <llamapixel> Is there anything in say image magick that does pixel data to vector? OCR?
[01:32:15 CET] <klaxa> there are some tools
[01:32:21 CET] <klaxa> i think inkscape can do it?
[01:32:32 CET] <klaxa> but i wouldn't hope for too good results
[01:33:02 CET] <klaxa> there is some commercial software probably that does a better job, but i doubt it's as good as you want :P
[01:33:25 CET] <klaxa> this approach also seems quite backwards
[01:33:45 CET] <llamapixel> I dont need imperial accuracy, should I ask gource to export the data at that stage?
[01:34:33 CET] <klaxa> i would think so, you are taking git history, go great lengths to generate a bitmap and then convert it back to vector information
[01:35:03 CET] <llamapixel> I want to be able to see a video first before making a decision on the frame
[01:35:04 CET] <klaxa> somewhere between reading the history and generating the bitmap the vector image probably already exists
[01:36:04 CET] <llamapixel> ok thanks I will dig into it that way mate
[01:37:29 CET] <llamapixel> Now a mention on the result. The result can be used to make unique structures finding the tips of the vectors to make primitive shapes that would help to make traversable areas in game content. https://usercontent.irccloud-cdn.com/file/Mz9HKVzs/Screenshot%202016-12-03%2011.35.49.png
[01:38:22 CET] <llamapixel> how many repos are out there at the moment in late 2016?
[01:38:54 CET] <klaxa> at least 10
[01:39:01 CET] <llamapixel> cheeky monkey
[01:39:06 CET] <klaxa> ;)
[01:39:11 CET] <klaxa> what kind of answer do you expect?
[01:39:18 CET] <klaxa> i would guess millions upon millions
[01:39:22 CET] <klaxa> but i could wrong
[01:39:24 CET] <klaxa> *be
[01:39:45 CET] <klaxa> i don't think anyone keeps track of every repo in existence
[01:39:55 CET] <klaxa> you can probably get some number off github
[01:40:14 CET] <klaxa> and that would give you a good chunk of many open-source repos at least
[01:40:27 CET] <llamapixel> It was rhetorical and should of made you think about content like this being playable content like that.
[01:41:16 CET] <llamapixel> One serious question however is that they would all have a single similar node point or inherit some similarily right?
[01:42:34 CET] <llamapixel> anyway I will dig for that vector or data element mate, thanks again.
[01:44:42 CET] <klaxa> certainly an interesting idea to use git history to generate game maps
[01:47:32 CET] <llamapixel> Years ago I was populating a dungeon with your hard drive directory and files replacing potions, monsters etc.
[01:47:45 CET] <llamapixel> extension of this without need to probe the drive contents
[01:57:41 CET] <llamapixel> For inception I will use the git gource github for tests, I was going to use the git / git ;P
[05:31:08 CET] <llamapixel> klaxa: seems like potrace might help me down that path of image to vector
[05:31:57 CET] <klaxa> ok, i personally would probably have tried to process the raster images myself, but i don't know what and how you are coding, hope it works out for you :)
[05:33:14 CET] <llamapixel> cheers mate I will still keep that idea and component on a list of todo
[07:47:27 CET] <frustratedmacuse> hi guys
[07:47:56 CET] <frustratedmacuse> is there anyone who can teach me how to use this ffmpeg thing
[07:48:16 CET] <frustratedmacuse> I am trying to do something but it seems there is no new tutorial about ffmpeg
[07:48:56 CET] <frustratedmacuse> I have a .mov file and .ass file wich I want it to be hard subbed
[07:49:01 CET] <frustratedmacuse> How can I do that
[07:49:03 CET] <frustratedmacuse> ??
[07:49:13 CET] <c_14> https://trac.ffmpeg.org/wiki/HowToBurnSubtitlesIntoVideo
[07:49:38 CET] <CoJaBo> ..why would anyone hardsub anything :/
[07:49:54 CET] <c_14> Compatibility
[07:49:56 CET] <c_14> usually
[07:50:41 CET] <frustratedmacuse>  
[07:50:55 CET] <frustratedmacuse> it says no such filter ass
[07:51:05 CET] <frustratedmacuse> I was trying that but it doesn't let me
[07:51:11 CET] <c_14> Your version of ffmpeg wasn't built against libass
[07:51:34 CET] <frustratedmacuse> so what should I do c_14
[07:51:51 CET] <c_14> Where'd you get your ffmpeg from?
[07:51:57 CET] <frustratedmacuse> brew
[07:52:09 CET] <c_14> brew install ffmpeg --with-libass or something
[07:52:20 CET] <frustratedmacuse> let me try
[07:56:13 CET] <fling> I have a video with a bad audio in a container and I have to replace the audio with the better quality one.
[07:56:26 CET] <fling> How to properly count the offset to use for the new audio?
[07:56:46 CET] <fling> Is not there the magical audio comparing filter or something?
[07:57:05 CET] <c_14> not really, you can check the waveforms and try and find the offset between the peaks
[07:58:07 CET] <fling> Which app to use for this?
[07:59:03 CET] <c_14> plenty of options, ffmpeg has some filters that can do waveforms. You can try audacity. Not sure which would be best
[07:59:06 CET] <c_14> I usually just go by ear
[07:59:56 CET] <fling> by ear?
[08:00:45 CET] <c_14> play the video with the better audio in a video player and adjust the synch until it synchs
[08:00:59 CET] <c_14> That usually only works if you have something in the video to synch against though
[08:01:06 CET] <fling> Is mp4 with multiple h264 and aac streams fine in compatibility terms? Or should I leave one video and one audio per file?
[08:01:18 CET] <c_14> should be fine
[08:01:53 CET] <fling> Great!
[08:03:11 CET] <fling> c_14: Or I could put old audio into right channel and new audio to the left channel and compare them :>
[08:03:22 CET] <fling> But how will I shift? hmm hmmm
[08:10:31 CET] <frustratedmacuse> c_14
[08:10:42 CET] <frustratedmacuse> I've tried but I couldn't compile libass
[08:11:06 CET] <frustratedmacuse> do you know any other program that can do the job?
[08:11:27 CET] <frustratedmacuse> I need .ass subtitle hardsubbed on a mov movie output can be mp4 or avi
[08:11:44 CET] <frustratedmacuse> what do you think guys need helpp :)
[08:12:54 CET] <c_14> No clue of any other programs for that.
[08:12:58 CET] <c_14> Why doesn't libass compile?
[08:14:24 CET] <frustratedmacuse> I think I need libass dev first
[08:14:34 CET] <c_14> shouldn't brew do that?
[08:14:50 CET] <frustratedmacuse> when I try it says failed to download
[08:14:56 CET] <frustratedmacuse> I am new to mac tho
[08:15:02 CET] <frustratedmacuse> mybe I blew it somewhere
[08:16:04 CET] <c_14> No clue, don't actually have a mac.
[08:16:33 CET] <c_14> You could try the osx static builds http://evermeet.cx/ffmpeg/
[09:32:28 CET] <thebombzen> what does -vf hysteresis do? the docs are very unclear
[09:40:40 CET] <durandal_170> thebombzen: it is for creating masks
[09:40:47 CET] <thebombzen> masks?
[09:41:02 CET] <thebombzen> I mean I know what masks are. but how does it do that? how am I supposed to use it?
[09:41:08 CET] <laserbled> Hi.I have a situation. I am trying to stream my desktop(linux) over lan to another pc and trying to play it with gstreamer. ( because I already have an application written with it). Now when I use this this commnd  ffmpeg -video_size 1366x768 -framerate 25 -f x11grab -i :0.0 -f mpegts udp://192.168.1.11:53515 it works and I am able to stream with a delay of around 2 - 3 sec.
[09:41:23 CET] <durandal_170> yes, for use with maskedmerge
[09:41:41 CET] <durandal_170> and or blend filter
[09:42:15 CET] <thebombzen> maybe if the docs had an example
[09:42:26 CET] <laserbled> but when I try to use do the above with this command ffmpeg -video_size 1366x768 -framerate 25 -f x11grab -i :0.0 -vcodec libx264 -preset ultrafast -tune zerolatency -f h264 udp://@192.168.1.12:53515 i get internal error
[09:42:36 CET] <thebombzen> cause I don't really know what sort of input is appropriate what sort of thing is GIGO (Garbage In/Garbage Out)
[09:42:45 CET] <laserbled> could you let me know what I am doing wrong
[09:42:54 CET] <c_14> laserbled: tried without the @?
[09:42:57 CET] <c_14> otherwise
[09:42:59 CET] <c_14> #pb laserbled
[09:43:54 CET] <thebombzen> yes that. because laserbled you say "I get an error" but you don't say whether the ffmpeg command gives you the error or whether the gstreamer on the other side gives the error
[09:43:55 CET] <laserbled> ok. let me check and be back with console out.
[09:44:21 CET] <thebombzen> cause you're muxing the H.264 stream as raw h264 which is kind of suspicious. why not use -f mpegts?
[09:44:49 CET] <thebombzen> suspicious as in, there's generally very little reason to use a raw h264 stream
[09:45:03 CET] <thebombzen> and I suspect that gstreamer doesn't like it
[09:45:26 CET] <thebombzen> either way
[09:45:38 CET] <thebombzen> I feel that some of the filters are extremely well-documented (e.g. overlay)
[09:45:46 CET] <thebombzen> mostly those are the ones that lots of people will use
[09:46:08 CET] <thebombzen> but other, specialized filters (hysteresis) are poorly documented and it's frustrating.
[09:46:25 CET] <thebombzen> I don't have an immediate reason to use it, I just like to be familar with libavfilter
[09:46:25 CET] <c_14> The more people use them the more likely they are to submit patches to the documentation.
[09:49:51 CET] <durandal_170> there is example on doom9 forum
[09:59:21 CET] <laserbled> using UDP  : http://pastebin.com/gBnBtypy
[09:59:26 CET] <laserbled> using tcp http://pastebin.com/Xgh4ASHj
[10:01:00 CET] <c_14> like thebombzen said, try -f mpegts or something instead of -f h264
[10:01:30 CET] <laserbled> thebombzen: -f mpegts works. I tried that but has a delay of 2 - 3 sec
[10:01:42 CET] <laserbled> Any suggetion to reduce that ?
[10:02:29 CET] <laserbled> Since it was over lan I was expecting less than 1 sec delay.
[10:02:33 CET] <llamapixel> Is there an editor designed to manage the command \ layout I should be looking at>?
[10:02:35 CET] <c_14> mpegts shouldn't add really any latency, so removing that won't help
[10:02:40 CET] <c_14> The rest is probably lost in a buffer somewhere
[10:02:57 CET] <c_14> s/a buffer/buffers/
[10:04:00 CET] <c_14> You can adjust the udp buffer_size (default is 64K)
[10:04:13 CET] <c_14> You'll also want to adjust the udp buffer_size on the receiving end
[10:04:45 CET] <laserbled> ok cool. let me try that. thanks c_14 .
[10:04:59 CET] <c_14> you can try -fflags nobuffer
[10:27:21 CET] <laserbled> one more query
[10:27:55 CET] <laserbled> if ffplay can stream it, does that mean gstreamer can also play it ?
[10:28:16 CET] <laserbled> because I tried a similar command and it played with ffplay
[10:28:20 CET] <c_14> no
[10:28:34 CET] <c_14> gstreamer uses its own magic for all that
[10:28:38 CET] <c_14> completely different codebase
[10:29:34 CET] <laserbled> ah true. I meant the validness of the stream. I was worried if the stream was corrupt for somereason
[10:29:55 CET] <c_14> ffplay playing it doesn't necessarily mean the stream is valid
[10:29:59 CET] <c_14> It just means it's not completely broken
[10:30:19 CET] <c_14> (libav* tends to accept common broken files that are "easy" to fix)
[10:30:34 CET] <laserbled> hmm
[10:31:36 CET] <c_14> But ffmpeg shouldn't produce such broken files. If it does, it's a bug.
[10:31:47 CET] <c_14> (anywhere I said "file" you can replace with "stream")
[10:37:26 CET] <thebombzen> out of curiosity, I noticed ffmpeg autodetects if it's writing matroska to a seekable file
[10:37:56 CET] <thebombzen> is there a way to disable this? I sometimes like streamable mkvs so I just use -f matroska - >output.mkv
[10:38:24 CET] <c_14> Don't think so
[10:38:25 CET] <thebombzen> this allows me to play the file while it's encoding as long as speed > 1x, which it often is
[10:38:36 CET] <c_14> You can do that with a normal mkv as well
[10:38:53 CET] <thebombzen> no, I get different behavior. might be a bug tho
[10:39:16 CET] <c_14> it used to work
[10:39:19 CET] <c_14> might be the crc patches
[10:39:21 CET] <thebombzen> I know it did
[10:39:21 CET] <c_14> eeeh
[10:39:23 CET] <c_14> lemme see
[10:39:25 CET] <thebombzen> it doesn't anymore
[10:39:32 CET] <thebombzen> lemme send you a pasta
[10:40:35 CET] <c_14> yeah, I reproduced
[10:40:38 CET] <c_14> something about EBML size
[10:41:27 CET] <c_14> or duplicate element
[10:42:28 CET] <c_14> -live 1
[10:43:07 CET] <c_14> ^will make it "streamable"
[10:43:32 CET] <c_14> Though I'm interested in when that changed
[10:43:48 CET] <thebombzen> yea duplicate element
[10:44:16 CET] <thebombzen> http://pastebin.com/KBDD7pHn
[10:45:05 CET] <c_14> yea, -live 1 is what you're looking for
[10:45:12 CET] <c_14> though I might bisect later to find out why it's happening
[10:45:25 CET] <thebombzen> yea. I don't actually know C otherwise I'd poke around myself
[10:49:25 CET] <thebombzen> another question - about the minterpolate filter
[10:49:49 CET] <thebombzen> the defeault settings works mostly fine. is there another algorithm that works better (even if it's more expensive) or are the default settings the "best"
[10:50:50 CET] <fling> wow -vf hqdn3d rocks!
[10:51:02 CET] <fling> Should I use it for noisy webcam all the time?
[10:51:23 CET] <furq> probably
[10:51:24 CET] <fling> with the same x264 settings the bitrate decreased from 29Mbps to 20Mbps
[10:51:32 CET] <furq> there are other denoisers you can use
[10:51:52 CET] <fling> The resulting video is a bit blurry but the original is really blinking with the colored snow
[10:52:04 CET] <furq> you might want to use more aggressive settings
[10:52:06 CET] <furq> 20mbit seems pretty high
[10:52:12 CET] <fling> So even blurry the image looks much better
[10:52:39 CET] <fling> furq: it is 20 because I used -crf 18 and -preset ultrafast just for testing the video filter
[10:52:45 CET] <furq> oh
[10:53:00 CET] <furq> well yeah you can increase the strength with -vf hqdn3d=5
[10:53:05 CET] <furq> 4 is the default
[10:53:13 CET] <fling> So I encoded the source mjpeg twice. first without the filter and second with the filter, now comparing
[10:53:22 CET] <fling> Should I increase it?
[10:53:35 CET] <furq> up to you
[10:53:49 CET] <furq> you can decrease it as well if it's removing too much detail
[10:53:58 CET] <fling> Is it possible to mostly filter with smaller radius or something?
[10:54:07 CET] <raven_> tried 'ffmpeg -i input.mp4 --ss 790 -c copy -t 187 output.mp4' and got error "Unrecognized option '--ss'" "Error splitting the arguments list: Option not found".
[10:54:14 CET] <fling> The idea is to just reduce the blinking snow :P
[10:54:22 CET] <fling> raven_: it is -ss not --ss
[10:54:33 CET] <furq> https://ffmpeg.org/ffmpeg-filters.html#hqdn3d-1
[10:54:55 CET] <furq> for grain you can try lowering spatial and increasing temporal
[10:55:00 CET] <raven_> fling: thanks
[10:55:28 CET] <furq> there are plenty of other denoise filters you can try as well
[10:55:33 CET] <furq> hqdn3d is pretty well liked though
[10:55:44 CET] <fling> raven_: yw
[10:56:13 CET] <fling> furq: now I only see noise on the hard edges, the image got more blurry, the bitrate lowered down to 18Mbps
[10:56:30 CET] <fling> I need to research these filters.
[10:56:54 CET] <fling> I never get to denoise and reencode all this mjpeg
[10:57:27 CET] <fling> 1. there is some audio delay need to be fixed and 2. I don't want to ruin the picture quality with reencoding :P
[10:57:34 CET] <fling> As I don't trust my eyes
[10:58:24 CET] <thebombzen> ummmm
[10:58:25 CET] <furq> you've also got atadenoise, dctdnoiz, nlmeans, owdenoise, pp, removegrain, vaguedenoise and probably some others i'm forgetting about
[10:58:45 CET] <thebombzen> so ffmpeg-filters.html lists "vsmbc" as an option for minterpolate
[10:59:11 CET] <thebombzen> but I'm getting: [Parsed_minterpolate_0 @ 0x1891720] Option 'vsmbc' not found
[10:59:18 CET] <furq> i've only ever used hqdn3d and nlmeans
[10:59:21 CET] <fling> furq: I like removegrain name
[10:59:38 CET] <furq> apparently that's spatial only so it might not be great
[10:59:40 CET] <beauty> who can give me a audio resample module to learn?
[10:59:41 CET] <furq> no harm in trying though
[10:59:45 CET] <fling> I could try using nlmeans for removing only color noise
[11:00:06 CET] <c_14> thebombzen: vsbmc
[11:00:07 CET] <fling> spatial?
[11:00:08 CET] <c_14> looks like a typo
[11:00:45 CET] <thebombzen> c_14: that worked, yea the docs have a typo
[11:01:05 CET] <furq> spatial removes noise within a frame, temporal removes noise across multiple frames
[11:01:21 CET] <furq> temporal will do a better job with flickering etc because it can average multiple frames
[11:01:53 CET] <thebombzen> so the issue with noise is it's extra data that obscures the actual data
[11:02:12 CET] <beauty> who can give me a audio resample module to learn?
[11:02:14 CET] <thebombzen> denoising is attempting to recover the signal behind it. if you use the previous or next frames then you have more to work with
[11:02:21 CET] <thebombzen> beauty: asking again won't help.
[11:02:31 CET] <furq> hqdn3d does both, hence 3d (x, y, time)
[11:02:31 CET] <beauty> sorry
[11:02:40 CET] <furq> a lot of them do both
[11:03:02 CET] <thebombzen> beauty: we didn't answer your question (or rather, I didn't) cause I don't really know what you're asking
[11:03:09 CET] <thebombzen> it will help to have a more specific question.
[11:03:39 CET] <beauty> thebombzen: sorry
[11:03:58 CET] <beauty> I want to use swr_convert to resample audio.
[11:04:12 CET] <beauty> but the code I write has problem
[11:04:16 CET] <thebombzen> well, I can't help you there. I have not used swresample.
[11:04:47 CET] <beauty> so I want to know who can give me a audio resample module example.
[11:04:53 CET] <fling> which x264 profile to use if I'm only going to use the video with ffmpeg/mpv?
[11:05:07 CET] <fling> wow nlmeans is so slow!
[11:05:22 CET] <c_14> beauty: resampling_audio.c
[11:05:24 CET] <thebombzen> fling: depends on how long you want it to take to encode. if you don't mind it taking a while then use -profile:v slower
[11:05:27 CET] <furq> beauty: https://www.ffmpeg.org/doxygen/trunk/resampling_audio_8c-example.html
[11:05:33 CET] <c_14> ^that
[11:05:49 CET] <thebombzen> if you don't like waiting or need it faster use medium or fast.
[11:05:56 CET] <fling> thebombzen: not preset but profile
[11:06:03 CET] <thebombzen> oh profile? right
[11:06:04 CET] <thebombzen> duh
[11:06:10 CET] <thebombzen> I leave that alone.
[11:06:14 CET] <furq> don't set the profile unless you need to
[11:06:16 CET] <fling> the default is profile High 4:2:2, level 4.0, 4:2:2 8-bit
[11:06:17 CET] <beauty> furq: I have read it. but it can't use.
[11:06:37 CET] <thebombzen> fling: by default x264 will use high or high10 or whatever variant of "high" is necessary
[11:06:45 CET] <furq> a few devices won't play back 4:2:2 or high profile, but if you're not targeting those devices then it doesn't matter
[11:06:49 CET] <beauty> my problem will have "__memcpy_ssse3_back" segment fault.
[11:07:08 CET] <thebombzen> fling: there's generally no reason to set the profile unless you want to force it downward. usually that's because some devices can't play the higher stuff.
[11:07:46 CET] <thebombzen> if you only are playing it on avcodec-based players on a general system, then you don't need to touch that option.
[11:08:35 CET] <thebombzen> all the variants of high are essentially based on the pixel format.
[11:09:06 CET] <fling> I thought setting it to 4.2
[11:09:27 CET] <furq> that won't do anything
[11:09:27 CET] <thebombzen> that's the level.
[11:09:31 CET] <thebombzen> you also don't need to touch that.
[11:09:36 CET] <thebombzen> I'm not even sure how or why
[11:10:01 CET] <furq> there is a use case for setting level but it's very limited
[11:10:45 CET] <thebombzen> in general, with x264, you just need to use preset and crf, if all you're doing feeding it to ffmpeg, mpv, or a website that feeds it to ffmpeg (e.g. YouTube).
[11:11:39 CET] <fling> ok, thanks.
[11:12:09 CET] <fling> But why not to set a higher profile anyway? :P
[11:13:23 CET] <furq> because you don't need to
[11:13:35 CET] <furq> ffmpeg will set the correct profile automatically based on your encoding parameters
[11:13:50 CET] <c_14> x264 will set the lowest profile based on the features you're using
[11:13:50 CET] <furq> you only ever need to set a lower profile than the one which is autoselected
[11:14:17 CET] <furq> e.g. if you use -preset veryslow -profile baseline then it'll disable bframes, cabac etc
[11:14:21 CET] <fling> ahh this is why!
[11:14:52 CET] <fling> Thanks for the explanation!
[11:16:58 CET] <fling> I'm getting only 0.2 fps with 1600x1200 with -vf nlmeans :<
[11:17:08 CET] <furq> nice
[11:17:30 CET] <furq> i didn't even know nlmeans was in lavf, it must be pretty new
[11:17:36 CET] <furq> i've only ever used the gpu-accelerated one in vapoursynth
[11:18:28 CET] <fling> The resulting video still looks noisy. Looks like the noise just lost colors but it is mostly still there
[11:18:44 CET] <fling> I more like hqdn3d
[11:19:11 CET] <fling> Can't I use my radeon to accelerate something?
[11:19:21 CET] <furq> not with ffmpeg
[11:19:31 CET] <thebombzen> you should if you compiled with --enable-opencl, right?
[11:19:47 CET] <fling> hmm hmmm
[11:19:54 CET] <furq> it doesn't mention acceleration in the filter docs
[11:20:00 CET] <thebombzen> only certain filters support opencl though.
[11:20:03 CET] <furq> yeah
[11:20:29 CET] <furq> vapoursynth has an accelerated nlmeans but that's a hell of a rabbit hole to go down
[11:20:47 CET] <thebombzen> afaik only deshake and unsharp support opencl support in libavfilter
[11:20:54 CET] <fling> furq: but it does not look good!
[11:21:01 CET] <furq> stick with hqdn3d then
[11:23:21 CET] <fling> ok
[11:25:27 CET] <fling> I'm about to craft/determine the proper filtering parameters for the noisy mjpeg somehow
[11:27:29 CET] <durandal_170> try vaguedenoiser and atadenoise
[11:28:56 CET] <thebombzen> c_14: should I submit a patch about nonseekable matroskas?
[11:29:03 CET] <thebombzen> is that a regression or a deliberate change?
[11:29:14 CET] <thebombzen> not a patch* I mean a trac report.
[11:34:51 CET] <fling> thebombzen: you better submit a patch.
[11:35:25 CET] <thebombzen> um, I don't know how to use C beyond the basics
[11:35:41 CET] <thebombzen> the issue is I only want to submit a bug report if it isn't intended behavior
[11:35:47 CET] <thebombzen> and c_14 said they'd look into it
[11:37:56 CET] <fling> Another question is how to prevent creation_time from being converted to Zulu time?
[11:44:42 CET] <fling> durandal_170: hqdn3d looks the best
[11:46:16 CET] <durandal_170> fling: just opinion, you tried only defaults?
[11:47:37 CET] <fling> durandal_170: only defaults for now
[12:06:22 CET] <ngomes> hi. is there any options to split a video file in size ?
[12:08:42 CET] <ngomes> i see manual talks about -fs parameter. i can split 1 part . but how to split in the rest ?
[12:11:18 CET] <ngomes> it outputs the time , i could parse it and use -ss option , asking if there's a way to do it more fast ?
[12:21:12 CET] <fling> I'm interested in this too. I would like to split output file each hour or each GB
[12:24:03 CET] <ngomes> fling, i've googled and see no desired answer . looks like that splitting per size needs to be done manually
[12:24:19 CET] <ngomes> or making a script
[12:25:38 CET] <ngomes> desired answer would be ffmpeg to do it automatic
[12:26:13 CET] <furq> one per hour is easy
[12:26:21 CET] <furq> there's no way to do it by size afaik
[12:27:02 CET] <fling> furq: how to?
[12:27:12 CET] <furq> https://www.ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment
[12:27:19 CET] <fling> Will it be possible to seamlessly concat afterwards?
[12:27:23 CET] <fling> thanks
[12:27:42 CET] <furq> it should work if you split to mpegts
[12:28:03 CET] <ngomes> fling, ffmpeg -i inputfile.ext -vcodec copy -acodec copy -ss 00:00:00 -t 01:00:00 outfile_1hour.ext
[12:29:20 CET] <ngomes> fling, ffmpeg -i inputfile.ext -vcodec copy -acodec copy -ss 01:00:00 -t 02:00:00 outfile_1hour_2.ext
[12:29:25 CET] <ngomes> and so on ..
[12:29:26 CET] <furq> yeah don't do that
[12:33:07 CET] <ngomes> well thanks anyway. bye
[12:57:55 CET] <beauty> who can give me a audio resample program
[13:09:43 CET] <beauty> Is SwrComtext thread safety?
[15:47:49 CET] <laserbled> hi. Can you please explain  -q:v 0 -g 25  parameter in ffmpeg. I was playing around with gdigrab to capture desktop and when i added this the quality of the stream improved drastically. I read it has to do with keyframe
[15:48:22 CET] <laserbled> trying to figure out which way i have to change the values of 0 and 25 to get even better quality and less delay
[16:09:49 CET] <DHE> -q:v 0 means codec-level lossless, and -g 25 means a keyframe every 25 frames (at most)
[16:10:16 CET] <JEEB> that is, if zero is lossless in the format you're trying to use
[16:10:35 CET] <DHE> I tend to assume h264 codec unless told otherwise
[16:11:05 CET] <DHE> especially in realtime environments because h265 seems unlikely
[16:39:17 CET] <fling> Is not there a how-to on determining the proper parameters for hqdn3d?
[16:47:00 CET] <fling> Is there a proper way of saving encoding parameters/commands/whatever to the target file? Or could I just put them to metadata?
[16:47:48 CET] <fling> Or I could just attach the capturing/encoding/muxing script as a side data.
[17:01:54 CET] <fling> Is it possible to mute sound here and there for a second or so with -c:a copy?
[17:06:18 CET] <JEEB> no
[17:23:20 CET] <microchip_> fling: copy means "copy the audio/video without touching it". Else it wouldn't be called copy. So filters have no effect on stream copy
[17:33:37 CET] <kerio> it might be possible to losslessly mute compressed audio
[17:33:37 CET] <kerio> but
[17:33:51 CET] <kerio> i wonder how much would transpire
[17:36:58 CET] <abumusa> Hi, where I can find developers to hire for a freelance job?
[17:41:20 CET] <laserbled> DHE: JEEB thanks. I am trying to use mpegts as h264 has issues
[17:44:20 CET] <durandal_170> abumusa: what job?
[17:44:56 CET] <abumusa> durandal_170: I wrote a screen recorder for google chrome, I want someone to finish it
[17:45:50 CET] <kerio> laserbled: like, raw h264?
[17:45:52 CET] <kerio> no kidding
[17:46:17 CET] <kerio> i still don't know how to deal with raw bitstreams of arbitrary codecs tbh :\
[17:54:38 CET] <abumusa> durandal_170: are you interested?
[17:55:38 CET] <durandal_170> abumusa: busy with work
[17:55:48 CET] <abumusa> durandal_170: thanks
[17:57:48 CET] <DHE> laserbled: that makes no sense. mpegts is a file format, h264 is a codec
[17:58:00 CET] <DHE> you put h264 into mpegts
[18:23:50 CET] <kerio> DHE: i assumed he was outputting raw h264
[21:16:17 CET] <Zeranoe> c++
[21:17:43 CET] <JEEB> rust
[21:20:28 CET] <furq> standard ml
[21:28:51 CET] <fritsch> haskell
[21:53:56 CET] <nyuszika7h> c_14, the thing you suggested earlier for multiple snapshots does not work without repeating the -i after each -ss, it just generates one thumbnail and then gets stuck at 00:00:00 forever
[21:56:42 CET] <nyuszika7h> http://dpaste.com/08ZZCF9
[22:10:21 CET] <c_14> nyuszika7h: it works; it just takes forever
[22:10:29 CET] <nyuszika7h> oh
[22:10:46 CET] <nyuszika7h> well I'll just keep manually doing it then, I only need three thumbnails anyway
[22:10:54 CET] <c_14> Just used the command on a 1:09 minute movie and it took 9 minutes
[22:11:04 CET] <c_14> 1:09 hour movie
[22:11:19 CET] <c_14> (1 hour, 9 minute)
[22:13:22 CET] <c_14> queuing 3 commands with just the initial seek and 1 frame is probably faster
[22:14:13 CET] <iive> are you talking about x265?
[22:14:45 CET] <c_14> Nah, seeking in a video to get multiple thumbnails at specific points
[22:19:41 CET] <iive> :O
[23:14:18 CET] <c_14> thebombzen: seems like it was a deliberate change, though I don't see why they decided to write a void element instead of writing the element and then going back and overwriting it (other than that it was easier)
[23:40:17 CET] <mantas322> Hi guys
[23:40:26 CET] <mantas322> Im trying to convert an .avi to an mp4
[23:40:42 CET] <mantas322> its a 4k res 2 gb avi video
[23:41:10 CET] <mantas322> any advise as to why default conversion settings fial?
[23:41:12 CET] <mantas322> fail?
[23:41:55 CET] <mantas322> my command is as simple as C:\ffmpeg\bin\ffmpeg.exe -i C:\vr\RedHorizon.avi C:\vr\RH.mp4
[23:42:11 CET] <c_14> and the console output?
[23:42:20 CET] <c_14> Also, if you just want to remux add -c copy
[23:42:30 CET] <mantas322> I'm reconverting it. please wait for it to finish
[23:42:38 CET] <mantas322> and i will post an imgur of the final result
[23:42:48 CET] <mantas322> still scrolling through frame by frame atm
[23:44:08 CET] <mantas322> c_14 http://i.imgur.com/H0irYmp.png
[23:44:20 CET] <mantas322> is that a success?
[23:44:29 CET] <c_14> yep
[23:44:36 CET] <mantas322> maybe it worked and i just cant view it for some reason
[23:44:45 CET] <mantas322> im attempting to create a 360 vr video
[23:45:05 CET] <c_14> What are you using to try and play it?
[23:45:13 CET] <mantas322> and im too cheap to afford adobe super studio, or what ever its called.
[23:45:21 CET] <mantas322> defauly windows stuff
[23:45:26 CET] <mantas322> audio plays no video
[23:47:04 CET] <c_14> Can you try playing with something like mpv or vlc?
[00:00:00 CET] --- Sun Dec  4 2016


More information about the Ffmpeg-devel-irc mailing list