[Ffmpeg-devel-irc] ffmpeg.log.20160219

burek burek021 at gmail.com
Sat Feb 20 02:05:01 CET 2016


[00:02:00 CET] <wintershade> hmm let me try...
[00:03:08 CET] <wintershade> kepstin: okay, I tried ffprobe -show_format file.m4v |grep duration |sed s:duration=::g
[00:03:20 CET] <wintershade> and it got me the integer (well, float, but that's okay)
[00:03:35 CET] <wintershade> but it also printed a wall of text before that in the output
[00:03:48 CET] <wintershade> starting with ffmpeg version 2.6.8... etc.
[00:04:24 CET] <kepstin> sure, that wall of text is on stderr. There's some options to hide the banner, or you can just do 2>/dev/null on the ffprobe command
[00:09:31 CET] <C0nundrum> Is there a way to customize the message the rtmpt sends ?
[00:15:27 CET] <wintershade> kepstin: great! that worked, thanks
[00:15:35 CET] <wintershade> kepstin: I'll try tinkering around that one now
[06:36:46 CET] <CoJaBo> ..so here's yet another stupid problem: I have a camera that has the second line burned out. Is there a sane way to edit that out?
[06:44:10 CET] <furq> crop?
[06:50:30 CET] <CoJaBo> furq: Cropping would leave me with a really weird-sized video
[06:57:44 CET] <furq> weird how
[07:00:32 CET] <furq> i guess you could try delogo but idk how well that will work on a 1px high line
[07:46:56 CET] <CoJaBo> furq: ..how do I delogo?
[07:47:28 CET] <CoJaBo> An additional question came up; is there something like yuv4mpegpipe that supports YUVJ?
[07:55:44 CET] <relaxed> CoJaBo: you could do -c:v rawvideo -pix_fmt yuvj420p
[07:56:17 CET] <relaxed> look at "ffmpeg -pix_fmts | grep j" for other options
[07:59:19 CET] <CoJaBo> How do I read the rawvideo format?
[08:01:33 CET] <relaxed> you can try, -pix_fmt yuvj420p -f yuv4mpegpipe
[08:01:45 CET] <relaxed> but I'm not sure if it's supported
[08:04:17 CET] <CoJaBo> relaxed: That's the problem; it complains about it not being supported
[08:04:23 CET] <CoJaBo> but... why?
[08:06:53 CET] <relaxed> if it's not supported and you need yuvj420p, then use -c:v rawvideo -pix_fmt yuvj420p and output it in matroska
[08:07:41 CET] <CoJaBo> I want to actually pipe it tho, to some other tool (and then back to ffmpeg)
[08:07:44 CET] <relaxed> libx264 supports yuvj420p
[08:08:48 CET] <relaxed> you can pipe matroska or nut
[08:09:16 CET] <relaxed> ffmpeg -i input -c:v rawvideo -pix_fmt yuvj420p -f matroska - | ffmpeg -i - ...
[08:09:42 CET] <CoJaBo> what is nut?
[08:09:58 CET] <relaxed> a container, like .mkv or .mp4
[08:10:54 CET] <relaxed> why do you need to pipe from ffmpeg to ffmpeg? what are you doing?
[08:11:16 CET] <CoJaBo> I want to edit it in the middle somehow
[08:13:27 CET] <CoJaBo> (I often need to do bizarre things, like duplicating the second scanline of every second frame, un-fishbowling the clip, or doing a skew effect to the image)
[08:23:44 CET] <relaxed> what's expected in the middle is the info you need
[08:24:33 CET] <relaxed> can you send raw yuv?
[08:24:47 CET] <CoJaBo> That's what I want to do, yes
[08:25:07 CET] <CoJaBo> is nut the best format to use for that?
[08:25:24 CET] <CoJaBo> I'd previously used yuv4mpegpipe, but it appears to no longer be supported..
[08:25:31 CET] <relaxed> -pix_fmt yuvj420p -f rawvideo -
[08:27:23 CET] <relaxed> yuv4mpegpipe with yuvj420p?
[08:28:37 CET] <CoJaBo> yuv4mpegpipe doesn't support yuvj420p
[08:28:49 CET] <CoJaBo> I don't know why; might be a bug?
[08:30:00 CET] <relaxed> I mean you said it worked in the past. What doesn't work now?
[08:30:38 CET] <CoJaBo> It worked before because my other camera shot yuv420p
[08:32:35 CET] <relaxed> oh, then use -pix_fmt yuv420p -f yuv4mpegpipe -
[08:32:47 CET] <CoJaBo> I want to minimize the number of color format conversions
[08:33:57 CET] <CoJaBo> I'm hoping to edit directly in yuvj420p; tho I might have to convert to RGB, in which case I'm actually probably better off using nut/rawvideo anyway.. gah..
[08:35:06 CET] <relaxed> like I said, if whatever you're using in the middle supports raw yuv, then -pix_fmt yuvj420p -f rawvideo -
[08:36:07 CET] <CoJaBo> rawvideo alone loses the headers
[08:36:12 CET] <relaxed> indeed
[08:36:31 CET] <CoJaBo> which I kinda need lol
[08:38:06 CET] <relaxed> lossless h264 in matroska
[08:54:21 CET] <CoJaBo> relaxed: ..that is not at all easy to work with lol
[10:02:51 CET] <dystopia_> is there an easy way to cat vp6 flash video?
[10:03:00 CET] <dystopia_> with out encoding to another format first
[10:20:00 CET] <claw> dystopia_: mkvmerge
[10:20:32 CET] <claw> oh sorry you are talking about vp6
[10:21:07 CET] <claw> nevermind
[10:36:14 CET] <dystopia_> mkvmerge worked claw
[10:36:18 CET] <dystopia_> thanks for the suggestion
[10:36:39 CET] <claw> well be both learned something
[10:36:49 CET] <dystopia_> :)
[10:41:18 CET] <dystopia_> ahh it didn't work :(
[10:41:31 CET] <dystopia_> i was testing with files i had already encoded out, when i tried the source files it failed
[13:14:15 CET] <t4nk692> Hello, I need some help. Does anybody know where I can update my yasm/nasm or somehow make my ffmpeg installation work? This is the log: Jonathans-MacBook-Pro:~ jonathan$ /Users/jonathan/Downloads/ffmpeg-3.0/configure ; exit; yasm/nasm not found or too old. Use --disable-yasm for a crippled build.  If you think configure made a mistake, make sure you are using the latest version from Git.  If the latest version fails, report the prob
[13:17:58 CET] <J_Darnley> Build yasm or nasm from source?
[13:18:08 CET] <J_Darnley> http://yasm.tortall.net/
[13:18:56 CET] <J_Darnley> and I think nasm moved here http://www.nasm.us/
[13:22:16 CET] <ethe> t4nk692 you can get it from homebrew
[13:22:38 CET] <ethe> (or macports, I believe, although I've never used it)
[13:23:01 CET] <t4nk692> Right, I was through the yasm-site before during my search for a solution. I don't quite get how to perform a build from source. But I'll give it another shot.
[13:23:16 CET] <t4nk692> Trying the nasm site, downloading now.
[13:23:26 CET] <t4nk692> and will check homebrew
[13:23:28 CET] <cowai> My input is "720x576 [SAR 16:15 DAR 4:3]", how should I scale that to get the best possible square image?
[13:23:33 CET] <ethe> homebrew will be 100x easier
[13:23:52 CET] <cowai> right now I am using bwdif=0:0:0,scale=768x576
[13:23:57 CET] <ethe> you dont have to build from source, it generally already has binary packages, but if not, it automates building from source.
[13:24:09 CET] <TekniQue> cowai: square image or square pixels?
[13:24:14 CET] <cowai> pixels
[13:24:18 CET] <cowai> sorry :P
[13:24:18 CET] <TekniQue> 768x576
[13:24:33 CET] <TekniQue> gives you a 4:3 square pixel image
[13:24:44 CET] <cowai> that is what I am using currently.
[13:24:52 CET] <cowai> is that the most correct one?
[13:24:55 CET] <TekniQue> yes
[13:25:29 CET] <cowai> even when deinterlacing?
[13:25:38 CET] <ethe> t4nk692: and if you're just looking to get an ffmpeg release then you can get that from homebrew as well (using brew install ffmpeg), and if you want the latest git from within homebrew: brew install --HEAD ffmpeg
[13:25:41 CET] <TekniQue> yes
[13:26:13 CET] <cowai> so the image that I get by using -filter:v "bwdif=0:0:0,scale=768x576" is the best I can get with ffmpeg?
[13:27:01 CET] <cowai> I have been suggested to try to scale it up to twice the size and then use unsharp mask to try to make it better. Is that worthwhile you think?
[13:27:15 CET] <TekniQue> what's bwdif?
[13:27:25 CET] <cowai> ffmpegs newest deinterlacer :)
[13:27:31 CET] <t4nk692> alright <ethe> I'm going into that now. Thank you!
[13:27:35 CET] <cowai> commited by durandal_1707 yesterday
[13:27:44 CET] <cowai> its a mix of yadif and w3fdif
[13:27:51 CET] <TekniQue> ok
[13:28:03 CET] <cowai> I have tried it with nice results
[13:28:23 CET] <cowai> more "balanced" for my use case which is a live stream with lots of different content.
[13:29:01 CET] <cowai> https://ffmpeg.org/ffmpeg-filters.html#bwdif
[13:29:02 CET] <TekniQue> but for scaling, yes it may give a smoother image to do it that way
[13:29:48 CET] <TekniQue> but even just applying the unsharp mask to it as is may produce a more pleasant image
[13:29:55 CET] <cowai> my output will be a hls stream, with different qualities. Today I have 192p, 360p and 576p. The source is a decklink card with SD-SDI
[13:30:04 CET] <TekniQue> I'd have to see a screenshot to be sure of what's going on
[13:30:21 CET] <TekniQue> normally I would let the player worry about scaling the picture
[13:30:33 CET] <TekniQue> unles you're using a codec that requires square pixels
[13:30:53 CET] <cowai> I think HLS requries it.
[13:31:18 CET] <TekniQue> nope
[13:31:50 CET] <cowai> I am also playing hls with hls.js and flashls.
[13:31:51 CET] <TekniQue> source: I've been using HLS since 2009
[13:32:13 CET] <cowai> I think when I didnt scale it, some of the different players made the image squeezed
[13:32:30 CET] <Mavrik> But that has nothing to do with interlace
[13:32:34 CET] <Mavrik> That's wrong SAR
[13:32:38 CET] <Mavrik> Or players not listening to it.
[13:34:27 CET] <cowai> Mavrik:  We were talking about SAR
[13:35:00 CET] <Mavrik> Anyway, broadcasting HLS to bunch of clients and didn't have requirement for square pixels.
[13:35:11 CET] <Mavrik> AR switch is another issue.
[13:44:33 CET] <cowai> Thanks. Should I let the other levels/qualities be the same SAR as the original when downscaling?
[13:54:45 CET] <TekniQue> yes
[13:55:16 CET] <cowai> do I just do scale=-1:360  after deinterlacing then?
[13:55:23 CET] <cowai> for a height of 360?
[13:55:33 CET] <cowai> and the SAR will be like it was?
[13:55:48 CET] <TekniQue> no I'd scale down on horizontal and vertical
[13:56:13 CET] <TekniQue> what I usually do is make the smaller pictures 1:1 PAR
[13:56:28 CET] <TekniQue> err, SAR
[13:56:49 CET] <J_Darnley> PAR=SAR
[13:57:18 CET] <TekniQue> yes, I'm used to talking about pixel aspect ratio and not sample aspect ratio
[13:57:46 CET] <TekniQue> for the same thing
[14:00:49 CET] <cowai> I must admit I am not sure that the difference is :/
[14:02:00 CET] <J_Darnley> There isn't one as far as you are concerned.
[14:02:04 CET] <J_Darnley> samples are pixels
[14:02:08 CET] <J_Darnley> pixels are samples
[14:03:31 CET] <cowai> Is there any new formats that will ever use a non 1:1 PAR?  Isn't this just for legazy CRT monitor stuff?
[14:04:04 CET] <J_Darnley> Blurays probably still do in some of their dark corners
[14:04:26 CET] <cowai> :P
[14:04:27 CET] <J_Darnley> Broadcast might still do anamorphic 1080p
[14:04:37 CET] <J_Darnley> DVDs of course
[14:04:43 CET] <cowai> yes,
[14:05:12 CET] <cowai> But as the world moves towards HD anamorphic will die right?
[14:05:17 CET] <J_Darnley> It wouldn't surprise if some "cheap" cameras do
[14:05:42 CET] <J_Darnley> I doubt it.
[14:05:49 CET] <J_Darnley> Video formats will support it
[14:05:49 CET] <cowai> I am trying to find the reason it was created in the first place.
[14:06:01 CET] <J_Darnley> So someone will still make videos that have it.
[14:06:15 CET] <J_Darnley> Ah
[14:06:32 CET] <J_Darnley> That probably goes back to early cinema cameras
[14:07:06 CET] <J_Darnley> One of the film formats used a lens which produced this kind of distortion
[14:07:13 CET] <cowai> I mean who in their right mind created "1440x1080" HDV
[14:07:37 CET] <J_Darnley> People without enough bandwidth or processing power for full hd
[14:07:39 CET] <cowai> I would rather have 720p forever then crazy 1440x1080
[14:08:04 CET] <Mavrik> Pretty much all SD broadcast TV has non 1:1 SAR
[14:08:53 CET] <J_Darnley> I would rather we stayed at SD.  No need for anything bigger.
[14:09:29 CET] <cowai>  I work in a SD TV channel and the amount of work we need to worry about fields is crazy in this HD age.
[14:09:43 CET] <J_Darnley> Now interlacing is another matter
[14:09:47 CET] <cowai> yes
[14:09:54 CET] <Mavrik> Interlacing must be taken outside and shot.
[14:10:07 CET] <J_Darnley> Anyone who produces an interlaced video should be publically executed
[14:10:32 CET] <cowai> Agreed
[14:10:34 CET] <Mavrik> Even though, people love that sweet 50i
[14:10:41 CET] <dv_> isnt interlacing effectively gone in HEVC?
[14:10:47 CET] <dv_> IIRC it is just present as a marker
[14:10:53 CET] <dv_> but no MBAFF or anything like that
[14:11:02 CET] <J_Darnley> That's not the problem
[14:11:09 CET] <J_Darnley> The content is the problem
[14:11:44 CET] <J_Darnley> Interlaced content remains interlaced no matter what encoding features are present.
[14:15:15 CET] <cowai> my tv provider at home scales 720p50 to 1080i before reencoding it to lower bitrate. And then the reciever deinterlaces this to 1080p.
[14:15:39 CET] <cowai> doesnt much worse then that.
[14:15:56 CET] <J_Darnley> my condolences
[14:21:03 CET] <IntelRNG> It sounds awful.
[16:41:24 CET] <eightfold> hi there
[16:42:52 CET] <eightfold> i have two questions. i use -ss and -t when remuxing to move segments of files into a new container.
[16:43:15 CET] <eightfold> is there a way to set -t to "the end of the file", without specifying the exact length of the file?
[16:43:28 CET] <J_Darnley> Yes, don't give -t
[16:43:40 CET] <eightfold> J_Darnley: oh, thanks! :)
[16:44:44 CET] <eightfold> my second question. it would be simpler to calculate time if i could put the position of the -t in "time into the file" instead of "time from -ss"
[16:45:02 CET] <eightfold> is that possible? i've tried "-to", but it seems to work the same way as -t
[16:45:31 CET] <J_Darnley> Are you using -t or -to as an input or output option?
[16:46:56 CET] <eightfold> J_Darnley: hmm, i don't know, i though -to replaced -t
[16:47:19 CET] <J_Darnley> Do you put it before the input file or before the output file?
[16:49:14 CET] <eightfold> J_Darnley: after input and before output
[16:49:44 CET] <J_Darnley> Try moving -to to before the input file.  It might work as you expect then.
[16:51:10 CET] <eightfold> J_Darnley: ok, i'll try. thanks!
[16:58:57 CET] <monkkey> Hi, is there?
[17:00:17 CET] <eightfold> J_Darnley: when doing these kind of moves, should i add any option for better syncing or so? currently looks like this:
[17:00:18 CET] <eightfold> ffmpeg -ss 00:00:00.000 -i input.mp4 -t 00:39:01.500 -c copy output.mp4
[17:00:45 CET] <eightfold> i'm running ffmpeg 2.8.4
[17:01:29 CET] <J_Darnley> It can be a little imprecise when copying but I don't know what options may help fix/prevent that.
[17:01:46 CET] <J_Darnley> You can try looking at the various sync options.
[17:03:06 CET] <monkkey> Please, if someone can help me... I do it: ./opencvvehicletrack | ffmpeg -f rawvideo -pix_fmt gray -i - out.avi (opencvvehicletrack generate raw video) and then I get the next result: https://www.youtube.com/watch?v=HqpAv_oi3u4
[17:03:26 CET] <eightfold> J_Darnley: seems to be ok with preciseness
[17:03:50 CET] <J_Darnley> monkkey: you have a size or stride problem
[17:04:04 CET] <monkkey> stride?
[17:04:15 CET] <J_Darnley> aka pitch or linesize
[17:04:32 CET] <J_Darnley> (the number of bytes to move to the next line)
[17:05:33 CET] <monkkey> I send to the pipe raw image (mat of opencv)
[17:05:49 CET] <monkkey> and these mat are in the same size of the output
[17:06:13 CET] <monkkey> if I change the sizes, I get an error
[17:07:38 CET] <J_Darnley> Well, I'm surprised that the exact command you've shown doesn't just return an error.
[17:07:50 CET] <J_Darnley> You've not set the frame size at all
[17:09:07 CET] <monkkey> I'm sorry I delete it to simplify the question, here complete:
[17:09:34 CET] <monkkey> ./opencvvehicletrack | ffmpeg -f rawvideo -pix_fmt gray -s 720x480 -i - -an out.avi
[17:10:07 CET] <monkkey> Do you say that I should focus on size issues?
[17:10:29 CET] <J_Darnley> Yes.
[17:11:06 CET] <J_Darnley> It would appear that the output video is not exactly what it should be.
[17:11:44 CET] <J_Darnley> Your other tool should be writing exactly 345600 bytes per frame
[17:12:19 CET] <monkkey> ahhh I don't know that!
[17:12:37 CET] <monkkey> Thanks!!
[17:12:45 CET] <J_Darnley> How many bytes did you think were in a greyscale 720x480 frame?
[17:13:30 CET] <monkkey> 345600 :D
[17:14:21 CET] <monkkey> I didn't figure out counting the mat size
[17:15:37 CET] <eightfold> J_Darnley: i tried:
[17:15:43 CET] <eightfold> ffmpeg -ss 00:21:07.000 -to 00:38:22.000 -i input.mp4 -c copy output.mp4
[17:16:04 CET] <eightfold> but got: "Option to (record or transcode stop time) cannot be applied to input file"...
[17:16:13 CET] <J_Darnley> Ah
[17:16:30 CET] <J_Darnley> I guess you can't do that then
[17:21:22 CET] <eightfold> J_Darnley: a good idea for a future i think. i have never come across a situation where i want "time from -ss". often times i think most people look up a time for start and one for end in an existing file. right now i use https://www.timeanddate.com/date/timeduration.html to calculate the time between...
[17:22:35 CET] <J_Darnley> You can specify both those values in seconds if it help you at all.
[17:24:04 CET] <J_Darnley> I must say that I often want to do only a short bit.  I just give -t 10 (or whatever) in that case.
[17:29:08 CET] <monkkey> the size is ok, I forget that I write to disk the output, and get an avi with the correct size
[17:29:27 CET] <monkkey> May be the problem is how ffmpeg take the rawvideo?
[17:31:34 CET] <monkkey> after hours, the "scroll" is vertical and horizontal...
[17:32:02 CET] <guillaumekh> Hi all, looking for help :) Trimming audio files with -ss/-to genrerates a "click" sound at both ends of the file (see screenshot : https://www.dropbox.com/s/5wydz3zsspyzuj5/Screenshot%202016-02-19%2017.22.10.png?dl=0). Anyone familiar with this and ways to circumvent that ? The file is a 44kHz 16bit stereo PCM stream encoded in AAC LC 192kbps.
[17:32:51 CET] <kepstin> guillaumekh: what's the full command? are you transcoding or using -c:a copy?
[17:33:19 CET] <monkkey> Should I send between image and image a kind of separator byte?
[17:33:22 CET] <kepstin> I would expect something that when trimming with copy, simply because of how lossy audio codecs work.
[17:33:42 CET] <kepstin> monkkey: nope, raw input to ffmpeg is the raw binary bytes of each frame, right after eachother.
[17:34:06 CET] <kepstin> it uses the input size and pixel format options to determine how many bytes to read.
[17:37:51 CET] <cjbo> Hi! I'm thinking about a use of ffmpeg and would like to know about feasibility and hints. I want to split a video into two streams, one for left eye, another for right eye. Left eye would be 4x4 filtered stream of the original, where some squares are black, Right eye the complementary image.
[17:38:21 CET] <J_Darnley> Sounds horrible for compression!
[17:38:37 CET] <J_Darnley> At least make them macroblock sized.
[17:39:10 CET] <cjbo> When viewed with a 3d viewer like bino and anaglyph glasses, the viewer must integrate both images in his brain to be able to view it. I want it to help my son's amblyopia, hope having explained myself enough
[17:39:46 CET] <J_Darnley> As for doing that, I can't think of a simple way.  Perhaps abusing a lut filter.
[17:45:29 CET] <monkkey> yes, sorry my writting is horrible :(
[17:46:15 CET] <monkkey> I do: matriz.height * matriz.width... and OK
[17:46:38 CET] <monkkey> but, I count the bytes of the matrix and there is one more byte
[18:09:57 CET] <podman> alright, i figured out how to install the nvidia drivers on an EC2 instance running ubuntu and got ffmpeg installed with nvenc. w00t
[18:11:39 CET] <colde> Remember to make an AMI so you dont have to battle it again :)
[18:11:52 CET] <podman> colde: yeah, actually about to do that!
[18:12:22 CET] <podman> and then i use packer to install everything else on that base AMI and create a new AMI
[18:12:24 CET] <podman> fun times
[18:15:25 CET] <cjbo> a simpler question: I have a 4x4 checker board image file, I want to use it as a mask to an input video. Output video is visible only in white squares, black squares remain black....
[18:15:57 CET] <J_Darnley> Try the blend filter for that.
[18:16:19 CET] <cjbo> J_Darnley: ok, thanks
[18:16:50 CET] <J_Darnley> I still suggest bigger squares
[18:16:59 CET] <jas99> Hi thr
[18:17:46 CET] <J_Darnley> Unless you aren't planning on encoding the output with a typical codec.
[18:18:50 CET] <jas99> Can I add misc data in any container
[18:19:05 CET] <cjbo> J_Darnley: Well, I don't mind having quite large files. I want them as a some sort of eye exercise for lazy eye treatment.
[18:19:19 CET] <J_Darnley> Typically no.  As metadata, often.
[18:19:43 CET] <jas99> hey thx for reply J
[18:20:51 CET] <jas99> Not worked with metadata but can I have my own keys in key-value pairs in metadata??
[18:21:33 CET] <jas99> I guess muxing will be required
[18:23:54 CET] <jas99> Hi guys
[18:24:14 CET] <jas99> Anybody thr
[18:24:16 CET] <J_Darnley> In formats which support it
[18:24:59 CET] <jas99> oh really cool, does mkv one of them?
[18:25:05 CET] <J_Darnley> I believe so.
[18:25:14 CET] <jas99> nice thx mate
[18:25:39 CET] <furq> -metadata foo=bar
[18:26:06 CET] <jas99> fruq: oh cool thx very much mate
[18:26:16 CET] <jas99> *furq
[18:27:05 CET] <jas99> I was googling for another thing can I add presentation time PTS on frame manually?
[18:27:24 CET] <jas99> i am using c++
[18:28:05 CET] <J_Darnley> You can control many things through the API
[18:28:31 CET] <furq> can you control the thoughts of inferior species
[18:28:32 CET] <J_Darnley> If you want to change timestamps then do it for whatever data structures you have.
[18:28:41 CET] <furq> like "hu-mans"
[18:29:03 CET] <jas99> but can't init the codec without setting framerate
[18:29:12 CET] <jas99> lol
[18:29:21 CET] <jas99> i second that
[18:29:54 CET] <jas99> btw where are you guys from>
[18:30:00 CET] <J_Darnley> .be
[18:30:01 CET] <jas99> ?
[18:31:10 CET] <jas99> .be??
[18:31:58 CET] <J_Darnley> it borders .fr, .nl, .de
[18:32:16 CET] <furq> it's in .eu
[18:32:31 CET] <DHE> some IRC clients have a /country command for easy reference to this kind of thing
[18:32:35 CET] <furq> and also in .world
[18:32:36 CET] <jas99> belgium?
[18:32:55 CET] <J_Darnley> Congrats you triggered my highlighting
[18:33:30 CET] <furq> do you get highlighted by belgium
[18:33:38 CET] <J_Darnley> Yes
[18:33:42 CET] <furq> that sounds like an innuendo
[18:34:07 CET] <J_Darnley> belgian too I think
[18:34:37 CET] <furq> my hostmask says i'm from belgium but i'm not really. it's an elaborate plot
[18:34:42 CET] <furq> nobody will ever know where i'm from
[18:34:59 CET] <J_Darnley> omg criminal scum
[18:35:08 CET] <furq> that joke would work better in the channel where i mention it all the time
[18:35:10 CET] <jas99> I love bitcoin
[18:35:16 CET] <jas99> just random thought
[18:35:38 CET] <jas99> i feel like speaking in jar
[18:36:07 CET] <jas99> expect it talks back often useful info
[18:39:12 CET] <jas99> so do i need to set framerate??
[18:39:37 CET] <J_Darnley> Usually no, the timing is done from timestamps
[18:40:24 CET] <jas99> but my codec won't initiate and asks to set framerate
[18:41:05 CET] <jas99> using ffv1 from api
[18:42:23 CET] <J_Darnley> Should I state the obvious then?  Set the framerate.
[18:44:45 CET] <jas99> sorry for being dumb :( :(
[18:45:00 CET] <jas99> inferior species
[19:08:46 CET] <cjbo> J_Darnley: I'm doing it drawing black boxes with drawbox... can i specify relative sizes to match videos of any size?
[19:11:52 CET] <J_Darnley> Check the docs to see if it supports expressions.
[19:13:19 CET] <cjbo> https://ffmpeg.org/ffmpeg-filters.html#drawbox it mentions 'expressions' in the parameters, does that mean yes?
[19:14:26 CET] <J_Darnley> yes
[19:15:47 CET] <cjbo> ...so i think i'm able to put something like 'width/4' or similar? I beg for a hint o words to search....
[19:16:43 CET] <J_Darnley> The link you just gave gives you several varibales you can use yourself
[19:16:56 CET] <J_Darnley> in_w and in_h
[19:18:37 CET] <cjbo> J_Darnley: thx, i think i'm too old for this sh... ;-)
[19:20:55 CET] <cjbo> I get only the last specified box...  ffmpeg -i v1.mp4 -vf drawbox=0:0:160:80:color=black:t=max -vf drawbox=321:0:160:80:color=black:t=max -q:v 0 salida.mkv
[19:21:41 CET] <J_Darnley> the next vf overrides the previous.
[19:21:51 CET] <J_Darnley> join the two chains with a comma
[19:22:04 CET] <cjbo> great!
[19:52:44 CET] <rajkosto> yo why avcodec_decode_video2 return "invalid data found when processing input", im giving it one NAL(i tried both without 0x00000001 prefix and with) per invocation
[19:52:51 CET] <rajkosto> h264 decoding
[19:56:19 CET] <cjbo> J_Darnley: http://picpaste.com/captura-L8fjaDum.png
[20:03:33 CET] <cjbo> J_Darnley: http://picpaste.com/captura2-W4vvu7eL.png
[20:10:49 CET] <J_Darnley> Colourful
[20:11:35 CET] <cjbo> J_Darnley: thats for the anaglyph... i'll try in a normal 3D television
[20:30:24 CET] <rajkosto> playing the NAL frames with 0x00000001 prefix dumped to a .264 file works with mplayer
[23:32:03 CET] <harej> I am trying to convert a file from MP4 to WebM. (Yes, compressed to compressed is not ideal, but it's what I have.) The source video is 720p. Despite setting high bitrate settings and setting the crf to 0 (!) I get these weird color splotch artifacts in the converted product. What setting should I change to avoid these? Are they unavoidable?
[23:32:17 CET] <harej> These are the settings I am using: -cpu-used 4 -threads 16 -crf 0 -b:v 7M -c:a libvorbis
[23:37:11 CET] <furq> -b:v 7M is overriding -crf
[23:37:37 CET] <furq> you can use both with vpx, the -b:v value is the bitrate cap
[23:37:58 CET] <furq> if you want a lossless encode then use -lossless 1
[23:41:01 CET] <harej> And it does it make sense to use lossless if my source material is compressed? Will I be rid of the annoying color splotches?
[23:41:16 CET] <furq> well that's what people normally expect when they do -crf 0
[23:41:29 CET] <furq> i've never done a lossless vpx encode so i couldn't tell you
[23:41:48 CET] <harej> And are there other settings you'd recommend?
[23:41:59 CET] <furq> i'd just get rid of -b:v 7M for now
[23:42:16 CET] <furq> that's probably not going to work well in conjunction with -crf 0
[23:42:35 CET] <furq> also make sure you're using vp9 and not vp8 (-c:v libvpx-vp9)
[23:59:25 CET] <harej> furq, so I did the conversion and now I am free of the artifacts, but I find that the resulting output is not as high-quality as the source
[23:59:34 CET] <harej> what can i do to preserve the 720p resolution?
[00:00:00 CET] --- Sat Feb 20 2016


More information about the Ffmpeg-devel-irc mailing list