[Ffmpeg-devel-irc] ffmpeg.log.20141110

burek burek021 at gmail.com
Tue Nov 11 02:05:01 CET 2014


[00:13] <AndyS90> you still here c_14 ?
[00:15] <c_14> ye
[00:16] <AndyS90> if you're interested in what you helped me with earlier, or i guess if anyone ever asks the same question - i put the code up on GitHub; https://github.com/andrew-s/roku_audio_encoder
[00:18] <c_14> Just a nitpick, it's FFmpeg, and ffprobe is all lowercase.
[00:19] <c_14> (for the readme)
[00:19] <AndyS90> :P updated
[00:20] <AndyS90> i guess it's a fairly niche thing i would imagine anyway
[00:20] <AndyS90> needing to add an AAC track to all your files i mean
[00:21] <c_14> I personally haven't had anybody with that issue besides you so far.
[00:21] <AndyS90> lol, fair enough
[00:21] Action: blippyp looks around the room to see if anyone is there...
[00:22] <blippyp> oops - wrong channel...  ;)
[00:52] <scoofy> hi. how can I determine the number of frames in a video?
[00:53] <scoofy> (other than, multiplying FPS by length in seconds)
[00:59] <sypher01> hello all ... quick question, i'm trying to convert subtitles from one format (ASS) to another (SRT) ..
[01:00] <sypher01> i'm pretty sure the ASS is ok, but when converting to SRT it retains stuff that i don't really want ... was wondering if there's a way to handle it. (example, tags with { } )
[01:40] <jfmcarreira> heyy guys
[01:42] <blippyp> hey frost
[01:49] <jfmcarreira> how can i get the raw frame data information after using av_read_packet and av_decode?
[02:04] <c_14> scoofy: ffprobe -show_streams -count_frames <- the nb_frames var
[02:08] <scoofy> c_14: thanks
[02:09] <kahrl> does anybody know a decent tutorial for recording audio generated by applications with ffmpeg and alsa and without pulse? (As a bonus, recording mic input and audio output at the same time)
[02:10] <kahrl> I tried a few things with snd_aloop but without luck, and at the same time the .asoundrc syntax is confusing me
[02:13] <c_14> I don't know of any tutorials, but I can give you some pointers.
[02:13] <kahrl> as a C programmer, I like pointers too
[02:14] <c_14> lel
[02:14] <c_14> https://trac.ffmpeg.org/wiki/Capture/ALSA <- this covers the basics as well as mic input
[02:15] <kahrl> yeah I saw that page
[02:15] <c_14> Applications should be as easy as modprobe snd-aloop pcm_substreams=1 and recording from hw:Loopback,1,0 and outputting to hw:Loopback,0,0
[02:16] <kahrl> thanks, let me try that
[02:17] <c_14> asoundrc should be pcm.!default hw:Loopback,0,0 (probably)
[02:21] <kahrl> c_14: just like that, without quotes and braces?
[02:21] <kahrl> because then it gives me an Input/output error when I try to record
[02:22] <kahrl> c_14: http://sprunge.us/WYhX
[02:23] <c_14> let me test over here
[02:25] <blippyp> anything - just newer - even if it doesn't completely work, just something that shows 'good code practice'etc..interetesting design, but not so old and irrelevant...
[02:26] <blippyp> I really need to stop doing that - sorry guys
[02:28] <kahrl> c_14: ah, if I use pcm.!default "hw:Loopback,0,0" it works
[02:28] <kahrl> well, almost
[02:28] <kahrl> I got that far before already: it records the audio from the application but I can't hear it while recording
[02:29] <c_14> If you want to hear the audio while you're recording it, you'll either need to listen to the ffmpeg output (so you'll have a delay) or set up dmix
[02:30] <c_14> Wait, no
[02:30] <c_14> Not dmix, you need to duplicate the streams somewhere
[02:30] <c_14> ehh
[02:31] <c_14> This works, somehow.
[02:31] <c_14> Ye, with routing
[02:31] <c_14> https://raw.githubusercontent.com/vehk/dotfiles/master/asoundrc/asoundrc_multi <- look at the pcm.multi part
[02:32] <c_14> Have one of the slaves be the hw:Loopback,0,0 and the other the output device you want to listen on.
[02:33] <kahrl> what's a slave? a device where audio is routed to?
[02:33] <c_14> pretty much
[02:33] <kahrl> cool
[02:34] <c_14> computer lingo tends to have a lot of masters, slaves, servers, children, parents etc etc.
[02:35] <c_14> It's always great when you're talking with people about a process killing children and then people walk by.
[02:36] <kahrl> heh
[02:41] <kahrl> what does the ttable stuff mean?
[02:42] <kahrl> and the bindings?
[02:44] <c_14> The ttable duplicates the stereo input channels
[02:45] <c_14> Ie turns the 2 channels into 4 so the route can split the 4 into 2x2
[02:45] <c_14> The bindings tell the route with channels go where
[02:46] <c_14> The only part you should need to change are the slaves
[02:50] <kahrl> now I get something about "channels count not available" whenever I use aplay
[02:51] <kahrl> or, actually
[02:51] <kahrl> if I route to "hw:0,0" I get "Device or resource busy", if I route to "hw:0,3" I get "channels count not available"
[02:52] <c_14> Can you pastebin your asoundrc?
[02:52] <kahrl> sure, one sec
[02:53] <kahrl> c_14: http://sprunge.us/eETe
[03:02] <c_14> Right
[03:03] <c_14> >pcm.!default { type plug slave.pcm "multi" }
[03:03] <c_14> replace your pcm.!default with that
[03:04] <kahrl> hrm
[03:04] <c_14> And you _might_ need to replace the slaves identifiers with similar constructs ie slaves.a.pcm "loopin"; where loopin is pcm.looping { type plug slave.pcm "hw:Loopback,0,0" }
[03:05] <kahrl> still "Device or resource busy" if I route to "hw:0,0"
[03:05] <kahrl> ah, okay
[03:05] <kahrl> why is this so much more complicated than recording without immediate playback :P
[03:05] <c_14> Because ALSA is rather eh complex.
[03:07] <c_14> Yeah, I got it working on my end.
[03:07] <kahrl> congrats :)  I didn't...
[03:07] <c_14> Want me to pastebin my asoundrc?
[03:08] <kahrl> with "hw:0,0" still "Device or resource busy", with "hw:0,3" an assertion error
[03:08] <kahrl> that would be great
[03:09] <c_14> http://ix.io/f6Q
[03:09] <c_14> Then record with ffmpeg -f alsa -i loopout
[03:09] <c_14> While audio is playing
[03:13] <kahrl> I replaced Headset with the name of my card --> "Device or resource busy"
[03:14] <kahrl> why? nothing else is playing audio at the same time...
[03:14] <c_14> fuser -v /dev/snd/* ?
[03:15] <kahrl> returns nothing
[03:16] <c_14> what's your current asoundrc?
[03:17] <kahrl> http://sprunge.us/agSb
[03:18] <kahrl> MID is the HDA Intel MID card
[03:18] <c_14> can you pastebin aplay -L ?
[03:19] <kahrl> http://sprunge.us/RTYY
[03:20] <kahrl> if I try HDMI instead of MID I get: aplay: main:722: audio open error: No such file or directory
[03:21] <kahrl> (the same file plays perfectly fine when there is no .asoundrc)
[03:22] <c_14> Maybe use aplay -l to get the direct hardware device numbers?
[03:22] <c_14> If that doesn't work, you might want to ask #alsa for this port
[03:23] <kahrl> http://sprunge.us/gYWW <-- aplay -l and arecord -l
[03:26] <c_14> try hw:0,0 or hw:0 instead and switch the headset pcm to a type plug with pcm.slave as that hw:0 thing, like with loopin and loopout
[03:27] <kahrl> no luck
[03:28] <kahrl> whenever hw:0,0 is involved at any point, the device is busy
[03:28] <c_14> Ask #alsa ... or sett up a dmix in front of hw:0,0
[03:28] <c_14> *set
[03:28] <klaxa> *ehem* as expected of alsa
[03:30] <kahrl> well thanks anyway for that effort!
[03:30] <kahrl> *looks at clock* wow, it's been 1.5 hours since I joined
[03:31] <kahrl> if you could lend me your soundcard too :P
[03:31] <c_14> I kinda need that, sorry. :)
[03:40] <Hello71> hw: devices are by definition not multiplexed
[03:40] <kahrl> Hello71: is that a problem in my configuration?
[06:29] <Zombie> I'm having issues with encoding from a v4l compliant Capture Card.
[06:31] <Zombie> It seems any AVIs or mp4's I generate play a green screen on my Raspberry Pi based xbmc device, or play video with no audio under mplayer.
[07:57] <harovali> hi, say I have: ffmpeg -i seq.mp4 -vf "movie=output.mkv[inner];[in][inner]overlay=main_w-overlay_w-10:main_h-overlay_h-10[out]" -map "[out]" output.mkv                       Why doesn't the -map "[out]"  work ? I get this error "Output with label 'out' does not exist in any defined filter graph, or was already used elsewhere."
[07:59] <harovali> (in this case if I don't write the -map, the command works, but I'm trying to put the map because I want to add an audio track as another -map
[08:23] <Zombie> It seems any AVIs or mp4's I generate play a green screen on my Raspberry Pi based xbmc device, or play video with no audio under mplayer.
[11:38] <gcl5cp> what is the best(smaller size-good quality) speech encoder? http://en.wikipedia.org/wiki/List_of_codecs#Voice
[11:42] <relaxed> gcl5cp: probably libopus
[11:42] <relaxed> http://www.opus-codec.org/
[11:51] <sky2> im really at my wits' end to figure out that that i can't join other channels on IRC
[11:52] <relaxed> sky2: #hexchat is where you need to ask about this.
[11:53] <sky2> ok relaxed
[11:54] <gcl5cp> thank relaxed, i will test it
[12:16] <sky2> i was only able to join #ffmpeg with sky2
[13:39] <anshul_mahe> I have added some feature in ffmpeg, I want to enable it only when I say --enable-feature=foo but its codec file is compiled even I dont say enable-fearure=foo
[13:53] <TiberiusReilly> buenos dias ffmpeg
[13:53] <TiberiusReilly> I have a problem when using filter_complex on windows to concatenate some files
[13:54] <TiberiusReilly> I have the error "Error while filtering"
[13:55] <TiberiusReilly> The command I am running is as follows: http://hastebin.com/tihabagute.vhdl
[14:16] <TiberiusReilly> The problem was that I was using was from March last year
[15:19] <ChampS_> hey guys
[15:19] <ChampS_> I'm programming a GUI for ADRone and receiving an udp stream
[15:20] <ChampS_> I want to put this stream in ffmpeq and display it in my JavaFx window
[15:20] <ChampS_> but the delay is 13s
[16:42] <BlackDream> Hello, i have 2 TS videos that i want to combine them into one. Why this doesnt work? ffmpeg -i "concat:1.ts|t.ts" -codec copy -f mpegts /tmp/combined.ts
[16:42] <BlackDream> it only adds the first ts
[16:42] <Lac3rat3d> can someone explain these options for libmp3lame: -q:a 5 -ac 2 -ar 44100
[16:44] <relaxed> vbr setting 5, channels 2, audio rate 44100
[16:45] <relaxed> `man lame` to read about the vbr ranges
[16:46] <Lac3rat3d> thansk :)
[16:49] <blippyp> blackdream - try ffmpeg -i 1.st -i t.ts -vf "concat" -c:v copy combined.ts
[16:50] <BlackDream> Filtergraph 'concat' was defined for video output stream 0:0 but codec copy was selected.
[16:50] <BlackDream> Filtering and streamcopy cannot be used together.
[16:50] <blippyp> you should also be able to just cat the files together as well like: cat t1.ts t.ts>combined.ts - but this probably isn't a great solution. someone with a little more know-how might be able to tell you why.
[16:51] <BlackDream> Yes but i want to combine mp4 movies into one without creating a new file
[16:51] <blippyp> but if you plan on just re-encoding them again later, it probably doesn't hurt
[16:51] <blippyp> you have to
[16:51] <BlackDream> https://trac.ffmpeg.org/wiki/How%20to%20concatenate%20(join,%20merge)%20media%20files
[16:51] <blippyp> what you want to try to avoid though is re-encoding it
[16:51] <BlackDream> but here it is using concat without creating new file
[16:52] <blippyp> checking out link
[16:52] <relaxed> BlackDream: or mp4 use `mp4box -cat 1.mp4 -cat 2.mp4 -new combined.mp4`
[16:53] <blippyp> ya - I don't see a 'join' command... or anything like that - the fastest you're going to get is by remusing it.
[16:53] <relaxed> for, not or
[16:53] <blippyp> remuxing*
[16:53] <BlackDream> relaxed yes it's ok with that. However why the above concat command does not work. It successfully starts the 1.ts but completly ignored the t.ts like i didnt write it at all
[16:54] <blippyp> I don't think you used the command properly - do it like I showed you
[16:55] <BlackDream> this one:  ffmpeg -i 1.ts -i 2.ts -vf "concat" -c:v copy combined.ts
[16:55] <blippyp> yes
[16:55] <blippyp> that should work
[16:55] <BlackDream> Yes i get
[16:55] <BlackDream> Filtergraph 'concat' was defined for video output stream 0:0 but codec copy was selected.
[16:55] <BlackDream> Filtering and streamcopy cannot be used together.
[16:55] <blippyp> oh yeah - I forgot that
[16:56] <blippyp> then use a bitstream filter on them
[16:56] <blippyp> hold on I'm trying to find it
[16:56] <blippyp> https://www.ffmpeg.org/ffmpeg-bitstream-filters.html
[16:56] <relaxed> oh, try ffmpeg -i concat:1.ts\|2.ts -c copy output.ts
[16:57] <relaxed> I thought you said mp4
[16:57] <BlackDream> relaxed yes i will use it with mp4 later.
[16:58] <blippyp> with mp4 you have to use the bitstream filter - the link I showed you last
[16:58] <BlackDream> Ah guys i think i know what is the issue with the above command and concat. It only adds the 1.ts because when ffmpeg tries to combine the 2.ts it says:
[16:58] <BlackDream> [mpegts @ 0x427d760] New video stream 0:2 at pos:2052660 and DTS:1.4s
[16:58] <BlackDream> [mpegts @ 0x427d760] New audio stream 0:3 at pos:2109248 and DTS:1.45778s
[16:58] <BlackDream> [mpegts @ 0x427d760] PES packet size mismatch
[16:58] <relaxed> unless it errors out you're fine
[16:59] <BlackDream> yes they are with yellow
[16:59] <relaxed> those are just inormative
[16:59] <relaxed> informative*
[16:59] <BlackDream> for reason the output file is larger than 1.ts&.ts but it only plays the first ts
[17:00] <BlackDream> the first TS is 8 seconds and then it stops
[17:01] <relaxed> tsmuxer might be a better tool or this job
[17:01] <BlackDream> anyway, with mp4 the above command will work? i will try now
[17:01] <relaxed> for (my f key is flakey)
[17:02] <blippyp> blackdream - just cat them together like I said - I've done it lots
[17:03] <BlackDream> it is ok to cat 2 mp4 files together?
[17:03] <blippyp> but like I said - I usually re-encode them again later
[17:03] <relaxed> no
[17:03] <blippyp> no - you have to use the bitstream filter like I said
[17:03] <BlackDream> ok i see
[17:03] <blippyp> h253_mp4toannexb
[17:03] <BlackDream> w8
[17:03] <blippyp> oops -  h264_mp4toannexb
[17:04] <blippyp> it's fast though
[17:04] <relaxed> and with .ts it may work but you're destroying time stamps
[17:04] <blippyp> yeah it's 'wonky' which is why I re-encode them
[17:04] <blippyp> I would never try to distribute the 'cat/joined' versions
[17:04] <blippyp> it works because .ts is like a 'raw' format or something - can't remember the details
[17:05] <blippyp> also works with mpg's iirc...
[17:05] <BlackDream> blippyp you suggest this one: ffmpeg -i 1.ts -i 2.ts -vf "concat" -c:v copy -bsf:v h264_mp4toannexb combined.ts
[17:05] <BlackDream> ?
[17:06] <blippyp> ya - something like that - you're o the right track now
[17:06] <BlackDream> Filtergraph 'concat' was defined for video output stream 0:0 but codec copy was selected.
[17:06] <BlackDream> Filtering and streamcopy cannot be used together.
[17:06] <BlackDream> again
[17:06] <blippyp> change -c:v copy to -codec copy
[17:06] <Zombie> It seems any AVIs or mp4's I generate play a green screen on my Raspberry Pi based xbmc device, or play video with no audio under mplayer.
[17:06] <relaxed> use my command with that bitstream filter
[17:06] <blippyp> sorry - you have to convert them first
[17:06] <blippyp> don't concat them
[17:07] <blippyp> ffmpeg -i 1.ts -codec copy -bsf:v h264_mp4toannexb out1.ts
[17:07] <blippyp> ffmpeg -i 2.ts -codec copy -bsf:v h264_mp4toannexb out2.ts
[17:07] <Zombie> other conversions to avi, and mp4 have had anomolies like slowed or sped up playback.
[17:07] <blippyp> then cat out1.ts out2.ts>combined.ts
[17:08] <blippyp> then bitstream them back to mp4 format
[17:08] <BlackDream> i cant use this filter because the containers is the same for both
[17:08] <blippyp> I use to do this all the time with the video from my hauppauge pvr
[17:08] <BlackDream> i get a n error
[17:08] <relaxed> ffmpeg -i concat:1.ts\|2.ts -c copy -bsf:v h264_mp4toannexb output.mp4
[17:08] <blippyp> stop concatenating the
[17:08] <blippyp> the* them
[17:09] <blippyp> first you convert them
[17:09] <blippyp> then cat them together
[17:09] <blippyp> then convert the joined file back to mp4
[17:09] <DelphiWorld> blol blippyp
[17:09] <relaxed> why not do it all at once?
[17:09] <blippyp> because it won't let you
[17:09] <blippyp> at least I've never found a way
[17:10] <blippyp> it's fast - it basically just copies the files
[17:10] <relaxed> yes it will, my command doesn't use the filter
[17:10] <blippyp> oh - sorry -wasn't paying attention...
[17:10] <blippyp> cool - I'll have to try that
[17:10] <BlackDream> relaxed didnt combine the ts, i run the video and again i'm only see the first one
[17:11] <relaxed> blippyp: it uses the concat protocol- confusing, I know
[17:12] <relaxed> BlackDream: hmm, can you pastebin the command and output
[17:12] <blippyp> ya - thanks - I will definately try that next time
[17:12] <relaxed> which *should* work with mpeg containers
[17:12] <BlackDream> ok hold one for a minute. Forget the TS files. I will use mp4 for now as that was my job from the start. So i have 1.mp4 and 2.mp4 i have download some samples files
[17:13] <BlackDream> I tried your command changing the ts to mp4 and again i was able to see only the first one
[17:13] <BlackDream> also ffmpeg stopped when frame=  166 fps= 30 q=-1.0 Lsize=     377kB time=00:00:05.56 bitrate= 554.5kbits/s
[17:13] <BlackDream> video:315kB audio:56kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.595011%
[17:13] <BlackDream> at 5 seconds, but both mp4 files are 10 seconds
[17:13] <relaxed> fucking pastebin
[17:13] <BlackDream> ok ok okok
[17:14] <relaxed> you can't spam random, cherry picked output and expect us to know what's going on
[17:14] <BlackDream> http://pastebin.com/2TwSH3B7
[17:16] <relaxed> BlackDream: The concat protocol doesn't work with mp4s. Use the mp4box command I gave you earlier
[17:17] <BlackDream> ok wait
[17:17] <blippyp> gotta go - you need that bitstream filter though - mp4's need them when joining/splitting, that mp4box that relaxed is suggesting is likely doing exactly the same thing
[17:17] <blippyp> good luck
[17:29] <Lac3rat3d> i have an avi that has audio delayed by a second or two, can anyone help me with a command to fix it?
[17:32] <relaxed> Lac3rat3d: ffmpeg -i input.avi -itsoffset -2 -i input.avi -map 0:v -map 1:a -c copy output.avi
[17:33] <Lac3rat3d> thanks
[17:33] <relaxed> wrong :(
[17:33] <Lac3rat3d> ??
[17:33] <Lac3rat3d> too many -i input :)
[17:33] <Lac3rat3d> ffmpeg -i input.avi -itsoffset -2 -c:v copy -c:a copy output.avi
[17:33] <Lac3rat3d> ?
[17:33] <relaxed> no, it's right :)  You have to use the same input twice
[17:33] <Lac3rat3d> oh
[17:34] <Lac3rat3d> -2 == delay audio by 2 seconds?
[17:34] <relaxed> we hope
[17:34] <Lac3rat3d> ... o_O
[17:34] Action: relaxed is having pints and giving tech support
[17:35] <Lac3rat3d> ok
[17:35] <Lac3rat3d> i'll try it out. thanks.
[17:40] <relaxed> BlackDream: did that work out for you?
[17:41] <TiberiusReilly> :D
[17:41] <TiberiusReilly> Pardon me, wrong window
[17:41] <relaxed> Lac3rat3d: you need the -map(s) too
[17:41] <Lac3rat3d> ya, i see what you did there :)
[17:42] <Lac3rat3d> video from file one, audio from file 2 (with delay)
[17:42] <Lac3rat3d> i'll test it out in a bit
[18:35] <ac_slater_> hey all. I'm interested in taking H264 nal units and creating an MPEGTS format context via libavformat. Has anyone done this? Or should I ask on the mailing list
[18:42] <tolmark12> When I export to mp4, the first second of the video is low resolution, is there a flag that will prevent this? I assume it is low res to allow fast start on internet streaming.
[18:42] <tolmark12> Command / output : https://gist.github.com/Tolmark12/817a44ce048079355c8f
[18:42] <Lac3rat3d> relaxed: works, thanks :)
[18:43] <jfmcarreira> heyy guys
[18:43] <jfmcarreira> how is data organized inside an AVFrame?
[18:45] <tolmark12> http://shots.delorum.com/client/view/Screen%20Shot%202014-11-10%20at%2010.44.35%20AM.png (first second on left, after a few seconds on the right)
[18:47] <rcombs> tolmark12: pastebin your entire ffmpeg invocation and its output
[18:48] <tolmark12> https://gist.github.com/Tolmark12/817a44ce048079355c8f
[18:49] <rcombs> uh, 550kbps is pretty low
[18:49] <tolmark12> what's a better bitrate?
[18:50] <rcombs> if you care more about quality than bitrate, try using x264's CRF-mode
[18:52] <tolmark12> bitrate is important since it will be sent over the internet, but maybe I just need to bump up the quality
[18:53] <rcombs> often CRF-mode with the `maxrate` parameter is a good choice, then
[18:54] <rcombs> or, if average bitrate is absolutely essential (hint: it's probably not), 2-pass encoding
[18:55] <tolmark12> cool, that's helpful
[18:57] <tolmark12> do you have a suggestion of a good maxrate value I should start testing at?
[18:58] <rcombs> depends on your intended audience; 1Mbps might be a good place to start if -b:v 550k looked largely OK
[18:59] <tolmark12> cool, thanks for the help!
[18:59] <rcombs> welcome!
[18:59] <rcombs> oh, also, if speed doesn't matter, set -preset veryslow
[18:59] <tolmark12> ok
[19:00] <rcombs> and if you want it to playback on mobile devices and such, you may want to set an H.264 profile and/or level (see the x264 docs)
[19:00] <Zombie>  I seem to get more reliable results from Libreoffice once I run over the file with the ODF Integrator Converter, rather than simply opening the file directly in Lo 4.3
[19:00] <tolmark12> ohhh, I was wondering about that, thnks
[19:01] <Zombie> On the topic of this channel.
[19:02] <Zombie> I'd like some advice to Digitizing VHS Tapes of family movies.
[19:02] <rcombs> ack ack ack ack ack analog
[19:05] <Zombie> My goal is to burn them to DVD and make them availible via our Raspberry Pi based devices.
[19:06] <tolmark12> @rcombs thanks again (!) that did the trick, looks great
[19:07] <rcombs> huzzah!
[19:36] <c_14> Zombie: if you want to burn them so that they can be read by a dvd player, just use -target dvd (or pal-dvd or ntsc-dvd depending).
[19:37] <Zombie> I'm having trouble encoding them.
[19:39] <Zombie> ffmpeg -i tv:// -tv driver=v4l2:device=/dev/video0:input=1:norm=ntsc:adevice=/dev/audio2 Mom_2nd_attempt.mp4  Here is the command
[19:40] <c_14> What's the problem?
[19:40] <Zombie> It seems any AVIs or mp4's I generate play a green screen on my Raspberry Pi based xbmc device, or play video with no audio under mplayer.
[19:41] <c_14> ffprobe $file
[19:41] <c_14> And pastebin it
[19:54] <Zombie> http://pastebin.com/YHtsfJ5g
[19:56] <c_14> try adding -pix_fmt yuv420p
[19:57] <c_14> Not sure why mplayer isn't playing the audio though. Unless, it's been built to dislike mp3s...
[20:02] <Zombie> I think that has more to do in this case with my Capture Equipment.
[20:02] <Zombie> This is a VCR Connected to the card with Composite Cables.
[20:02] <c_14> So? As long as ffmpeg doesn't throw errors, the output should be playable.
[20:06] <Zombie> What would the ideal method for capturing from this VCR be?
[20:07] <Zombie> I've made many attempts with mencoder, vlc and ffmpeg
[20:07] <c_14> tbh, whatever works
[20:11] <Zombie> Nothing I have done so far has truly worked to my complete satistfaction.
[21:48] <pomaranc> Hi, I have a mpegts(mpeg2+ac3) video with a 10s hole in the audio...transcode to mpegts(h264+aac) removes that hole(thus, sound is out of sync) - what can I do to prevent this?
[21:49] <Mavrik> probably use copyts parameter?
[21:49] <pomaranc> only thing that works is -copyts, but I can't use that(other problems)
[21:49] <pomaranc> yep...
[21:49] <pomaranc> isn't there any other way?
[21:50] <Mavrik> what other way would you want?
[21:50] <pomaranc> other than copyts
[21:50] <Mavrik> either you leave same timestamps which will keep the hole
[21:50] <Mavrik> or you generate new ones which wont
[22:32] <LanDi> how can I cut a part of some video with ffmpeg ?
[22:32] <LanDi> can anyone link me ?
[22:36] <LanDi> got it
[22:36] <LanDi> :P
[00:00] --- Tue Nov 11 2014


More information about the Ffmpeg-devel-irc mailing list