[Ffmpeg-devel-irc] ffmpeg.log.20141009

burek burek021 at gmail.com
Fri Oct 10 02:05:01 CEST 2014


[00:21] <soreau> Using avconv from avi to mp4 with -c:v libx264 -c:a copy, the audio is out of sync a few hundred milliseconds. Is there some way to convert with guaranteed audio sync?
[01:17] <jverce> Hi there. Is there a way I can start an ffserver session from within a Java app, but without calling the external command?
[04:02] <Akagi201> I am converting a rtsp stream to rtmp push stream, I got this error.
[04:02] <Akagi201> https://gist.github.com/anonymous/25e87f381ebced3983e1
[04:02] <Akagi201> Do I miss some filter settings?
[04:03] <Akagi201> I use ffmpeg command, it works.
[04:03] <Akagi201> ffmpeg -i rtsp://192.168.1.210/ch1/main -acodec copy -vcodec copy -f flv "rtmp://xxxx"
[04:12] <Akagi201> I found the problem, ffmpeg -t rtsp:/xxxx , I got Stream #0:2: Data: none, How can I judge it, and avoid to copy it to output stream
[06:04] <kunterbunt> I am having trouble with the rotation of FFmpeg transcoded videos that are being uploaded from iPads. The original file is displaying properly right-side up regardless of what orientation the iPad was in while capturing the video. The transcoded videos and the screenshots, however, are not being displayed right-side up. If the video is captured in portrait then the encoded videos and thumbnail are rotated -90 degrees while the landscap
[06:04] <kunterbunt> videos are not being adjusted at all (meaning the up-side down videos stay up-side down). I am using a simple wrapper around the FFmpeg command line tool. Could this be an issue with the command Im passing to FFmpeg? Do I need to do something special to preserve the rotation metadata? Here is a pastebin of the command im using with variables: http://pastebin.com/GJawt9PS. Im new to the world of transcoding so any tips would be
[06:04] <kunterbunt> appreciated.
[07:57] <Schnabeltierchen> i´m searching the forum/google some time but didn´t find a proper solution. is there an easy option for ffmpeg to keep all subtitles/all audio lines etc, only recode to h264/avc? cause forum said converting to h264 will mess up subtitles and speeding
[09:47] <c_14> Schnabeltierchen: -c copy -c:v libx264 -map 0
[10:04] <polysics> hello
[10:05] <polysics> I have been trying to rotate an image, then overlaying it on the video
[10:05] <polysics> but the area "left" by the rotated image stays black no matter what
[10:05] <polysics> the color= option seems to work, ie. I can make it red
[10:06] <polysics> but c=none does not, nor does c=red at 0
[10:06] <polysics> or similar
[10:17] <polysics> another, related, issue
[10:18] <polysics> I tried to fade in and out the overlay
[10:18] <polysics> but I had to create a loop using image2
[10:18] <polysics> the issue is that the processing never stops :D
[10:18] <polysics> if I stop the processing, the video is generated correctly
[13:00] <N3sh> hello :)
[13:00] <N3sh> I have a question concerning setting the volume of an audio overlay
[13:01] <N3sh> I am basically merging 2 audios and I would like to be able to change the volume of the second one (background music)
[13:03] <N3sh> here is the pastebin for it: http://pastebin.com/ayubta1t
[13:03] <N3sh> thanks in advance, I think it's an easy command but I really couldn't find anyone doing something like that, without video stream.
[13:08] <c_14> You sure you want amerge and not amix?
[13:10] <kunterbunt> I am having trouble with the rotation of FFmpeg transcoded videos that are being uploaded from iPads. The original file is displaying properly right-side up regardless of what orientation the iPad was in while capturing the video. The transcoded videos and the screenshots, however, are not being displayed right-side up. If the video is captured in portrait then the encoded videos and thumbnail are rotated -90 degrees while the landscap
[13:10] <kunterbunt> videos are not being adjusted at all (meaning the up-side down videos stay up-side down). I am using a simple wrapper around the FFmpeg command line tool. Could this be an issue with the command Im passing to FFmpeg? Do I need to do something special to preserve the rotation metadata? Here is a pastebin of the command im using with variables: http://pastebin.com/GJawt9PS. Im new to the world of transcoding so any tips would be
[13:10] <kunterbunt> appreciated.
[13:11] <Mavrik> hmm, what happened to the aconvert filter
[13:12] <kunterbunt> Mavrik: whats an aconvert filter?
[13:12] <Mavrik> converts audio sample format and channel layout
[13:13] <kunterbunt> should that effect rotation?
[13:14] <Mavrik> oh sorry
[13:14] <Mavrik> I wasn't answering your question, that was mine :P
[13:15] <Mavrik> kunterbunt, you're having issues with the MP4/MOV rotation flag
[13:15] <kunterbunt> no thats my bad. the irc world doesnt revolve around me
[13:16] <kunterbunt> Mavrik: that sounds promising. I looked through the ffmpeg list of commands though and i didnt see anything like that
[13:16] <c_14> kunterbunt: what is @raw_options ?
[13:16] <Mavrik> kunterbunt, yeah, because it wasn't really supported at all when I checked it the last time
[13:16] <Mavrik> kunterbunt, if you ffprobe it you'll see it as "rotation: <something>" flag
[13:17] <Mavrik> but ffmpeg (at least as of 2.1, didn't check newer) was not able to read it or act on it
[13:17] <Mavrik> and mobile devices don't actually rotate video when you record, they just add that flag to tell the player to rotate it
[13:17] <Mavrik> and if you transcode with ffmpeg it'll clobber it
[13:17] <kunterbunt> Mavrik: Im a first timer how do I ffprobe?
[13:17] <c_14> It can read and set it, it just doesn't act on it (mainly because no one could decide what the default action should be).
[13:17] <c_14> ffmpeg usually copies the flag though.
[13:17] <c_14> kunterbunt: ffprobe file
[13:18] <kunterbunt> c_14: k, one sec ill tell you what the @raw_options are
[13:19] <kunterbunt> Mavrik: so Im not the first person to have troubl with this then? Its like a common gripe with ffmpeg?
[13:19] <Mavrik> it's just a corner case if you're encoding videos from mobile devices
[13:23] <kunterbunt> c_14: here are the @raw_options - theyre being converted from this into ffmpeg flags: http://pastebin.com/nN4cgBqL
[13:25] <c_14> What does ':enlarge => true' do?
[13:26] <kunterbunt> so if ffmpeg usually copies the flag then there should be no difference in how the original file is displayed versus the transcoded files. Is that true?
[13:26] <kunterbunt> c_14: one sec
[13:26] <Mavrik> kunterbunt, it doesn't copy the flag
[13:27] <N3sh> C_14: hey there, sorry I just saw your reply. Mmm what is the difference?
[13:28] <N3sh> I just need to get these two tracks and put one over the other
[13:28] <N3sh> is amix better?
[13:29] <c_14> amix will mix the audio tracks together, amerge will put the audio in new channels
[13:29] <c_14> ie if you have 2 stereo files, amix will give you stereo output while amerge will give you surround 40 output
[13:29] <kunterbunt> c_14: actually, that was a mistake {:enlarge => true} is being used somewhere else not in the FFmpeg command
[13:30] <N3sh> I think I might have a mono file + stereo file
[13:30] <N3sh> will amix return a stereo file?
[13:31] <c_14> N3sh: should, might be better to duplicate the mono stream into a stereo stream and then mix though.
[13:31] <kunterbunt> Mavrik: is there a work-around for that?
[13:31] <N3sh> ok, c_14. How do I do the amix with the audio setting?
[13:32] <c_14> Mavrik: I just tested and ffmpeg copies the rotate metadata by default.
[13:35] <c_14> N3sh: you can probably just take the 2nd command you pasted and replace the amerge and the pan filters with a simple amix, might want/need to turn the mono stream into a stereo stream though
[13:36] <N3sh> let's see, thanks c_14, I'll let you know
[13:40] <kunterbunt> c_14: what ffmpeg command did you use? can you see something in my command that might be interfering with the default behavior?
[13:41] <c_14> I checked, but couldn't see anything. I just used `ffmpeg -i out.mkv out.mp4'
[13:41] <c_14> Can you pastebin the output of your command?
[13:48] <N3sh> c_14: thanks a lot! It worked.
[13:48] <N3sh> I got another question. Is it possible to mix mp3 and wav files?
[13:49] <N3sh> at the moment the libmp3lame crashes if I use an mp3
[13:49] <N3sh> a wav}
[13:50] <kunterbunt> c_14: k, im adding some ffprobes to my code to see what I can find.
[13:51] <c_14> N3sh: probably depends on the implementation. imo it should work though
[13:51] <N3sh> ok
[13:52] <N3sh> actually, it's not crashing for that reason... weird
[13:53] <N3sh> one last question: how do I stop the audio file when the first track ends?
[13:53] <N3sh> making the second one stop/fade out
[13:53] <c_14> amix=duration=shortest
[13:54] <N3sh> exactly that?
[13:54] <N3sh> ok
[13:54] <N3sh> I'll try, thanks :)
[13:57] <N3sh> c_14, I tried this: http://pastebin.com/hAFPZAKp
[13:57] <N3sh> it failed saying it couldn't find a proper output
[13:58] <c_14> -filter_complex "[0:a]volume=0.9[a1]; [1:a]volume=0.11250[a2]; [a1][a2]amix=duration=shortest,pan=stereo:c0<c0+c2:c1<c1+c3[out]"
[13:58] <c_14> And I'm pretty sure you don't need tha tpan
[14:00] <wodim> kek
[14:10] <N3sh> c_14: the pan=stereo?
[14:11] <c_14> yeah
[14:35] <N3sh> Thanks a lot C_14, you are amazing! :)
[14:35] <N3sh> It worked flawlessly and it's awesome.
[14:35] <N3sh> Does ffmpeg also handle fade out?
[14:36] <N3sh> since cutting the background can sound bad, how about fading it when the track ends?
[14:36] <c_14> Use the afade filter
[14:38] <N3sh> inside the filter_complex ""?
[14:40] <c_14> right in front of [out]
[14:41] <N3sh> like this? -> c3, afade=t=in:ss=0:d=15 [out]"
[14:42] <c_14> What does your entire filter_complex look like right now?
[14:44] <N3sh> http://pastebin.com/Ue0sKw8y
[14:45] <c_14> yeah, then that would be fine. that would fade in though, not out
[14:51] <kunterbunt> c_14: Im definitely losing the rotate flag during encoding. Heres my input file: http://pastebin.com/ZkT1GwJB and heres my output file: http://pastebin.com/3vBMJam5
[14:52] <N3sh> oh true
[14:52] <c_14> Right, that's probably because you're using ffmpeg version 1.0.9
[14:53] <N3sh> so it's c3, afade=t=out:ss=0:d=15 [out]"
[14:53] <c_14> N3sh: you need to adjust ss and d
[14:53] <c_14> kunterbunt: try adding -map_metadata 0
[14:53] <N3sh> do I keep the 'in'?
[14:53] <c_14> nah, keep the out
[14:53] <kunterbunt> c_14: can I just stick that on the end?
[14:54] <N3sh> alright, so I need to get the length of the file and then have it like ss={length-5}:d=5
[14:54] <c_14> kunterbunt: yep
[14:54] <c_14> N3sh: yep
[14:55] <N3sh> is there an easy way to find that?
[14:55] <c_14> not really
[14:55] <c_14> you'll have to do it programatically with ffprobe
[14:55] <N3sh> :(
[14:56] <kunterbunt> c_14: Also, Im a freelancer working on someone elses project which is still in development. Theyre launching in November. It seems like a bad idea to start with a setup thats already out-of-date. Would you recommend updating to the newer version of ffmpeg? How much of a difference is there?
[14:56] <c_14> limitation in the filter api
[14:56] <c_14> kunterbunt: I'd recommend updating to a newer version of ffmpeg. There's been a lot of additions and fixes.
[14:57] <kunterbunt> c_14: ok, good to know
[15:05] <kunterbunt> c_14: no luck with adding -map_metadata 0  what does the 0 do in that flag?
[15:08] <N3sh> C_14, I don't have ffprobe :/
[15:08] <N3sh> ops
[15:08] <N3sh> I do have it :P
[15:08] <kunterbunt> c_14: in the documentation I found the -metadata[:metadata_specifier] flag. Im stashing the rotate variable from before encoding so maybe I can just throw it back in here.
[15:17] <c_14> ye
[15:17] <c_14> not sure why the map isn't working though
[15:17] <c_14> Maybe because it's only copying the global by default.
[15:18] <c_14> you could also try -map_metadata:s:v 0:v
[15:18] <c_14> That should copy the video stream metadata
[15:18] <c_14> afaik
[15:27] <kunterbunt> c_14: ok I went with over-riding the metadata and it seems to have worked but my videos still arent displaying properly. The output metadata now has the correct number stored in the rotate flag but it looks slightly different like its in all caps. Not sure if that kind of thing matters to video players. Do you mind taking a look? input: http://pastebin.com/Yiiraqtn output: http://pastebin.com/AFPFDy9G
[15:32] <c_14> The caps is probably just the format requiring metadata fields to be in caps. The problem might be that the player doesn't check ogv for rotate metadata.
[15:32] <c_14> What player are you checking with?
[15:37] <kunterbunt> c_14: html5
[15:37] <c_14> Does ffplay play it correctly?
[15:39] <N3sh> C_14, I am getting this: Argument '|' provided as input filename, but 'test\final_test\output_audio_5.mp3' was already specified.
[15:40] <kunterbunt> c_14: Im not sure I need to install it on my local machine to find out.
[15:40] <N3sh> any idea? I am trying to run this: http://pastebin.com/ZYs8UjR8
[15:40] <c_14> ffprobe doesn't need -i
[15:41] <c_14> also, wait
[15:41] <c_14> Does windows even have pipes...
[15:41] <N3sh> ohhhh
[15:41] <c_14> I think windows only has named pipes?
[15:41] <N3sh> correct :/
[15:41] <N3sh> it worked on my prompt because it is mingw
[15:41] <N3sh> daymn
[15:41] <c_14> kunterbunt: you might just want to physically rotate the video and delete the metadata
[15:42] <kunterbunt> c_14: -map_metadata:s:v 0:v > invalid metadata type
[15:43] <kunterbunt> c_14: yeah, i suggested that to the guy paying me and he didnt seem keen on it because it will be displayed on multiple platforms
[15:43] <c_14> If the video is rotated physically, it'll look the same on every platform.
[15:44] <c_14> Eh, when I say physically I don't mean the screen, I mean use the rotate or transpose filter.
[15:44] <kunterbunt> c_14: what do you mean by rotated physically?
[15:44] <kunterbunt> c_14: are those more ffmpeg options?
[15:45] <c_14> filters
[15:45] <c_14> you'll have to do that programatically though.
[15:45] <c_14> check the rotate metadata, rotate the video
[15:46] <kunterbunt> c_14: doing it programatically is fine - Im good with the code, Ive just never worked with video before
[15:47] <N3sh> C_14: I checked the the fade-out. It fades out everything, not only my second track
[15:47] <N3sh> can I specify that?
[15:47] <c_14> Just ffprobe the file, grab the rotation (either grep or use something like ffprobe's JSON output) and then either convert that to radians and feed it to the rotate filter or use a switch-case and use transpose
[15:47] <c_14> N3sh: move the fade filter right after the corresponding volume filter
[15:48] <kunterbunt> c_14: yeah i already have the rotation saved i just need to figure out how the rotate filter works
[15:48] <c_14> It takes radians not degrees.
[15:49] <c_14> Other than that, you feed it an angle and it rotates the video.
[15:54] <N3sh> c_14: http://pastebin.com/U4uq35yP  this works but the fade starts after few seconds and not when I specified it (at 15 sec)
[15:55] <N3sh> (at 26s)
[15:56] <c_14> Ok, 2 things
[15:56] <N3sh> check this
[15:56] <N3sh> http://pastebin.com/WyNTqxnA
[15:56] <N3sh> that was the wrong version (forgot a comma)
[15:56] <c_14> Ok, in that case 1 thing
[15:56] <c_14> Do you really need the volume filters if you aren't changing the volume?
[15:56] <N3sh> that is a test, in my application I will change it
[15:56] <c_14> kk
[15:57] <N3sh> I have a very loud background and I want to test the fade-out
[15:57] <N3sh> thanks to point that out :)
[15:57] <c_14> Hmm, can't see anything wrong with the command.
[15:57] <c_14> Have you tried it without the pan?
[16:00] <N3sh> nope
[16:01] <N3sh> same :(
[16:04] <c_14> eh
[16:04] <c_14> switch -ss to -st
[16:04] <c_14> You're not counting in seconds, you're counting in samples.
[16:05] <c_14> eh
[16:05] <c_14> ss to st
[16:07] <N3sh> ooooh
[16:09] <EXetoC> how do I output to http? my web server does receive a POST request, after which ffmpeg outputs "av_interleaved_write_frame(): Broken pipe" and then exits. sending to stdout instead does work
[16:09] <EXetoC> though maybe it's better to use rtmp
[16:09] <N3sh> Thanks a lot c_14
[16:10] <N3sh> you are like a star in the dark and obscure world of ffmpeg
[16:10] <N3sh> thanks again! :) :)
[16:10] <c_14> np
[17:15] <nasojlsu> can anyone off the top of their head give a good analyze command for a mpeg-2 ps file
[17:16] <Fjorgynn> nope
[17:16] <Fjorgynn> :D
[17:17] <nasojlsu> quick and honest. good deal. google it is then
[17:22] <Mavrik> analyze command? in what way?
[17:22] <nasojlsu> i have a video file that wants to fail on transcoding
[17:22] <nasojlsu> i was wondering why
[17:23] <nasojlsu> ffmpeg -i inputfile someoutputfile
[17:23] <c_14> _wants_ to?
[17:23] <nasojlsu> returned "vbv buffer size not set"
[17:24] <c_14> Did you set -bufsize ?
[17:24] <kepstin-laptop> nasojlsu: if you could pastebin the actual command you're running and the complete output of ffmpeg, that would be helpful...
[17:24] <nasojlsu> the transcoder is a commerical product that only says "failed"
[17:24] <c_14> Aren't you using ffmpeg?
[17:24] <Mavrik> ffmpeg is a transcoder, not a file analyzer really :/
[17:25] <Mavrik> check out Elecard StreamEye or somethin
[17:25] <nasojlsu> i use ffmpeg a lot for other things
[17:27] <nasojlsu> I was just hoping i could use some super command for analyzing...
[17:28] <kepstin-laptop> if all you want to do is see if ffmpeg can decode the file without errors, something like "ffmpeg -i inpuptfile -f null -" would do that.
[17:29] <nasojlsu> ah thanks.
[17:29] <kepstin-laptop> other than that, you're gonna have to contact the support for your unnamed commercial product and complain about their useless error messages.
[17:30] <nasojlsu> it now tells me "encoder did not produce proper pts, making some up."
[17:30] <nasojlsu> rhozet carbon coder is the name
[17:31] <nasojlsu> or i think it is not harmonic carbon coder
[17:31] <nasojlsu> now*
[17:35] <nasojlsu> so what is the relationship between vbv and pts
[17:38] <Mavrik> practically none.
[17:39] <Mavrik> PTS are timestamps that tell player when to display frame or play audio
[17:39] <Mavrik> and VBV are parameters you give to x264 encoder to tell it to keep to certain bitrate
[17:47] <ChocolateArmpits> Is there any difference between "prores" and "prores_aw" encoders ?
[17:48] <anshul_mahe> is there any document to add your own codec in ffmpeg source code
[17:48] <c_14> https://ffmpeg.org/developer.html#New-codecs-or-formats-checklist
[17:51] <kunterbunt> c_14: the filters worked! Thank you very much for your help!
[17:51] <anshul_mahe> c_14: thanks
[18:02] <anshul_mahe> what is fourCC how would I know that my codec have it or not?
[18:05] <Mavrik> uh
[18:05] <Mavrik> http://www.fourcc.org/codecs.php
[18:05] <Mavrik> first hit on google.
[19:06] <lenarhoyt> why is it impossible to navigate by frame through some videos?
[19:06] <lenarhoyt> in VLC for example
[19:12] <illuminated> is there a way to limit the fps of a transcode to the frame rate the transcode is outputting
[19:14] <illuminated> in other words instead of it transcoding at 100 fps have it transcode @ like 23.97 fps
[19:14] <kepstin-laptop> illuminated: yeah, the -re option should do that.
[19:15] <kepstin-laptop> (it's an input option, causes the input file to be read at approximately "realtime" speed)
[19:16] <EXetoC> ffmpeg is kinda neat. can't it fry me some bacon though?
[19:16] <illuminated> i wish this opencaster from rtmpworld allowed me to customize the parameters it uses tho.
[19:22] <DelphiWorld> yo
[19:22] <DelphiWorld> is it pocible to insert a image in a m4a file?
[19:23] <DelphiWorld> ffmpeg -re -i audio.m4a -i artwork.png -acodec copy out.m4a
[19:23] <DelphiWorld> stay in frame=0 without progress
[19:23] <ChocolateArmpits> Then it wouldn't be m4a after all no ?
[19:24] <DelphiWorld> ChocolateArmpits: why? m4a support artwork
[19:24] <ChocolateArmpits> So is there something you're not finding ?
[19:24] <DelphiWorld> could be
[19:24] <DelphiWorld> lol!
[19:24] <DelphiWorld> file is out but codec is h264... :)
[19:25] <kepstin-laptop> hmm, I dunno if ffmpeg supports embedding artwork properly in m4a files, you might be better off using dedicated audio file tagging tools
[19:25] <ChocolateArmpits> Because the image wasn't attached as part of metadata
[19:25] <ChocolateArmpits> At least that's what I would think
[19:25] <DelphiWorld> ChocolateArmpits: not sure
[19:25] <ChocolateArmpits> Does the image-turn-video have bitrate?
[19:25] <DelphiWorld> i have no clue
[19:25] <DelphiWorld> ChocolateArmpits: didnt understand your question
[19:26] <ChocolateArmpits> Well, the m4a now cotains a video track and an audio track
[19:26] <ChocolateArmpits> Right?
[19:26] <DelphiWorld> right
[19:26] <kepstin-laptop> well, the only difference between m4a and m4v/mp4 is the file extension
[19:26] <DelphiWorld> kepstin-laptop: that what i would think:P
[19:28] <DelphiWorld> ChocolateArmpits: wanna try playing it in your side to see if you get anything?
[19:28] <DelphiWorld> ChocolateArmpits: lol i'm doing it blindly
[19:28] <DelphiWorld> i evean use my pc without screen:P
[19:28] <ChocolateArmpits> I would say that's not safe
[19:28] <DelphiWorld> ChocolateArmpits: why not safe?
[19:29] <DelphiWorld> ChocolateArmpits: i can't see the img cause...
[19:29] Action: DelphiWorld is blind
[19:29] <ChocolateArmpits> Literally?
[19:29] <DelphiWorld> ChocolateArmpits: no eyes at all
[19:30] <ChocolateArmpits> This is an interesting case scenario
[19:30] <DelphiWorld> ChocolateArmpits: all my life is tts based
[19:36] <DelphiWorld> ChocolateArmpits: lol, my cousin confirmed its work
[19:37] <ChocolateArmpits> It may be that it plays back as a video rather than an artwork metadata embedded
[19:37] <DelphiWorld> ChocolateArmpits: yeah, i dont know what's the diference
[19:38] <ChocolateArmpits> Well, artwork as metadata embeds the image within the file without changing the artwork, while artwork as a video converts the video to compatible format (h264 in this case) and adds as many frames as the audio lasts
[19:39] <ChocolateArmpits> Based on what I understand
[19:39] <DelphiWorld> ChocolateArmpits: ...
[19:39] <DelphiWorld> ChocolateArmpits: you explained it
[19:39] <DelphiWorld> so i must do it as an artwork
[19:39] <ChocolateArmpits> First method of course is more space-conversing
[19:39] <ChocolateArmpits> conserving***
[19:41] <ChocolateArmpits> Artwork as video of course can be shared more easily as the recipient will see the artwork right away, can be posted on youtube and so on. If you only intend to share music without posting to youtube or video websites artwork as metadata is a better option
[19:41] <DelphiWorld> ChocolateArmpits: i plan to shoutcast it
[20:05] <anshul_mahe> how to compile ffmpeg in verbose mode, it does not show what command it is using
[20:07] <Chocola2> DelphiWorld, you can use MP4box to embed an image as cover art
[20:07] <Chocola2> my nick got messed up
[20:07] <ChocolateArmpits> finally
[20:08] <DelphiWorld> HEHEHE ChocolateArmpits
[20:08] <ChocolateArmpits> DelphiWorld, Do you need help with that?
[20:08] <ChocolateArmpits> I've tried it myself and it works just fine
[20:08] <DelphiWorld> ChocolateArmpits: give me just the git repo
[20:11] <ChocolateArmpits> I don't exactly know how this works, but I don't think the authors behind it have it, they have an svn
[20:12] <ChocolateArmpits> Also, the image embedded with mp4box is shown in all media players that I've tried
[20:13] <ChocolateArmpits> Well maybe except for MPC-HC
[20:18] <DelphiWorld> ChocolateArmpits: give svn lol
[20:18] <ChocolateArmpits> svn co svn://svn.code.sf.net/p/gpac/code/trunk/gpac gpac
[20:20] <voip_> hello guys, please chek log: http://pastebin.com/zCrViwNZ
[20:20] <voip_> ffmpeg frequently stops
[20:22] <J_Darnley> To people familiar with ffmpeg's expression evaluator, how can I do this "or(lt(t,5),gt(t,214))" when there is no or function?
[20:22] <BtbN> You mean, why does this work, or how do you get it to work?
[20:23] <BtbN> Because i have no idea what you'd expect that function to do.
[20:23] <BtbN> randomly choose one?
[20:23] <tliu> Hi there, does anyone know if writing the data field in AVPackets received from successive calls to avcodec_encode_video2 to a file is enough to produce a valid mpg1 file?  The examples also write a few extra bytes at the end of the file, but are there any headers or anything that need to be written or will those be included in the first Packet?
[20:24] <BtbN> tliu, avcodec gives you raw bitstream, no containers. There's libavformat for containers.
[20:24] <J_Darnley> BtbN: lt returns 1 when the condition is true...
[20:25] <J_Darnley> how can I or two 1s together?
[20:25] <J_Darnley> like with || in C or just about any other language
[20:25] <BtbN> add them?
[20:25] <tliu> BtbN: does ffmpeg do anything special when you set the output file to be an http address?  or does it just dump the raw bitstream into the address?
[20:25] <tliu> i'm trying to send my bitstream through a websocket, for reference
[20:26] <BtbN> i don't think you can set a http url as target.
[20:26] <tliu> ffmpeg -s 640x480 -f video4linux2 -i /dev/video0 -f mpeg1video \
[20:26] <tliu> -b 800k -r 30 http://example.com:8082/yourpassword/640/480/
[20:26] <tliu> this works :)
[20:26] <tliu> i'll go dig into ffmpeg.c i guess
[20:27] <J_Darnley> thanks, that seems to work
[20:27] <BtbN> Well, and what does it do? POST the entire stream?
[20:27] <tliu> the server just dumps whatever it gets into a websocket
[20:28] <BtbN> into a what?
[20:28] <DelphiWorld> ChocolateArmpits: i'm checking it out
[20:29] <DelphiWorld> oh ChocolateArmpits is gui or shell commandline
[20:31] <voip_> guys, can you check log: http://pastebin.com/zCrViwNZ pls, ffmpeg frequently stops
[20:31] <tliu> a websocket, the server just sends data as it gets it from ffmpeg into websocket frames passed to a client that's connected to it
[20:32] <tliu> i guess i'll just try to get my app to write to a file first
[20:32] <BtbN> So they re-invented tcp on top of http, what the hell
[20:32] <BtbN> or more like udp on top of http on top of tcp.
[20:32] <tliu> haha yeah it's kinda gross and disgusting
[20:32] <DelphiWorld> lol
[20:33] <tliu> this is a pretty good read on it http://lucumr.pocoo.org/2012/9/24/websockets-101/
[20:33] <voip_> tliu, BtbN  you guys wrote regarding my problem ?
[20:33] <tliu> and this is an example of what i'm trying to do, http://phoboslab.org/log/2013/09/html5-live-video-streaming-via-websockets
[20:33] <BtbN> tliu, you want nginx-rtmp.
[20:34] <tliu> i'm looking  for like 100ms delay between server and client
[20:34] <BtbN> That's not possible with hls.
[20:34] <BtbN> you have segment size * 3 minimum delay.
[20:34] <tliu> that's what i thought.
[20:34] <tliu> :(
[20:35] <BtbN> so with 5 second segments you have 15 seconds delay due to buffering
[20:36] <BtbN> If you want zero-latency, go for rtmp.
[20:36] <BtbN> nginx-rtmp can do that, too. (As the name might suggest)
[20:38] <tliu> reading on rtmp a bit, sec
[20:39] <DelphiWorld> rtmpmpmpmpmp:P
[20:39] <tliu> right now we're sending mjpeg basically
[20:39] <tliu> bunch of jpegs into an img tag
[20:39] <tliu> it's gross
[20:39] <tliu> and the filesize is huge
[21:00] <ChocolateArmpits> DelphiWorld: Mp4box as part of GPAC is a command line application
[21:00] <DelphiWorld> but do i need to compile all gpac?
[21:00] <DelphiWorld> or only mp4box
[21:01] <ChocolateArmpits> only mp4box
[21:02] <DelphiWorld> cool
[21:02] <DelphiWorld> ChocolateArmpits: poof... pretty huge project!
[21:20] <ChocolateArmpits> DelphiWorld: How is it going there ?
[21:21] <DelphiWorld> ChocolateArmpits: building going on
[21:33] <DelphiWorld> ChocolateArmpits: ok built
[21:43] <ChocolateArmpits> DelphiWorld: Do you need a command line example ?
[21:46] <Cosworth> hello
[21:46] <Cosworth> is there anyine alive?
[22:13] <DelphiWorld> ChocolateArmpits: they evean support iOS
[22:28] <polysics> hello!
[22:29] <polysics> when I rotate an image to overlay, it looks like the area left empty can;t be made transparent, is that correct?
[23:26] <Zeranoe> Does FFmpeg support decoding XESC?
[23:48] <voip_> Hello guys, please help: http://pastebin.com/zPYUXqt1
[23:59] <voip_> any help ?
[00:00] --- Fri Oct 10 2014


More information about the Ffmpeg-devel-irc mailing list