[FFmpeg-user] Large file sizes on recording.

Leonard Bogard leonard at kcfchurch.org
Thu Feb 27 22:12:49 CET 2014


On Thu, Feb 27, 2014 at 2:52 AM, Bouke (VideoToolShed)
<bouke at videotoolshed.com> wrote:
> ----- Original Message ----- From: "Leonard Bogard" <leonard at kcfchurch.org>
> To: "FFmpeg user questions" <ffmpeg-user at ffmpeg.org>
> Sent: Wednesday, February 26, 2014 6:10 PM
> Subject: [FFmpeg-user] Large file sizes on recording.
>
>
>> I'm using a bash script to on a linux (Ubuntu) machine to record and HLS
>> live stream to a remote Wowza server for live services.  The services
>> usually run for about 1.5 hours.  I'm using a Decklink SDI video input
>> card.  I'm splitting the output from bmdcapture (using "tee") to two
>> separate ffmpeg instances, the first for HLS and the second for recording
>> to an mp4 file.   For a while I've been recording at 720x400 at 30 fps
>> using libx264 with "-vcodec libx264 -preset medium -vb 1000k -vprofile
>> baseline -level 3.1 -pix_fmt yuv420p" which would generally give me a file
>> size around 800 MB on average.
>>
>> Of course this is not HD and we have all this expensive HD equipment that
>> gives us HD but I'm not recording HD with these parameters.
>>
>> Naturally, my boss wants HD and I want to give it to him.  Problem is,
>> with
>> the new parameters that keeps the CPU utilization to about 90-110% (as low
>> as I could get it on the 4 core processor last night while playing around
>> with the parameters) the file size is going to be somewhere around 1.7
>> terabytes.  Here's the recording part of the bash script:
>
>
> This is very, very large! Sounds like uncompressed, not H264...
> Something is definitly wrong, are you sure you don't mean 1.7 gigabyte?
> (That would make more sense...)
>

Ack, I was reading the ouput wrong, it is in the gigabytes, not
terabytes.  I guess that's what I get for working on it at night while
sleepy.

>> /home/suser/bin/ffmpeg -nostats -itsoffset -0.2 -i - \
>>  \
>>  -copytb -1 \
>>  -async -1 $OUTFILEAUDIO \
>>  -vcodec libx264 -preset veryfast -crf 22 -vprofile baseline -level 4.1
>> -pix_fmt yuv420p \
>>  -flags +ilme+ildct \
>>  -y $OUTFILE \
>>  < "$fifoFILEname" 2>> liveFILE.log & ffmpegFILEpid=$!
>>
>> The current output from this line is 1280x720 @59.95 fps.
>
>
> Are you sure you want interlaced video? (I  since you don't de-interlace and
> have a funny FPS)
>

Typo, 59.94, not 59.95.  I put "-flags +ilme+ildct" in there because I
was seeing some horizontal lines with movements when I play the video
back and read somewhere that the horizontal lines were from issues
with certain devices/monitors with interlaced video.  It came back
this morning so I removed it since it was no longer helping, but the
horizontal lines went away.  Baffling.


>> I'm thinking that I'm probably not getting it right since I've seen
>> cameras
>> able to record HD to a much smaller file on the fly from a live source.
>>
>> If not, is there some post-processing I can do on the file to make it much
>> smaller?  Should I record to a different format first if there is?
>
>
> Generally, if you want the best quality / data rate, you need  to set your
> preset as slow as  possible.
> (In real life, 'slow'  is good enough for rock 'n roll in most cases.)
> If you can live with a low-res file for live, and need to publish the
> archive later, you could capture the source to MpegII next to the H264
> That has extremely low CPU cost, with a trade off for larger file sizes, but
> you can re-compress that later.
> (Or, use a 'pro'  codec as an inbetween)
>

Everytime I use the slow preset, it would hog up ~250% (and sometimes
~320% depending on the video feed) cpu time and would choke out the
HLS instance and cause the live stream to chop.  I figured it would be
safe to use "veryfast" for the recording and then apply further
compression when the recording is complete.

> hth,
> Bouke
>
>>
>> Thanks in advance for any advice.


More information about the ffmpeg-user mailing list