[FFmpeg-user] AVStream time_base not getting set
Zach Jacobs
zachjacobs at gmail.com
Fri Jun 28 15:28:37 CEST 2013
Hello,
I am attempting to open an AVFormatContext for writing to a file but I do
not think my AVStream time_base is getting set correctly when calling
avformat_write_headers.
Here is what I am doing:
AVFormatContext* oF = (AVFormatContext*)outputFile.ToPointer();
code = avformat_alloc_output_context2(&oF,null,null,outputPath);
outputFile = new IntPtr(oF);
if (code < 0)
throw new Exception("EncodingError: Could not open path " + outputPath);
code = avio_open(&oF->pb, outputPath,AVIO_FLAG_WRITE);
if (code < 0)
{
string avError = EncoderUtilities.GetFFmpegErrorString(code);
throw new Exception("EncodingError: Couldn't connect to stream " +
outputPath + ": " + avError);
}
AVCodec* vidCod = avcodec_find_encoder(AV_CODEC_ID_H264);
if (videoCodec == null)
throw new Exception("FFMPEG Encoder Error: Codec Not Found");
AVCodecContext* vidCodCon = avcodec_alloc_context3(vidCod);
vidCodCon->bit_rate = encoderInfo.Settings.OutputVideoBitrate/8;
vidCodCon->bit_rate_tolerance = encoderInfo.Settings.OutputVideoBitrate/8;
vidCodCon->width = encoderInfo.Settings.OutputVideoWidth;
vidCodCon->height = encoderInfo.Settings.OutputVideoHeight;
vidCodCon->gop_size = encoderInfo.Settings.KeyFrameInterval;
vidCodCon->max_b_frames = 0;
vidCodCon->pix_fmt = AV_PIX_FMT_YUV420P;
vidCodCon->flags |= CODEC_FLAG_GLOBAL_HEADER;
vidCodCon->time_base.num = 1;
vidCodCon->time_base.den = TIMEBASE_DENOMINATOR; //30000
vidCodCon->codec_type = AVMEDIA_TYPE_VIDEO;
vidCodCon->codec_id = AV_CODEC_ID_H264;
av_opt_set(vidCodCon->priv_data, "preset", "superfast", 0);
av_opt_set(vidCodCon->priv_data, "profile", "main", 0x0001); //
AV_OPT_SEARCH_CHILDREN
av_opt_set(vidCodCon->priv_data, "crf", "22", 0);
vidCodCon->rc_min_rate = bitrate/8;
vidCodCon->rc_max_rate = bitrate/8;
vidCodCon->rc_buffer_size = bitrate / 8;
if (avcodec_open2(vidCodCon, vidCod, null) < 0)
throw new Exception("FFmpeg Encoder Error: could not open video codec");
AVStream* vidStr = avformat_new_stream(outFile, vidCod);
if (vidStr == null)
throw new Exception("Encoder Error: could not allocate video stream");
vidStr->codec = (AVCodecContext*)av_malloc((uint)sizeof(AVCodecContext));
EncoderUtilities.MemoryCopy(vidStr->codec, vidCodCon, (uint)sizeof
(AVCodecContext));
vidStr->codec->extradata =
(byte*)av_malloc((uint)vidStr->codec->extradata_size);
EncoderUtilities.MemoryCopy(vidStr->codec->extradata, vidCodCon->extradata,
(uint)vidStr->codec->extradata_size);
code = avformat_write_header(oF, null);
I can verify that the vidStr->codec has the correct timebase. No fcns
return <0 error codes. After I call avformat_write_header(), the
videoStr.time_base doesn't get updated as indicated by the API here:
/**
* This is the fundamental unit of time (in seconds) in terms
* of which frame timestamps are represented.
*
* decoding: set by libavformat
* encoding: set by libavformat in avformat_write_header. The muxer may
use the
* user-provided value of @ref AVCodecContext.time_base
"codec->time_base"
* as a hint.
*/
AVRational time_base;
When running my program, Libx264 throws a "non-strictly-monotonic PTS"
error.
Any ideas on what could be going wrong? Am I setting everything up
correctly? Thank you!
Zach
More information about the ffmpeg-user
mailing list