[Libav-user] MUX Synchronize dts/pts/basetime does it mandatory?

kckang kckang at skycom.ne.kr
Tue Jan 23 09:19:51 EET 2018


add stream2
but i can't make synchronize. like this code.


//ADD Stream2/////////////////////////////////////////
    for (i = 0; i < ifmt_ctx2->nb_streams; i++) {
        AVStream *out_stream2;
        AVStream *in_stream = ifmt_ctx2->streams[i];
        AVCodecParameters *in_codecpar = in_stream->codecpar;
		cout << __LINE__ << ":______________"<<in_filename<<endl;

        if (in_codecpar->codec_type != AVMEDIA_TYPE_AUDIO &&
            in_codecpar->codec_type != AVMEDIA_TYPE_VIDEO &&
            in_codecpar->codec_type != AVMEDIA_TYPE_SUBTITLE) {
            stream_mapping[i] = -1;
            continue;
        }

        stream_mapping2[i] = stream_index++;

        out_stream2 = avformat_new_stream(ofmt_ctx, NULL);
        if (!out_stream2) {
            fprintf(stderr, "Failed allocating output stream\n");
            ret = AVERROR_UNKNOWN;
            goto end;
        }

        ret = avcodec_parameters_copy(out_stream2->codecpar, in_codecpar);
        if (ret < 0) {
            fprintf(stderr, "Failed to copy codec parameters\n");
            goto end;
        }
        out_stream2->codecpar->codec_tag = 0;
/*  */

    }

//this code is not work

dts = av_gettime() / 1000;
    dts = dts * 25;
    printf( "DTS:%l",dts);
    dts = av_gettime();
    int duration = 20; // 20
    if(m_prevAudioDts > 0LL) {
        duration = dts - m_prevAudioDts;
    }

    m_prevAudioDts = dts;
    m_currAudioDts += duration;
    pkt.duration = duration;





this one means pts/dts....














-----Original Message-----
From: Libav-user [mailto:libav-user-bounces at ffmpeg.org] On Behalf Of Corey Taylor
Sent: Tuesday, January 23, 2018 12:00 PM
To: This list is about using libavcodec, libavformat, libavutil, libavdevice and libavfilter.
Subject: Re: [Libav-user] MUX Synchronize dts/pts/basetime does it mandatory?

On Mon, Jan 22, 2018 at 7:17 PM, kckang <kckang at skycom.ne.kr> wrote:
> mux works 2 stream combine, audio is 1, video is 0 stream.
> when combine two type of packet, just ignore and write to output context. but its makes half size.
> fralkly, I was so much confused, coz I wanna it becomes naturally reconize and automatically constructe this one. video reconized as a 25fps(by ffprobe, file type raw.264 from rtp payload by reconstruction, play is ok by ffplay) and audio packet is comming from aac file its also reconized by 8000hz samplingrate. how can I make it default automatic? or some easy way...

What do you mean by "automatically construct this one"?

What is "this one"?

corey
_______________________________________________
Libav-user mailing list
Libav-user at ffmpeg.org
http://ffmpeg.org/mailman/listinfo/libav-user



More information about the Libav-user mailing list