[FFmpeg-user] FFmpeg and Android Raw audio encoding

Andre Esteves apostas.life at gmail.com
Tue Oct 16 18:29:35 CEST 2012


Hi all,

Im been able to stream video only through ffmpeg and now im trying to do
the same but with audio only.

The server accept the rtsp connection and receives the packets but the
output file is silent although it have duration.

Code:

> static void write_audio_frame(uint16_t *frame_audio)
> {
> AVPacket pkt;
> int got_packet,ret=0;
> frame = avcodec_alloc_frame();
> int bufferSize = av_samples_get_buffer_size(0, c->channels,
> audio_input_frame_size,c->sample_fmt, 0);
> av_init_packet(&pkt);
> pkt.data = NULL;
> pkt.size = 0;
> frame->nb_samples = audio_input_frame_size;
> ret = avcodec_fill_audio_frame(frame, c->channels, c->sample_fmt, (uint8_t
> *)frame_audio, bufferSize , 0);
>
> if(ret != 0)
> __android_log_print(ANDROID_LOG_INFO, "AudioNative","Could not
> avcodec_fill_audio_frame. -> %d\n",ret);



> frame->pts = 130 * 44 * audio_st->codec->frame_number;
> __android_log_print(ANDROID_LOG_INFO, "AudioNative","Frame PTS ->
> %d\n",frame->pts);
>


> ret = avcodec_encode_audio2(c, &pkt, frame, &got_packet);
> if(ret != 0)
> __android_log_print(ANDROID_LOG_INFO, "AudioNative","Could not encode
> frame. -> %d\n",ret);
>


> pkt.flags |= AV_PKT_FLAG_KEY;
> pkt.stream_index = audio_st->index;
> /* Write the compressed frame to the media file. */
> if (av_interleaved_write_frame(oc, &pkt) != 0) {
> __android_log_print(ANDROID_LOG_INFO, "AudioNative","Error while writing
> audio frame\n");
> exit(1);
> }
> av_free_packet(&pkt);
> av_free(frame);
> }


By the way the "uint16_t *frame_audio" is a byte array from Android Java
passed through JNI.

Thanks in advance.

Best Regards,
-- 
Andre Esteves
Mobile Software Developer

**


More information about the ffmpeg-user mailing list