ijkPlayer源码导读-6-android硬解码流程

前面两篇介绍的是软解码及其绘制过程,这篇我们来跟一下ijk在Android上的硬解码流程。

1.H264

这里不准备详细讲H264,只是介绍一下这篇会用到的东西。

H264码流

H264码流分为AVCC与Annex-B两种格式,不同的封装格式可能会采用不同的码流格式,如mp4/flv/mkv通常使用AVCC格式,而TS流则使用Annex-B格式。

它们有以下几个区别:

1.NALU的分割方式不同,AVCC的NALU由其长度进行分割,如[length] [NALU] [length] [NALU]。而Annex-B的则以start code进行分割,通常为0x000001或0x00000001。

2.sps/pps数据存储位置不同,Annex-B格式的sps/pps可以作为一个NALU放在文件或流的头部,而AVCC则是需要专门有一个地方来存储。

在android中硬解码MediaCodec只支持Annex-B格式,所以如果要兼容AVCC格式,那么需要将其转换为Annex-B格式。在FFmpeg中会将封装格式带有的sps/pps放入到AVCodecParameters->extradata,由于Annex-B格式会直接将其从AVPacket中获取出来,进而将sps/pps数据发送给解码器,而AVCC格式就需要自己解析出sps/pps数据,将其封装成一下Annex-B的NALU,设置给解码器,才好解码。下面我们通过ijk源码看一下是如何处理这两个问题的。

AVCC sps/pps转换Annex-B NALU

这里就需要了解一下AVCC中 sps/pps的数据格式了。在convert_sps_pps函数中也就是按这个格式进行解析,主要进行了一下步骤:

1.解析出NALU长度(即NALU分割字段)占用字节,后续在进行转换的时候会使用到。

2.分别解析出sps和pps的个数,并按个数进行遍历

3.在每个sps或pps之前添加Annex-B的start code 0x 00 00 00 01,再将sps或pps的真正数据拷贝到buffer中。

第1字节:version (通常0x01)
第2字节:avc profile (值同第1个sps的第2字节)
第3字节:avc compatibility (值同第1个sps的第3字节)
第4字节:avc level (值同第1个sps的第3字节)
第5字节前6位:保留全1
第5字节后2位:NALU Length 字段大小减1,通常这个值为3,即NAL码流中使用3+1=4字节表示NALU的长度
第6字节前3位:保留,全1
第6字节后5位:SPS NALU的个数,通常为1
第7字节开始后接1个或者多个SPS数据
 SPS结构 [16位 SPS长度][SPS NALU data]





SPS数据后
第1字节:PPS的个数,通常为1
第2字节开始接1个或多个PPS数据
static int convert_sps_pps( const uint8_t *p_buf, size_t i_buf_size,
                            uint8_t *p_out_buf, size_t i_out_buf_size,
                            size_t *p_sps_pps_size, size_t *p_nal_size)
{

    // int i_profile;
    uint32_t i_data_size = i_buf_size, i_nal_size, i_sps_pps_size = 0;
    unsigned int i_loop_end;


    /* */
    if( i_data_size < 7 )
    {
        ALOGE( "Input Metadata too small" );
        return -1;
    }



    /* Read infos in first 6 bytes */
    // i_profile    = (p_buf[1] << 16) | (p_buf[2] << 8) | p_buf[3];
    if (p_nal_size)
        *p_nal_size  = (p_buf[4] & 0x03) + 1;
    p_buf       += 5;
    i_data_size -= 5;
		// 循环两次,第一次为sps 第二次为pps
    for ( unsigned int j = 0; j < 2; j++ )
    {
        /* First time is SPS, Second is PPS */
        if( i_data_size < 1 )
        {

            ALOGE( "PPS too small after processing SPS/PPS %u",
                    i_data_size );
            return -1;
        }
        // sps的个数是第六个自己的后5位,所以是0x1f
        // pps的个数是去除其余数据只剩pps的第一个字节
        i_loop_end = p_buf[0] & (j == 0 ? 0x1f : 0xff);
        // 跳过大小这个字节
        p_buf++; i_data_size--;



        for ( unsigned int i = 0; i < i_loop_end; i++)
        {
            if( i_data_size < 2 )
            {
                ALOGE( "SPS is too small %u", i_data_size );
                return -1;
            }
            // 前16位 sps pps 的长度
            i_nal_size = (p_buf[0] << 8) | p_buf[1];
            // 跳过长度
            p_buf += 2;
            i_data_size -= 2;




            if( i_data_size < i_nal_size )
            {
                ALOGE( "SPS size does not match NAL specified size %u",
                        i_data_size );
                return -1;
            }
            if( i_sps_pps_size + 4 + i_nal_size > i_out_buf_size )
            {
                ALOGE( "Output SPS/PPS buffer too small" );
                return -1;
            }
            // 添加Annexb头
            p_out_buf[i_sps_pps_size++] = 0;
            p_out_buf[i_sps_pps_size++] = 0;
            p_out_buf[i_sps_pps_size++] = 0;
            p_out_buf[i_sps_pps_size++] = 1;


            memcpy( p_out_buf + i_sps_pps_size, p_buf, i_nal_size );
            i_sps_pps_size += i_nal_size;


            p_buf += i_nal_size;
            i_data_size -= i_nal_size;
        }
    }



    *p_sps_pps_size = i_sps_pps_size;


    return 0;
}
AVCC NALU 转 Annex-B NALU

这里插播一个小tip,在FFmpeg中的视频 AVPcket所包含的数据可能不止一个NALU,可能有多个,所以在转换时需要将所有的NALU进行转换,下面的convert_h264_to_annexb就是转换函数,有了前面的铺垫下面这段代码就比较好理解了:

1.输入数据为[length] [NALU] [length] [NALU]。

2.根据之前获取出的p_nal_size,即NALU大小字段字节数获取出这个NALU的大小,并将这个NALU的分割符由大小转换为0x 00 00 00 010x 00 00 01

3.按上面的大小拷贝真正的NAL数据,并判断输入数据是否读取完成,如果未完成,那证明还有未处理的NALU,那就继续重复上述步骤进行处理。

static void convert_h264_to_annexb( uint8_t *p_buf, size_t i_len,
                                    size_t i_nal_size,
                                    H264ConvertState *state )
{

    if( i_nal_size < 3 || i_nal_size > 4 )
        return;




    /* This only works for NAL sizes 3-4 */
    while( i_len > 0 )
    {
        if( state->nal_pos < i_nal_size ) {
            unsigned int i;
            // 读取NAL大小,并将将前4个字节改为0x 00 00 00 01 或前3个字节修改为 0x 00 00 01
            for( i = 0; state->nal_pos < i_nal_size && i < i_len; i++, state->nal_pos++ ) {
                state->nal_len = (state->nal_len << 8) | p_buf[i];
                p_buf[i] = 0;
            }
            if( state->nal_pos < i_nal_size )
                return;
            p_buf[i - 1] = 1;
            p_buf += i;
            i_len -= i;
        }
        if( state->nal_len > INT_MAX )
            return;
        if( state->nal_len > i_len )
        {

            state->nal_len -= i_len;
            return;
        }
        else
        {
            // 往前移这个NAL的大小,读取下一个NAL
            p_buf += state->nal_len;
            i_len -= state->nal_len;
            state->nal_len = 0;
            state->nal_pos = 0;
        }
    }
}

下面开始进入硬解码源码跟踪。

2.硬件解码

static IJKFF_Pipenode *func_open_video_decoder(IJKFF_Pipeline *pipeline, FFPlayer *ffp)
{









    IJKFF_Pipeline_Opaque *opaque = pipeline->opaque;
    IJKFF_Pipenode        *node = NULL;
    //优先使用硬解码器,会先查看视频流格式是否符合,不不符合或者没有开启硬解码,使用软解码
    if (ffp->mediacodec_all_videos || ffp->mediacodec_avc || ffp->mediacodec_hevc || ffp->mediacodec_mpeg2)
        node = ffpipenode_create_video_decoder_from_android_mediacodec(ffp, pipeline, opaque->weak_vout);
    if (!node) {
        node = ffpipenode_create_video_decoder_from_ffplay(ffp);
    }






    return node;
}
获取视频编码信息

这里我们以H.264为例进行代码跟踪。在打开视频解码时,已经创建并初始化好了一个FFmpeg中的解码器,通过这个解码器我们能获取到视频的编码格式与编码参数,如profile和level,从codecpar->extradata可以获取到sps/pps信息。

IJKFF_Pipenode *ffpipenode_create_video_decoder_from_android_mediacodec(FFPlayer *ffp, IJKFF_Pipeline *pipeline, SDL_Vout *vout)
{









    ALOGD("ffpipenode_create_video_decoder_from_android_mediacodec()\n");
    if (SDL_Android_GetApiLevel() < IJK_API_16_JELLY_BEAN)
        return NULL;


    if (!ffp || !ffp->is)
        return NULL;


    IJKFF_Pipenode *node = ffpipenode_alloc(sizeof(IJKFF_Pipenode_Opaque));
    if (!node)
        return node;


    VideoState            *is     = ffp->is;
    IJKFF_Pipenode_Opaque *opaque = node->opaque;
    JNIEnv                *env    = NULL;
    int                    ret    = 0;
    jobject                jsurface = NULL;


    node->func_destroy  = func_destroy;
    if (ffp->mediacodec_sync) {
        node->func_run_sync = func_run_sync_loop;
    } else {
        node->func_run_sync = func_run_sync;
    }

    node->func_flush    = func_flush;
    opaque->pipeline    = pipeline;
    opaque->ffp         = ffp;
    //已经打开的软解码器
    opaque->decoder     = &is->viddec;
    opaque->weak_vout   = vout;
    // 拷贝解码参数用
    opaque->codecpar = avcodec_parameters_alloc();
    if (!opaque->codecpar)
        goto fail;



    ret = avcodec_parameters_from_context(opaque->codecpar, opaque->decoder->avctx);
    if (ret)
        goto fail;
    //查找软解码器id
    //构建opaque->mcc 信息
    switch (opaque->codecpar->codec_id) {
    case AV_CODEC_ID_H264:
        if (!ffp->mediacodec_avc && !ffp->mediacodec_all_videos) {
            ALOGE("%s: MediaCodec: AVC/H264 is disabled. codec_id:%d \n", __func__, opaque->codecpar->codec_id);
            goto fail;

        }


        //查找硬解码支持的profile
        //支持BASELINE CONSTRAINED_BASELINE MAIN EXTENDED HIGH
        switch (opaque->codecpar->profile) {
            ........
        }
        // 设置mime类型
        strcpy(opaque->mcc.mime_type, SDL_AMIME_VIDEO_AVC);
        opaque->mcc.profile = opaque->codecpar->profile;
        opaque->mcc.level   = opaque->codecpar->level;
        break;
        // h265解码器
    case AV_CODEC_ID_HEVC:
        if (!ffp->mediacodec_hevc && !ffp->mediacodec_all_videos) {
            ALOGE("%s: MediaCodec/HEVC is disabled. codec_id:%d \n", __func__, opaque->codecpar->codec_id);
            goto fail;
        }

        strcpy(opaque->mcc.mime_type, SDL_AMIME_VIDEO_HEVC);
        opaque->mcc.profile = opaque->codecpar->profile;
        opaque->mcc.level   = opaque->codecpar->level;
        break;
    case AV_CODEC_ID_MPEG2VIDEO:
        if (!ffp->mediacodec_mpeg2 && !ffp->mediacodec_all_videos) {
            ALOGE("%s: MediaCodec/MPEG2VIDEO is disabled. codec_id:%d \n", __func__, opaque->codecpar->codec_id);
            goto fail;
        }
        strcpy(opaque->mcc.mime_type, SDL_AMIME_VIDEO_MPEG2VIDEO);
        opaque->mcc.profile = opaque->codecpar->profile;
        opaque->mcc.level   = opaque->codecpar->level;
        break;
    case AV_CODEC_ID_MPEG4:
        if (!ffp->mediacodec_mpeg4 && !ffp->mediacodec_all_videos) {
            ALOGE("%s: MediaCodec/MPEG4 is disabled. codec_id:%d \n", __func__, opaque->codecpar->codec_id);
            goto fail;
        }
        if ((opaque->codecpar->codec_tag & 0x0000FFFF) == 0x00005844) {
            ALOGE("%s: divx is not supported \n", __func__);
            goto fail;
        }
        strcpy(opaque->mcc.mime_type, SDL_AMIME_VIDEO_MPEG4);
        opaque->mcc.profile = opaque->codecpar->profile >= 0 ? opaque->codecpar->profile : 0;
        opaque->mcc.level   = opaque->codecpar->level >= 0 ? opaque->codecpar->level : 1;
        break;

    default:
        ALOGE("%s:create: not H264 or H265/HEVC, codec_id:%d \n", __func__, opaque->codecpar->codec_id);
        goto fail;
    }

    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s:create: SetupThreadEnv failed\n", __func__);
        goto fail;
    }


    opaque->acodec_mutex                      = SDL_CreateMutex();
    opaque->acodec_cond                       = SDL_CreateCond();
    opaque->acodec_first_dequeue_output_mutex = SDL_CreateMutex();
    opaque->acodec_first_dequeue_output_cond  = SDL_CreateCond();
    opaque->any_input_mutex                   = SDL_CreateMutex();
    opaque->any_input_cond                    = SDL_CreateCond();
  
      //创建MediaFromat
    ret = recreate_format_l(env, node);
    if (ret) {
        ALOGE("amc: recreate_format_l failed\n");
        goto fail;
    }
    //回调java层,选择硬解码decoder
    //最终会确定一个decoder
    if (!ffpipeline_select_mediacodec_l(pipeline, &opaque->mcc) || !opaque->mcc.codec_name[0]) {
        ALOGE("amc: no suitable codec\n");
        goto fail;
    }
}

创建MediaFormat

在创建MediaFormat的时候,判断了sps数据是否满足AVCC格式,即opaque->codecpar->extradata[0] == 1这个条件,在上面的AVCC sps/pps格式中第一个字节是1,而Annex-B的则是0(因为start code 第一位为0)。如果满足条件那就会进行格式转换,设置到csd-0 buffer中,在后续的创建过程配置到MediaCodec中。

// 创建Java层 MediaFormat
static int recreate_format_l(JNIEnv *env, IJKFF_Pipenode *node)
{

    IJKFF_Pipenode_Opaque *opaque         = node->opaque;
    FFPlayer              *ffp            = opaque->ffp;
    int                    rotate_degrees = 0;




    ALOGI("AMediaFormat: %s, %dx%d\n", opaque->mcc.mime_type, opaque->codecpar->width, opaque->codecpar->height);
    SDL_AMediaFormat_deleteP(&opaque->output_aformat);
    //反射构建MediaFormat MediaFormat.createVideoFormat("video/avc", 1280, 720)
    if (ffp->sdkVersion >= 21) {
        opaque->input_aformat = SDL_AMediaFormatNative_createVideoFormat(opaque->mcc.mime_type,
                                                                         opaque->codecpar->width,
                                                                         opaque->codecpar->height);
    }else{
      	// 创建Java层MediaFormat
        opaque->input_aformat = SDL_AMediaFormatJava_createVideoFormat(env, opaque->mcc.mime_type,
                                                                       opaque->codecpar->width,
                                                                       opaque->codecpar->height);
    }

		
    if (opaque->codecpar->extradata && opaque->codecpar->extradata_size > 0) {
        //配置h264和h265 sps pps信息
        if ((opaque->codecpar->codec_id == AV_CODEC_ID_H264 && opaque->codecpar->extradata[0] == 1)
            || (opaque->codecpar->codec_id == AV_CODEC_ID_HEVC && opaque->codecpar->extradata_size > 3
                && (opaque->codecpar->extradata[0] == 1 || opaque->codecpar->extradata[1] == 1))) {
#if AMC_USE_AVBITSTREAM_FILTER
            if (opaque->codecpar->codec_id == AV_CODEC_ID_H264) {
                opaque->bsfc = av_bitstream_filter_init("h264_mp4toannexb");
                if (!opaque->bsfc) {
                    ALOGE("Cannot open the h264_mp4toannexb BSF!\n");
                    goto fail;
                }
            } else {
                opaque->bsfc = av_bitstream_filter_init("hevc_mp4toannexb");
                if (!opaque->bsfc) {
                    ALOGE("Cannot open the hevc_mp4toannexb BSF!\n");
                    goto fail;
                }
            }

            opaque->orig_extradata_size = opaque->codecpar->extradata_size;
            opaque->orig_extradata = (uint8_t*) av_mallocz(opaque->codecpar->extradata_size + FF_INPUT_BUFFER_PADDING_SIZE);
            if (!opaque->orig_extradata) {
                goto fail;
            }
            memcpy(opaque->orig_extradata, opaque->codecpar->extradata, opaque->codecpar->extradata_size);
            for(int i = 0; i < opaque->codecpar->extradata_size; i+=4) {
                ALOGE("csd-0[%d]: %02x%02x%02x%02x\n", opaque->codecpar->extradata_size, (int)opaque->codecpar->extradata[i+0], (int)opaque->codecpar->extradata[i+1], (int)opaque->codecpar->extradata[i+2], (int)opaque->codecpar->extradata[i+3]);
            }
            SDL_AMediaFormat_setBuffer(opaque->input_aformat, "csd-0", opaque->codecpar->extradata, opaque->codecpar->extradata_size);
#else
            size_t   sps_pps_size   = 0;
            size_t   convert_size   = opaque->codecpar->extradata_size + 20;
            uint8_t *convert_buffer = (uint8_t *)calloc(1, convert_size);
            if (!convert_buffer) {
                ALOGE("%s:sps_pps_buffer: alloc failed\n", __func__);
                goto fail;
            }
            if (opaque->codecpar->codec_id == AV_CODEC_ID_H264) {
                if (0 != convert_sps_pps(opaque->codecpar->extradata, opaque->codecpar->extradata_size,
                                         convert_buffer, convert_size,
                                         &sps_pps_size, &opaque->nal_size)) {
                    ALOGE("%s:convert_sps_pps: failed\n", __func__);
                    goto fail;
                }
            } else {
                if (0 != convert_hevc_nal_units(opaque->codecpar->extradata, opaque->codecpar->extradata_size,
                                                convert_buffer, convert_size,
                                                &sps_pps_size, &opaque->nal_size)) {
                    ALOGE("%s:convert_hevc_nal_units: failed\n", __func__);
                    goto fail;
                }
            }
            //MediaFormat.setByteBuffer
            SDL_AMediaFormat_setBuffer(opaque->input_aformat, "csd-0", convert_buffer, sps_pps_size);
            for(int i = 0; i < sps_pps_size; i+=4) {
                ALOGE("csd-0[%d]: %02x%02x%02x%02x\n", (int)sps_pps_size, (int)convert_buffer[i+0], (int)convert_buffer[i+1], (int)convert_buffer[i+2], (int)convert_buffer[i+3]);
            }
            free(convert_buffer);
#endif
        } else if (opaque->codecpar->codec_id == AV_CODEC_ID_MPEG4) {
            size_t esds_dec_dscr_type_length = opaque->codecpar->extradata_size + 0x18;
            size_t esds_es_dscr_type_length = esds_dec_dscr_type_length + 0x08;
            size_t esds_size = esds_es_dscr_type_length + 0x05;
            uint8_t *convert_buffer = (uint8_t *)calloc(1, esds_size);
            restore_mpeg4_esds(opaque->codecpar, opaque->codecpar->extradata, opaque->codecpar->extradata_size, esds_es_dscr_type_length, esds_dec_dscr_type_length, convert_buffer);
            SDL_AMediaFormat_setBuffer(opaque->input_aformat, "csd-0", convert_buffer, esds_size);
            free(convert_buffer);
        } else {
            // Codec specific data
            // SDL_AMediaFormat_setBuffer(opaque->aformat, "csd-0", opaque->codecpar->extradata, opaque->codecpar->extradata_size);
            ALOGE("csd-0: naked\n");
        }
    } else {
        ALOGE("no buffer(%d)\n", opaque->codecpar->extradata_size);
    }

    rotate_degrees = ffp_get_video_rotate_degrees(ffp);
    //配置旋转角度
    if (ffp->mediacodec_auto_rotate &&
        rotate_degrees != 0 &&
        SDL_Android_GetApiLevel() >= IJK_API_21_LOLLIPOP) {
        ALOGI("amc: rotate in decoder: %d\n", rotate_degrees);
        opaque->frame_rotate_degrees = rotate_degrees;
        SDL_AMediaFormat_setInt32(opaque->input_aformat, "rotation-degrees", rotate_degrees);
        ffp_notify_msg2(ffp, FFP_MSG_VIDEO_ROTATION_CHANGED, 0);
    } else {
        ALOGI("amc: rotate notify: %d\n", rotate_degrees);
        ffp_notify_msg2(ffp, FFP_MSG_VIDEO_ROTATION_CHANGED, rotate_degrees);
    }



    return 0;
fail:
    return -1;
}
选择合适的MediaCodec

在拿到编码格式后,根据所产生的mime 回调到java层进行解码器的选择,然后将合适的编码器的名称返回到native层。

bool ffpipeline_select_mediacodec_l(IJKFF_Pipeline* pipeline, ijkmp_mediacodecinfo_context *mcc)
{









    ALOGD("%s\n", __func__);
    if (!check_ffpipeline(pipeline, __func__))
        return false;


    if (!mcc || !pipeline->opaque->mediacodec_select_callback)
        return false;


    return pipeline->opaque->mediacodec_select_callback(pipeline->opaque->mediacodec_select_callback_opaque, mcc);
}




static bool mediacodec_select_callback(void *opaque, ijkmp_mediacodecinfo_context *mcc)
{
    JNIEnv *env = NULL;
    jobject weak_this = (jobject) opaque;
    const char *found_codec_name = NULL;


    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s: SetupThreadEnv failed\n", __func__);
        return -1;
    }

    //回调到java层,选着能使用的decoder名称
    found_codec_name = J4AC_IjkMediaPlayer__onSelectCodec__withCString__asCBuffer(env, weak_this, mcc->mime_type, mcc->profile, mcc->level, mcc->codec_name, sizeof(mcc->codec_name));
    if (J4A_ExceptionCheck__catchAll(env) || !found_codec_name) {
        ALOGE("%s: onSelectCodec failed\n", __func__);
        goto fail;
    }

fail:
    return found_codec_name;
}


创建MediaCodec

这里就是根据选择的MediaCodec名称进行创建。

static int reconfigure_codec_l(JNIEnv *env, IJKFF_Pipenode *node, jobject new_surface)

{









    IJKFF_Pipenode_Opaque *opaque   = node->opaque;


    int                    ret      = 0;

    sdl_amedia_status_t    amc_ret  = 0;

    jobject                prev_jsurface = NULL;





    prev_jsurface = opaque->jsurface;

    if (new_surface) {

        opaque->jsurface = (*env)->NewGlobalRef(env, new_surface);

        if (J4A_ExceptionCheck__catchAll(env) || !opaque->jsurface)

            goto fail;

    } else {

        opaque->jsurface = NULL;

    }




    SDL_JNI_DeleteGlobalRefP(env, &prev_jsurface);



    if (!opaque->acodec) {

        opaque->acodec = create_codec_l(env, node);

        if (!opaque->acodec) {

            ALOGE("%s:open_video_decoder: create_codec failed\n", __func__);

            ret = -1;

            goto fail;

        }

    }

}

static SDL_AMediaCodec *create_codec_l(JNIEnv *env, IJKFF_Pipenode *node)
{
    IJKFF_Pipenode_Opaque        *opaque   = node->opaque;
    ijkmp_mediacodecinfo_context *mcc      = &opaque->mcc;
    SDL_AMediaCodec              *acodec   = NULL;


    if (opaque->jsurface == NULL) {
        // we don't need real codec if we don't have a surface
        acodec = SDL_AMediaCodecDummy_create();
    } else {
        //构建MediaCodec
        if (opaque->ffp->sdkVersion >= 21) {
            acodec = SDL_AMediaCodec_native_create(mcc->codec_name);
        } else {
            acodec = SDL_AMediaCodecJava_createByCodecName(env, mcc->codec_name);
        }


        if (acodec) {
            strncpy(opaque->acodec_name, mcc->codec_name, sizeof(opaque->acodec_name) / sizeof(*opaque->acodec_name));
            opaque->acodec_name[sizeof(opaque->acodec_name) / sizeof(*opaque->acodec_name) - 1] = 0;
        }
    }

#if 0
    if (!acodec)
        acodec = SDL_AMediaCodecJava_createDecoderByType(env, mcc->mime_type);
#endif


    if (acodec) {
        // QUIRK: always recreate MediaCodec for reconfigure
        opaque->quirk_reconfigure_with_new_codec = true;
        /*-
        if (0 == strncasecmp(mcc->codec_name, "OMX.TI.DUCATI1.", 15)) {
            opaque->quirk_reconfigure_with_new_codec = true;
        }

        */
        /* delaying output makes it possible to correct frame order, hopefully */
        if (0 == strncasecmp(mcc->codec_name, "OMX.TI.DUCATI1.", 15)) {
            /* this is the only acceptable value on Nexus S */
            opaque->n_buf_out = 1;
            ALOGD("using buffered output for %s", mcc->codec_name);
        }
    }
    //根据角度配置宽高
    if (opaque->frame_rotate_degrees == 90 || opaque->frame_rotate_degrees == 270) {
        opaque->frame_width  = opaque->codecpar->height;
        opaque->frame_height = opaque->codecpar->width;
    } else {
        opaque->frame_width  = opaque->codecpar->width;
        opaque->frame_height = opaque->codecpar->height;
    }

    return acodec;
}
配置并开启MediaCodec

创建好MediaCodec后会对其进行配置,主要是之前创建好的MediaFormat和需要渲染的目标Surface,配置好后,后面解码完成后画面就可以显示到这个Surface上了。

static int reconfigure_codec_l(JNIEnv *env, IJKFF_Pipenode *node, jobject new_surface)

{









    IJKFF_Pipenode_Opaque *opaque   = node->opaque;


    int                    ret      = 0;

    sdl_amedia_status_t    amc_ret  = 0;

    jobject                prev_jsurface = NULL;





    prev_jsurface = opaque->jsurface;

    if (new_surface) {

        opaque->jsurface = (*env)->NewGlobalRef(env, new_surface);

        if (J4A_ExceptionCheck__catchAll(env) || !opaque->jsurface)

            goto fail;

    } else {

        opaque->jsurface = NULL;

    }




    SDL_JNI_DeleteGlobalRefP(env, &prev_jsurface);



    if (!opaque->acodec) {

        opaque->acodec = create_codec_l(env, node);

        if (!opaque->acodec) {

            ALOGE("%s:open_video_decoder: create_codec failed\n", __func__);

            ret = -1;

            goto fail;

        }

    }

    //如果MediaCodec已经使用过,需要重置
    if (SDL_AMediaCodec_isConfigured(opaque->acodec)) {
        if (opaque->acodec) {
            if (SDL_AMediaCodec_isStarted(opaque->acodec)) {
                SDL_VoutAndroid_invalidateAllBuffers(opaque->weak_vout);
                SDL_AMediaCodec_stop(opaque->acodec);
            }
            if (opaque->quirk_reconfigure_with_new_codec) {
                ALOGI("quirk: reconfigure with new codec");
                SDL_AMediaCodec_decreaseReferenceP(&opaque->acodec);
                SDL_VoutAndroid_setAMediaCodec(opaque->weak_vout, NULL);


                opaque->acodec = create_codec_l(env, node);
                if (!opaque->acodec) {
                    ALOGE("%s:open_video_decoder: create_codec failed\n", __func__);
                    ret = -1;
                    goto fail;
                }
            }
        }


        assert(opaque->weak_vout);
    }
    //配置MediaCodec 
  	// 这里会配置MediaFormat,
    amc_ret = SDL_AMediaCodec_configure_surface(env, opaque->acodec, opaque->input_aformat, opaque->jsurface, NULL, 0);
    if (amc_ret != SDL_AMEDIA_OK) {
        ALOGE("%s:configure_surface: failed\n", __func__);
        ret = -1;
        goto fail;
    }


    //开启MediaCodec
    amc_ret = SDL_AMediaCodec_start(opaque->acodec);
    if (amc_ret != SDL_AMEDIA_OK) {
        ALOGE("%s:SDL_AMediaCodec_start: failed\n", __func__);
        ret = -1;
        goto fail;
    }

    opaque->acodec_first_dequeue_output_request = true;
    ALOGI("%s:new acodec: %p\n", __func__, opaque->acodec);
    SDL_VoutAndroid_setAMediaCodec(opaque->weak_vout, opaque->acodec);


    ret = 0;
fail:
    return ret;
}






static sdl_amedia_status_t SDL_AMediaCodecJava_configure_surface(
    JNIEnv*env,
    SDL_AMediaCodec* acodec,
    const SDL_AMediaFormat* aformat,
    jobject android_surface,
    SDL_AMediaCrypto *crypto,
    uint32_t flags)
{
    SDLTRACE("%s", __func__);
    //配置MediaCodec,会绑定Surface,写入数据可以之间显示在Surface之上
    SDL_AMediaCodec_Opaque *opaque = (SDL_AMediaCodec_Opaque *)acodec->opaque;
    jobject android_media_format = SDL_AMediaFormatJava_getObject(env, aformat);
    jobject android_media_codec  = SDL_AMediaCodecJava_getObject(env, acodec);
    ALOGE("configure acodec:%p format:%p: surface:%p", android_media_codec, android_media_format, android_surface);
    J4AC_MediaCodec__configure(env, android_media_codec, android_media_format, android_surface, crypto, flags);
    if (J4A_ExceptionCheck__catchAll(env)) {
        return SDL_AMEDIA_ERROR_UNKNOWN;
    }

    opaque->is_input_buffer_valid = true;
    return SDL_AMEDIA_OK;
}


启动解码线程

解码线程的入口函数是在上面MediaCodec过程中配置好的func_run_sync,在开始解码之前,会先启动编码数据(即NALU)输入线程。

static int video_thread(void *arg)
{









    FFPlayer *ffp = (FFPlayer *)arg;
    int       ret = 0;


    if (ffp->node_vdec) {
        ret = ffpipenode_run_sync(ffp->node_vdec);
    }
    return ret;
}





static int func_run_sync(IJKFF_Pipenode *node)
{
    JNIEnv                *env      = NULL;
    IJKFF_Pipenode_Opaque *opaque   = node->opaque;
    FFPlayer              *ffp      = opaque->ffp;
    VideoState            *is       = ffp->is;
    Decoder               *d        = &is->viddec;
    PacketQueue           *q        = d->queue;
    int                    ret      = 0;
    int                    dequeue_count = 0;
    AVFrame               *frame    = NULL;
    int                    got_frame = 0;
    AVRational             tb         = is->video_st->time_base;
    AVRational             frame_rate = av_guess_frame_rate(is->ic, is->video_st, NULL);
    double                 duration;
    double                 pts;
    //如果硬解码器不存在,走软解码
    if (!opaque->acodec) {
        return ffp_video_thread(ffp);
    }



    frame = av_frame_alloc();
    if (!frame)
        goto fail;

    // 启动编码数据输入线程
    opaque->enqueue_thread = SDL_CreateThreadEx(&opaque->_enqueue_thread, enqueue_thread_func, node, "amediacodec_input_thread");
    if (!opaque->enqueue_thread) {
        ALOGE("%s: SDL_CreateThreadEx failed\n", __func__);
        ret = -1;
        goto fail;
    }
}
启动输入数据线程

输入线程的工程流程如下:

1.从队列中读取AVPacket

2.将AVPacket中的AVCC格式的NALU转换成Annex-B格式的NALU

3.dequeueInputBuffer获取可写MediaCodec buffer

4.向buffer中写入NALU数据

5.queueInputBuffer向解码器输入数据

代码中有个packet_pending字段,它代表的意思是当前AVPacket中是否有未读完的数据,看代码一开始我以为可能是一次申请的buffer不够大,此次的AVPacket数据没法完全写入到buffer中,但是经过测试,没有遇到这种情况,如果出现这种情况,还需要保存好上次已经写入的位置,方便下次继续写。我遇到的情况是dequeueInputBuffer返回负数,也就是取出的AVPacket没法这次消费掉,需要放到下一次消费。如果发生这种情况会插入一个fake frame,从代码中没有看出其意义,并且我把这段逻辑删除后也可以正常播放,猜测可能是为了音画同步的处理,避免两帧之间的时间间隔过大。

现在解码器中已经有可编码的buffer了,那我们现在回去看一下是如何拿到解码后的数据的。

static int enqueue_thread_func(void *arg)
{









    JNIEnv                *env      = NULL;

    IJKFF_Pipenode        *node     = arg;
    IJKFF_Pipenode_Opaque *opaque   = node->opaque;
    FFPlayer              *ffp      = opaque->ffp;
    VideoState            *is       = ffp->is;
    Decoder               *d        = &is->viddec;
    PacketQueue           *q        = d->queue;
    int                    ret      = -1;
    int                    dequeue_count = 0;


    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s: SetupThreadEnv failed\n", __func__);
        goto fail;
    }


    while (!q->abort_request && !opaque->abort) {
        ret = feed_input_buffer(env, node, AMC_INPUT_TIMEOUT_US, &dequeue_count);
        if (ret != 0) {
            goto fail;
        }
    }



    ret = 0;
fail:
    SDL_AMediaCodecFake_abort(opaque->acodec);
    ALOGI("MediaCodec: %s: exit: %d", __func__, ret);
    return ret;
}




static int feed_input_buffer(JNIEnv *env, IJKFF_Pipenode *node, int64_t timeUs, int *enqueue_count)
{
    IJKFF_Pipenode_Opaque *opaque   = node->opaque;
    FFPlayer              *ffp      = opaque->ffp;
    IJKFF_Pipeline        *pipeline = opaque->pipeline;
    VideoState            *is       = ffp->is;
    Decoder               *d        = &is->viddec;
    PacketQueue           *q        = d->queue;
    sdl_amedia_status_t    amc_ret  = 0;
    int                    ret      = 0;
    ssize_t  input_buffer_index = 0;
    ssize_t  copy_size          = 0;
    int64_t  time_stamp         = 0;
    uint32_t queue_flags        = 0;


    if (enqueue_count)
        *enqueue_count = 0;




    if (d->queue->abort_request) {
        ret = 0;
        goto fail;
    }
    // packet_pending表示是packet中有未读完的数据,如果为0,进入if,读取下一个packet
    // 如果为0,并且没有进行seek操作,那么复用上一个packet
    // 但经过测试,未发现存在AVPacket中数据未一次性写完的情况,只出现了dequeueInputBuffer返回负数,
    // 从而跳过这次enqueueBuffer,下次进入时会使用这次的AVPacket的情况
    if (!d->packet_pending || d->queue->serial != d->pkt_serial) {


        H264ConvertState convert_state = {0, 0};
        AVPacket pkt;
        // 读取视频AVPacket
        do {
            if (d->queue->nb_packets == 0)
                SDL_CondSignal(d->empty_queue_cond);
            //从packet队列获取出video packet
            if (ffp_packet_queue_get_or_buffering(ffp, d->queue, &pkt, &d->pkt_serial, &d->finished) < 0) {
                ret = -1;
                goto fail;
            }
            // 如果是flush pkt 或者需要进行编码器内部需要进行flush,那么调用java层 flush
            if (ffp_is_flush_packet(&pkt) || opaque->acodec_flush_request) {
                // request flush before lock, or never get mutex
                opaque->acodec_flush_request = true;
                SDL_LockMutex(opaque->acodec_mutex);
                if (SDL_AMediaCodec_isStarted(opaque->acodec)) {
                    if (opaque->input_packet_count > 0) {
                        // flush empty queue cause error on OMX.SEC.AVC.Decoder (Nexus S)
                        SDL_VoutAndroid_invalidateAllBuffers(opaque->weak_vout);
                        SDL_AMediaCodec_flush(opaque->acodec);
                        opaque->input_packet_count = 0;
                    }
                    // If codec is configured in synchronous mode, codec will resume automatically
                    // SDL_AMediaCodec_start(opaque->acodec);
                }
                opaque->acodec_flush_request = false;
                SDL_CondSignal(opaque->acodec_cond);
                SDL_UnlockMutex(opaque->acodec_mutex);
                d->finished = 0;
                d->next_pts = d->start_pts;
                d->next_pts_tb = d->start_pts_tb;
            }
        } while (ffp_is_flush_packet(&pkt) || d->queue->serial != d->pkt_serial);
        av_packet_split_side_data(&pkt);
        av_packet_unref(&d->pkt);
        d->pkt_temp = d->pkt = pkt;
        d->packet_pending = 1;
        
        if (opaque->codecpar->codec_id == AV_CODEC_ID_H264 || opaque->codecpar->codec_id == AV_CODEC_ID_HEVC) {
            convert_h264_to_annexb(d->pkt_temp.data, d->pkt_temp.size, opaque->nal_size, &convert_state);
            int64_t time_stamp = d->pkt_temp.pts;
            if (!time_stamp && d->pkt_temp.dts)
                time_stamp = d->pkt_temp.dts;
            if (time_stamp > 0) {
                time_stamp = av_rescale_q(time_stamp, is->video_st->time_base, AV_TIME_BASE_Q);
            } else {
                time_stamp = 0;
            }
        }
    }



        //拿到输入流 decoder.dequeueInputBuffer
        input_buffer_index = SDL_AMediaCodec_dequeueInputBuffer(opaque->acodec, timeUs);
        if (input_buffer_index < 0) {
            if (SDL_AMediaCodec_isInputBuffersValid(opaque->acodec)) {
                // timeout
                ret = 0;
                goto fail;
            } else {
                // enqueue fake frame
                queue_flags |= AMEDIACODEC__BUFFER_FLAG_FAKE_FRAME;
                copy_size    = d->pkt_temp.size;
            }
        } else {
            SDL_AMediaCodecFake_flushFakeFrames(opaque->acodec);
            //取出buffer->getInputBuffers,并拷贝数据
            copy_size = SDL_AMediaCodec_writeInputData(opaque->acodec, input_buffer_index,
                                                       d->pkt_temp.data, d->pkt_temp.size);
            ALOGE("hyc  copy_size %ld  pkt_temp.size = %d", copy_size,d->pkt_temp.size);

            if (!copy_size) {
                ALOGE("%s: SDL_AMediaCodec_getInputBuffer failed\n", __func__);
                ret = -1;
                goto fail;
            }
        }

        time_stamp = d->pkt_temp.pts;
        if (time_stamp == AV_NOPTS_VALUE && d->pkt_temp.dts != AV_NOPTS_VALUE)
            time_stamp = d->pkt_temp.dts;
        if (time_stamp >= 0) {
            time_stamp = av_rescale_q(time_stamp, is->video_st->time_base, AV_TIME_BASE_Q);
        } else {
            time_stamp = 0;
        }
        // ALOGE("queueInputBuffer, %lld\n", time_stamp);
        //向解码器输入数据 decoder.queueInputBuffer
        amc_ret = SDL_AMediaCodec_queueInputBuffer(opaque->acodec, input_buffer_index, 0, copy_size, time_stamp, queue_flags);
        if (amc_ret != SDL_AMEDIA_OK) {
            ALOGE("%s: SDL_AMediaCodec_getInputBuffer failed\n", __func__);
            ret = -1;
            goto fail;
        }
        // ALOGE("%s: queue %d/%d", __func__, (int)copy_size, (int)input_buffer_size);
        opaque->input_packet_count++;
        if (enqueue_count)
            ++*enqueue_count;
    }

    if (copy_size < 0) {
        d->packet_pending = 0;
    } else {
        d->pkt_temp.dts =
        d->pkt_temp.pts = AV_NOPTS_VALUE;
        if (d->pkt_temp.data) {
            d->pkt_temp.data += copy_size;
            d->pkt_temp.size -= copy_size;
            if (d->pkt_temp.size <= 0)
                d->packet_pending = 0;
        } else {
            // FIXME: detect if decode finished
            // if (!got_frame) {
                d->packet_pending = 0;
                d->finished = d->pkt_serial;
            // }
        }
    }

fail:
    return ret;
}
获取解码Frame

了解MediaCodec的API

static int func_run_sync(IJKFF_Pipenode *node)
{









    JNIEnv                *env      = NULL;

    IJKFF_Pipenode_Opaque *opaque   = node->opaque;
    FFPlayer              *ffp      = opaque->ffp;
    VideoState            *is       = ffp->is;
    Decoder               *d        = &is->viddec;
    PacketQueue           *q        = d->queue;
    int                    ret      = 0;
    int                    dequeue_count = 0;
    AVFrame               *frame    = NULL;
    int                    got_frame = 0;
    AVRational             tb         = is->video_st->time_base;
    AVRational             frame_rate = av_guess_frame_rate(is->ic, is->video_st, NULL);
    double                 duration;
    double                 pts;
    //如果硬解码器不存在,走软解码
    if (!opaque->acodec) {
        return ffp_video_thread(ffp);
    }


    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s: SetupThreadEnv failed\n", __func__);
        return -1;
    }


    frame = av_frame_alloc();
    if (!frame)
        goto fail;
    // 启动编码数据输入线程
    opaque->enqueue_thread = SDL_CreateThreadEx(&opaque->_enqueue_thread, enqueue_thread_func, node, "amediacodec_input_thread");
    while (!q->abort_request) {
        int64_t timeUs = opaque->acodec_first_dequeue_output_request ? 0 : AMC_OUTPUT_TIMEOUT_US;
        got_frame = 0;
        ret = drain_output_buffer(env, node, timeUs, &dequeue_count, frame, &got_frame);
        if (opaque->acodec_first_dequeue_output_request) {
            SDL_LockMutex(opaque->acodec_first_dequeue_output_mutex);
            opaque->acodec_first_dequeue_output_request = false;
            SDL_CondSignal(opaque->acodec_first_dequeue_output_cond);
            SDL_UnlockMutex(opaque->acodec_first_dequeue_output_mutex);
        }
        if (ret != 0) {
            ret = -1;
            if (got_frame && frame->opaque)
                SDL_VoutAndroid_releaseBufferProxyP(opaque->weak_vout, (SDL_AMediaCodecBufferProxy **)&frame->opaque, false);
            goto fail;

        }


        if (got_frame) {
            duration = (frame_rate.num && frame_rate.den ? av_q2d((AVRational){frame_rate.den, frame_rate.num}) : 0);
            pts = (frame->pts == AV_NOPTS_VALUE) ? NAN : frame->pts * av_q2d(tb);
    
            //添加到帧队列
            ret = ffp_queue_picture(ffp, frame, pts, duration, av_frame_get_pkt_pos(frame), is->viddec.pkt_serial);
            if (ret) {
                if (frame->opaque)
                    SDL_VoutAndroid_releaseBufferProxyP(opaque->weak_vout, (SDL_AMediaCodecBufferProxy **)&frame->opaque, false);
            }
            av_frame_unref(frame);
        }
    }
}

MediaCodec中获取编码后的buffer,拿到后填充frame数据,设置好Frame的宽高,类型,并创建一个SDL_AMediaCodecBufferProxy,让opaque指向它,它包含了buffer index,其作用是控制音画同步,是的,一开始我还很纳闷硬编码的绘制时如何控制时间的,其关键点就在这里。

static int drain_output_buffer_l(JNIEnv *env, IJKFF_Pipenode *node, int64_t timeUs, int *dequeue_count, AVFrame *frame, int *got_frame)
{









    IJKFF_Pipenode_Opaque *opaque   = node->opaque;


    FFPlayer              *ffp      = opaque->ffp;
    int                    ret      = 0;
    SDL_AMediaCodecBufferInfo bufferInfo;
    ssize_t                   output_buffer_index = 0;


    if (dequeue_count)
        *dequeue_count = 0;





    if (JNI_OK != SDL_JNI_SetupThreadEnv(&env)) {
        ALOGE("%s:create: SetupThreadEnv failed\n", __func__);
        goto fail;
    }


    // 获取解码后的数据
    output_buffer_index = SDL_AMediaCodecFake_dequeueOutputBuffer(opaque->acodec, &bufferInfo, timeUs);
    if (output_buffer_index == AMEDIACODEC__INFO_OUTPUT_BUFFERS_CHANGED) {
        ALOGI("AMEDIACODEC__INFO_OUTPUT_BUFFERS_CHANGED\n");
        // continue;
    } else if (output_buffer_index == AMEDIACODEC__INFO_OUTPUT_FORMAT_CHANGED) {
        ALOGI("AMEDIACODEC__INFO_OUTPUT_FORMAT_CHANGED\n");
        SDL_AMediaFormat_deleteP(&opaque->output_aformat);
        opaque->output_aformat = SDL_AMediaCodec_getOutputFormat(opaque->acodec);
        // continue;
    } else if (output_buffer_index == AMEDIACODEC__INFO_TRY_AGAIN_LATER) {
        AMCTRACE("AMEDIACODEC__INFO_TRY_AGAIN_LATER\n");
        // continue;
    } else if (output_buffer_index < 0) {
        SDL_LockMutex(opaque->any_input_mutex);
        SDL_CondWaitTimeout(opaque->any_input_cond, opaque->any_input_mutex, 1000);
        SDL_UnlockMutex(opaque->any_input_mutex);


        goto done;
    } else if (output_buffer_index >= 0) {
        ffp->stat.vdps = SDL_SpeedSamplerAdd(&opaque->sampler, FFP_SHOW_VDPS_MEDIACODEC, "vdps[MediaCodec]");



        if (dequeue_count)
            ++*dequeue_count;


        //Nexus S 适配逻辑
        if (opaque->n_buf_out) {
           
        } else {
            //填充frame数据,重要的是output_buffer_index
            ret = amc_fill_frame(node, frame, got_frame, output_buffer_index, SDL_AMediaCodec_getSerial(opaque->acodec), &bufferInfo);
        }


    }

done:
    if (opaque->decoder->queue->abort_request)
        ret = -1;
    else
        ret = 0;
fail:
    return ret;
}



static int amc_fill_frame(
    IJKFF_Pipenode            *node,
    AVFrame                   *frame,
    int                       *got_frame,
    int                        output_buffer_index,
    int                        acodec_serial,
    SDL_AMediaCodecBufferInfo *buffer_info)
{
    IJKFF_Pipenode_Opaque *opaque     = node->opaque;
    FFPlayer              *ffp        = opaque->ffp;
    VideoState            *is         = ffp->is;
		// 持有buffer index
    frame->opaque = SDL_VoutAndroid_obtainBufferProxy(opaque->weak_vout, acodec_serial, output_buffer_index, buffer_info);
    if (!frame->opaque)
        goto fail;



    frame->width  = opaque->frame_width;
    frame->height = opaque->frame_height;
    frame->format = IJK_AV_PIX_FMT__ANDROID_MEDIACODEC;
    frame->sample_aspect_ratio = opaque->codecpar->sample_aspect_ratio;
    frame->pts    = av_rescale_q(buffer_info->presentationTimeUs, AV_TIME_BASE_Q, is->video_st->time_base);
    if (frame->pts < 0)
        frame->pts = AV_NOPTS_VALUE;
    // ALOGE("%s: %f", __func__, (float)frame->pts);

    *got_frame = 1;
    return 0;
fail:
    *got_frame = 0;
    return -1;
}
将Frame添加到队列

这里的逻辑与软解码绘制的逻辑一样,我就不贴这些相同逻辑的代码了,只把不同的实现贴出来。

SDL_VoutFillFrameYUVOverlay函数中最终会调到下面的func_fill_frame函数,和软件渲染流程一样,最终使用到的数据是SDL_VoutOverlay,这里设置了它的格式为SDL_FCC__AMC,并保存了这一帧对应的SDL_AMediaCodecBufferProxy

// 创建SDL_VoutOverlay
SDL_VoutOverlay *SDL_VoutAMediaCodec_CreateOverlay(int width, int height, SDL_Vout *vout)
{

    SDLTRACE("SDL_VoutAMediaCodec_CreateOverlay(w=%d, h=%d, fmt=_AMC vout=%p)\n",
        width, height, vout);
    SDL_VoutOverlay *overlay = SDL_VoutOverlay_CreateInternal(sizeof(SDL_VoutOverlay_Opaque));
    if (!overlay) {
        ALOGE("overlay allocation failed");
        return NULL;
    }






    SDL_VoutOverlay_Opaque *opaque = overlay->opaque;
    opaque->mutex         = SDL_CreateMutex();
    opaque->vout          = vout;
    opaque->acodec        = NULL;
    opaque->buffer_proxy  = NULL;


    overlay->opaque_class = &g_vout_overlay_amediacodec_class;
    overlay->format       = SDL_FCC__AMC;
    overlay->pitches      = opaque->pitches;
    overlay->pixels       = opaque->pixels;
    overlay->w            = width;
    overlay->h            = height;
    overlay->is_private   = 1;

    overlay->free_l       = overlay_free_l;
    overlay->lock         = overlay_lock;
    overlay->unlock       = overlay_unlock;
    overlay->unref        = overlay_unref;
    overlay->func_fill_frame = func_fill_frame;


    if (!opaque->mutex) {
        ALOGE("SDL_CreateMutex failed");
        goto fail;
    }


    return overlay;


fail:
    overlay_free_l(overlay);
    return NULL;
}


static int func_fill_frame(SDL_VoutOverlay *overlay, const AVFrame *frame)
{
    assert(frame->format == IJK_AV_PIX_FMT__ANDROID_MEDIACODEC);

    SDL_VoutOverlay_Opaque *opaque = overlay->opaque;




    if (!check_object(overlay, __func__))
        return -1;


    if (opaque->buffer_proxy)
        SDL_VoutAndroid_releaseBufferProxyP(opaque->vout, (SDL_AMediaCodecBufferProxy **)&opaque->buffer_proxy, false);


    opaque->acodec       = SDL_VoutAndroid_peekAMediaCodec(opaque->vout);
    // TODO: ref-count buffer_proxy?
    opaque->buffer_proxy = (SDL_AMediaCodecBufferProxy *)frame->opaque;


    overlay->opaque_class = &g_vout_overlay_amediacodec_class;
    overlay->format     = SDL_FCC__AMC;
    overlay->planes     = 1;
    overlay->pixels[0]  = NULL;
    overlay->pixels[1]  = NULL;
    overlay->pitches[0] = 0;
    overlay->pitches[1] = 0;
    overlay->is_private = 1;

    overlay->w = (int)frame->width;
    overlay->h = (int)frame->height;
    return 0;
}
硬件编码的渲染

与软件的渲染一样,同一个入口,不过会走不同的分支。如之前提到的,硬件渲染时其overlay->format被设置为SDL_FCC__AMC.

static int func_display_overlay_l(SDL_Vout *vout, SDL_VoutOverlay *overlay)
{









    SDL_Vout_Opaque *opaque = vout->opaque;
    ANativeWindow *native_window = opaque->native_window;


    if (!native_window) {
        if (!opaque->null_native_window_warned) {
            opaque->null_native_window_warned = 1;
            ALOGW("func_display_overlay_l: NULL native_window");
        }
        return -1;
    } else {
        opaque->null_native_window_warned = 1;
    }



    if (!overlay) {
        ALOGE("func_display_overlay_l: NULL overlay");
        return -1;
    }

    if (overlay->w <= 0 || overlay->h <= 0) {
        ALOGE("func_display_overlay_l: invalid overlay dimensions(%d, %d)", overlay->w, overlay->h);
        return -1;
    }
    switch(overlay->format) {
    case SDL_FCC__AMC: {
        // only ANativeWindow support
        IJK_EGL_terminate(opaque->egl);
        //渲染为true就会渲染到surface
        return SDL_VoutOverlayAMediaCodec_releaseFrame_l(overlay, NULL, true);
    }

 }
  

这里会将解码器获取到的buffer释放掉,并渲染到Surface上。为什么能渲染?因为这里调用的是public final void releaseOutputBuffer(int index, boolean render)方法,第二个参数为true时会将这个buffer渲染到Surface之上,这样我们就有机会去控制什么时候该渲染这一帧了。


/**
 * If you are done with a buffer, use this call to return the buffer to the codec
 * or to render it on the output surface. If you configured the codec with an
 * output surface, setting {@code render} to {@code true} will first send the buffer
 * to that output surface. The surface will release the buffer back to the codec once
 * it is no longer used/displayed.
 *
 * Once an output buffer is released to the codec, it MUST NOT
 * be used until it is later retrieved by {@link #getOutputBuffer} in response
 * to a {@link #dequeueOutputBuffer} return value or a
 * {@link Callback#onOutputBufferAvailable} callback.
 *
 * @param index The index of a client-owned output buffer previously returned
 *              from a call to {@link #dequeueOutputBuffer}.
 * @param render If a valid surface was specified when configuring the codec,
 *               passing true renders this output buffer to the surface.
 * @throws IllegalStateException if not in the Executing state.
 * @throws MediaCodec.CodecException upon codec error.
 */
public final void releaseOutputBuffer(int index, boolean render) {
}



int  SDL_VoutOverlayAMediaCodec_releaseFrame_l(SDL_VoutOverlay *overlay, SDL_AMediaCodec *acodec, bool render)
{
    if (!check_object(overlay, __func__))
        return -1;

    SDL_VoutOverlay_Opaque *opaque = overlay->opaque;
    return SDL_VoutAndroid_releaseBufferProxyP_l(opaque->vout, &opaque->buffer_proxy, render);
}


int SDL_VoutAndroid_releaseBufferProxyP_l(SDL_Vout *vout, SDL_AMediaCodecBufferProxy **proxy, bool render)
{
    int ret = 0;



    if (!proxy)
        return 0;


    ret = SDL_VoutAndroid_releaseBufferProxy_l(vout, *proxy, render);
    *proxy = NULL;
    return ret;
}



static int SDL_VoutAndroid_releaseBufferProxy_l(SDL_Vout *vout, SDL_AMediaCodecBufferProxy *proxy, bool render)
{
    SDL_Vout_Opaque *opaque = vout->opaque;




    if (!proxy)
        return 0;


    ISDL_Array__push_back(&opaque->overlay_pool, proxy);
    // 判断是否一个序列,即是否有seek操作
    if (!SDL_AMediaCodec_isSameSerial(opaque->acodec, proxy->acodec_serial)) {
        ALOGW("%s: [%d] ???????? proxy %d: vout: %d idx: %d render: %s fake: %s",
            __func__,
            proxy->buffer_id,
            proxy->acodec_serial,
            SDL_AMediaCodec_getSerial(opaque->acodec),
            proxy->buffer_index, 
            render ? "true" : "false",
            (proxy->buffer_info.flags & AMEDIACODEC__BUFFER_FLAG_FAKE_FRAME) ? "YES" : "NO");
        return 0;
    }


    // buffer_index不合法或是fake frame不去走release操作
    if (proxy->buffer_index < 0) {
        ALOGE("%s: [%d] invalid AMediaCodec buffer index %d\n", __func__, proxy->buffer_id, proxy->buffer_index);
        return 0;
    } else if (proxy->buffer_info.flags & AMEDIACODEC__BUFFER_FLAG_FAKE_FRAME) {
        proxy->buffer_index = -1;
        return 0;
    }
		// 调用 java层 releaseOutputBuffer
    sdl_amedia_status_t amc_ret = SDL_AMediaCodec_releaseOutputBuffer(opaque->acodec, proxy->buffer_index, render);    
    if (amc_ret != SDL_AMEDIA_OK) {
        ALOGW("%s: [%d] !!!!!!!! proxy %d: vout: %d idx: %d render: %s, fake: %s",
            __func__,
            proxy->buffer_id,
            proxy->acodec_serial,
            SDL_AMediaCodec_getSerial(opaque->acodec),
            proxy->buffer_index, 
            render ? "true" : "false",
            (proxy->buffer_info.flags & AMEDIACODEC__BUFFER_FLAG_FAKE_FRAME) ? "YES" : "NO");
        proxy->buffer_index = -1;
        return -1;
    }
    proxy->buffer_index = -1;

    return 0;
}

至此Android上硬件编码及绘制流程我们也跑通了,现在来简单总结一下:

1.获取视频编码信息,选择合适的MediaCodec,配置并启动解码器,这个时候还需要将AVCC的sps/pps信息转换为Annex-B格式NALU。

2.创建NALU输入线程,不断的取出AVPacket,将其数据转换成Annex-B格式的NALU,申请MediaCodec 可写buffer并写入NALU,写入完成后向MediaCodec提交buffer。

3.在视频编码线程中不断的MediaCodec获取可读buffer,将buffer index设置到AVFrame中,并将这个frame添加到视频帧队列,等待渲染。

4.在渲染线程中,不断的从帧队列中取出数据进行渲染,而对于硬编码的渲染就是release掉AVFrame中的buffer,调用releaseOutputBuffer后即渲染到MediaCodec所绑定的Surface之上。

以上是个人见解,如果错误或不同见解,欢迎指正和交流。

参考文档:

Android与IOS端h264硬解关键流程梳理 – 简书 (jianshu.com)

H.264 媒体流 AnnexB 和 AVCC 格式分析 及 FFmpeg 解析mp4的H.264码流方法-腾讯云开发者社区-腾讯云 (tencent.com)

© 版权声明
THE END
喜欢就支持一下吧
点赞0

Warning: mysqli_query(): (HY000/3): Error writing file '/tmp/MY3NZafz' (Errcode: 28 - No space left on device) in /www/wwwroot/583.cn/wp-includes/class-wpdb.php on line 2345
admin的头像-五八三
评论 抢沙发
头像
欢迎您留下宝贵的见解!
提交
头像

昵称

图形验证码
取消
昵称代码图片