前置知识:解码流程 :star:

官方文档

  • 解码器状态机
    decoder-fsm

  • AVCodec编码流程
    decode-process

项目结构

相关源码仓:Hello-AVCodec

  • 该源码仓文件结构和Cmakelist中的不符,编译会报错
目录结构(点击)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
├─main
│ ├─cpp
│ │ ├─bounds_checking_function
│ │ │ ├─include
│ │ │ └─src
│ │ ├─common
│ │ │ └─dfx
│ │ ├─napi_wrapper
│ │ │ └─Sample
│ │ ├─sample_framework
│ │ │ └─test
│ │ │ └─unittest
│ │ │ └─video_test
│ │ │ └─video_test
│ │ │ ├─capbilities
│ │ │ │ ├─capbility_node
│ │ │ │ │ ├─data_producer
│ │ │ │ │ │ ├─bitstream_reader
│ │ │ │ │ │ ├─data_producer_base
│ │ │ │ │ │ ├─demuxer
│ │ │ │ │ │ └─raw_data_reader
│ │ │ │ │ └─video_codec
│ │ │ │ │ ├─video_codec
│ │ │ │ │ ├─video_decoder
│ │ │ │ │ └─video_encoder
│ │ │ │ ├─sample_buffer_queue
│ │ │ │ └─sample_window_manager
│ │ │ │ └─window_wrapper
│ │ │ │ └─include
│ │ │ ├─common
│ │ │ │ └─include
│ │ │ ├─demo
│ │ │ └─sample
│ │ │ ├─base
│ │ │ ├─decoder
│ │ │ ├─encoder
│ │ │ ├─helper
│ │ │ │ └─include
│ │ │ └─yuv_viewer
│ │ ├─types
│ │ │ └─libavcodecdemo
│ │ └─xcomponent_window_wrapper
│ ├─ets
│ │ ├─avcodecdemoability
│ │ ├─avcodecdemobackupability
│ │ └─pages
│ └─resources
│ ├─base
│ │ ├─element
│ │ ├─media
│ │ └─profile
│ └─dark
│ └─element
├─mock
├─ohosTest
│ └─ets
│ └─test
└─test
  • src/main/cpp/CMakeLists.txt:

    1
    2
    3
    4
    5
    6
    7
    8
    # clone video codec demo code
    if(NOT EXISTS ${SAMPLE_FRAMEWORK_PATH})
    message(STATUS "sample_framework directory does not exist, do clone code")
    execute_process(COMMAND git config core.sparsecheckout true)
    execute_process(COMMAND git clone --no-checkout --filter=blob:none --sparse https://gitee.com/westyao/multimedia_av_codec.git ${SAMPLE_FRAMEWORK_PATH})
    file(WRITE ${SAMPLE_FRAMEWORK_PATH}/.git/info/sparse-checkout "test/unittest/video_test/video_test/\n")
    execute_process(COMMAND git -C ${SAMPLE_FRAMEWORK_PATH} checkout dev_vcd)
    endif()

    上面代码在sample_wrapper目录不存在时从远程仓库的特定分支(dev_vcd)拉取相关目录下的测试代码


UI层

  • src/main/ets/pages: UI界面
    • NavigationPage.ets: Column中布置两个按钮
    • PlayerPage.ets: 调用cpp中实现的NAPI(声明于cpp/types/libavcodecdemo/index.d.tx,TS到CPP代码的转化过程我不清楚)
      • Play(surfaceId, uri):cpp/napi_wrapper/napi_init.cpp
      • OnSurfaceCreated(surfaceId): cpp/xcomponents_window_wrapper/xcomponents_window_wrapper.cpp

Native层:函数调用关系分析

一.Native层接口——Play函数

Play:异步机制实现播放逻辑,最终返回一个 Promise 供 JS 层等待结果。

1
2
3
4
5
6
7
8
9
ret = napi_create_async_work(
env,
nullptr, // 异步工作无特定对象绑定
work_name, // 工作名称(用于调试/日志)
ExecutePlay, // 实际执行播放的回调
CompletePlay, // 播放完成后的回调(通知 JS Promise)
asyncData, // 传递上下文数据
&asyncWork
);

!!! tip “图示说明”
- 图中绿色节点中为AVCodec Kit库函数
- 蓝色节点用来说明类间继承关系
- VideoDecoderAPI{10/11}Buffer表示VideoDecoderAPI10BufferVideoDecoderAPI11Buffer
- XYZ(x_y_z.cpp) 表示XYZ定义在x_y_z.cpp文件内

调用链图1

下图ExecutePlay调用时已确定SampleInfo(sample_info.h)为

1
2
3
4
5
6
7
8
SampleInfo sampleInfo = {
.inputFilePath = asyncData->uri.c_str(),
.dataProducerInfo = {
.demuxerSourceType = isUri ? DEMUXER_SOURCE_TYPE_URI : DEMUXER_SOURCE_TYPE_FILE,
},
.codecConsumerType = CODEC_CONSUMER_TYPE_DECODER_RENDER_OUTPUT,
.surfaceId = asyncData->surfaceId
};

这时已确定后面构造的类型:

  • DataProducer:Demuxer
  • DataConsumer:DECODER_RENDER_OUTPUT
  • SampleType:VIDEO_SAMPLE(默认值)
  • CodecType:VIDEO_HW_DECODER(默认值)
  • codecRunMode:API11_SURFACE(默认值)
1
2
3
4
5
6
7
8
9
10
11
graph TD
A["ExecutePlay(napi_init.cpp)"] -->|传入SampleInfo参数| B["RunSample(sample_helper.cpp)"]
B --- |根据SampleType创建Sample| C["SampleFactory::CreateSample(sample_base.cpp)"]
C --- |SampleType=VIDEO_SAMPLE| D("VideoSampleBase(2个子类)")
C --> |SampleType=YUV_VIEWER|E("YuvViewer")
D --> F["VSB.Init"]
D --> P["VSB.Start"]
D --> Q["SampleBase.WaitForSampleDone"]
E -->R1["YuvViewer.Init"]
E -->R2["YuvViewer.Start"]
E -->Q["SampleBase.WaitForSampleDone"]
  • 其中,CreateSample(SampleInfo)会根据SampleInfo.SampleType返回两种不同的sample:VIDEO_SAMPLEYUV_VIEWERVIDEO_SAMPLE也有两种: VideoEncodeSample, VideoDecodeSample,由CodecType字段确定(这里为VideoDecodeSample)。

    • sample间继承关系

      1
      2
      3
      4
      5
      6
      7
      graph TD
      A["SampleBase(sample_base.cpp)"] -->B["VideoSampleBase(video_sample_base.cpp)"]
      B --> C["VideoEncoderSample"]
      B --> D["VideoDecoderSample"]
      A --> E["YuvViewer"]
      classDef inherit fill:#2196F3,stroke:#1976D2,stroke-width:2px;
      class A,B,C,D,E inherit

调用链图2——VSB.Init

1
2
3
4
5
6
7
8
9
graph TD
F["VSB.Init"]
F -->|根据sampleinfo.dataProducerInfo.dataProducerType创建dataProducer并初始化| G("DataProducerFactory::CreateDataProducer(3种)")
G -->|这里创建Demuxer| H["dataProducer->Init"]
F --> I["VSB/VDS.CreateWindow"]
I -->|VDS会根据codecRunMode、codecConsumerType创建不同窗口| Z["WindowManager::GetInstance().CreateWindowWrapper"]
F --> J["VSB.InitVideoCodec"]
F -->|初始化context->in\outputBufferQueue| O["VXS.CreateSampleBufferQueue"]

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
graph TD
J["VSB.InitVideoCodec"] -->|传入codecType, codecRunMode| P("VideoCodecFactory::CreateVideoCodec")
P ---|根据codecType,codeRunMode选择调用,返回VXAPI(10/11)(Surface/Buffer)| P1["CreateVX"]
subgraph 调用于InitVideoCodec
P1 --- |创建编/解码器实例| J1["VX->Create"]
P1 --- |配置编/解码器| J2["VX->Configure"]
P1 --- |如果是Surface 输出| J3["VX->DealWithSurface"]
P1 --- |注册回调函数指针集合OH_AVCodecCallback| J4["VXAPI{10/11}->SetCallback"]
P1 --- |进行数据准备| J5["VX->Prepare"]
J1 --- |通过GetCodecName获取codecName后调用| K1["OH_VX_CreateByName"]
J2 --- |配置好OH_AVFormat后调用| K2["OH_VX_Configure"]
J3 --- K3["OH_VX_SetSurface"]
J4 --- |API10| K41["OH_VX_SetCallback"]
J4 --- |API11| K42["OH_VX_RegisterCallback"]
J5 --- K5["OH_VX_Prepare"]
end


classDef libfunc fill:#4CAF50,stroke:#388E3C,stroke-width:2px;

class K1 libfunc
class K2 libfunc
class K3 libfunc
class K41 libfunc
class K42 libfunc
class K5 libfunc

接调用链图1中VSB.Init,其中

  • VSB: VideoSampleBase
    • 含SampleContex成员,存储sampleInfo, windowWrapper, videoCodec, inputBufferQueue, outputBufferQueue
    • 含dataProducer成员
  • VES: VideoEncoderSample
  • VDS: VideoDecoderSample
  • VXS: VES 或 VDS
  • VCB: VideoCodecBase
  • VX: VideoEncoderVideoDecoder

DataProducer(左)和VideoCodec继承关系(右):(新代码两者都基于CapbilityNodeBase类)

1
2
3
4
5
6
7
8
9
10
11
12
13
graph TD
B["DataProducerBase(data_producer_base.cpp)"]
B --> C["BitStreamReader"]
B --> D["Demuxer"]
B --> E["RawdataReader"]
F["VideoCodecBase(video_codec_base.cpp)"]
F --> G["VideoDecoder"]
F --> H["VideoEncoder"]
G --- I["VideoDecoderAPI{10/11}"]
I --- J["VideoDecoderAPI{10/11}Buffer"]
I --- K["VideoDecoderAPI{10/11}Surface"]
classDef inherit fill:#2196F3,stroke:#1976D2,stroke-width:2px;
class B,C,D,E,F,G,H,I,J,K inherit

!!! note “注”
上图VideoEncoder向下继承关系和VideoDecoder同理。

调用链图3——VSB.Start

接调用链图1中VSB.Start

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
graph TD
A["VSB.Start"] --> B["VX->Start"]
A -->|更新帧信息| C["VX->UpdatePictureInfo"]
A -->|启动输入/输出线程| D["VXS.StartLoop"]
B --- E["OH_Video{En/De}coder_Start"]
C --- F["GetFormat"]
F --- F1["OH_VideoDecoder_GetOutputDescription"]
D --- |解码| G1["VDS.StartLoop"]
D --- |编码| G2["VES.StartLoop"]
G1 --- G11["VDS.InputThread"]
G1 --- G12["VDS.OutputThread"]
G2 --- G21["VES.{Surface/Buffer}InputThread"]
G2 --- G22["VES.OutputThread"]



classDef libfunc fill:#4CAF50,stroke:#388E3C,stroke-width:2px;
class E libfunc
class F1 libfunc
  • 1. VideoDecoderSample::InputThread:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
graph TD

A["VDS::InputThread"] --- B["AVCodecTrace::TraceBegin"]
A --- |进入输入轮转(内部为执行顺序)| C1["inputBufferQueue->DequeueBuffer"]
subgraph 循环体
C1 --> C2["dataProducer->ReadSample"]
C2 --> C3["ThreadSleep(sample_utils.cpp)"]
C3 --> C4["VDAPI{10/11}->PushInput"]
C4 --> C1["inputBufferQueue->DequeueBuffer"]
C4 --- Z["OH_VideoDecoder_PushInputBuffer"]
end
A --- D["VSB.PushEosFrame"]
B --- B1["OH_HiTrace_StartAsyncTrace"]
D --- D1["VDAPI{10/11}->PushInput"]
D1 --- D2["OH_VideoDecoder_PushInputBuffer"]


classDef libfunc fill:#4CAF50,stroke:#388E3C,stroke-width:2px;
class Z libfunc
class B1 libfunc
class D2 libfunc
  • 2. VideoDecoderSample::OutputThread:
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
graph TD

A["VDS::OutputThread"] --- |进入输出轮转(内部为执行顺序)| C1["outputBufferQueue->DequeueBuffer"]
subgraph 循环体
C1 --> C2["DumpOutput"]
C2 --> C3["ThreadSleep(sample_utils.cpp)"]
C3 --> C4["VXAPI{10/11}{Surface/Buffer}->FreeOutput"]
C4 --> C1

end
C4 ---|API11Surface| Z41["OH_VD_RenderOutputBuffer"]
C4 ---|API11Buffer| Z42["OH_VD_FreeOutputBuffer"]
C4 ---|API10Surface| Z43["OH_VD_RenderOutputData(弃用)"]
C4 ---|API10Buffer| Z44["OH_VD_FreeOutputData(弃用)"]
A --- B["AVCodecTrace::TraceEnd"]
A --- D["AVCodecTrace::CounterTrace"]
A --- E["SampleBase.NotifySampleDone"]
B --- B1["OH_HiTrace_FinishAsyncTrace"]
D --- D1["OH_HiTrace_CountTrace"]

classDef libfunc fill:#4CAF50,stroke:#388E3C,stroke-width:2px;
class Z libfunc
class B1 libfunc
class D1 libfunc
class Z41 libfunc
class Z42 libfunc
class Z43 libfunc
class Z44 libfunc

YuvViewer调用链

核心功能是将内存中的 YUV 原始数据通过图形窗口渲染为人类可理解的图像

1
2
3
4
5
6
7
8
graph TD
A["YuvViewer"] --- B["YuvViewer.Init"]
A --- C["YuvViewer.Start"]
A --- D["YueViewer.WaitForSampleDone"]
B --- E["DataProducerFactory::CreateDataProducer(3种)"]
E --- F["dataProducer.Init"]
B --- G["CreateWindow"]
C --- H["InputThread"]
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
graph TD
%% 节点样式定义
A["YuvViewer"]
B["YuvViewer.Init"]
C["YuvViewer.Start"]
D["YueViewer.WaitForSampleDone"]
E["DataProducerFactory::CreateDataProducer(3种)"]
F["dataProducer.Init"]
G["CreateWindow"]

%% 连接关系定义
A -->|初始化流程| B
A -->|启动流程| C
A -->|等待流程| D
B -->|创建数据生产者| E
E -->|初始化生产者| F
B -->|创建窗口| G

%% 设置节点样式
classDef core fill:#4CAF50,stroke:#388E3C,stroke-width:2px;
classDef factory fill:#FF9800,stroke:#F57C00,stroke-width:2px;
classDef window fill:#607D8B,stroke:#546E7A,stroke-width:2px;
classDef process fill:#2196F3,stroke:#1976D2,stroke-width:2px;

class A core
class B core
class C core
class D core
class E factory
class F process
class G window

%% 设置边样式
linkStyle 0 fill:#fff, stroke:#388E3C, stroke-width:2px;
linkStyle 1 fill:#fff, stroke:#1976D2, stroke-width:2px;
linkStyle 2 fill:#fff, stroke:#F57C00, stroke-width:2px;
linkStyle 3 fill:#fff, stroke:#546E7A, stroke-width:2px;

%% 分区分隔(用透明矩形模拟)
A -->|虚线分隔| H["dd"]
class H separator
classDef separator fill:transparent,stroke:transparent;

style H stroke-dasharray:5,2;

语法分析

!!! info “static_pointer_cast”
选择Video sample的代码如下:

1
2
3
sample = (info.codecType & 0b10) ? // 0b10: Video encoder mask
std::static_pointer_cast<SampleBase>(std::make_shared<VideoEncoderSample>()) :
std::static_pointer_cast<SampleBase>(std::make_shared<VideoDecoderSample>());
`static_pointer_cast`仅用于智能指针(如shared_ptr)间的类型转换,用于编译时已知合法的类型转换,包括继承关系中的向上/向下转换,以及非继承的类型兼容转换。但向下转换时需人工确保目标类型与实际对象类型一致。

二. Native层接口——OnSurfaceCreated

  • OnSurfaceCreated: 将 Surface 与原生窗口(NativeWindow)绑定,并通过窗口管理器(WindowManager)注册该窗口的包装类(WindowWrapper),以确保后续的图形或 UI 操作可以基于该窗口执行。
    NAPI定义于cpp/napi_wrapper/sample/sample_napi_wrapper.cpp.
    解析函数为 SampleNapiWrapper::OnSurfaceCreated(napi_env env, napi_callback_info info),实际调用OHOS::MediaAVCodec::Sample::XComponentWindowWrapper::OnSurfaceCreated(surfaceId);

  • OnSurfaceCreated调用链

1
2
3
4
5
6
7
graph TD
A["OnSurfaceCreated"] ---|传入surfaceId| B["XComponentWindowWrapper::OnSurfaceCreated"]
B ---|根据surfaceId创建对应的nativeWindow实例| C["OH_NativeWindow_CreateNativeWindowFromSurfaceId"]
B --- |注册窗口包装类|D["WindowManager::GetInstance().RegisterWindowWrapper"]

classDef libfunc fill:#4CAF50,stroke:#388E3C,stroke-width:2px;
class C libfunc

WindowWrapper继承关系

1
2
3
4
5
6
graph TD
A["WindowWrapper"] --- B["NativeImageWindowWrapper"]
A --- C["RosenWindowWrapper"]
A --- D["XComponentWindowWrapper"]
classDef inherit fill:#2196F3,stroke:#1976D2,stroke-width:2px;
class A,B,C,D inherit