转载

【转载】mediaCodec例子05-RTMP从0到1实战

温馨提示:
本文最后更新于 2024年03月04日,已超过 56 天没有更新。若文章内的图片失效(无法正常加载),请留言反馈或直接联系我

RTMP的定义

  • RTMP全称是Real-Time Messaging Protocol 实时消息传输协议。
  • 它是一个传输数据的协议,定义了数据包的封装格式。
  • 它基于tcp协议来传输数据

为什么需要RTMP

  • 理论上来说,我们可以任何协议来封装数据,只需要提供对应的解封装协议即可。
  • RTMP已经具备了成熟的配套,比如RTMPDump用于帮助客户端发送数据,包括建立连接与重连,提供队列,封装对象,内存分配与释放等。
  • 采用tcp写协议保证了数据包的可靠性,虽然在视频传输过程中,这一要求不是必须的。
  • 提供了RTMPS保证数据的安全性
  • 提供了RTMFP采用更高效的udp来替换tcp

总之一句话:它只是一种提供了很多通用能力并迭代得比较完善了的传输协议。选择它是为了提升效率。

RTMP协议包

视频包协议格式

 

音频包协议格式

rtmpDump库地址

rtmpDump

RTMP实战

我们采集本地摄像头数据和麦克风数据,通过mediaCodec编码产生h264数据,并通过rtmpdump提供的接口来连接服务器,发送rtmp数据包,最终展示到播放器中。

1. 摄像头数据采集

流程

  • 1.打开camera
    1. 设置camera对应的surface
    1. 添加预览回调
  • 4.开始预览
  • 5.预览回调buffer给codec编码 
public class LocalSurfaceView extends SurfaceView implements SurfaceHolder.Callback, Camera.PreviewCallback {
    private Camera mCamera;
    private Camera.Size size;
    byte[] buffer;
    private OnPreview onPreviewCallback;

    public LocalSurfaceView(Context context) {
        super(context);
    }

    public LocalSurfaceView(Context context, AttributeSet attrs) {
        super(context, attrs);
        getHolder().addCallback(this);

    }

    public void addOnPreviewCallback(OnPreview callback){
        this.onPreviewCallback = callback;
    }

    @Override
    public void onPreviewFrame(byte[] data, Camera camera) {
        if (onPreviewCallback != null){
            onPreviewCallback.onPreviewFrame(data,size);
        }

        mCamera.addCallbackBuffer(data);
    }

    @Override
    public void surfaceCreated(@NonNull SurfaceHolder surfaceHolder) {
        startPreview();
    }

    @Override
    public void surfaceChanged(@NonNull SurfaceHolder surfaceHolder, int i, int i1, int i2) {

    }

    @Override
    public void surfaceDestroyed(@NonNull SurfaceHolder surfaceHolder) {

    }

    private void startPreview() {
        mCamera = Camera.open(Camera.CameraInfo.CAMERA_FACING_BACK);
        size = mCamera.getParameters().getPreviewSize();
        try {
            mCamera.setPreviewDisplay(getHolder());
        } catch (IOException e) {
            e.printStackTrace();
        }
        buffer = new byte[size.width * size.height * 3/2];
        mCamera.addCallbackBuffer(buffer);
        
        mCamera.setDisplayOrientation(90);
        mCamera.setPreviewCallback(this);
        mCamera.startPreview();
    }

    public void removePreviewCallback() {
        this.onPreviewCallback = null;
    }

    interface OnPreview{
        void onPreviewFrame(byte[] data, Camera.Size size);
    }
}

细节说明

 
buffer = new byte[size.width * size.height * 3/2];
buffer = new byte[size.width * size.height * 3/2];

摄像头输出的是nv21也就是yuv420数据,它的大小为 y=widthheight u = widthheight1/4 v = widthheight1/4 因此yuv=widthheight*3/2

mCamera.setDisplayOrientation(90);
后置摄像头是横着放的,它影响的是本地surfaceView中画面显示的样子。不影响数据的传输

2.视频数据编码

H264Encoder.java

这是个基类,它提供了标准的编码的流程,可以通过扩展方式来自己定义编码器参数,一般编码参数需要指定帧率,I帧间隔,码率,编码视频数据格式

 
 
public class H264Encoder extends Thread{
    protected int width;
    protected int height;
    protected long startTime = 0;
    protected boolean isLiving = false;
    protected long lastKeyFrameTime ;
    protected MediaCodec mediaCodec;
    protected LinkedBlockingQueue<RTMPPackage> queue;
    protected MediaCodec.BufferInfo info = new MediaCodec.BufferInfo();

    public H264Encoder(LinkedBlockingQueue<RTMPPackage> queue) {
        this.width = 640;
        this.height = 1920;
        this.queue = queue;
    }

    //帧率 , I帧间隔 , 码率 , 数据格式
    protected void configEncodeCodec(){
        MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, width, height);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE , 20);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 30);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, width * height);
        //指定编码的数据格式是由surface决定的。
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT , MediaCodecInfo.CodecCapabilities.COLOR_FormatSurface);

        try{
            mediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
            mediaCodec.configure(mediaFormat , null , null , MediaCodec.CONFIGURE_FLAG_ENCODE);
        }catch (Exception e){
            e.printStackTrace();
        }
    }

    @Override
    public void run() {
        super.run();
        isLiving = true;
        if (mediaCodec == null){
            configEncodeCodec();
        }
        mediaCodec.start();

        while (isLiving) {
            input();
            output();
        }
        Log.d("hch", "live stopped");
    }

    protected void input() {

    }

    protected void output(){
        //超过2秒,强制输出I帧。
        if (System.currentTimeMillis() - lastKeyFrameTime > 2000){
            Bundle b = new Bundle();
            b.putInt(MediaCodec.PARAMETER_KEY_REQUEST_SYNC_FRAME , 0);
            mediaCodec.setParameters(b);
            lastKeyFrameTime = System.currentTimeMillis();
        }

        try {
            int outIndex = mediaCodec.dequeueOutputBuffer(info, 10000);
            byte[] bytes = new byte[info.size];
            if (outIndex >= 0) {
                if (startTime == 0){
                    startTime = info.presentationTimeUs / 1000;
                }
                ByteBuffer byteBuffer = mediaCodec.getOutputBuffer(outIndex);
                byteBuffer.get(bytes);
                queue.offer(new RTMPPackage(bytes, info.presentationTimeUs / 1000 - startTime).setType(RTMPPackage.TYPE_VIDEO));
                mediaCodec.releaseOutputBuffer(outIndex, false);
            }
        } catch (Exception e) {
            e.printStackTrace();
        }
    }


    public void stopLive() {
        isLiving = false;
        mediaCodec.flush();
        mediaCodec.release();
        mediaCodec.stop();
    }
}

CameraEncoder.java

它扩展自H264Encoder,处理了输入数据和编码器参数

 
public class CameraEncoder extends H264Encoder{

    private LinkedBlockingQueue<byte[]> inputQueue = new LinkedBlockingQueue<byte[]>();
    byte[] yuv;

    public CameraEncoder(LinkedBlockingQueue<RTMPPackage> queue, int width, int height) {
        super(queue);
        this.width = width;
        this.height = height;
        yuv = new byte[width * height * 3 / 2];
        Log.d("hch" , "size = width "+width +" height " + height);
    }

    public void input(byte[] data) {
        if (isLiving){
            Log.d("hch" , "start input data to queue:  " + data.length);
            inputQueue.offer(data);
        }
    }

    protected void configEncodeCodec(){
        MediaFormat mediaFormat = MediaFormat.createVideoFormat(MediaFormat.MIMETYPE_VIDEO_AVC, height, width);
        mediaFormat.setInteger(MediaFormat.KEY_FRAME_RATE , 20);
        mediaFormat.setInteger(MediaFormat.KEY_I_FRAME_INTERVAL, 30);
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, width * height);
        //指定编码的数据格式是nv21。
        mediaFormat.setInteger(MediaFormat.KEY_COLOR_FORMAT , MediaCodecInfo.CodecCapabilities.COLOR_FormatYUV420Flexible);

        try{
            mediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_VIDEO_AVC);
            mediaCodec.configure(mediaFormat , null , null , MediaCodec.CONFIGURE_FLAG_ENCODE);
        }catch (Exception e){
            e.printStackTrace();
        }


    }

    @Override
    protected void input() {
        //读取nv21数据,从camera出来的数据不能被codec直接处理,否则会花屏
        byte[] buffer = inputQueue.poll();
        if (buffer != null && buffer.length > 0){
            Log.d("hch" , "input camera buffer len = " + buffer.length);
            nv12 = YUVUtil.nv21toNV12(buffer);
            YUVUtil.portraitData2Raw(nv12, yuv, width, height);

            int inputIndex = mediaCodec.dequeueInputBuffer(1000);
            if (inputIndex >= 0){
                ByteBuffer byteBuffer = mediaCodec.getInputBuffer(inputIndex);
                byteBuffer.put(yuv , 0 , buffer.length);
                mediaCodec.queueInputBuffer(inputIndex , 0 , yuv.length , System.nanoTime()/1000 , 0);
            }
        }
    }
}

3.音频数据编码

 
public class AudioEncoder extends Thread {
    private final LinkedBlockingQueue<RTMPPackage> queue;
    MediaCodec mediaCodec;
    private AudioRecord audioRecord;
    int minBufferSize;
    long startTime = 0;

    private boolean isLiving = false;


    public AudioEncoder(LinkedBlockingQueue<RTMPPackage> queue) {
        this.queue = queue;
        configEncodeCodec();
    }

    @SuppressLint("MissingPermission")
    private void configEncodeCodec() {

        //采集音频的参数,采样率44.1khz,bilibili只支持单声道
        MediaFormat mediaFormat = MediaFormat.createAudioFormat(MediaFormat.MIMETYPE_AUDIO_AAC, 44100, 1);
        //描述要使用的AAC配置文件的键 , 这里与我们要传输的音频的质量有关系,如果音频质量高,需要设置较高的profile,一般选择main
        //如果选择的profile比较小,则可能会出现声音卡顿的问题
        mediaFormat.setInteger(MediaFormat.KEY_AAC_PROFILE, MediaCodecInfo.CodecProfileLevel.AACObjectMain);
        // 比特率 128k 或者64k
        mediaFormat.setInteger(MediaFormat.KEY_BIT_RATE, 128_000);

        try {
            mediaCodec = MediaCodec.createEncoderByType(MediaFormat.MIMETYPE_AUDIO_AAC);
            mediaCodec.configure(mediaFormat, null, null, MediaCodec.CONFIGURE_FLAG_ENCODE);

            //声音量化三要素:1.采样位数/采样大小 -采样的数据用几位表示 2.采样频率-每秒采样多少次 3.声道数 单声道(Mono)和双声道(Stereo)
            minBufferSize = AudioRecord.getMinBufferSize(44100, AudioFormat.CHANNEL_IN_MONO,
                    AudioFormat.ENCODING_PCM_16BIT);
            audioRecord = new AudioRecord(MediaRecorder.AudioSource.MIC, 44100, AudioFormat.CHANNEL_IN_MONO,
                    AudioFormat.ENCODING_PCM_16BIT, minBufferSize);
        }catch (Exception e){
            e.printStackTrace();
        }
    }


    @Override
    public void run() {
        super.run();
        isLiving = true;
        mediaCodec.start();
        audioRecord.startRecording();
        MediaCodec.BufferInfo bufferInfo = new MediaCodec.BufferInfo();

        //告知服务器,准备好音频传输,音频推流需要的编码信息
        byte[] audioDecoderSpecificInfo = {0x12,0x08};
        RTMPPackage rtmpPackageHeader = new RTMPPackage(audioDecoderSpecificInfo,0).setType(RTMPPackage.TYPE_HEADER);
        queue.offer(rtmpPackageHeader);
        byte[] buffer = new byte[minBufferSize];

        while (isLiving){
            //读取pcm数据
            int len = audioRecord.read(buffer , 0 , buffer.length);
            if (len <= 0){
                continue;
            }

            Log.d("hch" , "read audio buffer len = " + len);

            int inputIndex = mediaCodec.dequeueInputBuffer(1000);
            if (inputIndex >= 0){
                ByteBuffer byteBuffer = mediaCodec.getInputBuffer(inputIndex);
                byteBuffer.put(buffer , 0 , len);
                mediaCodec.queueInputBuffer(inputIndex , 0 , len , System.nanoTime()/1000 , 0);
            }

            int outputIndex = mediaCodec.dequeueOutputBuffer(bufferInfo , 1000);
            //输入的一块数据,输出不一定是一块。例如在编码原理中,B帧数据经过信源编码器,复合编码器后,会存放到传输缓冲器等待时机后进入传输编码器,等到获取到后面的P帧的时候,才一起输出。
            while (outputIndex >= 0 && isLiving){
                ByteBuffer byteBuffer = mediaCodec.getOutputBuffer(outputIndex);
                byte[] encodeAAC = new byte[bufferInfo.size];
                byteBuffer.get(encodeAAC);

                //时间戳采用相对时间戳,因为服务器需要用相对时间戳。在rtmp的packet定义中,会需要指定是否是相对时间
                if (startTime == 0){
                    startTime = bufferInfo.presentationTimeUs / 1000;
                }
                RTMPPackage rtmpPackage = new RTMPPackage(encodeAAC , bufferInfo.presentationTimeUs / 1000 - startTime).setType(RTMPPackage.TYPE_AUDIO);
                queue.offer(rtmpPackage);
                mediaCodec.releaseOutputBuffer(outputIndex ,false);

                //可能一帧音频数据,解码出来会有多帧aac数据。
                outputIndex = mediaCodec.dequeueOutputBuffer(bufferInfo, 0);
            }
        }
    }

    public void stopLive() {
        isLiving = false;
        mediaCodec.stop();
        audioRecord.stop();
        mediaCodec.release();
        audioRecord.release();
    }
}

4. 数据发送

PackageSender.java

 
public class PackageSender extends Thread{
    LinkedBlockingQueue<RTMPPackage> queue;
    private boolean isLiving = false;

    PackageSender(LinkedBlockingQueue<RTMPPackage> queue ){
        this.queue = queue;
        setName("package-sender");
    }

    @Override
    public void run() {
        super.run();
        isLiving = true;
        try{
            while (isLiving){
                RTMPPackage rtmpPackage = queue.poll();
                if (rtmpPackage != null) {
                    sendData(rtmpPackage.getBuffer(), rtmpPackage.getBuffer().length, rtmpPackage.getTms(),
                            rtmpPackage.getType());
                }
            }
        }catch (Exception e){
            e.printStackTrace();
        }
    }


    public native int sendData(byte[] data , int len , long timestamp , int type);

    public void stopLive() {
        isLiving = false;
        queue.clear();
    }
}

5.rtmp初始化与连接

 
extern "C"
JNIEXPORT jboolean JNICALL
Java_com_hch_bilibili_MainActivity_connectRtmp(JNIEnv *env, jobject thiz, jstring url_) {
    //能1  不能2 MK  2.3
    const char * url = env->GetStringUTFChars(url_, 0);
    int ret;
//实例化对象
    rtmp = RTMP_Alloc();
    RTMP_Init(rtmp);
    rtmp->Link.timeout = 10;
    ret =RTMP_SetupURL(rtmp, (char*)url);
    if (ret == TRUE) {
        LOGI("RTMP_SetupURL");
    }
    RTMP_EnableWrite(rtmp);
    LOGI("RTMP_EnableWrite");

    ret = RTMP_Connect(rtmp, 0);
    if (ret == TRUE) {
        LOGI("RTMP_Connect ");
    }
    ret = RTMP_ConnectStream(rtmp, 0);
    if (ret == TRUE) {
        LOGI("connect success");
    }
    env->ReleaseStringUTFChars(url_, url);
    return ret;
}

5.sps/pps封装

 
#include <jni.h>
#include <string>
extern "C"
{
    #include  "librtmp/rtmp.h"
}
#include <android/log.h>
#define LOGI(...) __android_log_print(ANDROID_LOG_INFO,"hch===>",__VA_ARGS__)
#define LOGE(...) __android_log_print(ANDROID_LOG_ERROR,"hch===>",__VA_ARGS__)

typedef struct {
    int16_t sps_len;
    int16_t pps_len;
    int8_t *sps;
    int8_t *pps;
}LiveSpspps;

LiveSpspps * spspps = nullptr;
int ppsFlagLen = 4;
int spsFLagLen = 4;
RTMP * rtmp;

const int  TYPE_VIDEO = 0;
const int  TYPE_AUDIO = 1;
const int  TYPE_HEADER = 2;

RTMPPacket *createSPSPPSPacket();

int sendAudio(jbyte *bytes, jint len, jlong timestamp, jint i);

// 不包含 67 ,只包含0x00000001 或者0x000001
int16_t spsStartIndex(int8_t *bytes){
    int start = 0;
    if ((bytes[start] == 0x00 && bytes[start+1] == 0x00 && bytes[start+2] == 0x00 && bytes[start+3] == 0x01 &&
         (bytes[start+4] & 0x1f) == 7)){
        spsFLagLen = start = 4;
    }else if ((bytes[start] == 0x00 && bytes[start+1] == 0x00 && bytes[start+2] == 0x01&&
                   (bytes[start+3] & 0x1f) == 7) ){
        spsFLagLen = start = 3;
    }
    return start;
}

// 不包含 68 ,只包含0x00000001 或者0x000001
int16_t ppsStartIndex(int spsStartIndex , int8_t *bytes , int16_t len){
    int start = 1;
    for (int i = spsStartIndex; i + 5 < len; i++) {
        if ((bytes[i] == 0x00 && bytes[i+1] == 0x00 && bytes[i+2] == 0x00 && bytes[i+3] == 0x01 &&
             (bytes[i+4] & 0x1f) == 8)){
            ppsFlagLen = 4;
            start = i + ppsFlagLen;
        }else if ((bytes[i] == 0x00 && bytes[i+1] == 0x00 && bytes[i+2] == 0x01&&
                   (bytes[i+3] & 0x1f) == 8) ){
            ppsFlagLen = 3;
            start = i + ppsFlagLen;
        }
    }
    return start;
}

int prepareVideo(int8_t *bytes , int16_t len){
    if (spspps == nullptr|| spspps->pps == nullptr || spspps->sps == nullptr){
        spspps = (LiveSpspps *)malloc(sizeof(LiveSpspps));
        // find sps
        int spsStart = spsStartIndex(bytes);
        if (spsStart != 0){
            LOGI("find sps success spsStart=%d" , spsStart);
            int16_t ppsStart = ppsStartIndex(spsStart , bytes , len);
            if (ppsStart == -1){
                LOGI("find pps failed");
                return FALSE;
            }
            spspps->sps_len = ppsStart - spsStart - ppsFlagLen;
            spspps->sps = static_cast<int8_t *>(malloc(spspps->sps_len));
            memcpy(spspps->sps , bytes + spsStart , spspps->sps_len);

            spspps->pps_len = len - ppsStart;
            spspps->pps = static_cast<int8_t *>(malloc(spspps->pps_len));
            memcpy(spspps->pps , bytes + ppsStart , spspps->pps_len);

            LOGI("sps pps ready -----> spsStart=%d , ppsStart=%d , spsFlagLen=%d , ppsFLagLen=%d , spsLen=%d , "
                 "ppsLen=%d" , spsStart,
                 ppsStart,spsFLagLen,ppsFlagLen,spspps->sps_len,spspps->pps_len);
            return TRUE;
        }else{
            LOGE("find sps failed");
            return FALSE;
        }
    }else{
        LOGI("spspps is ready");
    }
}

RTMPPacket *createSPSPPSPacket() {
    //sps , pps  , 16 表示 rtmp协议中规定的spspps包的其他字节的大小。
    int spsppsPacketLen = 16 + spspps->sps_len + spspps->pps_len;

    RTMPPacket *p = (RTMPPacket *)malloc(sizeof (RTMPPacket));
    RTMPPacket_Alloc(p , spsppsPacketLen);
    int i = 0;
    p->m_body[i++] = 0x17;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x01;
    p->m_body[i++] = spspps->sps[1]; // baseline
    p->m_body[i++] = spspps->sps[2]; // 兼容性
    p->m_body[i++] = spspps->sps[3]; // level ,码率等级
    p->m_body[i++] = 0xff; // nalu长度
    p->m_body[i++] = 0xe1; //
    p->m_body[i++] = (spspps->sps_len >>8) & 0xff; // 先取高8位 ,右移后仍然是int类型,&0xff后变成字节类型
    p->m_body[i++] = spspps->sps_len  & 0xff; // 再取低8位 ,右移后仍然是int类型,&0xff后变成字节类型
    memcpy(&p->m_body[i] , spspps->sps , spspps->sps_len); // copy sps content
    i = i + spspps->sps_len;
    p->m_body[i++ ] = 0x01;
    p->m_body[i++] = (spspps->pps_len >>8) & 0xff; // 先取高8位 ,右移后仍然是int类型,&0xff后变成字节类型
    p->m_body[i++] = spspps->pps_len  & 0xff; // 再取低8位 ,右移后仍然是int类型,&0xff后变成字节类型
    memcpy(&p->m_body[i] , spspps->pps , spspps->pps_len); // copy pps content
    //配置rtmp
    p->m_packetType = RTMP_PACKET_TYPE_VIDEO;
    p->m_nBodySize = spsppsPacketLen;
    //channel自定义
    p->m_nChannel = 0x04;
    //采用相对时间戳
    p->m_hasAbsTimestamp = 0;
    p->m_nTimeStamp = 0;
    //给服务器做参考的
    p->m_headerType = RTMP_PACKET_SIZE_LARGE;
    p->m_nInfoField2 = rtmp->m_stream_id;
    return p;
}

6.IBP帧封装

// bytes 包含了分隔符0000000165xxxxxxx
RTMPPacket *createVideoPacket(int type ,int8_t *bytes , int16_t len , long timems) {
    //由于分隔符一般情况下是一样长的,因此可以用spsFlagLen ,bytes中的数据不需要包含分隔符,但是要包含帧类型
    bytes += spsFLagLen;
    len -= spsFLagLen;
    // 9 为rtmp协议中定义的i帧包头的长度
    int iFramePacketLen = 9 + len;
    RTMPPacket* p = (RTMPPacket*)malloc(sizeof(RTMPPacket));
    RTMPPacket_Alloc(p , iFramePacketLen);
    int i = 0;
    //I frame
    if (type == 0x05){
        p->m_body[i++] = 0x17;
    }else{
        // p , b frame.
        p->m_body[i++] = 0x27;
    }

    p->m_body[i++] = 0x01;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x00;
    p->m_body[i++] = 0x00;

    p->m_body[i++] = (len >>24) & 0xff;
    p->m_body[i++] = (len >>16) & 0xff;
    p->m_body[i++] = (len >>8) & 0xff;
    p->m_body[i++] = len & 0xff;
    memcpy(&p->m_body[i] , bytes , len);

    //配置rtmp
    p->m_packetType = RTMP_PACKET_TYPE_VIDEO;
    p->m_nBodySize = iFramePacketLen;
    //channel自定义
    p->m_nChannel = 0x04;
    //采用相对时间戳
    p->m_hasAbsTimestamp = 0;
    //对于I帧,它的时间等于编码出来的相对时间戳
    p->m_nTimeStamp = timems;
    //给服务器做参考的
    p->m_headerType = RTMP_PACKET_SIZE_LARGE;
    p->m_nInfoField2 = rtmp->m_stream_id;
    return p;
}

7.音频封装

RTMPPacket *createAudioPacket(int8_t *bytes, jint len, jlong timestamp, int packetType) {
    // 2个字节表示 音频解码信息
    int bodySize = len + 2;
    RTMPPacket *p = (RTMPPacket *)malloc(sizeof (RTMPPacket));
    RTMPPacket_Alloc(p , bodySize);
    int i = 0;
    p->m_body[i++] = 0xAF;
    //如果是音频数据的头信息
    if (packetType == TYPE_HEADER){
        p->m_body[i++] = 0x00;
    }else{
        p->m_body[i++] = 0x01;
    }
    memcpy(&p->m_body[i++] , bytes , len);

//配置rtmp
    p->m_packetType = RTMP_PACKET_TYPE_AUDIO;
    p->m_nBodySize = bodySize;
    //channel自定义
    p->m_nChannel = 0x02;
    //采用相对时间戳
    p->m_hasAbsTimestamp = 0;
    //对于I帧,它的时间等于编码出来的相对时间戳
    p->m_nTimeStamp = timestamp;
    //给服务器做参考的
    p->m_headerType = RTMP_PACKET_SIZE_LARGE;
    p->m_nInfoField2 = rtmp->m_stream_id;
    LOGI("createAudioPacket success");
    return p;
}
 

8. 视频封装数据发送

 
extern "C"
JNIEXPORT jint JNICALL
Java_com_hch_bilibili_PackageSender_sendData(JNIEnv *env, jobject thiz, jbyteArray data, jint len, jlong timestamp,
                                             jint type) {
    LOGI("start send package , type = %d" , type);
    int ret;
    jbyte * bytes = env->GetByteArrayElements(data ,NULL);
    switch (type) {
        case TYPE_VIDEO:
            ret = sendVideo(bytes , len , timestamp);
            break;
        case TYPE_AUDIO:
        case TYPE_HEADER:
            ret = sendAudio(bytes, len, timestamp, type);
            break;
    }

    env->ReleaseByteArrayElements(data , bytes , 0);
    return ret;
}

/**
 *
 * @param bytes 每帧的数据
 * @param len  每帧的长度
 * @param timems 每帧编码出来的相对时间戳 ,单位ms , 对于I帧,它是编码出来的相对时间戳,对于sps/pps帧,它可以为0
 * @return
 */
int16_t sendVideo(int8_t *bytes , int16_t len , long timems) {
    int8_t type = bytes[4] & 0x1f;
    int r = 0;
    // sps , pps
    if (type == 0x7){
        int result = prepareVideo(bytes , len);
        if (result == TRUE){
            LOGI("prepare sps , pps success");
        }else{
            LOGI("prepare sps , pps failed");
        }
    }else if (type == 0x5){
        // i frame find , send two packet
        RTMPPacket * packet = createSPSPPSPacket();
        r = sendPacket(packet);
        if (r == TRUE){
            LOGI("send sps , pps success");
        }else{
            LOGE("send sps , pps success");
        }

    }
    //i , b , p frame
    RTMPPacket* iPacket = createVideoPacket(type , bytes , len , timems);
    r = sendPacket(iPacket);
    if (r == TRUE){
        if (type == 0x05){
            LOGI("send I frame success");
        }else{
            LOGI("send other frame success");
        }
    }else{
        LOGE("send video frame failed");
    }

    return r;
}

9.音频数据发送

 
int sendAudio(jbyte *bytes, jint len, jlong timestamp, int packetType) {
    int r;
    RTMPPacket * packet = createAudioPacket(bytes, len, timestamp, packetType);
    r = sendPacket(packet);
    return r;
}
作者:八道
链接:https://juejin.cn/post/7133390400089227271
来源:稀土掘金
著作权归作者所有。商业转载请联系作者获得授权,非商业转载请注明出处。
正文到此结束