开发者

FFmpeg does not decode h264 stream

开发者 https://www.devze.com 2023-03-06 10:34 出处:网络
I am trying to decode h264 stream from rtsp server and render it on iPhone. I found some libraries and read some articles about it.

I am trying to decode h264 stream from rtsp server and render it on iPhone.

I found some libraries and read some articles about it.

Libraries are from dropCam for iPhone called RTSPClient and DecoderWrapper.

But I can not decode frame data with DecodeWrapper that using on ffmpeg.

Here are my code.

VideoViewer.m

- (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
{
    [VideoDecoder staticInitialize];
    mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];


    [mConverter decodeFrame:frameData];

    if ([mConverter isFrameReady]) {
        UIImage *imageData =[mConverter getDecodedFrame];
        if (imageData) {
            [mVideoView setImage:imageData];
            NSLog(@"decoded!");
        }
    }
}

---VideoDecoder.m---
- (id)initWithCodec:(enum VideoCodecType)codecType 
         colorSpace:(enum VideoColorSpace)colorSpace 
              width:(int)width 
             height:(int)height 
        privateData:(NSData*)privateData {
    if(self = [super init]) {

        codec = avcodec_find_decoder(CODEC_ID_H264);
        codecCtx = avcodec_alloc_context();

        // Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).  
        // These fields will become filled in once the first frame is decoded and the SPS is processed.
        codecCtx->width = width;
        codecCtx->height = height;

        codecCtx->extradata = av_malloc([privateData length]);
        codecCtx->extradata_size = [privateData length];
        [privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
        codecCtx->pix_fmt = PIX_FMT_RGBA;
#ifdef SHOW_DEBUG_MV
        codecCtx->debug_mv = 0xFF;
#endif

        srcFrame = avcodec_alloc_frame();
        dstFrame = avcodec_alloc_frame();

        int res = avcodec_open(codecCtx, codec);
        if (res < 0)
        {
            NSLog(@"Failed to initialize decoder");
        }

    }

    return self;    
}

- (void)decodeFrame:(NSData*)frameData {


    AVPacket packet = {0};
    packet.data = (uint8_t*)[frameData bytes];
    packet.size = [frameData length];

    int frameFinished=0;
    NSLog(@"Packet size===>%d",packet.size);
    // Is this a packet from the video stream?
    if(packet.stream_index==0)
    {
        int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
        NSLog(@"Res value===>%d",res);
        NSLog(@"frame data===>%d",(int)srcFrame->data);
        if (res < 0)
        {
            NSLog(@"Failed to decode frame");
        }
    }
    else 
    {
        NSLog(@"No video stream found");
    }


    // Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
    if (!outputInit) {
        if (codecCtx->width > 0 && codecCtx->height > 0) {
#ifdef _DEBUG
            NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
#endif

            outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
            outputBuf = av_malloc(outputBufLen);

            avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);

            convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt,  codecCtx->width, 
                                        codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL); 

            outputInit = YES;
            frameFinished=1;
        }
        else {
            NSLog(@"Could not get video output dimensions");
        }
    }

    if (frameFinished)
        frameReady = YES;

}

The console shows me as follows.

2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
[h264 @ 0x5815c00] no frame!
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.225 RTSPTest1[41226:开发者_开发百科207] decoded!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x5017c00] no frame!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x581d000] no frame!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
. . . .  .

But the simulator shows nothing.

What's wrong with my code.

Help me solve this problem.

Thanks for your answers.


I've had a similar problem with H264 and FFmpeg. My problem was that some devices are not sending the sequence (SPS) and picture parameter sets (PPS) with every frame, so I've needed to slightly modify my frame data.

Maybe this post will help: FFmpeg can't decode H264 stream/frame data

0

精彩评论

暂无评论...
验证码 换一张
取 消

关注公众号