FFmpeg不解码h264流

时间:2011-05-16 13:33:52

标签: iphone ffmpeg video-streaming h.264 rtsp

我正在尝试从rtsp服务器解码h264流并在iPhone上呈现它。

我找到了一些图书馆并阅读了一些文章。

图书馆来自iPhone的dropCam,名为RTSPClient和DecoderWrapper。

但我无法使用在ffmpeg上使用的DecodeWrapper解码帧数据。

这是我的代码。

VideoViewer.m

- (void)didReceiveFrame:(NSData*)frameData presentationTime:(NSDate*)presentationTime
{
    [VideoDecoder staticInitialize];
    mConverter = [[VideoDecoder alloc] initWithCodec:kVCT_H264 colorSpace:kVCS_RGBA32 width:320 height:240 privateData:nil];


    [mConverter decodeFrame:frameData];

    if ([mConverter isFrameReady]) {
        UIImage *imageData =[mConverter getDecodedFrame];
        if (imageData) {
            [mVideoView setImage:imageData];
            NSLog(@"decoded!");
        }
    }
}

---VideoDecoder.m---
- (id)initWithCodec:(enum VideoCodecType)codecType 
         colorSpace:(enum VideoColorSpace)colorSpace 
              width:(int)width 
             height:(int)height 
        privateData:(NSData*)privateData {
    if(self = [super init]) {

        codec = avcodec_find_decoder(CODEC_ID_H264);
        codecCtx = avcodec_alloc_context();

        // Note: for H.264 RTSP streams, the width and height are usually not specified (width and height are 0).  
        // These fields will become filled in once the first frame is decoded and the SPS is processed.
        codecCtx->width = width;
        codecCtx->height = height;

        codecCtx->extradata = av_malloc([privateData length]);
        codecCtx->extradata_size = [privateData length];
        [privateData getBytes:codecCtx->extradata length:codecCtx->extradata_size];
        codecCtx->pix_fmt = PIX_FMT_RGBA;
#ifdef SHOW_DEBUG_MV
        codecCtx->debug_mv = 0xFF;
#endif

        srcFrame = avcodec_alloc_frame();
        dstFrame = avcodec_alloc_frame();

        int res = avcodec_open(codecCtx, codec);
        if (res < 0)
        {
            NSLog(@"Failed to initialize decoder");
        }

    }

    return self;    
}

- (void)decodeFrame:(NSData*)frameData {


    AVPacket packet = {0};
    packet.data = (uint8_t*)[frameData bytes];
    packet.size = [frameData length];

    int frameFinished=0;
    NSLog(@"Packet size===>%d",packet.size);
    // Is this a packet from the video stream?
    if(packet.stream_index==0)
    {
        int res = avcodec_decode_video2(codecCtx, srcFrame, &frameFinished, &packet);
        NSLog(@"Res value===>%d",res);
        NSLog(@"frame data===>%d",(int)srcFrame->data);
        if (res < 0)
        {
            NSLog(@"Failed to decode frame");
        }
    }
    else 
    {
        NSLog(@"No video stream found");
    }


    // Need to delay initializing the output buffers because we don't know the dimensions until we decode the first frame.
    if (!outputInit) {
        if (codecCtx->width > 0 && codecCtx->height > 0) {
#ifdef _DEBUG
            NSLog(@"Initializing decoder with frame size of: %dx%d", codecCtx->width, codecCtx->height);
#endif

            outputBufLen = avpicture_get_size(PIX_FMT_RGBA, codecCtx->width, codecCtx->height);
            outputBuf = av_malloc(outputBufLen);

            avpicture_fill((AVPicture*)dstFrame, outputBuf, PIX_FMT_RGBA, codecCtx->width, codecCtx->height);

            convertCtx = sws_getContext(codecCtx->width, codecCtx->height, codecCtx->pix_fmt,  codecCtx->width, 
                                        codecCtx->height, PIX_FMT_RGBA, SWS_FAST_BILINEAR, NULL, NULL, NULL); 

            outputInit = YES;
            frameFinished=1;
        }
        else {
            NSLog(@"Could not get video output dimensions");
        }
    }

    if (frameFinished)
        frameReady = YES;

}

控制台显示如下。

2011-05-16 20:16:04.223 RTSPTest1[41226:207] Packet size===>359
[h264 @ 0x5815c00] no frame!
2011-05-16 20:16:04.223 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.224 RTSPTest1[41226:207] frame data===>101791200
2011-05-16 20:16:04.224 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.225 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x5017c00] no frame!
2011-05-16 20:16:04.226 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.227 RTSPTest1[41226:207] frame data===>81002704
2011-05-16 20:16:04.227 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.228 RTSPTest1[41226:207] decoded!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Packet size===>424
[h264 @ 0x581d000] no frame!
2011-05-16 20:16:04.229 RTSPTest1[41226:207] Res value===>-1
2011-05-16 20:16:04.230 RTSPTest1[41226:207] frame data===>101791616
2011-05-16 20:16:04.230 RTSPTest1[41226:207] Failed to decode frame
2011-05-16 20:16:04.231 RTSPTest1[41226:207] decoded!
. . . .  .

但模拟器什么都没有显示。

我的代码出了什么问题。

帮我解决这个问题。

感谢您的回答。

1 个答案:

答案 0 :(得分:3)

我遇到过与H264和FFmpeg类似的问题。 我的问题是有些设备没有每帧发送序列(SPS)和图片参数集(PPS),所以我需要稍微修改我的帧数据。

也许这篇文章会有所帮助:FFmpeg can't decode H264 stream/frame data