如何正确获取didOutputSampleBuffer的视频输出?

时间:2015-01-15 06:44:23

标签: ios objective-c xcode avcapturesession

我正在尝试根据此页面从前置摄像头获取每个帧:https://developer.apple.com/library/ios/qa/qa1702/_index.html

我还在这里粘贴了我的代码:

- (void)viewDidLoad {
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    self.captureSession = [[AVCaptureSession alloc] init];
    self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    NSAssert([self checkDeviceAuthorizationStatus], @"authorization failed");

    self.sessionQueue = dispatch_queue_create(SESSION_QUEUE_LABEL, DISPATCH_QUEUE_SERIAL);
    dispatch_async(self.sessionQueue, ^{
        NSAssert([self findCamera:YES], @"get camera failed");
        NSAssert([self attachCameraToCaptureSession], @"get input failed");
        NSAssert([self setupVideoOutput], @"get output failed");
    });
}


- (BOOL) findCamera : (BOOL)useFrontCamera {
    AVCaptureDevice *camera = nil;
    NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];

    for (AVCaptureDevice *device in devices) {
        if (useFrontCamera && AVCaptureDevicePositionFront == [device position]) {
            camera = device;
        } else if (!useFrontCamera && AVCaptureDevicePositionBack == [device position]) {
            camera = device;
        }
    }

    if (nil != camera) {
        if ([camera lockForConfiguration:nil]) {
            [camera setActiveVideoMinFrameDuration:CMTimeMake(1, 10)];
            [camera setActiveVideoMaxFrameDuration:CMTimeMake(1, 30)];
            [camera unlockForConfiguration];
        }
        self.camera = camera;
    }
    return (nil != self.camera);
}

- (BOOL) attachCameraToCaptureSession {
    NSAssert(nil != self.camera, @"no camera");
    NSAssert(nil != self.captureSession, @"no session");

    self.cameraInput = nil;
    NSError *error = nil;
    self.cameraInput = [AVCaptureDeviceInput deviceInputWithDevice:self.camera error:&error];

    if (nil != error) {
        NSLog(@"attach camera to session error: %@", error);
        return false;
    }

    if ([self.captureSession canAddInput:self.cameraInput]) {
        [self.captureSession addInput:self.cameraInput];
    } else {
        return false;
    }

    return true;
}

- (BOOL)setupVideoOutput {
    self.videoOutput = [[AVCaptureVideoDataOutput alloc] init];
    self.captureQueue = dispatch_queue_create(CAPTURE_QUEUE_LABEL, DISPATCH_QUEUE_SERIAL);
    [self.videoOutput setSampleBufferDelegate:self queue:self.captureQueue];
    self.videoOutput.alwaysDiscardsLateVideoFrames = NO;
    self.videoOutput.videoSettings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt: kCVPixelFormatType_32BGRA ] forKey:(id)kCVPixelBufferPixelFormatTypeKey];
    if ([self.captureSession canAddOutput:self.videoOutput]) {
        [self.captureSession addOutput:self.videoOutput];
        return true;
    }
    return false;
}

然后我尝试从didOutputSampleBuffer函数中获取帧,但UIImage始终为nil。

- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    if (captureOutput == self.videoOutput) {
        NSLog(@"ok");
        dispatch_async(self.sessionQueue, ^{
            if (sampleBuffer) {
                UIImage *image = [ViewController imageFromSampleBuffer:sampleBuffer];
                NSLog(@"%@", image);
            }
        });
    } else {
        NSLog(@"not ok");
    }
}

imageFromSampleBuffer:sampleBuffer函数与我首先粘贴的链接中的函数相同。

此外,imageFromSampleBuffer ::

总会出现错误

:CGBitmapContextCreateImage:无效的上下文0x0。这是一个严重的错误。该应用程序或其使用的库正在使用无效的上下文,从而导致系统稳定性和可靠性的整体降低。此通知是礼貌的:请解决此问题。在即将到来的更新中,它将成为一个致命的错误。

谁能告诉我为什么?谢谢!

我的手机是iPhone5s和iOS8.1。

2 个答案:

答案 0 :(得分:4)

您不能像使用sampleBuffer一样在另一个操作队列中使用dispatch_asyncimageFromSampleBuffer使用此对象时,该对象已被释放。你必须从两种方法中选择:

  1. 在与您引用的示例中相同的队列(线程)中使用缓冲区。
  2. 保留(或复制)它以便进一步使用。可以在此处找到一个很好的示例:https://developer.apple.com/library/ios/samplecode/RosyWriter/Introduction/Intro.html(查看captureOutput:...中的RosyWriterVideoProcessor.m方法)。

答案 1 :(得分:0)

使用此方法从视频网址获取每隔一帧...希望这会对您有所帮助

-(void)generateThumbImage : (NSURL *)url
{
    // NSURL *url = [NSURL fileURLWithPath:filepath];

    AVAsset *asset = [AVAsset assetWithURL:url];
    AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator  alloc]initWithAsset:asset];
    imageGenerator.appliesPreferredTrackTransform=YES;

    CMTime time1 = [asset duration];

    for(int i=0;i<5;i++)
    {
        CMTime time = CMTimeMakeWithSeconds(i,30);
        CGImageRef imageRef = [imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL];
        UIImage *thumbnail = [UIImage imageWithCGImage:imageRef];
        CGImageRelease(imageRef);  // CGImageRef won't be released by ARC

        [arrayImages addObject:thumbnail];
    }
_imageView1.image=[arrayImages objectAtIndex:0];
_imageView2.image=[arrayImages objectAtIndex:1];
_imageView3.image=[arrayImages objectAtIndex:2];
_imageView4.image=[arrayImages objectAtIndex:3];
_imageView5.image=[arrayImages objectAtIndex:4];
NSLog(@"Image array:::::%@",arrayImages);

}