有没有办法用AVCaptureStillImageOutput改善拍摄之间的时间?

时间:2010-12-13 23:36:27

标签: iphone objective-c ios avfoundation

我目前使用以下代码拍摄一系列图片:

- (void)shootSeries:(int)photos {
    if(photos == 0) {
        [self mergeImages];
    } else {
        [output captureStillImageAsynchronouslyFromConnection:connection completionHandler:
            ^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
                NSLog(@"Shot picture %d.", 7 - photos);
                [self shootSeries:(photos - 1)];

                CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(imageDataSampleBuffer);

                CVPixelBufferLockBaseAddress(pixelBuffer, 0);
                int dataSize = CVPixelBufferGetDataSize(pixelBuffer);
                CFDataRef data = CFDataCreate(NULL, (const UInt8 *)CVPixelBufferGetBaseAddress(pixelBuffer), dataSize);
                CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);

                CGDataProviderRef dataProvider = CGDataProviderCreateWithCFData(data);
                CFRelease(data);

                CGImageRef image = CGImageCreate(CVPixelBufferGetWidth(pixelBuffer),
                                                 CVPixelBufferGetHeight(pixelBuffer),
                                                 8, 32,
                                                 CVPixelBufferGetBytesPerRow(pixelBuffer),
                                                 colorspace,
                                                 kCGImageAlphaNoneSkipFirst | kCGBitmapByteOrder32Little,
                                                 dataProvider, NULL, true, kCGRenderingIntentDefault);
                CFRelease(dataProvider);

                CFArrayAppendValue(shotPictures, image);
                CFRelease(image);
            }];
    }
}

虽然这种方法效果很好,但速度很慢。为什么像ClearCam这样的应用程序能够以比这更快的速度拍摄图片,我怎么能这样做呢?

1 个答案:

答案 0 :(得分:0)

捕获图像后,将样本缓冲区存储在CFArray中,一旦完成所有手机的操作,然后将它们转换为图像(或者在CGImageRefs中)。