实时访问iPhone的相机图像

时间:2011-02-18 07:56:19

标签: iphone cocoa-touch core-graphics

我正在尝试读取iPhone相机中心像素的(平均)RGB值。这几乎应该是实时发生的。因此我打开一个UIImagePickerController,使用一个计时器每隔x秒拍一张照片。处理图片是在单独的线程中进行的,这样在计算RGB值时它不会阻止应用程序。我尝试了几种方法来访问拍摄图像的RGB /像素值,但都存在问题,即它们太慢并导致相机视图滞后。 我尝试了以下方法:

- (UIColor *)getAverageColorOfImage:(UIImage*)image {
int pixelCount = kDetectorSize * kDetectorSize;
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();

NSUInteger bytesPerPixel = 4;
NSUInteger bytesPerRow = bytesPerPixel * kDetectorSize;
NSUInteger bitsPerComponent = 8;
unsigned char *rawData = malloc(pixelCount * bytesPerPixel);
CGContextRef context = CGBitmapContextCreate(rawData, kDetectorSize, kDetectorSize, bitsPerComponent, bytesPerRow, colorSpace, kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
CGColorSpaceRelease(colorSpace);
CGContextSetInterpolationQuality(context, kCGInterpolationNone);

NSLog(@"Drawing image");
CGContextDrawImage(context, CGRectMake(0, 0, kDetectorSize, kDetectorSize), [image CGImage]);
NSLog(@"Image drawn");

CGContextRelease(context);

// rawData contains the image data in the RGBA8888 pixel format. Alpha values are ignored
int byteIndex = 0;
CGFloat red = 0.0;
CGFloat green = 0.0;
CGFloat blue = 0.0;

for (int i = 0 ; i <  pixelCount; ++i) {
    red   += rawData[byteIndex];
    green += rawData[byteIndex + 1];
    blue  += rawData[byteIndex + 2];
    byteIndex += bytesPerPixel;
}

free(rawData);

return [UIColor colorWithRed:red/pixelCount/255.0 green:green/pixelCount/255.0 blue:blue/pixelCount/255.0 alpha:1.0];
}

kDetectorSize设置为6,使得处理后的图像的大小为6x6像素。其中一个图像参数也被裁剪为6x6像素。缓慢的部分是CGContextDrawImage,我的iPhone 4需要大约500-600毫秒。我尝试了一些替代品:

UIGraphicsPushContext(context); 
[image drawAtPoint:CGPointMake(0.0, 0.0)];
UIGraphicsPopContext();

UIGraphicsPushContext(context); 
[image drawInRect:CGRectMake(0.0, 0.0, kDetectorSize, kDetectorSize)];
UIGraphicsPopContext();

这两种方法都与上述方法一样慢。图像大小没有显着影响(我说它没有影响)。 有谁知道更快的方式来访问RGB值?

如果线程不会导致摄像机视图滞后,那也没关系。我这样称呼我的话:

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
    UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
    [NSThread detachNewThreadSelector:@selector(pickColorFromImage:) toTarget:self withObject:image]; 
}

- (void)pickColorFromImage:(UIImage *)image {
    NSAutoreleasePool *pool = [[NSAutoreleasePool alloc] init];   
    [NSThread setThreadPriority:0.0];

    [...cropping the image...]
UIColor *averageColor = [self getAverageColorOfImage:croppedImage];

    [self performSelectorOnMainThread:@selector(applyPickedColor:) withObject:averageColor waitUntilDone:NO];  

    [pool release];
}

感谢您的帮助!

1 个答案:

答案 0 :(得分:4)

你正在以错误的方式接近这一点 - Apple提供了一个类,可以完全按照你想要的方式进行操作而不会弄乱定时器和UIImagePickers。 AVCaptureSession及相关类使您可以实时访问摄像机的原始像素数据。

有关更多信息,请参阅文档:

http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVCaptureSession_Class/Reference/Reference.html%23//apple_ref/doc/uid/TP40009521

相关问题