如何使用VGA输出适配器将OpenGL-ES渲染到外部屏幕?

时间:2011-01-21 19:06:02

标签: iphone objective-c xcode ipad opengl-es

我一直在为iPad和iPhone开发3D程序,并希望能够将其渲染到外部屏幕。根据我的理解,你必须做类似于下面的代码来实现它,(见Sunsetlakesoftware.com):

if ([[UIScreen screens] count] > 1)
{
    // External screen attached
}
else
{
    // Only local screen present
}

CGRect externalBounds = [externalScreen bounds];
externalWindow = [[UIWindow alloc] initWithFrame:externalBounds];

UIView *backgroundView = [[UIView alloc]  initWithFrame:externalBounds];
backgroundView.backgroundColor = [UIColor whiteColor];

[externalWindow addSubview:backgroundView];

[backgroundView release];

externalWindow.screen = externalScreen;
[externalWindow makeKeyAndVisible];

但是,我不确定要对OpenGL项目做些什么改变。有谁知道你要做什么来实现这个在XCode中的iPad或iPhone的defualt openGL项目?

2 个答案:

答案 0 :(得分:2)

在外部显示器上渲染OpenGL ES内容所需要做的就是创建一个由CAEAGLLayer支持的UIView,并将其添加为上面backgroundView的子视图,或者采取这样的视图和将其移至backgroundView的子视图。

实际上,如果需要,可以删除backgroundView,只需将OpenGL托管视图直接放在externalWindow UIWindow实例上即可。该窗口附加到代表外部显示器的UIScreen实例,因此放在其上的任何内容都将显示在该显示器上。这包括OpenGL ES内容。

特定类型的OpenGL ES内容似乎存在问题,您可以在我尝试添加到我的Molecules应用程序中的实验支持中看到。如果你查看那里的源代码,我会尝试将我的渲染视图迁移到外部显示器,但它永远不会出现。我已经对其他OpenGL ES应用程序做了同样的事情,并且内容渲染得很好,所以我相信外部显示器上的深度缓冲区可能存在问题。我还在努力追踪它。

答案 1 :(得分:1)

我已经想出如何将任何OpenGL-ES内容渲染到外部显示器上。它实际上非常简单。您只需将渲染缓冲区复制到UIImage,然后在外部屏幕视图上显示该UIImage。拍摄渲染缓冲区快照的代码如下:

- (UIImage*)snapshot:(UIView*)eaglview
{
// Get the size of the backing CAEAGLLayer
GLint backingWidth, backingHeight;
glBindRenderbufferOES(GL_RENDERBUFFER_OES, defaultFramebuffer);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

NSInteger x = 0, y = 0, width = backingWidth, height = backingHeight;
NSInteger dataLength = width * height * 4;
GLubyte *data = (GLubyte*)malloc(dataLength * sizeof(GLubyte));

// Read pixel data from the framebuffer
glPixelStorei(GL_PACK_ALIGNMENT, 4);
glReadPixels(x, y, width, height, GL_RGBA, GL_UNSIGNED_BYTE, data);

// Create a CGImage with the pixel data
// If your OpenGL ES content is opaque, use kCGImageAlphaNoneSkipLast to ignore the alpha channel
// otherwise, use kCGImageAlphaPremultipliedLast
CGDataProviderRef ref = CGDataProviderCreateWithData(NULL, data, dataLength, NULL);
CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();
CGImageRef iref = CGImageCreate(width, height, 8, 32, width * 4, colorspace, kCGBitmapByteOrder32Big | kCGImageAlphaPremultipliedLast,
                                ref, NULL, true, kCGRenderingIntentDefault);

// OpenGL ES measures data in PIXELS
// Create a graphics context with the target size measured in POINTS
NSInteger widthInPoints, heightInPoints;
if (NULL != UIGraphicsBeginImageContextWithOptions) {
    // On iOS 4 and later, use UIGraphicsBeginImageContextWithOptions to take the scale into consideration
    // Set the scale parameter to your OpenGL ES view's contentScaleFactor
    // so that you get a high-resolution snapshot when its value is greater than 1.0
    CGFloat scale = eaglview.contentScaleFactor;
    widthInPoints = width / scale;
    heightInPoints = height / scale;
    UIGraphicsBeginImageContextWithOptions(CGSizeMake(widthInPoints, heightInPoints), NO, scale);
}
else {
    // On iOS prior to 4, fall back to use UIGraphicsBeginImageContext
    widthInPoints = width;
    heightInPoints = height;
    UIGraphicsBeginImageContext(CGSizeMake(widthInPoints, heightInPoints));
}

CGContextRef cgcontext = UIGraphicsGetCurrentContext();

// UIKit coordinate system is upside down to GL/Quartz coordinate system
// Flip the CGImage by rendering it to the flipped bitmap context
// The size of the destination area is measured in POINTS
CGContextSetBlendMode(cgcontext, kCGBlendModeCopy);
CGContextDrawImage(cgcontext, CGRectMake(0.0, 0.0, widthInPoints, heightInPoints), iref);

// Retrieve the UIImage from the current context
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

UIGraphicsEndImageContext();

// Clean up
free(data);
CFRelease(ref);
CFRelease(colorspace);
CGImageRelease(iref);

return image;
}

虽然由于某些原因我从来没有能够让glGetRenderbufferParameterivOES返回正确的后向宽度和后退高度,所以我不得不使用自己的函数来计算它们。只需将其弹出到渲染实现中,然后使用计时器将结果放到外部屏幕上。如果有人可以对此方法进行任何改进,请告诉我。