有没有办法在AVPlayerLayer中捕获视图的屏幕截图?

时间:2019-10-07 13:48:47

标签: ios objective-c uiimage avfoundation avplayerlayer

我目前正在制作视频拼贴应用。我有一个包含Avplayerlayer作为子层的视图。我需要获取包含AvplayerLayer的视图的屏幕截图。当我尝试使用它时,它会得到屏幕截图,但是对于avplayerlayer(视频正在其中播放),它不在屏幕截图中,而只是黑屏。对于模拟器而言,它可以完美地工作并显示该图层,但对于实际设备而言,它只是一个黑屏。

我尝试了StackOverFlow和appleds开发人员文档中的所有解决方案,但没有任何效果。

我尝试过的一些解决方案:

swift: How to take screenshot of AVPlayerLayer()

Screenshot for AVPlayer and Video

https://developer.apple.com/documentation/avfoundation/avcapturevideopreviewlayer

正如您在我的代码中看到的那样,它应该可以从视图中获取图像,但是对于avplayerlayer而言则无效。

- (UIImage *)imageFromView:(UIView *)view
{
   UIGraphicsBeginImageContext(view.frame.size);

    [view drawViewHierarchyInRect:_videoFrame afterScreenUpdates:false];

    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();

    NSString *fileExtension = @"png";

    NSData *data;

    Boolean *isOutputJPS = false;

    if (isOutputJPS) {
        data = UIImageJPEGRepresentation(image, 0.5);
        fileExtension = @"jpg";
    }else{
        data = UIImagePNGRepresentation(image);
    }

    UIImage  *rasterizedView = [UIImage imageWithData:data];
    UIGraphicsEndImageContext();
    return rasterizedView;
}

//in the viewController

UIImage *image =  [self imageFromView:recordingView];

我现在有点绝望,因为Avplayerlayer没有任何解决方案。 当我检查在真实设备中生成的图像时,它只是向我显示视图,但对于模拟器,它可以按预期工作。

1 个答案:

答案 0 :(得分:0)

有很多方法可以实现您想要的。我发现使用资产图片生成器始终可以正常工作。

- (NSImage *)getImageFromAsset:(AVAsset *)myAsset width:(int)theWidth height:(int)theHeight {

                            Float64 durationSeconds = CMTimeGetSeconds(myAsset.duration);

    /// Change frametimetoget section to your specific needs ///
                            CMTime frametimetoget;
                            if (durationSeconds <= 20) {
                                            frametimetoget = CMTimeMakeWithSeconds(durationSeconds/2, 600);
                            } else {
                                            frametimetoget = CMTimeMakeWithSeconds(10, 600);
                            }

                            AVAssetImageGenerator *imageGenerator = [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
                            imageGenerator.appliesPreferredTrackTransform = YES;
                            imageGenerator.maximumSize = CGSizeMake(theWidth, theHeight);
                            NSString *aspect = @"AVAssetImageGeneratorApertureModeEncodedPixels";
                            imageGenerator.apertureMode = aspect;

    /// NSError not handled in this example , you would have to add code ///
                            NSError *error = nil;
                            CMTime actualTime;
                            CGImageRef frameImage = [imageGenerator copyCGImageAtTime:frametimetoget actualTime:&actualTime error:&error];

                            Float64 myImageWidth = CGImageGetWidth(frameImage);
                            Float64 myImageHeight = CGImageGetHeight(frameImage);
                            Float64 ratio = myImageWidth/theWidth;
                            NSSize imageSize ;
                            imageSize.width=myImageWidth/ratio;
                            imageSize.height=myImageHeight/ratio;

    /// You may choose to use CGImage and skip below
    /// Swap out NSImage (Mac OS x) for the ios equivalence
                            NSImage * thumbNail = [[NSImage alloc]initWithCGImage:frameImage size:imageSize];

    /// CGImageRelease is a must to avoid memory leaks
                            CGImageRelease(frameImage);
                            return thumbNail;

}

相关问题