导出裁剪视频 - 前/后摄像头的不同结果

时间:2015-09-08 10:59:13

标签: ios core-graphics avfoundation avassetwriter avmutablecomposition

我试图从相机拍摄视频并将其导出为正方形。我在带有前后摄像头的iPad Air上进行测试。

当我使用后置摄像头拍摄视频时,一切运作良好 - 视频会按照我想要的方式裁剪。不幸的是,当我尝试导出从前置摄像头拍摄的视频时,它出现了错误。

翻译似乎是错误的,因为我在视频的底部出现了大的黑色条纹。有谁知道我做错了什么?

注意:我在iOS 9上测试它 - 不确定是否可能是问题的根源。

- (AVComposition *)trimmedAndCroppedVideoComposition
{
    AVMutableComposition *composition = [AVMutableComposition composition];

    AVURLAsset *sourceAsset = [[AVURLAsset alloc] initWithURL:self.media.videoURL
                                                      options:@{AVURLAssetPreferPreciseDurationAndTimingKey: @(YES)}];
    CMTimeRange timeRange = self.media.trimmedVideoRange;
    [composition insertTimeRange:timeRange ofAsset:sourceAsset atTime:kCMTimeZero error:nil];

    AVAssetTrack *track =  [[sourceAsset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    AVMutableCompositionTrack *compositionTrack = [[composition tracksWithMediaType:AVMediaTypeVideo] firstObject];

    CGSize videoSize = CGSizeApplyAffineTransform(track.naturalSize, track.preferredTransform);
    videoSize = CGSizeMake(fabs(videoSize.width), fabs(videoSize.height));

    CGFloat fillScale = MAX(self.renderSize.width / videoSize.width,
                            self.renderSize.height / videoSize.height);

    CGAffineTransform orientationTransform = track.preferredTransform;
    if (orientationTransform.tx == videoSize.width || orientationTransform.tx == videoSize.height) {
        orientationTransform.tx = self.renderSize.width;
    }

    if (orientationTransform.ty == videoSize.width || orientationTransform.ty == videoSize.height) {
        orientationTransform.ty = self.renderSize.width;
    }

    CGAffineTransform t1 = CGAffineTransformScale(CGAffineTransformIdentity, fillScale, fillScale);
    CGAffineTransform t2 = CGAffineTransformConcat(t1, orientationTransform);
    CGRect cropRect = CGRectMake(0, 0.5, 1, 0.5);
    CGAffineTransform t3 = CGAffineTransformConcat(t2, CGAffineTransformMakeTranslation
                                                   (-cropRect.origin.x * videoSize.width * fillScale,
                                                    -cropRect.origin.y * videoSize.height * fillScale));
    compositionTrack.preferredTransform = t3;
    return [composition copy];
}

- (void)_exportVideo:(void (^)(void))completion
{   
    // Trimmed and cropped Asset
    AVComposition *trimmedAsset = [self trimmedAndCroppedVideoComposition];

    // Input clip
    AVAssetTrack *clipVideoTrack = [[trimmedAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, trimmedAsset.duration);

    // Apple transform
    AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction
                                                              videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack];
    CGAffineTransform finalTransform = clipVideoTrack.preferredTransform;
    [transformer setTransform:finalTransform atTime:kCMTimeZero];
    instruction.layerInstructions = [NSArray arrayWithObject:transformer];

    // Make it square
    AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition];
    videoComposition.renderSize = CGSizeMake(self.renderSize.width,
                                             self.renderSize.height);
    videoComposition.frameDuration = CMTimeMake(1, 30);
    videoComposition.instructions = [NSArray arrayWithObject: instruction];

    // Export
    self.exporter = [[AVAssetExportSession alloc] initWithAsset:trimmedAsset presetName:AVAssetExportPresetMediumQuality];
    self.exporter.videoComposition = videoComposition;
    self.exporter.outputURL=[NSURL fileURLWithPath:outputPath];
    self.exporter.outputFileType=AVFileTypeQuickTimeMovie;

    [self.exporter exportAsynchronouslyWithCompletionHandler:^(void){
        ...
    }];
}

0 个答案:

没有答案