iOS:导出的视频中缺少音频

时间:2014-11-24 03:03:17

标签: ios iphone xcode video-processing avassetexportsession

我正在尝试导出录制的视频。并取得成功。但音频缺少最终导出的视频。所以我搜索了它并添加了下面的音频代码。

if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}

但是在添加上述代码后我无法保存视频。我收到一个错误:

" session.status 4 error Error Domain = AVFoundationErrorDomain Code = -11841" Operation Stopped" UserInfo = 0x17027e140 {NSLocalizedDescription = Operation Stopped,NSLocalizedFailureReason =视频无法合成。}"

- (void)exportDidFinish:(AVAssetExportSession*)session {

NSLog(@"session.status %ld error %@",session.status,session.error);}

以下是我用于导出视频的代码。那么你有什么想法我怎样才能实现导出带音频视频的目标?谢谢!!

- (void)getVideoOutput{    
exportInProgress=YES;
NSLog(@"videoOutputFileUrl %@",videoOutputFileUrl);
AVAsset *videoAsset = [AVAsset assetWithURL:videoOutputFileUrl];
NSLog(@"videoAsset %@",videoAsset);
// 1 - Early exit if there's no video file selected

NSLog(@"video asset %@",videoAsset);

if (!videoAsset) {

    UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error" message:@"Please Load a Video Asset First"

                                                   delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];

    [alert show];

    return;

}



// 2 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances.

AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];



// 3 - Video track

AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo

                                                                    preferredTrackID:kCMPersistentTrackID_Invalid];

[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                    ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]

                     atTime:kCMTimeZero error:nil];

/* getting an error AVAssetExportSessionStatusFailed
if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0)
{
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)

                        ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]

                         atTime:kCMTimeZero error:nil];
}*/


// 3.1 - Create AVMutableVideoCompositionInstruction

AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);



// 3.2 - Create an AVMutableVideoCompositionLayerInstruction for the video track and fix the orientation.

AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];

AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

UIImageOrientation videoAssetOrientation_  = UIImageOrientationUp;

BOOL isVideoAssetPortrait_  = NO;

CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;

if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ = UIImageOrientationRight;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) {

    videoAssetOrientation_ =  UIImageOrientationLeft;

    isVideoAssetPortrait_ = YES;

}

if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) {

    videoAssetOrientation_ =  UIImageOrientationUp;

}

if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) {

    videoAssetOrientation_ = UIImageOrientationDown;

}

[videolayerInstruction setTransform:videoAssetTrack.preferredTransform atTime:kCMTimeZero];

[videolayerInstruction setOpacity:0.0 atTime:videoAsset.duration];



// 3.3 - Add instructions

mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];



AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];



CGSize naturalSize;

if(isVideoAssetPortrait_){

    naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width);

} else {

    naturalSize = videoAssetTrack.naturalSize;

}



float renderWidth, renderHeight;

renderWidth = naturalSize.width;

renderHeight = naturalSize.height;

mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);

mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];

mainCompositionInst.frameDuration = CMTimeMake(1, 30);


int totalSeconds= (int) CMTimeGetSeconds(videoAsset.duration);

[self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize videoDuration:totalSeconds];



// 4 - Get path

NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);

NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:

                         [NSString stringWithFormat:@"FinalVideo-%d.mov",arc4random() % 1000]];

NSURL *url = [NSURL fileURLWithPath:myPathDocs];



// 5 - Create exporter

AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition

                                                                  presetName:AVAssetExportPresetHighestQuality];

exporter.outputURL=url;

exporter.outputFileType = AVFileTypeQuickTimeMovie;

exporter.shouldOptimizeForNetworkUse = YES;

exporter.videoComposition = mainCompositionInst;


[exporter exportAsynchronouslyWithCompletionHandler:^{


    //dispatch_async(dispatch_get_main_queue(), ^{

    dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{


        [self exportDidFinish:exporter];


    });

}];

}

1 个答案:

答案 0 :(得分:3)

我不确定它是否有用,但以下是我在项目中的表现:

  1. 准备最终作品

    AVMutableComposition *composition = [[AVMutableComposition alloc] init];
    
  2. 准备视频片段

    AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    
  3. 准备音轨

    AVMutableCompositionTrack *audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    
  4. 在视频轨道中插入资源中的视频数据

    AVAssetTrack *video = [[asset tracksWithMediaType:AVMediaTypeVideo] firstObject];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:video atTime:kCMTimeZero error:&error];
    
  5. 将资源中的音频数据插入音轨

    AVAssetTrack *audio = [[asset tracksWithMediaType:AVMediaTypeAudio] firstObject];
    [audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, asset.duration) ofTrack:audio atTime:kCMTimeZero error:&error];
    
  6. 然后,您可以添加一些说明来处理您的视频和/或音频数据

  7. 最后,您应该可以使用以下方式导出:

    AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetMediumQuality];
    [exporter exportAsynchronouslyWithCompletionHandler:^{ /* code when the export is complete */ }];
    
  8. 另外,检查您是否正确录制了音频 第一次触发相机时,iOS应询问您是否允许使用麦克风。如果允许,请检查您的设备设置。

    另一种选择,您可以使用Window>检索原始资产。 Xcode中的设备窗口 选择您的设备并将数据导出到您的计算机。然后,找到记录的资产并使用VLC打开它。使用Cmd + I检查流以查看是否存在音频和视频轨道。