使用AVMutableComposition和AVAssetExportSession导出的视频不会与视频和音频重叠

时间:2014-06-30 23:06:05

标签: ios avfoundation

我试图在我的3s视频的开头添加1s音轨(" dummy_recording.m4a")。但是我现在得到的结果是6s长的视频。它以黑色背景录制开始,然后仅显示黑色背景,然后在结尾显示视频。我在这做错了什么?我只是希望音频从头开始重叠我的视频。

-(void) addAudioToFileAtPath:(NSString *) filePath toPath:(NSString *)outFilePath completion:( void ( ^ ) () )completion
{
    NSString *audioFilePath = [[NSBundle mainBundle] pathForResource:@"dummy_recording"
                                                     ofType:@"m4a"];
    NSDictionary *audioInfoDictionary = @{@"audioFilePath": audioFilePath, @"audioDuration": [NSNumber numberWithFloat:1.0]};
    NSArray *audioInfoArray = @[audioInfoDictionary];

    NSError * error = nil;

    AVMutableComposition * composition = [AVMutableComposition composition];


    AVURLAsset * videoAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:filePath] options:nil];

    AVAssetTrack * videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                            preferredTrackID: kCMPersistentTrackID_Invalid];

    NSLog(@"videoAsset.duration... value: %lld, timescale: %d, seconds: %lld", videoAsset.duration.value, videoAsset.duration.timescale, videoAsset.duration.value / videoAsset.duration.timescale);

    [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:videoAssetTrack atTime:kCMTimeZero
                                 error:&error];

    CMTime audioStartTime = kCMTimeZero;
    for (NSDictionary * audioInfo in audioInfoArray)
    {
        NSString * pathString = [audioInfo objectForKey:@"audioFilePath"];
        AVURLAsset * urlAsset = [AVURLAsset URLAssetWithURL:[NSURL fileURLWithPath:pathString] options:nil];

        AVAssetTrack * audioAssetTrack = [[urlAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
        AVMutableCompositionTrack *compositionAudioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                                preferredTrackID: kCMPersistentTrackID_Invalid];

        NSLog(@"urlAsset.duration... value: %lld, timescale: %d, seconds: %lld", urlAsset.duration.value, urlAsset.duration.timescale, urlAsset.duration.value / urlAsset.duration.timescale);

        [compositionAudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero,urlAsset.duration) ofTrack:audioAssetTrack atTime:audioStartTime error:&error];

        audioStartTime = CMTimeAdd(audioStartTime, CMTimeMake((int) (([[audioInfo objectForKey:@"audioDuration"] floatValue] * RECORDING_FPS) + 0.5), RECORDING_FPS));
    }

    NSLog(@"composition.duration... value: %lld, timescale: %d, seconds: %lld", composition.duration.value, composition.duration.timescale, composition.duration.value / composition.duration.timescale);

    AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];

    assetExport.outputFileType = AVFileTypeMPEG4;
    assetExport.outputURL = [NSURL fileURLWithPath:outFilePath];

    [assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
         switch (assetExport.status)
         {
             case AVAssetExportSessionStatusCompleted:
                 //                export complete
                 NSLog(@"Export Complete");
                 completion();
                 break;
             case AVAssetExportSessionStatusFailed:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
                 //                export error (see exportSession.error)
                 break;
             case AVAssetExportSessionStatusCancelled:
                 NSLog(@"Export Failed");
                 NSLog(@"ExportSessionError: %@", [assetExport.error localizedDescription]);
                 //                export cancelled  
                 break;
         }
     }];    
}

这是记录持续时间的语句的结果。这个作品的长度是3s,这就是我想要的,但它仍然错误地输出到6s长:

videoAsset.duration... value: 1840, timescale: 600, seconds: 3
urlAsset.duration... value: 87040, timescale: 44100, seconds: 1
composition.duration... value: 1840, timescale: 600, seconds: 3

我从静止图像创建了3s视频文件。这是代码:

NSString *documentsFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
                             objectAtIndex:0];
NSString *path = [documentsFolder stringByAppendingPathComponent:@"test_video.mp4"];

DDLogInfo(@"path: %@", path);

NSFileManager *fileManager = [NSFileManager defaultManager];

NSError *removeItemError;
BOOL success = [fileManager removeItemAtPath:path error:&removeItemError];
if (success) {
    NSLog(@"removed file");
}
else
{
    NSLog(@"Could not delete file -:%@ ",[removeItemError localizedDescription]);
}

NSString *path2 = [documentsFolder stringByAppendingPathComponent:@"test_video_with_audio.mp4"];

DDLogInfo(@"path2: %@", path);

NSError *removeItemError2;
BOOL success2 = [fileManager removeItemAtPath:path2 error:&removeItemError2];
if (success2) {
    NSLog(@"removed file");
}
else
{
    NSLog(@"Could not delete file -:%@ ",[removeItemError2 localizedDescription]);
}

//1. Wire the writer.

NSError *error = nil;
AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                              [NSURL fileURLWithPath:path] fileType:AVFileTypeMPEG4
                                                          error:&error];
NSParameterAssert(videoWriter);

self.videoWriter = videoWriter;

NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               AVVideoCodecH264, AVVideoCodecKey,
                               [NSNumber numberWithInt:640], AVVideoWidthKey,
                               [NSNumber numberWithInt:640], AVVideoHeightKey,
                               nil];
AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                    assetWriterInputWithMediaType:AVMediaTypeVideo
                                    outputSettings:videoSettings]; //retain should be removed if ARC

NSParameterAssert(writerInput);
NSParameterAssert([videoWriter canAddInput:writerInput]);
[videoWriter addInput:writerInput];


NSMutableDictionary *attributes = [[NSMutableDictionary alloc] init];
[attributes setObject:[NSNumber numberWithUnsignedInt:kCVPixelFormatType_32ARGB] forKey:(NSString*)kCVPixelBufferPixelFormatTypeKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:640] forKey:(NSString*)kCVPixelBufferWidthKey];
[attributes setObject:[NSNumber numberWithUnsignedInt:640] forKey:(NSString*)kCVPixelBufferHeightKey];

AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                 sourcePixelBufferAttributes:attributes];

//2. Start a session

[videoWriter startWriting];
[videoWriter startSessionAtSourceTime:kCMTimeZero]; //use kCMTimeZero if unsure

UIImage *image = [UIImage imageNamed:@"dummy_square.jpg"];
CGImageRef cgImage = image.CGImage;

//3. Write some samples
//CVPixelBufferRef pixelBuffer = [self newPixelBufferFromCGImage:cgImage];
CVPixelBufferRef pixelBuffer = [self pixelBufferFromCGImage:cgImage];

BOOL result = [adaptor appendPixelBuffer:pixelBuffer withPresentationTime:CMTimeMakeWithSeconds(3.0, RECORDING_FPS)];

if (result == NO)
    NSLog(@"failed to append buffer");
else
    NSLog(@"appended buffer!");

if(pixelBuffer)
{
    CVBufferRelease(pixelBuffer);
}

//4. Finish the session
[writerInput markAsFinished];
//[videoWriter endSessionAtSourceTime:…]; //optional can call finishWriting without specifiying endTime

self.library = [[ALAssetsLibrary alloc] init];
__weak ALAssetsLibrary *lib = self.library;

[videoWriter finishWritingWithCompletionHandler:^{

    [self addAudioToFileAtPath:path toPath:path2 completion:^{

        NSString *albumName = @"Test Album";

        NSURL *pathUrl = [[NSURL alloc] initWithString:path2];

        [lib addAssetsGroupAlbumWithName:albumName resultBlock:^(ALAssetsGroup *group) {

            ///checks if group previously created
            if(group == nil){

                //enumerate albums
                [lib enumerateGroupsWithTypes:ALAssetsGroupAlbum
                                   usingBlock:^(ALAssetsGroup *g, BOOL *stop)
                 {
                     //if the album is equal to our album
                     if ([[g valueForProperty:ALAssetsGroupPropertyName] isEqualToString:albumName]) {

                         [lib writeVideoAtPathToSavedPhotosAlbum:pathUrl completionBlock:^(NSURL *assetURL, NSError *error) {
                             //then get the image asseturl
                             [lib assetForURL:assetURL
                                  resultBlock:^(ALAsset *asset) {
                                      //put it into our album
                                      [g addAsset:asset];
                                  } failureBlock:^(NSError *error) {

                                  }];
                         }];

                     }
                 }failureBlock:^(NSError *error){

                 }];

            }else{
                [lib writeVideoAtPathToSavedPhotosAlbum:pathUrl completionBlock:^(NSURL *assetURL, NSError *error) {
                    //then get the image asseturl
                    [lib assetForURL:assetURL
                         resultBlock:^(ALAsset *asset) {
                             //put it into our album
                             [group addAsset:asset];
                         } failureBlock:^(NSError *error) {

                         }];
                }];
            }

        } failureBlock:^(NSError *error) {

        }];
    }];
}];

2 个答案:

答案 0 :(得分:2)

这里我告诉你,如何从图像阵列转换并将音乐添加到视频

NSError *error = nil;
    NSFileManager *fileMgr = [NSFileManager defaultManager];
    NSString *documentsDirectory12 = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
    NSString *videoOutputPath = [documentsDirectory12 stringByAppendingPathComponent:@"test_output.mp4"];
 if ([fileMgr removeItemAtPath:videoOutputPath error:&error] != YES)
        NSLog(@"Unable to delete file: %@", [error localizedDescription]);


    CGSize imageSize = CGSizeMake(480, 320);
    NSUInteger fps = 1;

    NSLog(@"Start building video from defined frames.");

    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:videoOutputPath] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:imageSize.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:imageSize.height], AVVideoHeightKey,
                                   nil];

    AVAssetWriterInput* videoWriterInput = [AVAssetWriterInput
                                            assetWriterInputWithMediaType:AVMediaTypeVideo
                                            outputSettings:videoSettings];


    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                     sourcePixelBufferAttributes:nil];

    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    videoWriterInput.expectsMediaDataInRealTime = YES;
    [videoWriter addInput:videoWriterInput];

    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    CVPixelBufferRef buffer = NULL;

    //convert uiimage to CGImage.
    int frameCount = 0;
    //NSLog(@"fps :%f",(60.0-finalSec)/(float)[photoAlbumImages count]);
    float numberOfSecondsPerFrame =60.0/(float)[photoAlbumImages count];
    //NSLog(@"total fps :%f",numberOfSecondsPerFrame);
    float frameDuration = fps * numberOfSecondsPerFrame;
    NSLog(@"frame duration :%f",frameDuration);

    //for(VideoFrame * frm in imageArray)
    NSLog(@"**************************************************");
    for(UIImage * img12 in photoAlbumImages)
    {
        //UIImage * img = frm._imageFrame;
        buffer = [self pixelBufferFromCGImage:[img12 CGImage]];

        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30) {
            if (adaptor.assetWriterInput.readyForMoreMediaData)  {
                //print out status:
                NSLog(@"Processing video frame (%d,%d)",frameCount,[photoAlbumImages count]);

                CMTime frameTime12 = CMTimeMake(frameCount*frameDuration,fps);
               // NSLog(@"%@",frameTime12);
                NSLog(@"seconds = %f", CMTimeGetSeconds(frameTime12));
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime12];
                CVPixelBufferRelease(buffer);
                if(!append_ok){
                    NSError *error = videoWriter.error;
                    if(error!=nil) {
                        NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                    }
                }
            }
            else {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.3];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d\n, with error.", frameCount, j);
        }
        frameCount++;
    }
    NSLog(@"**************************************************");

    //Finish the session:
    [videoWriterInput markAsFinished];
    [videoWriter finishWritingWithCompletionHandler:^(){
        NSLog (@"finished writing");
    }];

    NSLog(@"Write Ended");



    ////////////////////////////////////////////////////////////////////////////
    //////////////  OK now add an audio file to move file  /////////////////////
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSString *bundleDirectory = [[NSBundle mainBundle] bundlePath];
    // audio input file...
    AVURLAsset* audioAsset;
    if(pathURL==NULL){
    NSString *audio_inputFilePath = [bundleDirectory stringByAppendingPathComponent:@"30secs.mp3"];
    NSURL    *audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];
    audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    }
    // this is the video file that was just written above, full path to file is in --> videoOutputPath
    NSURL    *video_inputFileUrl = [NSURL fileURLWithPath:videoOutputPath];

    // create the final video output file as MOV file - may need to be MP4, but this works so far...
    NSString *str=[NSString stringWithFormat:@"project1/imgtovid%@.mov",[self getCurrentDateTimeAsNSString]];
    NSString *outputFilePath = [documentsDirectory stringByAppendingPathComponent:str];
//     NSString* webStringURL = [outputFilePath stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
    NSURL    *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
    NSURL    *outputFileUrl1;
    if(outputFileUrl!=nil){
        NSString* webStringURL = [outputFilePath stringByAddingPercentEscapesUsingEncoding:NSUTF8StringEncoding];
        outputFileUrl1 = [NSURL URLWithString:webStringURL];
        [self.project1Array addObject:[NSString stringWithFormat:@"file://localhost%@",outputFileUrl1]];
        [positionArray addObject:[NSString stringWithFormat:@"file://localhost%@",outputFileUrl1]];

    }

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
//    NSLog(@"duration - %f",videoAsset.duration);

    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];


    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, videoAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack12 = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    if(pathURL==NULL){
        [b_compositionAudioTrack12 insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
    }else{
    [b_compositionAudioTrack12 insertTimeRange:audio_timeRange ofTrack:[[addAudioAsset1 tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];
    }


    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    _assetExport.outputFileType = @"public.mpeg-4";
    _assetExport.outputURL = outputFileUrl;
    NSLog(@"duration = %f", CMTimeGetSeconds(videoAsset.duration));
    _assetExport.timeRange=CMTimeRangeMake(kCMTimeZero, videoAsset.duration);

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {


}

这里是imagebuffer from image

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef)photimage {

    CGSize size = CGSizeMake(480, 320);

    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault,
                                          size.width,
                                          size.height,
                                          kCVPixelFormatType_32ARGB,
                                          (__bridge CFDictionaryRef) options,
                                          &pxbuffer);
    if (status != kCVReturnSuccess){
        NSLog(@"Failed to create pixel buffer");
    }

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                                 size.height, 8, 4*size.width, rgbColorSpace,
                                                 kCGImageAlphaPremultipliedFirst);
    //kCGImageAlphaNoneSkipFirst);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, size.width,
                                           size.height), photimage);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);
    return pxbuffer;
}

如果您需要更多帮助,请随时联系

答案 1 :(得分:1)

我对从图像和声音制作电影有同样的要求。我用这种方法添加电影和音频。

-(void)CompileFilesToMakeMovie
{
    AVMutableComposition* mixComposition = [AVMutableComposition composition];


    //AUDIO FILE PATH
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
                                                         NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString* path = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"final_%@.m4a",appDelegate.storyName]];
    NSURL*    audio_inputFileUrl = [NSURL fileURLWithPath:path];



    //VIDEO FILE PATH
    NSString* path1 = [documentsDirectory stringByAppendingPathComponent:@"temp.mp4"];
    NSURL*    video_inputFileUrl = [NSURL fileURLWithPath:path1];

    //FINAL VIDEO PATH
    NSString* path2 = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"outputFile_%@.mp4",appDelegate.storyName]];
    NSURL*    outputFileUrl = [NSURL fileURLWithPath:path2];
    NSLog(@"%@",path2);
    if ([[NSFileManager defaultManager] fileExistsAtPath:path2])
        [[NSFileManager defaultManager] removeItemAtPath:path2 error:nil];



    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];



    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    //nextClipStartTime = CMTimeAdd(nextClipStartTime, video_timeRange.duration);

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:CMTimeMake(2,1) error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset640x480];
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {

         //VIDEO COMPLETED
     }
     ];
}

对不起,代码有点乱,但我认为它会起作用。如果我需要解释一下,请告诉我。

修改电影制作代码

#pragma mark - Movie Making Code
-(void)makeMovie{
    //making movie and audio code ll be here
    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
                                                         NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString* path = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"temp.mp4"]];
    NSLog(@"%@",path);
    if ([[NSFileManager defaultManager] fileExistsAtPath:path])
        [[NSFileManager defaultManager] removeItemAtPath:path error:nil];


    CGSize sz = CGSizeMake(480, 320);
    [self writeImageAsMovie:appDelegate.imageArray toPath:path size:sz];

}


- (void)writeImageAsMovie:(NSMutableArray *)image toPath:(NSString*)path size:(CGSize)size
{
    //last screen adding

    UIImage *tempImg=[UIImage imageNamed:@"powered_by.png"];
    [image addObject:tempImg];
    //last screen adding end



    NSError *error = nil;
    AVAssetWriter *videoWriter = [[AVAssetWriter alloc] initWithURL:
                                  [NSURL fileURLWithPath:path] fileType:AVFileTypeQuickTimeMovie
                                                              error:&error];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];
    AVAssetWriterInput* writerInput = [AVAssetWriterInput
                                       assetWriterInputWithMediaType:AVMediaTypeVideo
                                       outputSettings:videoSettings];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerInput
                                                     sourcePixelBufferAttributes:nil];

    NSParameterAssert(writerInput);
    NSParameterAssert([videoWriter canAddInput:writerInput]);
    [videoWriter addInput:writerInput];
    writerInput.expectsMediaDataInRealTime = YES;
    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    //Write samples:
    /* CVPixelBufferRef buffer = [self pixelBufferFromCGImage:image.CGImage size:size];
     [adaptor appendPixelBuffer:buffer withPresentationTime:kCMTimeZero];
     [adaptor appendPixelBuffer:buffer withPresentationTime:CMTimeMake(duration-1, 2)];
     */
    float nxttime=0;
    float time=0;
    CVPixelBufferRef buffer=nil;
    @autoreleasepool {
        for(int i=0;i<image.count;i++)
        {

            NSLog(@"%d",image.count);
            if([writerInput isReadyForMoreMediaData])
            {
                if(i!=0 && i!=4)
                {
                    NSLog(@"AUDIO DURATION:%@",appDelegate.audioDuration);
                    nxttime=nxttime+time;
                    time=([[appDelegate.audioDuration objectAtIndex:i-1] floatValue])+1;

                }
                else if(i==0)
                {
                    nxttime=0;
                    time=3;
                }
                else if(i==4)
                {
                    nxttime=nxttime+5;


                }

                buffer = [self pixelBufferFromCGImage:[[image objectAtIndex:i] CGImage] size:size];
                CMTime frameTime = CMTimeMake(time, 1 );
                CMTime lastTime=CMTimeMake(nxttime, 1);
                CMTime presentTime=CMTimeAdd(lastTime, frameTime);
               // NSLog(@"%d: ft:%@ Lt:%@ PT:%@",i,frameTime,lastTime,presentTime);
                [adaptor appendPixelBuffer:buffer withPresentationTime:lastTime];
            }
            else
            {
                i=i-1;
            }

        }
    }
    //Finish the session:
    [writerInput markAsFinished];
    [self latestCombineVoices];
    [videoWriter finishWriting];


}
- (CVPixelBufferRef) pixelBufferFromCGImage:(CGImageRef)image size:(CGSize)size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;
    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                          size.height, kCVPixelFormatType_32ARGB, (__bridge CFDictionaryRef)options,
                                          &pxbuffer);
    status=status;//Added to make the stupid compiler not show a stupid warning.
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                                 size.height, 8, 4*size.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);

    // CGContextTranslateCTM(context, 0, CGImageGetHeight(image));
    //CGContextScaleCTM(context, 1.0, -1.0);//Flip vertically to account for different origin

    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

在我的情况下例如:有img1,img2,img3和sound1,sound2,sound3所以我制作一部电影,其中img1将为sound1_duration显示,依此类推。

相关问题