iOS AVFoundation音频/视频不同步

时间:2015-03-13 22:17:52

标签: ios objective-c avfoundation avplayer audio-video-sync

问题:

在每次播放过程中,音频都在视频后面1-2秒之间。


设置:

资产从媒体流加载AVURLAssets。

为了编写合成,我使用了具有不对称时间尺度的AVMutableCompositions和AVMutableCompositionTracks。音频和视频都流式传输到设备。音频的时间刻度是44100;视频的时间表是600.

使用AVPlayer播放。


尝试解决方案:

  • videoAssetTrack.timeRange用于[composition insertTimeRange]
  • 使用CMTimeRangeMake(kCMTimeZero, videoAssetTrack.duration);
  • 使用CMTimeRangeMake(kCMTimeZero, videoAssetTrack.timeRange.duration);

守则:

+(AVMutableComposition*)overlayAudio:(AVURLAsset*)audioAsset
                          withVideo:(AVURLAsset*)videoAsset
{
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    AVAssetTrack* audioTrack = [self getTrackFromAsset:audioAsset withMediaType:AVMediaTypeAudio];
    AVAssetTrack* videoTrack = [self getTrackFromAsset:videoAsset withMediaType:AVMediaTypeVideo];
    CMTime duration = videoTrack.timeRange.duration;

    AVMutableCompositionTrack* audioComposition = [self composeTrack:audioTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeAudio];
    AVMutableCompositionTrack* videoComposition = [self composeTrack:videoTrack withComposition:mixComposition andDuration:duration andMedia:AVMediaTypeVideo];
    [self makeAssertionAgainstAudio:audioComposition andVideo:videoComposition];
    return mixComposition;
}

+(AVAssetTrack*)getTrackFromAsset:(AVURLAsset*)asset withMediaType:(NSString*)mediaType
{
    return [[asset tracksWithMediaType:mediaType] objectAtIndex:0];
}

+(AVAssetExportSession*)configureExportSessionWithAsset:(AVMutableComposition*)composition toUrl:(NSURL*)url
{
    AVAssetExportSession* exportSession = [[AVAssetExportSession alloc] initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
    exportSession.outputFileType = @"com.apple.quicktime-movie";
    exportSession.outputURL = url;
    exportSession.shouldOptimizeForNetworkUse = YES;

    return exportSession;
}

-(IBAction)playVideo
{
    [avPlayer pause];
    avPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
    avPlayer = [[AVPlayer alloc]initWithPlayerItem:avPlayerItem];

    avPlayerLayer =[AVPlayerLayer playerLayerWithPlayer:avPlayer];
    [avPlayerLayer setFrame:CGRectMake(0, 0, 305, 283)];
    [avPlayerLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
    [playerView.layer addSublayer:avPlayerLayer];

    [avPlayer seekToTime:kCMTimeZero];
    [avPlayer play];
}

注释:

我不太了解AVFoundation框架。完全有可能我只是滥用我提供的片段。 (即为什么" insertTimeRange"对于作曲?)

我可以提供解决所需的任何其他信息 - 包括调试资产跟踪属性值,网络遥测,流信息等。

1 个答案:

答案 0 :(得分:1)

如果一致,则会出现强制延迟以正确采样音频。 Apple的指南通常比随附的书籍更容易阅读,但这里有关于延迟的具体说明。

https://developer.apple.com/library/ios/technotes/tn2258/_index.html

编程指南将详细说明原因/内容。