将CMSampleBuffer转换为AVAudioPCMBuffer以获取实时音频频率

时间:2019-07-17 19:59:50

标签: ios swift audio-streaming

我正在尝试从CMSampleBuffer的{​​{1}}返回的captureOutput读取频率值。

这个想法是创建一个AVCaptureAudioDataOutputSampleBufferDelegate,以便我可以阅读它的AVAudioPCMBuffer。但是我不确定如何将缓冲区传递给它。

我想我可以用:

floatChannelData

但是我该如何填充其数据?

1 个答案:

答案 0 :(得分:0)

遵循这些原则应该会有所帮助:

var asbd = CMSampleBufferGetFormatDescription(sampleBuffer)!.audioStreamBasicDescription!
var audioBufferList = AudioBufferList()
var blockBuffer : CMBlockBuffer?

CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(
    sampleBuffer,
    bufferListSizeNeededOut: nil,
    bufferListOut: &audioBufferList,
    bufferListSize: MemoryLayout<AudioBufferList>.size,
    blockBufferAllocator: nil,
    blockBufferMemoryAllocator: nil,
    flags: 0,
    blockBufferOut: &blockBuffer
)

let mBuffers = audioBufferList.mBuffers
let frameLength = AVAudioFrameCount(Int(mBuffers.mDataByteSize) / MemoryLayout<Float>.size)
let pcmBuffer = AVAudioPCMBuffer(pcmFormat: AVAudioFormat(streamDescription: &asbd)!, frameCapacity: frameLength)!
pcmBuffer.frameLength = frameLength
pcmBuffer.mutableAudioBufferList.pointee.mBuffers = mBuffers
pcmBuffer.mutableAudioBufferList.pointee.mNumberBuffers = 1

这似乎在捕获会话的末尾创建了一个有效的AVAudioPCMBuffer。但是对于我的用例来说,它的帧长度不正确,因此需要做一些进一步的缓冲。