从麦克风播放音频

时间:2018-01-09 08:53:35

标签: ios iphone swift avfoundation

目标:将音频/视频从一台设备流式传输到另一台设备。

问题:我设法获得音频和视频,但音频不会在另一方播放。

详细信息:

我创建了一个应用程序,可以通过网络将A / V数据从一个设备传输到另一个设备。为了不详细说明,我会告诉你我被困在哪里。我设法听取输出委托,在那里我提取音频信息,将其转换为数据并将其传递给我已创建的委托。

func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
    // VIDEO | code excluded for simplicity of this question as this part works

    // AUDIO | only deliver the frames if you are allowed to
    if self.produceAudioFrames == true {
        // process the audio buffer
        let _audioFrame = self.audioFromSampleBuffer(sampleBuffer)
        // process in async
        DispatchQueue.main.async {
            // pass the audio frame to the delegate
            self.delegate?.audioFrame(data: _audioFrame)
        }
    }
}

帮助函数转换SampleBuffer(不是我的代码,无法找到来源。我知道在SO上找到它):

func audioFromSampleBuffer(_ sampleBuffer: CMSampleBuffer) -> Data {

    var audioBufferList = AudioBufferList()
    var data = Data()
    var blockBuffer : CMBlockBuffer?

    CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(sampleBuffer, 
                                                            nil,
                                                            &audioBufferList, 
                                                            MemoryLayout<AudioBufferList>.size, 
                                                            nil, 
                                                            nil, 
                                                            0, 
                                                            &blockBuffer)

    let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, 
                                                   count: Int(audioBufferList.mNumberBuffers))
    for audioBuffer in buffers {
        let frame = audioBuffer.mData?.assumingMemoryBound(to: UInt8.self)
        data.append(frame!, count: Int(audioBuffer.mDataByteSize))
    }
    // dev
    //print("audio buffer count: \(buffers.count)") | this returns 2048
    // give the raw data back to the caller
    return data
}

注意:在通过网络发送之前,我将从helper func返回的数据转换为:let payload = Array(data)

这是主持方。

在客户端,我收到有效负载为[UInt8],这就是我被困住的地方。我试了很多东西,但都没有用。

func processIncomingAudioPayloadFromFrame(_ ID: String, _ _Data: [UInt8]) {
    let readableData = Data(bytes: _Data) // back from array to the data before we sent it over the network.
    print(readableData.count) // still 2048 even after recieving from network, So I am guessing data is still intact

    let x = self.bytesToAudioBuffer(_Data) // option two convert into a AVAudioPCMBuffer
    print(x) // prints | <AVAudioPCMBuffer@0x600000201e80: 2048/2048 bytes> | I am guessing it works

    // option one | play using AVAudioPlayer
    do {
        let player = try AVAudioPlayer(data: readableData)
        try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
        try AVAudioSession.sharedInstance().setActive(true)
        player.prepareToPlay()
        player.play()
        print(player.volume) // doing this to see if this is reached
    }catch{
        print(error) // gets error | Error Domain=NSOSStatusErrorDomain Code=1954115647 "(null)"
    }
}

以下是将[UInt8]转换为AVAudioPCMBuffer的辅助功能:

func bytesToAudioBuffer(_ buf: [UInt8]) -> AVAudioPCMBuffer {
    // format assumption! make this part of your protocol?
    let fmt = AVAudioFormat(commonFormat: .pcmFormatFloat32, sampleRate: 44100, 
                            channels: 1, interleaved: true)
    let frameLength = UInt32(buf.count) / fmt.streamDescription.pointee.mBytesPerFrame

    let audioBuffer = AVAudioPCMBuffer(pcmFormat: fmt, frameCapacity: frameLength)
    audioBuffer.frameLength = frameLength

    let dstLeft = audioBuffer.floatChannelData![0]
    // for stereo
    // let dstRight = audioBuffer.floatChannelData![1]

    buf.withUnsafeBufferPointer {
        let src = UnsafeRawPointer($0.baseAddress!).
            bindMemory(to: Float.self, capacity: Int(frameLength))
        dstLeft.initialize(from: src, count: Int(frameLength))
    }
    return audioBuffer
}

问题:

  1. 是否可以直接从[UInt8]
  2. 播放
  3. 如何使用AudioEngine播放AVAudioPCMBuffer有效负载? 有可能吗?
  4. 如何在客户端播放音频。
  5. 脚注:代码中的注释应该为您提供一些我希望输出的提示。此外,我不想保存到文件或任何相关文件,因为我只想放大麦克风进行实时收听,我对保存数据毫无兴趣。

1 个答案:

答案 0 :(得分:0)

我使用了相同的代码,用于在Carrier呼叫上播放音频文件。

请尝试让我知道结果:

目标代码:

NSString *soundFilePath = [[NSBundle mainBundle] 
pathForResource:self.bgAudioFileName ofType: @"mp3"];    

NSURL *fileURL = [[NSURL alloc] initFileURLWithPath:soundFilePath ];

myAudioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL 
error:nil];

myAudioPlayer.numberOfLoops = -1;

NSError *sessionError = nil;

// Change the default output audio route
AVAudioSession *audioSession = [AVAudioSession sharedInstance];

 // get your audio session somehow
[audioSession setCategory:AVAudioSessionCategoryMultiRoute 
 error:&sessionError];

BOOL success= [audioSession         
overrideOutputAudioPort:AVAudioSessionPortOverrideNone 
error:&sessionError];

 [audioSession setActive:YES error:&sessionError];
 if(!success)
 {
     NSLog(@"error doing outputaudioportoverride - %@", [sessionError    
     localizedDescription]);
}
 [myAudioPlayer setVolume:1.0f];
 [myAudioPlayer play];

Swift版本:

 var soundFilePath: String? = Bundle.main.path(forResource: 
 bgAudioFileName, ofType: "mp3")
 var fileURL = URL(fileURLWithPath: soundFilePath ?? "")
 myAudioPlayer = try? AVAudioPlayer(contentsOf: fileURL)
 myAudioPlayer.numberOfLoops = -1
 var sessionError: Error? = nil
// Change the default output audio route
 var audioSession = AVAudioSession.sharedInstance()
 // get your audio session somehow
 try? audioSession.setCategory(AVAudioSessionCategoryMultiRoute)
 var success: Bool? = try? 
 audioSession.overrideOutputAudioPort(AVAudioSessionPortOverrideNone 
 as? AVAudioSessionPortOverride ?? AVAudioSessionPortOverride())
 try? audioSession.setActive(true)
 if !(success ?? false) {
   print("error doing outputaudioportoverride - \
      (sessionError?.localizedDescription)")
   }
  myAudioPlayer.volume = 1.0
  myAudioPlayer.play()