使用音频单元录制音频,文件分段为X秒,每个文件

时间:2015-01-11 23:43:21

标签: ios audio segments remoteio

我已经在这几天了。我不太熟悉框架的音频单元层。有人能指出一些完整的例子,说明我如何让用户记录,而不是用x个间隔来写文件。例如,用户按下记录,每10秒,我想写一个文件,在第11秒,它将写入下一个文件,在第21秒,它是相同的事情。因此,当我录制25秒的音频字时,它将产生3个不同的文件。

我用AVCapture尝试了这个,但它在中间产生了点击和弹出。我已经读过它,这是由于读写操作之间的毫秒数。我已经尝试了音频队列服务,但知道我正在处理的应用程序,我需要完全控制音频层;所以我决定选择Audio Unit。

1 个答案:

答案 0 :(得分:0)

我想我越来越近了......还是很迷茫。我最终使用了惊人的音频引擎(TAAE)。我现在正在看AEAudioReceiver,我的回调代码看起来像这样。我认为逻辑上是正确的,但我不喜欢'认为它已正确实施。

手头的任务:以AAC格式记录约5秒的段。

尝试:使用AEAudioReciever回调并将AudioBufferList存储在循环缓冲区中。跟踪录音机课程中收到的音频秒数;一旦它超过5秒标记(它可能稍微超过但不是6秒)。调用Obj-c方法使用AEAudioFileWriter

编写文件

结果:没有工作,录音听起来很慢,声音很大;我能听到一些录音;所以我知道有些数据存在,但它就像我丢失了大量数据一样。我甚至不确定如何调试这个(我会继续尝试,但此刻相当丢失)。

另一项是转换为AAC,我首先以PCM格式编写文件而不是转换为AAC,还是可以将音频段转换为AAC?

非常感谢您的帮助!

-----循环缓冲区初始化-----

//trying to get 5 seconds audio, how do I know what the length is if I don't know the frame size yet? and is that even the right question to ask?
TPCircularBufferInit(&_buffer, 1024 * 256);  

----- AEAudioReceiver回调------

static void receiverCallback(__unsafe_unretained MyAudioRecorder *THIS,
                         __unsafe_unretained AEAudioController *audioController,
                         void *source,
                         const AudioTimeStamp *time,
                         UInt32 frames,
                         AudioBufferList *audio) {
//store the audio into the buffer
TPCircularBufferCopyAudioBufferList(&THIS->_buffer, audio, time, kTPCircularBufferCopyAll, NULL);

//increase the time interval to track by THIS    
THIS.numberOfSecondInCurrentRecording += AEConvertFramesToSeconds(THIS.audioController, frames);

//if number of seconds passed an interval of 5 seconds, than write the last 5 seconds of the buffer to a file
if (THIS.numberOfSecondInCurrentRecording > 5 * THIS->_currentSegment + 1) {

    NSLog(@"Segment %d is full, writing file", THIS->_currentSegment);
    [THIS writeBufferToFile];

   //segment tracking variables
    THIS->_numberOfReceiverLoop = 0;
    THIS.lastTimeStamp = nil;
    THIS->_currentSegment += 1;
} else {
    THIS->_numberOfReceiverLoop += 1;
}

// Do something with 'audio'
if (!THIS.lastTimeStamp) {
    THIS.lastTimeStamp = (AudioTimeStamp *)time;
}
}

----写入文件(MyAudioRecorderClass中的方法)----

- (void)writeBufferToFileHandler {

NSString *documentsFolder = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES)
                             objectAtIndex:0];

NSString *filePath = [documentsFolder stringByAppendingPathComponent:[NSString stringWithFormat:@"Segment_%d.aiff", _currentSegment]];

NSError *error = nil;

//setup audio writer, should the buffer be converted to aac first or save the file than convert; and how the heck do you do that?
AEAudioFileWriter *writeFile = [[AEAudioFileWriter alloc] initWithAudioDescription:_audioController.inputAudioDescription];
[writeFile beginWritingToFileAtPath:filePath fileType:kAudioFileAIFFType error:&error];

if (error) {
    NSLog(@"Error in init. the file: %@", error);
    return;
}

int i = 1;
//loop to write all the AudioBufferLists that is in the Circular Buffer; retrieve the ones based off of the _lastTimeStamp; but I had it in NULL too and worked the same way.
while (1) {

//NSLog(@"Processing buffer file list for segment [%d] and buffer index [%d]", _currentSegment, i);
    i += 1;
    // Discard any buffers with an incompatible format, in the event of a format change

    AudioBufferList *nextBuffer = TPCircularBufferNextBufferList(&_buffer, _lastTimeStamp);
    Float32 *frame = (Float32*) &nextBuffer->mBuffers[0].mData;

    //if buffer runs out, than we are done writing it and exit loop to close the file       
    if ( !nextBuffer ) {
        NSLog(@"Ran out of frames, there were [%d] AudioBufferList", i - 1);
        break;
    }
    //Adding audio using AudioFileWriter, is the length correct?
    OSStatus status = AEAudioFileWriterAddAudio(writeFile, nextBuffer, sizeof(nextBuffer->mBuffers[0].mDataByteSize));
    if (status) {
      NSLog(@"Writing Error? %d", status);
    }

    //consume/clear the buffer
    TPCircularBufferConsumeNextBufferList(&_buffer);
}

//close the file and hope it worked
[writeFile finishWriting];
}

-----音频控制器AudioStreamBasicDescription ------

//interleaved16BitStereoAudioDescription
AudioStreamBasicDescription audioDescription;
memset(&audioDescription, 0, sizeof(audioDescription));
audioDescription.mFormatID          = kAudioFormatLinearPCM;
audioDescription.mFormatFlags       = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked | kAudioFormatFlagsNativeEndian;
audioDescription.mChannelsPerFrame  = 2;
audioDescription.mBytesPerPacket    = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mFramesPerPacket   = 1;
audioDescription.mBytesPerFrame     = sizeof(SInt16)*audioDescription.mChannelsPerFrame;
audioDescription.mBitsPerChannel    = 8 * sizeof(SInt16);
audioDescription.mSampleRate        = 44100.0;