您可以直接从CMSampleBuffer播放音频吗?

问题描述

我在ARSession期间捕获了麦克风音频,希望将其传递给另一个VC,并在捕获发生后播放,但应用仍在运行(并且音频在内存中)。

当前将音频捕获为单个CMSampleBuffer,并通过didOutputAudioSampleBuffer ARSessionDelegate方法进行访问。

我以前使用过音频文件和AVAudioPlayer,但对于CMSampleBuffer还是陌生的。

有没有办法按原样获取原始缓冲区并播放它?如果是这样,哪些类可以做到这一点?还是需要先将其渲染/转换为其他格式或文件?

这是缓冲区中数据的格式描述:

mediaType:'soun' 
    mediaSubType:'lpcm' 
    mediaSpecific: {
        ASBD: {
            mSampleRate: 44100.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     } 
        cookie: {(null)} 
        ACL: {Mono}
        FormatList Array: {
            Index: 0 
            ChannelLayoutTag: 0x640001 
            ASBD: {
            mSampleRate: 44100.000000 
            mFormatID: 'lpcm' 
            mFormatFlags: 0xc 
            mBytesPerPacket: 2 
            mFramesPerPacket: 1 
            mBytesPerFrame: 2 
            mChannelsPerFrame: 1 
            mBitsPerChannel: 16     }} 
    } 
    extensions: {(null)}

任何指导意见都值得赞赏,因为苹果公司的文档尚不清楚,有关SO的相关问题更多涉及的是实时音频流,而不是捕获和后续回放。

解决方法

似乎答案是否定的,您不能简单地保存和播放原始缓冲区音频,首先需要将其转换为更持久的格式。

执行此操作的主要方法是使用AVAssetWriter将缓冲区数据保存为音频文件,以便以后使用AVAudioPlayer播放。

,

可以在录音的同时将麦克风传递给音频引擎,延迟最小:

let audioEngine = AVAudioEngine()
...
self.audioEngine.connect(self.audioEngine.inputNode,to: self.audioEngine.mainMixerNode,format: nil)
self.audioEngine.start()

如果样品缓冲液的使用很重要—— 粗略地说,可以通过转换为 PCM 缓冲区来完成:

import AVFoundation

extension AVAudioPCMBuffer {
static func create(from sampleBuffer: CMSampleBuffer) -> AVAudioPCMBuffer? {
    
    guard let description: CMFormatDescription = CMSampleBufferGetFormatDescription(sampleBuffer),let sampleRate: Float64 = description.audioStreamBasicDescription?.mSampleRate,let channelsPerFrame: UInt32 = description.audioStreamBasicDescription?.mChannelsPerFrame /*,let numberOfChannels = description.audioChannelLayout?.numberOfChannels */
    else { return nil }
    
    guard let blockBuffer: CMBlockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer) else {
        return nil
    }
    
    let samplesCount = CMSampleBufferGetNumSamples(sampleBuffer)
    
    //let length: Int = CMBlockBufferGetDataLength(blockBuffer)
    
    let audioFormat = AVAudioFormat(commonFormat: .pcmFormatFloat32,sampleRate: sampleRate,channels: AVAudioChannelCount(1),interleaved: false)
    
    let buffer = AVAudioPCMBuffer(pcmFormat: audioFormat!,frameCapacity: AVAudioFrameCount(samplesCount))!
    buffer.frameLength = buffer.frameCapacity
    
    // GET BYTES
    var dataPointer: UnsafeMutablePointer<Int8>?
    CMBlockBufferGetDataPointer(blockBuffer,atOffset: 0,lengthAtOffsetOut: nil,totalLengthOut: nil,dataPointerOut: &dataPointer)
    
    guard var channel: UnsafeMutablePointer<Float> = buffer.floatChannelData?[0],let data = dataPointer else { return nil }
    
    var data16 = UnsafeRawPointer(data).assumingMemoryBound(to: Int16.self)
    
    for _ in 0...samplesCount - 1 {
        channel.pointee = Float32(data16.pointee) / Float32(Int16.max)
        channel += 1
        for _ in 0...channelsPerFrame - 1 {
            data16 += 1
        }
        
    }
    
    return buffer
}
}


 class BufferPlayer {

let audioEngine = AVAudioEngine()
let player = AVAudioPlayerNode()

deinit {
    self.audioEngine.stop()
}

init(withBuffer: CMSampleBuffer) {
    
    self.audioEngine.attach(self.player)
    
    self.audioEngine.connect(self.player,format: AVAudioPCMBuffer.create(from: withBuffer)!.format
    )
    
    _ = try? audioEngine.start()
}

func playEnqueue(buffer: CMSampleBuffer) {
    guard let bufferPCM = AVAudioPCMBuffer.create(from: buffer) else { return }
    
    self.player.scheduleBuffer(bufferPCM,completionHandler: nil)
    if !self.player.isPlaying { self.player.play() }
}

}

相关问答

依赖报错 idea导入项目后依赖报错,解决方案:https://blog....
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下...
错误1:gradle项目控制台输出为乱码 # 解决方案:https://bl...
错误还原:在查询的过程中,传入的workType为0时,该条件不起...
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct...