AudioKit下采样音频

如何解决AudioKit下采样音频

我有一些现有的代码,该代码使用AVAudioEngine从麦克风获取输入,对其进行降采样并将其写入AVAudioFile

internal func setupNodeChain() {
    guard let audioEngine = audioEngine else { return } // Fatal error ?
    
    let engineInputNode = audioEngine.inputNode
    
    let bus = 0
    let engineInputNodeFormat = engineInputNode.outputFormat(forBus: bus)
    
    // This attempts to down sample the audio from the microphone
    let downSampleMixerNode = AVAudioMixerNode()
    let mixerOutputFormat = AVAudioFormat(standardFormatWithSampleRate: 8000,channels: 1)
    
    // Input -> (volume) -> down sample -> (volume) -> Output
    
    let inputVolumeMixerNode = AVAudioMixerNode()
    inputVolumeMixerNode.volume = Float(10 * microphoneVolume)
    
    audioEngine.attach(inputVolumeMixerNode)
    audioEngine.attach(downSampleMixerNode)
    
    self.downSampleMixerNode = downSampleMixerNode
    self.inputVolumeMixerNode = inputVolumeMixerNode
    
    let silenceNode = AVAudioMixerNode()
    silenceNode.outputVolume = 0
    
    self.silenceNode = silenceNode
    
    audioEngine.connect(engineInputNode,to: inputVolumeMixerNode,format: engineInputNodeFormat)
    audioEngine.connect(inputVolumeMixerNode,to: downSampleMixerNode,format: engineInputNodeFormat)
    
    // Try and stop the microphone audio from going through to the speaker
    audioEngine.attach(silenceNode)
    audioEngine.connect(downSampleMixerNode,to: silenceNode,format: mixerOutputFormat)
    audioEngine.connect(silenceNode,to: audioEngine.outputNode,format: mixerOutputFormat)

    downSampleMixerNode.installTap(onBus: bus,bufferSize: 1024 * 16,format: mixerOutputFormat) { (buffer: AVAudioPCMBuffer,time: AVAudioTime) in
        guard let tap = self.audioTap else { return }
        // Write buffer to AVAudioFile          
        tap.drip(buffer: buffer,time: time)
    }
}

大多数情况下,这种方法是可行的,但是我正在研究用AudioKit代替它,但是我遇到了问题,我不知道如何创建一种机制来对从麦克风到录音机的音频进行下采样。 / p>

    AKSettings.enableEchoCancellation = true
    AKSettings.allowAirPlay = true
    AKSettings.useBluetooth = true
    
    do {
        try AKSettings.setSession(category: .playAndRecord,with: [
                                                                .allowBluetoothA2DP,])
        
        AKSettings.defaultToSpeaker = true
        
        let audioFile = try self.makeAudioFile(named: "Recording")
        
        let mixerOutputFormat = AVAudioFormat(standardFormatWithSampleRate: 8000,channels: 1)!

        let microphone = AKMicrophone()
        let microphoneBooster = AKBooster(microphone)
        microphoneBooster.gain = 0
        
        let recorder = try AKNodeRecorder(node: microphoneBooster)
        //recorder.recordFormat = mixerOutputFormat
        
        let silence = AKMixer(microphoneBooster)
        silence.volume = 0
        
        self.microphone = microphone
        self.microphoneBooster = microphoneBooster
        self.recorder = recorder
        self.silence = silence
        
        AKManager.output = silence
        
        log(debug: "Start")
        try AKManager.start()
        
        log(debug: "Record")
        try recorder.record()

        DispatchQueue.main.async {
            self.state = .recording
            self.plot?.node = microphone
            self.callButton.setImage(#imageLiteral(resourceName: "EndCall"),for: [])
        }
    } catch let error {
        log(error: "Failed to establish play and record session: \(error)")
    }

所以,问题是-我将如何创建“向下采样”节点/工作流,它将麦克风以“默认”格式链接到“节点”,并将“节点”链接到下一个链中具有所需AVAudioFormat的节点?

麦克风->下采样(默认格式)

下采样->下一个节点(目标格式)->记录器

解决方法

本质上,我必须创建自己的“tap”来挖掘数据

首先,我有一个“转换器”。这基本上将来自另一个混音器的音频(通过“tap”)转换为目标格式并将其写入音频文件

class TapConverter: NodeTapperDelegate {
    
    let audioConfig: AudioConfig
    
    internal var inputFormat: AVAudioFormat?
    internal var converter: AVAudioConverter?
    
    var onError: ((Error) -> Void)?
    
    init(audioConfig: AudioConfig) {
        self.audioConfig = audioConfig
    }
    
    func open(format: AVAudioFormat) throws {
        inputFormat = format
        converter = AVAudioConverter(from: format,to: audioConfig.audioFormat)
    }
    
    func drip(buffer: AVAudioPCMBuffer,time: AVAudioTime) {
        guard let converter = converter else {
            return
        }
        guard let inputFormat = inputFormat else {
            return
        }
        
        let inputBufferSize = inputFormat.sampleRate
        let sampleRateRatio = inputBufferSize / audioConfig.audioFormat.sampleRate
        let capacity = Int(Double(buffer.frameCapacity) / sampleRateRatio)
        
        let bufferPCM16 = AVAudioPCMBuffer(pcmFormat: audioConfig.audioFormat,frameCapacity: AVAudioFrameCount(capacity))!
        var error: NSError? = nil

        converter.convert(to: bufferPCM16,error: &error) { inNumPackets,outStatus in
            outStatus.pointee = AVAudioConverterInputStatus.haveData
            return buffer
        }
        if let error = error {
            // Handle error in someway
        } else {
            let audioFile = audioConfig.audioFile
            do {
                log(debug: "Write buffer")
                try audioFile.write(from: bufferPCM16)
            } catch let error {
                log(error: "Failed to write buffer to audio file: \(error)")
                onError?(error)
            }
        }
    }
    
    func close() {
        converter = nil
        inputFormat = nil
        // ? we close the audio file
    }
}

AudioConfig 只是一个基本的占位符,它包含已写入(必须已创建)的 audioFile 和目标 AVAudioFormat

struct AudioConfig {
    let url: URL
    let audioFile: AVAudioFile
    let audioFormat: AVAudioFormat
}

创作可能看起来像......

let settings: [String: Any] = [
    AVFormatIDKey: NSNumber(value: kAudioFormatMPEG4AAC),AVSampleRateKey: NSNumber(value: 8000),AVNumberOfChannelsKey: NSNumber(value: 1),AVEncoderBitRatePerChannelKey: NSNumber(value: 16),AVEncoderAudioQualityKey: NSNumber(value: AVAudioQuality.min.rawValue)
]
let audioFile = try AVAudioFile(forWriting: sourceURL,settings: settings)

let audioConfig = AudioConfig(url: sourceURL,audioFile: audioFile,audioFormat: audioFormat)

从那里,我需要一种方法来点击节点(获取它的数据)并将其传递到我的转换器,为此,我使用了类似...

import Foundation
import AudioKit

protocol NodeTapperDelegate: class {
    func open(format: AVAudioFormat) throws
    func drip(buffer: AVAudioPCMBuffer,time: AVAudioTime)
    func close()
}

class NodeTapper: NSObject {
    // MARK: - Properties
    
    // The node we record from
    private(set) var node: AKNode?
    
    /// True if we are recording.
    @objc private(set) dynamic var isTapping = false
    
    /// The bus to install the recording tap on. Default is 0.
    private var bus: Int = 0
    
    /// Used for fixing recordings being truncated
    private var recordBufferDuration: Double = 16_384 / AKSettings.sampleRate
    
    weak var delegate: NodeTapperDelegate?
    
    // MARK: - Initialization
    
    /// Initialize the node recorder
    ///
    /// Recording buffer size is defaulted to be AKSettings.bufferLength
    /// You can set a different value by setting an AKSettings.recordingBufferLength
    ///
    /// - Parameters:
    ///   - node: Node to record from
    ///   - bus: Integer index of the bus to use
    ///
    @objc init(node: AKNode? = AKManager.output,bus: Int = 0) throws {
        self.bus = bus
        self.node = node
    }
    
    // MARK: - Methods
    
    /// Start recording
    @objc func start() throws {
        if isTapping == true {
            return
        }
        
        guard let node = node else {
            return
        }
        
        guard let delegate = delegate else {
            return
        }
        
        let bufferLength: AVAudioFrameCount = AKSettings.recordingBufferLength.samplesCount
        isTapping = true
        
        // Note: if you install a tap on a bus that already has a tap it will crash your application.
        let nodeFormat = node.avAudioNode.outputFormat(forBus: 0)
        try delegate.open(format: nodeFormat)

        // note,format should be nil as per the documentation for installTap:
        // "If non-nil,attempts to apply this as the format of the specified output bus. This should
        // only be done when attaching to an output bus which is not connected to another node"
        // In most cases AudioKit nodes will be attached to something else.
        node.avAudioUnitOrNode.installTap(onBus: bus,bufferSize: bufferLength,format: nil,// Might need to the input node's format :/
                                                                            block: process(buffer:time:))
    }
    
    private func process(buffer: AVAudioPCMBuffer,time: AVAudioTime) {
        guard let sink = delegate else { return }
        sink.drip(buffer: buffer,time: time)
    }
    
    /// Stop recording
    @objc func stop() {
        if isTapping == false {
            return
        }
        
        isTapping = false
        
        if AKSettings.fixTruncatedRecordings {
            //  delay before stopping so the recording is not truncated.
            let delay = UInt32(recordBufferDuration * 1_000_000)
            usleep(delay)
        }
        node?.avAudioUnitOrNode.removeTap(onBus: bus)
        delegate?.close()
    }
}

然后,不知何故,将其完全绑定

let microphone = AKMicrophone()
microphone?.volume = 10 * volume

let monoToStereo = AKStereoFieldLimiter(microphone,amount: 1)
let microphoneMixer = AKMixer(monoToStereo)

// This is where we're converting the audio from
// the microphone and dripping it into the audio file
let converter = TapConverter(audioConfig: audioConfig)
// handleError is basically just a func in this case
converter.onError = handleError
// Here we tap the mixer/node and output to the converter
let tapper = try NodeTapper(node: microphoneMixer)
tapper.delegate = converter

// Silence the output from the microphone,so it's not
// fed back into the microphone
let silence = AKMixer(microphoneMixer)
silence.volume = 0

self.microphoneMixer = microphoneMixer
self.converter = converter
self.tapper = tapper
self.microphone = microphone
self.silence = silence

AKManager.output = silence

log(debug: "Start")
try AKManager.start()

log(debug: "Record")
try tapper.start()

其中大部分来自网络上不同帖子的不同想法的碎片,所以这是最好的选择吗?我不知道,但它做我需要它做的事情

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


依赖报错 idea导入项目后依赖报错,解决方案:https://blog.csdn.net/weixin_42420249/article/details/81191861 依赖版本报错:更换其他版本 无法下载依赖可参考:https://blog.csdn.net/weixin_42628809/a
错误1:代码生成器依赖和mybatis依赖冲突 启动项目时报错如下 2021-12-03 13:33:33.927 ERROR 7228 [ main] o.s.b.d.LoggingFailureAnalysisReporter : *************************** APPL
错误1:gradle项目控制台输出为乱码 # 解决方案:https://blog.csdn.net/weixin_43501566/article/details/112482302 # 在gradle-wrapper.properties 添加以下内容 org.gradle.jvmargs=-Df
错误还原:在查询的过程中,传入的workType为0时,该条件不起作用 <select id="xxx"> SELECT di.id, di.name, di.work_type, di.updated... <where> <if test=&qu
报错如下,gcc版本太低 ^ server.c:5346:31: 错误:‘struct redisServer’没有名为‘server_cpulist’的成员 redisSetCpuAffinity(server.server_cpulist); ^ server.c: 在函数‘hasActiveC
解决方案1 1、改项目中.idea/workspace.xml配置文件,增加dynamic.classpath参数 2、搜索PropertiesComponent,添加如下 <property name="dynamic.classpath" value="tru
删除根组件app.vue中的默认代码后报错:Module Error (from ./node_modules/eslint-loader/index.js): 解决方案:关闭ESlint代码检测,在项目根目录创建vue.config.js,在文件中添加 module.exports = { lin
查看spark默认的python版本 [root@master day27]# pyspark /home/software/spark-2.3.4-bin-hadoop2.7/conf/spark-env.sh: line 2: /usr/local/hadoop/bin/hadoop: No s
使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams['font.sans-serif'] = ['SimHei'] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -> systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping("/hires") public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate<String
使用vite构建项目报错 C:\Users\ychen\work>npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-