Tuesday, 30 September 2014

AVFoundation audio monitoring - Installing a Tap on a Bus in Swift

Having had disappointing results with the AVAudioPlayerNode so far in Swift I haven't had a chance to return and continue bashing my head against a wall yet to try to get an answer, but today I had a quick chance to try something else that I'd been wanting to do and should certainly be in the scope of what Apple communicated in the WWDC info on the new AVFoundation classes.

If I can't quite work out how to synthesise audio just yet, it will be interesting to see if Swift has the real-timeness to do some audio monitoring and processing, so I wanted to quickly knock up a few lines of code to see if I could read the audio input and see what the rough, averaged 'energy'/volume was as a quick test.

It seemed pretty simple and the code worked without too much trouble, so here it is....

WARNING - turn your speaker off before doing this!!

I've just linked the input through a delay to the output to generate a signal to monitor.

import AVFoundation

// Setup engine and node instances
var engine = AVAudioEngine()
var delay = AVAudioUnitDelay()
var input = engine.inputNode
var output = engine.outputNode
var format = input.inputFormatForBus(0)
var error:NSError?

// Attach FX nodes to engine
engine.attachNode(delay)

// Connect nodes
engine.connect(input, to: delay, format: format)
engine.connect(delay, to: output, format: format)

// Start engine
engine.startAndReturnError(&error)

let length = 256
var audioBuffer = AVAudioPCMBuffer(PCMFormat: input.outputFormatForBus(0), frameCapacity: UInt32(length))
audioBuffer.frameLength = UInt32(length)

input.installTapOnBus(0, bufferSize:UInt32(length), format: input.outputFormatForBus(0), block:
    {
        (buffer, time) in
        
        var sum:Float = 0
        
        // do a quick calc from the buffer values
        for (var i=0; i<length; i++)
        {
            sum += Float(buffer.floatChannelData.memory[i]) * 10_000
        }
        
        println(NSString(format:"%.0f",sum/Float(length)))
})


while (true)
{
    NSThread.sleepForTimeInterval(1)
}

Remember, if you do this in a playground the looping in the playground is not going to work real-time.

hmmm, maybe there is just something really simple that is not working with the generation and I need to go back to it....


Update.... as I just bashed this together this morning, it got the needed result, but I didn't really like that I needed to link the audio to a delay and then the output to get this working. With a  bit more time looking at this it's easy to connect this just with the input as follows.... just use an input connected to a mixer, which is enough as follows:

var mixer = engine.mainMixerNode
var input = engine.inputNode
var format = input.inputFormatForBus(0)
var error:NSError?

// Connect nodes

engine.connect(input, to: mixer, format: format)

mixer.outputVolume = 0

And the rest of the code is the same. The important part here is that the mixer output needs to be zero otherwise it's creating another feedback loop like before.

No comments:

Post a Comment