Thursday, 11 September 2014

AVFoundation Audio with Swift using AVAudioPlayerNode

Having been blocked using AudioUnit callbacks in Swift in my previous exploration, I decided to take a different direction and had another look at the WWDC video, presentation and transcript given for the new AVFoundation changes for Audio. Unfortunately it's described in terms of Objective-C, but the one of the interesting points is around the description of using AVAudioPlayerNode and the scheduleBuffer function.

func scheduleBuffer(_ buffer: AVAudioPCMBuffer!,
             atTime when: AVAudioTime!,
            options options: AVAudioPlayerNodeBufferOptions,
  completionHandler completionHandler: AVAudioNodeCompletionHandler!)


My first thoughts were, great, look there's a callback to indicate that the buffer has played out which can then be called and re-filled, it's in Swift, which means we can workaround the previous callback problems. So, I knocked up another playground to test this.

The scheduleBuffer call allows for options to be set to Loops, Interrupts, InterruptsAtLoop or nil. Check out the WWDC material which explains this with some diagrams and an ADSR sort of example.

Taking baby steps, I thought I'd basically fill-up a buffer with a simple Sine wave and then play that out as a continuous loop to get started. The buffer needs to be an AVAudioPCMBuffer. If you take a look at Bob Burns' post on Gene de Lisa's blog, he's trying something similar. My code looks like this:

import Cocoa
import AVFoundation

let twopi:Float = 2.0 * 3.14159

var freq:Float = 440.00
var sampleRate:Float = 44100.00

var engine = AVAudioEngine()
var player:AVAudioPlayerNode = AVAudioPlayerNode()
var mixer = engine.mainMixerNode

var buffer = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0),frameCapacity:100)

var length = 100
buffer.frameLength = AVAudioFrameCount(length)

// fill up the buffer with some samples
for (var i=0; i<length; i++)
{
    var val:Float = 10.0 * sin(Float(i)*twopi*freq/sampleRate)
    buffer.floatChannelData.memory[i] = val
}

engine.attachNode(player)
engine.connect(player,to:mixer,format:mixer.outputFormatForBus(0))

var error:NSErrorPointer = nil
engine.startAndReturnError(error)

player.scheduleBuffer(buffer,atTime:nil,options:.Loops,completionHandler:nil)
player.play()

// keep playground running
import XCPlayground
XCPSetExecutionShouldContinueIndefinitely(continueIndefinitely:true)

Fantastic, I got some audio out, seemed like a tone, but was getting some glitchy audio effects I expect due to the buffer not smoothly containing a single cycle. After trying this and googling a bit I found Thomas Royal had also tried something similar. At least I'm getting some sound out now.

So, taking this further I thought, rather than making the Sine fit a cycle I could simply set the completionHandler callback and get an indication of when to play the next buffer chunk and I'd be away generating what I liked. [Just as a note, my assumption was that options could be set to nil or InterruptsAtLoop and effectively we'd be creating audio double-buffering so that samples could be created during the buffer playout and there would be no wait from getting the completion handler to setting the next buffer].

The empty completion handler looks like this:

func handler(buffer:AVAudioPCMBuffer!,time:AVAudioTime!) -> Void
{
}

I then tried setting as follows:

player.scheduleBuffer(buffer,atTime:nil,options:.InterruptsAtLoop,completionHandler:handler)

And got this 'helpful' error:



Hmmm. I tried taking this out of the Playground, I tried a number of different ideas. None worked. Damn! I googled a lot on this and completion handlers generally and didn't get any results. Shame.

That avenue blocked (hopefully for now), undeterred I thought I'd give this another go. Changing approach again, I thought, well, if I'm not getting a callback, maybe I just create a thread and stuff buffers into the player, I could get cleverer later on and use the atTime parameters (assuming that would work) and put the buffers in given some consideration for timing. Indeed doing this might be a nice way to ensure that the timing alignment of various players were synched. But I'm getting ahead of myself now.

The revised fragment looks like this:

let queue = NSOperationQueue()

queue.addOperationWithBlock({
    
    var j:Int=0;
    
    while(true)
    {
        for (var i=0; i<length; i++)
        {
            var val:Float = 5.0 * sin(Float(j)*twopi*freq/sampleRate)
            buffer.floatChannelData.memory[i] = val
            j++
        }
        
        player.scheduleBuffer(buffer,atTime:nil,options:.InterruptsAtLoop,completionHandler:nil)
        let thread = NSThread.currentThread()
        
        NSThread.sleepForTimeInterval(0.1)
    }
})


This proved to be problematic in the playground as it tried to show filling the loop each cycle, which took longer than the playback, so I first tried to move this part of the code to a Framework to import (unsuccessfully, something I'll come back to later as it's going to be key to being able to use Playgrounds effectively) and then just into a normal Console application:

//
//  main.swift
//  Audio
//
//  Created by hondrou on 11/09/2014.
//  Copyright (c) 2014 hondrou. All rights reserved.
//

import Foundation
import AVFoundation

let twopi:Float = 2.0 * 3.14159

var freq:Float = 440.00
var sampleRate:Float = 44100.00

var engine = AVAudioEngine()
var player:AVAudioPlayerNode = AVAudioPlayerNode()
var mixer = engine.mainMixerNode


var length = 4000

var buffer = AVAudioPCMBuffer(PCMFormat: player.outputFormatForBus(0),frameCapacity:AVAudioFrameCount(length))

buffer.frameLength = AVAudioFrameCount(length)


engine.attachNode(player)
engine.connect(player,to:mixer,format:mixer.outputFormatForBus(0))

var error:NSErrorPointer = nil
engine.startAndReturnError(error)


let queue = NSOperationQueue()

queue.addOperationWithBlock({
    
    var j:Int=0;

    while(true)
    {
        for (var i=0; i<length; i++)
        {
            var val:Float = 5.0 * sin(Float(j)*twopi*freq/sampleRate)
            buffer.floatChannelData.memory[i] = val
            j++
        }
        
        player.scheduleBuffer(buffer,atTime:nil,options:.InterruptsAtLoop,completionHandler:nil)
        let thread = NSThread.currentThread()
        
        NSThread.sleepForTimeInterval(0.1)
    }
})

player.play()


while (true)
{
    NSThread.sleepForTimeInterval(1)
    
    //freq += 10
}

Rather than the keep-alive for the playground at the end I'm keeping the main thread alive with a simple loop (which I'll use later to adjust the frequency to check that this is not just playing a single tone).

This played back and ok, I got audio, but those funny glitches were still there. So I played around with the sleep loop interval and the size of the buffer with varying results, but none of them nice, then decided to go to bed! Stumped and not too happy about it.

Hmmmm, not all lost yet as I have some other ideas, but I'm away for the next few days on a biz trip so will have to try this later on. If anyone has any good comments/suggestions before then I'd be most grateful. I'm hoping that Swift should be man-enough for the job. C# certainly can cope with this kind of relatively simple synthesis and it's running in the CLR.



Update 
doh! that'll teach me for late night coding. I finally found the problem with the completion handler and have just posted another blog entry

2 comments:

  1. Ahhh, the completion handler, been reading this evening from the hotel and it's all about closures. Good thing to get skilled up on, so worth a post or two when I get back. Means we'll be able to delve into async responses like Http calls as well

    ReplyDelete
  2. I was surfing the Internet for information and came across your blog. I am impressed by the information you have on this blog. It shows how well you understand this subject. best logo design company

    ReplyDelete