Swift audio synthesis, processing, & analysis platform for iOS, macOS and tvOS



Build Status License Platform Reviewed by Hound Twitter Follow

AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS (including Catalyst), and tvOS.


To add AudioKit to your Xcode project, select File -> Swift Packages -> Add Package Depedency. Enter https://github.com/AudioKit/AudioKit for the URL. You can define which version range you want, or which branch to use, or even which exact commit you would like use.


All AudioKit documentation is generated from this repository. The documentation that appear in the docs folder generate the AudioKit.io Web Site. API documentation appears on the Github wiki.


The AudioKit Cookbook contains many recipes for simple uses for AudioKit components.

Getting help

  1. Post your problem to StackOverflow with the #AudioKit hashtag.

  2. Once you are sure the problem is not in your implementation, but in AudioKit itself, you can open a Github Issue.

  3. If you, your team or your company is using AudioKit, please consider sponsoring Aure on Github Sponsors.

  • Xcode 12 + AudioKit 4.11 Problems

    Xcode 12 + AudioKit 4.11 Problems

    Hi. I just wanted to ask if there is a planned support for v4 of AudioKit in Xcode 12? Or if there is a documentation for the v5 beta version which works in Xcode 12.

    I cannot compile my app in Xcode 12 and hence cannot prepare it for the iOS 14 which is now already released. I have went into detail in this stackoverflow post. https://stackoverflow.com/questions/63860545/implementing-microphone-analysis-with-audiokit-v5

    Thank you.

    opened by vojtabohm 82
  • Add OSC instrument communication

    Add OSC instrument communication

    We have full MIDI support, but no OSC support yet. There seems to be a few options listed at Charles Martin's Blog:


    but so far it looks like his Metatone might be best


    and it comes with this example


    If you are on OSC wiz and want to knock this out, that would be great, or else you can also just comment that you would also like to have this feature, and that will motivate us to prioritize this higher.

    opened by aure 43
  • Ableton Link Integration

    Ableton Link Integration

    AudioKit and Ableton Link

    Just writing down a quick list of methods that get called by Ableton Links 'LinkHut' example app to execute a tempo update. Since Ableton Link is meant to keep different music apps in tempo / sync then updating the AKSequencer / tempo backend would probably be enough to integrate Link with AudioKit.



    • Need to download Ableton Link / LinkHut from the [LinkKit release tab] (https://github.com/Ableton/LinkKit/releases)
    • LinkHut is in the examples folder inside the release zip

    Here are the steps taken in LinkHut from Ableton:

    Below are brief breakdown of steps taken to setup Ableton Link with an audio engine to update tempo

    All steps described below are based of the Ableton Link LinkHut example app.

    The process of updating the tempo (when another user set's new tempo):

    • First audio engine and Ableton Link objects are setup in ViewController -> viewDidLoad

      • _audioEngine = [[AudioEngine alloc] initWithTempo:_bpm];
      • This is where AVAudioSession shared instance and callbacks get setup
      • ABLLinkSetSessionTempoCallback(...)
        • This is where Ableton Link tempo change gets passed along to user callback to update UI
      • _audioEngine.linkRef, onSessionTempoChanged, (__bridge void *)self);
      • _linkSettings = [ABLLinkSettingsViewController instance:_audioEngine.linkRef];
      • [_audioEngine setQuantum:_quanta];
    • Ableton Link will receive the tempo change from the other user (over the network for ex)

    • Ableton Link will send an event to the AVAudioSession audio engine route

    • User code has a callback that gets called when route event happens

    • See 'AudioEngine.m' in the Ableton LinkHut app example:

      • callback in AudioEngine.m -> audioCallback
      • This is setup in 'setupAudioEngine' method that gets called during init:
      • This is where _linkData object is setup to connect to shared instance of AVAudioSession properties
      • 'AURenderCallbackStruct ioRemoteInput;'
        • gets called to setup i/o callback using 'audioCallback' function
        • audioCallback is defined to checks for a tempo update and saves the timestamp and new tempo
        • See 'static OSStatus audioCallback' method in AudioEngine.m
        • This is where the tempo actually gets updated when we receive an tempo event from Ableton Link
        • If new tempo is valid it passes along the new tempo event to another callback that informs the UI to display the new tempo to the user
        • This happens in 'ABLLinkSetSessionTempoCallback' defined in the ViewController.swift
        • This callback is setup in ViewController.swift -> viewDidLoad -> ABLLinkSetSessionTempoCallback
    opened by JoshuaBThompson 36
  • Notes not stopping properly in MIDI sequencer

    Notes not stopping properly in MIDI sequencer

    Something strange has started happening with the Sequencer. I'm not sure exactly when, but recently.

    The "stop playing notes" functionality has essentially stopped working. I have confirmed that my code is calling the stop method on each track, but the notes continue sustaining, which is a real problem in my app.

    Here's what the code looks like that I execute in direct response to the user pressing the "stop" button:

        public func stop() {
            if let sequencer = sequencer {
                internalPlaybackState = .stopped
                for track in sequencer.tracks {

    In that code, "sequencer" is just an AudioKit Sequencer. The "internalPlaybackState" is something related to my app that should have no bearing here.

    I'd be happy to help you reproduce it. All the source code that I'm using for playback is in the SongSprout Swift Package located here: https://github.com/dunesailer/SongSprout

    The main playback class is Orchestrion.

    opened by btfranklin 33
  • AKMicrophone init crash in objective C

    AKMicrophone init crash in objective C

    I am trying to use AKMicrophone in objective C sample app like this and it crashes for all the pod versions above 4.5.1 on the init step. iOS version is 12.1.2, iphone 7plus, Xcode version 10.1. It works fine in swift sample app MicrophoneAnalysis. So I suspect maybe missing @objc somewhere? @property (strong, nonatomic) AKMicrophone *mic; Then in ViewDidLoad: _mic = [[AKMicrophone alloc] init]; // Crashes here.
    I am also getting following error in the console: 2019-01-15 12:56:30.809227-0500 VoiceWavePoc[2856:1473174] [avas] AVAudioSessionPortImpl.mm:56:ValidateRequiredFields: Unknown selected data source for Port Receiver (type: Receiver)

    opened by KozDan 32
  • Setting Input/Output for external device

    Setting Input/Output for external device

    While connecting an external I/O device , It is batter has a setting for user can select which channels are prefer . ig. User apogee quartet , quartet has 4 analog input , and 6 analog output , you can select specific I/O channels for recording and monitor . you can select input 1 & 2 for stereo recording or just 1 for mono recording.

    For current version I test with quartet , if will using default input 1 , still works , but if your audio is via other channels , nothing can be recorded

    A setting view with real time chart UI will be great for measuring the input/output volume


    enhancement tips and tricks 
    opened by WebberLai 31
  • AKMicrophone crash init in 4.5.5 version

    AKMicrophone crash init in 4.5.5 version

    AudioKit version 4.5.5: crash in code AKSettings.audioInputEnabled = true mic = AKMicrophone() tracker = AKFrequencyTracker(mic) silence = AKBooster(tracker, gain: 0)

    _AVAE_Check: required condition is false: [AVAudioIONodeImpl.mm:911:SetOutputFormat: (format.sampleRate == hwFormat.sampleRate)] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: format.sampleRate == hwFormat.sampleRate'

    Test device - iPhone 8 and simulators AudioKit.engine.inputNode.inputFormat(forBus: 0).sampleRate periodically returns in the application 44100.0 or 48000.0

    On the iPhone se AKMicrophone() works stably, no crash

    opened by extnous 30
  • AKOfflineRenderNode doesn't render any sound.

    AKOfflineRenderNode doesn't render any sound.

    I utilized AKOfflineRenderNode to process recorded voice. The idea is to record voice and save it into the file. Next apply some effects to this file and save combined voice+effects into file.

    I have successful result on my iPhone 6s running on iOS 11.0.3 but on other devices running on iOS 11+(for example last generation of iPad on 11.0.3) I always have "silent" file with same size about 60 Kb. I attached one for reference(rec.zip). Also the issue 100% reproducible on iOS 11 simulator.

    Here is my setup(I removed effects setup since offline render doesn't work even on clean recording):

        fileprivate func setupAudioKit(){
            AKSettings.enableLogging = true
            AKSettings.bufferLength = .medium
            AKSettings.numberOfChannels = 2
            AKSettings.sampleRate = 44100
            AKSettings.defaultToSpeaker = true
            do {
                try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
            } catch {
                AKLog("Could not set session category")
            mic = AKMicrophone()
            micMixer = AKMixer(mic)
            recorder = try? AKNodeRecorder(node: micMixer)
            if let file = recorder.audioFile {
                player = try? AKAudioPlayer(file: file)
                player.looping = true
            playerMixer = AKMixer(player)
    // Effects setup
            offlineRenderer = AKOfflineRenderNode(playerMixer)
            AudioKit.output = offlineRenderer

    Here is export method:

         fileprivate func render() {
            offlineRenderer.internalRenderEnabled = false
            player.schedule(from: 0, to: player.duration, avTime: nil)
            let renderURL = URL(fileURLWithPath: FileHelper.documentsDirectory() + "rec.m4a")
            let sampleTimeZero = AVAudioTime(sampleTime: 0, atRate: AudioKit.format.sampleRate)
            player.play(at: sampleTimeZero)
            do {
                try offlineRenderer.renderToURL(renderURL, seconds: player.duration)
            } catch {
            offlineRenderer.internalRenderEnabled = true
    opened by ygoncharov-ideas 30
  • Swift 5 support

    Swift 5 support

    I'm sure you're all working on this, just making the issue to add some visibility for others who are wondering as well.

    Xcode 10.2 with Swift 5 support was just released today. Hoping Swift 5 support is coming soon to AudioKit!

    opened by colinhumber 29
  • Audiobus 'hwFormat' crash

    Audiobus 'hwFormat' crash

    I'm integrating Audiobus into my AudioKit-based app, and have followed the instructions here, but when I call Audiobus.start(), the app will continue running for about a second, then I get some error output, followed by a crash:

    2017-03-22 23:25:45.620918 Vulse[20369:7115604] [central] 54: ERROR: [0x16e167000] >avae> AVAudioIONodeImpl.mm:365: _GetHWFormat: required condition is false: hwFormat framerate: 23 2017-03-22 23:26:03.917219 Vulse[20369:7115604] *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'required condition is false: hwFormat'

    So, I got the FilterEffects example up and running to see if the issue was just in my app. (First I had to change the input variable in viewDidLoad to a property, because it was deiniting too early and causing a crash.) But after fixing that, I unfortunately got the same error above.

    Interestingly, for the second or two before the app crashes, everything seems to work properly: the app takes audio input from audiobus and I can hear it output with the reverb applied. Also (at least in my app) if I remove the Audiobus.start(), but still try to instantiate an AKStereoInput, I get a similar message: SetOutputFormat: required condition is false: format.sampleRate == hwFormat.sampleRate

    Do you get the same crash when trying this yourself, with the latest AudioKit release? I'm using Audiobus 2.3.3, on an iPhone 6s running iOS 10.2.1.

    opened by jconst 29
  • AKPlayer.load() on a file crashes after upgrading to 4.10 (from 4.9.4)

    AKPlayer.load() on a file crashes after upgrading to 4.10 (from 4.9.4)

    AudioKit.AKFaderAudioUnit input format must match output format

    2020-06-02 12:15:08.494929+0530 [avae] AVAEInternal.h:109 [AVAudioEngineGraph.mm:4170:UpdateGraphAfterReconfig: (AUGraphParser::InitializeActiveNodesInOutputChain(ThisGraph, kOutputChainFullTraversal, *conn.srcNode, isChainActive)): error -10868 2020-06-02 12:15:08.497253+0530 *** Terminating app due to uncaught exception 'com.apple.coreaudio.avfaudio', reason: 'error -10868'

    opened by ghost 28
  • Introduce a callback queue on a tap

    Introduce a callback queue on a tap

    Fixes #2814, but it feels clumsy as we have two syncronisation mechanisms - locks and queue.

    We should probably rewrite a tap to work on one dispatch queue and remove all the locks.

    opened by jcavar 2
  • TSAN data race

    TSAN data race

    macOS Version(s) Used to Build

    macOS 13 Ventura

    Xcode Version(s)

    Xcode 14



    To reproduce, run tests with TSAN enabled


    Crash Logs, Screenshots or Other Attachments (if applicable)

    No response

    opened by wtholliday 7
  • Add a test for start and pause with empty mixer

    Add a test for start and pause with empty mixer

    • This currently fails so it is skipped
    • Add a reference to bug report

    I don't have clear idea in mind on how to fix this. I was thinking about possibly maintaining separate state within AudioEngine, but we don't have a reference to it from Mixer. Adding associated object to AVAudioEngine would probably work, but I am not sure if it is worth it. It is not that difficult to workaround at the callsite.

    opened by jcavar 0
  • Approach to disable input bus

    Approach to disable input bus


    AUAudioUnitBus has isEnabled property. If this property is disabled, pullBlock will not be called on an input bus and will save processing time.

    This is working very well in Apple's Audio Units (e.g. AVAudioMixerNode) and depending where this bus in the chain is, you might experience significant performance improvements as none of the upstream nodes will be called.

    Unfortunately this is not implemented in AudioKit units (e.g. DryWetMixer).

    Proposed Solution

    This check would need to happen here and here I believe.

    I am not sure how to approach this since this property is in AUAudioUnitBusArray->AUAudioUnitBus which is not available within DSP.

    Describe Alternatives You've Considered

    I guess one approach would be to duplicate these properties on DSP, but that seems not very elegant.

    Additional Context

    Please note that this is different to bypass effect, which just bypasses processing in one node, but doesn't prevent further signal propagation.

    opened by jcavar 0
  • (Mostly) Eliminate AVAudioEngine

    (Mostly) Eliminate AVAudioEngine


    AVAudioEngine is:

    • relatively buggy (still not handling dynamic connections well)
    • opaque (we don't have access to its code)
    • annoying (mapping AK's connections onto AVAudioEngine is tricky)
    • hard to use (terrible error messages)

    Proposed Solution

    Mostly replace AVAudioEngine usage in AK with our own code. AVAudioEngine would still be used to handle input and output, but we would evaluate our audio graph within a single AU. Thus the AVAudioEngine graph would always be Input -> AK AU -> Output.

    • Instead of the recursive pull-based style of typical AUs, sort the graph and call the AUs in sequence with dummy input blocks (this makes for simpler stack traces and easier profiling).
    • Write everything in Swift (AK needs to run in Playgrounds. Make use of @noAllocations and @noLocks)
    • Host AUs (actually the same AUs as before!)
    • Don't bother trying to share buffers and update in-place (this seems to be of marginal benefit on modern hardware)
    • Preserve almost exactly the same API

    Describe Alternatives You've Considered

    Continue to file bugs with Apple and hope they fix them. This is the status quo. It's not good enough.

    Additional Context

    The devil's in the details, and this could prove to be too much work after we investigate further.

    opened by wtholliday 0
  • Solution for #298

    Solution for #298


    Re discussion in https://github.com/AudioKit/AudioKit/issues/298

    From Logic, if you save your EXS24 instrument into the Samples/ folder in your Xcode project and also pick "include audio data" — the saved EXS24 instrument will contain the correct relative path to be able to load!

    Proposed Solution

    Since this is something a few people have run into, it might be worth adding something in the docs or FAQ about it?

    Describe Alternatives You've Considered

    This seems like the best thing

    Additional Context

    Happy to write up the FAQ post if you like

    opened by charliewilliams 1
Alfresco iOS App - Alfresco is the open platform for business-critical content management and collaboration.

Welcome to the Alfresco iOS App Alfresco is the open platform for business-critical content management and collaboration. Alfresco Mobile was designed

Alfresco Software 42 Sep 26, 2022
▶️ video player in Swift, simple way to play and stream media on iOS/tvOS

Player Player is a simple iOS video player library written in Swift. Looking for an obj-c video player? Check out PBJVideoPlayer (obj-c). Looking for

patrick piemonte 2k Dec 24, 2022
VLC for iOS and tvOS official mirror

This is the official mirror repository of VLC for iOS and tvOS application. You can find the official repository here. It's currently written in Objec

VideoLAN 798 Dec 28, 2022
Firefox for iOS, branch works with Xcode 12.5.1, Swift 5.4.2 and supports iOS 11.4 and above.

Firefox for iOS Download on the App Store. This branch (main) This branch works with Xcode 12.5.1, Swift 5.4.2 and supports iOS 11.4 and above. Please

Mozilla Mobile 11.2k Jan 7, 2023
Development of the TUM Campus App for iOS devices - for and from students at Technical University of Munich.

Tum Campus App - An Unofficial Guide Through University Life The TUM Campus App (TCA) is an open source project, developed by volunteers and available

TUM Developers 93 Dec 16, 2022
📱 Wire for iOS (iPhone and iPad)

Wire™ This repository is part of the source code of Wire. You can find more information at wire.com or by contacting [email protected]. You can find

Wire Swiss GmbH 3.2k Jan 8, 2023
PHPHub for iOS is the universal iPhone and iPad application for PHPHub

PHPHub is a Forum project written in Laravel 4.2, and it is also the project build up PHP & Laravel China community. PHPHub for iOS is the universal i

Aufree 1.2k Nov 18, 2022
Basic app to show how to login with Facebook, Google, Twitter. Created for learning purpose :) using Xcode 9 and Swift 4.0

Social Logins iOS Basic app to show how to login with Facebook, Google, Twitter. Created for learning purpose :) using Xcode 9 and Swift 4.0 Note: Bef

Jogendra 12 Nov 4, 2022
A Swift mailing list client for iPhone and iPad

Due to costs and lack of interest, I’ve had to take down the Charter service. If you’re interested in running your own copy, get in touch and I can se

Matthew Palmer 526 Dec 24, 2022
Fully open source text editor for iOS written in Swift.

Edhita Fully open source text editor for iOS written in Swift. http://edhita.bornneet.com/ What Edhita means? Edhita (Romaji) == エディタ (Katakana) == Ed

Tatsuya Tobioka 1.2k Jan 7, 2023
Simple sample of using the VIP (Clean Swift) architecture for iOS

MyAnimeList Simple sample of using the VIP (Clean Swift) architecture for iOS. ViewController: controls the event handling, view life cycle and displa

null 24 Oct 12, 2022
This app shows the current percentage of the vaccination campaign in Brazil and its states

This app shows the current percentage of the vaccination campaign in Brazil and its states. The data is obtained thanks to covid19br.

Anderson Kloss Maia 8 Jul 22, 2022
Sample app to demonstrate data sharing between a WatchKit app and its main app using Realm

#Done! A sample app demonstrating how to share data between an app an its Watch extension using Realm. You can read more about it here. ##Screenshot #

Fancy Pixel 147 Dec 8, 2022
A simple and beautiful barcode scanner.

Description BarcodeScanner is a simple and beautiful wrapper around the camera with barcode capturing functionality and a great user experience. Barco

HyperRedink 1.6k Dec 28, 2022
TriangleDraw is a pixel editor for iPad and iPhone.

TriangleDraw TriangleDraw is brilliant for sketching logos. You can quickly create designs that can be used for branding on letterheads or on your web

TriangleDraw 46 Sep 26, 2022
An iOS widget-based HN reader

Benuse, an iOS widget-based HN reader Why did you build this app? There already exist some great native Hacker News clients for iOS. I would recommend

Jordan Mann 8 Jul 21, 2022
iOS app to record how much things cost using various data persistence implementations.

how-much iOS app to record how much things cost using various data persistence implementations. The basic data unit is an item, a simple dictionary: {

null 22 Aug 15, 2022
The (second) best iOS app for GitHub.

GitHawk is the second-highest rated iOS app for GitHub. Features 0️⃣ Inbox Zero your notifications ?? Comment even faster than on GitHub desktop ?? Th

GitHawk 2.8k Dec 23, 2022
WordPress for iOS - Official repository

WordPress for iOS Build Instructions Please refer to the sections below for more detailed information. The instructions assume the work is performed f

WordPress Mobile 3.4k Jan 9, 2023