Simplest MIDI Swift library

Overview

WebMIDIKit: Simplest Swift MIDI library

###Want to learn audio synthesis, sound design and how to make cool sounds in an afternoon? Check out Syntorial!

About

What's MIDI

MIDI is a standard governing music software and music device interconnectivity. It lets you make music by sending data between applications and devices.

What's WebMIDI

WebMIDI is a browser API standard that brings the MIDI technology to the web. WebMIDI is minimal, it only describes MIDI port selection, receiving data from input ports and sending data to output ports. WebMIDI is currently implemented in Chrome & Opera. Note that WebMIDI is relatively low level as messages are still represented as sequences of UInt8s (bytes/octets).

What's WebMIDIKit

WebMIDIKit is an implementation of the WebMIDI API for macOS/iOS. On these OS, the native framework for working with MIDI is CoreMIDI. CoreMIDI is old and the API is entirely in C ( 💩 ). Using it involves a lot of void pointer casting ( 💩 ^9.329), and other unspeakable things. Furthermore, some of the APIs didn't quite survive the transition to Swift and are essentially unusable in Swift (MIDIPacketList APIs, I'm looking at you).

CoreMIDI is also extremely verbose and error prone. Selecting an input port and receiving data from it is ~80 lines of convoluted Swift code. WebMIDIKit let's you do the same thing in 1.

WebMIDIKit is a part of the AudioKit project and will eventually replace AudioKit's MIDI implementation.

Also note that WebMIDIKit adds some APIs which aren't a part of the WebMIDI standard. These are marked as non-standard in the code base.

Usage

Installation

Use Swift Package Manager. Add the following .Package entry into your dependencies.

.Package(url: "https://github.com/adamnemecek/webmidikit", from: "1.0.0")

Check out the sample project.

Selecting an input port and receiving MIDI messages from it

import WebMIDIKit

/// represents the MIDI session
let midi: MIDIAccess = MIDIAccess()

/// prints all MIDI inputs available to the console and asks the user which port they want to select
let inputPort: MIDIInput? = midi.inputs.prompt()

/// Receiving MIDI events 
/// set the input port's onMIDIMessage callback which gets called when the port receives MIDI packets
inputPort?.onMIDIMessage = { (list: UnsafePointer<MIDIPacketList>) in
    for packet in list {
        print("received \(packet)")
    }
}

Selecting an output port and sending MIDI packets to it

/// select an output port
let outputPort: MIDIOutput? = midi.outputs.prompt()

/// send messages to it
outputPort.map {

	/// send a note on message
	/// the bytes are in the normal MIDI message format (https://www.midi.org/specifications/item/table-1-summary-of-midi-message)
	/// i.e. you have to send two events, a note on event and a note off event to play a single note
	/// the format is as follows:
	/// byte0 = message type (0x90 = note on, 0x80 = note off)
	/// byte1 = the note played (0x60 = C8, see http://www.midimountain.com/midi/midi_note_numbers.html)
	/// byte2 = velocity (how loud the note should be 127 (=0x7f) is max, 0 is min)

	let noteOn: [UInt8] = [0x90, 0x60, 0x7f]
	let noteOff: [UInt8] = [0x80, 0x60, 0]

	/// send the note on event
	$0.send(noteOn)

	/// send a note off message 1000 ms (1 second) later
	$0.send(noteOff, offset: 1000)

	/// in WebMIDIKit, you can also chain these
	$0.send(noteOn)
	  .send(noteOff, offset: 1000)
}

If the output port you want to select has a corresponding input port you can also do

let outputPort: MIDIOutput? = midi.output(for: inputPort)

Similarly, you can find an input port for the output port.

let inputPort2: MIDIInput? = midi.input(for: outputPort)

Looping over ports

Port maps are dictionary like collections of MIDIInputs or MIDIOutputs that are indexed with the port's id. As a result, you cannot index into them like you would into an array (the reason for this being that the endpoints can be added and removed so you cannot reference them by their index).

for (id, port) in midi.inputs {
	print(id, port)
}

Creating virtual ports

To create virtual input and output ports, use the the createVirtualVirtualMIDIInput and createVirtualVirtualMIDIOutput functions respectively.

let virtualInput = midi.createVirtualMIDIInput(name: "Virtual input")

let virtualOutput = midi.createVirtualMIDIOutput(name: "Virtual output") { (list: UnsafePointer<MIDIPacketList>) in

}

Installation

Use Swift Package Manager. Add the following .Package entry into your dependencies.

.Package(url: "https://github.com/adamnemecek/webmidikit", from: "1.0.0")

If you are having any build issues, look at the sample project sample project.

Documentation

MIDIAccess

Represents the MIDI session. See spec.

class MIDIAccess {
	/// collections of MIDIInputs and MIDIOutputs currently connected to the computer
	var inputs: MIDIInputMap { get }
	var outputs: MIDIOutputMap { get }

	/// will be called if a port changes either connection state or port state
	var onStateChange: ((MIDIPort) -> ())? = nil { get set }

	init()
	
	/// given an output, tries to find the corresponding input port
	func input(for port: MIDIOutput) -> MIDIInput?
	
	/// given an input, tries to find the corresponding output port
	/// if you send data to the output port returned, the corresponding input port
	/// will receive it (assuming the `onMIDIMessage` is set)
	func output(for port: MIDIInput) -> MIDIOutput?
}

MIDIPort

See spec. Represents the base class of MIDIInput and MIDIOutput.

Note that you don't construct MIDIPorts nor it's subclasses yourself, you only get them from the MIDIAccess object. Also note that you only ever deal with subclasses or MIDIPort (MIDIInput or MIDIOutput) never MIDIPort itself.

class MIDIPort {

	var id: Int { get }
	var manufacturer: String { get }

	var name: String { get }

	/// .input (for MIDIInput) or .output (for MIDIOutput)
	var type: MIDIPortType { get }

	var version: Int { get }

	/// .connected | .disconnected,
	/// indicates if the port's endpoint is connected or not
	var state: MIDIPortDeviceState { get }

	/// .open | .closed
	var connection: MIDIPortConnectionState { get }

	/// open the port, is called implicitly when MIDIInput's onMIDIMessage is set or MIDIOutputs' send is called
	func open()

	/// closes the port
	func close()
}

MIDIInput

Allows for receiving data send to the port.

See spec.

class MIDIInput: MIDIPort {
	/// set this and it will get called when the port receives messages.
	var onMIDIMessage: ((UnsafePointer<MIDIPacketList>) -> ())? = nil { get set }
}

MIDIOutput

See spec.

class MIDIOutput: MIDIPort {

	/// send data to port, note that unlike the WebMIDI API, 
	/// the last parameter specifies offset from now, when the event should be scheduled (as opposed to absolute timestamp)
	/// the unit remains milliseconds though.
	/// note that right now, WebMIDIKit doesn't support sending multiple packets in the same call, to send multiple packets, you need on call per packet
	func send<S: Sequence>(_ data: S, offset: Timestamp = 0) -> MIDIOutput where S.Iterator.Element == UInt8
	
	// clear all scheduled but yet undelivered midi events
	func clear()
}
Comments
  • Can't use package with SPM

    Can't use package with SPM

    Hey Adam,

    I'm trying to test your package in a minimal setup. I added it as required in my Package.swift:

    // swift-tools-version:5.0
    // The swift-tools-version declares the minimum version of Swift required to build this package.
    
    import PackageDescription
    
    let package = Package(
        name: "WebMidiKitTest",
        dependencies: [
            // Dependencies declare other packages that this package depends on.
            // .package(url: /* package url */, from: "1.0.0"),
            .package(url: "https://github.com/adamnemecek/WebMIDIKit.git", majorVersion: 1)
        ],
        targets: [
            // Targets are the basic building blocks of a package. A target can define a module or a test suite.
            // Targets can depend on other targets in this package, and on products in packages which this package depends on.
            .target(
                name: "WebMidiKitTest",
                dependencies: ["WebMIDIKit"]),
            .testTarget(
                name: "WebMidiKitTestTests",
                dependencies: ["WebMidiKitTest"]),
        ]
    )
    

    But when I run swift build I get the following error message:

    /Users/nik/proj/IDEs/Xcode/WebMidiKitTest: error: manifest parse error(s):
    /Users/nik/proj/IDEs/Xcode/WebMidiKitTest/Package.swift:11:10: error: ambiguous reference to member 'package'
            .package(url: "https://github.com/adamnemecek/WebMIDIKit.git", majorVersion: 1)
    
    

    I tried changing majorVersion: 1 to from: "1.0.0" which seems to atleast conform to the function signature but this gave the following error:

    Updating https://github.com/adamnemecek/WebMIDIKit.git
    error: dependency graph is unresolvable; found these conflicting requirements:
    
    Dependencies:
        https://github.com/adamnemecek/WebMIDIKit.git @ 1.0.0..<2.0.0
    

    Can you help me out here? Really depend on your library for a generative music project, thanks :)

    opened by nkleemann 8
  • Conform to new SPM syntax

    Conform to new SPM syntax

    The format of Package.swift was deprecated since Swift 4. Now this package can be installed with Xcode 11 by selecting "branch: master" in the install process.

    Since I did not look into the tag/version problem I can give no update on that

    opened by nkleemann 7
  • PacketlistAPI from swift

    PacketlistAPI from swift

    Hello Adam, I was looking for a swift MIDI solution and found your code. I agree that the PacketList API cannot be used from swift right now, but I found a simple way to work around it using some small inlined c-functions. With these functions you can use the CoreMIDI PacketList API from swift without restrictions. Feel free to check out the repository jrmadev/SwiftMIDI

    I took a look at your packet iterating extension:

    extension MIDIPacketList: Sequence {
        public typealias Element = MIDIEvent
    
        public func makeIterator() -> AnyIterator<Element> {
    
           // the packet struct is copied here:
            var p: MIDIPacket = packet
            var i = (0..<numPackets).makeIterator()
    
            return AnyIterator {
                defer {
    
               // the packet struct is copied here:
                    p = MIDIPacketNext(&p).pointee
                }
    
                return i.next().map { _ in .init(packet: &p) }
            }
        }
    }
    

    The iterated packets are copied from the original list.

    In MIDIServices.h MIDIPacketNext is documented like this:

    /*!
     @function MIDIPacketNext
    
     @abstract Advances a MIDIPacket pointer to the MIDIPacket which
          immediately follows it in memory if it is part of a MIDIPacketList.
    
     @param pkt A pointer to a MIDIPacket in a MIDIPacketList.
    
     @result The subsequent packet in the MIDIPacketList.
    */
    

    When You assign the packet struct to a variable (struct are value types in swift) it is no longer part of a packet list. As a side effect you loose bytes in case the data is longer than 256 bytes. MIDIPacketNext calculates pointer offsets i.e. when calling MIDIPacketNext(&p) the returned pointer will point eventually into memory outside of of the current packet.

    In my repository I have a small code sample showing exactly this. IterateOverCopiedPackets

    Greetings, Jan

    opened by jrmaxdev 6
  • When an Endpoint is gone you can't ask for kMIDIPropertyUniqueID, it is gone

    When an Endpoint is gone you can't ask for kMIDIPropertyUniqueID, it is gone

    When an MIDIEndpoint is gone (physical disconnected or software hosting a virtual port teared down) you can't ask for kMIDIPropertyUniqueID, it is gone already, there is not ID available no more.

    internal final class MIDIEndpoint: Codable {
        ...
        final var id: Int {
            self[int: kMIDIPropertyUniqueID]
        }
        ...
    }
    

    causing the API to crash anytime CoreMIDI is asked to get the ID when it can not answer such. That is a conceptual ERROR. This is the reason why the Endpoint ID once evaluated is stored in your local PortMap/List. In case a notification arrives telling it is gone it carries the ID that caused it with it.. but asking CoreMIDI again will do the crash.. Instead you take the notified Endpoint ID, iterate thru your PortMap and clean up the List to represent a proper local state of all available CoreMidi devices.

    opened by designerfuzzi 5
  • Swift package - missing manifest specification

    Swift package - missing manifest specification

    encountered on Xcode 12.5 and Xcode 13.1

    the manifest is missing a Swift tools version specification; consider prepending to the manifest '// swift-tools-version:5.4.0' to specify the current Swift toolchain version as the lowest Swift version supported by the project; if such a specification already exists, consider moving it to the top of the manifest, or prepending it with '//' to help Swift Package Manager find it

    opened by qalvapps 5
  • Improve Xcode 12+ compatibility

    Improve Xcode 12+ compatibility

    Closes #31

    When trying to "Add Package" from e.g. an Xcode macOS app project, this package does resolve, but SwiftPM complains about the tools-version not being set.

    Updating this to (with whitespace, which is new ⚠️) 5.6 works; but 5.5 may be good enough, too.

    This increases the minimum required Swift version, though. Not sure if you're ok with this @adamnemecek?

    opened by DivineDominion 4
  • All messages are the same

    All messages are the same

    All MIDI Messages received are the same, even though they are not.

    Message: 0x0000700003bc6050

    Code:

    func midiTest() {
        inputPort?.onMIDIMessage = { (list: UnsafePointer<MIDIPacketList>) in
            for packet in list {
                print("received \(packet)")
            }
    		
        }	
    }
    

    with CoreMIDI and WebMIDIKit imports (and of course, foundation.)

    opened by BlockArchitech 3
  • MIDIPacketNext update & minor code refactor/tweaks

    MIDIPacketNext update & minor code refactor/tweaks

    While I updated the MIDIPacketNext code, I had to update a few other things to get the library to compile.

    I upped the Package swift version to 5.3 which allows using the platforms parameter, which was needed because MIDIClientCreateWithBlock and MIDIInputPortCreateWithBlock require macOS 10.11 minimum.

    It's worth noting that as-is, this is a macOS-only library and will not build on other platforms such as iOS because it uses a couple API calls that are macOS-only: AudioGetCurrentHostTime and AudioConvertNanosToHostTime.

    Cheers.

    opened by orchetect 2
  • onMessageReceived seems to

    onMessageReceived seems to "leak/grow" MidiBuffer or never empties.

    the problem with this is this ends up in something that is behaving like a leak. Have to admit that comes up in play with WkWebKit handling WebMidi with the help of MIDIDriver.swift and MIDIMessageHandler.swift but messuring with Intruments i was able to see that memory allocation is growing and growing before/in & after onMIDIMessage declared block is called. App memory increases with any message received.

    thats the point where i must admit i will start from scratch, likely ending up in new API that does it the classic way.

    opened by designerfuzzi 1
  • Capacitor Plugin

    Capacitor Plugin

    I have a Capacitor app, and have been looking for a iOS MIDI solution. This seems promising, I am wondering if anyone has had experience using this as a Capacitor Plugin?

    PS... I ran into a few issues trying to build this (and the demo project) in Xcode 12.1.

    opened by joshstovall 1
  • this line is useless and worse it enforces crashing

    this line is useless and worse it enforces crashing

    https://github.com/adamnemecek/WebMIDIKit/blob/60a99ca8d6f82ac39fb6fc4618cad3fd05aa9a96/Sources/WebMIDIKit/MIDIPortMap.swift#L70

    any device, virtual or hardware can disconnect at any time via cable lost, virtual receiver app crashed or is just turned off. Which means testing if something is connected that is not connected of course makes no sense. The enveloping function removes (actually just releases the entry) the entry from MIDIPortMap for inputs or outputs and we force to remove it for reason. Lets make an example: we turn off an device, CoreMidi invalidates its entry, sends the notification about removal, notification is received and calls back to remove it from local MIDIPortMap. Now we try to test if something that can not be connected is connected when we got just the notification about just that.. lots of overthinking in this API

    opened by designerfuzzi 0
  • Failed to Run the Simplest Demo

    Failed to Run the Simplest Demo

    Hi, I'm new to swift I try to get the midi input port list by

    import WebMIDIKit

    let midi: MIDIAccess = MIDIAccess()

    let inputPort: MIDIInput? = midi.inputs.prompt()

    It build succeeded but with an error

    Screen Shot 2021-05-07 at 6 15 48 PM

    How can I get it Run correctly ?

    opened by Espresso1027 0
  • Removing a Midi Device produces a crash

    Removing a Midi Device produces a crash

    The problem is here:

    Assertion failed: file WebMIDIKit/MIDIPortMap.swift, line 72

    The code:

    assert(port.state == .connected)

    Is this assertion necessary?

    opened by raduvarga 8
  • Looking to use in iOS and starting with CoreAudio macOS migration

    Looking to use in iOS and starting with CoreAudio macOS migration

    AudioGetCurrentHostTime() and AudioConvertNanosToHostTime(UInt64($0 * 1000000)) don't exist on iOS so we need to use https://developer.apple.com/library/archive/qa/qa1643/_index.html

    any idea how we should go about using the CAHostTimeBase obj-c class in order to get this done?

    opened by cdbattags 4
Owner
ngrid.io, femtovg core team member, AudioKit core team member, nalgebra contributor
null
A note source and a rhythm source to create a new virtual MIDI instrument

MidiCombiner for Mac Combines inputs from 2 different MIDI sources: A note source and a rhythm source to create a new virtual MIDI instrument. The not

null 2 Jul 6, 2022
Beethoven is an audio processing Swift library

Beethoven is an audio processing Swift library that provides an easy-to-use interface to solve an age-old problem of pitch detection of musical signals.

Vadym Markov 735 Dec 24, 2022
SwiftySound is a simple library that lets you deal with Swift sounds easily

SwiftySound Overview SwiftySound is a simple library that lets you deal with Swift sounds easily. Static methods Sound.play(file: "dog.wav") Sound.pla

Adam Cichy 1.1k Dec 17, 2022
MusicScore - A music score library with MusicPart, Measure, Note, Pitch and Tempo representations in swift structs

MusicScore A music score library with MusicPart, Measure, Note, Pitch and Tempo

null 7 Sep 19, 2022
Soundable is a tiny library that uses AVFoundation to manage the playing of sounds in iOS applications in a simple and easy way

Soundable is a tiny library that uses AVFoundation to manage the playing of sounds in iOS applications in a simple and easy way. You can play

Luis Cárdenas 89 Nov 21, 2022
YiVideoEditor is a library for rotating, cropping, adding layers (watermark) and as well as adding audio (music) to the videos.

YiVideoEditor YiVideoEditor is a library for rotating, cropping, adding layers (watermark) and as well as adding audio (music) to the videos. YiVideoE

coderyi 97 Dec 14, 2022
A macOS app to visualise your iTunes library as graphs.

iTunes Graphs iTunes Graphs is a Cocoa-based macOS app which visualises your iTunes library in a series of pie charts. Currently, it supports the foll

zac garby 46 Dec 26, 2022
An advanced media player library, simple and reliable

About The SRG Media Player library provides a simple way to add universal audio / video playback support to any application. It provides: A controller

SRG SSR 145 Aug 23, 2022
A drop-in universal library allows to record audio within the app with a nice User Interface.

IQAudioRecorderController IQAudioRecorderController is a drop-in universal library allows to record and crop audio within the app with a nice User Int

Mohd Iftekhar Qurashi 637 Nov 17, 2022
Subsonic is a small library that makes it easier to play audio with SwiftUI

Subsonic is a small library that makes it easier to play audio with SwiftUI

Paul Hudson 218 Dec 16, 2022
This app demonstrates how to use the Google Cloud Speech API and Apple on-device Speech library to recognize speech in live recorded audio.

SpeechRecognitionIOS This app demonstrates how to use Google Cloud Speech API and Apple on-device Speech library to recognize speech in live audio rec

Josh Uvi 0 Mar 11, 2022
SimplePlayer - A simple application to fetch songs from iTunes Library by searching song's artist

SimplePlayer is a simple application to fetch songs from iTunes Library by searching song's artist.

Maria Angelina 0 Jan 30, 2022
Swift-music - swift-music is a swift package that provides an easy-to-use API for music related developments.

?? swift-music Introduction swift-music is a swift package that provides an easy-to-use API for music related developments. Currently available module

Jin Zhang 4 Feb 8, 2022
MusicKit is a framework and DSL for creating, analyzing, and transforming music in Swift.

MusicKit MusicKit is a framework and DSL for creating, analyzing, and transforming music in Swift. Examples Functional harmony let C5 = Pitch(midi: 72

Ben Guo 591 Oct 18, 2022
SwiftAudioPlayer - Swift-based audio player with AVAudioEngine as its base

SwiftAudioPlayer Swift-based audio player with AVAudioEngine as its base. Allows for: streaming online audio, playing local file, changing audio speed

null 417 Jan 7, 2023
Swift audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

AudioKit AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS (including Catalyst), and tvOS. Installation To add AudioKit

AudioKit 8.7k Sep 30, 2021
Professional Radio Station App - now supports Swift 5 / Xcode 10!

Swift Radio Swift Radio is an open source radio station app with robust and professional features. This is a fully realized Radio App built entirely i

Matthew Fecher 2.7k Jan 7, 2023
Voice Memos is an audio recorder App for iPhone and iPad that covers some of the new technologies and APIs introduced in iOS 8 written in Swift.

VoiceMemos Voice Memos is a voice recorder App for iPhone and iPad that covers some of the new technologies and APIs introduced in iOS 8 written in Sw

Zhouqi Mo 322 Aug 4, 2022
Swift Radio is an open source radio station app with robust and professional features.

Swift Radio Swift Radio is an open source radio station app with robust and professional features. This is a fully realized Radio App built entirely i

Ahmed AlOtaibi 0 Oct 13, 2021