HaishinKit (formerly lf)
- Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
 - Issuesの言語は、日本語が分かる方は日本語でお願いします!
 
 Sponsored with 
  
  
 Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 
Communication
- If you need help with making LiveStreaming requests using HaishinKit, use a GitHub issue with Bug report template 
  
- The trace level log is very useful. Please set 
Logboard.with(HaishinKitIdentifier).level = .trace. - If you don't use an issue template. I will immediately close the your issue without a comment.
 
 - The trace level log is very useful. Please set 
 - If you'd like to discuss a feature request, use a GitHub issue with Feature request template.
 - If you want to support e-mail based communication without GitHub issue. 
  
- Consulting fee is $50/1 incident. I'm able to response a few days.
 
 - If you want to contribute, submit a pull request!
 
Features
RTMP
- Authentication
 - Publish and Recording (H264/AAC)
 - Playback (Beta)
 -  Adaptive bitrate streaming 
  
- Handling (see also #126)
 - Automatic drop frames
 
 -  Action Message Format 
  
- AMF0
 - AMF3
 
 - SharedObject
 -  RTMPS 
  
- Native (RTMP over SSL/TLS)
 - Tunneled (RTMPT over SSL/TLS) (Technical Preview)
 
 - RTMPT (Technical Preview)
 - ReplayKit Live as a Broadcast Upload Extension (Technical Preview)
 
HLS
- HTTPService
 - HLS Publish
 
Rendering
| - | HKView | GLHKView | MTHKView | 
|---|---|---|---|
| Engine | AVCaptureVideoPreviewLayer | OpenGL ES | Metal | 
| Publish | ○ | ○ | ◯ | 
| Playback | × | ○ | ◯ | 
| VIsualEffect | × | ○ | ◯ | 
| Condition | Stable | Stable | Beta | 
Others
-  Support tvOS 10.2+ (Technical Preview) 
  
- tvOS can't publish Camera and Microphone. Available playback feature.
 
 - Hardware acceleration for H264 video encoding, AAC audio encoding
 - Support "Allow app extension API only" option
 -  
Support GPUImage framework (~> 0.5.12) -  
Objective-C Bridging 
Requirements
| - | iOS | OSX | tvOS | XCode | Swift | CocoaPods | Carthage | 
|---|---|---|---|---|---|---|---|
| 1.1.0+ | 9.0+ | 10.11+ | 10.2+ | 12.0+ | 5.0+ | 1.5.0+ | 0.29.0+ | 
| 1.0.0+ | 8.0+ | 10.11+ | 10.2+ | 11.0+ | 5.0+ | 1.5.0+ | 0.29.0+ | 
| 0.11.0+ | 8.0+ | 10.11+ | 10.2+ | 10.0+ | 5.0 | 1.5.0+ | 0.29.0+ | 
Cocoa Keys
Please contains Info.plist.
iOS 10.0+
- NSMicrophoneUsageDescription
 - NSCameraUsageDescription
 
macOS 10.14+
- NSMicrophoneUsageDescription
 - NSCameraUsageDescription
 
Installation
*Please set up your project Swift 5.3. *
CocoaPods
source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!
def import_pods
    pod 'HaishinKit', '~> 1.1.5'
end
target 'Your Target'  do
    platform :ios, '9.0'
    import_pods
end
Carthage
github "shogo4405/HaishinKit.swift" ~> 1.1.5
Swift Package Manager
https://github.com/shogo4405/HaishinKit.swift
License
BSD-3-Clause
Donation
Paypal
Bitcoin
3FnjC3CmwFLTzNY5WPNz4LjTo1uxGNozUR
Prerequisites
Make sure you setup and activate your AVAudioSession.
import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with: [
            AVAudioSession.CategoryOptions.allowBluetooth,
            AVAudioSession.CategoryOptions.defaultToSpeaker]
        )
        try session.setMode(.default)
    }
    try session.setActive(true)
} catch {
    print(error)
}
RTMP Usage
Real Time Messaging Protocol (RTMP).
let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}
let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)
// add ViewController#view
view.addSubview(hkView)
rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)
RTML URL Format
- rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName 
  
- [] mark is an Optional.
 
rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]") rtmpStream.publish("[prefix:[path1[/path2/]]]streamName") - rtmp://localhost/live/streamName 
  
rtmpConneciton.connect("rtmp://localhost/live") rtmpStream.publish("streamName") 
Settings
var rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.captureSettings = [
    .fps: 30, // FPS
    .sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
    // .isVideoMirrored: false,
    // .continuousAutofocus: false, // use camera autofocus mode
    // .continuousExposure: false, //  use camera exposure mode
    // .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
    .muted: false, // mute audio
    .bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
    .width: 640, // video output width
    .height: 360, // video output height
    .bitrate: 160 * 1000, // video output bitrate
    .profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    .maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]
// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)
Authentication
var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")
Screen Capture
// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))
HTTP Usage
HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8
var httpStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")
var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)
var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)
// add ViewController#view
view.addSubview(hkView)
FAQ
How can I run example project?
git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift
carthage bootstrap --use-xcframeworks
open HaishinKit.xcodeproj
Reference
- Adobe’s Real Time Messaging Protocol
 - Action Message Format -- AMF 0
 - Action Message Format -- AMF 3
 - Video File Format Specification Version 10
 - Adobe Flash Video File Format Specification Version 10.1
 
|
|