Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

Overview

HaishinKit

Platform Language CocoaPods GitHub license

  • Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.
  • Issuesの言語は、日本語が分かる方は日本語でお願いします!

Sponsored with 💖 by
Stream Chat
Enterprise Grade APIs for Feeds & Chat. Try the iOS Chat tutorial 💬

Communication

  • If you need help with making LiveStreaming requests using HaishinKit, use a GitHub issue with Bug report template
    • The trace level log is very useful. Please set Logboard.with(HaishinKitIdentifier).level = .trace.
    • If you don't use an issue template. I will immediately close the your issue without a comment.
  • If you'd like to discuss a feature request, use a GitHub issue with Feature request template.
  • If you want to support e-mail based communication without GitHub issue.
    • Consulting fee is $50/1 incident. I'm able to response a few days.
  • If you want to contribute, submit a pull request!

Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Handling (see also #126)
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)
    • Tunneled (RTMPT over SSL/TLS) (Technical Preview)
  • RTMPT (Technical Preview)
  • ReplayKit Live as a Broadcast Upload Extension (Technical Preview)

HLS

  • HTTPService
  • HLS Publish

Rendering

- HKView MTHKView
Engine AVCaptureVideoPreviewLayer Metal
Publish
Playback ×
VisualEffect ×
Condition Stable Stable

Others

  • Support tvOS 10.2+ (Technical Preview)
    • tvOS can't publish Camera and Microphone. Available playback feature.
  • Hardware acceleration for H264 video encoding, AAC audio encoding
  • Support "Allow app extension API only" option
  • Support GPUImage framework (~> 0.5.12)
  • Objective-C Bridging

Requirements

- iOS OSX tvOS XCode Swift
1.2.0+ 9.0+ 10.11+ 10.2+ 13.0+ 5.5+
1.1.0+ 9.0+ 10.11+ 10.2+ 12.0+ 5.0+
1.0.0+ 8.0+ 10.11+ 10.2+ 11.0+ 5.0+

Cocoa Keys

Please contains Info.plist.

iOS 10.0+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

macOS 10.14+

  • NSMicrophoneUsageDescription
  • NSCameraUsageDescription

Installation

*Please set up your project Swift 5.5. *

CocoaPods

source 'https://github.com/CocoaPods/Specs.git'
use_frameworks!

def import_pods
    pod 'HaishinKit', '~> 1.2.2'
end

target 'Your Target'  do
    platform :ios, '9.0'
    import_pods
end

Carthage

github "shogo4405/HaishinKit.swift" ~> 1.2.2

Swift Package Manager

https://github.com/shogo4405/HaishinKit.swift

License

BSD-3-Clause

Donation

Paypal

Bitcoin

3FnjC3CmwFLTzNY5WPNz4LjTo1uxGNozUR

Prerequisites

Make sure you setup and activate your AVAudioSession.

import AVFoundation
let session = AVAudioSession.sharedInstance()
do {
    // https://stackoverflow.com/questions/51010390/avaudiosession-setcategory-swift-4-2-ios-12-play-sound-on-silent
    if #available(iOS 10.0, *) {
        try session.setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .allowBluetooth])
    } else {
        session.perform(NSSelectorFromString("setCategory:withOptions:error:"), with: AVAudioSession.Category.playAndRecord, with: [
            AVAudioSession.CategoryOptions.allowBluetooth,
            AVAudioSession.CategoryOptions.defaultToSpeaker]
        )
        try session.setMode(.default)
    }
    try session.setActive(true)
} catch {
    print(error)
}

RTMP Usage

Real Time Messaging Protocol (RTMP).

let rtmpConnection = RTMPConnection()
let rtmpStream = RTMPStream(connection: rtmpConnection)
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio)) { error in
    // print(error)
}
rtmpStream.attachCamera(DeviceUtil.device(withPosition: .back)) { error in
    // print(error)
}

let hkView = HKView(frame: view.bounds)
hkView.videoGravity = AVLayerVideoGravity.resizeAspectFill
hkView.attachStream(rtmpStream)

// add ViewController#view
view.addSubview(hkView)

rtmpConnection.connect("rtmp://localhost/appName/instanceName")
rtmpStream.publish("streamName")
// if you want to record a stream.
// rtmpStream.publish("streamName", type: .localRecord)

RTML URL Format

  • rtmp://server-ip-address[:port]/application/[appInstance]/[prefix:[path1[/path2/]]]streamName
    • [] mark is an Optional.
    rtmpConneciton.connect("rtmp://server-ip-address[:port]/application/[appInstance]")
    rtmpStream.publish("[prefix:[path1[/path2/]]]streamName")
    
  • rtmp://localhost/live/streamName
    rtmpConneciton.connect("rtmp://localhost/live")
    rtmpStream.publish("streamName")
    

Settings

var rtmpStream = RTMPStream(connection: rtmpConnection)

rtmpStream.captureSettings = [
    .fps: 30, // FPS
    .sessionPreset: AVCaptureSession.Preset.medium, // input video width/height
    // .isVideoMirrored: false,
    // .continuousAutofocus: false, // use camera autofocus mode
    // .continuousExposure: false, //  use camera exposure mode
    // .preferredVideoStabilizationMode: AVCaptureVideoStabilizationMode.auto
]
rtmpStream.audioSettings = [
    .muted: false, // mute audio
    .bitrate: 32 * 1000,
]
rtmpStream.videoSettings = [
    .width: 640, // video output width
    .height: 360, // video output height
    .bitrate: 160 * 1000, // video output bitrate
    .profileLevel: kVTProfileLevel_H264_Baseline_3_1, // H264 Profile require "import VideoToolbox"
    .maxKeyFrameIntervalDuration: 2, // key frame / sec
]
// "0" means the same of input
rtmpStream.recorderSettings = [
    AVMediaType.audio: [
        AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
        AVSampleRateKey: 0,
        AVNumberOfChannelsKey: 0,
        // AVEncoderBitRateKey: 128000,
    ],
    AVMediaType.video: [
        AVVideoCodecKey: AVVideoCodecH264,
        AVVideoHeightKey: 0,
        AVVideoWidthKey: 0,
        /*
        AVVideoCompressionPropertiesKey: [
            AVVideoMaxKeyFrameIntervalDurationKey: 2,
            AVVideoProfileLevelKey: AVVideoProfileLevelH264Baseline30,
            AVVideoAverageBitRateKey: 512000
        ]
        */
    ],
]

// 2nd arguemnt set false
rtmpStream.attachAudio(AVCaptureDevice.default(for: AVMediaType.audio), automaticallyConfiguresApplicationAudioSession: false)

Authentication

var rtmpConnection = RTMPConnection()
rtmpConnection.connect("rtmp://username:password@localhost/appName/instanceName")

Screen Capture

// iOS
rtmpStream.attachScreen(ScreenCaptureSession(shared: UIApplication.shared))
// macOS
rtmpStream.attachScreen(AVCaptureScreenInput(displayID: CGMainDisplayID()))

HTTP Usage

HTTP Live Streaming (HLS). Your iPhone/Mac become a IP Camera. Basic snipet. You can see http://ip.address:8080/hello/playlist.m3u8

var httpStream = HTTPStream()
httpStream.attachCamera(DeviceUtil.device(withPosition: .back))
httpStream.attachAudio(AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio))
httpStream.publish("hello")

var hkView = HKView(frame: view.bounds)
hkView.attachStream(httpStream)

var httpService = HLSService(domain: "", type: "_http._tcp", name: "HaishinKit", port: 8080)
httpService.startRunning()
httpService.addHTTPStream(httpStream)

// add ViewController#view
view.addSubview(hkView)

FAQ

How can I run example project?

git clone https://github.com/shogo4405/HaishinKit.swift.git
cd HaishinKit.swift

carthage bootstrap --use-xcframeworks

open HaishinKit.xcodeproj

Reference

Comments
  • Switching Cameras - Delay mirroring image

    Switching Cameras - Delay mirroring image

    Sometimes when I switch the camera using:

            let position:AVCaptureDevicePosition = currentPosition == .Back ? .Front : .Back
    
            rtmpStream!.attachCamera(DeviceUtil.deviceWithPosition(position))
    
            currentPosition = position`
    

    the image is shown in the wrong position (as mirror) and after a time (about 30 secs) it becomes normal.

    Did you find this issue? I'm using lf (0.4.1)

    bug 
    opened by migueliOS 18
  • Crash on NetSocket.swift

    Crash on NetSocket.swift

    Hello,

    Describe the bug Crashlytics is reporting a crash on NetSocket.swift line 103 (NetSocket.doOutputProcess(_:maxLength:)).

    To Reproduce I can't reproduce it but it occured 10 times in 1 week in production (not many users).

    Smartphone (please complete the following information): Exemple of impacted devices :

    • 12.4.1 (16G102) / iPhone 7
    • 12.3.1 (16F203) / iPhone 6s
    • 12.4.2 (16G114) / iPad Air
    • 11.4.1 (15G77) / iPhone 6

    Additional context Here is the logs from Crashlytics :

    Crashed: com.haishinkit.HaishinKit.NetSocket.output
    0  CoreFoundation                 0x1d8cdd3e0 CFHash + 372
    1  CoreFoundation                 0x1d8d70780 CFBasicHashGetCountOfKey + 204
    2  CoreFoundation                 0x1d8cde8bc CFSetContainsValue + 116
    3  CoreFoundation                 0x1d8cd6e18 CFRunLoopRemoveSource + 164
    4  CFNetwork                      0x1d93ebc98 SocketStream::write(__CFWriteStream*, unsigned char const*, long, CFStreamError*) + 592
    5  CoreFoundation                 0x1d8cec0c0 CFWriteStreamWrite + 300
    6  HaishinKit                     0x10138dba4 NetSocket.doOutputProcess(_:maxLength:) + 103 (NetSocket.swift:103)
    7  HaishinKit                     0x10138d98c closure #1 in NetSocket.doOutput(data:locked:) + 49 (NetSocket.swift:49)
    8  HaishinKit                     0x1013355d8 thunk for @escaping @callee_guaranteed () -> () (<compiler-generated>)
    9  libdispatch.dylib              0x1d8788a38 _dispatch_call_block_and_release + 24
    10 libdispatch.dylib              0x1d87897d4 _dispatch_client_callout + 16
    11 libdispatch.dylib              0x1d8732320 _dispatch_lane_serial_drain$VARIANT$mp + 592
    12 libdispatch.dylib              0x1d8732e3c _dispatch_lane_invoke$VARIANT$mp + 428
    13 libdispatch.dylib              0x1d873b4a8 _dispatch_workloop_worker_thread + 596
    14 libsystem_pthread.dylib        0x1d8969114 _pthread_wqthread + 304
    15 libsystem_pthread.dylib        0x1d896bcd4 start_wqthread + 4
    
    crash_info_entry_1
    *** CFHash() called with NULL ***
    

    Have you any idea how to fix it ? Thanks !

    opened by Goule 17
  • Video autorotation

    Video autorotation

    Video is automatically rotated, but in player its always scaled up to full screen - in landscape this is correct behavior but after rotation to portrait I believe black area should appear on left and right so portrait video is showing in full size scaled down to fit size. Could you please correct me if I understand autorotation wrong? Thanks.

    opened by akovalov 16
  • Choppy audio after updating to iOS 13

    Choppy audio after updating to iOS 13

    Describe the bug I'm running the example iOS app in iOS 13 and streaming with Mux (tried it with Wowza as well) and am having issues with the audio being incredibly choppy. Using different bitrates is not fixing the issue. This was not an issue in iOS 12.

    To Reproduce Steps to reproduce the behavior:

    1. Launch HaishinKit example on device running iOS13
    2. Start stream
    3. Listen to stream playback, it is choppy

    Expected behavior Clear/smooth audio

    Desktop (please complete the following information):

    • OS: MacOS
    • XCode 11.2.1

    Smartphone (please complete the following information):

    • Device: iPhone X
    • OS: iOS 13.3

    Additional context This problem was not occurring in iOS12.

    opened by hraza-simublade 15
  • AWS Media Elements Live

    AWS Media Elements Live

    Hi @shogo4405

    I am trying to get setup with AWS Media Live.

    I have a RTMP url here:

    rtmp: rtmp://52.6.106.57:1935/app key: testing

    And I have an endpoint here:

    m3u8: https://cf98fa7b2ee4450e.mediapackage.us-east-1.amazonaws.com/out/v1/94943f7bdb5d45ae85ecd327928cc302/index.m3u8

    I have tested with (OBS) open broadcast software and it works and will start displaying on the .m3u8 file in Safari I am trying to debug what the issue might be when transferring over to this framework I know I am probably missing something in AWS but not sure.

    The output I get from the log is:

    [Error] [com.haishinkit.HaishinKit] [RTMPMessage.swift:320] payload > AMF0Serializer{data: 236 bytes,position: 140,reference: HaishinKit.AMFReference}
    

    I am connecting like this.

    rtmpStream.publish("testing")
    rtmpConnection.connect("rtmp://52.206.199.219:1935/app")
    

    I will leave the urls open if you could suggest any possible issues that would be great

    Thanks

    bug 
    opened by samueleastdev 14
  • Can't connect to Periscope.

    Can't connect to Periscope.

    hello, i'm trying to use this lib on my iphone 6S but i keep getting this error "inSourceFormat > nil" of the AACEncoder.swift . what should i do?

    opened by RamzyChatti90 14
  • RTMP Server URL is failing at handshake on iOS device.

    RTMP Server URL is failing at handshake on iOS device.

    Describe the bug I am trying to connect to RTMP server URL of mux: rtmp://global-live.mux.com:5222/app Doc link: https://docs.mux.com/docs/live-streaming

    The connect request is getting timed out and failing at handshake.

    XCode version: 11.5 iOS device version: 13.5.1

    I even checked in your Example- iOS by setting uri to some rtmp URL in your Preference.swift it doesn't work. eg. You can check by replacing with the above URL: rtmp://global-live.mux.com:5222/app

    To Reproduce Steps to reproduce the behavior:

    1. Go to Preference.swift
    2. Replace uri with rtmp://global-live.mux.com:5222/app
    3. Handshake failed.

    Expected behavior The handshake should succeed.

    Smartphone (please complete the following information):

    • Device: iPhone 11 pro
    • OS: 13.5

    Additional context Add any other context about the problem here.

    opened by SidPack 13
  • Stream is always portrait in Broadcast Upload Extension

    Stream is always portrait in Broadcast Upload Extension

    Stream is always portrait in Broadcast Upload Extension

    I used the exact copy of Examples/iOS/Screencast

    When I enter games that are landscape I expect the output to be landscape as well. But it's not.

    I tried

    broadcaster.stream.syncOrientation = true
    // And
    broadcaster.stream.orientation = .landscapeLeft
    

    both does not work

    question 
    opened by arslan2012 13
  • How to Local Record Docs?

    How to Local Record Docs?

    Hi @shogo4405

    Do you have a working version of local recording?

    I have followed the docs in the Readme.md but cannot seem to get a working version.

    I have posted about the didFinishWriting method not being called in another issue just wanted to know if its worth still trying to find a solution for this or if it is best to wait for an update?

    Any info would be helpful.

    Thanks

    opened by samueleastdev 13
  • App crash getting converter in AudioCodec

    App crash getting converter in AudioCodec

    Describe the bug

    My app crash because of converter being nil when trying to get it in a specific configuration.

    For my app, I need to use a playAndRecord category and a videoChat mode. I need to do it because, alongside my RTMP audio & video stream, I have WebRTC connections to make an audio call.

    So I improve voice isolation & stuff using videoChat mode.

    Now I have different results considering if I change something in HaishinKit's code.

    • If I change nothing, it just crashes
    • If I set defaultChannels property in AudioCodec to 2, it works without crashing, but when I download the video+audio I sent to my backend, I only have sound in my left ear.
    • If I set defaultChannels property in AudioCodec to 1, then it works without crashing, but sound is now in mono, and I'd like to be able to support true stereo since my app works use external mics.

    To Reproduce

    I just launch my RTMP connection & stream and the app crashes.

    Expected behavior

    It should work without crashing.

    Version

    latest

    Smartphone info.

    iPhone 13 mini. Latest version of iOS

    Additional context

    It seems this code works fine in production on iOS 15 but this version of my app used a version of HaishinKit from December 2021. I updated it last week to be prepared for iOS 16 and now I have this crash.

    Screenshots

    No response

    Relevant log output

    No response

    duplicate wontfix 
    opened by CedricEugeni 12
  • Infinite listen loop

    Infinite listen loop

    I'm having problems connecting with a rtmp server implementing this protocol (I don't have access to its implementation: http://wwwimages.adobe.com/content/dam/Adobe/en/devnet/rtmp/pdf/rtmp_specification_1.0.pdf

    Execution enters in infinite loop (listen function) after handshake. The problem comes in las listen command:

    if (position < bytes.count) {
                listen(bytes: Array(bytes[position..<bytes.count]))
            }
    

    current chunk is not null and is setting position to 0 value always.

    Any idea?

    LOG:

    2017-04-03 18:56:28.195 [Info] [AACEncoder.swift:81] inSourceFormat > nil 2017-04-03 18:56:28.204 [Info] [VideoIOComponent.swift:32] fps > (30.0, __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0)) 2017-04-03 18:56:28.212 [Info] [VideoIOComponent.swift:32] fps > (30.0, __C.CMTime(value: 100, timescale: 3000, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0)) 2017-04-03 18:56:39.602 [Error] [RTMPMessage.swift:318] payload > AMF0Serializer{bytes:[2, 0, 11, 111, 110, 70, 67, 80, 117, 98, 108, 105, 115, 104],position:14,reference:lf.AMFReference} 2017-04-03 18:56:39.787 [Error] [RTMPMessage.swift:59] create > 0 2017-04-03 18:56:39.791 [Error] [RTMPChunk.swift:249] bytes > [0, 0, 0, 0, 0, 1, 3, 0, 0, 0, 0, 0, 196, 20, 1, 0, 0, 0, 2, 0, 8, 111, 110, 83, 116, 97, 116, 117, 115, 0, 0, 0, 0, 0, 0, 0, 0, 0, 5, 3, 0, 5, 108, 101, 118, 101, 108, 2, 0, 6, 115, 116, 97, 116, 117, 115, 0, 4, 99, 111, 100, 101, 2, 0, 23, 78, 101, 116, 83, 116, 114, 101, 97, 109, 46, 80, 117, 98, 108, 105, 115, 104, 46, 83, 116, 97, 114, 116, 0, 11, 100, 101, 115, 99, 114, 105, 112, 116, 105, 111, 110, 2, 0, 43, 78, 84, 70, 105, 77, 68, 65, 119, 90, 106, 70, 108, 77, 68, 81, 52, 77, 84, 73, 48, 89, 49, 57, 102, 77, 81, 32, 105, 115, 32, 110, 111, 119, 32, 112, 117, 98, 108, 105, 115, 104, 101, 195, 100, 0, 7, 100, 101, 116, 97, 105, 108, 115, 2, 0, 26, 78, 84, 70, 105, 77, 68, 65, 119, 90, 106, 70, 108, 77, 68, 81, 52, 77, 84, 73, 48, 89, 49, 57, 102, 77, 81, 0, 8, 99, 108, 105, 101, 110, 116, 105, 100, 2, 0, 13, 76, 97, 118, 102, 53, 55, 46, 53, 55, 46, 49, 48, 48, 0, 0, 9]

    opened by jbiscarri 12
  • Improved performance PiP mode for the MultiCamCaptureSetting.

    Improved performance PiP mode for the MultiCamCaptureSetting.

    Description & motivation

    • Improved performance PiP mode for the MultiCamCaptureSetting.

    Type of change

    • [x] Bug fix (non-breaking change which fixes an issue)

    Screenshots:

    |Befure|After| |:-:|:-:| |スクリーンショット 2023-01-01 23 38 54|スクリーンショット 2023-01-01 23 37 38|

    • iPadAir 5th + iPadOS16.2
    opened by shogo4405 0
  • failed assertion with 'Texture Descriptor Validation' on macOS 13.0

    failed assertion with 'Texture Descriptor Validation' on macOS 13.0

    Describe the bug

    Hello,

    Since the MacOS Ventura update we are not unable to run any stream due to a Metal assertion :

    -[MTLDebugDevice newTextureWithDescriptor:iosurface:plane:]:2403: failed assertion `Texture Descriptor Validation
    IOSurface textures must use MTLStorageModeShared
    

    I find this on the developers forums : https://developer.apple.com/forums/thread/710843

    It look like a bug with MacOS Ventura 13.0, but if someone have any solution I'm taking !

    To Reproduce

    1. Run the "Exemple iOS+SwiftUI" exemple with "My Mac (designed for iPad)"
    2. Accept permissions
    3. Restart and see the assertion

    Expected behavior

    Working on Silicon device with (Designed for iPad)

    Version

    • MacOS 13.0
    • HaishinKit.swift 1.0

    Smartphone info.

    No response

    Additional context

    No response

    Screenshots

    Screenshot 2022-11-05 at 10 51 45

    Relevant log output

    No response

    opened by floriangbh 4
  • Crash at resize method call of Circular Buffer

    Crash at resize method call of Circular Buffer

    Describe the bug

    Getting app crashed while streaming

    To Reproduce

    random crash

    Expected behavior

    It shouldn't crash

    Version

    v: 1.3.0

    Smartphone info.

    iPhone SE (2nd Gen) iOS: 16.0.0

    Additional context

    Log from the Crashlytic

    Crashed: com.haishinkit.HaishinKit.NetSocket.output 0 Foundation 0x41cacc Data.InlineSlice.replaceSubrange(:with:count:) + 224 1 Foundation 0x4239a4 Data.Representation.replaceSubrange(:with:count:) + 672 2 HaishinKit 0x62d04 NetSocket.CircularBuffer.resize((smiley) + 964 (:964) 3 HaishinKit 0x616e0 closure #2 in NetSocket.doOutput(data:locked:) + 104 4 HaishinKit 0x1ac5c thunk for @escaping @callee_guaranteed () -> () + 28 (:28) 5 libdispatch.dylib 0x24b4 _dispatch_call_block_and_release + 32 6 libdispatch.dylib 0x3fdc _dispatch_client_callout + 20 7 libdispatch.dylib 0xb694 _dispatch_lane_serial_drain + 672 8 libdispatch.dylib 0xc1e0 _dispatch_lane_invoke + 384 9 libdispatch.dylib 0x16e10 _dispatch_workloop_worker_thread + 652 10 libsystem_pthread.dylib 0xdf8 _pthread_wqthread + 288 11 libsystem_pthread.dylib 0xb98 start_wqthread + 8

    Screenshots

    No response

    Relevant log output

    Crashed: com.haishinkit.HaishinKit.NetSocket.output
    0  Foundation                     0x41cacc Data.InlineSlice.replaceSubrange(:with:count:) + 224
    1  Foundation                     0x4239a4 Data.Representation.replaceSubrange(:with:count:) + 672
    2  HaishinKit                     0x62d04 NetSocket.CircularBuffer.resize((smiley) + 964 (<compiler-generated>:964)
    3  HaishinKit                     0x616e0 closure #2 in NetSocket.doOutput(data:locked:) + 104
    4  HaishinKit                     0x1ac5c thunk for @escaping @callee_guaranteed () -> () + 28 (<compiler-generated>:28)
    5  libdispatch.dylib              0x24b4 _dispatch_call_block_and_release + 32
    6  libdispatch.dylib              0x3fdc _dispatch_client_callout + 20
    7  libdispatch.dylib              0xb694 _dispatch_lane_serial_drain + 672
    8  libdispatch.dylib              0xc1e0 _dispatch_lane_invoke + 384
    9  libdispatch.dylib              0x16e10 _dispatch_workloop_worker_thread + 652
    10 libsystem_pthread.dylib        0xdf8 _pthread_wqthread + 288
    11 libsystem_pthread.dylib        0xb98 start_wqthread + 8
    
    opened by ParvinderjitSF 5
  • When I set pause and resume, mute will not work

    When I set pause and resume, mute will not work

    Steps to reproduce the behavior:

    1. Set audioSettings muted "true"
    2. Set rtmpStream.paused "true"
    3. Set rtmpStream.paused "false"
    4. Client will receive sound but the audioSettings muted still "true"
    • Device: iPhoneX
    • OS: iOS15.5
    bug 
    opened by jexwang 0
  • 1080p capture stutters on iPhone 12 Pro/Max

    1080p capture stutters on iPhone 12 Pro/Max

    Describe the bug The 1080p capture stutters on the iPhone 12 Pro/Max. Stuttering affects preview, local recording, and stream.

    To Reproduce Steps to reproduce the behavior:

    1. Clone iOS Example
    2. Change rtmpStream.captureSettings.sessionPreset to be AVCaptureSession.Preset.hd1920x1080
    3. Run project on iPhone
    4. Switch back/front camera until issue occurs.

    Expected behavior Preview, local recording, and stream should be smooth. No stuttering

    Screenshots Please see attached video with reproduction steps and the stutter: https://user-images.githubusercontent.com/67028273/106406758-23917180-63ef-11eb-936e-a7dfb66c179e.MP4

    Smartphone

    • Device: iPhone 12 Pro and iPhone 12 Pro Max
    • OS: iOS14 (newest)
    • Version: HaishinKit.swift Master

    Additional context Does not happen on iPhone 12 (non Pro/Max). Or any other iPhone devices I have tried.

    Very strange: sometimes zooming in will solve the issue. But will re-occur after zoom out.

    bug 
    opened by CubitSpeed 1
  • Audio Interruption causes AVSync disparity

    Audio Interruption causes AVSync disparity

    Audio interruptions cause AV sync issues. Because video frames continue to be sent while audio frames are not sent, remote playback now has disparity between audio and video.

    To Reproduce Launch sample app, call phone and let it ring, observer broadcast AV sync issues after it recovers.

    Expected behavior Timestamps provided in the samplebuffers inform the encoder of how to synchronize buffers

    bug 
    opened by ryango 2
Releases(1.4.1)
  • 1.4.1(Dec 30, 2022)

    What's Changed

    • Fix spm compile error 1.4.0 by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1112

    Full Changelog: https://github.com/shogo4405/HaishinKit.swift/compare/1.4.0...1.4.1

    Source code(tar.gz)
    Source code(zip)
  • 1.4.0(Dec 27, 2022)

    Supports two camera video sources. A picture-in-picture display that shows the image of the secondary camera of the primary camera. Supports camera split display that displays horizontally and vertically. Please try it and report an issue. Thank you.

    What's Changed

    • Suppress warnings SwiftLint by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1065
    • Rename AudioCodec sub classes. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1067
    • Bump fastlane from 2.210.0 to 2.210.1 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1068
    • Bump sqlite3 from 1.5.0 to 1.5.2 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1073
    • Fix: session is reinitialized every frame by @leo150 in https://github.com/shogo4405/HaishinKit.swift/pull/1079
    • Add hasAudio, hasVideo options. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1081
    • Bump fastlane from 2.210.1 to 2.211.0 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1086
    • Bring back scaling mode setting by @leo150 in https://github.com/shogo4405/HaishinKit.swift/pull/1087
    • Remove DeviceUtil#device method. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1094
    • NSError -> Error by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1096
    • Invert hasAudio and hasVideo values by @leo150 in https://github.com/shogo4405/HaishinKit.swift/pull/1098
    • [Technical preview] Initial support an AVCaptureMultiCamSession. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1097
    • Support multi camera capture on macOS. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1100
    • Rename AVIOUnit to IOUnit. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1101
    • Support background audio stream for a framework layer. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1104
    • Fix: add audio sample rate by @daveisfera in https://github.com/shogo4405/HaishinKit.swift/pull/1106
    • Redesign AVCaptureSesion capture options. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1107
    • Support isMultitaskingCameraAccessEnabled. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1109

    New Contributors

    • @daveisfera made their first contribution in https://github.com/shogo4405/HaishinKit.swift/pull/1106

    Full Changelog: https://github.com/shogo4405/HaishinKit.swift/compare/1.3.0...1.4.0

    Source code(tar.gz)
    Source code(zip)
  • 1.3.0(Sep 17, 2022)

    Related issues

    • https://github.com/shogo4405/HaishinKit.swift/issues?q=is%3Aclosed+milestone%3A1.3.0

    Migration Guide

    What's Changed

    • Remove HKPictureInPictureController by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1033
    • Remove RTMPStreamDelegate default implement. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1034
    • Support local recording without publish. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1035
    • Bump fastlane from 2.207.0 to 2.208.0 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1038
    • add spm build check by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1040
    • Remove stop recording on stop publishing by @Goule in https://github.com/shogo4405/HaishinKit.swift/pull/1041
    • Update VideoCodec.swift Queue Name by @CubitSpeed in https://github.com/shogo4405/HaishinKit.swift/pull/1043
    • add PiPHKView macOS. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1045
    • Advanced B-Frame compatibility by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1046
    • Bump fastlane from 2.208.0 to 2.209.0 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1047
    • fix #869 Local record audio desynchronization on camera switch by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1048
    • Feature: Expose key frame reordering as an option by @allan-o3h in https://github.com/shogo4405/HaishinKit.swift/pull/1050
    • Bump fastlane from 2.209.0 to 2.209.1 by @dependabot in https://github.com/shogo4405/HaishinKit.swift/pull/1053
    • Bump up Logboard to 2.3.0 by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1061
    • Bump to support version up iOS11, tvOS11. by @shogo4405 in https://github.com/shogo4405/HaishinKit.swift/pull/1063

    New Contributors

    • @Goule made their first contribution in https://github.com/shogo4405/HaishinKit.swift/pull/1041
    • @CubitSpeed made their first contribution in https://github.com/shogo4405/HaishinKit.swift/pull/1043
    • @allan-o3h made their first contribution in https://github.com/shogo4405/HaishinKit.swift/pull/1050

    Full Changelog: https://github.com/shogo4405/HaishinKit.swift/compare/1.2.7...1.3.0

    Source code(tar.gz)
    Source code(zip)
  • 1.2.7(Jul 14, 2022)

  • 1.2.6(Jul 10, 2022)

  • 1.2.5(Jul 5, 2022)

  • 1.2.4(Jun 25, 2022)

  • 1.0.7(Mar 15, 2020)

Owner
shogo4405
shogo4405
Player for streaming local and remote audio files. Written in Swift.

Jukebox is an iOS audio player written in Swift. Contents Features Installation Supported OS & SDK versions Usage Handling remote events Public interf

Teo 545 Nov 11, 2022
A framework for streaming audio between Apple devices using AirPlay.

Airstream An iOS / macOS framework for streaming audio between Apple devices using AirPlay. You can use Airstream to start an AirPlay server in your i

Qasim Iqbal 374 Oct 26, 2022
A fast and extensible gapless AudioPlayer/AudioStreamer for OSX and iOS (iPhone, iPad)

StreamingKit StreamingKit (formally Audjustable) is an audio playback and streaming library for iOS and Mac OSX. StreamingKit uses CoreAudio to decomp

Thong Nguyen 2.3k Dec 21, 2022
Syntax sugar of OpenTok iOS SDK with Audio/Video communication including screen sharing

Accelerator Core iOS The Accelerator Core is a solution to integrate audio/video communication to any iOS applications via OpenTok platform. Accelerat

OpenTok 30 Nov 8, 2022
Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

HaishinKit Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS. Issuesの言語は、日本語が分かる方は日本語でお願いします! Sponsored with ?? by Enterprise

shogo4405 2.4k Jan 2, 2023
A simple demo app to showcase streaming HLS with SwiftUI Videoplayer

HLS Streaming with SwiftUI Basic implementation of VideoPlayer for SwiftUI to play remote media files using HTTP Live Streaming (HLS). Multiple views

Create with Swift 17 Dec 14, 2022
Swift Xcode Project that demonstrates how to set up a microphone input via AudioKit verions 5.

AudioKit Mic Input Swift Xcode Project that demonstrates how to set up a microphone input via AudioKit verions 5. Be sure to plug in headphones in ord

Mark Jeschke 0 Oct 23, 2021
NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

NextLevel 2k Jan 2, 2023
LaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate

LFLiveKit LFLiveKit is a opensource RTMP streaming SDK for iOS. Features Background recording Support horizontal vertical recording Support Beauty Fac

null 4.3k Jan 6, 2023
A camera designed in Swift for easily integrating CoreML models - as well as image streaming, QR/Barcode detection, and many other features

Would you like to use a fully-functional camera in an iOS application in seconds? Would you like to do CoreML image recognition in just a few more sec

David Okun 868 Dec 29, 2022
Repository with base samples for playing HLS/DASH with CMAF video, across as many platforms as possible. Includes steps for encoding and packaging your own test content.

Video Everything Repository with minimal samples for playing HLS/DASH with CMAF video, across as many platforms as possible. Content and License All t

Alex Dodge 3 Jul 4, 2021
Telemat ist eine einfache Single-Screen-Streaming-App für tvOS

Telemat tvOS Basierend auf der ursprünglichen Idee von https://github.com/noestreich/Telemat1000_iPad ist Telemat tvOS eine tvOS APP, mit der ein schn

Oliver Michalak 15 Oct 7, 2022
CLI tool for macOS that transcribes speech from the microphone using Apple’s speech recognition API, SFSpeechRecognizer. (help.)

CLI tool for macOS that uses SFSpeechRecognizer to transcribe speech from the microphone. The recognition result will be written to the standard output as JSON string.

Thai Pangsakulyanont 23 Sep 20, 2022
Quick hack to hide the full screen microphone dot on macOS 12.

undot Quick hack to hide the full screen microphone dot on macOS 12. Usage To build: git clone https://github.com/s4y/undot cd undot make To run: whil

Sidney San Martín 324 Dec 29, 2022
Library for iOS Camera API. CameraKit helps you add reliable camera to your app quickly.

CameraKit helps you add reliable camera to your app quickly. Our open source camera platform provides consistent capture results, service that scales, and endless camera possibilities.

CameraKit 628 Dec 27, 2022
Vision Camera 📸 The Camera library that sees the vision.

Vision Camera ?? The Camera library that sees the vision. npm i react-native-vision-camera npx pod-install Documentation Guides API Ex

Marc Rousavy 3.5k Jan 5, 2023
iOS framework that enables detecting and handling voice commands using microphone.

iOS framework that enables detecting and handling voice commands using microphone. Built using Swift with minumum target iOS 14.3.

Ahmed Abdelkarim 20 Aug 4, 2022
TuningFork is a simple utility for processing microphone input and interpreting pitch, frequency, amplitude, etc.

Overview TuningFork is a simple utility for processing microphone input and interpreting pitch, frequency, amplitude, etc. TuningFork powers the Parti

Comyar Zaheri 419 Dec 23, 2022
Recording Indicator Utility lets you turn off the orange microphone recording indicator light for live events and screencasts.

Recording Indicator Utility Recording Indicator Utility lets you turn off the orange microphone recording indicator light, making it ideal for profess

Tyshawn Cormier 121 Jan 1, 2023
An open source iOS app that lets you use one device as a camera and another as a remote control for the camera

Q: What is Open Source Selfie Stick? A: With this free app you can use any iPhone or iPad as a remote control for the camera on any other iPhone or iP

Richard Nelson 43 Jan 5, 2023