LaiFeng IOS Live Kit,H264 and AAC Hard coding,support GPUImage Beauty, rtmp transmission,weak network lost frame,Dynamic switching rate

Related tags

Streaming LFLiveKit
Overview

LFLiveKit

icon~

Build Status  License MIT  CocoaPods  Support  platform 

LFLiveKit is a opensource RTMP streaming SDK for iOS.

Features

  • Background recording
  • Support horizontal vertical recording
  • Support Beauty Face With GPUImage
  • Support H264+AAC Hardware Encoding
  • Drop frames on bad network
  • Dynamic switching rate
  • Audio configuration
  • Video configuration
  • RTMP Transport
  • Switch camera position
  • Audio Mute
  • Support Send Buffer
  • Support WaterMark
  • Swift Support
  • Support Single Video or Audio
  • Support External input video or audio(Screen recording or Peripheral)
  • FLV package and send

Requirements

- iOS 7.0+
- Xcode 7.3

Installation

CocoaPods

# To integrate LFLiveKit into your Xcode project using CocoaPods, specify it in your Podfile:

source 'https://github.com/CocoaPods/Specs.git'
platform :ios, '7.0'
pod 'LFLiveKit'

# Then, run the following command:
$ pod install

Carthage

1. Add `github "LaiFengiOS/LFLiveKit"` to your Cartfile.
2. Run `carthage update --platform ios` and add the framework to your project.
3. Import \<LFLiveKit/LFLiveKit.h\>.

Manually

1. Download all the files in the `LFLiveKit` subdirectory.
2. Add the source files to your Xcode project.
3. Link with required frameworks:
    * UIKit
    * Foundation
    * AVFoundation
    * VideoToolbox
    * AudioToolbox
    * libz
    * libstdc++

Usage example

Objective-C

- (LFLiveSession*)session {
	if (!_session) {
	    _session = [[LFLiveSession alloc] initWithAudioConfiguration:[LFLiveAudioConfiguration defaultConfiguration] videoConfiguration:[LFLiveVideoConfiguration defaultConfiguration]];
	    _session.preView = self;
	    _session.delegate = self;
	}
	return _session;
}

- (void)startLive {	
	LFLiveStreamInfo *streamInfo = [LFLiveStreamInfo new];
	streamInfo.url = @"your server rtmp url";
	[self.session startLive:streamInfo];
}

- (void)stopLive {
	[self.session stopLive];
}

//MARK: - CallBack:
- (void)liveSession:(nullable LFLiveSession *)session liveStateDidChange: (LFLiveState)state;
- (void)liveSession:(nullable LFLiveSession *)session debugInfo:(nullable LFLiveDebug*)debugInfo;
- (void)liveSession:(nullable LFLiveSession*)session errorCode:(LFLiveSocketErrorCode)errorCode;

Swift

// import LFLiveKit in [ProjectName]-Bridging-Header.h
#import <LFLiveKit.h> 

//MARK: - Getters and Setters
lazy var session: LFLiveSession = {
	let audioConfiguration = LFLiveAudioConfiguration.defaultConfiguration()
	let videoConfiguration = LFLiveVideoConfiguration.defaultConfigurationForQuality(LFLiveVideoQuality.Low3, landscape: false)
	let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfiguration)
	    
	session?.delegate = self
	session?.preView = self.view
	return session!
}()

//MARK: - Event
func startLive() -> Void { 
	let stream = LFLiveStreamInfo()
	stream.url = "your server rtmp url";
	session.startLive(stream)
}

func stopLive() -> Void {
	session.stopLive()
}

//MARK: - Callback
func liveSession(session: LFLiveSession?, debugInfo: LFLiveDebug?) 
func liveSession(session: LFLiveSession?, errorCode: LFLiveSocketErrorCode)
func liveSession(session: LFLiveSession?, liveStateDidChange state: LFLiveState)

Release History

* 2.0.0
    * CHANGE: modify bugs,support ios7 live.
* 2.2.4.3
    * CHANGE: modify bugs,support swift import.
* 2.5 
    * CHANGE: modify bugs,support bitcode.

License

LFLiveKit is released under the MIT license. See LICENSE for details.

Comments
  • 请问如何添加水印

    请问如何添加水印

    `- (void)setWaterMark { GPUImageAlphaBlendFilter *blendFilter = [[GPUImageAlphaBlendFilter alloc] init]; blendFilter.mix = 1.0; NSDate *startTime = [NSDate date];

    UILabel *timeLabel = [[UILabel alloc] initWithFrame:CGRectMake(0.0, 0.0, 240.0f, 320.0f)];
    timeLabel.font = [UIFont systemFontOfSize:17.0f];
    timeLabel.text = @"Time: 0.0 s";
    timeLabel.textAlignment = NSTextAlignmentCenter;
    timeLabel.backgroundColor = [UIColor clearColor];
    timeLabel.textColor = [UIColor whiteColor];
    
    GPUImageUIElement *uiElementInput = [[GPUImageUIElement alloc] initWithView:timeLabel];
    
    [_filter addTarget:blendFilter];
    [uiElementInput addTarget:blendFilter];
    
    [blendFilter addTarget:_gpuImageView];
    
    __unsafe_unretained GPUImageUIElement *weakUIElementInput = uiElementInput;
    __weak typeof(self) _self = self;
    [_filter setFrameProcessingCompletionBlock:^(GPUImageOutput * filter, CMTime frameTime){
        timeLabel.text = [NSString stringWithFormat:@"Time: %f s", -[startTime timeIntervalSinceNow]];
        [weakUIElementInput update];
        [_self processVideo:filter];
    }];
    [_videoCamera addTarget:_gpuImageView];
    

    }` 这是我添加水印的代码 可是在[weakUIElementInput update] 这里崩溃

    opened by XFoxer 14
  • 1.8.0 demo 关闭直播,还是会自动链接服务器,状态为已连接

    1.8.0 demo 关闭直播,还是会自动链接服务器,状态为已连接

    同时,如果服务器未打开,点击开始直播后,再次点击播报会崩溃到LFStreamRtmpSockert.m

    - (void)_stop{
        if(self.delegate && [self.delegate respondsToSelector:@selector(socketStatus:status:)]){
            [self.delegate socketStatus:self status:LFLiveStop];
        }
        if(_rtmp != NULL){
            PILI_RTMP_Close(_rtmp, &_error);
            PILI_RTMP_Free(_rtmp);
            _rtmp = NULL;
        }
    } 
    
    opened by twenty-zp 12
  • Accept other video input

    Accept other video input

    Great library! I wonder can you make it more flexible by allowing users to specify the video input, for example, making LFVideoCapture an protocol and then let LFLiveSession accept an LFVideoCapture implementation. The implementation can call - (void)captureOutput:(nullable LFVideoCapture*)capture pixelBuffer:(nullable CVImageBufferRef)pixelBuffer; to specify the video input.

    Thanks!

    opened by ltebean 10
  • pop controller crash

    pop controller crash

    class PublishLiveViewController: UIViewController {
        
        @IBOutlet weak var livePathInput: UITextField!
        
        lazy var session: LFLiveSession = {
            let audioConfiguration = LFLiveAudioConfiguration.default()
            let videoConfiguration = LFLiveVideoConfiguration.defaultConfiguration(for: .high3)
            let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfiguration)
            session?.captureDevicePosition = .back
            session?.running = true
            session?.delegate = self
            session?.preView = self.view
            return session!
        }()
    
        
        
        override func viewDidLoad() {
            super.viewDidLoad()
    
            
        }
        
        override func viewDidDisappear(_ animated: Bool) {
            super.viewDidDisappear(animated)
            self.stopLive()
        }
        func startLive() -> Void {
            let stream = LFLiveStreamInfo()
            stream.url = livePathInput.text;
            livePathInput.isHidden = true
            session.preView = view
            session.startLive(stream)
        }
        
        func stopLive() -> Void {
            session.preView = nil
            livePathInput.isHidden = false
            session.stopLive()
        }
        
        @IBAction func publishLive(_ sender: UIButton) {
            
            startLive()
        }
        
        @IBAction func stopPublishLive(_ sender: UIButton) {
            
            stopLive()
        }
        
        @IBAction func switchCamera(_ sender: UIButton) {
            
            session.captureDevicePosition = session.captureDevicePosition == .back ? .front : .back;
        }
    
    }
    

    Crash code LFAudioCapture.m

    - (void)dealloc {
        [[NSNotificationCenter defaultCenter] removeObserver:self];
    
         dispatch_async(self.taskQueue, ^{
            if (self.componetInstance) { // Crash code
                self.isRunning = NO;
                AudioOutputUnitStop(self.componetInstance);
                AudioComponentInstanceDispose(self.componetInstance);
                self.componetInstance = nil;
                self.component = nil;
            }
       });
    }
    
    opened by 0x1306a94 9
  • pod 引入 ReactiveCocoa 的时候,LFLiveKit编译失败

    pod 引入 ReactiveCocoa 的时候,LFLiveKit编译失败

    如题,当我项目中pod同时引入 ReactiveCocoa 和 LFLiveKit 的时候,LFLiveKit 就会编译失败

    我的pod写法入下:

    source 'https://github.com/CocoaPods/Specs.git' platform :ios, "8.0"

    use_frameworks!

    target 'test' do

    pod 'LFLiveKit' pod 'ReactiveCocoa'

    end

    opened by lyandy 9
  • 延迟较大的问题

    延迟较大的问题

    我测试发现正常使用的过程中,源视频画面变化后,播放端要过6秒,甚至8秒以上才会看到。也就是延迟达到6-8秒。什么原因导致的?有什么参数可以调整试试看吗?

    我测试的环境, 源端使用 iOS8.4.1, LFLiveKit 2.6 流媒体服务: Nginx+rmtp-module 播放端: vlc播放器 上述都在同一路由器网络中,其中nginx和vlc在同一PC机上。 我在 XCode8.1的输出中看到上传速度在 100KB左右,看似网络带宽不成问题。

    2017-02-27 13:34:58.599 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 113.2 KB/s 2017-02-27 13:34:59.621 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 103.3 KB/s 2017-02-27 13:35:00.678 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 107.8 KB/s 2017-02-27 13:35:01.687 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 86.5 KB/s 2017-02-27 13:35:02.708 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 56.7 KB/s 2017-02-27 13:35:03.206 EasyLiveVideo[1235:145634] Increase bitrate 969200 2017-02-27 13:35:03.730 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 62.7 KB/s 2017-02-27 13:35:04.764 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 36.9 KB/s 2017-02-27 13:35:05.768 EasyLiveVideo[1235:145634] debugInfo uploadSpeed: 60.1 KB/s

    我使用的参数如下,

    `

    • (LFLiveSession*)session{ if(!_session){ @try {
      LFLiveAudioConfiguration *audioConfiguration = [LFLiveAudioConfiguration new]; audioConfiguration.numberOfChannels = 2; audioConfiguration.audioBitrate = LFLiveAudioBitRate_32Kbps; audioConfiguration.audioSampleRate = LFLiveAudioSampleRate_16000Hz;

            LFLiveVideoConfiguration *videoConfiguration = [LFLiveVideoConfiguration defaultConfigurationForQuality:LFLiveVideoQuality_High1];
            videoConfiguration.videoSize = CGSizeMake(960, 540);
            videoConfiguration.videoBitRate = 800*1024;
            videoConfiguration.videoMaxBitRate = 1200*1024;
            videoConfiguration.videoMinBitRate = 400*1024;
            videoConfiguration.videoFrameRate = 10;
            videoConfiguration.videoMaxFrameRate = 15;
            videoConfiguration.videoMinFrameRate = 5;
            videoConfiguration.videoMaxKeyframeInterval = 20;
            videoConfiguration.outputImageOrientation =[[UIApplication sharedApplication] statusBarOrientation];
            videoConfiguration.sessionPreset = LFCaptureSessionPreset540x960;
      
            _session = [[LFLiveSession alloc] initWithAudioConfiguration:audioConfiguration videoConfiguration:videoConfiguration];
            _session.adaptiveBitrate =YES;
            _session.reconnectInterval =8;
            _session.reconnectCount =6;
      
            _session.saveLocalVideo =YES;
            NSString *cachesDir = [NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0];
            NSString * pathToMovie =[cachesDir stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mp4", [ELVView dataAndTimeNow]]];
            unlink([pathToMovie UTF8String]); // If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
            _session.saveLocalVideoPath =[NSURL fileURLWithPath:pathToMovie];
      
            _session.showDebugInfo =YES;
      
            // 设置代理
            _session.delegate = self;
            _session.running = YES;
            _session.preView = self;
            _session.captureDevicePosition = AVCaptureDevicePositionBack;
        } @catch (NSException *exception) {
      
        } @finally {
      
        }
      

      } return _session; } `

    另,作者这个库非常好。感谢~

    opened by dungeonsnd 8
  • Is there a way to access the each video frame and is it possible to add overlay object to video frame?

    Is there a way to access the each video frame and is it possible to add overlay object to video frame?

    I am new to this. This is a question on iOS

    I wonder if there is a way to access to image of each video frame. There is a UIImage *currentImage property in LFLiveSession, however, the LFLiveSession delegate callback is only triggered when live state change.

    If I can access each video frame, I would like to overlay objects to the video frame. Is this possible?

    Thanks

    opened by lzzhangar 7
  • How to crop and keep aspect ratio?

    How to crop and keep aspect ratio?

    I want to crop and keep aspect ratio in swift 3.1.

    let audioConfiguration = LFLiveAudioConfiguration.defaultConfiguration(for: .medium)
    let videoConfiguration = LFLiveVideoConfiguration.defaultConfiguration(for: .medium3, outputImageOrientation: .portrait)
    videoConfiguration?.videoSize = CGSize(width: 720, height: 360)
    videoConfiguration?.videoSizeRespectingAspectRatio = false
    
    let session = LFLiveSession(audioConfiguration: audioConfiguration, videoConfiguration: videoConfiguration)!
    session.delegate = self
    session.captureDevicePosition = .back
    session.preView = self.cameraView
    

    But, I can't to crop while keep aspect ratio.

    I did not solve with under issue.

    • https://github.com/LaiFengiOS/LFLiveKit/issues/86
    • https://github.com/LaiFengiOS/LFLiveKit/issues/166

    Please teach me 🙏 resize_image

    opened by roana0229 7
  • ERROR: PILI_RTMP_ReadPacket, failed to read PILI_RTMP packet header

    ERROR: PILI_RTMP_ReadPacket, failed to read PILI_RTMP packet header

    Hey,

    I started testing LFLiveKit with the code example and I have this error: ERROR: PILI_RTMP_ReadPacket, failed to read PILI_RTMP packet header

    What's PILI RTMP ?

    I would like to stream to YouTube, I use this code to start streaming:

    let stream = LFLiveStreamInfo()
    stream.url = "rtmp://a.rtmp.youtube.com/live2"
    stream.streamId = "xxxx-xxxx-xxxx-xxxx"
    session.startLive(stream)
    

    Configuration: Xcode 8 Swift 3 iOS 10

    Thanks,

    Remi.

    opened by remi1212 7
  • Can't archive due to bitcode error

    Can't archive due to bitcode error

    When trying to Archive my project which uses LFLiveKit, I get the following error:

    ld: bitcode bundle could not be generated because '/path-to-project/Pods/LFLiveKit/Vendor/pili-librtmp.framework/pili-librtmp(amf.o)' was built without full bitcode. All object files and libraries for bitcode must be generated from Xcode Archive or Install build for architecture armv7
    clang: error: linker command failed with exit code 1 (use -v to see invocation)
    

    It doesn't work whether I set Enable Bitcode to YES or NO.

    opened by dmdque 7
  • 在4G环境下推流异常

    在4G环境下推流异常

    在4G环境下采集推流控制台提示: handleRouteChange reason is The category of the session object changed. handleRouteChange reason is The output route was overridden by the app. 数据应该没有推出去!

    opened by BossKing 7
  • Where can add or enable face detect in this project?

    Where can add or enable face detect in this project?

    Hello, I using this project to implement live streaming app. It's work great. But I need face detect function to add some specail filter for user. If I want to add AVCaptureMetadataOutput to support face detect, where is easy to add it? Or what suggest do you prefer??

    opened by kikokuo 2
  • App crashing when streaming to more then 3 rtmp or rtmps URLs.

    App crashing when streaming to more then 3 rtmp or rtmps URLs.

    When try to stream multiple rtmp urls app was crashing in "C rtmp" file line -> SSL_load_error_strings();

    B2B(32583,0x16dfc7000) malloc: Non-aligned pointer 0x2823e0f00 being freed (2) B2B(32583,0x16dfc7000) malloc: *** set a breakpoint in malloc_error_break to debug

    opened by vignesh2340 0
  • Incorporating AVMultiCamPiP: Capturing from Multiple Cameras

    Incorporating AVMultiCamPiP: Capturing from Multiple Cameras

    How to achieve this : Simultaneously record the output from the front and back cameras into a single movie file by using a multi-camera capture session. And stream this to RTMP. https://developer.apple.com/documentation/avfoundation/cameras_and_media_capture/avmulticampip_capturing_from_multiple_cameras

    opened by pundirmr 0
Owner
null
A fast and extensible gapless AudioPlayer/AudioStreamer for OSX and iOS (iPhone, iPad)

StreamingKit StreamingKit (formally Audjustable) is an audio playback and streaming library for iOS and Mac OSX. StreamingKit uses CoreAudio to decomp

Thong Nguyen 2.3k Dec 21, 2022
Player for streaming local and remote audio files. Written in Swift.

Jukebox is an iOS audio player written in Swift. Contents Features Installation Supported OS & SDK versions Usage Handling remote events Public interf

Teo 545 Nov 11, 2022
Syntax sugar of OpenTok iOS SDK with Audio/Video communication including screen sharing

Accelerator Core iOS The Accelerator Core is a solution to integrate audio/video communication to any iOS applications via OpenTok platform. Accelerat

OpenTok 30 Nov 8, 2022
iOS platform video hard decoding, support h264, h265

VideoDecoder iOS platform video hard decoding, support h264, h265 Example To run the example project, clone the repo, and run pod install from the Exa

null 21 Sep 8, 2022
🔥 🔥 🔥Support for ORM operation,Customize the PQL syntax for quick queries,Support dynamic query,Secure thread protection mechanism,Support native operation,Support for XML configuration operations,Support compression, backup, porting MySQL, SQL Server operation,Support transaction operations.

?? ?? ??Support for ORM operation,Customize the PQL syntax for quick queries,Support dynamic query,Secure thread protection mechanism,Support native operation,Support for XML configuration operations,Support compression, backup, porting MySQL, SQL Server operation,Support transaction operations.

null 60 Dec 12, 2022
Automated Apple Music Lossless Sample Rate Switching for Audio Devices on Macs.

LosslessSwitcher switches your current audio device's sample rate to match the currently playing lossless song on your Apple Music app, automatically.

Vincent Neo 371 Dec 27, 2022
iOS multi-functional AI camera: portrait cartoon, ageing and rejuvenation, beauty, filters, artistic effects, etc.

Magic Camera is an iOS AI camera app based on SwiftUI and CoreML that implements the following features: Portrait Cartoonization, which turns your photos into cartoon avatars Portrait Style Migration, which makes your photos older, younger, hair color, etc Beauty Camera, which supports peeling

William Wang 157 Dec 23, 2022
Profiling / Debugging assist tools for iOS. (Memory Leak, OOM, ANR, Hard Stalling, Network, OpenGL, Time Profile ...)

MTHawkeye Readme 中文版本 MTHawkeye is profiling, debugging tools for iOS used in Meitu. It's designed to help iOS developers improve development producti

meitu 1.4k Dec 29, 2022
Simple Catalyst example (Mac idiom) of a grid-based app populated with photos, with dynamic cell layout switching

Catalyst Photo Grid Simple Catalyst example (Mac idiom) of a grid-based app populated with photos that can animate its cells between two different lay

Steven Troughton-Smith 56 Nov 14, 2022
Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

HaishinKit (formerly lf) Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS. Issuesの言語は、日本語が分かる方は日本語でお願いします! Sponsored with ??

shogo4405 2.4k Dec 29, 2022
Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS.

HaishinKit Camera and Microphone streaming library via RTMP, HLS for iOS, macOS, tvOS. Issuesの言語は、日本語が分かる方は日本語でお願いします! Sponsored with ?? by Enterprise

shogo4405 2.4k Jan 2, 2023
GPUImage 2 is a BSD-licensed Swift framework for GPU-accelerated video and image processing.

GPUImage 2 Brad Larson http://www.sunsetlakesoftware.com @bradlarson [email protected] Overview GPUImage 2 is the second generation of th

Brad Larson 4.8k Dec 29, 2022
GPUImage 3 is a BSD-licensed Swift framework for GPU-accelerated video and image processing using Metal.

GPUImage 3 Janie Clayton http://redqueengraphics.com @RedQueenCoder Brad Larson http://www.sunsetlakesoftware.com @bradlarson contact@sunsetlakesoftwa

Brad Larson 2.4k Jan 3, 2023
A video decoder built on ffmpeg which allows libpag to use ffmpeg as its software decoder for h264 decoding.

ffavc ffavc is a video decoder built on ffmpeg which allows libpag to use ffmpeg as its software decoder for h264 decoding. Build ffmpeg First, make s

Portable Animated Graphics 8 Nov 24, 2022
An unintrusive & light-weight iOS app-theming library with support for animated theme switching.

Gestalt Gestalt is an unintrusive and light-weight framework for application theming with support for animated theme switching. Usage Let's say you wa

Vincent Esche 327 Nov 8, 2022
Open Source Project of LOST Locations iOS app.

LOST Location Description: LOST Location is a fan made app designed for people visiting Honolulu, Hawaii. This app lists different locations used to s

Adrien Villez 9 Feb 10, 2022
Intuitive cycling tracker app for iOS built with SwiftUI using Xcode. Features live route tracking, live metrics, storage of past cycling routes and many customization settings.

GoCycling Available on the iOS App Store https://apps.apple.com/app/go-cycling/id1565861313 App Icon About Go Cycling is a cycling tracker app built e

Anthony Hopkins 64 Dec 19, 2022
📱📲 A wrapper for the MultipeerConnectivity framework for automatic offline data transmission between devices

A wrapper for Apple's MultipeerConnectivity framework for offline data transmission between Apple devices. This framework makes it easy to automatical

Wilson Ding 197 Nov 2, 2022
📱📲 A wrapper for the MultipeerConnectivity framework for automatic offline data transmission between devices

A wrapper for Apple's MultipeerConnectivity framework for offline data transmission between Apple devices. This framework makes it easy to automatical

Wilson Ding 197 Nov 2, 2022
Apple watch app to interface with Transmission Client

TransmissionWatch Apple watch app to interface with Transmission Client Currrent

Aayush 2 Dec 23, 2021