NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS

Overview

Next Level

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

Build Status Pod Version Swift Version GitHub license

Features
🎬 Vine-like” video clip recording and editing
🖼 photo capture (raw, jpeg, and video frame)
👆 customizable gestural interaction and interface
💠 ARKit integration (beta)
📷 dual, wide angle, telephoto, & true depth support
🐢 adjustable frame rate on supported hardware (ie fast/slow motion capture)
🎢 depth data capture support & portrait effects matte support
🔍 video zoom
white balance, focus, and exposure adjustment
🔦 flash and torch support
👯 mirroring support
low light boost
🕶 smooth auto-focus
configurable encoding and compression settings
🛠 simple media capture and editing API
🌀 extensible API for image processing and CV
🐈 animated GIF creator
😎 face recognition; qr- and bar-codes recognition
🐦 Swift 5

Need a different version of Swift?

  • 5.0 - Target your Podfile to the latest release or master
  • 4.2 - Target your Podfile to the swift4.2 branch

Quick Start

0.16.3" # Carthage github "nextlevel/NextLevel" ~> 0.16.3 # Swift PM let package = Package( dependencies: [ .Package(url: "https://github.com/nextlevel/NextLevel", majorVersion: 0) ] ) ">
# CocoaPods
pod "NextLevel", "~> 0.16.3"

# Carthage
github "nextlevel/NextLevel" ~> 0.16.3

# Swift PM
let package = Package(
    dependencies: [
        .Package(url: "https://github.com/nextlevel/NextLevel", majorVersion: 0)
    ]
)

Alternatively, drop the NextLevel source files or project file into your Xcode project.

Important Configuration Note for ARKit and True Depth

ARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flag USE_ARKIT and USE_TRUE_DEPTH respectively.

Apple will reject apps that link against ARKit or the True Depth Camera API and do not use them.

If you use Cocoapods, you can include -D USE_ARKIT or -D USE_TRUE_DEPTH with the following Podfile addition or by adding it to your Xcode build settings.

  installer.pods_project.targets.each do |target|
    # setup NextLevel for ARKit use
    if target.name == 'NextLevel'
      target.build_configurations.each do |config|
        config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']
      end
    end
  end

Overview

Before starting, ensure that permission keys have been added to your app's Info.plist.

<key>NSCameraUsageDescriptionkey>
    <string>Allowing access to the camera lets you take photos and videos.string>
<key>NSMicrophoneUsageDescriptionkey>
    <string>Allowing access to the microphone lets you record audio.string>

Recording Video Clips

Import the library.

import NextLevel

Setup the camera preview.

let screenBounds = UIScreen.main.bounds
self.previewView = UIView(frame: screenBounds)
if let previewView = self.previewView {
    previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
    previewView.backgroundColor = UIColor.black
    NextLevel.shared.previewLayer.frame = previewView.bounds
    previewView.layer.addSublayer(NextLevel.shared.previewLayer)
    self.view.addSubview(previewView)
}

Configure the capture session.

override func viewDidLoad() {
    NextLevel.shared.delegate = self
    NextLevel.shared.deviceDelegate = self
    NextLevel.shared.videoDelegate = self
    NextLevel.shared.photoDelegate = self

    // modify .videoConfiguration, .audioConfiguration, .photoConfiguration properties
    // Compression, resolution, and maximum recording time options are available
    NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTimeMakeWithSeconds(5, 600)
    NextLevel.shared.audioConfiguration.bitRate = 44000
 }

Start/stop the session when appropriate. These methods create a new "session" instance for 'NextLevel.shared.session' when called.

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)     
    NextLevel.shared.start()
    //
}
override func viewWillDisappear(_ animated: Bool) {
    super.viewWillDisappear(animated)        
    NextLevel.shared.stop()
    //
}

Video record/pause.

// record
NextLevel.shared.record()

// pause
NextLevel.shared.pause()

Editing Recorded Clips

Editing and finalizing the recorded session.

if let session = NextLevel.shared.session {

    //..

    // undo
    session.removeLastClip()

    // various editing operations can be done using the NextLevelSession methods

    // export
    session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality, completionHandler: { (url: URL?, error: Error?) in
        if let _ = url {
            //
        } else if let _ = error {
            //
        }
     })

    //..

}

Videos can also be processed using the NextLevelSessionExporter, a media transcoding library in Swift.

Custom Buffer Rendering

‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.

Just to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.

Enable custom rendering.

NextLevel.shared.isVideoCustomContextRenderingEnabled = true

Optional hook that allows reading sampleBuffer for analysis.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // video frame processing
    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {
        // Use the sampleBuffer parameter in your system for continual analysis
    }

Another optional hook for reading buffers for modification, imageBuffer. This is also the recommended place to provide the buffer back to NextLevel for recording.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // enabled by isCustomContextVideoRenderingEnabled
    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
		    // provide the frame back to NextLevel for recording
        if let frame = self._availableFrameBuffer {
            nextLevel.videoCustomContextImageBuffer = frame
        }
    }

NextLevel will check this property when writing buffers to a destination file. This works for both video and photos with capturePhotoFromVideo.

nextLevel.videoCustomContextImageBuffer = modifiedFrame

About

NextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such as Snapchat, Instagram, and Vine.

The goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, or computational photography.

ARKit

NextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.

If you are trying to capture frames from SceneKit for ARKit recording, check out the examples project.

Documentation

You can find the docs here. Documentation is generated with jazzy and hosted on GitHub-Pages.

Stickers

If you found this project to be helpful, check out the Next Level stickers.

Project

NextLevel is a community – contributions and discussions are welcome!

  • Feature idea? Open an issue.
  • Found a bug? Open an issue.
  • Need help? Use Stack Overflow with the tag ’nextlevel’.
  • Questions? Use Stack Overflow with the tag 'nextlevel'.
  • Want to contribute? Submit a pull request.

Related Projects

Resources

License

NextLevel is available under the MIT license, see the LICENSE file for more information.

Comments
  • Videos are getting cut short

    Videos are getting cut short

    I've seen this in our implementation and in the sample app. When we record a single video clip, it seems like the last fraction of a second of the clip gets cut off. You can see this in the project's demo app as well.

    To repro in the demo project, hold the capture button to record a clip and just before letting go of the button, mark the clip with an audio cue. You'll notice during playback, the audio cue is not present.

    bug 
    opened by bennysghost 19
  • Low Quality Video Capture

    Low Quality Video Capture

    @piemonte Hey the previous issue #34 was closed so I thought it necessary to bring this up again since i failed to fix it.

    For whatever reason despite setting videoConfigured to high quality, I still get lowQuality output video when I play back using your other library Player.

    I have these settings in my viewDidLoad so would appreciate further hints as to what could possible cause the bad video. Im using the latest version of this library.

    self.nextLevel.delegate = self
            self.nextLevel.videoDelegate = self
            self.nextLevel.photoDelegate = self
            self.nextLevel.deviceDelegate = self
            self.nextLevel.captureMode = .video
            self.nextLevel.focusMode = .autoFocus
            self.nextLevel.devicePosition = .front
            self.nextLevel.deviceOrientation = .portrait
            self.nextLevel.videoStabilizationMode = .auto
            self.nextLevel.videoConfiguration.bitRate = 2000000
            self.nextLevel.videoConfiguration.preset = AVCaptureSessionPresetHigh
            self.nextLevel.videoConfiguration.transform = CGAffineTransform(scaleX: -1, y: 1)```
    bug 
    opened by otymartin 13
  • Video Quality

    Video Quality

    Hi Piemonte,

    We have below configuration for video recording:

    1. Bit rate : 2000000
    2. AVAssetExportPresetHighestQuality

    but after video recording we are getting pixelated video file. Can you please help.

    opened by monish152 12
  • Preview orientation encounters issues

    Preview orientation encounters issues

    I'm having issues with the rotation of the preview layer, it's happening super slowly and with a lot of stuttering.

    My set up looks something like this:

    self.nextLevel.delegate = self
    self.nextLevel.videoDelegate = self
    self.nextLevel.captureMode = .videoWithoutAudio
    self.nextLevel.automaticallyUpdatesDeviceOrientation = true
    self.nextLevel.videoConfiguration.preset = .hd1920x1080
    

    I have the preview set up like this:

    /// The recording preview which contains the next level recorder
    private lazy var previewView: UIView = {
        let view = UIView(frame: UIScreen.main.bounds)
        view.backgroundColor = .black
        self.nextLevel.previewLayer.frame = view.bounds
        view.layer.addSublayer(self.nextLevel.previewLayer)
        return view
    }()
    
    // In view did load
    self.previewView.frame = self.view.bounds
    self.previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
    self.view.insertSubview(self.previewView, at: 0)
    

    Here is a GIF of the slow rotation:

    ezgif com-optimize

    Any ideas what could be going on?

    bug 
    opened by luispadron 10
  • Xcode 9.0 issues

    Xcode 9.0 issues

    Xcode 9.0

    I encounter type issues (about 75 of them):

    AVMediaType' (aka 'NSString') is not implicitly convertible to 'String!'; did you mean to use 'as' to explicitly convert?
    

    In :

    extension AVCaptureDeviceInput {
        
        /// Returns the capture device input for the desired media type and capture session, otherwise nil.
        ///
        /// - Parameters:
        ///   - mediaType: Specified media type. (i.e. AVMediaTypeVideo, AVMediaTypeAudio, etc.)
        ///   - captureSession: Capture session for which to query
        /// - Returns: Desired capture device input for the associated media type, otherwise nil
        public class func deviceInput(withMediaType mediaType: AVMediaType, captureSession: AVCaptureSession) -> AVCaptureDeviceInput? {
            if let inputs = captureSession.inputs as? [AVCaptureDeviceInput] {
                for deviceInput in inputs {
                    if deviceInput.device.hasMediaType(mediaType) {
                        return deviceInput
                    }
                }
            }
            return nil
        }
        
    }
    
    opened by cybercent 10
  • How to apply a CIIFilter to preview and video?

    How to apply a CIIFilter to preview and video?

    I've looked into it, but everything I find suggests taking the sample buffer and converting to a UIImage, which doesn't seem right, particularly when it would require updating the AVPreview layer each frame.

    I'm currently looking at the VideoDelegate methods:

    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {}

    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {}

    Any tips would be appreciated.

    question 
    opened by jjaybrown 9
  • [FEATURE] Supporting the AVPortraitEffectsMatte

    [FEATURE] Supporting the AVPortraitEffectsMatte

    Motivation

    Early this year Apple announced a lot of features related to the AVFoundation and one of them is the AVPortraitEffectsMatte

    This new class is an auxiliary image used to separate foreground from background with high resolution for depth enabled photos.

    I think this would be a great addition to the NextLevel, so I put some time and effort into this and came up with this pull request.

    Changes

    New flag variable on NextLevelPhotoConfiguration

    /// Enables portrait effects matte output for the photo
    public var isPortraitEffectsMatteEnabled: Bool = false
    

    Enabling the AVPortraitEffectsMatte if available and desired on capturePhoto()

    ...
        if #available(iOS 12.0, *) {
            if photoOutput.isPortraitEffectsMatteDeliverySupported {
                photoOutput.isPortraitEffectsMatteDeliveryEnabled = 
                self.photoConfiguration.isPortraitEffectsMatteEnabled
            }
        } 
    ...
    

    New Protocol method on NextLevelPhotoDelegate

    The nextLevelDidFinishProcessingPhoto should be used to get an AVCapturePhoto that can have metadata for AVDepthData or AVPortraitEffectsMatte. In the future could be more useful for new capabilities that Apple introduce related to AVFoundation.

    @available(iOS 11.0, *)
    func nextLevelDidFinishProcessingPhoto(_ nextLevel: NextLevel, photo: AVCapturePhoto)
    

    New Protocol called NextLevelPortraitEffectsMatteDelegate

    // MARK: - NextLevelPortraitEffectsMatteDelegate
    
    /// Portrait Effects Matte delegate, provides portrait effects matte updates
    public protocol NextLevelPortraitEffectsMatteDelegate: AnyObject {
        
        @available(iOS 12.0, *)
        func portraitEffectsMatteOutput(_ nextLevel: NextLevel, didOutput portraitEffectsMatte: AVPortraitEffectsMatte)
        
    }
    

    Using the newer method of AVCapturePhotoCaptureDelegate

        @available(iOS 11.0, *)
        public func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
            DispatchQueue.main.async {
                self.photoDelegate?.nextLevelDidFinishProcessingPhoto(self, photo: photo)
            }
            
            // MARK: Portrait Effects Matte Output
            if #available(iOS 12.0, *){
                if let portraitEffectsMatte = photo.portraitEffectsMatte {
                    DispatchQueue.main.async {
                        self.portraitEffectsMatteDelegate?.portraitEffectsMatteOutput(self, didOutput: portraitEffectsMatte)
                    }
                }
            }
        }
    

    Considerations

    Is there any way that I can keep contributing for the NextLevel? I've loved to implement this feature in the library and I will be using NextLevel for my photo app @fade-it on next releases.

    Thank You! Gus

    opened by giogus 8
  • Reconfigure on each focus and exposure change

    Reconfigure on each focus and exposure change

    Hey there, Awesome library, but can we make it even better ? I was wondering for few things that might or might not be problems:

    • Does this device.isSubjectAreaChangeMonitoringEnabled = true have to change on each focusEnded and exposureEnded call? https://github.com/NextLevel/NextLevel/blob/9cad174578d1986d2cecc6c230de79666764bebe/Sources/NextLevel.swift#L1663-L1691

    • Another thing I've found that I can't explain to myself is why the following: Why is device.isAdjustingFocus always false in here https://github.com/NextLevel/NextLevel/blob/9cad174578d1986d2cecc6c230de79666764bebe/Sources/NextLevel.swift#L1611-L1624

    • And finally: Why is focusAtAdjustedPointOfInterest being called instead of focusExposeAndAdjustWhiteBalance in the function above?

    opened by NikolayGenov 8
  • Xcode 12 compiler error

    Xcode 12 compiler error

    Has anyone encounter this issue when running in Xcode 12?

    Looks like an iOS 14 SDK issue, but not 100% sure. image

    I could find availableVideoCVPixelFormatTypes in Objc header file, but somehow it is not visible in Swift. image

    And here is Apple's documentation for availableVideoPixelFormatTypes which is nonobjc but still, cannot find it in Swift header

    https://developer.apple.com/documentation/avfoundation/avcapturevideodataoutput/2926555-availablevideopixelformattypes

    NextLevel version: 0.16.3

    Updates Checked in XCode 11.7, here is the definition for that property image

    bug 
    opened by cincas 7
  • Enabling/Disabling Audio

    Enabling/Disabling Audio

    Hey there,

    Is there any way microphone input can be disabled and then re-enabled using this framework? I'm trying to create haptic feedback when the user taps on a button on the screen when the user is not recording. As playing haptics is disabled when an AVCaptureSession is active with an audio input, I'm thinking of disabling audio input while the user is not recording, and then enabling it just before they start recording.

    Thanks for the help!

    -William

    bug enhancement 
    opened by darnfish 7
  • Slow Motion Example?

    Slow Motion Example?

    Hi guys, really awesome project! I want to achieve slow motion video recording in the app I am working on. I was able to "stretch out" the video with AVFoundation only, but the quality sucked because the frame rate was unchanged while timescale was decreased. I saw that NextLevel supports slow motion recording on supported device, so I tried to look into the documentations and only found the bitrate and frameRate property. How should I use them to achieve slow motion video recording? Thanks!

    question 
    opened by tonymuu 7
  • NextLevel ARKit not recording

    NextLevel ARKit not recording

    I am integrating NextLevel into my project and I am using ARKit. Taking photoFromVideo works however, when I try to record, the video sample buffer is not updated, and no clips actually record. The didStartClipInSession is called however, the clip is never recorded/completed. This is the same issue in the Example App

    opened by sparkdeveloping 1
  • pixelBuffer is nil in the captured photo

    pixelBuffer is nil in the captured photo

    Hi,

    Thanks for the great framework you made. I am trying to capture photos and I am using the following settings:

    NextLevel.shared.delegate = self NextLevel.shared.deviceDelegate = self NextLevel.shared.videoDelegate = self NextLevel.shared.photoDelegate = self NextLevel.shared.devicePosition = .back NextLevel.shared.captureMode = .photo NextLevel.shared.videoConfiguration.preset = .high NextLevel.shared.automaticallyUpdatesDeviceOrientation = true

    I successfully capture photo by calling NextLevel.shared.capturePhoto(), but in didFinishProcessingPhoto photo: AVCapturePhoto the photo.pixelBuffer is nil.

    Can you please help me with this?

    opened by bluepixeltech 1
  • about shouldOptimizeForNetworkUse setting

    about shouldOptimizeForNetworkUse setting

    Is there any place to set shouldOptimizeForNetworkUse, this is set to true in the source code, which causes the writer.finishWriting to take a long time when I record a long video.

    opened by JJson 0
  • Orientation issue

    Orientation issue

    If video is in landscape and in between change mode to portrait the video is zoomed . Like its not smoooth

        NextLevel.shared.delegate = self
        NextLevel.shared.deviceDelegate = self
        NextLevel.shared.videoDelegate = self
        NextLevel.shared.captureMode = .video
        NextLevel.shared.deviceOrientation = .portrait
        NextLevel.shared.automaticallyUpdatesDeviceOrientation = true
        NextLevel.shared.flipCaptureDevicePosition()
        NextLevel.shared.audioConfiguration.bitRate = 44000
        NextLevel.shared.devicePosition = .back
    
    opened by dhvl1729 2
  • Occlusion : recording not working - object not occluded

    Occlusion : recording not working - object not occluded

    iOS 14, ARKit latest version and using the "nextlevel" example app.

    occlusion switched on with

    arConfig.frameSemantics.insert(ARConfiguration.FrameSemantics.personSegmentationWithDepth)

    occlusion works fine in the live ARview, but the recorded video it does not

    opened by cortinas 0
Releases(0.16.1)
  • 0.16.1(Dec 4, 2019)

    • addresses and issue where updateVideoOutputSettings was being called after adding an input instead of when initially setting up the video output causing frame drops and delays
    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Nov 14, 2017)

    The ARKit mode is in beta, so it is not compiled by default without the inclusion of the Swift compiler flag USE_ARKIT. Apple will reject apps that link ARKit and do not use it. To try it out, setup the AppDelegate to load the MixedRealityViewController class and include the xcode build settings flag.

    If you use Cocoapods, you can include -DUSE_ARKIT with the following podfile addition or by adding it to your Xcode build settings.

      installer.pods_project.targets.each do |target|
        # setup NextLevel for ARKit use
        if target.name == 'NextLevel'
          target.build_configurations.each do |config|
            config.build_settings['OTHER_SWIFT_FLAGS'] = '-DUSE_ARKIT'
          end
        end
      end
    
    Source code(tar.gz)
    Source code(zip)
Owner
NextLevel
Camera Platform in Swift
NextLevel
📸 iOS Media Capture – features touch-to-record video, slow motion, and photography

PBJVision PBJVision is a camera library for iOS that enables easy integration of special capture features and camera interface customizations in your

patrick piemonte 1.9k Dec 26, 2022
This plugin defines a global navigator.camera object, which provides an API for taking pictures and for choosing images from the system's image library.

title description Camera Take pictures with the device camera. AppVeyor Travis CI cordova-plugin-camera This plugin defines a global navigator.camera

null 0 Nov 2, 2021
ALCameraViewController - A camera view controller with custom image picker and image cropping.

ALCameraViewController A camera view controller with custom image picker and image cropping. Features Front facing and rear facing camera Simple and c

Alex Littlejohn 2k Dec 6, 2022
Library for iOS Camera API. CameraKit helps you add reliable camera to your app quickly.

CameraKit helps you add reliable camera to your app quickly. Our open source camera platform provides consistent capture results, service that scales, and endless camera possibilities.

CameraKit 628 Dec 27, 2022
IPadLiDARExperiment - Simple experiment to capture Depth data from the iPad Pro's LiDAR

iPad LiDAR Experiment Simple experiment to capture and display Depth data from t

Fabio 16 Jul 25, 2022
Custom camera with AVFoundation. Beautiful, light and easy to integrate with iOS projects.

?? Warning This repository is DEPRECATED and not maintained anymore. Custom camera with AVFoundation. Beautiful, light and easy to integrate with iOS

Tudo Gostoso Internet 1.4k Dec 16, 2022
Fasttt and easy camera framework for iOS with customizable filters

FastttCamera is a wrapper around AVFoundation that allows you to build your own powerful custom camera app without all the headaches of using AVFounda

IFTTT 1.8k Dec 10, 2022
A light weight & simple & easy camera for iOS by Swift.

DKCamera Description A light weight & simple & easy camera for iOS by Swift. It uses CoreMotion framework to detect device orientation, so the screen-

Bannings 86 Aug 18, 2022
BarcodeScanner is a simple and beautiful wrapper around the camera with barcode capturing functionality and a great user experience.

Description BarcodeScanner is a simple and beautiful wrapper around the camera with barcode capturing functionality and a great user experience. Barco

HyperRedink 1.6k Jan 7, 2023
A fully customisable and modern camera implementation for iOS made with AVFoundation.

Features Extremely simple and easy to use Controls autofocus & exposure Customizable interface Code-made UI assets that do not lose resolution quality

Gabriel Alvarado 1.3k Nov 30, 2022
An iOS framework that uses the front camera, detects your face and takes a selfie.

TakeASelfie An iOS framework that uses the front camera, detects your face and takes a selfie. This api opens the front camera and draws an green oval

Abdullah Selek 37 Jan 3, 2023
Video and photo camera for iOS

Features: Description Records video ?? takes photos ?? Flash on/off ⚡ Front / Back camera ↕️ Hold to record video ✊ Tap to take photo ?? Tap to focus

André J 192 Dec 17, 2022
Instagram-like photo browser and a camera feature with a few line of code in Swift.

Fusuma is a Swift library that provides an Instagram-like photo browser with a camera feature using only a few lines of code.

Yuta Akizuki 2.4k Dec 31, 2022
Camera engine for iOS, written in Swift, above AVFoundation. :monkey:

?? The most advanced Camera framework in Swift ?? CameraEngine is an iOS camera engine library that allows easy integration of special capture feature

Remi ROBERT 575 Dec 25, 2022
A Snapchat Inspired iOS Camera Framework written in Swift

Overview SwiftyCam is a a simple, Snapchat-style iOS Camera framework for easy photo and video capture. SwiftyCam allows users to capture both photos

Andrew Walz 2k Dec 21, 2022
iOS camera engine with Vine-like tap to record, animatable filters, slow motion, segments editing

SCRecorder A Vine/Instagram like audio/video recorder and filter framework in Objective-C. In short, here is a short list of the cool things you can d

Simon Corsin 3.1k Dec 25, 2022
A simple, customizable camera control - video recorder for iOS.

LLSimpleCamera: A simple customizable camera - video recorder control LLSimpleCamera is a library for creating a customized camera - video recorder sc

Ömer Faruk Gül 1.2k Dec 12, 2022
Simple Swift class to provide all the configurations you need to create custom camera view in your app

Camera Manager This is a simple Swift class to provide all the configurations you need to create custom camera view in your app. It follows orientatio

Imaginary Cloud 1.3k Dec 29, 2022
UIView+CameraBackground - Show camera layer as a background to any UIView.

UIView+CameraBackground Show camera layer as a background to any UIView. Features Both front and back camera supported. Flash modes: auto, on, off. Co

Yonat Sharon 63 Nov 15, 2022