⬆️ Rad Media Capture in Swift

Overview

Next Level

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

Build Status Pod Version Swift Version GitHub license

Features
🎬 Vine-like” video clip recording and editing
🖼 photo capture (raw, jpeg, and video frame)
👆 customizable gestural interaction and interface
💠 ARKit integration (beta)
📷 dual, wide angle, telephoto, & true depth support
🐢 adjustable frame rate on supported hardware (ie fast/slow motion capture)
🎢 depth data capture support & portrait effects matte support
🔍 video zoom
white balance, focus, and exposure adjustment
🔦 flash and torch support
👯 mirroring support
low light boost
🕶 smooth auto-focus
configurable encoding and compression settings
🛠 simple media capture and editing API
🌀 extensible API for image processing and CV
🐈 animated GIF creator
😎 face recognition; qr- and bar-codes recognition
🐦 Swift 5

Need a different version of Swift?

  • 5.0 - Target your Podfile to the latest release or master
  • 4.2 - Target your Podfile to the swift4.2 branch

Quick Start

# CocoaPods
pod "NextLevel", "~> 0.16.3"

# Carthage
github "nextlevel/NextLevel" ~> 0.16.3

# Swift PM
let package = Package(
    dependencies: [
        .Package(url: "https://github.com/nextlevel/NextLevel", majorVersion: 0)
    ]
)

Alternatively, drop the NextLevel source files or project file into your Xcode project.

Important Configuration Note for ARKit and True Depth

ARKit and the True Depth Camera software features are enabled with the inclusion of the Swift compiler flag USE_ARKIT and USE_TRUE_DEPTH respectively.

Apple will reject apps that link against ARKit or the True Depth Camera API and do not use them.

If you use Cocoapods, you can include -D USE_ARKIT or -D USE_TRUE_DEPTH with the following Podfile addition or by adding it to your Xcode build settings.

  installer.pods_project.targets.each do |target|
    # setup NextLevel for ARKit use
    if target.name == 'NextLevel'
      target.build_configurations.each do |config|
        config.build_settings['OTHER_SWIFT_FLAGS'] = ['$(inherited)', '-DUSE_ARKIT']
      end
    end
  end

Overview

Before starting, ensure that permission keys have been added to your app's Info.plist.

<key>NSCameraUsageDescription</key>
    <string>Allowing access to the camera lets you take photos and videos.</string>
<key>NSMicrophoneUsageDescription</key>
    <string>Allowing access to the microphone lets you record audio.</string>

Recording Video Clips

Import the library.

import NextLevel

Setup the camera preview.

let screenBounds = UIScreen.main.bounds
self.previewView = UIView(frame: screenBounds)
if let previewView = self.previewView {
    previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
    previewView.backgroundColor = UIColor.black
    NextLevel.shared.previewLayer.frame = previewView.bounds
    previewView.layer.addSublayer(NextLevel.shared.previewLayer)
    self.view.addSubview(previewView)
}

Configure the capture session.

override func viewDidLoad() {
    NextLevel.shared.delegate = self
    NextLevel.shared.deviceDelegate = self
    NextLevel.shared.videoDelegate = self
    NextLevel.shared.photoDelegate = self

    // modify .videoConfiguration, .audioConfiguration, .photoConfiguration properties
    // Compression, resolution, and maximum recording time options are available
    NextLevel.shared.videoConfiguration.maximumCaptureDuration = CMTimeMakeWithSeconds(5, 600)
    NextLevel.shared.audioConfiguration.bitRate = 44000
 }

Start/stop the session when appropriate. These methods create a new "session" instance for 'NextLevel.shared.session' when called.

override func viewWillAppear(_ animated: Bool) {
    super.viewWillAppear(animated)     
    NextLevel.shared.start()
    //
}
override func viewWillDisappear(_ animated: Bool) {
    super.viewWillDisappear(animated)        
    NextLevel.shared.stop()
    //
}

Video record/pause.

// record
NextLevel.shared.record()

// pause
NextLevel.shared.pause()

Editing Recorded Clips

Editing and finalizing the recorded session.

if let session = NextLevel.shared.session {

    //..

    // undo
    session.removeLastClip()

    // various editing operations can be done using the NextLevelSession methods

    // export
    session.mergeClips(usingPreset: AVAssetExportPresetHighestQuality, completionHandler: { (url: URL?, error: Error?) in
        if let _ = url {
            //
        } else if let _ = error {
            //
        }
     })

    //..

}

Videos can also be processed using the NextLevelSessionExporter, a media transcoding library in Swift.

Custom Buffer Rendering

‘NextLevel’ was designed for sample buffer analysis and custom modification in real-time along side a rich set of camera features.

Just to note, modifications performed on a buffer and provided back to NextLevel may potentially effect frame rate.

Enable custom rendering.

NextLevel.shared.isVideoCustomContextRenderingEnabled = true

Optional hook that allows reading sampleBuffer for analysis.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // video frame processing
    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {
        // Use the sampleBuffer parameter in your system for continual analysis
    }

Another optional hook for reading buffers for modification, imageBuffer. This is also the recommended place to provide the buffer back to NextLevel for recording.

extension CameraViewController: NextLevelVideoDelegate {

    // ...

    // enabled by isCustomContextVideoRenderingEnabled
    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {
		    // provide the frame back to NextLevel for recording
        if let frame = self._availableFrameBuffer {
            nextLevel.videoCustomContextImageBuffer = frame
        }
    }

NextLevel will check this property when writing buffers to a destination file. This works for both video and photos with capturePhotoFromVideo.

nextLevel.videoCustomContextImageBuffer = modifiedFrame

About

NextLevel was initally a weekend project that has now grown into a open community of camera platform enthusists. The software provides foundational components for managing media recording, camera interface customization, gestural interaction customization, and image streaming on iOS. The same capabilities can also be found in apps such as Snapchat, Instagram, and Vine.

The goal is to continue to provide a good foundation for quick integration (enabling projects to be taken to the next level) – allowing focus to placed on functionality that matters most whether it's realtime image processing, computer vision methods, augmented reality, or computational photography.

ARKit

NextLevel provides components for capturing ARKit video and photo. This enables a variety of new camera features while leveraging the existing recording capabilities and media management of NextLevel.

If you are trying to capture frames from SceneKit for ARKit recording, check out the examples project.

Documentation

You can find the docs here. Documentation is generated with jazzy and hosted on GitHub-Pages.

Stickers

If you found this project to be helpful, check out the Next Level stickers.

Project

NextLevel is a community – contributions and discussions are welcome!

  • Feature idea? Open an issue.
  • Found a bug? Open an issue.
  • Need help? Use Stack Overflow with the tag ’nextlevel’.
  • Questions? Use Stack Overflow with the tag 'nextlevel'.
  • Want to contribute? Submit a pull request.

Related Projects

Resources

License

NextLevel is available under the MIT license, see the LICENSE file for more information.

Comments
  • Videos are getting cut short

    Videos are getting cut short

    I've seen this in our implementation and in the sample app. When we record a single video clip, it seems like the last fraction of a second of the clip gets cut off. You can see this in the project's demo app as well.

    To repro in the demo project, hold the capture button to record a clip and just before letting go of the button, mark the clip with an audio cue. You'll notice during playback, the audio cue is not present.

    bug 
    opened by bennysghost 19
  • Low Quality Video Capture

    Low Quality Video Capture

    @piemonte Hey the previous issue #34 was closed so I thought it necessary to bring this up again since i failed to fix it.

    For whatever reason despite setting videoConfigured to high quality, I still get lowQuality output video when I play back using your other library Player.

    I have these settings in my viewDidLoad so would appreciate further hints as to what could possible cause the bad video. Im using the latest version of this library.

    self.nextLevel.delegate = self
            self.nextLevel.videoDelegate = self
            self.nextLevel.photoDelegate = self
            self.nextLevel.deviceDelegate = self
            self.nextLevel.captureMode = .video
            self.nextLevel.focusMode = .autoFocus
            self.nextLevel.devicePosition = .front
            self.nextLevel.deviceOrientation = .portrait
            self.nextLevel.videoStabilizationMode = .auto
            self.nextLevel.videoConfiguration.bitRate = 2000000
            self.nextLevel.videoConfiguration.preset = AVCaptureSessionPresetHigh
            self.nextLevel.videoConfiguration.transform = CGAffineTransform(scaleX: -1, y: 1)```
    bug 
    opened by otymartin 13
  • Video Quality

    Video Quality

    Hi Piemonte,

    We have below configuration for video recording:

    1. Bit rate : 2000000
    2. AVAssetExportPresetHighestQuality

    but after video recording we are getting pixelated video file. Can you please help.

    opened by monish152 12
  • Preview orientation encounters issues

    Preview orientation encounters issues

    I'm having issues with the rotation of the preview layer, it's happening super slowly and with a lot of stuttering.

    My set up looks something like this:

    self.nextLevel.delegate = self
    self.nextLevel.videoDelegate = self
    self.nextLevel.captureMode = .videoWithoutAudio
    self.nextLevel.automaticallyUpdatesDeviceOrientation = true
    self.nextLevel.videoConfiguration.preset = .hd1920x1080
    

    I have the preview set up like this:

    /// The recording preview which contains the next level recorder
    private lazy var previewView: UIView = {
        let view = UIView(frame: UIScreen.main.bounds)
        view.backgroundColor = .black
        self.nextLevel.previewLayer.frame = view.bounds
        view.layer.addSublayer(self.nextLevel.previewLayer)
        return view
    }()
    
    // In view did load
    self.previewView.frame = self.view.bounds
    self.previewView.autoresizingMask = [.flexibleWidth, .flexibleHeight]
    self.view.insertSubview(self.previewView, at: 0)
    

    Here is a GIF of the slow rotation:

    ezgif com-optimize

    Any ideas what could be going on?

    bug 
    opened by luispadron 10
  • Xcode 9.0 issues

    Xcode 9.0 issues

    Xcode 9.0

    I encounter type issues (about 75 of them):

    AVMediaType' (aka 'NSString') is not implicitly convertible to 'String!'; did you mean to use 'as' to explicitly convert?
    

    In :

    extension AVCaptureDeviceInput {
        
        /// Returns the capture device input for the desired media type and capture session, otherwise nil.
        ///
        /// - Parameters:
        ///   - mediaType: Specified media type. (i.e. AVMediaTypeVideo, AVMediaTypeAudio, etc.)
        ///   - captureSession: Capture session for which to query
        /// - Returns: Desired capture device input for the associated media type, otherwise nil
        public class func deviceInput(withMediaType mediaType: AVMediaType, captureSession: AVCaptureSession) -> AVCaptureDeviceInput? {
            if let inputs = captureSession.inputs as? [AVCaptureDeviceInput] {
                for deviceInput in inputs {
                    if deviceInput.device.hasMediaType(mediaType) {
                        return deviceInput
                    }
                }
            }
            return nil
        }
        
    }
    
    opened by cybercent 10
  • How to apply a CIIFilter to preview and video?

    How to apply a CIIFilter to preview and video?

    I've looked into it, but everything I find suggests taking the sample buffer and converting to a UIImage, which doesn't seem right, particularly when it would require updating the AVPreview layer each frame.

    I'm currently looking at the VideoDelegate methods:

    public func nextLevel(_ nextLevel: NextLevel, willProcessRawVideoSampleBuffer sampleBuffer: CMSampleBuffer) {}

    public func nextLevel(_ nextLevel: NextLevel, renderToCustomContextWithImageBuffer imageBuffer: CVPixelBuffer, onQueue queue: DispatchQueue) {}

    Any tips would be appreciated.

    question 
    opened by jjaybrown 9
  • [FEATURE] Supporting the AVPortraitEffectsMatte

    [FEATURE] Supporting the AVPortraitEffectsMatte

    Motivation

    Early this year Apple announced a lot of features related to the AVFoundation and one of them is the AVPortraitEffectsMatte

    This new class is an auxiliary image used to separate foreground from background with high resolution for depth enabled photos.

    I think this would be a great addition to the NextLevel, so I put some time and effort into this and came up with this pull request.

    Changes

    New flag variable on NextLevelPhotoConfiguration

    /// Enables portrait effects matte output for the photo
    public var isPortraitEffectsMatteEnabled: Bool = false
    

    Enabling the AVPortraitEffectsMatte if available and desired on capturePhoto()

    ...
        if #available(iOS 12.0, *) {
            if photoOutput.isPortraitEffectsMatteDeliverySupported {
                photoOutput.isPortraitEffectsMatteDeliveryEnabled = 
                self.photoConfiguration.isPortraitEffectsMatteEnabled
            }
        } 
    ...
    

    New Protocol method on NextLevelPhotoDelegate

    The nextLevelDidFinishProcessingPhoto should be used to get an AVCapturePhoto that can have metadata for AVDepthData or AVPortraitEffectsMatte. In the future could be more useful for new capabilities that Apple introduce related to AVFoundation.

    @available(iOS 11.0, *)
    func nextLevelDidFinishProcessingPhoto(_ nextLevel: NextLevel, photo: AVCapturePhoto)
    

    New Protocol called NextLevelPortraitEffectsMatteDelegate

    // MARK: - NextLevelPortraitEffectsMatteDelegate
    
    /// Portrait Effects Matte delegate, provides portrait effects matte updates
    public protocol NextLevelPortraitEffectsMatteDelegate: AnyObject {
        
        @available(iOS 12.0, *)
        func portraitEffectsMatteOutput(_ nextLevel: NextLevel, didOutput portraitEffectsMatte: AVPortraitEffectsMatte)
        
    }
    

    Using the newer method of AVCapturePhotoCaptureDelegate

        @available(iOS 11.0, *)
        public func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
            DispatchQueue.main.async {
                self.photoDelegate?.nextLevelDidFinishProcessingPhoto(self, photo: photo)
            }
            
            // MARK: Portrait Effects Matte Output
            if #available(iOS 12.0, *){
                if let portraitEffectsMatte = photo.portraitEffectsMatte {
                    DispatchQueue.main.async {
                        self.portraitEffectsMatteDelegate?.portraitEffectsMatteOutput(self, didOutput: portraitEffectsMatte)
                    }
                }
            }
        }
    

    Considerations

    Is there any way that I can keep contributing for the NextLevel? I've loved to implement this feature in the library and I will be using NextLevel for my photo app @fade-it on next releases.

    Thank You! Gus

    opened by giogus 8
  • Reconfigure on each focus and exposure change

    Reconfigure on each focus and exposure change

    Hey there, Awesome library, but can we make it even better ? I was wondering for few things that might or might not be problems:

    • Does this device.isSubjectAreaChangeMonitoringEnabled = true have to change on each focusEnded and exposureEnded call? https://github.com/NextLevel/NextLevel/blob/9cad174578d1986d2cecc6c230de79666764bebe/Sources/NextLevel.swift#L1663-L1691

    • Another thing I've found that I can't explain to myself is why the following: Why is device.isAdjustingFocus always false in here https://github.com/NextLevel/NextLevel/blob/9cad174578d1986d2cecc6c230de79666764bebe/Sources/NextLevel.swift#L1611-L1624

    • And finally: Why is focusAtAdjustedPointOfInterest being called instead of focusExposeAndAdjustWhiteBalance in the function above?

    opened by NikolayGenov 8
  • Xcode 12 compiler error

    Xcode 12 compiler error

    Has anyone encounter this issue when running in Xcode 12?

    Looks like an iOS 14 SDK issue, but not 100% sure. image

    I could find availableVideoCVPixelFormatTypes in Objc header file, but somehow it is not visible in Swift. image

    And here is Apple's documentation for availableVideoPixelFormatTypes which is nonobjc but still, cannot find it in Swift header

    https://developer.apple.com/documentation/avfoundation/avcapturevideodataoutput/2926555-availablevideopixelformattypes

    NextLevel version: 0.16.3

    Updates Checked in XCode 11.7, here is the definition for that property image

    bug 
    opened by cincas 7
  • Enabling/Disabling Audio

    Enabling/Disabling Audio

    Hey there,

    Is there any way microphone input can be disabled and then re-enabled using this framework? I'm trying to create haptic feedback when the user taps on a button on the screen when the user is not recording. As playing haptics is disabled when an AVCaptureSession is active with an audio input, I'm thinking of disabling audio input while the user is not recording, and then enabling it just before they start recording.

    Thanks for the help!

    -William

    bug enhancement 
    opened by darnfish 7
  • Slow Motion Example?

    Slow Motion Example?

    Hi guys, really awesome project! I want to achieve slow motion video recording in the app I am working on. I was able to "stretch out" the video with AVFoundation only, but the quality sucked because the frame rate was unchanged while timescale was decreased. I saw that NextLevel supports slow motion recording on supported device, so I tried to look into the documentations and only found the bitrate and frameRate property. How should I use them to achieve slow motion video recording? Thanks!

    question 
    opened by tonymuu 7
  • NextLevel ARKit not recording

    NextLevel ARKit not recording

    I am integrating NextLevel into my project and I am using ARKit. Taking photoFromVideo works however, when I try to record, the video sample buffer is not updated, and no clips actually record. The didStartClipInSession is called however, the clip is never recorded/completed. This is the same issue in the Example App

    opened by sparkdeveloping 1
  • pixelBuffer is nil in the captured photo

    pixelBuffer is nil in the captured photo

    Hi,

    Thanks for the great framework you made. I am trying to capture photos and I am using the following settings:

    NextLevel.shared.delegate = self NextLevel.shared.deviceDelegate = self NextLevel.shared.videoDelegate = self NextLevel.shared.photoDelegate = self NextLevel.shared.devicePosition = .back NextLevel.shared.captureMode = .photo NextLevel.shared.videoConfiguration.preset = .high NextLevel.shared.automaticallyUpdatesDeviceOrientation = true

    I successfully capture photo by calling NextLevel.shared.capturePhoto(), but in didFinishProcessingPhoto photo: AVCapturePhoto the photo.pixelBuffer is nil.

    Can you please help me with this?

    opened by bluepixeltech 1
  • about shouldOptimizeForNetworkUse setting

    about shouldOptimizeForNetworkUse setting

    Is there any place to set shouldOptimizeForNetworkUse, this is set to true in the source code, which causes the writer.finishWriting to take a long time when I record a long video.

    opened by JJson 0
  • Orientation issue

    Orientation issue

    If video is in landscape and in between change mode to portrait the video is zoomed . Like its not smoooth

        NextLevel.shared.delegate = self
        NextLevel.shared.deviceDelegate = self
        NextLevel.shared.videoDelegate = self
        NextLevel.shared.captureMode = .video
        NextLevel.shared.deviceOrientation = .portrait
        NextLevel.shared.automaticallyUpdatesDeviceOrientation = true
        NextLevel.shared.flipCaptureDevicePosition()
        NextLevel.shared.audioConfiguration.bitRate = 44000
        NextLevel.shared.devicePosition = .back
    
    opened by dhvl1729 2
  • Occlusion : recording not working - object not occluded

    Occlusion : recording not working - object not occluded

    iOS 14, ARKit latest version and using the "nextlevel" example app.

    occlusion switched on with

    arConfig.frameSemantics.insert(ARConfiguration.FrameSemantics.personSegmentationWithDepth)

    occlusion works fine in the live ARview, but the recorded video it does not

    opened by cortinas 0
Releases(0.16.1)
  • 0.16.1(Dec 4, 2019)

    • addresses and issue where updateVideoOutputSettings was being called after adding an input instead of when initially setting up the video output causing frame drops and delays
    Source code(tar.gz)
    Source code(zip)
  • 0.9.0(Nov 14, 2017)

    The ARKit mode is in beta, so it is not compiled by default without the inclusion of the Swift compiler flag USE_ARKIT. Apple will reject apps that link ARKit and do not use it. To try it out, setup the AppDelegate to load the MixedRealityViewController class and include the xcode build settings flag.

    If you use Cocoapods, you can include -DUSE_ARKIT with the following podfile addition or by adding it to your Xcode build settings.

      installer.pods_project.targets.each do |target|
        # setup NextLevel for ARKit use
        if target.name == 'NextLevel'
          target.build_configurations.each do |config|
            config.build_settings['OTHER_SWIFT_FLAGS'] = '-DUSE_ARKIT'
          end
        end
      end
    
    Source code(tar.gz)
    Source code(zip)
Owner
NextLevel
Camera Platform in Swift
NextLevel
Instagram-like photo browser and a camera feature with a few line of code in Swift.

NOTE: This project is no longer maintained. We highly recommend YPImagePicker. Fusuma Fusuma is a Swift library that provides an Instagram-like photo

Yuta Akizuki 2.4k Dec 31, 2022
Simple QRCode reader in Swift

QRCodeReader.swift is a simple code reader (initially only QRCode) for iOS in Swift. It is based on the AVFoundation framework from Apple in order to

Yannick Loriot 1.3k Jan 5, 2023
A simple Swift framework for building reliable Bluetooth LE apps.

Bluejay is a simple Swift framework for building reliable Bluetooth LE apps. Bluejay's primary goals are: Simplify talking to a single Bluetooth LE pe

Steamclock Software 1k Dec 13, 2022
Demo using Terminal.Gui with Llama Swift

Hola! This repo is a demo which shows the use of Llama Swift with Terminal.Gui. Llama is my exploratory project to compile "other languages" for .NET

Eric Sink 6 May 22, 2021
Swift library to easily check the current device and some more info about it.

Usage To run the example project, clone the repo, and run pod install from the Example directory first. let device = Deviice.current device is a Devi

Andrea Mario Lufino 56 Nov 3, 2022
Light weight tool for detecting the current device and screen size written in swift.

Device detect the current  device model and screen size. Installation CocoaPods Device is available through CocoaPods. To install it, simply add the

Lucas Ortis 1.5k Dec 28, 2022
Switshot is a game media manager helps you transfer your game media from Nintendo Switch to your phone, and manage your media just few taps.

Switshot is a game media manager helps you transfer your game media from Nintendo Switch to your phone, and manage your media just few taps.

Astrian Zheng 55 Jun 28, 2022
A nice iOS View Capture Swift Library which can capture all content.

SwViewCapture A nice iOS View Capture Library which can capture all content. SwViewCapture could convert all content of UIWebView to a UIImage. 一个用起来还

Xing Chen 597 Nov 22, 2022
NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS

NextLevel is a Swift camera system designed for easy integration, customized media capture, and image streaming in iOS. Integration can optionally leverage AVFoundation or ARKit.

NextLevel 2k Jan 2, 2023
📸 iOS Media Capture – features touch-to-record video, slow motion, and photography

PBJVision PBJVision is a camera library for iOS that enables easy integration of special capture features and camera interface customizations in your

patrick piemonte 1.9k Dec 26, 2022
A custom calculator for deg to rad conversion & the other way round

Lilium Features A custom calculator for deg to rad conversion & the other way round. How to use Slide up the dock and you should see Lilium. An activa

null 2 Nov 20, 2022
Team Kodi 15k Jan 8, 2023
A utility application to capture and review search results from Swift Package Index.

SPISearch An app (macOS & iOS) to explore the search results from Swift Package Index. Testflight Links: SPIIndex (iOS and macOS apps) Search Ranking

Joseph Heck 5 Jul 26, 2022
A frida tool that capture GET/POST HTTP requests of iOS Swift library 'Alamofire' and disable SSL Pinning.

FridaHookSwiftAlamofire A frida tool that capture GET/POST HTTP requests of iOS Swift library 'Alamofire' and disable SSL Pinning. 中文文档及过程 Features Ca

neilwu 69 Dec 16, 2022
An iOS Framework Capture & record ARKit videos 📹, photos 🌄, Live Photos 🎇, and GIFs 🎆.

An iOS Framework that enables developers to capture videos ?? , photos ?? , Live Photos ?? , and GIFs ?? with ARKit content.

Ahmed Bekhit 1.5k Dec 24, 2022
Sample iOS AR app that demonstrates how to capture the texture of a user's face in realtime.

Sample iOS AR app that demonstrates how to capture the texture of a user's face in realtime. This texture can be used to create a simple textured 3D face model.

Matt Bierner 58 Dec 14, 2022
This UI attempts to capture the Quibi Card Stack and the associated User Interaction.

RGStack This UI attempts to capture the Quibi Card Stack and the associated User Interaction. Required A View that conforms to the ConfigurableCard pr

RGeleta 96 Dec 18, 2022
iOS app to automagically control device torch/flash and capture photos

BlobStar ✨ Version française ???? iOS application to automagically control the device torch/flash and capture photos. The software was quickly drafted

Ninja Inc 1 Oct 11, 2021
A free, multiplatform SDK for real-time facial motion capture using blendshapes, and rigid head pose in 3D space from any RGB camera, photo, or video.

mocap4face by Facemoji mocap4face by Facemoji is a free, multiplatform SDK for real-time facial motion capture based on Facial Action Coding System or

Facemoji 592 Jan 1, 2023