Code examples for Depth APIs in iOS

Overview

iOS-Depth-Sampler

Platform Language License Twitter

Code examples of Depth APIs in iOS

Requirement

Use devices which has a dual camera (e.g. iPhone 8 Plus) or a TrueDepth camera (e.g. iPhone X)

How to build

Open ARKit-Sampler.xcworkspace with Xcode 10 and build it!

It can NOT run on Simulator. (Because it uses Metal.)

Contents

Real-time Depth

Depth visualization in real time using AV Foundation.

Real-time Depth Mask

Blending a background image with a mask created from depth.

Depth from Camera Roll

Depth visualization from pictures in the camera roll.

Plaease try this after taking a picture with the Camera app using the PORTRAIT mode.

Portrait Matte

Background removal demo using Portrait Effect Matte (or Portrait Effect Matte).

Plaease try this after taking a picture of a HUMAN with PORTRAIT mode.

Available in iOS 12 or later.

ARKit Depth

Depth visualization on ARKit. The depth on ARKit is available only when using ARFaceTrackingConfiguration.

2D image in 3D space

A demo to render a 2D image in 3D space.

AR occlusion

[WIP] An occlusion sample on ARKit using depth.

Author

Shuichi Tsutsumi

Freelance iOS programmer from Japan.

Support via PayPal
Comments
  • AVCaptureDevice+Extension selectDepthFormat parses empty availableFormats which results in fatalError

    AVCaptureDevice+Extension selectDepthFormat parses empty availableFormats which results in fatalError

    Hi,

    Thanks for this great repo. I've had a play around with the samples and it seems only the front camera is working properly. The device I am using to run the program is iPhone 11 and my XCode version is 12.2.

    When I try to run Real-time Depth and Real-time Depth Mask, I run into a fatal error in AVCaptureDevice+Extension line 39.

    I've tried to have a look at it myself but since I'm relatively new to Swift, I've been unable to resolve the problem myself. The availableFormats is always empty which results in the guard failing. I'm not too sure why since the format field is always populated.

    Am I doing something wrong?

    opened by E-Mac27 1
  • 4 compilation issues on Xcode 10.0, Mojave

    4 compilation issues on Xcode 10.0, Mojave

    I cloned master @106813e0860f381f1821baddf76b9cdeebfcc490, opened the .xcworkspace, and change the signing, then tried to compile, which resulting in four compilation errors:

    /Users/jon/git/iOS-Depth-Sampler/iOS-Depth-Sampler/Renderer/MetalRenderer.swift:71:39: 
    Value of type 'MTLDrawable' has no member 'texture'
    
    /Users/jon/git/iOS-Depth-Sampler/iOS-Depth-Sampler/Samples/Realtime-Depth/RealtimeDepthViewController.swift:38:54:
     Value of type 'MTLDrawable' has no member 'layer'
    
    /Users/jon/git/iOS-Depth-Sampler/iOS-Depth-Sampler/Samples/ARKit/ARKitDepthViewController.swift:34:28: 
    Value of type 'ARSCNView?' has no member 'device'
    
    /Users/jon/git/iOS-Depth-Sampler/iOS-Depth-Sampler/Samples/ARKit/ARKitDepthViewController.swift:39:54:
     Value of type 'MTLDrawable' has no member 'layer'
    
    opened by jondwillis 1
  • convert depth16 to depth32 before grayPixelData

    convert depth16 to depth32 before grayPixelData

    grayPixelDataメソッド内でindexを1/2していた箇所について、Depthの 16bitフォーマットを32bitでscanしていたためではないかと思いました。 converting(toDepthDataType: kCVPixelFormatType_DepthFloat32) メソッドを実行してz軸を逆方向にしてindexをそのまま使用できることを確認しましたので、pullrequestさせて頂きました。

    opened by tatsuya-ogawa 0
  • Can ARKit distinguish real world face from picture face?

    Can ARKit distinguish real world face from picture face?

    I am using the ARKit to detect face with depthData.

    However, I am not able to distinguish whether this face is a real world person or a picture placed in front of the device camera.

    opened by malhobayyeb 2
  • Adding example to rectify lens distortion in depth images

    Adding example to rectify lens distortion in depth images

    Hi there!

    Thank you very much for creating this open source repo, I plan to use it for future ios machine learning projects! In the WWDC 2017 Talk, Apple discusses that the depth output is geometrically distorted to align with images produced by the camera. They mention to get precise true depth measurements, you need to correct for lens distortion. the WWDC Talk says that a reference implementation to correct for lens distortion is commented in the AVCameraCalibrationData.h file.

    It would be great if you can add an example view controller that a user taps to see a recitfied depth image and enable developers to work with precise, true depth measurements. I attached the reference implementation from AVCameraCalibrationData.h for ease of reference and if anyone can add this it would be amazing!

    The following reference implementation illustrates how to use the lensDistortionLookupTable, 
        inverseLensDistortionLookupTable, and lensDistortionCenter properties to find points in the 
        lens-distorted or undistorted (rectilinear, corrected) space. If you have a distorted image (such as a photo taken by a camera) and want to find a particular point in a corresponding undistorted image, you would call the sample method below using the inverseLensDistortionLookupTable. If you have an undistorted (aka distortion-corrected) image and want to find a point in the distorted image's space, you would call the sample method below using the lensDistortionLookupTable.
     
        To apply distortion correction to an image, you'd begin with an empty destination buffer and iterate through it 
        row by row, calling the sample implementation below for each point in the output image, passing the 
        lensDistortionLookupTable to find the corresponding value in the distorted image, and write it to your 
        output buffer. Please note that the "point", "opticalCenter", and "imageSize" parameters below must be
        in the same coordinate system, i.e. both at full resolution, or both scaled to a different resolution but
        with the same aspect ratio.
     
        The reference function below returns floating-point x and y values. If you wish to match the results with 
        actual pixels in a bitmap, you should either round to the nearest integer value or interpolate from surrounding
        integer positions (i.e. bilinear interpolation from the 4 surrounding pixels).
     
    - (CGPoint)lensDistortionPointForPoint:(CGPoint)point
                               lookupTable:(NSData *)lookupTable
                   distortionOpticalCenter:(CGPoint)opticalCenter
                                 imageSize:(CGSize)imageSize
    {
        // The lookup table holds the relative radial magnification for n linearly spaced radii.
        // The first position corresponds to radius = 0
        // The last position corresponds to the largest radius found in the image.
     
        // Determine the maximum radius.
        float delta_ocx_max = MAX( opticalCenter.x, imageSize.width  - opticalCenter.x );
        float delta_ocy_max = MAX( opticalCenter.y, imageSize.height - opticalCenter.y );
        float r_max = sqrtf( delta_ocx_max * delta_ocx_max + delta_ocy_max * delta_ocy_max );
     
        // Determine the vector from the optical center to the given point.
        float v_point_x = point.x - opticalCenter.x;
        float v_point_y = point.y - opticalCenter.y;
     
        // Determine the radius of the given point.
        float r_point = sqrtf( v_point_x * v_point_x + v_point_y * v_point_y );
     
        // Look up the relative radial magnification to apply in the provided lookup table
        float magnification;
        const float *lookupTableValues = lookupTable.bytes;
        NSUInteger lookupTableCount = lookupTable.length / sizeof(float);
     
        if ( r_point < r_max ) {
            // Linear interpolation
            float val   = r_point * ( lookupTableCount - 1 ) / r_max;
            int   idx   = (int)val;
            float frac  = val - idx;
     
            float mag_1 = lookupTableValues[idx];
            float mag_2 = lookupTableValues[idx + 1];
     
            magnification = ( 1.0f - frac ) * mag_1 + frac * mag_2;
        }
        else {
            magnification = lookupTableValues[lookupTableCount - 1];
        }
     
        // Apply radial magnification
        float new_v_point_x = v_point_x + magnification * v_point_x;
        float new_v_point_y = v_point_y + magnification * v_point_y;
     
        // Construct output
        return CGPointMake( opticalCenter.x + new_v_point_x, opticalCenter.y + new_v_point_y );
    }
    
    opened by interactivetech 5
  • Sample of creating an custom depth image and adding it to an image

    Sample of creating an custom depth image and adding it to an image

    Hi I was wondering if you could add a sample that could load up a grayscale jpg image and add that as the depth map of another image that does not already have depth data?

    opened by dredenba 1
Owner
Shuichi Tsutsumi
Freelance iOS Programmer
Shuichi Tsutsumi
An iOS Framework Capture & record ARKit videos 📹, photos 🌄, Live Photos 🎇, and GIFs 🎆.

An iOS Framework that enables developers to capture videos ?? , photos ?? , Live Photos ?? , and GIFs ?? with ARKit content.

Ahmed Bekhit 1.5k Dec 24, 2022
Sample iOS AR app that demonstrates how to capture the texture of a user's face in realtime.

Sample iOS AR app that demonstrates how to capture the texture of a user's face in realtime. This texture can be used to create a simple textured 3D face model.

Matt Bierner 58 Dec 14, 2022
IOS example app to generate point clouds in ARKit using scenedepth

Visualizing a Point Cloud Using Scene Depth Place points in the real-world using the scene's depth data to visualize the shape of the physical environ

Isak Diaz 20 Oct 31, 2022
PlacenoteSDK Sample app in native iOS using ARKit, written primarily in Swift

Placenote SDK for iOS Placenote SDK lets you easily build cloud-based Augmented Reality (AR) apps that pin digital content to locations in the real wo

Placenote 93 Nov 15, 2022
This is a sample AR project written with Swift language for iOS devices

ARSample This is a sample AR project written with Swift language for iOS devices. While I was learning the ARKit framework, I defined this project and

Kamyar Sehati 2 Jun 27, 2022
An iOS app 📱that detects the image and plays video on top of it just like the harry potter movies

AR-Magic-Image This is an iOS application ?? inspired from Harry Potter movies. It tracks the image and plays the assigned video on top of the image u

Vatsal Patel 4 Dec 17, 2022
Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit

HomeMax-iOS Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit and SceneKit. Inspired by IKEA place app. Description Experience on

Ikmal Azman 5 Oct 14, 2022
ARVideoPortal - A Minimal iOS AR app to display 360 / video in sphere space

AR Video Portal A minimal iOS AR app to display 360 / video in sphere space. Xco

Yasuhito Nagatomo 6 Jun 3, 2022
ARInRoomISS - A minimal iOS AR app that displays the International Space Station (ISS) in the room

A minimal iOS AR app to display the International Space Station (ISS) in the room.

Yasuhito Nagatomo 3 Feb 3, 2022
AREarthObservatory - A minimal iOS AR app that visualizes time-series changes in the global environment based on data from NASA satellites

A minimal iOS AR app that visualizes time-series changes in the global environment based on data from NASA satellites

Yasuhito Nagatomo 3 Aug 11, 2022
A minimal iOS AR app with the Comic Postprocess Effect

AR Comic Effect A minimal iOS AR app with the Comic Postprocess Effect. Xcode 13.2.1 Target: iOS / iPadOS 15.0+ SwiftUI, ARKit, RealityKit 2, Core Ima

Yasuhito Nagatomo 14 Jun 22, 2022
AR Ruler - A simple iOS app made using ARKit and SceneKit

A simple iOS app made using ARKit and SceneKit.Which can try to simplify little things in your life such as measuring stuff.

Dishant Nagpal 5 Aug 31, 2022
Develop simple and fun Augmented Reality (AR) iOS apps

AR-Dice Simple and fun to use iOS app made to make Augmented reality (AR) be in

Dishant Nagpal 1 Feb 23, 2022
A minimal iOS AR app that displays a wave animation using RealityKit2 Geometry Modifier

AR Simple Sea A minimal iOS AR app that displays a wave animation using RealityKit2 Geometry Modifier. Xcode 13.3 Target: iOS / iPadOS 15.0+ SwiftUI,

Yasuhito Nagatomo 15 Dec 29, 2022
A sample collection of basic functions of Apple's AR framework for iOS.

RealityKit-Sampler RealityKitSampler is a sample collection of basic functions of RealityKit, Apple's AR framework for iOS. How to build 1, Download o

MLBoy 74 Dec 21, 2022
Unofficial Google ARCore Swift Package for iOS

Google didn't want to make a swift package for ARCore… so let's do it instead.

Max Cobb 4 Jul 15, 2022
A minimal iOS AR app that displays virtual objects at specific geographical locations, in an AR scene.

AR Simple GeoLocation A minimal iOS AR, Augmented Reality, app that displays virtual objects at specific geographical location, in an AR scene. With t

Yasuhito Nagatomo 23 Dec 16, 2022
Smart Online Shopping iOS App with Augmented Reality (AR) and simple Social Media features using SwiftUI and Google Firebase Cloud Services

Table of contents App Demo How to Run Context Content How it's written Inspiration App Demo AR.online.shopping.iOS.demo.mp4 How to Run First make sure

Ashkan Goharfar 13 Nov 1, 2022
Using ARKit and LiDAR to save depth data and export point cloud, based on WWDC20-10611 sample code

Save iOS ARFrame and Point Cloud This project improves the usability of the sample code from WWDC20 session 10611: Explore ARKit 4. Note that the samp

null 4 Dec 22, 2022
Visualize your dividend growth. DivRise tracks dividend prices of your stocks, gives you in-depth information about dividend paying stocks like the next dividend date and allows you to log your monthly dividend income.

DivRise DivRise is an iOS app written in Pure SwiftUI that tracks dividend prices of your stocks, gives you in-depth information about dividend paying

Kevin Li 78 Oct 17, 2022