After Apple’s introduction of ARKit 2, we have been consistently working behind to create shared-AR experiences. Our goal is to improve the utility of mobile using AR experiences.

Overview

Bluetoothed ARKit 2.0 with ARWorldMap!

After Apple’s introduction of ARKit 2, we have been consistently working behind to create shared-AR experiences. Our goal is to improve the utility of mobile using AR experiences.

This demo created using ARKit 2:

  • Creates Geo-localized AR experiences on ARWorldMap
  • Detects objects and images
  • Mark specific objects and create 3D renders in point cloud
  • Share information locally over BLE (Bluetooth low energy)

Features in this demo:

  • Image tracking
  • Save and load maps
  • Detect objects
  • Environmental texturing

Prerequisites

Before we dive into the implementation details let’s take a look at the prerequisites of the course.

  • Xcode 10 (beta or above)
  • iOS 12 (beta or above)
  • Physical iPhone 6S or above

Image recognition and tracking

“A photo is like a thousands words” - words are fine, but, ARKit-2 turns a photo into thousands of stories.

Among the new dev features introduced in WWDC 2018, Image detection and tracking is one of the coolest. Imagine having the capability to replace and deliver contextual information to any static image that you see.

Image detection was introduced in ARKit 1.5, but the functionality and maturity of the framework was a bit low. But with this release, you can build amazing AR experiences. Take a look at the demo below:

alt text

  • Class : AVImageDetection.swift
let configuration = ARImageTrackingConfiguration()
let warrior = ARReferenceImage(UIImage(named: "DD")!.cgImage!,orientation: CGImagePropertyOrientation.up,physicalWidth: 0.90)
configuration.trackingImages = [warrior]
configuration.maximumNumberOfTrackedImages = 1

you can do custom action after image detect. We have added one GIF in replace of detected image.

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
        let node = SCNNode()
        if let imageAnchor = anchor as? ARImageAnchor {
            let plane = SCNPlane(width: imageAnchor.referenceImage.physicalSize.width, height: imageAnchor.referenceImage.physicalSize.height)
            plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 0.8)
            let material = SCNMaterial()
            material.diffuse.contents = viewObj
            plane.materials = [material]
            let planeNode = SCNNode(geometry: plane)
            planeNode.eulerAngles.x = -.pi / 2
            node.addChildNode(planeNode)
        } else {
            if isFirstTime == true{
                isFirstTime = false
            } else {
                return node
            }
            let plane = SCNPlane(width: 5, height: 5)
            plane.firstMaterial?.diffuse.contents = UIColor(white: 1, alpha: 1)
            let planeNode = SCNNode(geometry: plane)
            planeNode.eulerAngles.x = .pi
            let shipScene = SCNScene(named: "art.scnassets/Sphere.scn")!
            let shipNode = shipScene.rootNode.childNodes.first!
            shipNode.position = SCNVector3Zero
            shipNode.position.z = 0.15
            planeNode.addChildNode(shipNode)
            node.addChildNode(planeNode)
        }
        return node
    }

Save and load maps

ARKit 2 comes with revolutionary ARWorldMap that allows persistent and multiuser AR experiences. In simpler words, you can use ARWorldMap to not only render AR experiences and render objects, but it also builds awareness about your user’s physical space and helps your app. This means that you can detect and standardise real world features in your iOS app.

You can then use these standardised features to place virtual content (funny GIFs anyone?) on your application.

We are going to build something like this

alt text Let’s dive into tech implementation.

First we are going to get the current world map from a user’s iPhone by using .getCurrentWorldMap(). We will save this session to get spatial awareness and initial anchors that were are going to share with another iPhone user.

  • Class : AVSharingWorldMapVC.swift
sceneView.session.getCurrentWorldMap { worldMap, error in
        guard let map = worldMap
        else { print("Error: \(error!.localizedDescription)"); return }
        guard let data = try? NSKeyedArchiver.archivedData(withRootObject: map, requiringSecureCoding: true)
        else { fatalError("can't encode map") }
        self.multipeerSession.sendToAllPeers(data)
}

Once, you get the session and world map related anchors from the first iPhone user. The app will now use Multipeer connectivity framework to push information on a P2P network to the 2nd iPhone user.

The code below shows how the second iPhone user would receive session sent over Multipeer. The 2nd iPhone user in this case would receive a notification using a receiver handler. Once the receive a notification, we can then get session data and render it in ARWorld.

func receivedData(_ data: Data, from peer: MCPeerID) {
        if let unarchived = try? NSKeyedUnarchiver.unarchivedObject(of: ARWorldMap.classForKeyedArchiver()!, from: data), let worldMap = unarchived as? ARWorldMap {
            // Run the session with the received world map.
            let configuration = ARWorldTrackingConfiguration()
            configuration.planeDetection = .horizontal
            configuration.initialWorldMap = worldMap
            sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
            mapProvider = peer
        }
}

If you wish to get a more hands on experience with Multi user AR experiences, Apple has a demo project just for that. You can download the demo here.

Object Detection

The new ARKit gives you the ability to scan 3D objects in the real world, creating a map of the scanned object that can be loaded when the object comes into view in the camera. Similar to the ARWorldMap object, ARKit creates a savable ARReferenceObject that can be saved and loaded during another session.

alt text

  • Class : AVReadARObjectVC.swift
 let configuration = ARWorldTrackingConfiguration()
 
 // ARReferenceObject(archiveURL :...   this method used when we have  ARReferenceObject Store in local Document Directory 
 configuration.detectionObjects = Set([try ARReferenceObject(archiveURL: objectURL!)])
 
 // ARReferenceObject.referenceObjects(inGroupNamed...   this method used when we have  ARReferenceObject Store in Assest Folder
 configuration.detectionObjects = ARReferenceObject.referenceObjects(inGroupNamed: "", bundle: .main)!
 
 sceneView.session.run(configuration)

When Object found this delegate method called.

   func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? 

You can do You custom action in this delegate method.

Environment Texturing

In previous versions of ARKit, 3D objects placed in the real world didn’t have the ability to gather much information about the world around them. This left objects looking unrealistic and out of place. Now, with environmental texturing, objects can reflect the world around them, giving them a greater sense of realism and place. When the user scans the scene, ARKit records and maps the environment onto a cube map. This cube map is then placed over the 3D object allowing it to reflect back the environment around it. What’s even cooler about this is that Apple is using machine learning to generate parts of the cube map that can’t be recorded by the camera, such as overhead lights or other aspects of the scene around it. This means that even if a user isn’t able to scan the entire scene, the object will still look as if it exists in that space because it can reflect objects that aren’t even directly in the scene.

To enable environmental texturing, we simply set the configuration’s

environmentalTexturing property to .automatic.

Apple has created a project that can be used to scan 3D objects, and can be downloaded here

Inspired

This demo is created from Apple's ARKit 2 sample demo

You might also like...
ARKit Base Project. Place virtual objects based on WWDC example project
ARKit Base Project. Place virtual objects based on WWDC example project

ARKit - Placing Virtual Objects in Augmented Reality Learn best practices for visual feedback, gesture interactions, and realistic rendering in AR exp

This library uses ARKit Face Tracking in order to catch user's smile.
This library uses ARKit Face Tracking in order to catch user's smile.

SmileToUnlock Make your users smile before opening the app :) Gif with the demonstration Installation Cocoapods The most preferable way to use this li

Power! Unlimited power for ARKit 2.0!
Power! Unlimited power for ARKit 2.0!

A long time ago in a galaxy, far, far away... It is a period when iPhone SE and iPhone X were destroyed from the apple store, the AR market was under

Augmented Reality image tracking with SwiftUI, RealityKit and ARKit 4.

ARImageTracking This is an Augmented Reality Xcode project that uses Apple's newest RealityKit framework and ARKit 4 features, to dynamically track a

A simple application created for educational purposes for mastering ARKit
A simple application created for educational purposes for mastering ARKit

ARDrawing AR Drawing is a simple application created for educational purposes for mastering ARKit. The basis of the project is copied from the project

Trying TDD with ARKit
Trying TDD with ARKit

ARPlacer BDD Spec As a user I want to place a random object in real world. I also want to see the distance between AR object and my phone. Use Cases

Draw VR content over live camera feed with ARKit

funny-ar Exercise with ARKit: draw VR content over live camera feed: work is in

ARKit: Projecting 3D mesh to 2D coordinate
ARKit: Projecting 3D mesh to 2D coordinate

ARKitDemo ARKit: Projecting 3D mesh to 2D coordinate A simple utility to project 3D face mesh in 2D coordinate on device screen. Sources: https://deve

Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit
Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit

HomeMax-iOS Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit and SceneKit. Inspired by IKEA place app. Description Experience on

Comments
  • Incorrect argument label in call after just loading the project

    Incorrect argument label in call after just loading the project

    Error Message: ARKit2.0-Prototype-master/iOS12_Sampler/ios12 Sampler/AVSharingWorldMapVC.swift:225:68: Incorrect argument label in call (have 'of:from:', expected 'ofClasses:from:')

    Error Location:

    • if let unarchived = try? NSKeyedUnarchiver.unarchivedObject(of: ARWorldMap.classForKeyedArchiver()!, from: data), let worldMap = unarchived as? ARWorldMap {

    • if let unarchived = try? NSKeyedUnarchiver.unarchivedObject(of: ARAnchor.classForKeyedUnarchiver(), from: data),

    Xcode Version: Version 10.2 (10E125)

    opened by ali-h2010 0
Owner
Simform Solutions
Simform Solutions
IOS example app to generate point clouds in ARKit using scenedepth

Visualizing a Point Cloud Using Scene Depth Place points in the real-world using the scene's depth data to visualize the shape of the physical environ

Isak Diaz 20 Oct 31, 2022
A library that allows you to generate and update environment maps in real-time using the camera feed and ARKit's tracking capabilities.

ARKitEnvironmentMapper Example To run the example project, clone the repo, and run pod install from the Example directory first. Installation ARKitEnv

SV Hawks 91 Dec 4, 2022
PlacenoteSDK Sample app in native iOS using ARKit, written primarily in Swift

Placenote SDK for iOS Placenote SDK lets you easily build cloud-based Augmented Reality (AR) apps that pin digital content to locations in the real wo

Placenote 93 Nov 15, 2022
A library that allows you to generate and update environment maps in real-time using the camera feed and ARKit's tracking capabilities.

ARKitEnvironmentMapper Example To run the example project, clone the repo, and run pod install from the Example directory first. Installation ARKitEnv

SV Hawks 91 Dec 4, 2022
ARDicee - Simple augmented reality app using SceneKit and ARKit

ARDicee Simple augmented reality app using SceneKit and ARKit Requirements Xcode

donggyu 3 Feb 4, 2022
AR Ruler - A simple iOS app made using ARKit and SceneKit

A simple iOS app made using ARKit and SceneKit.Which can try to simplify little things in your life such as measuring stuff.

Dishant Nagpal 5 Aug 31, 2022
Using ARKit and LiDAR to save depth data and export point cloud, based on WWDC20-10611 sample code

Save iOS ARFrame and Point Cloud This project improves the usability of the sample code from WWDC20 session 10611: Explore ARKit 4. Note that the samp

null 4 Dec 22, 2022
ARKit + CoreLocation: Combines the high accuracy of AR with the scale of GPS data.

ARKit: Uses camera and motion data to map out the local world as you move around. CoreLocation: Uses wifi and GPS data to determine your global locati

Andrew Hart 5.3k Dec 27, 2022
ARKit Demo Application

ARKitNavigationDemo Work in progress. In Progress Region — For one, we could render far fewer nodes. In fact, it’s a bit distracting that the entire t

Christopher Webb 296 Dec 16, 2022
An iOS Framework Capture & record ARKit videos 📹, photos 🌄, Live Photos 🎇, and GIFs 🎆.

An iOS Framework that enables developers to capture videos ?? , photos ?? , Live Photos ?? , and GIFs ?? with ARKit content.

Ahmed Bekhit 1.5k Dec 24, 2022