Easily use ARKit to detect facial gestures.

Overview

FaceTrigger

Build Status Cocoapods Compatible

Introduction

FaceTrigger is a simple to use class that hides the details of using ARKit's ARSCNView to recognize facial gestures via ARFaceAnchor.BlendShapeLocations

Simply create an instance of FaceTrigger and register yourself as its delegate. As a FaceTrigger delegate, your class will know when face gestures occur. All delegate functions are optional.

FaceTrigger recognizes the following gestures:

  • Smile
  • Blink
  • Wink Right
  • Wink Left
  • Brow Down
  • Brow Up
  • Squint
  • Cheek Puff
  • Mouth Pucker
  • Jaw Open
  • Jaw Left
  • Jaw Right

Additional gestures can be added to the project by implementing a new class that conforms to FaceTriggerEvaluatorProtocol. Submit a PR!

Demo

For first-hand experience run the FaceTriggerExample application on a supported device.

FaceTrigger requires a device that supports ARKit face tracking such as an iPhone X, XS, or XR.

Download Full Video Demo

Demo Still

Installation

CocoaPods

For general information about using CocoaPods, please read: Using CocoaPods

Add FaceTrigger to your Podfile.

pod 'FaceTrigger'

Then install it.

$ pod install

Manual

Drag the FaceTrigger.xcodeproj file into your your project.

or

Copy the files .swift files from the FaceTrigger folder into your project: FaceTrigger/FaceTrigger/*.swift

Usage

TLDR; example

import UIKit
import FaceTrigger

class MyViewController: UIViewController {

  var faceTrigger: FaceTrigger?

  override func viewDidAppear(_ animated: Bool) {
      super.viewDidAppear(animated)

      faceTrigger = FaceTrigger(hostView: previewContainer, delegate: self)
      faceTrigger?.start()
  }
}

extension MyViewController: FaceTriggerDelegate {
  func onSmile() {
      print("smile")
  }
}

Initialization

Import FaceTrigger.

import FaceTrigger

Create an instance variable of type FaceTrigger in your view controller.

class MyViewController: UIViewController {

  var faceTrigger: FaceTrigger?

In viewDidLoad, create a new instance of FaceTrigger and assign it to your controller's instance variable.

override func viewDidAppear(_ animated: Bool) {
  super.viewDidAppear(animated)

  faceTrigger = FaceTrigger(hostView: view, delegate: self)
  faceTrigger?.start()
}

Note that you must create an instance variable in your controller. You cannot create it locally. This will NOT work: let faceTrigger = FaceTrigger(hostView: view, delegate: self). The ARSCNViewDelegate functions won't trigger. If you know why, please let me know... I'm very curious but have not dug into it.

FaceTrigger(hostView: view, delegate: self)

hostView

This is a view that will contain the ARSCNView. The ARSCNView is a view that shows the video stream of the user's face.

You do not need to display this view. If you want, just pass your view controller's view object and set the hidePreview attribute to true. The default to false and it will display the video stream.

If you do want to show the video stream of the user's face, pass any UIView in your UI as the hostView and do not set the hidePreview attribute (or explicitly set it to true).

delegate

The class conforms to the FaceTriggerDelegate protocol that will receive callbacks when facial gestures are detected. In a simple case this can just be your view controller. For example:

extension MyViewController: FaceTriggerDelegate {

    func onSmile() {
        print("smile")
    }
}

Responding to facial gestures

The FaceTriggerDelegate defines several functions that will be called when a gestures is recognized. All FaceTriggerDelegate protocol functions are optional - your application only needs to to implement those gestures that you care about.

See the FaceTriggerDelegate class for all available delegate functions.

For example, to detect when the user smiles, your delegate may implement the onSmile function.

func onSmile() {
  print("smile")
}

Each gesture also an "onChange" function. If you need need extra control to know when the user stops performing the gesture. For example:

func onSmileDidChange(smiling: Bool) {
  print("onSmileDidChange \(smiling)")
}

In the example above, the smiling parameter will be true when the user begins smiling, and false when the user stops smiling. Note that when smiling is true, the onSmile function will also be run if you have implemented it.

  @objc public protocol FaceTriggerDelegate: ARSCNViewDelegate {

  @objc optional func onSmile()
  @objc optional func onSmileDidChange(smiling: Bool)

  @objc optional func onBlink()
  @objc optional func onBlinkDidChange(blinking: Bool)

  @objc optional func onBlinkLeft()
  @objc optional func onBlinkLeftDidChange(blinkingLeft: Bool)

  @objc optional func onBlinkRight()
  @objc optional func onBlinkRightDidChange(blinkingRight: Bool)

  @objc optional func onCheekPuff()
  @objc optional func onCheekPuffDidChange(cheekPuffing: Bool)

  @objc optional func onMouthPucker()
  @objc optional func onMouthPuckerDidChange(mouthPuckering: Bool)

  @objc optional func onJawOpen()
  @objc optional func onJawOpenDidChange(jawOpening: Bool)

  @objc optional func onJawLeft()
  @objc optional func onJawLeftDidChange(jawLefting: Bool)

  @objc optional func onJawRight()
  @objc optional func onJawRightDidChange(jawRighting: Bool)

  @objc optional func onBrowDown()
  @objc optional func onBrowDownDidChange(browDown: Bool)

  @objc optional func onBrowUp()
  @objc optional func onBrowUpDidChange(browUp: Bool)

  @objc optional func onSquint()
  @objc optional func onSquintDidChange(squinting: Bool)
}

Additional options

You may set additional options on the FaceTrigger object before calling .start() to modify its behavior. You must set these prior to calling .start().

Gesture thresholds

Each gesture is triggered when the user perform the gesture to a certain degree, on a scale from 0.0 to 1.0. For example, a small smile may only have a value of 0.2, and if the smileThreshold is 0.8, the onSmile() function will not be called until the user smiles "harder".

The threshold for each gesture has been set to a default value. You may find that you want to increase or decrease these. You can do so setting the threshold on the FaceTrigger object prior to calling .start().

For example, to require the user to smile very hard before the onSmile() function will be called, increase the smileThreshold attribute to 0.99 (up from the default value of 0.8).

faceTrigger = FaceTrigger(hostView: previewContainer, delegate: self)
faceTrigger?.smileThreshold = 0.99
faceTrigger?.start()

The default threshold values for each gesture can be found in FaceTrigger.swift.

public var smileThreshold: Float = 0.7
public var blinkThreshold: Float = 0.8
public var browDownThreshold: Float = 0.25
public var browUpThreshold: Float = 0.95
public var cheekPuffThreshold: Float = 0.2
public var mouthPuckerThreshold: Float = 0.7
public var jawOpenThreshold: Float = 0.9
public var jawLeftThreshold: Float = 0.3
public var jawRightThreshold: Float = 0.3
public var squintThreshold: Float = 0.8

Hide Preview

Set the hidePreview to true if your application does not need to show the video stream of the user's face. Your application will still watch the user's face and your FaceTriggerDelegate will still receive calls when a gesture is performed even if the the video stream is not shown to the user.

faceTrigger = FaceTrigger(hostView: previewContainer, delegate: self)
faceTrigger?.hidePreview = true
faceTrigger?.start()

License

The MIT License (MIT)

Copyright (c) 2018 Michael Peterson Blinkloop

You might also like...
A simple application created for educational purposes for mastering ARKit
A simple application created for educational purposes for mastering ARKit

ARDrawing AR Drawing is a simple application created for educational purposes for mastering ARKit. The basis of the project is copied from the project

Trying TDD with ARKit
Trying TDD with ARKit

ARPlacer BDD Spec As a user I want to place a random object in real world. I also want to see the distance between AR object and my phone. Use Cases

Draw VR content over live camera feed with ARKit

funny-ar Exercise with ARKit: draw VR content over live camera feed: work is in

ARKit: Projecting 3D mesh to 2D coordinate
ARKit: Projecting 3D mesh to 2D coordinate

ARKitDemo ARKit: Projecting 3D mesh to 2D coordinate A simple utility to project 3D face mesh in 2D coordinate on device screen. Sources: https://deve

Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit
Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit

HomeMax-iOS Furniture E-Commerce Augmented Reality(AR) app in iOS powered by ARKit and SceneKit. Inspired by IKEA place app. Description Experience on

ARID - Augmented Reality app using Apple’s ARKit framework which can recognise faces of famous scientists
ARID - Augmented Reality app using Apple’s ARKit framework which can recognise faces of famous scientists

ARID Augmented Reality app using Apple’s ARKit framework which can recognise fac

ARDicee - Simple augmented reality app using SceneKit and ARKit
ARDicee - Simple augmented reality app using SceneKit and ARKit

ARDicee Simple augmented reality app using SceneKit and ARKit Requirements Xcode

Reality-iOS - NFT Augmented Reality(AR) app that demonstrate application of ARImageTracking in iOS powered by ARKit 2

Reality-iOS NFT Augmented Reality(AR) app that demonstrate application of ARImag

AR Ruler - A simple iOS app made using ARKit and SceneKit
AR Ruler - A simple iOS app made using ARKit and SceneKit

A simple iOS app made using ARKit and SceneKit.Which can try to simplify little things in your life such as measuring stuff.

Comments
  • Add the Jaw Open gesture

    Add the Jaw Open gesture

    Hi, I'm Shawn, a Computer Science student at CU Boulder. I'm working on a project related to face gestures and I was pleasantly surprised to find your FaceTrigger class. It's perfect for what I'm doing and I would be happy to contribute some new gestures. This PR adds support for "Jaw Open" as a first contribution.

    Feel free to comment on any of these changes. Cheers.

    opened by sapols 1
  • Memory Leak when Stopping

    Memory Leak when Stopping

    Hi, thanks for the great project!

    The moment you start using face trigger, memory usage jumps up about 200MBs, which is to be expected. However, calling faceTrigger.stop() does not make the memory footprint drop back.

    It seems that adding sceneView = nil right after removing it from its superview does the trick.

    opened by ClassicalDude 0
  • 'self' used in method call 'onBoth' before 'super.init' call

    'self' used in method call 'onBoth' before 'super.init' call

    I just tried to compile the project but there seem to be some minor issues and I don't know the cause. Help?

    Please see the screenshot below: screenshot 2018-08-15 at 2 13 48 pm

    File: FaceTriggeresEvaluators.swift

    Thanks!

    opened by SameeHsn 4
Releases(1.0.2)
Owner
Michael Peterson
iOS / Android / Web / API / Full-Stack Software Engineer
Michael Peterson
ARKit Demo Application

ARKitNavigationDemo Work in progress. In Progress Region — For one, we could render far fewer nodes. In fact, it’s a bit distracting that the entire t

Christopher Webb 296 Dec 16, 2022
An iOS Framework Capture & record ARKit videos 📹, photos 🌄, Live Photos 🎇, and GIFs 🎆.

An iOS Framework that enables developers to capture videos ?? , photos ?? , Live Photos ?? , and GIFs ?? with ARKit content.

Ahmed Bekhit 1.5k Dec 24, 2022
IOS example app to generate point clouds in ARKit using scenedepth

Visualizing a Point Cloud Using Scene Depth Place points in the real-world using the scene's depth data to visualize the shape of the physical environ

Isak Diaz 20 Oct 31, 2022
ARKit Base Project. Place virtual objects based on WWDC example project

ARKit - Placing Virtual Objects in Augmented Reality Learn best practices for visual feedback, gesture interactions, and realistic rendering in AR exp

Ignacio Chiazzo Cardarello 338 Jan 5, 2023
A library that allows you to generate and update environment maps in real-time using the camera feed and ARKit's tracking capabilities.

ARKitEnvironmentMapper Example To run the example project, clone the repo, and run pod install from the Example directory first. Installation ARKitEnv

SV Hawks 91 Dec 4, 2022
This library uses ARKit Face Tracking in order to catch user's smile.

SmileToUnlock Make your users smile before opening the app :) Gif with the demonstration Installation Cocoapods The most preferable way to use this li

Ruslan Serebriakov 628 Oct 22, 2022
PlacenoteSDK Sample app in native iOS using ARKit, written primarily in Swift

Placenote SDK for iOS Placenote SDK lets you easily build cloud-based Augmented Reality (AR) apps that pin digital content to locations in the real wo

Placenote 93 Nov 15, 2022
Power! Unlimited power for ARKit 2.0!

A long time ago in a galaxy, far, far away... It is a period when iPhone SE and iPhone X were destroyed from the apple store, the AR market was under

KBOY (Kei Fujikawa) 516 Dec 1, 2022
A library that allows you to generate and update environment maps in real-time using the camera feed and ARKit's tracking capabilities.

ARKitEnvironmentMapper Example To run the example project, clone the repo, and run pod install from the Example directory first. Installation ARKitEnv

SV Hawks 91 Dec 4, 2022
Augmented Reality image tracking with SwiftUI, RealityKit and ARKit 4.

ARImageTracking This is an Augmented Reality Xcode project that uses Apple's newest RealityKit framework and ARKit 4 features, to dynamically track a

Richard Qi 198 Dec 7, 2022