WWDC22 demo: Scanning data with the camera

Overview

Scanning Data with the Camera in SwiftUI

WWDC22 brings brilliant Live Text data scanning tools which let users scan text and codes with the camera, similar to the Live Text interface in the Camera app for iOS and iPadOS developers. In this article, I will focus on the new API which called DataScannerViewController and share my experience of how to embed this UIKit API into your SwiftUI code. The following photo shows today's demo.

Provide a reason for using the camera

Because this demo app can only be running in real world device, so to access users' camera you'd better provide a clear statement of why you need to access their camera.

You provide the reason for using the camera in the Xcode project configuration. Add the NSCameraUsageDescription key to the target's Information Property List in Xcode.

The following steps I provide come from Apple's official document: Scanning data with the camera.

  • In the project editor, select the target and click Info.
  • Under Custom iOS Target Properties, click the Plus button in any row.
  • From the pop-up menu in the Key column, choose Privacy - Camera Usage Description.
  • In the Value column, enter the reason, such as "Your camera is used to scan text and codes."

Creating a Main View

In your ContentView.swift file, add following code:

VStack {
    Text(scanResults)
        .padding()
    Button {
        // Enable Scan Document Action
    } label: {
        Text("Tap to Scan Documents")
            .foregroundColor(.white)
            .frame(width: 300, height: 50)
            .background(Color.blue)
            .cornerRadius(10)
    }
}

Where scanResult is a String variable which represents the camera scan result that will use to illustrate what the camera see during scanning.

@State private var scanResults: String = ""

Button here is used to present the scanning view. When someone taps the button, the device will be ready to scan data. However, not all devices support this function. Or even when the device supports scan data, when the user deny to provide the camera usage permission, things may failed when tapping the button.

In this case, I will provide an alert view which shows a message when the device is not capable for scanning data.

@State private var showDeviceNotCapacityAlert = false

The code above provides a variable to choose whether shows the alert view or not. If showDeviceNotCapacityAlert is true, then shows the alert view. Add the following code behind the VStack code.

.alert("Scanner Unavailable", isPresented: $showDeviceNotCapacityAlert, actions: {})

Finally when the device is ready for scanning data, we need to present a scanning view, like above code, add the following code to your ContentView.swift file.

@State private var showCameraScannerView = false
var body: some View {
    VStack {
        ...
    }
    .sheet(isPresented: $showCameraScannerView) {
        // Present the scanning view
    }
    ...
}

Now thing what we left is when we tap the button, if the device is not capable for scanning data, an alert view will show. We use isDeviceCapacity to check the whether the device can use this function or not.

@State private var isDeviceCapacity = false

Now add the follow code inside the Button action:

if isDeviceCapacity {
    self.showCameraScannerView = true
} else {
    self.showDeviceNotCapacityAlert = true
}

Create a Camera Scanner View

Create a new swift file and named it CameraScanner.swift. Add the following code here:

struct CameraScanner: View {
    @Environment(\.presentationMode) var presentationMode
    var body: some View {
        NavigationView {
            Text("Scanning View")
                .toolbar {
                    ToolbarItem(placement: .navigationBarLeading) {
                        Button {
self.presentationMode.wrappedValue.dismiss()
                        } label: {
                              Text("Cancel")
                        }
                    }
                }
                .interactiveDismissDisabled(true)
        }
    }
}

Handle when the scanner becomes unavailable

When the app is opened by users, we need to check whether the scanner is available or not. Simply add the following code then this app is opened:

var body: some View {
    VStack {
        ...
    }
    .onAppear {
        isDeviceCapacity = (DataScannerViewController.isSupported && DataScannerViewController.isAvailable)
    }
    ...
}

Only if above convenience property that checks both values returns true we can open the scanning view.

Create a data scanner view controller

To implement a view controller that can be used in a SwiftUI View, first of all we need to use UIViewControllerRepresentable to wrap a UIKit view controller so that it can be used inside SwiftUI. Create a new Swift file named CameraScannerViewController.swift and simply add the following code here:

import SwiftUI
import UIKit
import VisionKit
struct CameraScannerViewController: UIViewControllerRepresentable {
    func makeUIViewController(context: Context) -> DataScannerViewController {
        let viewController =  DataScannerViewController(recognizedDataTypes: [.text()],qualityLevel: .fast,recognizesMultipleItems: false, isHighFrameRateTrackingEnabled: false, isHighlightingEnabled: true)
        return viewController
    }
    func updateUIViewController(_ viewController: DataScannerViewController, context: Context) {}
}

The above code returns a view controller which provides the interface for scanning items in the live video. In this article, I will only focus on scanning text data so the recognizedDataTypes here only contains .text() property.

Handle Delegate protocol

After creating the view controller and before we present it, set its delegate to an object in this app that handles the DataScannerViewControllerDelegate protocol callbacks.

In UIKit it's easy to write the following code:

viewController.delegate = self

Luckily, it can be very convenient to handle the DataScannerViewControllerDelegate in SwiftUI.

SwiftUI's coordinators are designed to act as delegates for UIKit view controllers. Remember, "delegates" are objects that respond to events that occur elsewhere. For example, UIKit lets us attach a delegate object to its text field view, and that delegate will be notified when the user types anything, when they press return, and so on. This meant that UIKit developers could modify the way their text field behaved without having to create a custom text field type of their own. So add the following code inside the CameraScannerViewController:

func makeCoordinator() -> Coordinator {
    Coordinator(self)
}
class Coordinator: NSObject, DataScannerViewControllerDelegate {
    var parent: CameraScannerViewController
    init(_ parent: CameraScannerViewController) {
        self.parent = parent
    }
}

Now we can use the similar code inside makeUIViewController just on the top of the return code:

func makeUIViewController(context: Context) -> DataScannerViewController {
    ...
    viewController.delegate = context.coordinator
    return viewController
}

Begin data scanning

It's time to start the data scanning. Once the user allows access to the camera without restrictions, you can begin scanning for items that appear in the live video by invoking the startScanning() method. In this case, when the scanning view is presented we will need the view to perform scanning action. We need a value to tell the scanning view to scan, add the following code to the CameraScannerViewController :

@Binding var startScanning: Bool

When the startScanning's value is set to true, we need to update UIViewController and start scanning, add the following code inside updateUIViewController:

func updateUIViewController(_ viewController: DataScannerViewController, context: Context) {
    if startScanning {
        try? viewController.startScanning()
    } else {
        viewController.stopScanning()
    }
}

Respond when we tap an item

When we tap a recognized item in the live video, the view controller invokes the dataScanner(_:didTapOn:) delegate method and passes the recognized item. Implement this method to take some action depending on the item the user taps. Use the parameters of the RecognizedItem enum to get details about the item, such as the bounds.

In this case, to handle when we tap a text that the camera recognized, implement the dataScanner(_:didTapOn:) method to perform an action that shows the result in the screen. So add the following code inside Coordinator class:

class Coordinator: NSObject, DataScannerViewControllerDelegate {
    ...
    func dataScanner(_ dataScanner: DataScannerViewController, didTapOn item: RecognizedItem) {
        switch item {
            case .text(let text):
                parent.scanResult = text.transcript
            default:
                break
        }
    }
}

And add a Binding property inside CameraScannerViewController :

@Binding var scanResult: String

Scan Now

It's time to update the CameraScanner.swift file. Simply add the following code:

@Binding var startScanning: Bool
@Binding var scanResult: String

And change the Text("Scanning View") to the following code:

var body: some View {
    NavigationView {
        CameraScannerViewController(startScanning: $startScanning, scanResult: $scanResult)
        ...
    }
}

Finally, add the following code to ContentView : 

struct ContentView: View {
    ...
    @State private var scanResults: String = ""
    var body: some View {
        VStack {
            ...
        }
        .sheet(isPresented: $showCameraScannerView) {
            CameraScanner(startScanning: $showCameraScannerView, scanResult: $scanResults)
        }
        ...
    }
}

scanResults is used to pass the value through views. Once the camera scan something, scanResults will be updated and then update the Text view. Now run this project and enjoy yourself 😀 .

Next Step

WWDC not only brings Live Text in Camera, but also simplifies how developers can enable Live Text Interactions With Images. If you are interested in how to enable Live Text Interactions With Images, please check my new article: WWDC22: Enabling Live Text Interactions With Images in SwiftUI. Please also check Live Text Interaction demo.

You might also like...
COVID-19 SwiftUI Demo
COVID-19 SwiftUI Demo

COVID-19_SwiftUI_Demo About COVID-19_SwiftUI_Demo is the coronavirus information application using SwiftUI which is first introduced in WWDC19 keynote

A very simple Rick & Morty app to demo GraphQL + SwiftUI
A very simple Rick & Morty app to demo GraphQL + SwiftUI

MortyUI A very simple Rick & Morty app to demo GraphQL + SwiftUI Characters Character detail Episode This app have a very simple SwiftUI MVVM architec

A SwiftUI system components and interactions demo app
A SwiftUI system components and interactions demo app

SwiftUI Kit A SwiftUI system components and interactions demo app based on iOS 14, macOS Big Sur, watchOS 7, and tvOS 14. Use the SwiftUI Kit app to s

A demo of how you can integrate SwiftUI and Airtable in your app
A demo of how you can integrate SwiftUI and Airtable in your app

SwiftUI Airtable Demo This is a small, functional example app that demonstrates how you can use Airtable as a lightweight backend. I wouldn't recommen

Demo app for Podlodka Crew 5

PodlodkaFiles Demo app for Podlodka Crew 5 shows two approaches for the persistence layer architecture: Normalized in-memory storage based on Swift va

demo: REST API with SwiftUI
demo: REST API with SwiftUI

CatAPISwiftUI demo: REST API with SwiftUI The example API is the cat API. https://thecatapi.com/ You can watch me develope this code on my Youtube cha

Demo implementing Modern MVVM with Combine and SwiftUI

ModernMVVM-Combine-SwiftUI Demo implementing Modern MVVM with Combine and SwiftUI Includes the following: Publishers feedback with needed extensions V

Demo to show Air Quality Indices of Cities (in India) using SwiftUI and Combine Framework

AirQualityMonitoring-SwiftUI-Combine Demo to show Air Quality Indices of Cities (in India) using SwiftUI and Combine Framework Demo Video City List wi

The demo project to show how to organize code to make SwiftUI apps easy to be test.
The demo project to show how to organize code to make SwiftUI apps easy to be test.

TestableApp I combined the idea to use functional programming instead of an loader instance in ModelView(I prefer to think of it as a service) and Res

Comments
  • Scanning Result not Working

    Scanning Result not Working

    Hi, I'm using XCode 14.0 beta 4 and iOS 16.0 Beta 2, I installed the app successfully but when I'm trying to scan text, it does not shown on the CameraScanner, actually I think it did not call the func dataScanner but keeping showing me the warning message:

    截屏2022-08-07 13 29 28

    Any workaround? Thanks in advance.

    opened by BILLXZY1215 2
Owner
Huang Runhua
iOS & macOS developer
Huang Runhua
My WWDC22 Swift Student Challenge submission [Submitted]

WWDC22 Swift Student Challenge Submission An educational iPad app teaching some fundamental rules of typography in a fun and interactive way. Created

null 11 Nov 3, 2022
WWDC22 Challenge: SwiftUI navigation hotdish

Challenge: SwiftUI navigation hotdish See this challenge definition at: Challenge: SwiftUI navigation hotdish Proposed solution The new SwiftUI versio

Zebra 1 Jun 11, 2022
Accepted in WWDC22 Swift Student Challenge. This is an app dedicated to my grandfather.

WWDC22 Introduction Accepted in WWDC22 Swift Student Challenge. This is an app dedicated to my grandfather. In short, Grandpa's Farm is an app in whic

Xikai Liu 6 Nov 26, 2022
Swift playground teaching basics of buffer overflow vulnerability and ARM64 assembly by exploiting vulnerable app on ARM64 emulator (WWDC22 Swift Student Challenge Winner)

Pwnground Project overview Pwnground is a project created as my submission for WWDC22 Swift Student Challenge (winner). It is an interactive Swift Pla

Bartłomiej Pluta 4 Aug 8, 2022
A directory demo app written with SwiftUI, Core Data, and Alamofire

Directory-SwiftUI A directory demo app written with SwiftUI, Core Data, and Alamofire Getting Started Clone (or fork) this repo: git clone git@github.

Harold Martin 13 Dec 23, 2022
VTuber Cam, macOS app that displays an avatar using a virtual camera.

日本語 VCam macOS app that displays an avatar using a virtual camera. You can display a virtual avatar on Zoom or Google Meet, or etc. This is useful for

VCam 59 Jan 2, 2023
Notes App using Core Data for persisting the data ✨

Notes Notes app where we can save and manage our daily notes. App usage The app allow us to add new notes We can delete our existing notes We can edit

Chris 0 Nov 13, 2021
UDF (Unidirectional Data Flow) is a library based on Unidirectional Data Flow pattern.

UDF (Unidirectional Data Flow) is a library based on Unidirectional Data Flow pattern. It lets you build maintainable, testable, and scalable apps.

inDriver 51 Dec 23, 2022
A document-based SwiftUI application for viewing and editing EEG data, aimed at making software for viewing brain imaging data more accessible.

Trace A document-based SwiftUI application for viewing and editing EEG data, aimed at making software for viewing brain imaging data more accessible.

Tahmid Azam 3 Dec 15, 2022
SwiftWebUI - A demo implementation of SwiftUI for the Web

SwiftWebUI More details can be found on the related blog post at the Always Right Institute. At WWDC 2019 Apple announced SwiftUI. A single "cross pla

SwiftWebUI 3.8k Dec 28, 2022