Audio-ar-playground - Tracking Geographic Locations in AR

Overview

Tracking Geographic Locations in AR

Track specific geographic areas of interest and render them in an AR experience.

Overview

In this sample app, the user marks spots on a map or camera feed to create a collection of anchors they view in augmented reality (AR). By rendering those anchors as virtual content in an AR view, the user can see a nearby anchor through the camera feed, move to its physical location, and continue to move to any subsequent anchors in the collection. If a virtual anchor that the user is moving toward isn't visible in the camera feed, the user can refer to its pin in the map view and advance until the virtual anchor becomes visible.

Geotracking configuration (ARGeoTrackingConfiguration) combines GPS, the device's compass, and world-tracking features in AR to track specific geographic locations. By giving ARKit a latitude and longitude (and optionally, altitude), the sample app declares interest in a specific location on the map.

During a geotracking session, ARKit marks this location in the form of a location anchor (ARGeoAnchor) and continually refines its position in the camera feed as the user moves about. ARKit provides the location anchor's coordinates with respect to the scene, which allows the app to render virtual content at its real-world location or trigger other interactions.

For example, when the user approaches a location anchor, an app may reveal a virtual signpost that explains a historic event that occurred there. Or, to form a street route, an app could render a virtual anchor in a series of location anchors that connect.

Figure of an AR app showing two views. The upper view displays a camera feed that captures a busy city intersection. A series of floating blue buoys form a path leading the user to turn right. In the lower view, a top-down map provides an alternate view of the same scene. Dots on the map correspond to the buoys seen in the camera feed, which appear to lead the user through the city.

  • Note: ARKit supports geotracking only with the device's rear camera.

Configure the Sample Code Project

The sample app demonstrates geotracking coaching, which requires iOS 15. The Xcode project defines a deployment target of iOS 15.

Geotracking requires a device with A12 Bionic chip or later, and cellular (GPS) capability. Set the project's run destination to a device. ARKit doesn't support iOS Simulator.

Ensure Device Support

The sample app checks whether a device supports geotracking at the application entry point, AppDelegate.swift:

if !ARGeoTrackingConfiguration.isSupported {
    let storyboard = UIStoryboard(name: "Main", bundle: nil)
    window?.rootViewController = storyboard.instantiateViewController(withIdentifier: "unsupportedDeviceMessage")
}

If the device doesn't support geotracking, the sample project stops. Optionally, an app can present an error message and continue the session at a limited capacity without geotracking.

Display an AR View and Map View

The sample project renders location anchors using an ARView. To reinforce the correspondence between geographic locations and positions in the session's local space, the sample project also displays a map view (MKMapView) that marks the anchors from a top-down perspective. The app displays both views simultaneously by using a stack view (UIStackView) with the camera feed on top. See the sample's View Controller Scene within the project's Main.storyboard.

Check Availability and Run a Session

To place location anchors with precision, geotracking requires a better understanding of the user’s geographic location than is possible with GPS alone. Based on a particular GPS coordinate, ARKit downloads batches of imagery that depict the physical environment in that area and assist the session with determining the user’s precise geographic location.

This localization imagery captures the view mostly from public streets and routes accessible by car. As a result, geotracking doesn’t support areas within the city that are gated or accessible only to pedestrians.

Because localization imagery depicts specific regions on the map, geotracking only supports areas where Apple has collected localization imagery in advance. Before starting a session, the sample project checks whether geotracking supports the user's location by calling checkAvailability(completionHandler:).

ARGeoTrackingConfiguration.checkAvailability { (available, error) in
    if !available {
        let errorDescription = error?.localizedDescription ?? ""
        let recommendation = "Please try again in an area where geotracking is supported."
        let restartSession = UIAlertAction(title: "Restart Session", style: .default) { (_) in
            self.restartSession()
        }
        self.alertUser(withTitle: "Geotracking unavailable",
                       message: "\(errorDescription)\n\(recommendation)",
                       actions: [restartSession])
    }
}

ARKit requires a network connection to download localization imagery. The checkAvailability function returns false if a network connection is unavailable. If geotracking is available, the sample project runs a session.

let geoTrackingConfig = ARGeoTrackingConfiguration()
geoTrackingConfig.planeDetection = [.horizontal]
arView.session.run(geoTrackingConfig, options: .removeExistingAnchors)

Coach the User for Geotracking Status

To begin a geotracking session, the framework undergoes several geotracking states. At any point, the session can require action from the user to progress to the next state. To instruct the user on what to do, the sample project uses a ARCoachingOverlayView with the .geotracking goal.

func setupCoachingOverlay() {
    coachingOverlay.delegate = self
    arView.addSubview(coachingOverlay)
    coachingOverlay.goal = .geoTracking

Instruct the User Based on Geotracking State

After the app localizes and begins a geotracking session, the sample app monitors the geotracking state and instructs the user by presenting text with a label.

self.trackingStateLabel.text = text

As the user moves along a street, the framework continues to download localization imagery as needed to maintain a precise understanding of the user's position in the world. If the .geoDataNotLoaded error occurs after the session localizes, it may indicate a network issue. If this error persists, the app may ask the user to check the internet connection.

While the session runs, the status reason notAvailableAtLocation occurs if the user crosses into an area where ARKit lacks geotracking support. To enable the session to continue, the sample project presents text to guide the user back to a supported area.

case .notAvailableAtLocation: return "Geotracking is unavailable here. Please return to your previous location to continue"

Coach the User as the Session Runs

A geotracking session maps geographic coordinates to ARKit's world-tracking local space, which requires basic world-tracking support. If environmental circumstances impair the device's world-tracking condition, the geotracking coaching overlay alerts the user and displays instructions to resolve the problem.

For example, if the user travels too quickly, the device's camera feed may not contain sufficient features that ARKit requires to model the environment. In this case:

  1. The framework sets world-tracking state to limited.
  2. The geotracking session observes the world-tracking status change and sets the geotrackingstatus reason to geoWorldUnstable.
  3. Coaching overlay activates and displays the text: "Slow down".

The sample app disables the user interface until the user responds to the coaching.

func coachingOverlayViewWillActivate(_ coachingOverlayView: ARCoachingOverlayView) {
    mapView.isUserInteractionEnabled = false
    undoButton.isEnabled = false
    hideUIForCoaching(true)
}

ARKit dismisses the coaching overlay when the tracking status improves, and the app reenables the user interface.

func coachingOverlayViewDidDeactivate(_ coachingOverlayView: ARCoachingOverlayView) {
    mapView.isUserInteractionEnabled = true
    undoButton.isEnabled = true
    hideUIForCoaching(false)
}

Create an Anchor When the User Taps the Map

The sample project acquires the user's geographic coordinate (CLLocationCoordinate2D) from the map view at the screen location where the user tapped.

func handleTapOnMapView(_ sender: UITapGestureRecognizer) {
    let point = sender.location(in: mapView)
    let location = mapView.convert(point, toCoordinateFrom: mapView)

With the user's latitude and longitude, the sample project creates a location anchor.

geoAnchor = ARGeoAnchor(coordinate: location)

Because the map view returns a 2D coordinate with no altitude, the sample calls init(coordinate:), which defaults the location anchor's altitude to ground level.

To begin tracking the anchor, the sample project adds it to the session.

arView.session.add(anchor: geoAnchor)

The sample project listens for the location anchor in session(didAdd:) and visualizes it in AR by adding a placemark entity to the scene.

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {
    for geoAnchor in anchors.compactMap({ $0 as? ARGeoAnchor }) {
        // Effect a spatial-based delay to avoid blocking the main thread.
        DispatchQueue.main.asyncAfter(deadline: .now() + (distanceFromDevice(geoAnchor.coordinate) / 10)) {
            // Add an AR placemark visualization for the geo anchor.
            self.arView.scene.addAnchor(Entity.placemarkEntity(for: geoAnchor))

To establish visual correspondence in the map view, the sample project adds an MKOverlay that represents the anchor on the map.

let anchorIndicator = AnchorIndicator(center: geoAnchor.coordinate)
self.mapView.addOverlay(anchorIndicator)

Create an Anchor When the User Taps the AR View

When the user taps the camera feed, the sample project casts a ray at the screen-tap location to determine its intersection with a real-world surface.

if let result = arView.raycast(from: point, allowing: .estimatedPlane, alignment: .any).first {

The ray cast result's translation describes the intersection's position in ARKit's local coordinate space. To convert that point to a geographic location, the sample project calls the session-provided utility getGeoLocation(forPoint:).

arView.session.getGeoLocation(forPoint: worldPosition) { (location, altitude, error) in

Then, the sample project creates a location anchor with the result. Because the result includes altitude, the sample project calls the init(coordinate:altitude:) anchor initializer.

Assess Geotracking Accuracy

To ensure the best possible user experience, an app must monitor and react to the geotracking accuracy. When possible, the sample project displays the accuracy as part of its state messaging to the user. The session populates accuracy in its geoTrackingStatus in state .localized.

if geoTrackingStatus.state == .localized {
    text += "Accuracy: \(geoTrackingStatus.accuracy.description)"

An app renders location anchors using an asset that’s less exact if geotracking is off by a small distance, such as when accuracy is .low. For example, the sample app renders a location anchor as a large ball several meters in the air rather than an arrow pointing to a real-world surface.

Center the Map as the User Moves

The sample project uses updates from Core Location to center the user in the map view. When the user moves around, Core Location notifies the delegate of any updates in geographic position. The sample project monitors this event by implementing the relevant callback.

func locationManager(_ manager: CLLocationManager, didUpdateLocations locations: [CLLocation]) {

When the user's position changes, the sample project pans the map to center the user.

let camera = MKMapCamera(lookingAtCenter: location.coordinate,
                         fromDistance: CLLocationDistance(250),
                         pitch: 0,
                         heading: mapView.camera.heading)
mapView.setCamera(camera, animated: false)
You might also like...
Swift audio synthesis, processing, & analysis platform for iOS, macOS and tvOS
Swift audio synthesis, processing, & analysis platform for iOS, macOS and tvOS

AudioKit AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS (including Catalyst), and tvOS. Installation To add AudioKit

Voice Memos is an audio recorder App for iPhone and iPad that covers some of the new technologies and APIs introduced in iOS 8 written in Swift.
Voice Memos is an audio recorder App for iPhone and iPad that covers some of the new technologies and APIs introduced in iOS 8 written in Swift.

VoiceMemos Voice Memos is a voice recorder App for iPhone and iPad that covers some of the new technologies and APIs introduced in iOS 8 written in Sw

YiVideoEditor is a library for rotating, cropping, adding layers (watermark) and as well as adding audio (music) to the videos.

YiVideoEditor YiVideoEditor is a library for rotating, cropping, adding layers (watermark) and as well as adding audio (music) to the videos. YiVideoE

App for adding and listening audio files
App for adding and listening audio files

SomeSa SomeSa (самса) – приложение, позволяющее загружать и воспроизводить произвольные аудиофайлы. Протестировано на форматах файлов .wav и .mp3, раз

Painless high-performance audio on iOS and Mac OS X

An analgesic for high-performance audio on iOS and OSX. Really fast audio in iOS and Mac OS X using Audio Units is hard, and will leave you scarred an

Audio Filters on iOS and OSX

Audio Filters on iOS and OSX Implement high quality audio filters with just a few lines of code and Novocaine, or your own audio library of choice. NV

A drop-in universal library allows to record audio within the app with a nice User Interface.
A drop-in universal library allows to record audio within the app with a nice User Interface.

IQAudioRecorderController IQAudioRecorderController is a drop-in universal library allows to record and crop audio within the app with a nice User Int

Audio visualisation of song
Audio visualisation of song

SonogramView Audio visualisation of song Requirements iOS 8.0+ macOS 10.10+ Xcode 8.0+ Installation: Manually First Check SonogramView.swift or MacSon

AIB indicates for your app users which audio is playing. Just like the Podcasts app.
AIB indicates for your app users which audio is playing. Just like the Podcasts app.

Audio Indicator Bars for iOS and tvOS Indicates for your app users which audio is playing. Just like the Podcasts app. Index Requirements and Details

Owner
Cenatus
Cenatus
The Amazing Audio Engine is a sophisticated framework for iOS audio applications, built so you don't have to.

Important Notice: The Amazing Audio Engine has been retired. See the announcement here The Amazing Audio Engine The Amazing Audio Engine is a sophisti

null 523 Nov 12, 2022
AudiosPlugin is a Godot iOS Audio Plugin that resolves the audio recording issue in iOS for Godot Engine.

This plugin solves the Godot game engine audio recording and playback issue in iOS devices. Please open the Audios Plugin XCode Project and compile the project. You can also use the libaudios_plugin.a binary in your project.

null 3 Dec 22, 2022
AudioKit 67 Dec 21, 2022
RaceRunner is a run-tracking app focused on racing.

RaceRunner RaceRunner is a run-tracking iPhone app focused on racing. Spectators can track the progress of runners during races. Spectators can start

Josh Adams 75 Dec 7, 2022
AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS, and tvOS.

AudioKit is an audio synthesis, processing, and analysis platform for iOS, macOS (including Catalyst), and tvOS. Installation To add AudioKit

AudioKit 9.5k Dec 31, 2022
AudioPlayer is syntax and feature sugar over AVPlayer. It plays your audio files (local & remote).

AudioPlayer AudioPlayer is a wrapper around AVPlayer. It also offers cool features such as: Quality control based on number of interruption (buffering

Kevin Delannoy 676 Dec 25, 2022
AudioPlayer is a simple class for playing audio in iOS, macOS and tvOS apps.

AudioPlayer AudioPlayer is a simple class for playing audio in iOS, macOS and tvOS apps.

Tom Baranes 260 Nov 27, 2022
Beethoven is an audio processing Swift library

Beethoven is an audio processing Swift library that provides an easy-to-use interface to solve an age-old problem of pitch detection of musical signals.

Vadym Markov 735 Dec 24, 2022
FDWaveformView is an easy way to display an audio waveform in your app

FDWaveformView is an easy way to display an audio waveform in your app. It is a nice visualization to show a playing audio file or to select a position in a file.

William Entriken 1.1k Dec 21, 2022
SwiftAudioPlayer - Swift-based audio player with AVAudioEngine as its base

SwiftAudioPlayer Swift-based audio player with AVAudioEngine as its base. Allows for: streaming online audio, playing local file, changing audio speed

null 417 Jan 7, 2023