GithubHelp home page GithubHelp logo

davidetalevi98 / introduction-to-augmented-reality-on-ios Goto Github PK

View Code? Open in Web Editor NEW

This project forked from hunterh37/introduction-to-augmented-reality-on-ios

0.0 0.0 0.0 88 KB

An introduction to Augmented Reality on iOS.

Home Page: http://www.hunt3r.org/

introduction-to-augmented-reality-on-ios's Introduction

ARKit-LogoRealityKit-Logo

Introduction to Augmented Reality on iOS

This page is intended to be an introduction to Augmented Reality on iOS - including frameworks, best practices, resources, and more. Apple has created a wonderful augmented reality platform - but with technology and frameworks evolving so quickly, things can get a bit confusing. ARKit, RealityKit, SceneKit, and Metal, are often brought up when discussing an augmented reality application. Often times discussed in tandem with one another, or sometimes individually, this document will help provide clarity.

Table of Contents

Examples and Media

First let's look at what is currently on the market, and previews of the most cutting edge augmented reality experiences.

ARKit-Logo

Reality-UI - a framework for building user interfaces in Augmented Reality


ARKit-Logo

ARKit-CoreLocation framework


ARKit-Logo

CoreML-ARKit - an example project using machine learning and ARKit to display labels above objects

Comparision: ARKit, SceneKit, RealityKit - What's the difference?

Apple’s ARKit platform was originally built on top of SceneKit — a 3D graphics framework developed for mobile games. This gave a solid foundation to leverage SceneKit’s rendering technology to create an early platform for augmented reality applications. RealityKit was introduced after ARKit and SceneKit, as a way to simplify the workflow of creating an augmented reality application. RealityKit offers many features out of the box, such as: handling gestures, collisions, and applying real world lighting effects to objects.

ARKit (iOS 11)

ARKit captures devices data that is used to render objects in a realistic method - it combines device motion tracking, camera scene capture, and advanced scene processing. SceneKit offers a way to render AR content using the data provided from ARKit.

SceneKit (iOS 8)

SceneKit was originally developed for mobile games. In an augmented reality application, SceneKit uses the data from ARKit to render objects in a realistic method. SceneKit provides a way to drill down and manage these objects, but many of the features were not created for an augmented reality application. This is where RealityKit comes in.

RealityKit (iOS 13)

RealityKit was created as a way to simplify the creation of building augmented reality applications. Out of the box, RealityKit provides many of the features that would traditionally require manual calculations and rendering.

RealityKit mapping of virtual object on table


ARKit-Logo

ARKit

ARKit is an augmented reality framework for iOS, available on both iPhone and iPad. ARKit lets developers place digital objects in the real world by blending the camera on the screen with virtual objects, allowing users to interact with these objects in a real space.

It does this by using the camera on iOS devices to create a map of the area, detecting things like tabletops, floors and the location of the device in the physical space using CoreMotion data. No calibration is required from the user.

ARKit can run on most modern iPhones and iPads, utilizing SceneKit integration to render the objects. ARKit also makes it possible to integrate with third-party tools such as Unity and Unreal Engine, to use their rendering capabilities.

ARKit Features

RealityKit-Logo - RealityKit-Logo


RealityKit-Logo

RealityKit

The RealityKit framework was built from the ground up specifically for augmented reality with photo-realistic rendering, camera effects, animations, physics, and more. With native Swift APIs, ARKit integration, incredibly realistic physics-based rendering, transform and skeletal animations, spatial audio, and rigid body physics, RealityKit makes AR development faster and easier than ever before.

RealityKit Features

  • Leverages data from ARKit for realistic rendering of augmented reality experiences
  • Gesture handling (tap to move objects)
  • Animations (start or stop multiple animations on different objects)
  • Shaders (allows for complex rendering of objects)
  • Physics (Object movement and collision)
  • Lighting & Shadows estimation

Materials

  • SimpleMaterial
  • UnlitMaterial
  • OcclusionMaterial
  • VideoMaterial
  • PhysicallyBasedMaterial
  • CustomMaterial

Quick Start - RealityKit Implementation

Quickly add a .usdz file to an ARView using Entity.load

  import RealityKit
            
  //Initialize an ARView programically
  let arView = ARView(frame: CGRect(x: 0, y: 0, width: 0, height: 0))
  
  //Or connect via IBOutlet
  @IBOutlet weak var arView: ARView!
  
  //Add a .usdz object to the ARView
  let anchor = AnchorEntity()
  let modelEntity = try! Entity.loadModel(named: "name_of_usdz_file")  
  anchor.addChild(modelEntity)
  arView.scene.addAnchor(anchor)
            

Note: Make sure to enable the Privacy - Camera Usage Description setting in info.plist

Common Components in an Augmented Reality Application

ARKit-Logo

  • ARView

    • A view that displays an augmented reality experience that incorporates content from RealityKit.
    • This is what allows us to see 3d models placed into the environment, using the camera view.
  • Scene

    • A container that holds the collection of entities rendered by an AR view.
    • You don’t create a Scene instance directly. Instead, you get the one and only scene associated with a view from the scene property of an ARView instance.
    • To add content to the view’s scene, you first create and add one or more AnchorEntity instances to the scene’s anchors collection.
  • AnchorEntity

    • Anchors tell RealityKit how to pin virtual content to real world objects, like flat surfaces or images. You then add a hierarchy of other Entity instances to each anchor to indicate the geometry and behaviors that RealityKit should render at a given anchor point.
  • Entity

    • An element of a RealityKit scene to which you attach components that provide appearance and behavior characteristics for the entity.
    • You create and configure entities to embody objects that you want to place in the real world in an AR app. You do this by adding Entity instances to the Scene instance associated with an ARView.

Useful Snippets

 // create an entity from local .usdz file

 func addEntity() {
        let anchor = AnchorEntity()
        let modelEntity = try! Entity.loadModel(named: "Local_Filename_Here")
        
        anchor.addChild(modelEntity)
        arView.scene.addAnchor(anchor)
 }

 //create a cube entity with texture from remote image url
 
 func createCubeWithRemoteTexture(remoteUrl: URL) {
        // First create a local temporary file URL to store the image at the remote URL.
        let fileURL = FileManager.default.temporaryDirectory.appendingPathComponent(UUID().uuidString)
        // Download contents of imageURL as Data.  Use a URLSession if you want to do this asynchronously.
        let data = try! Data(contentsOf: remoteURL)
        
        // Write the image Data to the file URL.
        try! data.write(to: fileURL)
        do {
            // Create a TextureResource by loading the contents of the file URL.
            let texture = try TextureResource.load(contentsOf: fileURL)
            var material = SimpleMaterial()
            material.baseColor = MaterialColorParameter.texture(texture)
         
            let entity = ModelEntity(mesh: .generateBox(size: 0.5), materials: [material])
            let anchor = AnchorEntity(.plane(.any, classification: .any, minimumBounds: .zero))
            
            entity.generateCollisionShapes(recursive: true)
            arView.installGestures(for: entity)
            
            anchor.addChild(entity)
            arView.scene.addAnchor(anchor)
        } catch {
            print(error.localizedDescription)
        }
    }



//save an AR world map to url

func writeWorldMap(_ worldMap: ARWorldMap, to url: URL) throws {
    let data = try NSKeyedArchiver.archivedData(withRootObject: worldMap, requiringSecureCoding: true)
    try data.write(to: url)
}

//load an AR worldmap from url

func loadWorldMap(from url: URL) throws -> ARWorldMap {
    let mapData = try Data(contentsOf: url)
    guard let worldMap = try NSKeyedUnarchiver.unarchivedObject(ofClass: ARWorldMap.self, from: mapData)
        else { throw ARError(.invalidWorldMap) }
    return worldMap
}

Media:

Apple Measure App

Lowes Envision App

Face Tracking

Veep - Augmented Reality Social Network: https://youtu.be/HDHnbd8QHGU

Resources:

Apple Tools

RealityKit-Logo

https://developer.apple.com/augmented-reality/tools/

  • Reality Composer - found within Xcode developer menu

  • Reality Converter - Download from Apple website, allows for quick conversion of common 3d media formats into .usdz or .reality for ARKit compatibility. Common filetypes include: .fbx and .obj

  • Object Capture - https://developer.apple.com/augmented-reality/object-capture/ - Turn photos from your iPhone or iPad into high‑quality 3D models that are optimized for AR using the new Object Capture API on macOS Monterey. Object Capture uses photogrammetry to turn a series of pictures taken on your iPhone or iPad into USDZ files that can be viewed in AR

Useful Links (Apple.com)

Apple Example Projects:

Other Example Projects & Frameworks:

  • ARKit Sampler - a sample implementation of many ARKit features.
  • RealityUI - a framework for building user interfaces in AR

3D Model Marketplaces

Augmented Reality iOS Timeline

  • 2012 Vuforia Engine (3rd party AR framework)
  • 2017 ARKit Introduction
  • 2018 ARKit 2
  • 2019 ARKit 3
  • 2019 RealityKit
  • 2020 ARKit 4
  • 2021 RealityKit 2
  • 2021 ARKit 5

Sources:

Conclusion / Future of AR Apps on iOS

  • RealityKit + SwiftUI + Apple Silicon = Apple Glasses

With such a strong augmented reality platform, it certainly seems Apple is preparing for their launch of Apple Glasses. These tools and frameworks have been quietly brewing in the background - the major bugs, flaws, and quirks, have all been worked out. Apple Glasses might spark a consumer revolution similar to the first iPhone, so developers and anyone involved in mobile app development should pay attention to this space.

introduction-to-augmented-reality-on-ios's People

Contributors

hunterh37 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.