ARKit Official Document(원문) - juniverse1103/ARKitStudy GitHub Wiki
ARKit
Integrate iOS device camera and motion features to produce augmented reality experiences in your app or game.
Overview
Augmented Reality (AR) describes user experiences that add 2D or 3D elements to the live view from a device's camera in a way that makes those elements appear to inhabit the real wold. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.
Augmented Reality with the Back Camera
The most common kinds of AR experience display a view from an iOS device's back-facing camera, augmented by other visual content, giving the user a new way to see and interact with the world around them.
ARWorldTrackingConfiguration provides this kind of experience: ARKit maps and tracks the real-world space the user inhabits, and matches it with a coordinate space for you to place virtual content. World tracking also offers features to make AR experiences more immersive, such as recognizing objects and images in the user's environment and responding to real-world lighting conditions.
Note
You can display a 3D object in the user's real-world without building a custom AR experience. In iOS 12, the system provides an AR view for 3D objects when you use QLPreviewController with USDZ files in an app, or use Safari or WebKit with USDZ files in web content
Augmented Reality with the Front Camera
On iPhone X, ARFaceTrackingConfiguration uses the front-facing TrueDepth camera to provide real-time information about the pose and expressioin of the user's face for you to use in rendering virtual content. For example, you might show the user's face in a camera view and provide realistic virtual masks. You can also omit the camera view and use ARKit facial expression data to animate virtual characters, as seen in the Animoji app for iMessage.
Topics
First Steps
-
Verifying Device Support and User Permission
Make sure your app can use ARKit and respects user privacy.
-
class ARSession
A shared object that manages the device camera and motion processing needed for augmented reality experiences.
- Shared object: A shared object is an indivisible unit that is generated from one or more relocatable objects.
-
class ARConfiguration
The abstract base class for AR session configurations.
- Abstract class: Abstract classes are classes that contain one or more abstract methods. An abstract method is a method that is declared, but contains no implementation. Abstract classes may not be instantiated, and require subclasses to provide implementations for the abstract methods.
Display
-
class ARSCNView
A view for displaying AR experiences that augment the camera view with 3D SceneKit content
-
class ARSKView
A view for displaying AR experiences that augmetn the camera view with 2D SpriteKit content.
-
Displaying an AR Experience with Metal
Build a custom AR view by rendering camera images and using position-tracking information to display overlay conent.
World Tracking
Create AR experiences that allow a user to explore virtual content in the world around them with a device's back-facing camera.
-
{ } Building Your First AR Experience
Create an app that runs an AR session and uses plane detection to place 3D content using SceneKit
-
Doc Understanding World Tracking in ARKit
Discover supporting concepts, features, and best practices for building great AR experiences.
-
class ARWorldTrackingConfiguration
A configuration that uses the back-facing camera, tracks a device's orientation and position, and detects real-world surfaces, and known images or objects.
-
class ARPlaneAnchor
Information about the position and orientation of a real-world flat surface detected in a world-tracking AR session.
-
class AREnvironmentProbeAnchor
An object that provides environmental lighting information for a specific area of space in a world-tracking AR session.
User Experience
Build compelling, intuitive AR experiences by following these examples and Human Interface Guidelines > Augmented Reality
-
Doc Managing Session Lifecycle and Tracking Quality
Make your AR experience more robust by providing clear feedback, recovering from interruptions, and resuming previous sessions.
-
{ } Handling 3D Interaction and UI Controls in Augmented Reality
Follow best practices for visual feedback, gesture interactions, and realistic rendering in AR experiences.
-
{ } SwiftShot: Creating a Game for Augmented Reality
See how Apple built the featured demo for WWDC18, and get tips for making your own multiplayer games using ARKit, SceneKit, and Swift
AR World Sharing and Persistence
-
{ } Creating a Multiuser AR Experience
Transmit ARKit world-mapping data between nearby devices with the MultiperConnectivity framework to create a shared basis for AR experiences.
-
{ }Creating a Persistent AR experience
Save and load ARKit world-mapping data to allow users to return to previous AR experiences in the same real-world environment.
-
class ARWorldMap
The space-mapping state and set of anchors from a world-tracking AR session.
Image Detection and Tracking
Use known 2D images in user's environment to enhance a world-tracking AR session.
-
{ } Recognizing Images in an AR Experience
Detect known 2D images in the user's environment, and use their positions to place AR content.
-
class ARReferenceImage
An image to be recognized in the real-world environment during a world-tracking AR session.
-
class ARImageAnchor
Information about the position and orientation of an image detected in a world-tracking AR session.
Object Detection
Use known 3D objects in the user's environment to enhance a world-tracking AR session.
-
{ } Scanning and Detecting 3D Objects
Record spatial features of real-world objects, then use the results to find those objects in the user's environment and trigger AR content.
-
class ARReferenceObject
A 3D object to be recognized in the real-world environment during a world-tracking AR session
-
class ARObjectAnchor
Information about the position and orientation of a real-world 3D object detected in a world-tracking AR session.
-
class ARObjectScanningConfiguration
A configuration that uses the back-facing camera to collect high-fidelity spatial data for use in scanning 3D objects for later detection
Hit Testing and Real-World Positions
-
class ARHitTestResult
Information about a real-world surface found by examining a point in the device camera view of an AR session.
-
class ARAnchor
A real-world position and orientation that can be used for placing objects in an AR scene
-
protocol ARAnchorCopying
Support for custom ARAnchor subclasses
-
protocol ARTrackable
A real-world object in a scene for which ARKit tracks changes to position and orientation
Camera and Scene Details
-
class ARFrame
A video image, with position-tracking information, captured as part of an AR session.
-
class ARCamera
Information about the camera position and imaging characteristics for a captured video frame in an AR session
-
class ARLightEstimate
Estimated scene lighting information associated with a captured video frame in an AR session.
Face Tracking
Use the TrueDepth camera on iPhone X to create AR experiences that respond to the user's face and facial expressions.
-
{ } Creating Face-Based AR Experiences
Use the information provided by a face tracking AR session to place and animate 3D content
-
class ARFaceTrackingConfiguration
A configuration that tracks the movement and expressions of the user's face with the TrueDepth camera
-
class ARFaceAnchor
Information about the pose, topology, and expression of a face detected in a face-tracking AR session.
-
class ARDirectionalLightEstimate
Estimated environmental lighting information associated with a captured video frame in a face-tracking AR session.
Specialized Configurations
-
class AROrientationTrackingConfiguration
A configuration that uses the back-facing camera and tracks only a device's orientation
-
class ARImageTrackingConfiguration
A configuration that uses the back-facing camera to detect and track known images.
Related Technologies
-
{ } Creating an Immersive AR Experience with Audio
Use sound effects and environmental sound layers to create an engaging AR experience.
-
{ } Using Vision in Real Time with ARKit
Manage Vision resources for efficient execution of a Core ML image classifier, and use SpriteKit to display image classifier output in AR.