Updated: September 24, 2021.
TL;DR
Google ARCore
allows you build apps for Android and iOS. With Apple ARKit
you can build apps for iOS; with Apple RealityKit
– for iOS and macOS. And a great old PTC Vuforia
was designed to create apps for Android, iOS and Universal Windows Platform.
A crucial Vuforia's peculiarity is that it uses ARCore
/ARKit
technology if the hardware it's running on supports it, otherwise Vuforia uses its own AR technology and engine, known as software solution without dependant hardware
.
When developing for Android OEM smartphones, you may encounter an unpleasant issue: devices from different manufacturers need a sensors’ calibration in order to observe the same AR experience. Luckily, Apple gadgets have no such drawback because all sensors used there were calibrated under identical conditions.
Let me put first things first.
Google ARCore 1.27
ARCore was released in March 2018. ARCore is based on the three main fundamental concepts : Motion Tracking
, Environmental Understanding
and Light Estimation
. ARCore allows a supported mobile device to track its position and orientation relative to the world in 6 degrees of freedom (6DoF) using special technique called Concurrent Odometry and Mapping. COM helps us detect the size and location of horizontal, vertical and angled tracked surfaces. Motion Tracking works robustly thanks to optical data coming from a RGB camera at 60 fps, combined with inertial data coming from gyroscope and accelerometer at 1000 fps, and depth data coming from ToF sensor at 60 fps. Surely, ARKit, Vuforia and other AR libraries operate almost the same way.
??When you move your phone through the real environment, ARCore tracks a surrounding space to understand where a smartphone is, relative to the world coordinates. At tracking stage, ARCore "sows" so called feature points
. These feature points are visible through RGB camera, and ARCore uses them to compute phone's location change. The visual data then must be combined with measurements from IMU (Inertial Measurement Unit) to estimate the position and orientation of the ArCamera over time. If a phone isn't equipped with ToF sensor, ARCore looks for clusters of feature points that appear to lie on horizontal, vertical or angled surfaces and makes these surfaces available to your app as planes (we call this technique Plane Detection). After detection process you can use these planes to place 3D objects in your scene. Virtual geometry with assigned shaders will be rendered by ARCore's companion – Sceneform supporting a real-time Physically Based Rendering (a.k.a. PBR) engine – Filament.
Notwithstanding the above, at this moment Sceneform repository has been archived and it no longer actively maintaining by Google. The last released version was Sceneform 1.17.1. That may sound strange but ARCore team member said "there's no direct replacement for Sceneform library and ARCore developers are free to use any 3D game library with Android AR apps (video from GoogleIO'21 – time 06:20).
?ARCore's environmental understanding lets you place 3D objects with a correct depth occlusion in a way that realistically integrates with the real world. For example, you can place a virtual cup of coffee on the table using Depth hit-testing and ArAnchors.
?ARCore can also define lighting parameters of a real environment and provide you with the average intensity and color correction of a given camera image. This data lets you light your virtual scene under the same conditions as the environment around you, considerably increasing the sense of realism.??
Current ARCore version has such a significant APIs as Raw Depth API and Full Depth API
, Lighting Estimation
, Augmented Faces
, Augmented Images
, Instant Placement
, Debugging Tools
, 365-days Cloud Anchors, Recording and Playback
and Multiplayer support
. The main advantage of ARCore in Android Studio over ARKit in Xcode is Android Emulator allowing you run and debug AR apps using virtual device.??
This table presents the difference between Raw Depth API and Full Depth API:
|------------|--------------------|--------------------|
| | "Raw Depth API" | "Full Depth API" |
|------------|--------------------|--------------------|
| Accuracy | Awesome | Bad |
|------------|--------------------|--------------------|
| Coverage | Not all pixels | All pixels |
|------------|--------------------|--------------------|
| Distance | 0.5 to 5.0 m | 0 to 8.0 m |
|------------|--------------------|--------------------|
ARCore is older than ARKit. Do you remember Project Tango released in 2014? Roughly speaking, ARCore is just a rewritten Tango SDK. But a wise acquisition of FlyBy Media, Faceshift, MetaIO, Camerai and Vrvana helped Apple not only to catch up but significantly overtake Google. Suppose it's good for AR industry.
The latest version of ARCore supports OpenGL ES acceleration, and integrates with Unity, Unreal, and Web applications. At the moment the most powerful and energy efficient chipsets for AR experience on Android platform are Snapdragon 888 Plus (5nm), Exynos 2100 (5nm) and Kirin 9000 (5nm) – now Google and Huawei are almost friends again.
ARCore price: FREE.
|------------------------------|------------------------------|
| "ARCore PROs" | "ARCore CONs" |
|------------------------------|------------------------------|
| iToF and Depth API support | No Body Tracking support |
|------------------------------|------------------------------|
| Quick Plane Detection | Cloud Anchors hosted online |
|------------------------------|------------------------------|
| Long-distance-accuracy | Lack of rendering engines |
|------------------------------|------------------------------|
| ARCore Emulator in AS | Poor developer documentation |
|------------------------------|------------------------------|
| High-quality Lighting API | No external camera support |
|------------------------------|------------------------------|
| A lot of supported devices | Poor Google Glass API |
|------------------------------|------------------------------|
Here's ARCore code's snippet written in Kotlin:
private fun addNodeToScene(fragment: ArFragment,
anchor: Anchor,
renderable: Renderable) {
val anchorNode = AnchorNode(anchor)
anchorNode.setParent(fragment.arSceneView.scene)
val modelNode = TransformableNode(fragment.transformationSystem)
modelNode.setParent(anchorNode)
modelNode.setRenderable(renderable)
modelNode.localPosition = Vector3(0.0f, 0.0f, -3.0f)
fragment.arSceneView.scene.addChild(anchorNode)
modelNode.select()
}
Apple ARKit 5.0
ARKit was released in June 2017. Like its competitors, ARKit also uses special technique for tracking, but its name is Visual Inertial Odometry. VIO is used to very accurately track the world around your device. VIO is quite similar to COM found in ARCore. There are also three similar fundamental concepts in ARKit: World Tracking
, Scene Understanding
(which includes four stages: Plane Detection, Ray-Casting, Light Estimation, Scene Reconstruction), and Rendering
with a great help of ARKit companions – SceneKit framework, that’s actually an Apple 3D game engine since 2012, RealityKit framework specially made for AR and written in Swift from scratch (released in 2019), and SpriteKit framework with its 2D engine (since 2013).
VIO fuses RGB sensor data at 60 fps with Core-Motion data (IMU) at 1000 fps and LiDAR data. In addition to that, It should be noted that due to a very high energy impact (because of an enormous burden on CPU and GPU), your iPhone's battery will be drained pretty quickly. The same can be said about Android devices.
ARKit has a handful of useful approaches for robust tracking and accurate measurements. Among its arsenal you can find easy-to-use functionality for saving and retrieving ARWorldMaps. World map is an indispensable "portal" for Persistent and Multiuser AR experience that allows you to come back to the same environment filled with the same chosen 3D content just before the moment your app became inactive. Support for simultaneous front
and back
camera capture and support for collaborative sessions
, is also great.
There are good news for gamers: up to 6 people are simultaneously able to play the same AR game, thanks to MultipeerConnecti