- Apple ARKit 3 now allows tracking and occlusion of the human body
- Many new features and improvements
- Apple ARKit 3 paves the way for 2020 glasses
As part of its annual developer conference, WWDC 2019, Apple announced ARKit 3. This is the new version of Apple's ARKit augmented reality framework, launched in 2017 and to develop AR applications for iOS. This new version introduces several major advances.
Apple ARKit 3 now allows tracking and occlusion of the human body
First, this third iteration of the framework allows body tracking and occlusion. This now allows augmented reality objects to be placed in front of or behind people in a realistic way. This is a massive improvement for ARKit, which until now has already allowed iPhones to place virtual objects on flat surfaces using computer vision technology.
It is this same technology that now allows ARKit 3 to understand the position of people in the space visible to the camera. Thus, the system for placing objects in front of or behind people depending on the distance separating them from the device. Augmented reality will therefore be much more credible.
Better still, thanks to body tracking, it is now possible for the user to interact directly through gestures with virtual objects. ARKit 3 allows tracking of a virtual "copy" of the person filmed by the iPhone. Thus, it is for example possible to reproduce the movements of the body of the latter on an avatar for video games or other types of applications.
Many new features and improvements
In addition to these new features, ARKit 3 will allow simultaneous capture of the front camera and the back camera of the iPhone, motion capture, faster reference image loading, automatic image size detection, better visual consistency, and improved 3D object detection.
The Quick Look app will also enable augmented reality video recording, audio support, multiple model support, and augmented reality payment via Apple Pay. She also goes allow native iOS applications like Safari, Messages or Mail to quickly view 3D objects to scale via augmented reality.
Le multi-face tracking will now be supported, and it will be possible to launch collaborative sessions. Up to 100 images can be detected. The framework will also allow HDR environment textures. Finally, Apple unveils the AR Coaching user interface for augmented reality.
Apple ARKit 3 paves the way for 2020 glasses
While it's no longer really a secret that Apple is making augmented reality glasses, no one could have really suspected that Apple's research into augmented reality were also advanced. The Californian firm has bought many startups dedicated to augmented reality in recent years without revealing these projects, and the veil is finally lifting on Apple ARKit 3.
So this new version of the framework is sort of one of Apple's biggest surprises in many years. This is perhaps Apple's biggest innovation since the iPhone. At least, it is now certain that the role of the ARKit framework for iOS is to pave the way for the future Apple Glass glasses and their OS which will certainly be announced in a little over a year.
Occlusion and body tracking will be essential features for augmented reality glasses, since the user will wear them at all times. If the virtual objects do not overlap correctly on the people he passes, the realism of the experience will be totally ruined. Thus, Apple now has a one step ahead of its competitors such as Facebook or Microsoft. However, we do not know what cards they hide in their sleeves ...