Augmented Reality has gone to the next level with ARKit 3 which was unveiled at Apple’s Worldwide Developers Conference and Tech Guide got a chance to experience it for ourselves.
The two main new features are motion capture where developers can easily integrate human movement into their app simply by filming someone making those movements.
The other new feature is People Occlusion where AR content will show up in front of or behind people simultaneously to enable a more immersive AR experience.
In the game that we played, Swift Strike, we were able to not only see a large bowling ball in front of us but also interact with it at the same time.
Yet behind us were set of pins and other objects which were able to be present at the same time.
ARKit 3 also brings in the front camera to track up to 3 faces at the same time utilises the back camera.
This also makes it easier to join a shared AR experience.
Another useful tool in creating AR apps is Apple’s RealityKit which offers photorealistic rendering and environment mapping and support for a camera effects like noise and motion blur.
Also onboard RealityKit is animation, physics and spatial audio.
Reality Composer is a powerful new app for iOS, iPadOS and Mac which allows developers can create amazing AR apps without the need to know anything about 3D modelling or rendering.
Creators can place, move and rotate AR objects to build they are experience which can then be integrated into an app in Xcode or exported to AR Quick Look.
* Stephen Fenech travelled to San Jose as a guest of Apple.