OUR BLOG


by Vignesh | 09 July, 2018 | 0 Comments

ARKit: a new iOS framework

Apple has announced at WWDC17 a new iOS framework called ARKit. It's a framework that "allows you to easily create unparalleled augmented reality experiences for iPhone and iPad".


Augmented reality is simply the ability to digitally place virtual elements into the real world and interact with these elements as if they were actually present. ARKit describes user experiences that add 2D or 3D elements to the live view from a device's camera in a way that makes those elements appear to inhabit the real world. ARKit combines device motion tracking, camera scene capture, advanced scene processing, and display conveniences to simplify the task of building an AR experience. You can use these technologies to create many kinds of AR experiences using either the back camera or front camera of an iOS device.


Augmented reality is getting a lot of buzz right now because of the Apple's new framework. One of the first and probably the most famous app that showed us the power of AR in apps was Pokemon Go. Achieving apps with the same interactivity as Pokemon Go isn't easy and that's why I think that ARKit will make the difference. With this new framework, AR will be more accessible to developers by bringing native AR support to iOS.


It does lighting estimation using the camera sensor, it can analyse what's presented by the camera view and find horizontal planes like tables and floors and it can place and track objects on anchor points. You can even render 3D objects using Metal, SceneKit, and third-party tools like Unity and Unreal Engine. ARKit does all of this with great performance and it's well documented.


Augmented Reality with the Back Camera:

The most common kinds of AR experience display a view from an iOS device's back-facing camera, augmented by other visual content, giving the user a new way to see and interact with the world around them.ARKit maps and tracks the real-world space the user inhabits, and matches it with a coordinate space for you to place virtual content. World tracking also offers features to make AR experiences more immersive, such as recognising objects and images in the user's environment and responding to real-world lighting conditions.


Augmented Reality with the Front Camera:

On iPhone X, uses the front-facing TrueDepth camera to provide real-time information about the pose and expression of the user's face for you to use in rendering virtual content. For example, you might show the user's face in a camera view and provide realistic virtual masks. You can also omit the camera view and use ARKit facial expression data to animate virtual characters, as seen in the Animoji app for iMessage.


ARKit useful components:

Physical camera feed:

Place this component on the physcial camera object. It will grab the textures needed for rendering the video, set it on the material needed for blitting to the backbuffer, and set up the command buffer to do the actual blit.


Virtual camera manager:

Place this component on a GameObject in the scene that references the virtual camera that you intend to control via ARKit. It will position and rotate the camera as well as provide the correct projection matrix to it based on updates from ARKit. This component also has the code to initialize an ARKit session.


Plane anchor GameObjects:

For each plane anchor detected, this component generates a GameObject which is instantiated from a referenced prefab and positioned, scaled and rotated according to plane detected. As the plane anchor updates and is removed, so is the corresponding GameObject.


Point cloud visualizer:

This component references a particle system prefab, maximum number of particles and size per particle to be able to visualize the point cloud as particles in space.


Hit test:

This component references the root transform of a GameObject in the scene, and does an ARKit Hit Test against the scene wherever user touches on screen, and when hit successful (against HitTest result types enumerated in the script), moves the referenced GameObject to that hit point.


Light estimation:

This component when added to a light in the scene will scale the intensity of that light to the estimated lighting in the real scene being viewed.


Leave a Comment

Your email address will not be published. Required fields are marked *

List of Blogs