Tracking

From Kudan AR Engine
Jump to: navigation, search

General Tracking

What kinds of tracking does KudanAR provide?

KudanAR currently provides 2D image tracking, as well as Arbitrary Tracking. Full SLAM is under development and will be released at a future date.

Can I mix different kinds of tracking?

Yes. Kudan's tracker is designed to be able to switch tracking modes on the fly, so you can start a scene in marker mode and switch back and forth between Marker tracking and Arbitrary Tracking as many times as you like.

How can I share content across nodes?

The iOS SDK contains an ARMultiTrackableNode class that automatically reparents itself to any markers it is associated with when that marker is detected. On Android or the Unity plugin, you would have to manually write a function that changes the parent of a node when a marker is detected.

Marker Tracking

What makes a good marker?

A good marker has lots of high contrast corners that are distributed fairly evenly. The pattern should be non-repetitive. The aspect ratio of the marker shouldn't be too large. For more information on what makes a good marker please read our page on What makes a good marker?

How many different markers can I detect at once?

There is no limit on the number of markers you can detect at the same time.

Does your image tracker work with bad markers?

The Kudan image tracker works well with bad markers but tracking quality may be degraded. There are several controls available in the native SDKs to fine-tune the tracking parameters to better suit poor markers.

My marker jitters / doesn't detect very well. What can I do?

Firstly you should attempt to improve the quality of your marker. If this isn't an option, then you can try adjusting the tracker parameters for the marker. In the case of failing to detect you can reduce the detection threshold. For unstable tracking you can adjust the number of features tracked as well as pose filtering.

ArbiTrack

What is ArbiTrack?

Arbitrary Tracking, ArbiTrack for short, is KudanAR's own SLAM-based tracking solution for tracking arbitrary parts of the camera image. This can be the entire camera image, including any new parts that come into view, or constrained to a narrower area such as a person. It initialises instantly without requiring different viewpoints, and is especially suited for tracking desk surfaces or floors in 6 degrees of freedom.

How does it work?

Without revealing too many technical details, ArbiTrack tracks feature points in the environment to know where it is in relation to its immediate surroundings. This allows the tracker to correctly place and track objects in the real world.

SLAM

What is SLAM?

SLAM stands for Simultaneous Localisation and Mapping. The camera draws a map of its surroundings as it moves around and can calculate its own position in the world by comparing what it sees to the map it has created.

What is the difference between Arbitrary Tracking and SLAM?

Arbitrary Tracking initialises instantly without having to move the device to different viewpoints. It also works especially well on flat surfaces such as distant landscapes, floors, or walls. SLAM is more suited to mapping out environments which it can recognise even after tracking has been lost.