Marker Basics iOS

From Kudan AR Engine
Jump to: navigation, search

This tutorial will take you through the basics of getting started with KudanAR, leading on from our previous tutorial which described how to integrate our framework into your iOS project. If you have not already set up a project with an ARCameraViewController, we suggest you check it out before going any further.

This tutorial utilises bundled assets so ensure that you have imported the correct assets into your project.

For this sample we have used:

  • Marker : spaceMarker.jpg
  • Image : eyebrow.png
  • Video : waves.mp4
  • Alpha video : kaboom.mp4
  • Model : ben.armodel
  • Model Texture : bigBenTexture.png

All of which can be downloaded here.

Setting Up an Image Trackable

To create an Image Trackable, you are first going to need an image to track. We do not put any restrictions as to what image format you use as long as it is supported natively. For information about what creates a good marker please read our page on What makes a good marker?

If your augmentation appears to twitch or shake, it is likely that the marker being used is poor.

// Initialise image trackable
ARImageTrackable *imageTrackable = [[ARImageTrackable alloc] initWithImage:[UIImage imageNamed:@"spaceMarker.jpg"] name:@"space"];
// Get instance of image tracker manager
ARImageTrackerManager *trackerManager = [ARImageTrackerManager getInstance];
[trackerManager initialise];
// Add image trackable to image tracker manager
[trackerManager addTrackable:imageTrackable];

Adding content to an Image Trackable

To add content to an Image Trackable you need to transform the content you have in the corresponding ARNode and add that to the trackable’s world (the 3D space surrounding the marker). Kudan has 4 different ARNodes for this purpose:

  • ARImageNode
  • ARVideoNode
  • ARAlphaVideoNode
  • ARModelNode

Note: When adding any AR content to your application you should consider adding it on the background thread. This will help prevent the camera feed from stalling.

Image Nodes

Images are displayed using the ARImageNode class. These are initialised with an image. This image can use any format that is supported by the device’s operating system. On iOS, supported formats include:

  • .jpg
  • .png
  • .bmp
  • .gif
  • .ico
  • .cur
  • .xbm
  • .tif
// Initialise image node
ARImageNode *imageNode = [[ARImageNode alloc] initWithBundledFile:@"eyebrow.png"];
// Add image node to image trackable
[ addChild:imageNode];

Video Nodes

Videos are displayed using ARVideoNode class. Video nodes are initialised using a video file. The video file can be any format supported by the native device. On iOS, supported formats include:

  • H.264 video, up to 1.5 Mbps, 640 by 480 pixels, 30 frames per second, Low-Complexity version of the H.264 Baseline Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
  • H.264 video, up to 768 Kbps, 320 by 240 pixels, 30 frames per second, Baseline Profile up to Level 1.3 with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
  • MPEG-4 video, up to 2.5 Mbps, 640 by 480 pixels, 30 frames per second, Simple Profile with AAC-LC audio up to 160 Kbps, 48 kHz, stereo audio in .m4v, .mp4, and .mov file formats
// Initialise video node
ARVideoNode *videoNode = [[ARVideoNode alloc] initWithBundledFile:@"waves.mp4"];
// Add video node to image trackable
[ addChild:videoNode];

Alpha Video Nodes

Alpha videos are videos with a transparency channel and can be created through our Toolkit using a set of transparent PNGs. Alpha videos are displayed using the ARAlphaVideo class. They are initialised in the same way as a video node.

// Initialise alpha video node
ARAlphaVideoNode *alphaVideoNode = [[ARAlphaVideoNode alloc] initWithBundledFile:@"kaboom.mp4"];
// Add alpha video node to image trackable
[ addChild:alphaVideoNode];

Model Nodes

Models are displayed using the ARModelNode class. The model is imported using the ARModelImporter class. If you have correctly mapped your texture to your model you only need to set the lighting values of each mesh node as your texture should be applied during the importing of the model. Otherwise, a texture material can be applied to the model’s individual mesh nodes. This can be either a colour material, texture material or a light material.

For more information on using 3D models with Kudan please check out our Wiki page on 3D Models.

Note: If you do not add lighting to your ARLightMaterial your material will show up as black.

// Import model
ARModelImporter *importer = [[ARModelImporter alloc] initWithBundled:@"ben.armodel";];
ARModelNode *modelNode = [importer getNode];
// Apply ambient light to model mesh nodes
for(ARMeshNode *meshNode in modelNode.meshNodes)
    ARLightMaterial *material = (ARLightMaterial *)meshNode.material;
    material.ambient.value = [ARVector3 vectorWithValuesX:0.8 y:0.8 z:0.8];;  
// Add model node to image trackable
[ addChild:modelNode];


If the image/video/model you wish to add to the marker isn’t the same size as the marker, you may wish to scale your ARNode. Providing your video/image is the same aspect ratio as your trackable you can divide one width/height from the other to get the correct scale. This value can then be used to scale your nodes. You can scale with a uniform value, or by scaling the X, Y and Z axes separately. This tutorial will use the uniform method.

// Find the scale ratio
float scale = (float)imageTrackable.width / node.texture.width;
// Apply it to your ARNode
[node scaleByUniform:scale];

Content visibility

Each node has a boolean value which can be set to determine whether or not the node is displayed. This is useful when you have multiple nodes attached to a marker and you do not wish to display them all at once. This can be set using:

// Hide ARNode
[node setVisible:NO];