3D Models

From Kudan AR Engine
Jump to: navigation, search

Introduction

KudanAR is capable of loading complicated 3D models from a variety of different formats.

Supported 3D formats

KudanAR supports the following formats:

  • OBJ (*.obj)
  • COLLADA (*.dae)
  • FBX (*.fbx)

All modelling software is capable of working with these formats, but Maya is recommended for the best results using the framework. If you require a format other than what is supported you can try importing it into some 3D modelling software and exporting as one of the supported formats.

OBJ is incapable of representing scenegraphs and has no support for animations. FBX is the most robust format.

Supported model features

Some of the supported model formats are capable of representing the much more complex 3D used in offline rendering. As such, models for AR apps should be limited to a subset.

Meshes

Meshes should be made polygons. Non-polygon meshes will be converted by the importer. Meshes should have a single material. The number of unique meshes in a scene is a primary factor determining performance. Meshes exceeding 65535 vertices will be split into separate meshes. For good performance you should try to keep the number of meshes rendered at any given time to below 50.

Nodes

Complex scenegraphs are supported. It is helpful to have unique names for each node in the scene but this isn't compulsory.

Bones

Skinned meshes are supported. There is a hard limit of 64 bones per mesh on iOS and 32 on Android but you can exceed this by splitting the mesh into multiple meshes. The model converter supports meshes where any one vertex can be influenced by any number of bones, but only the 4 highest weighted bones will be used for rendering. You should ensure that your model looks correct under these circumstances, although it is uncommon for this to be problematic.

Blend shapes / morph targets

The renderer supports blend shapes (also known as morph targets). This allows you to deform a mesh and is particular suited to our camera texture mapping feature.

Due to rendering in real-time on mobile devices, there are limitations on the blend shapes. You can only mix between two different shapes at any one time. For example, you cannot mix between an angry face blend shape and an angry face blend shape because there are three shapes involved: angry, happy, and regular. But a single shape with 10 in-between shapes is perfectly fine because you're only ever mixing between two shapes.

Animations

Animations are imported. The following properties are keyable:

  • Node transformation.
  • Node visibility.
  • Blend shape influence.

Currently animation takes are not supported so different animation cycles need to be split into separate files. Support will be added soon.

Animations don't need to be baked and only the channels that actually changed need to be keyed.

On Android, bone animations are supported, but only when using an ARLightMaterial. Other materials or textures will not correctly display a bone animation.

KudanAR's model format

The KudanAR framework only works with our ARModel format. This format is designed to result in a small file size and fast loading. ARModel files can be created using the Kudan AR Toolkit (N.B. If you are using your model in Android, you will need to change the extension of the exported file from .armodel to .jet or include .armodel in a _noCompress_ block in your build gradle). Textures are bundled with the app separately.

ARModel files can be loaded via the ARModelImporter class:

Objective-C

ARModelImporter *importer = [[ARModelImporter alloc] initWithBundled:@"test.armodel"];
ARModelNode *myModelNode = [importer getNode];

Java

ARModelImporter importer = new ARModelImporter();
importer.loadFromAsset("test.jet");
ARModelNode myModelNode = (ARModelNode)importer.getNode();

The resulting ARModelNode can then be added to other nodes in order to get rendered.

Textures

Textures are limited to a maximum size of 2048x2048px and 8-bit color-depth. The dimensions must be a power of two.

Android

In Android file extensions of model assets must be changed to ".jet" to avoid file compression.


Materials

Light Materials

This is an implementation of phong shading. Currently these materials are only available for our native iOS SDK. These materials are made up of a collection of ARMaterialPropererties.


ARMaterialProperties


Use either an ARVector value for storing uniform values across the material which is multiplied by by a float factor, or an ARTexture that can be used to apply a coloured texture to the model or map that can be used for lighting.

Light Materials contain the following properties on iOS:

colour - The texture of and material and can be set using a vector which represents a solid colour or a texture.

blendTexture - An alternative texture that can be loaded to transform the material.

ambient - The ambient light set on the material can be set using a vector for uniform lighting or a map passed as a texture.

diffuse - The diffuse lighting applied to a material can be set using a vector for uniform diffuse lighting or a diffuse map passed as a texture.

reflection - Uses an ARMaterialPropertyReflection that takes a cube map instead of an ARTexture. Used to control the reflectivity of a material and what the object reflects.

normal - The normal map used for a material set using a texture.


Other Properties



shininess - A float representing a factor of how much specular light is reflected.

alpha - A float used to set the transparency of the material default is 1 which is opaque.

blendParameter - A float used to blend the colour texture with the blend texture with the default being 0 which is the colour texture and 1 being the blend texture.

refractivityRatio - A float representing the refractivity ratio between two mediums.

maxLights - Maximum amount of lights that can effect a material.

perPixelShader - A boolean on whether the lighting calculations occur per-vertex or per-pixel.


Java

ARLightMaterial material = new ARLightMaterial();
material.setTexture(texture2D);
...
material.setShininess(shininess);


Objective-C

ARLightMaterial *material = [ARLightMaterial new];
material.colour.texture = colourTexture;
...
material.shininess = shininess;

Colour Material

ARColourMaterial represents a material of a singular solid colour.

Java

Vector3f rgb = new Vector3f(r,g,b);
ARColourMaterial colourMaterial = new ARColourMaterial(rgb);


Objective-C

ARColourMaterial *colourMaterial = [[ARColourMaterial alloc] initWithRed:r green:g blue:b];

Texture material

ARTextureMaterial creates material using an image with a texture map. The transparency can be explicitly controlled and is the material behind ARImageNode. Lighting cannot be applied to and ARTextureMaterial.

Objective-C

ARTexture *texture = [[ARTexture alloc] initWithUIImage:image];
ARTextureMaterial *textureMaterial = [[ARTextureMaterial alloc] initWithTexture:texture];

Occlusion Material

AROcclusionMaterial allows you to occlude certain parts of the scene. This is useful for causing parts of the camera texture to occlude your CG content, for instance, creating a hole in the marker.

Java

AROcclusionMaterial occlusionMaterial = new AROcclusionMaterial();


Objective-C

AROcclusionMaterial *material = [[AROcclusionMaterial alloc] init];

Camera Texture Material

ARExtractedCameraTexture isn't a material type but rather a texture that can be used with other materials. It is responsible for taking part of the live camera texture and making it usable as part of the rendering. It is useful for deforming the camera image and making it come alive, while maintaining realistic due to perfectly matching the lighting and noise of the real world image.

Java

ARExtractedCameraTexture cameraTexture = new ARExtractedCameraTexture(512,512);
ARNode sourceNode = new ARNode();
cameraTexture.setSrcNode(sourceNode);
cameraTexture.setSrcWidth(width);
cameraTexture.setSrcHeight(height);

Objective-C

ARExtractedCameraTexture *cameraTexture = [[ARExtractedCameraTexture alloc] initWithWidth:512 height:512];
ARNode *sourceNode = [ARNode nodeWithName:@"camera texture target"];
cameraTexture.srcNode = sourceNode;
cameraTexture.srcWidth = width;
cameraTexture.srcHeight = height;

Working with materials

Materials are created programmatically and must be assigned to meshes accordingly. Meshes in the scene are represented by ARMeshNode which has a material property designating which material to render it with.

All mesh nodes in a 3D model are grouped together in the ARModelNode's meshNodes property. Here's an example of creating an texture material and assigning it to a mesh:

Objective-C

ARTextureMaterial *textureMaterial = [ARTextureMaterial new];
ARMeshNode *textureMeshNode = [modelNode findMeshNode:@"texture"];
textureMeshNode.material = textureMaterial;

Java

ARTexture2D texture2D = new ARTexture2D();
texture2D.loadFromAsset("texture.png");
ARTextureMaterial material = new ARTextureMaterial(texture2D);
for(ARMeshNode meshNode : importer.getMeshNodes()){
     meshNode.setMaterial(material);
}

Materials can be used by multiple meshes.

Life size models

In order to get your 3D model to appear life size whilst using Markerless, you will need to know the real life dimensions of your object.

The units in our engine are arbitrary, much like you would have within a 3D modelling application.

By default, when using KudanAR, we set the camera height to be 120. We generally think of this as 120cm, but you can interpret this as 12 inches or 12cm, you will simply need to be consistent in what you choose this unit to be and then consistently apply this scale to your 3D models that are displayed by KudanAR.

So let's go with 1 unit as 1cm; if you create a 3D box that is 120x120x120 units, it should appear as a cube that reaches the height of the camera. If you have a cube that is 5 units on each axis, but its real size is 130cm, the calculation you would have to do is:

Real Size / Unit Size = Uniform Scale Value

130/5 = 26

Optimisation

While the renderer is capable of handling highly detailed 3D models exceeding 1 million triangles, there are certain things you can do to improve performance, especially on lower power devices.

  • Reduce the number of triangles. 100 thousand is a good target to aim for.
  • Reduce the number of different materials, combining any meshes with identical materials. Instead of having lots of different textures, compact them into an atlas and share it across meshes.
  • Bones are more efficient than blend shapes. If your deform can be done with bones then it is a better approach.
  • Try using normal maps with much simpler geometry. This can significantly reduce the model file size as well.
  • Try to share materials. If the same image appears in multiple locations throughout the scene, that image should be loaded only once. ARMaterial instances can be shared by multiple meshes.

Future features

These features are in development and will be made public in the near future.

  • Custom shader support
  • Real-time shadows
  • Mixing of blend shapes
  • Post-processing effects