ARP

From Virtual Humans
Revision as of 13:35, 8 April 2013 by John.Glauert (talk | contribs)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Main Page


ARP Tookit

ARP Toolkit Overview

The ARP Toolkit is an extensible integrated suite of tools for creating 3D avatars that support body and face animation for use in real-time applications. The requirement for the toolkit arose from UEA projects such as synthesised deaf signing and video face tracking where extra parameters and functionality were needed which could not be implemented in existing avatar creation software. It also overcomes many of the problems encountered when exporting complete avatars from these third party applications, such as rewriting plug-in exporters when new versions of the applications are released, errors in the vertex blending weights which require manual correction, and conversion of non-standard skeleton hierarchies.

ARP Toolkit Outputs

The output of the toolkit is a single binary file (ARP) containing a textured avatar mesh linked to a skeleton that can be animated in real-time with frames of bone rotation data. Each frame can also contain morph target data for face animation. The output file is highly optimised for rendering, for which a C++ static library is provided using OpenGL. The library has interfaces for loading, animating, and rendering the avatar and is intended for integration into other applications as it provides no rendering window. A rendering dll for Win32, using this library, is also available which creates one or more OpenGL Win32 windows each containing a single avatar. This is useful for applications that do not have their own OpenGL window. These libraries have also been being ported to Java and the Mac. Avatars can also be exported in the Alias FBX format for import to other 3D applications such as Discreet 3D Max and Alias Maya, and Alias MotionBuilder. Future work includes import and export in the Collada format, and the Shockwave W3D format.

The Toolkit consists of a number of plug-ins, each targeted at a particular area of avatar development, with a user interface that displays only those controls required for that particular task.

Plug-ins

Skeleton toolkit

This provides tools for creating skeletons of any configuration and complexity for fitting to a particular mesh. Each bone in the skeleton has a configurable envelope that will capture vertices local to the bone for deforming the mesh. Where envelopes overlap at joints, vertex weightings are blended between bones to provide good deformation. Skeletons can be saved and reused on other avatar meshes.

Vertex weights toolkit

Weightings of vertices for deformation can be modified using a simple 'mouse painting' technique. It also contains automatic methods for transferring weightings from an existing mesh to another lower resolution mesh such as level of detail meshes for a given avatar.

Morph toolkit

Facial animation for speech and deaf signing is achieved through morph targets, a selection of vertices on the face that describe a controlled deformation of part of the face. This toolkit provides simple controls for creating these, and for combining primitive deformations to make more complex facial movements. These morph targets can be blended and controlled by software in real time to produce facial animation.

Feature point toolkit

For deaf signing, each avatar requires about 380 points on the surface of the avatar mesh to be defined, providing targets for the synthetic animation engine, AnimGen. This is achieved by an automatic process that discovers the points using ray casting techniques from the bones of the skeleton.

AnimGen toolkit

AnimGen, the synthetic signing engine for deaf signing, requires a number of configuration files for each avatar that contain skeleton specific data to achieve correct arm movement and hand shapes. This toolkit provides means for creating these, and rapid adjustment of hand shapes in real time.

Level of detail toolkit

This is used to assemble different complexities of the same avatar mesh into a single avatar file, and setting the camera distances at which each mesh is used.

Materials toolkit

Material properties and textures can be assigned to avatars with this toolkit.

Video toolkit

Quicktime movies of animations can be created for demonstration purposes or with chroma key backgrounds for insertion into other movies. Billboard toolkit

Shockwave toolkit

This toolkit processes avatars in preparation for use in the Shockwave player, including methods for real time morphs for facial animation, which is not natively supported in the Shockwave format. This capability has been developed at UEA.


Main Page