The ARIA-VALUSPA Platform in Unity3D

As part of the demonstrators build for ARIA-VALUSPA’s final review to the European Commission, we developed the capability of running the ARIA-VALUSPA Platform in Unity3D. This allows you to create and place AVP characters in any game environment, and use appearances and backgrounds from huge existing libraries. Here are two example videos of how you could use this:

Unity is an industry standard 3D environment development engine from Unity Technologies. The most recently released figures from 2015 suggested 4.5 Million registered developers with 1 million monthly active users. While there are other well-known engines, such as Unreal, Unity is considered the de facto standard, especially amongst academics and indie developers due to its low cost, and the significant community support. It also accounts for some 47% of all mobile game development. As a cross-platform development environment for 3D work, unity is currently peerless. It supports native building for: iOS, Android, Tizen, Windows, Universal Windows Platform, Mac, Linux, WebGL, PlayStation 4, PlayStation Vita, Xbox One, Wii U, 3DS, Oculus Rift, Google Cardboard, Steam VR, Playstation VR, Gear VR, Windows Mixed Reality, Daydream, Android TV, Samsung Smart TV, tvOS, Nintendo Switch, Fire OS, Facebook Gameroom, Apple ARKit, Google ARCore, and Vuforia. Unity allows developers to create AAA standard environments and interactions: from large, complex 3D environments with casts of thousands, to the simplest 2D games. Unity includes built in pipeline support for a range of interaction modalities including AR and VR.

Unity is the ultimate game development platform. Use Unity to build high-quality 3D and 2D games, deploy them across mobile, desktop, VR/AR, consoles or the Web, and …

One relevant feature is the facility to create, animate and give life to virtual characters, however the realism of these characters is limited by standard method and the tools currently available. GRETA, by allowing remote manipulations of character animation (including lip syncing) allows far more complex agents to be delivered into the unity system. A key challenge was integrating GRETA’s animation system with unity’s own built in mechanim system. Doing this allows characters to be animated ‘normally’ using well established methods, then have these animations overridden by GRETA when necessary. The separation of FAP and BAP makes this comparatively straightforward, allowing choices to be made about how to animate bodies, while allowing GRETA to effectively handle facial animations. For gestures, it is possible to choose between GRETA and Mechanim control, if the developer wishes to implement his/her own gestures based on incoming FML Data. This is particularly preferable if the character is ‘doing’ something and we still wish to interact with them. Currently the platform only handles one specific skeleton structure (those generated by autodesk character creator), though the intention is to release an updated version which will allow bone mapping, to handle any “humanoid” bone setup. This is inline with unity’s current mechanim model.

The pipeline is as follows: Unity picks up FML from whatever source (e.g. AVP). An instance of GRETA (either on the same machine or over a network) runs a series of thrift servers. Unity passes the FML to GRETA and receives FAP/BAP and audio in return. These are then used to drive the character animation and audio.

The support for Unity3D is scheduled for the 3.0.1 release of AVP, due by the end of April 2018.

Leave a Reply