VIRTUAnimator

ANIMATING IN 3D SPACE

One of the greatest benefits of VR is seeing a scene with depth. While it can improve immersion in gaming greatly, it has benefits in other practical applications. One such application would be animation.
By working in 3D space with tracked motion controllers, it is possible to easily manipulate the model live as if they were a very cooperative person, allowing even non-animators to easily create animated scenes.

DESIGN GOALS

The end goal for the app is to allow a user to completely animate an entire scene within VR. A wide variety of objects will be animatable, including characters (humans, robots, animals, etc.), props (chairs, cars, weapons, etc.), scene objects (lights, cameras, particle emitters, etc.), and more. Using a per-object timeline, each object can be animated independently of others, with keyframe events allowing objects to be ‘pinned’ to another, such as a gun to a hand, or a hat to a head.

Importing custom content (Steam Workshop) will allow users to share more content with each other. Simple content such as props should ideally be loaded in automatically, but complex content such as characters may require external tools to set up. Cameras will have a ‘Director’ mode, allowing for real-time positional and rotational capture, rather than via keyframes. This will allow for more accurate/specific capturing of footage during animation playback.

Animations will be exportable as both an animation file, or a video file. Animation files will be able to be shared with others, allowing them to recapture scenes or edit as they’d like. Video files will simply be exported as either an image sequence, or an mp4 file, using either the headset or animated in-app ‘cameras’ to capture footage.

THE APP SO FAR

Currently, the app utilizes an IK-based model, so that each joint doesn’t need to be manually rotated. IK targets are currently shown as semi-transparent red circles,  which act as targets for the model’s limbs. This allows for easy animation through traditional controls (first-person camera, click and drag, mobile touch screen), but is limited to models explicitly set up to work.
Per-controller control has been implemented, allowing for independent control of objects with either hand. The ‘selected’ object will move and rotate directly relative to the controller grabbing it. It is now set up to work generically with props and objects, keeping consistent behavior across objects.

Progress is updated on the main blog, so check it out if you’re interested!

CONCEPT DRAFTS

ControllerGrab01

SpawnController KeyframeController ObjectProperties