Close

Becoming a Butterfly

Step III: VR Integration

Warning: this part gets a bit technical!

 

Once the materials and the model were ready (or at least ready enough to test), we brought everything into Unity to begin putting it all together and adding the VR functionality needed to be able to control the avatar with a combination of our headset and hand controllers. To get started, we used the Oculus OVRCameraRig prefab with the intention of adding Steam/OpenVR functionality later on. 

The first step here involved some initial calibration of the avatar—basically setting it up so that it was oriented and scaled correctly so that it would “fit” players of different heights. Meaning, when a person starts playing the game, the player’s head (i.e. the active camera) should be where we’d expect the monarch avatar’s head to be, and, when landed, the avatar’s virtual feet should be hitting the same ground plane as the player’s actual feet, which helps increase the feeling of immersion.

Avatar mockup showing proper “fit”

In the full app/demo, player height will be measured in an opening “calibration” scene, and the avatar will be scaled appropriately via scripting when the game begins, similar to what you see in many other VR apps. But, to make prototyping easy (and avoid loading scenes/transitions for the time being), we’re just adding the player height manually for myself and other early testers, along with their arm length, which we need to be able to determine how extended the avatar’s wings should be for flight mechanics to work properly. 

Integrating some basic controller movements was the next step, allowing the player to move the avatar on the ground plane by using the left thumbpad/joystick, and rotate the player by using the opposite thumbpad. We’re enabled OVR’s positional tracking, so players can also move the avatar by physically moving/rotating around their play area, but enabling the thumbpads makes basic movement quite a bit easier for players without large play areas. 

One trick here is that the avatar body and the camera need to be moved and rotated independently. We obviously want the avatar body to follow the head/camera, but we want to make sure we can control precisely how it does this. So, for example, if the player looks down in the game (rotating the camera around the red X axis in Unity), the avatar body shouldn’t simply apply the same rotation. That would result in the avatar body rotating into the air behind the camera rig in an unrealistic way. 

The Unity transform, storing an object’s position, rotation and scale.

Instead, we set this up so that the avatar body follows the camera, using the avatar’s pivot point that we added in Blender. Rotation is trickier. Basically, we want the avatar to orient toward the same forward vector as the player, but just as you can turn your head without turning your body, we want the player to be able to look around in the virtual space without necessarily turning the avatar body. As noted above, we also want to prevent the avatar body from rotating in an unrealistic way. Finally, we want the player to be able to select whether they’d prefer to use the orientation/forward vector of the camera or one of the hand controllers to control the rotation of the body. 

For this prototype, we settled on using only the rotation around the green Y axis from the selected source (controller or headset), while retaining the avatar’s rotation around the X and Z axes. Basically, this means that the avatar can rotate around in a circle, keeping the body parallel to the ground plane, but it can’t rotate forward/backward or to the sides. This works well enough for the purposes of this early prototype, but it’s worth noting that it has a significant limitation: it only works on a ground layer that is relatively flat. If we put our avatar on a steep slope it will break, as the avatar won’t be able to rotate its body to keep it parallel to the sloped surface. An issue for another day. 

Once the avatar could move around, it was time to start on animations. For this project, this includes both “FK” and “IK” animations, or forward kinematics and inverse kinematics. FK animations are the standard, and work more or less the same in Unity as they do in claymation. If you want a humanoid clay avatar to raise its right hand in a stop motion video, you take a series of images beginning with the base/resting pose. For each subsequent image, you move the right arm and hand just slightly until it’s reached the desired position. Then play the images in sequence to see a video of the clay avatar raising its right hand. 

This works much the same with digital avatars in Unity: you begin with a starting pose, and record the movement and/or rotation of the avatar’s right arm bones over a set number of frames until the right hand has reached the desired end position. The total number of frames and the playback speed (frames/sec) then determine the length of the animation clip which can be played during gameplay and even blended with other animation clips. So, for example, a character can walk and raise its right arm at the same time. 

Some nice things about these animations is that they’re versatile, relatively easy to set up, and computationally cheap (i.e. very low impact on performance). For this prototype, we’ve set up FK animations for the proboscis, wings (when the avatar is grounded) and default leg positions–all of which you can see in the clip at the beginning. I’ll also add FK animations to the antennae once we start working on approximating the perceptual world of the butterfly (with the antennae allowing the player to detect airborne chemical signatures like pheromones or the smell of nectar). 

But they don’t work for everything. For this prototype, we want the player to be able to control the wings with the controllers, meaning that the virtual right wing, when active, should follow the location of the player’s right hand controller so that if the player makes a flapping motion (moving the right arm up and down parallel to the body), the right wing should flap in the game following the same trajectory. This is a job for IK animations, which allow us to take a starting position and a desired target, or end position, and to calculate the best way to reorient the avatar’s bones to arrive at that end position automatically.

For this prototype, we’re using the Final IK toolkit to handle this, which we highly recommend as it both performs well and makes the process about as easy as it could be. We have IK animations set-up to control the wings so that, when active, the target for the right wing is a child of the right hand controller that extends just beyond where the end of the wing is if the player’s arm is extended straight out to the side (otherwise the wing would curl in on itself). We’re also using a physics-simulation package, the Boing Kit, to make the avatar wings move more like real butterfly wings—more like a flexible textile than a rigid mesh as you see in the gif below.

Avatar wing using Final IK and Boing Kit

Finally, we’re also using IK animations to animate the avatar’s four main legs (yes, butterflies are insects and have six legs, but for adult monarchs the two in front are smaller and generally curled up close to the body) to get some more natural-looking walking/turning behavior.

That’s all for now—this post turned out quite a bit longer than expected. Thanks for making it all the way through! 

1 2 3 4