Use Hand tracking API developed for the Presence PlatformWe introduced Hand Tracking with our latest update for mist on the Meta Quest Platform, titled “Hands & More”. We are so excited to finally let people play mist In pursuit without physical controllers! In this post, we will discuss the evolution and iteration of implementing hand tracking in mist– In particular, adding more support for it in Unreal Engine 4.27.2.
Hana Jamil is the Director of Development at blue skyThe studio behind the originalmystery Games — and helped develop the new Mist (2020)Which includes VR support. Originally coming from a purely tech background, she now helps lead production on all titles and runs Cyan’s business and technical efforts. She’s worked on titles like ‘Myst’ (2020), ‘The Witness’ and ‘Braid, Anniversary’ Edition”, “Obduction”, “Firmament” (coming soon!), and more.
Design stage and considerations
Navigation design for hand tracking
The image indicates where you would like to go. You’ve probably thought about reference, right? That’s why we chose to use the “pointing” method for movement in mist.
When you are in teleport mode, you can indicate where you want to go and the teleport ring will appear at your destination. When you loosen the point (by extending the rest of your fingers, or simply dragging your pointer finger back into your palm), teleportation is performed.
When you’re in smooth motion mode, pointing with your dominant free-motion hand (which can be configured in our controls settings, but is the left hand by default) will begin to smoothly move you in the direction you’re pointing.
When testing motion by pointing, we found that hand tracking can sometimes be unreliable with your index and middle finger when blocked by the rest of your hand. The system is not sure if these fingers are fully pointing or completely “encircled” with your hand. We added a bit of the “fudge” factor to the code to account for a more stable motion start/execution on this front – which we’ll discuss later when we discuss changes to manual tracking out of the box support in Unreal Engine.
The “point” method does not work for all uses of navigation. When it comes to rotation, we initially combined the signal with the wrist rotation. A comparison of the player’s wrist and the front vector of the camera indicates the direction of rotation (and how large the rotation is). We initially tried this because it seemed intuitive to keep the ‘pointing’ attribute to cycle through all modes.
Complications emerged in comfort tests. In a play test, most players point forward with their palms facing the ground, as one might do when trying to point at something outside the game as well. Rotating your wrist left and right (around the top axis of your wrist) while your palm is facing the floor is a challenge and you have very limited range of motion, especially if you’re trying to get away from your chest.
This problem is the same even if you ask a player to point at an object in front of him with his palm facing inward. You can bend your wrist towards Your body is quite a bit, but you won’t get the same range of motion by bending your wrist away from your body.
How did we solve this? We ended up assigning a “thumbs up” gesture instead of a finger pointing gesture.
Imagine giving a thumbs up. Now turn your wrist left and right. Note that while there isn’t much range of motion, it’s still fairly consistent indicating either “left” and “right” with your thumb in this gesture.
This is what we settled on to turn on the hand tracking mode. Although pointing with your thumb doesn’t seem like the most intuitive way to get around, it is she did Ultimately it is the most comfortable and consistent way to do it.
With a sudden rotation, rotating your wrist to the left or right from the thumb-up position starts one abrupt rotation. You must then return your hand to the “middle” position (straight up) in order to reset the snap, as well as wait for a very short deceleration to start the abrupt turn again.
With a smooth rotation, rotating your wrist while in the thumb-up position will begin to rotate you left or right – once you leave the “dead zone” that prevents a turn from happening until you cross the threshold.
Dealing with conflicts between movement and interaction between objects
Of course, pointing with a finger is too broad a signal to be assumed that it is only used for navigation. People will make the same pointing gesture to press buttons or interact with other things in the world out of habit or their own expectations. It would be very annoying to walk to (but not truly So) a button, point with your finger to press it, and then suddenly (and undesirably) get close to it in the game (or start teleportation inadvertently)!
The way we prevent movement from occurring while a player is interacting with something is by preventing any movement token from being fired when the “move” gesture is within a certain range of an interactable object. This range has been modified several times to hit “Good Place” based on gameplay testing.
We found that this beautiful spot is about 25 cm from the global space site of the index finger bone. mist It’s full of interactive objects of varying sizes (everything from tiny buttons to extra-large levers) lined up in both open spaces and narrow lanes, so it took us quite a bit of testing to settle on that number. We initially tried 60cm (about 2 feet), but that prevented movement from happening when players still needed to get close to something. Similarly, anything less than 25 cm caused unwanted movement of the player when the players were trying to grab or touch an object.
One of our top test areas was the generator room on Myst Island, where you make your way through a narrow hallway and are immediately greeted by a panel full of buttons. When the interaction test area was too large, players could not move through the entry and toward the board because it detected the buttons within their index finger.
However, 25 cm is what was specifically used mist. Other games may need to tweak this number if they want to implement something similar, keeping their own criteria in mind.
Design object interactions for hand tracking
At the moment, all the captureable interactions in mist They are designed to work with hand tracking – flipping valves, opening doors, push buttons, flipping book pages, etc.
Interactions are undoing what we already prepared mist With touch screen controllers. There, pressing the fist button automatically blends the grid representation of your in-game hand into a “grab” mode, either putting your hand into a fist (if it’s empty) or grasping an object. With hand tracing, we’ve added code that will make you a qualified guess when you’ll wrinkle your fingers enough to ‘grab’ something and start the same logic as mentioned earlier.
For example, when you use hand tracking and your hand hovers over something that can be grabbed, your hand turns orange (this is exactly what happens when you don’t use hand tracking in mist VR as well). When you grab an object that is reactive by starting to bend your fingers into a fist, the orange ball replaces your hand net and represents where the hand attaches to the object.
The reason we took this approach rather than creating custom grids that could fit into your hands – or allow your hands/fingers to appear to be physically interacting with parts of these objects – is because we wanted the interactions to be on par with what we’re offering on the touch console side of the day .
However, pressing the buttons works differently. There is no need for stripping as the buttons are not grabable objects, instead we allow you to simply press a button using capsule collisions created between each finger joint on the hand grid that can be positioned. You can do all kinds of weird and fun things because of this – like using just your pinky or your ring finger knuckle to interact with every button in the game, if you really want to.
This implementation differs slightly from the way touch controllers interact with the buttons in the game in that we typically expect players to use the fist button on their controller to set the hand to be a “finger pointing” grid for a precise in-game button press on its end. With hand tracking, there’s obviously more flexibility in the pose you can create with your hand, and so there are much more ways to press buttons with the same level of accuracy.
To interact with the menus, we ended up with the same interaction model that Meta uses for the Quest platform: a two-finger pinch between the thumb and forefinger, on either hand. This can be used to open the menu within the game and interact with the items in the menu. It makes no sense to reinvent the wheel here when players are taught to do so in OS-level menus when they first enable manual tracking in Quest!
Connect all this to the player
Because hand tracking is not as common in Quest as touch controllers, and because there may be some people playing mist For the first time (or even playing the first VR game!), we tried to consider how all this information about hand tracking is communicated to the player. We made sure to include another version of the Console Diagram that is specifically designed to describe hand tracking interactions (when enabled in mist), and show the player specialized notifications that tell them exactly how to move their hands.
Additionally, we thought it would be necessary to remind the player how to have a seamless hand tracking experience, once enabled. The player is notified at mistOn the list that the stability of manual tracking is much better if they make sure they are in a well-lit room and keep their hands in their field of view.
Meta also tells players that these are essential to a well-tracked, manual tracking environment, but we understand that some players may jump into a game without notifications that Meta analyzes about this first, so we chose to remind people in case they forget.