XR Hand tracking: GAME concepts
Projects: Drag/Drop Boardgame, Fantasy Action
Role: DevelopeR/Designer
Platforms: Meta quest 2
TECHNOLOGIES: Unity
FEATURED API'S: oculus integration toolkit (v49)
Role: DevelopeR/Designer
Platforms: Meta quest 2
TECHNOLOGIES: Unity
FEATURED API'S: oculus integration toolkit (v49)
Obviously, the runaway leader for XR use is games. Since the majority of VR games are controller based, I wanted to play around with a few concepts that would exclusively use hand tracking. Here are the one line goals of both concepts in this video:
Concept 1: “Create a modular board game where the pieces can be placed, selected, rotated and deleted - all using natural hand gestures”
Concept 2: “Create an action game where the users individual fingers and gestures are the weapons that fight off an enemy”
Concept 1 is a physics-based board game concept. It heavily uses custom hand positions - allowing a user to pick up a game piece and have their fingers. Rotation of objects is handled by gesture detection (in this case, swiping left and right to rotate). And hand pose detection allows for things like performing a “thumbs down” to delete. Hot Wheel style cars can be dropped onto the roads, which use a simple node system for navigation. A VR mode uses the Meta SDK’s new navigation methods to snap and move around the environment that was created.
Concept 2 is my stab at a fantasy-style action game, using a player’s hands to “harass” the enemy. Individual colliders are placed on the fingers and palms, allowing interactions like hitting, tripping and even picking up a creature. I tested two different views - a top-down “dollhouse” mode and a first-person mode (which easily lends itself to a Punch-Out style of game)
Tech Credits: This was all done in Unity (code is C#). Motion capture data is provided by Mixamo (Adobe) with some keyframe cleanup in Unity. And this is all running on an Meta Quest 2, using the Oculus Integration Toolkit v49.