XR Hand tracking: retail concepts
Project: Shoe purchase previewer
Role: DevelopeR/Designer
Platforms: Meta quest 2
TECHNOLOGIES: Unity, mac os
FEATURED API'S: oculus integration toolkit (v49), Apple Reality capture
Role: DevelopeR/Designer
Platforms: Meta quest 2
TECHNOLOGIES: Unity, mac os
FEATURED API'S: oculus integration toolkit (v49), Apple Reality capture
Coming from years in the ad agency space, I often think about XR for retail, events and various out-of-home experiences. Dedicated hardware in those spaces is tough, so I thought about focusing on bringing real-world items home to a user through photogrammetry - thinking of ways that they could experience the look and feel of an item, without actually being able to touch in.
As a series of tech prototypes for 2023, I decided to deep dive on XR headset hand tracking. Here's the one-line goal of the Retail focused concept:
"Using natural gestures, let a user interact with photorealistic looking shoes and showcase what these shoes might actually look like in a variety of athletic situations."
This concept involve athletic shoes: some very clean Nike shoes (w/ scans provided by Apple) and some less-then-clean used shoes (gratefully provided by my wife). The photogrammetry is done about as cheaply as you can: a simple backdrop, several soft lights to minimize shadows, and a MacOS command line tool (in Swift). Luckily, shoes are one of those items that lends itself very well to photogrammetry.
To visualize how the shoes would look during various real world activities, I utilized a human model with motion capture data - and set up a big animation controller in Unity. I removed the original shoes from the model, and then dynamically replace them with whatever shoe the user has selected. For animation selection, I set up curved UI surfaces (in VR, these can be easier to read, as opposed to a simple flat UII)
Tech Credits: This was all done in Unity 2021.3 LTS (code is C#). The models were shot on an iPhone 14 Pro, converted to 3D USDZ models in MacOS (via a Swift command line app using the Object Capture API), with some minor cleanup in Blender. The motion capture data is pulled from Mixamo (Adobe), with some small keyframe tweaking in Unity. And this is all running on an Meta Quest 2, using the Oculus Integration Toolkit v49.