AR Hand Interface for 3D Manipulation ・ NASA JPL

Summer 2018

Designed and prototyped at NASA Jet Propulsion Lab, OpsLab, for the ProtoSpace AR CAD collaboration tool.

Combining Leap Motion hand tracking + HoloLens, exploring bimanual interaction patterns.


Previously, if a ProtoSpace user wanted to move an AR CAD object precisely on a single axis, they pre-choose the "X-Axis only" (etc) tool, airtapping through many menu levels.

My redesign allows the grabbing hand to start, and the nondominant hand's orientation constrains.

The world axes anchored to the nondominant hand visually and audially indicate when the hand has snapped into axis orientation.

It's much faster to rapidly reorient the hand than invoking a menu, and the gestures semantically match the invoked constraint.

Without a visual representation of my hand in AR space, I often missed grabbing objects if there was any hand tracking miscalibration. Rather than superimposing a whole hand model, I anchored two "cursor" dots at my index and thumbtip, and I came to embody and identify them as me.

 
 

I explored other bimanual spatial UI interactions, like selection behavior [on the right].

Immerse your finger in an object, but any confirmation gesture with that hand might displace the selection point. Instead, show selected object at pinch point of other hand, pinching it to confirm.

A key challenge was prototyping this future SUI (that needs an expansive FOV) on present hardware with limited FOV. I considered, if the hand is extraperipheral but normally has relevant UI anchored, ways that the system can hold the UI at the FOV edge until the hand reenters [see top middle]

I was interested in how UI iconography could morph into the tool UI itself, anchoring to the hand, rather than merely initiating that tool mode.

Thus the icon’s visual signification matches the resulting spatial UI form, as they are one and the same object.