VR Visual Programming UI
Exploring affordances implemented using particle systems for VR UI elements
Handles react to finger proximity, communicating grabbable size
Apple Watch Haptics in VR
Investigating displaced haptics and sensory fusion
Unity collision → OSC → iPhone → Apple Watch haptic pulse
~150ms collision-to-haptics latency preempted with collision trigger projected along finger velocity vector
Particles’ physical dynamics influence/prime haptic perception
ARKit WorldSpace to ScreenSpace
Physics Transform Prototype
The placement of information amidst the environment is essential, but I’ve been dissatisfied with the disconnected nature of world-space panels in phone AR apps.
Apple’s Measure app behaves admirably, with a label in worldspace smoothly interpolating, reorienting to screenspace. However, once snapped to screenspace, there is no visual or physical indication of the panel’s source, as if it has lost all connection to its anchor.
I created a dynamic physical system where twin forces are constantly pulling the panel to its anchor and to the phone.
By tuning the forces to prioritize mild-screenspace-snapping when the phone nears the panel, and ramping up the force back to the anchor as the phone departs, the panel becomes a reactive physical object that subtly communicates its tendency to return to its anchor, establishing its spatial context even when near screenspace.
Now that I have translational forces implemented, I’ll next work on subtle rotational forces to slightly align the panel more closely to the screen’s normal. I will allow it some slack such that its rubberbanding communicates the orientation of it at its anchor.
I used Kite Compositor to build a fully interactive visualization of Pittsburgh's vehicle crash data, enabling exploration across many different comparisons.
My process documentation is available here.