VR Machine Vision Interaction HCI and Design Systems

The hands-on gesture control turns the user’s smartphone camera into a real-time gesture sensor. We developed a computer vision processing algorithms that utilizes the smartphone camera and LED light (torch) to process and track a left and right marker in real-time in all lighting conditions.

Tap (or click) intensity  measuring the positional delta in the Z axis enables the algorithm to detect a fast Z movement in realtime. Z movements that accelerate in the positive direction and suddenly slow down represent a click or ‘tap’. If the virtual hand reticle is hovering over a button or object during this state, the object is tapped or clicked.

Depending on how fast the user taps, increases or decreases the Z delta in a given time frame thus changes the intensity of the tap. For example, when hitting a virtual button or drum, the user can change the intensity of their tap but hitting with higher acceleration thus increasing the intensity and the loudness of the drum tap.

Upwards Gazing and Abstraction There are useful positions within the virtual environment that may be outside the immediate line of sight. These include spaces above and below the user. Research argues that upwards deviations of the eyes activates creative activity in the brain encouraging us to reach nearby time and space into the infinite and eternal. This physiological hardwiring can be anchored to menu systems that require movement into unrelated or abstracted information that is not directly corresponding to what is currently in front of the user.

  • Rotate/Orbit The drag gesture can be utilized to rotate an object in 3D space, such as a dial when performed on top of such virtual object. The drag gesture can also be utilized to orbit around a virtual orbit ‘anchors’ - similar to orbiting around an object in 3D modeling software with one hand.

Z-Depth translation with physiological synchronization via ‘push’ acceleration. Taking components of the previous iteration, we simplified the trigger movements to mimic the push of a virtual button. However, instead of forward movement being triggered via button push, the forward movement was actually directly tied to the hand acceleration AS the button was being pushed as opposed to AFTER the button is pushed. This in essence was a fake button, only acting as a visual indicator for the user when in fact pressing the button had no corresponding event tied to the experience. Again it was the forward hand acceleration that directly affected the virtual camera acceleration in Z-depth. Accompanied with footstep sound effects, this modality proved to be successful, easy to understand and physiologically appropriate for the user.

View all tags
Posted on Mar 14, 2023
Milan Baić
Elevating products to look and feel like market leaders.

More by Milan Baić

View profile