Spatial Keypad with Eye and Hand Tracking

Prototyped another variation of the previous keypad concept with eye-tracking. Quest Pro didn't track Ring/Pinky pinches accurately, so I used both hands to leverage only the Index and Middle pinches for decoding key signs.

Pinch Invokation Map

1) Left Index -> Top Left -> "A"

2) Left Middle -> Bottom Left -> "1"

3) Right Index -> Top Right -> "B"

4) Right Middle -> Bottom Right -> "C"

As you can see, the left hand decodes the left side of a key's signs, and the right hand does the right side, retrospectively. The mapping feels natural, and there are no significant issues with detecting the Index/Middle pinches. The typing experience is comfortable and fairly quick, with minimal physical strain. You use your eyes to select and your hands to 'click' - either resting on a table or on your lap.

There's room for improvement in the layout. One might arrange the most frequently used letters near the keyboard's center to minimize eye movement. It would be especially beneficial for professionals who prioritize speed and efficiency. Learning a new layout would pay off.

However, a layout with a sequential order of signs is more practical for everyday users. Since most people are familiar with the alphabetical order and basic counting, this knowledge will help them locate the desired letter easily.

The sorted order of the signs on the keypad enables users to leverage a binary search strategy to find a specific sign. This method has an O(log n) complexity, which is more efficient than the O(n) complexity for unsorted sequences. Therefore, a layout with a sorted sequence of signs is more effective for the majority.

By the way, this point above is one of the reasons why there is no sense in using QWERTY for spatial inputs like this: 1) you can't leverage your muscle memory because of the lack of haptic feedback and the wrong posture; 2) it's an arbitrary unsorted sequence from the common sense perspective.

If you have a Quest Pro, you can try out this prototype yourself on GitHub.

I'm eager to hear your feedback!

Stay tuned

Dribbble | GitHub | Twitter | Instagram

Oleg Frolov
Experiment with Interaction Design. Explore AR/VR.

More by Oleg Frolov

View profile