Gesture Control
Leap Motion Development for testing
When in doubt, explore
Tech-center Research
Leap Motion
Unity
Gesture Control
Although haptic feedback is part of every physical interaction, it is mostly neglected due to its complex design process. Hapticlabs.io is a toolset to democratise haptic feedback, allowing designers to identify opportunities and easily create functional prototypes. It consists of two parts: A prototyping kit which does not require any expertise in coding or electronics, as well as an online knowledge base, which provides easy to follow background information and guidelines.
Ekoo
Unity Demo
Ekoo Test
Download to run
To run Ekoo gesture control, you need to set up an UltraLeap controller
Remember my case study Ekoo ? We envisioned that in the future, maybe in MR environments, maybe in real environments, game production teams could interact more intuitively with 3D space using gesture control.

However, due to time constraints, we only had concepts. Therefore, our professor Maggie asked us if we could evaluate the usability of our ideas.

So I chose to use Leap Motion as the controller and developed our idea into a demo in the Unity environment to test it.
YEAR
2022
PROJECT TYPE
Academic
OTHER CONTRIBUTOR
N/A
When in doubt, explore
Remember my case study Ekoo ? We envisioned that in the future, maybe in MR environments, maybe in real environments, game production teams could interact more intuitively with 3D space using gesture control.

However, due to time constraints, we only had concepts. Therefore, our professor Maggie asked us if we could evaluate the usability of our ideas.

So I chose to use Leap Motion as the controller and developed our idea into a demo in the Unity environment to test it.
Ekoo
Unity Demo
Ekoo Test
Download to run
To run Ekoo gesture control, you need to set up an UltraLeap controller
YEAR
2022
PROJECT TYPE
Academic
OTHER CONTRIBUTOR
N/A
01 The Problem
While collecting pain points from Ekoo users, we uncovered an intriguing issue.

Many individuals find manipulating 3D models with a mouse and keyboard to be imprecise and complex. They struggle to accurately map the data for rotation and scaling with the movement of the mouse. Moreover, due to the complexity of shortcuts, particularly when working online, there is overlap between keyboard commands in modeling software and video conferencing tools, causing frustration for users (e.g., the spacebar serving both as a scaling command and a mute/unmute command).Consequently, we anticipate that users in the future will likely desire a solution where 3D space can be controlled through gestures.

However, determining how to naturally design gesture interactions to minimize user learning costs and align with psychological models poses a challenge.
While collecting pain points from Ekoo users, we uncovered an intriguing issue.

Many individuals find manipulating 3D models with a mouse and keyboard to be imprecise and complex. They struggle to accurately map the data for rotation and scaling with the movement of the mouse. Moreover, due to the complexity of shortcuts, particularly when working online, there is overlap between keyboard commands in modeling software and video conferencing tools, causing frustration for users (e.g., the spacebar serving both as a scaling command and a mute/unmute command).Consequently, we anticipate that users in the future will likely desire a solution where 3D space can be controlled through gestures.

However, determining how to naturally design gesture interactions to minimize user learning costs and align with psychological models poses a challenge.
02 Research Goal
How can we design gestures that are both intuitive and in harmony with users' mental models?

We aim to develop a gesture interaction system that is natural, intuitive, and provides a 3D model control experience that is more straightforward and aligns with user expectations.
How can we design gestures that are both intuitive and in harmony with users' mental models?

We aim to develop a gesture interaction system that is natural, intuitive, and provides a 3D model control experience that is more straightforward and aligns with user expectations.
03 Gesture Design
Zoom In & Out
Undo
Confirm
Rotate & Pan
Play & Pause
Accelerate
04 Demo Development
This was my first time developing gesture controls, so writing the code was a bit challenging for me. Throughout this project, I acquired the skills to implement function calculations and conduct precise joint-level checks, enhancing my ability to finely control user interactions.
Determine the last and current position change to map zoom interactions
Calculate the rotation direction using the dot product calculation method.
Check the value of palm and finger flexion to determine fist clenching
Calculate grip force, pinch force and flexion extension
Finger, palm, fist settings
Set up demo scene
Tackling tricky issues
This was my first time developing gesture controls, so writing the code was a bit challenging for me. Throughout this project, I acquired the skills to implement function calculations and conduct precise joint-level checks, enhancing my ability to finely control user interactions.
Determine the last and current position change to map zoom interactions
Calculate the rotation direction using the dot product calculation method.
Check the value of palm and finger flexion to determine fist clenching
Calculate grip force, pinch force and flexion extension
Finger, palm, fist settings
Set up demo scene
What's next
I will collect more test data and summarize the results of the tests to improve our interaction concept. Currently there is not enough data to support any changes.
Usability Testing
I will collect more test data and summarize the results of the tests to improve our interaction concept. Currently there is not enough data to support any changes.