
Embodiment Interaction
AXM-E7008, Spring 2025
Playtest and Iteration:
After presenting the project on the final presentation day and during the open studio exhibition
week, we realised that the project still had some usability issues and fundamental bugs.
For example, the hand gesture recognition to bring out and consistently use the hammer was
not very consistent and the hammer kept appearing and disappearing, hurting the player
experience. The threshold between detecting the closed fist gesture was too sensitive. After
fixing that issue, the next one was that now it became difficult to get rid of the hammer once the
player was done using it. We came up with multiple approaches to solve this problem including
1. Having the hammer as an interactable object that is always present in the environment and
players could pick it up or put it down whenever they want to, 2. Being able to put the hammer in
an imaginary or visible belt area around the waist, where a collider would make them disappear
or hold the hammer in the belt etc. We decided to make a custom gesture to solve this issue by
detecting when the palm starts to open and that would make the hammer disappear.
We also conducted internal playtests among 5 people and had some more insightful findings.
The interaction of placing the towers and building them was not very intuitive. Earlier, players
had to first buy the tower, move them to the destination and then show their left thumbs up to
confirm the location. Only after that they had to bring the hammer and start building it. This was
quite confusing for most players, especially the part when they had to do the thumbs up to
confirm the position. We improved this interaction by having the player move the tower wherever
they want to and confirm its position only after striking it with the hammer. This made much
more sense from a design and gameplay point of view and also improved the player experience
quite a lot.
We also added a simple tutorial to improve the onboarding experience, as we initially found it
challenging to communicate which actions should be performed with which hand. Now the
improved tutorial only progresses forward only when the player completes the current task. After
implementing the tutorial, users gave positive feedback, noting that the mechanics felt more
intuitive to learn.
Conclusion
Our exploration of novel embodied interaction techniques has been both educational and
enjoyable. While we possessed previous experience with the Unity engine and VR
environments, venturing into mixed reality presented a significant new challenge, accompanied
by a substantial initial learning curve. Each stage of development presented unique challenges
and considerations. Nevertheless, we are pleased to report satisfaction with our current
outcomes. The scope of this project was substantial, as our goal was to develop a fully
functional tower defense game within a mixed reality setting. This challenge proved to be highly
educational, and we are proud to have achieved a working implementation.
We hold high expectations for this project and view our current progress not as a conclusion, but
rather as the beginning of our journey in mixed reality design and development. The process of