Embodiment Interaction
AXM-E7008, Spring 2025
Shamit Ahmed, Yu-Jung Liu
Bark No More - A mixed-reality tower defense game
Itch.io Download Link:
https://shamitahmed.itch.io/barknomore?secret=2tpLE88Md0vgJB4bUDHo2YkPrI
Embodiment Interaction
AXM-E7008, Spring 2025
Description
Bark no More is a mixed reality (MR) tower defense game that reimagines your everyday
surroundings as an interactive battlefield. By leveraging spatial mapping and depth sensing, the
game blends virtual elements with the physical world, allowing players to place defense towers
beside real-world objects like desks, tables, and chairs. While full environmental recognition is
not yet implemented, the project is designed with this future functionality in mind, aiming to
create dynamic, adaptive gameplay environments unique to each player's space.
The core gameplay of Bark no More centers on intuitive, embodied interaction. Players use
natural hand gestures to collect energy dropped by defeated enemies or to hammer down and
upgrade towers in real-time. These physical motions replace traditional controller-based inputs,
creating a seamless bridge between the player’s body and the game world. This approach
enhances engagement, making the experience not just visual, but physical.
Departing from conventional tower defense mechanics that rely on mouse clicks and static
screens, Bark no More leverages MR to create a deeply immersive and strategic experience.
Players must physically move, observe, and react within their real-world environment to fend off
waves of invading dogs. With precise hand tracking enabled by Meta Quest headset, the game
offers a level of interactivity and immersion.
Reflection
Reflecting on the project, we are quite satisfied with the outcome despite some shortcomings.
We somehow managed to integrate most of the features we wanted to have in this proof of
concept. Right from the beginning it has been quite challenging and interesting for us to develop
the novel embodied interaction techniques that this project has.
Throughout the development process, we came to realize that when working under a tight
timeframe, prioritizing tasks based on their importance becomes absolutely crucial—especially
for a project with such complex features. Having a clear idea of what we wanted in our MVP
helped us focus and use our limited time wisely. It kept us on track and made sure the core
experience came through, even with the project's experimental nature.
Since the mixed-reality hand tracking and gesture recognition techniques are still quite new and
work in progress, it has been especially difficult to both find references of relevant works and
also tutorials and documentation on how to actually implement the features. We focused our
efforts on implementing key interaction features that aligned with our project goals, such as
using hand tracking to move buildings and interact with virtual buttons and coins, as well as
recognizing specific gestures to cast fireball spells and toggle the hammer. These elements
were central to showcasing the potential of our concept.
Embodiment Interaction
AXM-E7008, Spring 2025
Playtest and Iteration:
After presenting the project on the final presentation day and during the open studio exhibition
week, we realised that the project still had some usability issues and fundamental bugs.
For example, the hand gesture recognition to bring out and consistently use the hammer was
not very consistent and the hammer kept appearing and disappearing, hurting the player
experience. The threshold between detecting the closed fist gesture was too sensitive. After
fixing that issue, the next one was that now it became difficult to get rid of the hammer once the
player was done using it. We came up with multiple approaches to solve this problem including
1. Having the hammer as an interactable object that is always present in the environment and
players could pick it up or put it down whenever they want to, 2. Being able to put the hammer in
an imaginary or visible belt area around the waist, where a collider would make them disappear
or hold the hammer in the belt etc. We decided to make a custom gesture to solve this issue by
detecting when the palm starts to open and that would make the hammer disappear.
We also conducted internal playtests among 5 people and had some more insightful findings.
The interaction of placing the towers and building them was not very intuitive. Earlier, players
had to first buy the tower, move them to the destination and then show their left thumbs up to
confirm the location. Only after that they had to bring the hammer and start building it. This was
quite confusing for most players, especially the part when they had to do the thumbs up to
confirm the position. We improved this interaction by having the player move the tower wherever
they want to and confirm its position only after striking it with the hammer. This made much
more sense from a design and gameplay point of view and also improved the player experience
quite a lot.
We also added a simple tutorial to improve the onboarding experience, as we initially found it
challenging to communicate which actions should be performed with which hand. Now the
improved tutorial only progresses forward only when the player completes the current task. After
implementing the tutorial, users gave positive feedback, noting that the mechanics felt more
intuitive to learn.
Conclusion
Our exploration of novel embodied interaction techniques has been both educational and
enjoyable. While we possessed previous experience with the Unity engine and VR
environments, venturing into mixed reality presented a significant new challenge, accompanied
by a substantial initial learning curve. Each stage of development presented unique challenges
and considerations. Nevertheless, we are pleased to report satisfaction with our current
outcomes. The scope of this project was substantial, as our goal was to develop a fully
functional tower defense game within a mixed reality setting. This challenge proved to be highly
educational, and we are proud to have achieved a working implementation.
We hold high expectations for this project and view our current progress not as a conclusion, but
rather as the beginning of our journey in mixed reality design and development. The process of
Embodiment Interaction
AXM-E7008, Spring 2025
designing embodied interactions, alongside the guidance of our supervisor and the insights
shared by fellow students, has been both rewarding and motivating. We intend to continue
enhancing the game by expanding its content and refining its features.
Looking ahead, we envision two potential directions for this project: one path involves preparing
the game for release on the Meta Store, while the other focuses on deepening our exploration of
embodied interaction and spatial awareness of the mixed-reality application, with the aim of
submitting our work to conferences related to human-computer interaction and extended reality.
Gameplay Screenshots
Embodiment Interaction
AXM-E7008, Spring 2025
Gameplay Video
https://youtu.be/-lfP-QiiHZw