At NuVuX, we’re passionate about empowering students to imagine bold solutions for their communities. Recently, we had the opportunity of supporting a group of students at the Ali Ghandour Center, a community-based leadership and civic engagement program at the International College in Beirut, Lebanon, in a project that seamlessly blends advanced technology with a deep commitment to community. The International College in Beirut became a NuVuX partner last year and their students have shown ingenuity in looking at pressing issues within their own communities as a source of ideas. Alongside other partner organizations, these students developed an inspiring vision that demonstrates the power of creativity and collaboration to shape the future. What follows is the students’ own narrative of their remarkable project.
Article by Jawad Dheini (Student at International College in Beirut, Lebanon) on behalf of the Deafeat team
Deafeat began with a clear goal: to bridge the communication gap between deaf and hearing individuals using accessible, real-time technology. The idea was born from a team member’s personal experience growing up with deaf family members and witnessing the challenges firsthand. That perspective, combined with three years in the Ali Ghandour Center‘s program and its focus on community-based design, taught us to build for real community impact.
From the start, we knew we wanted something efficient, fast, and meaningful in real-world use, especially in countries where accessibility tools are scarce or nonexistent. The core of the project is Deafeat AI, a system that detects hand gestures and translates them into letters/words, which are then combined to form sentences, and finally speech. It also works in reverse, translating spoken language into text. One of the biggest technical challenges was achieving this on a Raspberry Pi with limited processing power. Camera integration was a major hurdle. Maintaining a reliable frame rate while capturing and analyzing hand signs in real time pushed us to experiment with different libraries, optimize frame processing, and deal with lighting inconsistencies. Throughout the process, we kept the NuVuX team updated and received useful suggestions, like how to approach combining our static sign model with the dynamic gesture one, which helped us tackle one of the toughest parts of the project: integrating both models into a unified inference pipeline. This required managing inputs, implementing temporal logic for gesture-tracking, and fine-tuning geometric preprocessing to improve detection accuracy.
Training the model presented its own set of problems. We had limited data and limited time. To work around this, we built a lightweight training menu that quickly captures a few signs, automatically adjusts and augments them, and feeds them back to the model. This speed made it possible for us to offer users a custom-training feature that they can use to personalize their device.
With the help of the IC STEAM Lab, we built the machine, and once we had a working prototype, we tested it at the Lebanese School for the Blind and the Deaf (LSBD) in Baabda, Lebanon. The feedback we received was encouraging and helped shape our next steps.
Looking ahead, we are developing a standalone mobile app, exploring hardware redesigns for better portability, and working toward supporting additional sign languages. We are also in early discussions around deployment in schools and public spaces.
This project was our AGC graduation deliverable and marked the culmination of everything we’ve learned over the past three years. Our team members received a “Social Entrepreneurship” award in recognition of Deafeat’s impact.
We’ve validated the concept. Now we’re focused on scaling the system and refining it for daily use. To see Deafeat in action, you can watch our 5-minute final presentation video here: https://youtu.be/4bGaQb2MfcU
You can also follow our progress on Instagram: @DeafeatUniversal