Inspiration
Rhythm games are reactive — they follow pre-made tracks. We wanted to flip that. Instead of adapting to music, what if the music adapts to you? BeatShoot turns physical movement into both the controller and the composer.
What it does
-Beat Shooter is an AI-powered first-person rhythm shooter that generates music in real time based on player motion. -An ESP32 equipped with IMU sensors captures pitch, roll, and acceleration data as the player moves. -Movement data is streamed to a FastAPI backend, where rhythm timing and shot accuracy are calculated. -Using ElevenLabs’ generative audio capabilities, the system dynamically creates beat-driven soundscapes from live prompts based on gameplay intensity and performance. -The result is a fully embodied rhythm experience where your motion shapes the music and your aim locks into the beat.
How we built it
-Beat Shooter runs on an ESP32 for low-latency motion capture via I²C IMU sensors. Sensor data is transmitted over Bluetooth to a FastAPI server that handles timing logic, hit detection, and intensity mapping. -Gameplay state is converted into dynamic music prompts that are sent to ElevenLabs for real-time audio generation, producing adaptive tracks that respond to player performance. -A lightweight frontend renders visual feedback, combo streaks, and accuracy metrics, creating a seamless loop between physical input, AI music generation, and gameplay output.
Challenges we ran into
-The IMU Accerlerometer and Gyroscope could not properly send signals to the ESP32, leading to Hardware Issues -Real-time latency: Synchronizing physical motion, beat timing, and AI-generated audio required precise buffering and timestamp alignment. -Sensor noise and drift: We implemented filtering and smoothing to maintain stable motion tracking. -Dynamic music adaptation: We designed prompt engineering strategies to ensure musical consistency while still allowing responsive variation. -Event overload: Rapid motion created duplicate triggers, which we solved with edge detection and cooldown logic.
Accomplishments that we're proud of
-Achieved low-latency motion-to-beat synchronization suitable for live play. -Integrated generative AI audio directly into gameplay using ElevenLabs. -Built a full-stack hardware-to-AI pipeline combining embedded systems, real-time processing, and adaptive sound generation. -Delivered a playable demo that transforms movement into music and music into gameplay. -Created and Innovated the Console, the Controller, and the Game.
What we learned
Building Beat Shooter taught us that real-time systems are less about raw speed and more about synchronization. Aligning motion data, beat timing, and AI-generated audio required careful buffering, filtering, and event control to maintain a seamless experience.
We also learned that generative AI is most powerful when it reacts to human behavior — not just inputs, but intensity, rhythm, and intent. When hardware, AI, and sound are tightly integrated, interaction feels less like control and more like collaboration.
What's next for Beat Shooter
Next, we aim to increase mechanical difficulty by introducing depth-based input, enabling users to control aiming via the gun’s distance from the display. This allows precise targeting of on-screen regions through forward/backward displacement rather than lateral movement alone.
We plan to expand difficulty scaling using machine learning to adapt to individual player rhythm patterns in real time.
Long term, we envision Beat Shooter as a modular platform — supporting multiplayer sync, wearable motion tracking, and deeper spatial awareness through computer vision.
Built With
- adafruit-circuitpython
- c
- elevenlabs
- embedded-systems
- esp32
- fastapi
- geminiapi
- imu
- pcb
- python
- react
- solidworks
- typescript


Log in or sign up for Devpost to join the conversation.