Music can influence emotions, but what if it could respond to them in real-time? Combining neuroscience with AI, we set to create a system that generates personalized playlists based on brain activity.
The problem with current recommendations is that they don't account for mood and rather rely heavily on listening history and genre-specific recommendations. This often creates a feedback loop that narrows down your listening preferences instead of introducing you to new artists and musicians.
This leads to a common misconception that music is what determines your mood. But what if it was the other way around? What if we could leverage music to boost positive emotions and negate negative ones?
Introducing Mind Beats, a hardware technology that utilizes Brain-Computer Interfaces (BCI) to detect EEG waves and process them using deep learning algorithms and convolutional neural networks. These allow us to calculate a person's emotions and create a highly personalized playlist that accurately depicts the user's mood.
As of now, our model uses the Muse 2 EEG sensor, available online for retail to detect EEG signals from the brain. In the future, we wish for these sensors to be placed in headphones, that can then seamlessly detect these signals and play the top-most recommended song automatically.
One of the biggest challenges was accurately interpreting EEG signals and mapping them to emotional states given our lack of experience with neuroscience. We spent a lot of time reading and researching to accurately understand EEG Signals.
We successfully developed a working prototype that demonstrates the feasibility of emotion-based music curation. Our model achieves high accuracy in detecting emotional states and dynamically adapting playlists.
We plan to integrate our system with widely used recommendation platforms like Spotify and YouTube and reimagine their recommendation systems to redefine how music is looked like and what it could look like.
Log in or sign up for Devpost to join the conversation.