Inspiration
We built OpenBy to solve a universal problem: decision paralysis in volatile markets. Whether buying tech products or trading stocks, consumers are bombarded with noise—price fluctuations, varied review sentiments, and social media hype. We wanted to move beyond simple price tracking to answer the real question: Is now the right time to buy? Our goal was to fuse AI predictions with probabilistic insights to help users buy smarter, not sooner.
What It Does
OpenBy is an intelligent decision engine that forecasts future price movements and calculates the probability of hitting your target price. It aggregates three distinct layers of data into a single, actionable "Buy Score":
- Market Data: Historical price trends and volatility.
- Social Sentiment: Virality and engagement metrics from social platforms.
- News Context: Sentiment analysis of relevant news cycles.
The result is a dynamic dashboard where users can see predicted price horizons, risk levels, and a clear "Go/No-Go" signal for their purchase or investment.
How We Built It
We engineered a multi-stage pipeline to turn raw noise into a clear signal:
- Data Aggregation: We built scrapers and connectors to pull historical price data, social engagement metrics, Google Trends volume, and news sentiment streams.
- AI Layer: We trained a custom model using PyTorch to predict expected log-returns, weighting the input from our multi-signal index.
- Probabilistic Engine: Instead of a simple point prediction, we modeled uncertainty using non-linear variance growth. This allows us to output the probability of a price crossing a specific threshold (e.g., "75% chance this drops below $100 in 3 days").
- Visualization: We used Matplotlib and Plotly to render dynamic confidence intervals, price horizons, and heatmaps that non-technical users can understand at a glance.
Tech Stack: Next.js, Python, FastAPI, pandas, numpy, scikit-learn, PyTorch, Matplotlib/Plotly.
Challenges We Ran Into
- Data Normalization: Merging data with vastly different scales (e.g., tweet volume vs. stock price) and handling missing timestamps required rigorous preprocessing.
- Modeling Uncertainty: Predicting rare, high-impact events (black swans) is difficult. We had to carefully tune our long-term variance models to avoid overconfidence.
- Signal Fusion: Combining numerical data (price) with textual/semantic data (news/sentiment) into a single scalar index was a complex balancing act.
Accomplishments We're Proud Of
Built With
- astapi
- claude
- matplotlib
- next.js
- numpy
- pandas
- python
- pytorch
- typescript
Log in or sign up for Devpost to join the conversation.