Inspiration

The main inspiration for this project was an accumulation of X (formerly twitter) blogs that I have read in the days leading up to the hackathon https://x.com/harrris0n/status/2014197314571952167?s=46, watching companies who are focused on human simulations raise millions of dollars (Artificial Societies & Simile.ai) combined with real time data networks from publicly available sources such as Waze, Flights and Maritime data.

What it does

Fundamentally at the base sequoia is a map based interface that allows users to understand human movements within a city specific view. Aggregating signals like traffic data, Waze traffic reports, google popular times data, aircraft carrier data, maritime carrier data, public transportation networks all into a single web-based network. We then included human simulations using an orchestrating to spawn math-defined sub agents to interact with real world societies, allowing the user to see how simulated humans interact within city.

How we built it

react-globe.gl + Three.js for the spinning globe landing page, MapLibre GL for city specific mapping, Next.Js 16 as an app router to connect to server side functions and call API's without rate limiting, websockets to stream real time data such as maritime data, anthropic SDK for agentic human simulations, python csv parsers & specific use case scrappers, lots of the REST Api calls for different functions.

Challenges we ran into

One of the things I ran into was rate limiting for specific APIs, originally I was using OpenSky to get real time air traffic data but was limited to only 1 req/10s. I quickly found out that once you send too many request you get put on a really long ban time so I had to pivot into Airplanes.live. Another challenge was specifically the implementation of the globe was tricky but was able to overcome this with some YouTube videos and public GitHub repos.

Accomplishments that we're proud of

One of the things that I am most proud of is having a constant stream of data that flows into the system at any given time. With 10 different data sources each with different rate limits, data shapes and protocols I think building out each system individually allowed me to focus on each source one by one to create a single polished product.

What we learned

One of my goals for this hackathon was to see other ways in which I can collect and aggregate data for free. My main language is python & C++ so I work with low latency systems and data pipelines a lot. But building data scrappers in other languages was quite enjoyable.

What's next for Seqouia

I don't know, I guess you will have to stay tuned and find out.

Built With

Share this project:

Updates