Inspiration

As international university students in Australia, one of the biggest challenges we face every day is figuring out what to eat for each meal. Cooking for yourself while balancing studies, part-time work, and life in a new country is time-consuming and sometimes stressful. A significant portion of our time is spent deciding on what to cook, grocery shopping, preparing meals, and cleaning up afterwards.

More often than not, the mouth-watering recipes we see on social media require ingredients that aren't commonly found in our limited stock dorm pantries, and buying niche ingredients to use in one recipe simply isn't practical and sustainable. Many students, like us, end up wasting both food and money, or defaulting to the same few meals repeatedly.

If only there was a way to turn the ingredients we already have into quick, practical meal ideas. This challenge inspired us to create WhatsLeft, an app that helps students or people of any ages discover recipes using what’s already in their pantry, reducing food waste and making everyday cooking easier, faster, and more enjoyable.

What it does

Welcome to WhatsLeft, the ultimate solution to life’s daily dilemma: what should I eat? WhatsLeft helps students and busy individuals make the most of the ingredients already in their pantry, turning them into quick and practical meal ideas while reducing food waste and unnecessary grocery trips.

Unlike traditional recipe apps or meal planners, WhatsLeft uses smart AI-powered tools to track your pantry, suggest recipes based on what you already have, and manage grocery lists seamlessly, all in one convenient app. By combining automation, image recognition, and smart meal planning, WhatsLeft makes cooking effortless, efficient, and sustainable.

Key Features:

1. Smart Recipe Generator Generate recipes based on the ingredients currently available in your pantry, helping you quickly decide what to cook without needing to buy extra things.

2. Pantry Scanner Easily update your pantry by scanning ingredients using the built-in scanner. The app uses image recognition trained on grocery store item datasets to automatically identify and add items to the pantry inventory.

3. Cookbook Browse through the app’s catalogue of recipes and save your favorites to cook later. The cookbook also includes recipes that may require ingredients not currently in your pantry, allowing you to explore new meal ideas. Missing ingredients can be directly added to your grocery list.

4. Smart Grocery List Manage your grocery list within the app. When an item is checked off the list after purchase, it is automatically added to your pantry inventory, keeping your pantry up to date.

How we built it

In building WhatsLeft, our design philosophy focused on making meal planning intuitive, approachable, and even enjoyable, especially for international students living away from home. We wanted the app to feel like a personal assistant for your pantry, helping you explore recipes, track ingredients, and manage groceries effortlessly. During the design phase, we used Figma to prototype wireframes, experimenting with clean layouts, easy navigation, and visually clear feedback so we always know what to code using Swift via Xcode.

Frontend WhatsLeft’s interface is built entirely using SwiftUI and Swift, enabling declarative UI development with real-time reactive updates throughout the app. We implemented the MVVM (Model-View-ViewModel) architecture using @StateObject, @ObservedObject, and @Published properties, ensuring a clean separation of concerns between the UI and business logic. To handle asynchronous data flow and reactive state management, we leveraged Swift’s Combine framework, keeping the app responsive and smooth.

Machine Learning & Computer Vision To make pantry scanning seamless, we trained a custom Core ML image classification model using the GroceryStoreDataset, which includes 5,125 images across 81 supermarket categories. This model was integrated with Apple’s Vision framework for on-device recognition through VNCoreMLRequest. Through iterative training, dataset filtering, and augmentation in Create ML, we achieved 76% validation accuracy. For cases where the model’s confidence was below 60%, the app falls back to manual ingredient entry, ensuring reliability.

API Integration To fetch real, actionable recipes based on available ingredients, we integrated the TheMealDB REST API. Network calls are handled with Swift’s async/await concurrency, and responses are parsed with custom Codable models for robust JSON decoding. This allows WhatsLeft to suggest relevant recipes dynamically, keeping the app practical and useful for everyday cooking.

Challenges we ran into

Going into our first hackathon, we had many doubts and insecurities as we knew our competitors have much more experience than us. We quickly realized that developing a fully functioning app had a steep learning curve that we had to overcome.

The first few challenges we came across were to do with the pantry scanner. The camera couldn't identify when an object was present even if it was clearly in its line of sight. Furthermore, we had issues with the accuracy of identifying objects. For example, the scanner recognized a jug of milk as diapers. The low accuracy had to do with the lack of training for the model. To solve this we made a custom LLM that was trained on grocery store databases from the internet, and achieved 76% accuracy.

The final issue we encountered was with the display of recipes. Some recipes include optional ingredients, but the app was treating them as required. As a result, if an optional ingredient was missing from the pantry, the recipe would not appear in the recipe tab, even though it should have.

Accomplishments that we're proud of

Bringing a daily frustration to life in our very first hackathon, and it actually works. We developed WhatsLeft to tackle a problem we face every day as international students, which is figuring out what to cook with limited pantry ingredients. The app can generate meaningful, practical recipes from what’s already in your pantry, and our AI-powered pantry scanner accurately identifies grocery items using image recognition trained on real-world datasets. In a world full of generic recipe apps, we’re proud that our idea is unique, student-focused, and fully functional, solving a challenge we personally experience as international university students living away from home. This is a product we would genuinely use ourselves, and we can see it having a tangible impact in reducing food waste, saving money, and simplifying daily cooking.

Our UI and design create a friendly, approachable environment. From the clean layout to intuitive features like the cookbook, pantry scanner, and smart grocery list, WhatsLeft makes meal planning feel easy and even enjoyable. Students can browse recipes, discover new meals, and add missing ingredients to their grocery list effortlessly. We focused on a design that encourages exploration and confidence, helping users feel in control of their cooking and pantry management.

On a team level, we’re proud of how we collaborated. For our very first hackathon, our small team navigated the full stack from brainstorming the concept to integrating the back-end AI with the front-end UI. We faced challenges, experimented with solutions, and celebrated small wins together. Seeing WhatsLeft come to life in real-time, especially as we test recipes and scan pantry items ourselves, has been both exciting and rewarding. This project reflects not only our technical skills and creativity but also the teamwork, problem-solving, and dedication we developed as first-time hackathon participants.

What We Learned

Our first hackathon was a crash course in both technical skills and teamwork, and we came away with lessons in several key areas:

Learning new technologies: For many of us, this was our first time coding with Swift and SwiftUI in Xcode. We had to quickly grasp the basics of declarative UI development, reactive state management with Combine, and the MVVM architecture. At the same time, we learned how to design interfaces in Figma and translate them into a functional SwiftUI layout, bridging the gap between design and code.

Building smart, data-driven features: We gained hands-on experience connecting APIs to applications, fetching live data to make our app practical and dynamic. We also experimented with machine learning by creating a custom model using the GroceryStoreDataset, which allowed the app to recognise ingredients and suggest recipes intelligently. These experiences gave us insight into how AI and databases can power real-world applications.

Collaboration and adaptability: Working in a small team under tight deadlines forced us to combine our individual skills while being open to learning new ones. From coding, debugging, and testing, to integrating APIs and training ML models, every member stepped out of their comfort zone to contribute meaningfully.

Passion and perseverance: Above all, we learned that passion drives results. Late nights debugging the pantry scanner, refining the UI, and testing recipes were made possible because we believed in WhatsLeft and the difference it could make for students’ daily lives.

Through this experience, we not only learned new technical skills but also gained confidence in tackling complex problems, building fully functional apps from scratch, and collaborating effectively as a team.

What's next for WhatsLeft

We have ambitions to fully integrate the API for the recipes WhatsLeft can suggest. While we have experimented with the API, our initial implementation contains bugs and limitations. By integrating the API, our app can generate a much broader range of recipes, providing users with more options and a more flexible cooking experience.

Another key improvement is adding user preference learning. We envision the app tracking which types of cuisine a user usually cooks such as Western, Asian, or vegetarian recipes, and adapting suggestions accordingly. We could implement this using machine learning models to cluster user preferences and predict which recipes they are more likely to choose. Over time, WhatsLeft would not only suggest recipes based on pantry contents but also user preferences, making the app more personalized.

Built With

Share this project:

Updates