-
-
An overview of grocerEasy
-
How a user can add their items to a shopping list
-
How the user can select an appointment
-
The homepage for the grocerEasy application
-
Angle 1 of our Technology for Easy Retail Interface (TERI). Here you can see the RFID scanner, which detects the tag attached to the sweets.
-
The buzzer sounds when the tag is close. This replicates what would happen if a user was approaching items on their shopping list.
-
Some screenshots of our code
-
The schema of our database
-
Our B2B business model
-
How to transform a shop for a grocerEasy session
Inspiration
The theme for our hackathon was "smart cities." As we were developing ideas, we decided to create grocerEasy as a technical initiative that would initially help people with visual impairment with shopping and would then be developed for people with disabilities. After discussing our idea to a carer at Hightown Housing Association, we noticed that the main problems that can occur during shopping for a visually impaired person are navigation and detecting what products the person has selected. Therefore, we focused on creating a shopping initiative, where navigation and selecting products in a shop were the the main aspects that we took into consideration.
What it does
We partner with high street supermarkets who set aside one hour a day for grocerEasy shoppers.
Their customers first use our app to select products that they need by creating a shopping list and would then book an appointment to shop via the grocerEasy desktop app. Once the user enters the shop, a device named TERI would be given to the user, which would alert users when they are near a product that the user has mentioned on their shopping list. The shop will implement a one-way navigation system by using textured flooring to keep the user moving in a specific direction to stop the user from colliding with objects in the shop. For future implementation, we will create a text-to-speech navigation tool that will help the user navigate through the shop by telling the user, what aisles they are currently in or are moving to.
How we built it
We built our Shopping List user interface using Qt designer, which we then converted into Python code. We then added functionality to the buttons using Python. We stored the data about the products, shipping and appointments in JSON files. To build Teri, we used an agile methodology. We built our system using Arduinos and the C programming language. In our first prototype, we ‘detected’ an item using a Tilt sensor and we improved this in the second prototype by using a RFID scanner, which would detect a RFID tag attached to an item
Challenges we ran into
We had to think very carefully about how to best design our systems for visually impaired users - to help us we conducted interviews with people who had worked with visually impaired people. It was surprisingly difficult to add images with Qt Designer. We found initially found it challenging to work out how to best store our data in a normalised way. We also found it very difficult to set up our Emergency Button on our Arduino to begin with. Some components, for example the RFID Scanner, did not have a lot of documentation.
Accomplishments that we're proud of
We are proud of designing a prototype user experience using PyQt, and optimising this design to be more accessible. We are also proud of creating a proof-of-concept embedded systems prototype using an Arduino Uno and RFID scanner that can detect the presence of an RFID-labelled product. We are also proud of creating a 3D mockup of the layout of a store that is using grocerEasy.
What we learned
We gained an understanding in the challenges that face those with visual impairments as they navigate day-to-day life and grocery shopping. In creating a solution to the product, we learned about the principles of good and accessible application design including in colour and font schemes. We learned how to create an optimised and efficient database schema. We also learned about new technologies such as the RFID scanner, and the APIs required to work with these.
What's next for grocerEasy
For future implementation, we are planning on integrating audio guide functionality so that the user will know what products they have scanned. We will also introduce a turn-by-turn functionality to tell the user what shopping aisle they are in and what aisles they are walking towards, which will aid the user with navigation. We will also implement a call-for-assistance button that will send a message across the network to a staff member in the shop, so that the staff member can help the customer if they need help.
Log in or sign up for Devpost to join the conversation.