How might we use augmented reality to help people make better decisions at the grocery store?
Would it have changed my behavior this morning to know how much sugar was inside that sticky pecan bun? Maybe not. Sometimes we want what we want, but sometimes we can be nudged.
What if I could see nutrition and allergy information overlaid on food before buying and eating it? This article is about building that augmented reality future using the phone camera in your hand to influence and simplify the hundreds of decisions made every time we shop for food.
Prototyping the AR glasses future with today’s phones.
My book, SuperSight, explains how AR glasses provide decisional support and guidance like a coach sitting on your shoulder throughout the day. In the same way that we’ve come to rely on GPS, these glasses will be the navigational equivalent for food, work, DIY projects, and even offer conversational guidance.
Most people aren’t wearing smart glasses yet, but we can use smartphones to prototype this inevitable future. Apps with computer vision help us identify plants, land virtual IKEA couches in our living rooms, and add bunny ears to your selfie. Snapchat announced last week at the Augmented World Expo that 250 million people use AR every day on their phone. Let’s put this technology to use for something more important than barfing rainbows!
Modern phones can also “read” food packaging, then show you information to help make a better choice. This micro-decision support may be one of the best uses of AR. With a little CandyCrush-style gamification, making pro-health choices might actually be a little fun..
Families face a boggling minefield of decisions every week deciding what to buy and eat. I met Ruchi S Gupta MD, a rock star in the food and allergy world who directs a research group at Northwestern. She explained that choices about food are multilayered and complex. They feel inconsequential at the moment, but the consequences are enormous. The food we buy shapes how we snack, how much time we spend preparing meals, or deciding to order out. It affects our focus, how we feel, and how we connect. It’s the second-largest budget item behind housing, which we don’t re-consider daily. Nudging consumer behavior around food could help people make changes that can improve their mood and their waistlines.
Ruchi encouraged me to experiment with how augmented reality might be helpful. I worked with one of her graduate students, an AR designer, and a programmer to prototype a new service called Better Choice. We tested it at a local Whole Food market in Brookline, MA.
The Augmented reality filter decorates the package with data.
Here’s the experience: You pick up a box of granola. The phone recognizes the package in your hand, looks up all the nutritional data, compares it with your profile, and summarizes key information with five colorful icons that represent how it matches your profile: in allergens, nutrition, financial value, customer ratings, and a sustainability metric. When you tap the “better Choice ” button, it visually swaps what you are holding for the best product choice for that food category, depending on what you’ve indicated in your profile (more fiber, no shellfish, nut allergy, etc.)
To make the interface simple and persuasive, we select the most compelling points of comparison and reason to take our advice. When you’re holding a package of granola cereal, it displays a gluten-free, locally produced, highly-rated alternative made in Vermont and available a couple of shelves down in the store.
Glanceability is the killer app
Aggregating all of the information about millions of consumer packaged goods their allergies and nutritional data, price and availability is essential to drive this experience. My friend David Goodtree at FoodMap is working on this big data-fusion problem. But for our prototype, we focused on the shopper experience–how to synthesize and express data in a glanceable and actionable way.
There are a million websites, blogs, youtube channels, and dense food labels which provide a lot of information to families. When you’re standing in the aisle at the market, blocking the way for other shoppers, with an impatient toddler, you need advice fast or not at all. Even reading food labels can be impractical while shopping. Our goal is to summarize trustworthy information to help shoppers make informed decisions quickly.
The need for personalization
My mom is gluten-free; my daughter is vegan; my wife is Pesca; I’m looking for low carb, low salt, high protein foods; and friends we entertain are kosher. Other families have even more fine-grained filters. This web of requirements is complicated for humans to track when shopping, but easy for algorithms.
Tradeoffs and multi-channel experiences.
Not a huge surprise: it’s expensive to buy local. Better reveals the most sustainable option, but this is typically the more expensive–unless you buy in bulk.
Might people be interested in bulk purchases or a subscription, given enough information to feel confident with such a choice? Who wants to haul dog food or a big bag of flour, rice, or other heavy packages, especially if the bulk value equation is more attractive.
A more sophisticated form of computer vision uses scene understanding to perform the inverse of augmented reality, it can recognize and discretely remove objects. In cluttered environments like a store, this diminished reality technique may be more valuable than augmented reality.
For example, we might remove all the items from your field of view that don’t fit your BetterChoice profile so that anything remaining on the shelf represents a decent match.
After we built our prototype, we went to Whole Foods and got people’s feedback.
Here’s what we learned.
- People are overwhelmed while shopping and want guidance.
- Allergens were the most urgent issue
- People think that they would use such a tool and that they would choose to have certain things delivered If they were confident that it was financially smart to buy in bulk
- It was important for people to know that product information was coming from a trusted source, not a paid promotion
- the idea of getting a free sample to persuade you to try something new was interesting.
Our next step is to expand the product categories, improve the BetterChoice algorithm, then deploy a broader test across more stores and geographies.
Which product brands would benefit most from shopper guidance in AR?
Products that have the best data, the best nutrition, customer ratings, and financial value. ones that align with people’s interests where it’s hard to see that alignment today, and new brands. because we’re promoting products based on their inherent attributes rather than brand recognition, the big brands may have the most to lose.
I wonder if such a tool would differentiate the shopping experience enough for people to choose one grocery store over another?