36 Hours, 4 Apps, and One Crazy Idea ⚡️

It’s 4 AM. Our eyes are burning, the coffee machine is making a weird noise, and our main Kafka consumer just crashed for the third time.
"Maybe we should just mock the data," someone suggests.
It’s a tempting offer. We have 4 hours until submission. But we didn't come to GlobeStrat'25 to build a slide deck. We came to build something real.
Last weekend, our team took home 1st place in the Map in Motion track at ASU's GlobeStrat hackathon. We didn't just build an app; we built an entire connected ecosystem for the retail experience—from a customer's shopping list to the manager's inventory dashboard.
Here's the technical deep dive into how we pulled it off (and why we almost didn't).
The Problem: Retail is Disconnected 🧩
The "Map in Motion" challenge asked participants to rethink interactions within a store.
Most "smart retail" solutions are just isolated gimmicks—a QR code scanner here, a digital map there. But they don't talk to each other. If I pick up the last gluten-free bread, the inventory system doesn't know until I check out. If a manager sees a spill, they have to radio a janitor rather than flagging it on a digital twin.
We realized the real opportunity wasn't another app; it was synchronization.
The Solution: A Connected Ecosystem 🕸️
We architected a four-part system where data flows seamlessly between stakeholders.
1. The Nervous System: Apache Kafka
Instead of building a monolithic backend, we used Kafka to handle real-time state. Every action—picking up an item, restocking a shelf, flagging an issue—is an event. This allowed our four independent apps to stay perfectly in sync without tight coupling.
2. The Eyes: Employee CV Scanner (OpenCV + Gemini)
This was my favorite part to build. We wanted employees to check inventory just by pointing their phone at a shelf.
We ran into a huge issue: on-device object detection models were too limited, but sending full images to the cloud was too slow.
The Fix: We built a hybrid pipeline. We use OpenCV on the device to detect changes and regions of interest, and then pipe those frames to Google's Gemini 2.5 Flash API for detailed identification.
# The "Aha!" moment: Sending only the relevant crop to Gemini
def identify_product(frame, bbox):
# Crop to the bounding box to reduce token usage and latency
cropped = crop_frame(frame, bbox)
# Analyze with Gemini
response = model.generate_content([
"Identify this product SKU and condition. Return JSON.",
cropped
])
return parse_json(response.text)
It felt like magic when it finally worked. You point the camera, and boom—"Spicy Nacho Doritos, 12oz, Low Stock."
3. The Guide: 3D Customer Navigation
For the customer app (built with React Native/Expo), we didn't want a flat map. We used a pathfinding algorithm to route customers through the store based on their shopping list.
If you have "Milk" and "cookies" on your list, it doesn't just show you where they are—it draws the optimal path to get both.
4. The Brain: Manager Dashboard
Finally, the dashboard aggregates everything. It’s a React web app that visualizes the store's "health."

We implemented a "heatmap" feature that shows where customers are congregating. This isn't just cool to look at; it helps managers open new checkout lanes before the lines get long.
The "Oh No" Moments 😅
Hackathons are never smooth.
Hour 12: referencing localhost works great until you try to demo on four different devices on a university Wi-Fi network. We lost about 2 hours setting up ngrok tunnels and hardcoding IP addresses.
Hour 28: Our 3D map engine was consuming so much memory it crashed the Expo Go app on older iPhones. We had to drastically simplify the mesh complexity of our store model.
Why We Won 🏆
I think we won for three reasons:
- Scope Ambition: We didn't build one app; we built four. It was risky, but it showed we understood the system, not just the user.
- Real AI Utility: We didn't just slap a chatbot on it. We used AI for computer vision tasks that were previously impossible to do cheaply.
- The Live Demo: When presentation time came, we didn't use a video. We pulled out our phones and did it live. The judges saw the inventory count update on the dashboard the second we scanned the item. That real-time factor sold it.
What's Next?
We're open-sourcing the CV module because we think the hybrid OpenCV/LLM approach has legs beyond just retail.
If you're interested in the code or want to chat about hackathon strategies, hit me up on Twitter/X.
Now, I'm going to finally get that sleep. 😴