📲 Scanning the Real World: How We Built a 3D Factory in Your Pocket
Date: 2023
Project: Incarnate – AI-powered 3D model scanning on iOS
Tagline: From camera roll to digital twin — no lidar required
3D content was everywhere — in AR try-ons, e-commerce product pages, virtual showrooms.
But creating 3D models? That was still stuck in the stone age (ironically).
Studios needed lidar rigs and photogrammetry setups — expensive, fragile, and totally out of reach
Outsourcing meant waiting days or weeks per model and shelling out thousands
Existing tools were clunky, confusing, or just not made for everyday people
Worst of all?
Some of the most in-demand categories — like food and botanicals — had 3D models that were nearly impossible to create manually.
We didn’t need another 3D editor. We needed an AI-powered scanner that just worked — like magic.
I wanted to make something so seamless that it didn’t feel like scanning at all.
Just point your iPhone camera at a product, follow a few intuitive steps, and — boom — you’ve got a photorealistic, ready-to-use 3D model.
It had to work for:
A ring 💍 and a wardrobe 🚪
1 product or 50 per day
Experts and first-timers
And ideally... someone scanning a donut 🍩 in the middle of a busy kitchen
Enter Incarnate — a mobile-first, AI-native 3D scanning app built for real-world merchants, not just 3D pros.
We didn’t just ship an app. We built a pipeline that turned everyday objects into usable 3D assets — in minutes, not days.
Real-time guidance overlays lead users around the object
Flow adapts to object size and complexity
No sliders or jargon — just human-friendly prompts like “slow down” or “get the top”
Scans are processed lightly on-device
Raw data is compressed and sent securely to the cloud
AI reconstructs 3D mesh + textures in the backend
Shiny and transparent surfaces are detected and flagged to avoid noise
Optimized for solid, non-organic items like food, electronics, homeware
Final models are downloadable in industry-standard formats
Scans are saved, versioned, and can be enhanced later
Incarnate completely flipped the 3D content game for merchants:
Metric | Before | After |
---|---|---|
⏱️ Turnaround Time | 5–7 days | Under 1 hour |
💸 Cost per Model | ₹2,000–₹10,000+ | Nearly free |
🍩 Food Scanning | Not feasible | Worked beautifully |
🧠 Operator Workflow | Manual, tedious | Scan 50+ SKUs a day, no sweat |
And for categories like food and botanicals, where traditional modeling hit a wall?
Incarnate made it feel easy.
Designing for one scan is easy — designing for 50 a day is where the real product work lies
Shiny surfaces are enemy #1 — the UX had to gently steer users away from trouble
Food was an unlikely hero — it performed best because it couldn’t be modeled manually anyway
The simpler the flow looks, the more technically complex it probably is
Success = making users feel like they just did something magical
The goal wasn’t to teach users 3D. It was to make them forget they ever needed to learn it.
iOS (Swift) – for camera, scanning UX, and capture logic
Python – backend orchestration + 3D reconstruction
Cloud ML Pipelines – for mesh generation + texturing
Mixpanel – tracked scan success, failure, and drop-off
Metabase – internal dashboards for usage and category insights
🎯 Designed the end-to-end flow — from phone camera to downloadable 3D model
🔧 Built human-friendly UX for both tiny (jewelry) and massive (furniture) objects
📊 Instrumented analytics to track failure points and surface hidden friction
🧪 Partnered with 3D engineers to refine mesh + texture outputs
📦 Prioritized for repeatable, industrial workflows, not one-off “wow” moments
🚧 Identified and de-prioritized tricky categories (fashion, organics) to stay focused
Incarnate made 3D modeling go from a studio problem to a camera feature.
Scan a donut, a drone, or a dinner plate — with just your phone — and walk away with a usable 3D model.
No lidar. No studios. No headaches.
Just scan, upload, and watch it come alive.
And when your users say, “Wait... I just did that?” —
that’s when you know it’s working.