Project: AN-Prez – Spatial Collaboration in Mixed Reality
Date: 2022
Subtitle: Because PowerPoint was never meant for 3D
How I made the MR Collaboration from ₹4L to ₹21K?
Let’s face it — remote collaboration was... fine.
Until it wasn’t.
Presentations were 2D.
Product demos lacked context.
Brainstorming meant sharing screens while juggling Notion, Zoom, and a prayer.
Mixed Reality should have been the answer. But MR tools were either:
Built for solo exploration, not team flow
Clunky or overengineered
Or stuck behind ₹4L HoloLens price tags
We didn’t need more slides.
We needed a war room — spatial, persistent, alive.
Think: JARVIS, but for meetings.
We had already built AN-Prez, a multiplayer Mixed Reality app for immersive collaboration — live avatars, sticky notes, 3D models, and slides — all in the same shared space.
But it was locked to Microsoft HoloLens.
Which meant it was basically locked to... nobody.
₹4,00,000 per headset isn’t how you scale team collaboration.
So I took on a challenge that, at the time, felt almost mythical:
Migrate AN-Prez from HoloLens to Oculus Quest. Solo.
No docs. No guides. Just code, grit, and way too many build errors.
This wasn’t a one-line platform switch. It was an ecosystem jump.
AreaHoloLens (UWP)Oculus Quest (Android)OSWindowsAndroidInputGaze & air-tap (MRTK)Controllers (Oculus SDK)Build SystemUWP + Visual StudioGradle + Android StudioDocsSparse but existsJust vibes and GitHub issues
I had to rethink interactions from the ground up — not just port code.
Gaze-based UX? Doesn’t work with controllers.
Air-taps? Replaced with precise joystick + hand tracking.
Even slide transitions needed reanimation.
AN-Prez wasn’t just "PowerPoint in VR" — it was a multiplayer sandbox where teams could ideate, present, and explore ideas together, in real-time.
👥 Virtual Meeting Rooms
→ Join as live avatars, synced with spatial audio
→ See who’s talking, where they’re looking, and what they’re interacting with
📊 Dual Presentation Mode
→ Flip seamlessly between 3D models and slide decks
→ Control everything via Oculus controllers
🧠 Sticky Note Brainstorming
→ Toss, group, write, and drag notes in spatial infinity
→ Infinite canvas. Real collaboration. No tabs.
🧪 Lightweight Multiplayer Networking
→ Photon Engine handled real-time sync
→ Built logic for avatar states, shared inputs, and persistent rooms
🎮 Cross-Platform Rewrite
→ Rebuilt all input logic from scratch
→ Re-architected build system for Android
→ Performance-tuned every frame for Quest’s mobile GPU
Before (HoloLens)After (Oculus Quest)₹4,00,000/headset₹21,000/headsetLimited to demos & POCsUsable by actual teamsGaze-only, Windows-onlyController-native, Android-readySolo device useMulti-user, real-world collaboration
And the best part?
It started getting adopted by design, sales, and innovation teams in Fortune 500 orgs — for everything from product reviews to spatial brainstorming offsites.
I wasn’t just porting a product.
I was:
Rewriting its input language from gaze to controller
Rebuilding its foundation across two operating systems
Debugging through guesswork and logs, because documentation was... let’s say optimistic
Optimizing for a completely different hardware constraint
Proving it could be done — and opening MR up to 10x more users
Stack Area | Tools Used |
---|---|
Core Engine | Unity |
HoloLens UX Layer | MRTK (Mixed Reality Toolkit) |
Oculus Integration | Oculus SDK (Android) |
Multiplayer Sync | Photon Engine |
Build Systems | Gradle, UWP |
3D Assets | Blender, Maya |
Codebase | C# |
Porting XR platforms is game design + engineering + archaeology.
There’s no Stack Overflow tag for “this has never been done.”
Accessibility isn’t just pricing — it’s usability, portability, and performance.
The real win? Not the code. It was turning a shiny MR demo into something teams could actually use.
We took spatial collaboration from proof of concept to plug-and-play.
From ₹4L exclusivity → ₹21K accessibility.
From solo headset demos → real-time team brainstorms.
It was hard.
It was fun.
It was the future — made usable.