Vision-RF-Based Mobile Localization for Augmented Reality Indoor Wayfinding

Microsoft Garage Internship: Summer 2019

Indoor Localization and Wayfinding for Microsoft Employees

I spent summer 2019 working on cm-scale-accurate vision-RF-based localization & augmented-reality navigation under HoloLens team and Microsoft Research mentorship at Microsoft Garage, Microsoft’s official center for experimental projects.

Video: initial visual-inertial odometry mobile tracking system results, later incorporated into a particle filter leveraging Microsoft’s internal campus maps API and noisy BLE beacon RSSI range estimates to mitigate tracking drift and poor initial vision pose estimates (e.g. office signage, QR codes, Azure spatial anchor’s cloud keypoint storage SDK).

Presented final product and beta deployment to C-suite leadership at the end of the summer and handed off to FTEs for further development (final product videos with mobile AR rendering are confidential / internal to Microsoft).

Path from visual-inertial odometry module with visual pose initialization from QR code / office signage. BLE beacon locations indicated by blue markers.

Me and my co-interns!

Previous
Previous

BotNet: Multi-Robot Control with Realistic Communications Modeling

Next
Next

AR CRISPR 3D