Motorola Mirage – Remote Augmented Reality

Sound familiar? This highly experimental collaboration with Motorola Labs preceded by nearly a decade the recent smart-phone release by the same name. In 2000-2001, a server 100s of miles away at Motorola Labs would augment in real-time the view of a user who wore head-mounted-display and head-mounted mini-camera. The users position and orientation in the room was inferred remotely from markers in the environment and an updated 3D virtual scene was generated, returned and composited into his view. In 2001 this roundtrip transport and processing benchmarked at less than 1/20th of a second. In other words, at a time when the average web page took 8 seconds to load (http://en.wikipedia.org/wiki/Network_performance), our app was remotely augmenting reality with a latency barely perceptible to humans. This project pushed the existing technologies to the limit: windows, networking, compression, multi-threaded media transport and processing. I generally prefer a Unix-like environment for software development. However, when hardware driver support and availability demanded that Windows be the target OS for our Augmented Reality platform, I made the transition without going into Linux withdrawal.

Tags: , , ,

Comments are closed.