HTC Vive and Northwestern MSR - An Oral History

Former Master of Science in Robotics students look back at the role the HTC Vive virtual reality system had on some of their student work at Northwestern.

The HTC Vive system is a commercially-available platform normally used for virtual reality, yet in the last few years, it's played a major role in multiple student projects within Northwestern's Master of Science in Robotics (MSR) program.

To better understand the system and the impact it's had within MSR, we spoke with four people who worked directly with HTC Vive:

  • Tanay Choudhary (MSR '16), Robotics Software Engineer at Vecna Robotics
  • Adam Pollack (MSR '17), Robotics Engineer at Knightscope
  • Lauren Hutson (MSR '18), Robotics Software Engineer at Diligent Robotics
  • Jarvis Schultz, Former Associate Director of Northwestern's MSR Program

The perspective of these four was quite informative and combined to create an oral history of HTC Vive and how the MSR program evolved around the system's evolving technology.


The HTC Vive system consists of two Lighthouse base stations and a variety of trackable components, including the Vive Headset, Vive Controllers, and the Vive Tracker. The base stations emit beams of light that are received by each of the trackable objects to determine their location in space. The system allows virtual reality content to render the trackable objects in a virtual world.


When the HTC Vive Virtual Reality (VR) system first came out, it created some interest from the robotics community because it seemed like a promising, low-cost system for tracking arbitrary objects in space, which we often want to do in robotics. There were papers published on the system, news articles, and roboticists trying to become developers for HTC/Valve to get access to the technology. The device includes software tools that allow developers to accurately track the pose of the headset and controllers, but we wanted something that could leverage the technology without requiring all of their software and controllers.

Tanay Choudhary was one of the first people to develop his own custom hardware that used the Lighthouses of the HTC Vive but did all of the tracking calculations on custom hardware and software. With his Airbot project, the idea was that you could slap on a few very cheap sensors, import a lightweight software library and automatically have access to highly accurate tracking information.


The broad goal of the Airbot project was to make a robot capable of moving around on a nearly-frictionless surface such as an air-hockey table. Most moving robots use wheels or legs, but they both would slip on a frictionless surface, so we had to come up with something unconventional.

Once I was able to make the robot move around on the air-hockey table, I needed a way to track exactly where the robot is on the table and how it is oriented since we can't really call it a mobile robot if it can't localize itself. This would be crucial if we wanted the robot to go from point A to point B on the table on its own.

HTC Vive's Lighthouse tracking system was perfect for this since it can enable indoor localization with sub-millimeter accuracy. However, since the system was unveiled quite recently and was not open source, no one really knew how it worked under the hood. I had to essentially reverse engineer the system by putting the sensors on my robot, observing the different kinds of pulses emitted by the base station, and programming an algorithm to triangulate the positions of the sensors.


After Tanay's project, HTC released the Vive Tracker hardware. This solved some of the problems in that it was a small, lightweight, and relatively inexpensive piece of hardware that could be bolted onto an object to get accurate tracking information about the pose of that object. The issue was that you were still tied to the software system that the Vive needed, which is pretty complicated.

Inspired by an open-source effort to reverse engineer the communication protocols of the Vive Tracker, Adam Pollack set out to create a software library and ROS package that allowed you to get the pose of a Vive Tracker but with very minimal software dependencies. This way you could connect the tracker to a Linux machine with low computational power, such as a Raspberry Pi, and have access to the pose information.


The goal of my final project was to determine the pose of the Vive Tracker in space for use with a quadrotor. When mounted on the quadrotor, the Vive Tracker would enable the vehicle to know its position in space and could then accept commands to travel to a particular absolute location. I utilized an open source library to interface with the Vive Tracker and then wrote custom code to determine the location based on data being received from the Lighthouse base stations.


Since Adam's project was completed, the MSR program has been able to leverage the Vive Tracker capabilities in a number of projects, including Lauren Hutson's custom VR glove and a multi-year project involving development and control of a fleet of mobile manipulators.


The availability of information regarding the low-level technology that enables the Vive to function, as well as the massive development community, were vital in helping me develop a strategy for completing the project.


Tanay's project laid the foundations for a more advanced custom solution that is now used regularly in research and courses at Northwestern.


Everything I've done in robotics is in some way built upon solutions other people came up with and shared, so I'm really glad some of my work at MSR is proving useful to others.