bit.lighthouse

Virtual Reality (VR) is the technology that immerses human into the virtual world. For decades, VR devices have been rolled out to the market and have brought joy to people with its ability to take them anywhere they want. At the present, VR has proven to be the best choice for such immersive experiences, however, limited modes of user interaction has always been obstructing way to a better immersion. To make the experience even better, we knew we need to do something. This is where the story began.

At bit.studio, various VR related researches have been done to improve user experiences and push the technology to its limit. Designed to work with HTC Vive® VR headsets, bit.lighthouse is also one of such researches. We aim to produce a tool to enhance the immersion of VR by bringing rich user interaction into the virtual world.

Among the flagship VR headsets in the market, HTC Vive® have the killer feature called “Room Scale Tracking” which was made possible by the technology named Lighthouse. Lighthouse allows a VR user to move freely in a cuboid space. Thanks to engineers in Vive’s team, it works by spreading and sweeping light beams across the entire room to aid the position tracking of the headset — thus the name Lighthouse. The position and pose of the headset is achieved by some calculations regarding the timing of those light beams. What’s surprising is that HTC allows developers to utilize this technology for free!

The project bit.lighthouse began by the end of year 2016, before Vive Tracker® was announced. At the date, many developers have been trying to hack Lighthouse’s protocol and publishing their works to the open-source community. Wanted to also make use of the Lighthouse, we joined the community and spent our effort in researching the technology. After that, we designed the hardware to fit our need and we ended up with our 1st prototype.

The 1st prototype was a floppy piece of hardware tangled with sensors and wires. A PU foam board were used as a mount for light sensors installed face-up from the board. At this stage, we used LAN cable to feed all the collected data to a computer to further process them. Our software engineer had built an algorithm to decode the data into something useful, i.e. pose and position, with the help of the well-known open-source computer vision library, OpenCV. After processing the data, they were then displayed in a 3D environment built with Unreal® engine.

After countless trials and errors on the 1st prototype, we were ready move on and build a more rigid hardware. In our 2nd prototype, we designed a custom PCB that can hold all the electronic modules within a more compact space. Powered by the ARM Cortex processor, we achieved a centimeter-level precision with only the data available from the lighthouse. A rechargeable battery was also added to the module and we started streaming collected data through Wi-Fi instead of a wired connection for better mobility.

This was the project we were very proud of until the announcement of Vive Tracker®. And so the story goes…