WIP : HTC Vive Lighthouse Beacon, #OpenVR Calibration, Ts3633 for Position Aware Devices

like many projects, this is a tale of knowing HOW, WHEN, and WHERE to start. i’m never quite sure.

repeat

HOW does one position a device?

i’d often thought we could use a camera to find your device and radio the positions around. potentially, we could use the radio signal strength directly to calculate a position. none of this seemed cheap and setting up a camera system might be fun. the lighting conditions for computer vision systems are always a bit too specific for my needs. i’m not sure how i’d calibrate it quickly, and converting the camera data into t a 3d position feels like an artform completely in and of itself.

this possibility of an easy position aware devices seemed graspable upon seeing Trammel‘s ongoing research into the vive lighthouse positioning system over at @nycresistor . He’d been decoding the IR signal and building hardware on basic IR photodetectors with some amplifying circuitry. His was a great explanation and i loved seeing the phenomena splayed out on the scope. There is even a provision to send data from the lighthouse one-bit at a time.

Omnidirectional Optical Transmitter Frame
Omnidirectional Optical Transmitter Frame is a data structure that base stations (lighthouses) broadcast to all tracked objects. Each sync pulse contains one bit of data of the frame.

NOW

I wanted to play. A great talk by Alan Yates at SuperCon and DIY Position Tracking Using HTC Vive’s Lighthouse was all i need to get excited. this seemed about right for my current mix of freetime, reward needed, and effort.  i compiled the digikey shopping cart with a new plugin from 1-click BOM .

WHERE

This was starting to look easy. I asked around if a few local heroes were interested in the collection of parts and a possibility of boards to pull them all together. Just moments before pulling the trigger on the order, I find Triad Semi had already pulled all this together into a tiny package and breakout board. Technology is really starting to move fast when you can get a custom ASIC within a couple months of the first reverse engineerings. The ts3633-cm1 will be the new giant whose shoulders of which I’ll stand. I’m no stranger to the ease of Teensy/Arduino for connecting sensors to pretty much everything. So, I grabbed a spare and a led matrix with which i’d been wanting to experiment.

testing time

HARDWARE

2x ts3633-cm1 : they come in packages of ten. this starts to make sense when you realize how hard it is to consistently hit the one sensor from both lighthouses all the time. I’ll have to address this in the next Testcase.
1x Teensy : an LC, 3.0, 3.1, or any newer one should work just fine. Just be mindful of your PPM pin mappings (code).
neopixel leds : any rgb, or addressable form of leds could be made to work. i choose this 4×8 matrix i’d been wanting to test for another project. the pinouts were less straight-forward them most led strips and it is almost too bright. We’ll fix that in code.
battery : anything with usb micro or 5vdc will work. be creative. i might try to power this from a couple CR2032’s next

CODE

The calibration for the htc vive lighthouses is stored in steam vr when you setup your playspace.  For now I had to install Microsoft Visual Studio 2012 and download the latest examples from openvr. The solution file pretty much came right up after several wrong way paths down a cmake hole. I was able to debug my way to the data within the evening.

watched-lighthouse-calibration

 I’ve created a bit more logging into a build here : CODE .  I’ll continuing grinding on this with my minimal windows coding experience. The fun bit here was seeing the transform matrices for all the HTTCC devices in my system. I think this part will get very easy in the months to come. It looks like a lot of these positions can be figured out on the device with multiple sensors and some fixed geometries.
static lightsource lightsource1 = {
    { 0.779680, 0.024346, -0.625704,
      -0.178665, 0.966355, -0.185031,
      0.600148, 0.256057, 0.757798},
    { 1.560562, 2.315096, 1.997607}
};

static lightsource lightsource2 = {
    { -0.574520, 0.059408, 0.816332,
      0.234097, 0.967626, 0.094335,
      -0.784299, 0.245298, -0.569827},
    { -2.016330, 2.295804, -1.532343}
};
For now I’m happy testing with hardcoded lighthouse positioning values and mapping my space with light to see how repeatable this process is.
I started with a fork from trammel’s teensy code to log all the possible angles and positions. I ran tests around the house with my serial cable. I couldn’t tell exactly how good any of this data was. It seemed very sensitive, but I need to see the devices impression and wanted to go wireless.
<3 positioning
I love to hear about any of your ideas or experiments with the lighthouse tracking.
I used a few libraries for fun and profit:
  • fastled : get your rainbow color mapping, color pallets switching, and brightness control.
  • ewma : helps to color between positions while the sensors are often occluded
  • elapsedMillis : slick way to fade out when device it not able to find the lighthouses

NEXT

Next up, i really need a better shape. the current device and much of the code has been developed to get orientation and accuracy. I want to build a new version that is more easily tracked. I think combining sensor readings might have some advantages and creating more of a ball or a spherical tetrahedron.
Are there some interesting ramifications of being tracked in a space all the time? Definitely. I can imagine a ton of fun and useful reasons your devices might like know where they are.
Might you like to know when you’re being tracked by one of these systems? Perhaps the vive lighthouses aren’t a big deal or very prevalent yet, but in the future if this whole #openvr thing takes off we could find universal positioning systems employed everywhere.
Maybe there is an application here for the blind? kids? pets? something really creepy.  I’m inspired by new forms of sensing and the possibility of networking these devices back to the VR system for custom controllers, low-cost motion capture, and more emersion of the digital world.
Left-to-Right : HSB Gradient
Left-to-Right : HSB Gradient
height detail view
HeadSpace

 

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.