like many projects, this is a tale of knowing HOW, WHEN, and WHERE to start. i’m never quite sure.
HOW does one position a device?
i’d often thought we could use a camera to find your device and radio the positions around. potentially, we could use the radio signal strength directly to calculate a position. none of this seemed cheap and setting up a camera system might be fun. the lighting conditions for computer vision systems are always a bit too specific for my needs. i’m not sure how i’d calibrate it quickly, and converting the camera data into t a 3d position feels like an artform completely in and of itself.
this possibility of an easy position aware devices seemed graspable upon seeing Trammel‘s ongoing research into the vive lighthouse positioning system over at @nycresistor . He’d been decoding the IR signal and building hardware on basic IR photodetectors with some amplifying circuitry. His was a great explanation and i loved seeing the phenomena splayed out on the scope. There is even a provision to send data from the lighthouse one-bit at a time.
This was starting to look easy. I asked around if a few local heroes were interested in the collection of parts and a possibility of boards to pull them all together. Just moments before pulling the trigger on the order, I find Triad Semi had already pulled all this together into a tiny package and breakout board. Technology is really starting to move fast when you can get a custom ASIC within a couple months of the first reverse engineerings. The ts3633-cm1 will be the new giant whose shoulders of which I’ll stand. I’m no stranger to the ease of Teensy/Arduino for connecting sensors to pretty much everything. So, I grabbed a spare and a led matrix with which i’d been wanting to experiment.
2x ts3633-cm1 : they come in packages of ten. this starts to make sense when you realize how hard it is to consistently hit the one sensor from both lighthouses all the time. I’ll have to address this in the next Testcase.
1x Teensy : an LC, 3.0, 3.1, or any newer one should work just fine. Just be mindful of your PPM pin mappings (code).
neopixel leds : any rgb, or addressable form of leds could be made to work. i choose this 4×8 matrix i’d been wanting to test for another project. the pinouts were less straight-forward them most led strips and it is almost too bright. We’ll fix that in code.
battery : anything with usb micro or 5vdc will work. be creative. i might try to power this from a couple CR2032’s next
The calibration for the htc vive lighthouses is stored in steam vr when you setup your playspace. For now I had to install Microsoft Visual Studio 2012 and download the latest examples from openvr. The solution file pretty much came right up after several wrong way paths down a cmake hole. I was able to debug my way to the data within the evening.
I’ve created a bit more logging into a build here : CODE . I’ll continuing grinding on this with my minimal windows coding experience. The fun bit here was seeing the transform matrices for all the HTTCC devices in my system. I think this part will get very easy in the months to come. It looks like a lot of these positions can be figured out on the device with multiple sensors and some fixed geometries.
For now I’m happy testing with hardcoded lighthouse positioning values and mapping my space with light to see how repeatable this process is.
I started with a fork from trammel’s teensy code to log all the possible angles and positions. I ran tests around the house with my serial cable. I couldn’t tell exactly how good any of this data was. It seemed very sensitive, but I need to see the devices impression and wanted to go wireless.
I used a few libraries for fun and profit:
fastled : get your rainbow color mapping, color pallets switching, and brightness control.
ewma : helps to color between positions while the sensors are often occluded
elapsedMillis : slick way to fade out when device it not able to find the lighthouses
Next up, i really need a better shape. the current device and much of the code has been developed to get orientation and accuracy. I want to build a new version that is more easily tracked. I think combining sensor readings might have some advantages and creating more of a ball or a spherical tetrahedron.
Are there some interesting ramifications of being tracked in a space all the time? Definitely. I can imagine a ton of fun and useful reasons your devices might like know where they are.
Might you like to know when you’re being tracked by one of these systems? Perhaps the vive lighthouses aren’t a big deal or very prevalent yet, but in the future if this whole #openvr thing takes off we could find universal positioning systems employed everywhere.
Maybe there is an application here for the blind? kids? pets? something really creepy. I’m inspired by new forms of sensing and the possibility of networking these devices back to the VR system for custom controllers, low-cost motion capture, and more emersion of the digital world.