The PS Move API tracking test is a 6dof tracking implementation (positional and rotational) of a PS Move controller with a PS Eye camera on a PC while rendering a holodeck-like room to simulate head tracking.

Motivation

Since the Oculus Rift lacks positional tracking, the goal was to implement a positional tracking solution that anybody can use and at the lowest possible cost. That means using widely available off-the-shelf components that work out of the box with no technical skills required (ie. soldering).

A PS Eye + PS Move bundle seemed to be the perfect candidate, it can be bought for less than $70 (PlayStation Move Essentials Pack) which makes it a very cheap 6dof tracking solution. Most 9dof trackers cost at least $100, only provide 3dof rotational tracking and sometimes require soldering skills.

There is also the added benefit of being able to use it directly on a PS3 console for indie programmers.

Implementation

It works on a Linux machine (Debian/unstable) using a modified version of the PS Move API for tracking and Allegro/OpenGL for rendering.

The master thesis of the author of the PS Move API about sensor fusion is available on the project's website. It gives interesting informations about the capabilities of the system :
  • Test machine : Intel Core 2 Duo 2.53 GHz (single-threaded)
  • Controller tracking : 68 frames/s
  • Inertial sensor retrieval : 87 updates/s

  • Test machine : 1.40 GHz Linux machine
  • Total end-to-end latency : 68 ms (± 3 ms)

Problems


Jitter and rotational drift

The calibration doesn't seem to work correctly since there is a lot of rotational drift, even when not moving the controller. The position seems to be tracked with good precision and accuracy with low latency when orientation capture is disabled in the first part of the video.

The rotational drift is a big problem for now but I think it should be possible to get rid of it. The author of the PS Move API posted a showcase video where there doesn't seem to be any problem with orientation, although it's probably less visible since the PS Move is in his hand.

Continuous rotational drift

I think the jittering problems I've been experiencing are only related to the software side. I've tested different gain values for the Madgwick fusion algorithm and the jitter was basically gone, but it produced a continuous drift always in the same direction. I'll try with the previous version of the algorithm that seemed to take this problem into account.

Someone posted a comment on the video about using the "betaDef" parameter to get rid of the drift, this could be an interesting thing to test.

Video capture

The video doesn't do justice to what it looks like in reality, the old Canon camera used for filming doesn't seem to be able to correctly handle 60Hz rendering.

Future work

After unsuccessfully trying to correct the erratic rotational tracking and the rotational drift it seems the PS Move is not a suitable candidate for rotational tracking for now. Since the Oculus Rift already provides such tracking and considering the positional tracking was quite convincing, the best approach in the future seems to combine both.

Naive route

The naive route would be to use the PS Move scratched to the headset, but there are several disadvantages for this :
  • weight ;
  • line of sight ;
  • find a suitable location for the PS Move (top, side, looking up, front ?) ;
  • securely fix the PS Move so that it doesn't move ;
  • Bluetooth adapter needed to command the PS Move lighting ;
  • charging of the PS Move, ie. not pick & play ;

Some tests have been implemented by mixing the PS Move API and the OpenHMD library to provide positional tracking, but unfortunately it seems there is an incompatibility between these two libraries. They both use the HIDAPI but in a different way, so some modifications to one of them would be required to have a usable framework.

Best route


Reflective tape

Another less complex complex solution can probably be used. Instead of using the PS Move, some reflective tape could be sticked on the Oculus Rift, and with an additional light in the room directed towars the user could make the markers visible enough for the PS Eye to track them.

http://en.wikipedia.org/wiki/Retroreflective_sheeting

There are some benefits to this :
  • lower cost ;
  • less complex stack (no bluetooth, no PS Move charging, no library incompatibility) ;
  • no additional weight ;
  • obvious location of the markers (6 faces of the headset) ;
  • less line of sight problems if markers also on the straps ;
  • no insecure fixing of the markers ;
  • less weird aspect (no mickey mouse ears).

So now the goal is to find some suitable reflective tape and try to implement this in a demo, like the OculusWorldDemo which source is available and supports Linux.

It should also be kept in mind that not all colors work correctly for tracking with the PS Eye. From Thomas Perl's master thesis :
"With the PS Eye camera, we experienced visual tracking problems caused by motion blur with
green colors (figure 5.9). While other colors such as magenta (red and blue channels at full
intensity) were tracked without problems even in situations with fast movements, green does not
work so well with this camera. For this reason, green is to be avoided with the current hue-based
tracking algorithm when using the PS Eye camera."

The magenta, blue or red colors seem to be good candidate for the reflective tape color then. Maybe a combination of those could be useful to easily recognize the location of each markers.

The unknown for now is if reflective tape is good in daylight conditions, as it is normally intended for nighttime conditions. I think with an aditional lighting towards the user it could be good, but that needs to be tested.

Candidates for reflective tape :

The algorithms of the PS Move API should be adapted for the shape of these markers too.

Preliminary tests could also me made with standard tape or paper/cardboard attached to the Rift. For limited FOV testing a simple Sticky Note could be used, with the advantage that it's probably easy for an algorithm to detect a yellow square on a black background (the Oculus Rift itself).

Further refinements could provide a clipsable piece of plastic with markers on it, with branches running farther than the sides of the Rift, like temple arms in eyeglasses. This would make the Rift unaltered (no sticky glue on it) and would allow for more line of sight (almost 360°). A STL file could be created for easy construction, as well as stickers distributed with it, with some carved place to stick them at the appropriate locations of the Rift, DIY way.

Reflective balls

Reflective balls have been commonly used for a long time for tracking (head-tracking on the CrystalEye 3D glasses for example). They are generally used with IR cameras, but they may also be used with standard cameras and some additional lighting I guess.


Ping-pong balls

Also balls from the same size than the PS Move could be used in the same way probably, a benefit would be to not need any modification to the PS Move API. Ping-pong balls could be a good candidate, I'd need to measure them.

They could be put on each side of the Rift, like Mickey Mouse ears. The FOV would be quite high, better than with retroreflective tape certainly.

Media



Relevant forum threads

http://www.mtbs3d.com/phpBB/viewtopic.php?f=140&t=15994

External links

PS Move API : http://thp.io/2010/psmove/
Allegro : http://alleg.sourceforge.net/