Detailed write ups of each aspect of lucid can be found on the project blog.
Participants use the Lucid App while sleeping to try to gain clues to the in game challenges. Challenges vary from simply geometry to complex musical puzzles.
My involvement in this project was in creating controls for the participants to play Lucid. Within the Immersion Vision Theatre (a dome environment) participants wear a headset that has an iPhone attached, this is able to replicate their natural looking motion in-game by using the accelerometer and gyroscope data. A second iPhone was also used for player movement again accessing accelerometer and gyroscope data. I created a small Node.js App to receive this data via web sockets and send it using OSC to be used by the Unity3D engine Lucid is built on.