I needed to build a demonstration using Crossbar.io with ThreeJS and wanted to use an Arduino and an Raspberry Pi to connect the sensors. The WAMP router was in place so I just needed to make a visual system and get the sensors hooked up. The Arduino is more flexible for handling analog sensor data but the RPI does have ports for digital sensors, so I wanted to try using the Arduino to convert the sensor data then send it to the Raspberry Pi.
Live Demo ThreeJS Visualizations
Setting Up The Raspberry Pi
Passing input from the Raspberry Pi via NodeJS was the first step. I used a button first so I could calibrate sending and recieving data from the RPI to the WAMP server. Using a tactile button let me guage if there was a large lag time and also register the hit from the Crossbar server. It was also a good change to test the digital sensors with the RPI.
Testing Arduino Sensor Output
It seems a lot easier to get a wide range of signals using the Arduino. I wasn't sure yet if I was going to use sensors on the RPI too so I tried the digital sensors on both first as a test. I wanted to get a good baseline for the audio sensor as well so I used the oscilloscope to so how strong both analog and digital signals would be. The analog signal turned out to be a lot less responsive than I thought. It was better than the digital, which is more of a threshold meter.
Raspberry Pi Sensors
I ended up just starting with the two sensors on the RPI in order to make calibrating the visuals less complicated. I made a little rig with the sensors so we could setup multiple banks in different locations. This was going to be tracked visually but I'm just using one sensor block to start with.
Calibrating the Sensors
The first ThreeJS layout was somewhat basic but I wanted to use a shape that was easy to track. I set it up so that the position, color, size, transparency, and number of objects was used to represent the sensor data on the screen. The image shows some of the raw data on the left. It also shows the range finder data, which is not shown on the visuals. At this point everything is way too low and dense. The audio must be really loud to cause the visual spikes on the screen.
Data Visualization Overload
At this point the 3D shapes are displaying properly but they aren't very evenly dispersed. The shapes appear and then gradually fade. The screen as pretty full at this point but they at least aren't going off screen anymore. I'm still working out the color and transparency but added more fog so it has an appearance similar to floating over a city. The camera also moves gradually over the top of the scene. I plan on using the position as a paramater once more sensor banks are in place.
The completed scene looks relatively clean and a user can interact with the screen when activating the sensors directly. I hooked up two boxes at two different locations and split them evenly on each side of the stage vertically. I wanted it to look interesting even if the viewer wasn't directly triggering the sensors.