I hope everyone had a great Thanksgiving! We’re back from break, starting winter quarter, and ready to hit the ground running with our project’s production.
Yesterday we discussed some issues we found with our interface. It seemed that having the project projected on the floor but having parts controlled via gestures and others controlled via touch would create a sort of disconnect, which we don’t want as it’s an important goal of ours for the installation to be highly immersive. We talked about two different solutions:
- Keeping the projected floor interface, but incorporating more touch. The blocks in the center circle might be activated by pressure pads, and the middle rings might be activated by double-tapping the foot or holding the foot in one place for a few seconds, rather than by our previously planned gestures. The outer space would be activated just by moving through it, as usual.
- Projecting the interface on to a screen, with the projector behind the screen to reduce distortion and the Kinect behind the user, rather than in front of or above. The entire installation would then be gesture controlled, but made so the user could still touch the screen, and special attention would be paid to the rest of the installation’s environment to keep the immersive aspect.
We also talked about some other technologies we could incorporate to take some of the brunt of the work off the Kinect, including working with Arduino boards and infrared sensors. We figure that the fewer things we need to force it to sense, the easier the installation will be to use.
We’ve got a bunch of upcoming goals for the next week:
- Thursday evening, a few of us will be installing the projector and Kinect in one of the Golisano labs so we can be sure we know how everything hooks up, and to practice safely hanging the equipment top-down in case we decide to keep the project projected on the floor.
- Tuesday, we’ll be doing some heavy usability testing – Matt and Mekan will be creating some very simple demos that we can control with gestures, and we’ll be trying the gestures projected on the floor and the wall as well as several other options. From there, we’ll move on to having other people play with the demos and getting their feedback as well. This will help us ultimately determine which direction we’re going to take, depending on what feels most natural, comfortable, and intuitive.
- Myself, Emma, Bogdan, and Theo will be doing some heavy design work for next week, so that we can pick out our final design direction and start working cohesively. Bogdan will also be taking a look at options for our sound visualizer.
- Brad will be taking a closer look at our options for music. We can create sounds, but we may also be able to generate sound dynamically through code, so this will take some experimenting with to see what works and sounds the best.