Emma created this mini documentary of our project – get an overview of how the project works and check out footage from our first presentation on February 27th!
This past Thursday, our main goal was to see what our project might look like projected – we had aimed to do it a bit sooner, but our equipment was unfortunately uncooperative. We got our hands on a short-throw projector, and while we’ll need to replace it for the final version of the installation since the top left of the image is blurry, it was great to finally get an accurate idea of the potential scale that we’re dealing with. Our personal usability testing got bumped up to this coming Tuesday, and after the holiday break we’ll be letting other people give our demos a shot!
Besides work with the projector, we’ve made some more design progress. Theo put together his own concept of how our interface might look:
We plan to combine some ideas from this with some of Bogdan’s concepts, and then see how it comes out. We’re particularly liking the little stars all along the edges of the interface. If we decide to project on the floor in the final version, they’ll change color and follow users around; if we decide to project on the wall, users will be able to brush through them to make them change.
Brad also put together some sample audio so we can get an idea of how our project will sound in use. Give it a listen here!
I hope everyone had a great Thanksgiving! We’re back from break, starting winter quarter, and ready to hit the ground running with our project’s production.
Yesterday we discussed some issues we found with our interface. It seemed that having the project projected on the floor but having parts controlled via gestures and others controlled via touch would create a sort of disconnect, which we don’t want as it’s an important goal of ours for the installation to be highly immersive. We talked about two different solutions:
- Keeping the projected floor interface, but incorporating more touch. The blocks in the center circle might be activated by pressure pads, and the middle rings might be activated by double-tapping the foot or holding the foot in one place for a few seconds, rather than by our previously planned gestures. The outer space would be activated just by moving through it, as usual.
- Projecting the interface on to a screen, with the projector behind the screen to reduce distortion and the Kinect behind the user, rather than in front of or above. The entire installation would then be gesture controlled, but made so the user could still touch the screen, and special attention would be paid to the rest of the installation’s environment to keep the immersive aspect.
We also talked about some other technologies we could incorporate to take some of the brunt of the work off the Kinect, including working with Arduino boards and infrared sensors. We figure that the fewer things we need to force it to sense, the easier the installation will be to use.
We’ve got a bunch of upcoming goals for the next week:
- Thursday evening, a few of us will be installing the projector and Kinect in one of the Golisano labs so we can be sure we know how everything hooks up, and to practice safely hanging the equipment top-down in case we decide to keep the project projected on the floor.
- Tuesday, we’ll be doing some heavy usability testing – Matt and Mekan will be creating some very simple demos that we can control with gestures, and we’ll be trying the gestures projected on the floor and the wall as well as several other options. From there, we’ll move on to having other people play with the demos and getting their feedback as well. This will help us ultimately determine which direction we’re going to take, depending on what feels most natural, comfortable, and intuitive.
- Myself, Emma, Bogdan, and Theo will be doing some heavy design work for next week, so that we can pick out our final design direction and start working cohesively. Bogdan will also be taking a look at options for our sound visualizer.
- Brad will be taking a closer look at our options for music. We can create sounds, but we may also be able to generate sound dynamically through code, so this will take some experimenting with to see what works and sounds the best.
We just recently finished up our final presentation for the end of the quarter, and have a bunch of awesome design and development progress.
Matt and Mekan have gotten the Kinect receiving data and tracking gestures, which is an intrinsic part of how our installation is going to work.
Bogdan, Emma, Brad, and Theo have also been working on some more design concepts.
I worked on creating our research and process document. You can go ahead and download the PDF to take a look by clicking here. The documents noted at the end of the PDF as being on the CD are those that are included in this here blog post.
The designers will be working over Thanksgiving break to create additional design comps, and when we come back from the break we’ll be picking and choosing what we like best to nail down our final design direction by week two of winter quarter so we can start learning how to create the dynamic assets in Stage3D. The developers will be working on a Flash demo of the interface and getting our Kinect and projector mounted and working (we need to hang this equipment top-down from the ceiling, so a lot of time and care is going to be involved!). Around week six we hope to be putting design and development together so that by weeks nine and ten, we’ll just have bug fixes left.
In the meantime, we’ll also be looking in to materials to use for the blocks that will make the center of our projected interface work. We’ve got a couple different options for materials – the two we’re currently playing with being strong glass and foam core – so we’ll be studying costs and how well the materials stand up to abuse, since this project will need to be able to take a fairly heavy beating across about six hours of use at Imagine RIT.
We’ll see you after Thanksgiving!
We’re currently working on nailing down a look to “skin” our music generator with.
These are some sketches and color tests that I did, working on ideas for our central circle:
A mockup of Emma’s experimenting with shapes:
And these are demos that Bogdan created for how our outer rings might look. The first two are experiments with how the ring itself will look and with the color scheme, and the third is closer to what we’re leaning towards for the ring’s final look:
Hey everyone! Welcome to Synapse’s blog. We’ve got 8 weeks worth of material to hit you with, so bear with us!
A quick summary for the tl;dr among you: we’re a group of seven New Media (five Design & Imaging and two Interactive Development) seniors at Rochester Institute of Technology, and this is the progress blog for our team project, which basically functions as a capstone for our four years of education here at RIT. We have twenty weeks to create the project, and we’re in the middle of week eight. While the project is due in February, we’ll be presenting to the public in May at Imagine RIT.
Week one they put us together and basically said “okay, GO.” We pretty quickly agreed that we wanted to create something interactive and immersive and that we wanted to try using the Kinect to do it. This spawned brainstorming on a MASSIVE level, and we narrowed down to two ideas: an interactive three-dimensional world, or an immersive music generator and visualizer. We opted for the second idea.
The idea is that people will be able to interact with a grid of notes to create music, using both gestures and static blocks which we’ll build, which will be read by the Kinect. Theo put together a sketch of the basics:
We brainstormed some user interfaces:
And we put together some mood boards:
We’ve gotten some great feedback so far. Namely, our professor Adam suggested that we gear the project more towards our target persona, an 18-25-year-old engineering major, and add some hidden levels of functionality so that our project can be satisfying to engineering minds but still be simple enough for kids to play with, too. Matt put together this mockup of our interface idea as it currently stands:
The center “sun” has our bass notes, which will be determined with solid blocks placed within the circle. The middle rings are the middle notes, and the notes will be determined by projected “blobs” along the rings, which the user will be able to control with simple gestures. The outside particles will be our higher (“tinkly”) notes, and will interact with people just walking through the projection. Besides these projections on the floor, we’re also planning for a wall-projected visualizer to complete the immersive experience.
Currently, developers extraordinaire Matt and Mekan are putting together a Flash demo of how our project will work along with Brad, who’s working on the actual musical notes that our installation will produce. Theo, Emma, Bogdan, and I are working on some design mockups.
If you’ve made it this far, congratulations explorer! All our following posts will be a lot easier to digest. Unless, of course, you’re allergic to awesomeness.