We just finished another big team meeting, and made two major breakthroughs this week – Matt got Flash responding to the Kinect, and Brad & Mekan got Flash and Ableton communicating.
Here’s our rough plan for the next couple weeks:
- Our project will be presented on February 17th
- For January 31st, the final Flash mockup will be ready to go, all UI states will be designed and ready for code, and our promotional video will be storyboarded.
- Brad and Mekan will have a Flash music sequencer working by February 3rd.
- All Kinect gestures will be finalized by February 10th, so that Emma and I can work them in to our instructional posters.
- Instructional posters will be sent to the printer February 15th.
Amidst these deadlines, I will also be working on gathering materials for competition submissions. It’s crunch time!
Last week, we got a chance to play with the projector screen we’ll be using in our final exhibit! As mentioned, we decided that we wanted to move the installation to the wall, but we also wanted to eliminate shadow interference. We’re projecting from behind this massive screen, so users can stand in front and play without any problems.
The screen was too massive for me to back up enough to get a full shot – once on its stand, it’s about ten feet tall and twelve feet tall, awesome for multiple users at once. We’ll have to be sure to aim our projection such that everyone can reach the top alright, but a simple fix will just be adding some stars along the top of the screen that aren’t necessarily functional but will keep the illusion of immersion without making it impossible for shorter people (or, well, anyone really, since no one’s ten feet tall) to use the project.
We’ve named our installation Astrocyte, after some pretty excellent star-shaped cells in the brain and spinal cord. Emma and I have begun working on branding for the project, with a logo to be completed by this weekend.
We just recently presented our progress to our professors Adam and Nancy and the rest of the team project class. You can click here to view a PDF of the presentation’s slides.
The main talking points of our presentation were the usability test that I posted about last week, our flash demo, our current UI design progress, and working with Ableton to create and access our project’s music.
You can click here to download the SWF file of the Flash demo Matt created. Essentially, this is a black-and-white wireframe version of our project’s underlying structure that our usability test subjects played with. The rings and stars respond to mouse click events, imitating the effect that Kinect gestures would have in our final product.
This is the most up-to-date version of our After Effects UI mockup, which Theo and Bo have been working on, that we also showed during our presentation. The music in the animation is just a placeholder to demonstrate some of the interface’s visualizer effects.
After our presentation, we managed to make awesome progress on our installation’s actual setup, plus some more UI design work and additional progress on our music. I’ll be putting all of this together in another post soon, so be sure to check back!
We’ve just gotten back from Christmas break, and we got off to a running start with our first formal usability test today. You can check out all the details by reading the usability report PDF, but here’s an overview of what we were looking for and what we got out of it.
Our goals were to:
- Decide whether to project the installation on the floor or wall.
- Assess the most natural gestures for the adding, moving, and deletion of points on the musical grid.
We had five students who had never had contact with our project before come in and check out a black-and-white projected wireframe. We told them each that the outer space of the project was the higher notes, the inner rings the middle notes, the center circle the bass, and that they could place notes as desired within each section and an invisible metronome could activate them. They weren’t allowed to watch each other participate, and we asked each participant to show us what gestures they would instinctively want to use to add, move, and delete points, and whether they thought the project would be better either with more directions, or less directions to allow for experimentation.
Before we even started the test, we ended up deciding to go with projecting the installation on the wall. Because the wall projection won’t need to accommodate feet, it can be smaller, eliminating an extra projector. We also won’t have to deal with hanging expensive equipment upside down, and can eliminate potential shadow issues by projecting the installation from behind a screen.
After the test, we figured out the following:
- We’ll be using a swipe gesture to delete points, and a “press-and-hold” gesture to select and move points. Each participant used a different gesture to add, so we will need to decide which makes the most sense to us – we’ll likely go with a simple pointing gesture.
- We noticed that participants used their shadows to point to the sections they wanted. We want to incorporate a sort of cursor that will indicate where the users’ hands are while using the installation, since ideally there will be no shadow interference.
- Our participants unanimously agreed that they felt the installation would be best if users got just a small tutorial ahead of time but were allowed to mostly figure it out for themselves. We’ll be aiming to keep our instructional posters minimal.
Tomorrow, we’ll be discussing some of our milestones for the rest of the quarter.