Today’s the Big Day!

This is it! Astrocyte’s first presentation is today (we’ll be presenting again at Imagine RIT in May). If you’re in the area, stop by RIT and check it out! We’ll be in building 78 (otherwise known as the Louise M. Slaughter building, otherwise known as CIMS) from 1-3 PM in room 2240.

If you can’t make it, I’ll be posting photos and updates throughout the day on our Facebook page and Twitter feed.

Within a couple weeks, we’ll have a finished mini documentary about the project, so keep your eyes peeled!

We’re Coming Down to the Wire!

We’ve had lots going on in the past week and a half!

As far as the actual installation goes, Theo and Bogdan’s design is done, our Flash prototype is completed and runs in-computer with mouse clicks. Additionally, Matt and Theo got it set up on the projector with Kinect, so we can replace those mouse clicks with gestures. Our main focus in the next couple days will be getting this to run smoothly. While it works, there are a few issues with choppiness and multiple users.

Brad and Mekan have gotten eight notes’ worth of music working with the interface, and are now working to add an extra eight for an extra level of detail. One of our biggest roadblocks was that Ableton was unable to connect with Flash via RIT’s network, but we were able to obtain a router that will get us around the problem so that it will no longer be an issue, allowing us to move full steam ahead on campus.

Emma and I have been working on shooting, editing, and compositing both the project’s trailer and the behind-the-scenes mini documentary. The trailer will be ready for the presentation on Monday, while the mini documentary will be ready with the final deliverables and documentation next Thursday.

If you’re in the area, stop by the CIMS building on the RIT campus Monday, February 27th from 1-3 PM to see all the completed team projects!

Project Branding & Final UI Mockup

Astrocyte Imagine RIT Poster

Astrocyte has been branded, and here’s our poster for Imagine RIT! Our instructional gesture posters will be done in a similar visual style.

Additionally, we have a just about final UI design. We’ve jumped in to the process of pushing the assets over to Flash and getting our mockup skinned.

You’ll notice some changes from the last one, especially with the notes along the rings. For ease of use and programming, we’ve made it such that each block is its own note, rather than having to define a note for every single point around the rings as it was set up before.

This afternoon, we’re beginning shooting the longer of our two project videos that we plan to complete along with the rest of the project during week eleven.

New Project Milestones

We just finished another big team meeting, and made two major breakthroughs this week – Matt got Flash responding to the Kinect, and Brad & Mekan got Flash and Ableton communicating.

Here’s our rough plan for the next couple weeks:

  • Our project will be presented on February 17th
  • For January 31st, the final Flash mockup will be ready to go, all UI states will be designed and ready for code, and our promotional video will be storyboarded.
  • Brad and Mekan will have a Flash music sequencer working by February 3rd.
  • All Kinect gestures will be finalized by February 10th, so that Emma and I can work them in to our instructional posters.
  • Instructional posters will be sent to the printer February 15th.

Amidst these deadlines, I will also be working on gathering materials for competition submissions. It’s crunch time!

Projection Screen

Last week, we got a chance to play with the projector screen we’ll be using in our final exhibit! As mentioned, we decided that we wanted to move the installation to the wall, but we also wanted to eliminate shadow interference. We’re projecting from behind this massive screen, so users can stand in front and play without any problems.

The screen was too massive for me to back up enough to get a full shot – once on its stand, it’s about ten feet tall and twelve feet tall, awesome for multiple users at once. We’ll have to be sure to aim our projection such that everyone can reach the top alright, but a simple fix will just be adding some stars along the top of the screen that aren’t necessarily functional but will keep the illusion of immersion without making it impossible for shorter people (or, well, anyone really, since no one’s ten feet tall) to use the project.

We’ve named our installation Astrocyte, after some pretty excellent star-shaped cells in the brain and spinal cord. Emma and I have begun working on branding for the project, with a logo to be completed by this weekend.

Week Five Presentation

We just recently presented our progress to our professors Adam and Nancy and the rest of the team project class. You can click here to view a PDF of the presentation’s slides.

The main talking points of our presentation were the usability test that I posted about last week, our flash demo, our current UI design progress, and working with Ableton to create and access our project’s music.

You can click here to download the SWF file of the Flash demo Matt created. Essentially, this is a black-and-white wireframe version of our project’s underlying structure that our usability test subjects played with. The rings and stars respond to mouse click events, imitating the effect that Kinect gestures would have in our final product.

This is the most up-to-date version of our After Effects UI mockup, which Theo and Bo have been working on, that we also showed during our presentation. The music in the animation is just a placeholder to demonstrate some of the interface’s visualizer effects.

After our presentation, we managed to make awesome progress on our installation’s actual setup, plus some more UI design work and additional progress on our music. I’ll be putting all of this together in another post soon, so be sure to check back!

Usability Testing

We’ve just gotten back from Christmas break, and we got off to a running start with our first formal usability test today. You can check out all the details by reading the usability report PDF, but here’s an overview of what we were looking for and what we got out of it.

Our goals were to:

  • Decide whether to project the installation on the floor or wall.
  • Assess the most natural gestures for the adding, moving, and deletion of points on the musical grid.

We had five students who had never had contact with our project before come in and check out a black-and-white projected wireframe. We told them each that the outer space of the project was the higher notes, the inner rings the middle notes, the center circle the bass, and that they could place notes as desired within each section and an invisible metronome could activate them. They weren’t allowed to watch each other participate, and we asked each participant to show us what gestures they would instinctively want to use to add, move, and delete points, and whether they thought the project would be better either with more directions, or less directions to allow for experimentation.

Before we even started the test, we ended up deciding to go with projecting the installation on the wall. Because the wall projection won’t need to accommodate feet, it can be smaller, eliminating an extra projector. We also won’t have to deal with hanging expensive equipment upside down, and can eliminate potential shadow issues by projecting the installation from behind a screen.

After the test, we figured out the following:

  • We’ll be using a swipe gesture to delete points, and a “press-and-hold” gesture to select and move points. Each participant used a different gesture to add, so we will need to decide which makes the most sense to us – we’ll likely go with a simple pointing gesture.
  • We noticed that participants used their shadows to point to the sections they wanted. We want to incorporate a sort of cursor that will indicate where the users’ hands are while using the installation, since ideally there will be no shadow interference.
  • Our participants unanimously agreed that they felt the installation would be best if users got just a small tutorial ahead of time but were allowed to mostly figure it out for themselves. We’ll be aiming to keep our instructional posters minimal.

Tomorrow, we’ll be discussing some of our milestones for the rest of the quarter.

Projector Progress & More

This past Thursday, our main goal was to see what our project might look like projected – we had aimed to do it a bit sooner, but our equipment was unfortunately uncooperative. We got our hands on a short-throw projector, and while we’ll need to replace it for the final version of the installation since the top left of the image is blurry, it was great to finally get an accurate idea of the potential scale that we’re dealing with. Our personal usability testing got bumped up to this coming Tuesday, and after the holiday break we’ll be letting other people give our demos a shot!

Besides work with the projector, we’ve made some more design progress. Theo put together his own concept of how our interface might look:

We plan to combine some ideas from this with some of Bogdan’s concepts, and then see how it comes out. We’re particularly liking the little stars all along the edges of the interface. If we decide to project on the floor in the final version, they’ll change color and follow users around; if we decide to project on the wall, users will be able to brush through them to make them change.

Brad also put together some sample audio so we can get an idea of how our project will sound in use. Give it a listen here!

Winter Quarter

I hope everyone had a great Thanksgiving! We’re back from break, starting winter quarter, and ready to hit the ground running with our project’s production.

Yesterday we discussed some issues we found with our interface. It seemed that having the project projected on the floor but having parts controlled via gestures and others controlled via touch would create a sort of disconnect, which we don’t want as it’s an important goal of ours for the installation to be highly immersive. We talked about two different solutions:

  • Keeping the projected floor interface, but incorporating more touch. The blocks in the center circle might be activated by pressure pads, and the middle rings might be activated by double-tapping the foot or holding the foot in one place for a few seconds, rather than by our previously planned gestures. The outer space would be activated just by moving through it, as usual.
  • Projecting the interface on to a screen, with the projector behind the screen to reduce distortion and the Kinect behind the user, rather than in front of or above. The entire installation would then be gesture controlled, but made so the user could still touch the screen, and special attention would be paid to the rest of the installation’s environment to keep the immersive aspect.

We also talked about some other technologies we could incorporate to take some of the brunt of the work off the Kinect, including working with Arduino boards and infrared sensors. We figure that the fewer things we need to force it to sense, the easier the installation will be to use.

We’ve got a bunch of upcoming goals for the next week:

  • Thursday evening, a few of us will be installing the projector and Kinect in one of the Golisano labs so we can be sure we know how everything hooks up, and to practice safely hanging the equipment top-down in case we decide to keep the project projected on the floor.
  • Tuesday, we’ll be doing some heavy usability testing – Matt and Mekan will be creating some very simple demos that we can control with gestures, and we’ll be trying the gestures projected on the floor and the wall as well as several other options. From there, we’ll move on to having other people play with the demos and getting their feedback as well. This will help us ultimately determine which direction we’re going to take, depending on what feels most natural, comfortable, and intuitive.
  • Myself, Emma, Bogdan, and Theo will be doing some heavy design work for next week, so that we can pick out our final design direction and start working cohesively. Bogdan will also be taking a look at options for our sound visualizer.
  • Brad will be taking a closer look at our options for music. We can create sounds, but we may also be able to generate sound dynamically through code, so this will take some experimenting with to see what works and sounds the best.