UIST 2011 greatest hits
I went to UIST 2011 in Santa Barbara and presented our research on CrowdForge.
Here's a sample of some of the great work that was presented this year, in the 3 research areas that interest me most: crowdsourcing/human computation, mobile physical computing, and music.
Notable work in both workflow-oriented approaches (Jabberwocky, CrowdForge, PlateMate) and synchronous collaborative approaches (Crowds in seconds, Collabode, Real-time).
- Presents a workflow-based crowdsourcing nutritional analysis from food photographs.
- Implemented in Django/Python, same as CrowdForge
- Found out about CrowdForge 90% of the way into the research.
Real-time crowd control
- Nice research from Jeff Bigham's group.
- Compares different strategies of merging input from different users.
- Reminded me of my twitter mindstorms project.
Crowds in two seconds
- Great research from MIT crowdsourcing folks (Michael Bernstein, Rob Miller)
- Retainer: keep workers "on tap" - ready to work by paying $0.30/hour.
- Rapid refinement: crowd algorithm to narrow search space to accelerate.
- Cool application: take movies, crowdsource the best moment.
The Jabberwocky programming environment
- Really ambitious MTurk like project.
- Combine different worker types from different spheres (social groups, paid workers, machines)
- Full runtime stack (Dog high level language, ManReduce low level language, Dormouse runtime)
- Great potential to mix different spheres of workers, but a rather intimidating project... scope creep!
More in the Jabberwocky paper.
Real-time collaborative coding in a web IDE
- More awesome work from MIT (Max Goldman, Greg Little, Rob Miller)
- Web-based Java EtherPad editor for collaboration
- Main problem: others leave code in semi-working state. When to sync?
- Idea: automatic error-aware integration. Auto-sync code when it compiles (or - tests pass)
More on the Collabode site.
Mobile physical computing
Interesting work in mobile virtual and projected UI spaces.
- Great work from Microsoft Research to let you use your phone through fabric (while in the pocket)
- Since orientation in-pocket is unclear, there's an orientation setting gesture (not sold on this)
- Unfortunately typical front pocket jeans work rather poorly.
Multi-user Interaction with Handheld Projectors
- Most projector-based systems require a fixed place. This one is fully mobile.
- System produces visible projection (for image) and invisible projection (for data)
- Camera tracks location of nearby projections
- Cool applications: virtual boxing, content transfer, 3D model viewing.
To get a better sense of the project, take a look at this video.
- Lets you control a mobile phone device without taking it out of your pocket.
- Maps iPhone controls to the palm of your hand similar to Guidonian hand
- Touchscreen progression: fingers replaced styli. Next step: palm replaces phone (bit of a stretch)
- Shoulder mounted depth camera/projector.
- Works for all sorts of surfaces. Heavy math for tracking and projecting.
- Tracks "hover" (haven't seen this notion before) and "touch" states.
I had the pleasure of hearing Ge Wang speak about some of his older projects, including Ocarina, Leaf Trombone, the Stanford Mobile orchestra, and others. Notably I hadn't really seen the ChucK language in action, and would be interested in seeing if the Web Audio API could support this sort of thing. Browser-based ChucK, anyone?
The last piece of research that I really enjoyed was called "onNote: Playing printed music scores as a musical instrument". The idea here was to use OMR techniques for markerless tracking of sheet music using positioning and using a finger for pointing. The other neat application here was to support compositional remixing by literally cutting up sheet music and splicing it back together.
This was my first UIST, and though I'm unlikely to have new research to submit for UIST 2012, I'm seriously considering going anyway, just to stay on top of the great work that this vibrant community generates.