Each day of NIME was a completely new experience. Paper sessions spanned from Tuesday to Thursday, and each day was divided up into smaller sessions with distinct focus areas, such as
- Motion and Gesture,
- Collaborative Music Making,
- Robotic Systems,
- Tangible Interaction / Interfaces,
and plenty more.
I was particularly interested in the Motion and Gesture session, as its applications to music therapy are more than promising. For those of you who haven’t heard of the company SoundTree, venture to their page about their SoundBeam product. In short, SoundBeam is a system that translates movements into electronic music (soundtree.com). It removes the complications of composing and refining a musical piece.
But how can we improve these capabilities in mobile applications? Cost is often the primary reason both everyday users and those with special needs cannot obtain access to “dedicated” systems. As Stu Favilla and Sonja Padell stated, mobile apps make electronic music creation tools much more accessible while still preserving a high enough level of complexity to avoid patronizing users (see their paper on NIMEs and Dementia).
Luke Dahl’s paper on acceleration and improved motion sensing is a good starting point.