Don’t just let music move you, move to make music.
We've created a smart technology that allows your instrument to transform movement, orientation and momentum into audio & visual effects. It's essentially wearable technology for musicians, dancers, digital interactive and performance artists, and more. Bringing a new technology to the world, as you know, is a massive undertaking requiring a vast community to stand behind the idea and make it a reality. Please check us out on Kickstarter and consider supporting us: kck.st/1m9M54y
If you think this project can change the way we make music, please consider helping us out further by telling people you know about this, encouraging them to support, and posting our link to facebook and other places.
DJs, Classically-trained musicians, and experimental pioneers have all done incredible things using eMersion. Check out what these musicians have done with our gesture control system all over the world:
Extending the traditions of musical performance with the endless capacities of interactive computing go all the way back to the 1960s with Gordon Mumma's CyberSonics. Since then computing and real-time processing has evolved at an ever-accelerating pace, challenging our boundaries of the real world and what we perceive to be magical. In a world of «big-data,» our community is often faced with having to create custom programs to handle an unpredictable number of controller data, processing, and routing to musical parameters. This task becomes even more complex using wireless 'swarm sensing' based interfaces like eMersion.
For this project MaxMSP/Jitter was used to create a stand-alone series of apps to visualize and route wireless «swarm sensor» data from the eMersion system.
The software client emulates the layout of software most electronic musicians are already familiar with, Digital Audio Workstations (DAWs). The issue this software solves is: how does one handle an unpredictable number of controller data streams and an equally unpredictable number of musical parameters to route these streams to — without having to design or modify code from scratch?
Answer: the Digital Data Workstation (DDW).
Like Protools, Logic, or Garage Band, the DDW enables musicians to create «tracks.» Then the user selects from a list of available controller inputs and can assign one control input to many tracks, or different control inputs. Then, the user can process each track's data stream differently (filtering, beat tracking, threshold detection, rescaling, etc). Finally, each track is assigned an output by selecting from an available list. The kinds of outputs include MIDI, OpenSoundControl, UDP, DMX (light control), and also emulates keyboard and mouse events (to control virtually any application on your computer). Or, you can custom create inputs and outputs to be populated in the list using eMersion Tools (Max abstractions that extend the capacities of the eMersion software).
Finally, the «session» configuration can be saved and recalled for later, enabling quick changes from one musical work or section to another.
Thom Jordan of Georgia Tech mentions:
"One of the many exciting possibilities of the use of eMersion is to be able to conserve and redirect creative energy that otherwise would be wasted in the absurdity of having to simultaneously perform low level tech troubleshooting while trying to reach a higher level of artistry. It would be a pleasure to have my tools and materials be responsive enough to consistently handle the dynamic interplay and radical changing of ideas that often accompanies a semiotic mindset while performing/composing. I'm refactoring and expanding the code base now for my particular composing world towards achieving these ends, and am thrilled to begin thinking of the possibilities of interfacing with this expanding world through wearable controllers and effective mapping schemes.
3 Cheers for Wearable Technology !"