gestural synthesis

Video: YouTube: Drancing accelerometer music with Wiimotes: 3D variable frequency oscillators + amplitude variation + triggered "air drum" samples

Play video (then use fullscreen !)
ERROR: You are missing some Flash content that should appear here! Perhaps your browser cannot display it, or maybe it did not initialise correctly.
This error usually only occurs if embedding remote content fails; Please try playing the original video on the original site using the following link.

Demonstrates combined oscillator frequency variation, amplitude variation, and triggered Drum sample ("Drumming by Dancing") modes along with Drancel RGB monitor visuals projected onto the "Drancer" performer.

For this demonstration two hand-held WiiTM Remotes on a MacBook Pro were used. (The original Drancing accelerometer suit (since 1997) used 5 triaxial accelerometers in a "body star" pattern").

Visit the Drancel channel on Vimeo for amazing 4K videos and SmugMug for incredible macro photos !

The amazing Drancel body motion light synthesis system is now featured in delicious 4K on the Drancel channel on Vimeo !

Buy beautiful quality framed prints of incredible hires macro photos of Drancel images online on the Webel SmugMug photo channel !

Terminology: A Drancer is an "acceleration avatar" of the Drancing accelerometer music and visuals system

Because Dr Darren's Drancing accelerometer music and visuals synthesis system employs specifically acceleration, it has a special kind of "acceleration avatar" called a Drancer, where typically one Drancing performer (also known as a Drancer) typically wears 5 triaxial accelerometers in a "body star" pattern (reminiscent of Da Vinci's Vitruvian Man), which map to 3D Drancel RGB virtual synthesis atoms in the Drancer's acceleration avatar.

However, unlike many motion capture animation systems (such as those used in big budget films), where the emphasis is on spatial location, the Drancing system focuses "organically" on acceleration signals. So although the Drancer avatar can be represented in a physical space to remind us of the body star configuration, the mapped positions and perturbations (driven by conditioned accelerometer signals) of each virtual Drancel RGB in synthesised Drancing visuals are not intended to precisely reflect spatial positions.

So why does Drancing embrace acceleration alone in this way ? Because acceleration - unlike velocity or position - does not need a reference system, and this means that the Drancing system is in principle (and by design) able to be used anywhere, over a very large region, while riding your bike, or exercising, or dancing, or jogging, anywhere, anytime, even underwater or in space, without the need for a reference system.

_UML2 signal processing models

The content or the technology discussed here is HISTORICAL or ARCHIVAL

I present here, as adapted Unified Modeling Language™ (UML™) structure diagrams, a simplified model of the hardware, data acquisition, and JSyn audio synthesis circuits of an earlier version of DranceWare for Drancing.


$dr$ of Webel is a Pure Data (PD) enthusiast. A PureData-based audio synthesis version of the DranceWare software for The Drancing accelerometer music "air instrument" uses GEM to synthesis rich visuals in real-time from accelerometer signals (such as WiiTM Remote signals), synchronised with the accelerometer music. The PLAY music project "sound circles" animated logo is synthesised using PureData/GEM, as described in this mini-tutorial

Syndicate content