AudioPixel: Difference between revisions
AudioPixel (talk | contribs) No edit summary |
AudioPixel (talk | contribs) No edit summary |
||
| Line 1: | Line 1: | ||
[http://audiopixel.com AudioPixel.com] | [http://audiopixel.com AudioPixel.com] | ||
[http://vimeo.com/67036173 Control Tower video with some screenshots and pre-viz] | [http://vimeo.com/67036173 Control Tower video with some screenshots and pre-viz] | ||
== The Rundown == | |||
AudioPixel is based on Java, started as a project for mutant vehicle lighting, and its grown into a platform processing multiple kinds of input and output of data. | |||
The software provides an abstraction layer where any unit (dmx, midi, udp, etc) can define it's existence and be handled as a mapped node in the 3D pixel mapped universe. | |||
We run 'clips' - which can be custom scripts, algorithms, or any type of content - that centrally drive the behavior of all hardware. Control is mapped in a 3D universe where any type of hardware can be added and intelligently synchronized to be controlled by any type of input. | |||
During live stage shows, we use a midi dj controller to quickly activate and blend effects, generally without using the GUI. For example, when controlling DMX movers, the animation color data is interpreted as a rotation of it's colorwheel, while the same data is converted into proper RGB data for the LED units. | |||
We can setup playlists / timelines to run through libraries of programmed scenarios. We did a custom club installation with various banks of effects to switch through - i.e. Mellow, Dancefloor-Focus, etc. It runs automatically through presets if nobody is actively using the software. Last year, we ran the [http://audiopixel.com/2012/nexus-burning-man-2012 Nexus Burning Man 2012] stage all week long, almost entirely on an audio-reactive auto-mode. 0 crashes and it never looked stale. It's easy to hookup any controller or sensor, and define any additional behaviors for new hardware as needed. | |||
We're happy to adapt the software to talk to as much of the tower as possible, through published events or otherwise. | |||
== Interaction Scenario == | == Interaction Scenario == | ||
Coming soon | Coming soon | ||
Revision as of 21:44, 14 July 2013
Control Tower video with some screenshots and pre-viz
The Rundown
AudioPixel is based on Java, started as a project for mutant vehicle lighting, and its grown into a platform processing multiple kinds of input and output of data.
The software provides an abstraction layer where any unit (dmx, midi, udp, etc) can define it's existence and be handled as a mapped node in the 3D pixel mapped universe.
We run 'clips' - which can be custom scripts, algorithms, or any type of content - that centrally drive the behavior of all hardware. Control is mapped in a 3D universe where any type of hardware can be added and intelligently synchronized to be controlled by any type of input.
During live stage shows, we use a midi dj controller to quickly activate and blend effects, generally without using the GUI. For example, when controlling DMX movers, the animation color data is interpreted as a rotation of it's colorwheel, while the same data is converted into proper RGB data for the LED units.
We can setup playlists / timelines to run through libraries of programmed scenarios. We did a custom club installation with various banks of effects to switch through - i.e. Mellow, Dancefloor-Focus, etc. It runs automatically through presets if nobody is actively using the software. Last year, we ran the Nexus Burning Man 2012 stage all week long, almost entirely on an audio-reactive auto-mode. 0 crashes and it never looked stale. It's easy to hookup any controller or sensor, and define any additional behaviors for new hardware as needed.
We're happy to adapt the software to talk to as much of the tower as possible, through published events or otherwise.
Interaction Scenario
Coming soon