
2024 Stage Hack
Dance Diffusion
Pt 2 “conflict” of Diffusion_Dance, a mixing of generative computational and live performance art as a hackathon project that @misentropic and @christopher_t_linder developed with their team of 4 at MIT a few weekends ago. As well as the dancers tracking movement, depth, and visual aesthetic, there is also a team of 2 sound designers and 1 visual designer who are manipulating and essentially live mixing the visual and sound variables of the piece. Mini clip of the tech team at the end of this bit ! • “diffusion_dance is a live audiovisual performance built in 48 hours for Boston Tech Poetics Stagehack 2024. The narrative arc speaks to the relationship between humanity and technology, combining generative AI visuals, reactive audio, and the freedom of dance. The dancers are constantly watching how their movements are interpreted by the diffusion model, as well as multiple Kinect depth/position sensors. This live feedback allows them to improvise in dialogue with our tools, embodying the process of integration between human and artifice.”

Check out the full video on Youtube
Lumen Flow
2023 : Various Locations
Our first generative project, Lumen Flow uses a camera to pick up movement as people walk by. The tracked movement is translated into moving particles that wave and flow across the wall, creating an interactive visual effect

