Updated: Apr 15, 2021
In this sketch, I work through my previous iteration of the Interactive Light.
Special thanks to Serge, Erica, and Peggy
For this, I removed the particles and focused on the ability of the Microsoft Kinect to drive the interaction. I was able to map out data points of the user's hands that can then manipulate the tracing abilities. I removed the light aspect and provided a color palette that would be visually enticing and fun to draw with. This was done with a blend mode.
The interactive component relied on the Microsoft Kinect (set to player index). This identifies the figure and allows me to start to map out the points for the hands.
The audio-reactive nature of it came within the color within the silhouette. I set up a generative visualizer that was then shown within the figure. The visualizer would then dance or morph based on the audio.
I applied a similar network as the Interactive light piece, but instead of tracking light, the tracing happened by tracking the hands of the user. For this to happen I used several math nodes that pinpointed the movement of the hands thus creating a point of tracking for the lines to be drawn.
This experience is Covid friendly in that only one user can interact with it at a time. However, I noticed that groups were unafraid to participate with each other. In reflecting on this, I realize that a two-user experience may work better.
I found it fascinating that most users stick with very symmetrical movements. The arms generally will reflect themselves to create the shapes that are drawn on the screen. I also found the movement of the user interesting as well. Using their whole body to paint pictures.