top of page
The hologram project challenges Joselit’s ideas of the surface by augmenting a physical space. In most augmented or mixed reality experiences a surface device such as a phone places digital objects within the physical space through the phone's lens. However, this project pushes this concept further by bringing the digital object into the physical space as a projected sculpture.
The Holo Leap dancing girl hologram is another iteration of the previous two holograms. The key difference is that the Leap Motion haptic sensor is used instead of the Microsoft Kinect.
The Leap Motion is an optical hand tracking controller that captures the movement of the hands. It allows for nearly seamless interactivity with digital devices . For this experience, I used Touch Designer paired with the Leap Motion to control the movement of the dancing girl. Participants could hover their hands over the Leap Motion sensor and trigger various dance moves and music.
The character is Michelle from Adobe Mixamo, which is an online platform that has pre-rigged characters and animations for film and video games. Using this character made it easier for me to program her. In the Mixamo library, there are various pre-programmed animations that are available for download. The character is then placed in Unity with proper texture and lighting. Then recorded the animation as mp4 files. The hologram was mapped in Adobe After Effects. After rendering, with the leap motion, a specified value is coded for each hand movement. When the participant's hand moved to a position it would trigger a value and the dance happens. When no one participates then an idle animation is played.
This gives the illusion that she’s waiting for someone to interact with her. This also challenges Joselit’s idea of surface in that the surface object is literally above the screen and in the physical space and the viewer can actively have an interactive dialogue with her through hand movements. Similar to the Interactive Wall, this makes users more aware of their hand movements as they interact with the character.
In this iterative process, this project uses the Microsoft Kinect to make out the movements of an actor. The hologram therefore will mimic the movement of the actor. The concept was to build a holographic AI system, however, due to some engineering limitations, it ended up into a performance piece instead of a true AI system. The viewer can talk to and interact with the hologram through a behind-the-scenes controller and receive real-time responses from the hologram through the actor. This ties back with the dialogic aesthetics in that the viewers as a group have an opportunity to discuss the advancements and limitations of this sort of technology.
The design process is similar to the previous iteration. Using the interactive Hologram project as a road map, a model is imported into Unity with a holographic shader giving the illusion of a hologram. The orientation and mapping are exactly the same. The main difference is that actions are not pre-programmed. Instead, with the Microsoft Kinect and using their built-in skeletal tracking system (Kinect v2 Examples with MS-SDK and Nultitrack SDK) the Kinect is able to track the movement of the actor in real-time.
“Pay no attention to that man behind the curtain”
-Wizard of Oz (1939)
For the set up all of the hardware was hidden away behind a wall. A functioning computer with the Kinect attached was set up behind the scenes with the actor present.
The Kinect tracks the actor, allowing for real-time responses. Unity then executes the Hologram connected to a monitor. Using a built-in camera on the laptop the actor can see her own movement as well as a microphone system. On the other side of the wall, a webcam is relaying, giving her the ability to see and interact with anyone in real-time. Simply, the actor can control the hologram.
Using a black cart, a TV was placed flat with the prism attached for optimal projection of the holographic map. A webcam was placed below it, as the “eyes” of the Hologram. Below the webcam was a wireless speaker for the actor to communicate with the audience.
With everything set up, it was time for the performance. The purpose was to give the illusion of a complex AI System. The audience was gathered and had the ability to walk around the hologram in full 360.
For my part, I began a conversation with the hologram and received real-time responses. Then encouraged the audience to talk to it and ask it questions. The audience was reluctant to talk to it.
Interactive Hologram Prototype
The concept is to generate a 3-dimensional hologram with a simple control system, (i.e. A press system) where users can interact with the hologram by using pre-programmed animations. Using a 3D character and programming a set of actions (animations) that are triggered by an On Press cue using a keyboard. While playing music, the participant will be able to control the hologram with an external wireless keyboard.
For this prototype Unity was used to simply program the 3D character. A character was chosen through CG Trader. Then a proper holographic shader was implemented. Using Unity C# code as the method to program the controller. An On Press trigger was coded for certain keys on a wireless keyboard which would trigger certain dance moves that the 3D character would do. The dance moves were generated with Mixamo. Each set of animations was then imported into Unity through a network of nodes. In Unity, four duplicate characters were mapped strategically that when displayed mirrored each other.
To create the projection a prism needed to be built. The prism was made with found clear hard plastic material. The plastic is then cut into trapezoidal shapes and connected together with glue and tape. A screen must be placed flat with the four characters and then the prism is placed on top to create the hologram.
Finally with the wireless keyboard and live music the hologram dances based on the participant's desires.
Maya, Touch Designer, Leap Motion
bottom of page