Interactive Motion Capture System For The Performative Arts - SICMAP
On stage, the performer manipulates the space as if something tangible. From gestures in space, sound emerges, is transformed and is spatialized. Behind the performer large video projections of textures in movement are associated with real time gestures, in opposition or in fusion with the audible and the visual.
Through disciplines and beyond the disciplines, between music and dance, gesture and movement, art and technology, it is amid these domains that my artistic practice is situated. The will to surpass a disciplinary practice of classical percussion and its gestural training, and including the body as the focus of my creative process has brought me towards a new performative hybrid practice.
These works are realized using a motion capture system by computer vision, SICMAP (Système Interactif de Captation du Mouvement en Art Performatif – Interactive Motion Capture System For The Performative Arts). Developed since 2012, the SICMAP is a motion capture system by computer vision resulting from my conceptualization of a gesture-sound space in harmony with my gestures as a performer and my body. The SICMAP constitutes an application of gesture recognition (Kinect Kreative Interface), a data management program and mapping (using Max software), a sound synthesis program (based on Csound), and of a mapping application and video projection (VPT). These technological components point to the six components of my artistic practice: electroacoustic sound, gesture, video, physical space, and technological space; the body is the unifying entity.
The SICMAP gesture recognition is possible because of the motion detection application Kinect Kreative Interface. The application was developed in collaboration with Patrick St-Denis – in the capacity of composer and sound engineer – in the IACT laboratories directed by Jean Piché. The base of the programming rests on the use of Kinect sensors and the OpenNi, NITE and SensorKinect libraries. The application is a graphic interface that facilitates the conception of a tridimensional performing space. The interface offers the possibility to define interactive zones and their association with different parts of the performer’s body. The application uses the OSC protocol to output data that can be used to link performative gestures to a sonic or visual creation environment.