Computer improvisation, metrical structures and phrasing
This presentation will address questions of rhythm in the design of a computer improvisation software named ImproteK. We are working on a musician-machine interaction system that learns in real time from human performers and generates improvisations in the style of these performers. The improvisation kernel is based on sequence modeling and statistical learning derived from earlier works done at IRCAM about the OMax software. The new issue addressed here is to take into account an underlying metrical structure. Thus the system must be aware of the positions of the beat. We will show examples of music generated by the system in two different contexts : zyther music from Madagascar and jazz improvisation by keyboard player Bernard Lubat. In both cases we will discuss the reactions of the musician listening to these virtual improvisations particularly from a rhythmic point of view. We will point out problems of phrasing and microtiming that appear due to the existence of a distance between what the musician had in mind and what he actually played. This distance is sometimes increased by the recombination process of the system.
presentation slides available here