The discourse on interaction in live electronic music is often vague and inaccurate. Interaction – often confused with reaction – means above all reciprocity. There can be no interaction unless all interacting parts within a system are able to perceive each other’s actions, and act both in response to them and according to their own agenda. In human-computer music systems this would mean that not only the performer, but also the computer should be able to “act” – not just re-act. This idealistic vision of sonic human-computer reciprocity lies in the focus of our project. By incorporating machine intelligence in compositions for acoustic instruments and electronics, we seek to establish a reciprocal interaction between the musician as a cognising subject, and the computer as a cognising object.