The idea behind this type of analysis was to identify abrupt changes in my movements, both in their direction and speed, represented by the steepness of the curve. If movements are seen as a mirror of the player’s experience of the music, as suggested by the previously mentioned theories on embodied music cognition, a change in those can be expected to be linked to a change in the experience and therefore identify a new musical section.
Results showed that the movement patterns were generally consistent with the sections that were previously identified through the musical analysis, with movement starting from a resting position, increasing towards the middle of a section with a higher speed and distance from the centre of axes and then going back to the initial position towards the end of the section. The same U-shaped patters was usually repeated in the echoes which, contrary to expectations, sometimes showed an even higher intensity in movement. It was also noticed that movement seemed to increase in correspondence with faster notes, as well as often following the direction of the melodic movement.
With the goal of acquiring additional information on the movements I make whilst playing, the video of the performance was analysed using the “Tracker” software. This is a video analysis and modelling tool built on the Open Source Physics (OSP) Java framework which allows, amongst others, for manual and automated object tracking with position, velocity and acceleration overlays and data.
Having identified a point on my face which represented the biggest and most varied amount of movement, this was chosen as the object to be tracked throughout the full video, and graphs were obtained representing its horizontal and vertical movement across time. These are shown Figure 5, respectively on the y-axis of Diagram 1 (horizontal) and Diagram 2 (vertical), with time (s) being always on the x-axis and the origin of the axes corresponding to the position of the Point A in the resting position (sitting position before playing).