In the motion map an interactive machine learning technique called CPPN-NEAT (Compositional patternproducing network with Neuroevolution of augmenting topologies)(/ref Picbreeder) is leveraged to generate visualisations and sonifications of human movement. In the demo application the visualisations/sonifications are used to distinguish between different motions.
An example of 3 different visualisations for 3 different motions: