Motion Machine

Description: Motion Machine is a C++ software toolkit for rapid prototyping of motion feature extraction and motion-based interaction design. It encapsulates the complexity of motion capture data processing into an intuitive and easy-to-use set of APIs, associated with the openFrameworks environment for visualisation. MotionMachine is a new framework designed for “sense-making”, i.e. enabling the exploration of motion-related data so as to develop new kinds of analysis pipelines and/or interactive applications.

Researchers that want to use this toolkit should cite the following paper:

J. Tilmanne and N. d’Alessandro, "MotionMachine: A new framework for motion capture signal feature prototyping", in Proc. 2015 European Signal Processing Conference (EUSIPCO 2015), Nice, France, 31 August – 4 September 2015.

The Motion Machine toolkit is available for download in the following link:


Description: MyoWebToolkit is a tool for studying the electromyogram of the forearm with respect to the hand movements in order to reverse engineer the whole process, that is to estimate hand movements from the electromyogram. Most of the approaches treat this problem as a traditional pattern recognition problem, i.e. fitting a linear or non-linear classifier to signal features given the annotated output, which however we consider not a true solution due to the lack of a theoretical background to the certain problem and due to the derivation of discrete labels whereas the problem is of analogue nature. Therefore, we have attempted a solution that derives a) 15 muscles activation levels from the electromyogram of Myo device of 8 channels; b) conversion of 15 muscle activation levels to muscle forces; c) application of forces to a bones model in WebGL 3D governent from a physics engine. These 3 pillars constitute the backbone of the problem for real-time emulation of hand movements in an analogue form. The MyoWebToolkit is complete written in Javascript - PHP language to allow emulation in the client side (web browser) using the GPU. A complementary tool for recording myo data is also written in node.js technology that sends from every client data to the server in order to collect a database.

Researchers that want to use this tool should cite the following paper:

D. Ververidis, S. Karavarsamis, S. Nikolopoulos, and I. Kompatsiaris, "Pottery gestures style comparison by exploiting MYO sensor and forearm anatomy", in Proc. 3rd International Workshop on Movement and Computing (MOCO’16), Thessaloniki, Greece, 5-6 July 2016.

Central web page (code, installation instructions, demo link, description, issues):

Text to Song synthesis API

Description: The Text to Song module was developed to augment the learning experience in traditional singing by automatically synthesize singing voices based on lyrics and notes.In order to aid the development of singing voice synthesis solutions for new languages and singing styles, we have created a collection of libraries and an API that wrap the proposed algorithms. Developers can replace the NLP and F0 generation modules to accommodate new languages and singing styles.

Researchers that want to use this tool should cite the following paper:

M. Cotescu, "Optimal Unit Stitching in a Unit Selection Singing Synthesis System", in Proceedings of the Interspeech, pp. 1255-1259, 2016.

The software can be downloaded on request from Acapela's servers: -- The libraries are available free for research and non-commercial use. Researchers and developers interested in using our TTS library should write to us using Acapela's contact form. Please include "i-Treasures" in the subject of your message.

Pedagogical Planner

Description: The Pedagogical Planner (PP) is a web-based tool to support designing and planning of educational interventions in the field of Intangible Cultural Heritage.

Researchers that want to use this tool should cite the following paper:

F. Pozzi, A. Ceregini, F. M. Dagnino, M. Ott and M. Tavella "Closing the "learning design lifecycle" with the Pedagogical Planner", Special issue of European Journal of Open, Distance and E-Learning (EURODL), pp. 103-116, December 2016.

The tool is available in the following link: