From a parent coordinating movements to help a child learn to walk, to a violinist training a concerto, humans rely on physical interaction to learn from each other and from the environment. Building on a strongly multidisciplinary foundation with an integrated approach, CONBOTS proposes a paradigm shift that aims to augment handwriting and music learning through robotics, by creating a physically interacting robotic platform connecting humans in order to facilitate the learning of complex sensorimotor tasks.
The newly designed platform will combine four enabling technologies: i) compact robotic haptic devices to gently interact with upper limbs; ii) an interactive controller yielding physical communication, integrating differential Game Theory (GT) and an algorithm to identify the partner’s control; iii) a bi-directional user interface encompassing AR-based application-driven serious games, and a set of wearable sensors and instrumented objects; iv) Machine learning algorithms for tailoring learning exercises to the user physical, emotional, and mental state.
CONBOTS is building on recent neuroscientific findings that showed the benefits of physical interaction to performing motor tasks together, where the human central nervous system understands a partner motor control and can use it to improve task performance and motor learning. This will be implemented on innovative robotic technology, wearable sensors and machine learning techniques to give rise to novel human-human and human-robot interaction paradigms applied in two different learning contexts: i) training graphomotor skills in children learning handwriting; ii) augmenting learning performance in beginner musicians.
Using its neuroscience-driven unifying approach to motor learning and physical communication CONBOTS will expand the impact and the application of robotics to the education industry.