english only
EPFL > STI > IMT > LASA > Research > Human-robot interaction
Human-Robot Interaction
Machine Learning with Application to Robotics
Fast Adaptive Control
Dexterous Manipulation and Grasping
Computational Neuroscience and Cognitive Modeling

© Second Hands

Human-robot interaction

We focus on scenarios where robots take the role of assistants, trainers or collaborators in order to achieve physical manipulation tasks. We use machine-learning algorithms to model and observe human behaviors, develop robot implementations and replicate the task in a human-like manner or detect human intention.

We develop approaches and applications that:

Shared Control Through EMG

The range of motion that someone can do while wearing a prosthetic device is constrained due to lack of control and functionality. We are working to develop control interfaces that can decode the intention of the user by its muscular activity (EMG).

We develop approaches and applications that:
  • describe the dynamic behavior of the hand and the fingers during reaching-to-grasp motions
  • extract valuable information from the muscles to decode the intention of the user
  • combine dynamical systems with the user’s intention to provide a natural human-like behavior to the device
  • achieve results in a shared control scheme between the user and wearable robotic device
  • apply our results to prosthetic devices for the upper limbs as well as tele operated robotic systems
  • Video playlist

    Related publications