Recent progress in physical Human‐Robot Interaction (pHRI) research showed in principle that human and robots can actively and safely share a common workspace. The fundamental breakthrough that enabled these results was the human-centered design of robot mechanics and control. This made it possible to limit potential injuries due to unintentional contacts. Previous projects, in particular the PHRIENDS project in which a part of the consortium has been involved, provided remarkable results in these directions, constituting the background foundation for this proposal.
Inspired by these results, SAPHARI will perform a fundamental paradigm shift in robot development in the sense that we place the human at the centre of the entire design. The project will take a big step further along the human-centered roadmap by addressing all essential aspects of safe, intuitive physical interaction between humans and complex, human-like robotic systems in a strongly interconnected manner.
While encompassing safety issues based on biomechanical analysis, human-friendly hardware design, and interaction control strategies, the project will develop and validate key perceptive and cognitive components that enable robots to track, understand and predict human motions in a weakly structured dynamic environment in real-time.
We will equip robots with the capabilities to react to human actions or even take the initiative to interact in a situation‐dependent manner relying on sensor based decisions and background knowledge.
Apart from developing the necessary capabilities for interactive autonomy, we will also tightly incorporate the human safety also at the cognitive level. This will enable the robots to react or physically interact with humans in a safe and autonomous way. Keeping in mind the paradigm to “design for safety and control for performance”, research developments will be pursued in several areas, starting with the fundamental injury mechanisms of humans cooperating with robots. The analysis will be first carried out for stiff robots and then extended to variable stiffness actuation systems in terms of safety, energy, and load sustainability. Biomechanical knowledge and biologically motivated variable compliance actuators will be used to design bimanual manipulation systems that have design characteristics and performance properties close to humans. Real-time task and motion planning of such complex systems requires new concepts including tight coupling of control and planning that lead to new reactive action generation behaviours.
Safe operation will be enforced in mobile manipulation scenarios with large workspaces by smart fusion of proprioceptive and exteroceptive sensory information, sensor‐based task planning, human gestures and motion recognition and learning, and task‐oriented programming, including configuration and programming of safety measures.
Finally, self explaining interaction and communication frameworks will be developed to enhance the system usability and make the multimodal communication between human and robot seamless.
The project focuses on two industrial use cases that explicitly contain deliberate physical interaction between a human and a robot co‐worker, as well as on professional service scenarios in hospitals, in which medical staff and an assisting robot interact closely during daily work. These prototypical applications will pave the way towards new and emerging markets, not only in industry and professional services, but possibly also in household robots, advanced prostheses and rehabilitation devices, teleoperation, and robotic surgery. Generally, results of this project are expected to strongly impact all applications where interactive robots can assist humans and release them from dangerous or routine tasks.