ICRA 2018 Workshop
[Accepted] Full-day Workshop on Ergonomic Physical Human-Robot Collaboration
Recently, one of the main focuses in robotics is to enable robots to effectively and safely collaborate with humans in industrial, medical and household settings. Several fundamental and important aspects of human-robot collaboration have already been addressed in recent workshops in the robotics community. However, one important aspect that has not been yet sufficiently addressed are ergonomic working conditions of the human during the physical human-robot collaboration.
The aspect of ergonomic working conditions of the human is extremely important in tasks that require manipulation of heavy objects, repetitive body movements and uncomfortable body poses. Working in such conditions may lead to excessive strain and injuries of human co-workers, and consequently have negative socioeconomic impacts. To avoid this the robots should be able to recognize any improper working conditions and then adapt their own behavior to improve the working conditions of the human co-workers. Such methods should prevent the above-mentioned work-related stress and injuries, and maintain a good health and productivity of the human workers. This is especially important in industry, but can be also applied to service robotics and rehabilitation robotics.
To achieve this, we need to improve and combine several crucial elements in the existing fields of robotics. The robot must know the status of the human body; the limitations of human motor control; and the human response to challenging tasks. For this purpose the robot needs accurate dynamic models of humans that can e.g. be used to understand how the human joints should be reconfigured to provide less stressful conditions. In addition, the robot must be able to monitor the motion and other states of the human to know when and how to assist. Therefore, various sensory devices are necessary (either on the robot side or wearable by the human). Finally, the robot should possess an appropriate control framework and learning capabilities to be able to control and adapt its own behavior in order to help the human co-worker in achieving more ergonomic working conditions.
The goal of the workshop is to (1) bring together top experts in human-robot collaboration control/learning, human modelling/monitoring and ergonomics, (2) discuss the state-of-art, and (3) lay down promising future research directions that will lead to ergonomic human-robot collaboration
The questions we want to address
How to improve the existing modeling of the human and effectively apply it to achieve ergonomic human-robot collaboration?
- What sensory and feedback systems are required?
- Are the existing control methods sufficient to enable the robot to facilitate the ergonomic working conditions of the human, and what control methods are most suitable?
- How can robot learning methods be utilized to this end?
- What are the challenges in achieving the proposed goal in different applications (i.e. industrial human-robot collaboration, wearable robots, service robots, etc.)?
InterestsPhysical Human-Robot Collaboration, Ergonomics, Human Modelling, Physical Interaction Control, Adaptation and Learning, Industrial Robots, Exoskeleton Robots, Service Robots, Wearable Sensors, Feedback Devices, Shared Control
ConclusionsBased on the interactions during the workshop and round-table discussion we came to the following conclusions:
- We need to clearly define the measures for ergonomics in human-robot collaboration.
- The measures should have a mathematical formulation (cost function) that the robot controller can use to optimise the human co-worker’s states.
- Possible measures of ergonomics may be: overloading joint torques, muscle fatigue, safety maps, etc. However, we do not yet know whether these measures are sufficient to ensure ergonomic conditions and prevent work related injuries. We may need to involved medical doctors to validate the selected measures.
- Since human dynamical models are difficult to validate, it is possible that they may be incorrectly identified. If the robot uses incorrect human model, it might cause more harm than good.
- The correct amount of complexity of the human model is crucial, since some of the important degrees of freedom might be neglected.
- Learning algorithms may help the robot to overcome some of the above-mentioned issues.
- Sometimes extensive feedback systems might be necessary to provide enough measured information to the robot.
The following papers will be presented in the poster session of the workshop.
- Exoskeleton Design for Assisting Standing Posture Transitions throughModeling and Control of Body COM Motions Diego Felipe Paez Granados, Hideki Kadone, and Kenji Suzuki
- Improved Human-Robot Interaction: A manipulability based approach Sugeeth Gopinathan, Pouya Mohammadi, and Jochen J. Steil
- Robot Co-worker for Abrasive Blasting:Lessons Learnt in Worker Posture Estimation Marc G. Carmichael, Dikai Liu, Antony Tran, Richardo Khonasty, and Stefano Aldini
- Real-time Robot-assisted Ergonomics A. Shafti, A. Ataka, B. Urbistondo Lazpita, A. Shiva, H. A. Wurdemann, and K. Althoefer
- Planning to grasp and position an object forforceful human-robot collaboration Luis F. C. Figueredo, Lipeng Chen, and Mehmet R. Dogar
Accepted papers require that at least one of the authors register for the workshop.
Submission deadline for extended abstracts: 1st February 2018 (extended the deadline to 11th February) Notification of acceptance: 1st March, 2018
The workshop will be held on Monday 21. May 2018.
|08.45 - 09.00||Introduction by the organizers|
|09.00 - 09.30||Talk by Prof. Dana Kulić|
|09.30 - 10.00||Talk by Dr. Eiichi Yoshida|
|10.00 - 10.30||Talk by Prof. Gentiane Venture|
|10.30 - 11.30||Coffee Break & Poster session|
|11.30 - 12.00||Talk by Prof. Neville Hogan|
|12.00 - 12.30||Talk by Dr. Arash Ajoudani (presented by Dr. Luka Peternel)|
|12.30 - 13.30||Lunch|
|13.30 - 14.00||Talk by Prof. Jens Kober|
|14.00 - 14.30||Talk by Prof. Heni Ben Amor|
|14.30 - 15.00||Talk by Prof. Jan Babič|
|15.00 - 15.30||Coffee Break|
|15.30 - 16.00||Talk by Prof. Sami Haddadin|
|16.00 - 17.00||Round Table Discussions|
Luka Peternel, Post Doc
Italian Institute of Technology, Italy
Wansoo Kim, Post Doc
Italian Institute of Technology, Italy
Jens Kober, Assistant Professor
TU Delft, Netherlands
Jens Kober is an assistant professor at the TU Delft, Netherlands. He worked as a postdoctoral scholar jointly at the CoR-Lab, Bielefeld University, Germany and at the Honda Research Institute Europe, Germany. He graduated in 2012 with a PhD Degree in Engineering from TU Darmstadt. For his research he received the annually awarded Georges Giralt PhD Award for the best PhD thesis in robotics in Europe. From 2007-2012 he was working with Jan Peters as a masters student and subsequently as a PhD student at the Robot Learning Lab, Department Bernhard Schölkopf, Max-Planck Institute for Intelligent Systems (formerly part of the MPI for Biological Cybernetics). He has been a visiting research student at the Advanced Telecommunication Research (ATR) Center, Japan and an intern at Disney Research Pittsburgh, USA. His research interests include robotics, machine learning, and control.
Heni Ben Amor, Assistant Professor
Arizona State University, USA
Heni Ben Amor is an Assistant Professor at Arizona State University where he leads the ASU Interactive Robotics Laboratory. Prior to that, he was a Research Scientist at the Institute for Robotics and Intelligent Machines at GeorgiaTech in Atlanta. Heni studied Computer Science at the University of Koblenz-Landau (GER) and earned a Ph.D in robotics from the Technical University Freiberg and the University of Osaka in 2010 where he worked with Hiroshi Ishiguro and Minoru Asada. Before moving to the US, Heni was a postdoctoral scholar at the Technical University Darmstadt working with Jan Peters. Heni's research topics focus on artificial intelligence, machine learning, human-robot interaction, robot vision, and automatic motor skill acquisition. He received the highly competitive Daimler-and-Benz Fellowship as well as several best paper awards at major robotics and AI conferences. He is also in the program committee of various AI and robotics conferences such as AAAI, IJCAI, IROS, and ICRA.
Eiichi Yoshida, Senior Research Scientist
National Institute of Advanced Industrial Science and Technology, Japan
Received degree of B.E, M.E and Dr. Eng. from Dept. of Precision Machinery Engineering, School of Engineering, the University of Tokyo, in 1990, 1993, and 1996 respectively. Meanwhile, joined the Department of Microtechnique at Swiss Federal Institute of Technology at Lausanne (EPFL) (1990-1991). Joined former Mechanical Engineering Laboratory (MEL) on April 1996, From 2001- :Senior Research Scientist at Distributed System Design Research Group, Intellignent Systems Institute, AIST Feb 2004 Automous Behavior Control Resaerch Group, Intellignent Systems Institute, AIST Until July 2009: AIST/IS-CNRS/STIC Joint Japanese-French Robotics Laboratory (JRL) Co-Director (LRV Paris, LAAS-CNRS, Toulouse, France) Dec 2008: CNRS-AIST JRL (Joint Robotics Laboratory), UMI3218/CRT Apr 2009: Co-Director, CNRS-AIST JRL (Joint Robotics Laboratory), UMI3218/CRT
Neville Hogan, Professor
Massachusetts Institute of Technology, USA
Neville Hogan is currently the Sun Jae Professor of Mechanical Engineering at Massachusetts Institute of Technology.
Dr. Arash Ajoudani (presented by Dr. Luka Peternel)Italian Institute of Technology, Italy
"Towards ergonomic control of human-robot co-manipulation"
The talk will present a control approach to human-robot co-manipulation that accounts for human ergonomics and physical fatigue. To achieve an adaptive and context-aware robot behaviour in physical interaction with human and environment, we will first present a control framework that includes a hybrid interaction controller and a multi-modal human-robot interface. The robot then uses the lower level control framework in conjunction with the proposed higher level methods that can estimate and anticipate human states that contribute to ergonomics and physical fatigue. In this direction, real-time measurement systems and dynamical models are used to monitor the human online, while he/she is collaboratively performing the tasks with the robot. The robot then uses the proposed methods to control its own behaviour in a way that offloads the excessive effort of the human and ensures ergonomic working conditions.
Prof. Dana KulićUniversity of Waterloo, Canada
"Human Motion Measurement and Analysis outside the Lab"
The human body is capable of a wide range of agile, dexterous, and complex movement. Improved understanding and modeling of human movement can be leveraged to teach robots to perform tasks, allow robots to safely and intuitively interact with humans, and to provide appropriate assistance to restore and facilitate movement. Human motion measurement and analysis is a challenging problem, due to issues such as sensor and measurement system limitations, high dimensionality, and large spatial and temporal variability of movement. In this talk we will describe recent work developing techniques for automated human motion measurement and analysis. We will overview techniques for motion measurement, segmentation, individualized model learning and analysis, with a focus on applications to robot imitation learning, rehabilitation and human-robot interaction.
Dr. Eiichi YoshidaNational Institute of Advanced Industrial Science and Technology, Japan
"Ergonomic Evaluating and Designing Assistive Devices through Human Motion Replication"
We present an integrated approach for ergonomic evaluation and design of assistive devices by replicating human motion using humanoid and digital human model.The first axis is development of a method for humanoid robot control that can reproduce various human behaviors to use a humanoid robot as an evaluator of products such as assistive devices. This allows estimating its mechanical supportive effects of assisitve devices in a quantitative manner, which is difficult with human measurement. We also introduce applications of this research to standardization of wearable lumbar-support assistive devices. Another research direction is to develop a system for human-centered product evaluation and design through understanding humans' motion principles by using a digital human that can model its shape, musculo-skeletal structure and motions, as well as interactions with devices.
Prof. Gentiane VentureTokyo University of Agriculture and Technology, Japan
"Scaling the body"
Models of the human body play a key role in human motion science. In particular, the dynamics relates the movement to the forces indispensable to achieve this motion. It also relates to the environment through interaction forces. Measuring this data is not always trivial. In the past decade we have developed solutions for the computation of the dynamic quantities and developed individual models. In this presentation I will present the state of the art and our latest advances in this area and show some examples of applications.
Prof. Neville HoganMassachusetts Institute of Technology, USA
"So Good yet So Bad: Surprising Limitations of Human Motor Control"
Despite slower actuation, communication and computation, humans exhibit dexterity and agility far exceeding modern robots. At the same time, we exhibit surprising limitations, which may influence the ergonomics of physical collaboration between robots and humans. This presentation will review some of these limitations. Recent studies showed that moving slowly is hard for humans, both in unconstrained and kinematically-constrained actions. Natural human movement exhibits a velocity-curvature relation (the so-called “two-thirds power law”). We recently showed that subjects interacting physically with a robot had substantial difficulty deviating from this natural trajectory. It is often assumed that minimizing muscular effort is an important aspect of skillful human behavior. However, recent results show that humans do not adopt minimum-effort strategies, even when they do not compromise task performance. Instead, subjects prioritize predictability. Maximizing predictability in human-robot physical collaboration may enable reduced effort and hence less vulnerability to fatigue, an important ergonomic consideration.
Prof. Jan BabičJožef Stefan Institute, Slovenia
"SPEXOR: Spinal Exoskeletal Robot for Low Back Pain Prevention and Vocational Reintegration"
The objective of SPEXOR is to address low back pain as one of the most appealing health problems of the modern society by creating a body of scientific and technological knowledge in the multidisciplinary areas of biomechanics, robotics, and computer science that will lead to technologies for low back pain prevention. In the talk I will give you an overview of the current state-of-art of SPEXOR that we achieved in the first two years of the project. After introducing the rationale, I will walk you through the topics that include biomechanics of low back pain, development of the musculoskeletal stress monitoring for assessment of neuromuscular trunk functions, modeling and optimization of the interaction of spinal exoskeleton with the human body, electromechanical design and development of the spinal exoskeleton and its control, and finally the end-user evaluation of the functional effects, usability and satisfaction.
Prof. Jens KoberTU Delft, Netherlands
"Robots Learning in Interaction with Humans"
In scenarios with physical human-robot interactions, many of the humans will not be roboticists. Hence we need very intuitive programming/learning methods to allow them to adapt the robots behaviors to new tasks and their preferences. This setting also implies the need for very efficient and safe learning methods. In this talk we will present our recent advances in learning from humans and in interaction with humans. We will present approaches on imitation learning for complex force-interaction tasks and for speeding up reinforcement learning by including human corrective advice.
Prof. Heni Ben AmorArizona State University, USA
"Machine Learning for Human-Robot Interaction and Predictive Biomechanics"
Modern robotics technology has the potential to change millions of lives for the better. However, for this vision to become a reality, formalisms and methodologies are needed that allow robots to generate safe actions that seamlessly blend with those of the human user. In this talk, I will present a machine learning methodology for extracting human-robot interaction skills from example demonstrations. In addition, I will present ongoing work on how to incorporate predictive biomechanics so as to ensure healthy human postures while also reducing the risk of injuries and musculoskeletal disorders.
Prof. Sami HaddadinUniversity of Hannover, Germany
The title and abstract will notice soon.
The following IEEE-RAS Technical Committees have acknowledged the full support of the proposed workshop:
- IEEE RAS TC on Human Movement Understanding.
Co-chairs: Dr. Emel Demircan, Prof. Dana Kulic, Prof. Dany Oetomo, and Prof. Mitsu Hayashibe
- IEEE RAS TC on Human Robot Interaction and Coordination.
Co-chairs: Prof. Filippo Cavallo, Prof. David Feil-Seifer, and Prof. Yoshio Matsumoto
- IEEE RAS TC on Wearable Robotics.
Co-chairs: Prof. Samer Mohammed, Prof. Juan C. Moreno, Prof. Thomas Sugar, and Prof. Yasuhisa Hasegwa
- IEEE RAS TC on Humanoid Robotics.
Co-chairs: Dr. Katsu Yamane, Prof. Aude Billard, Prof. Tomomichi Sugihara
- IEEE RAS TC on Rehabilitation and Assistive Robotics.
Co-chairs: Prof. Machiel Van Der Loos, Prof. Takanori Shibata, and Prof. Stefano Mazzoleni
- IEEE RAS TC on Robot Learning.
Co-chairs: Prof. Ross Knepper, Prof. Jens Kober, and Prof. Wataru Takano