Robotic Constrained Imitation Learning

for the Peg Transfer Task in Fundamentals of Laparoscopic Surgery

ICRA 2024

  • Kento Kawaharazuka
  • Kei Okada
  • Masayuki Inaba
  • JSK Robotics Laboratory, The University of Tokyo, Japan

In this study, we present an implementation strategy for a robot that performs peg transfer tasks in Fundamentals of Laparoscopic Surgery (FLS) via imitation learning, aimed at the development of an autonomous robot for laparoscopic surgery. Robotic laparoscopic surgery presents two main challenges: (1) the need to manipulate forceps using ports established on the body surface as fulcrums, and (2) difficulty in perceiving depth information when working with a monocular camera that displays its images on a monitor. Especially, regarding issue (2), most prior research has assumed the availability of depth images or models of a target to be operated on. Therefore, in this study, we achieve more accurate imitation learning with only monocular images by extracting motion constraints from one exemplary motion of skilled operators, collecting data based on these constraints, and conducting imitation learning based on the collected data. We implemented an overall system using two Franka Emika Panda Robot Arms and validated its effectiveness.


Fundamentals of Laparoscopic Surgery for Robots

Regarding peg transfer tasks in Fundamentals of Laparoscopic Surgery (FLS) for robots using imitation learning, we handle two main problems: (1) the forceps are constrained by laparoscopic ports and (2) only RGB information can be obtained from a monocular endoscope.


The Setup for Robotic Peg Transfer Tasks in Fundamentals of Laparoscopic Surgery

Franka Emika Panda Robot Arms are operated by Touch Haptic Device (3D Systems Corp.). Maryland Dissectors are controlled via Hand Adapter and Parallel Gripper to transfer Rubber Object on Peg Board.


Constrained Inverse Kinematics for Laparoscopic Surgery

The configuration to control forceps considering the constraint by a laparoscopic port. The left figure shows a geometric model of robot arms, and the right figure shows the kinematics of forceps with a virtual linear joint from the tip of the hand that overlaps with the long axis of the forceps.


Constrained Data Collection and Imitation Learning

In order to generate robot motions by imitation learning, we collect human teaching data using haptic devices. However, it is challenging to manipulate the forceps successfully because it is difficult to obtain the depth information. Therefore, we extract constraints on the movement of the forceps from the trajectory of a single slow and accurate exemplary demonstration, and incorporate them to improve the quality of the teaching data and the accuracy of imitation learning. The procedure is (1) Describe a phase transition condition, (2) Extract motion constraints pertaining to each phase from a single exemplary demonstration, (3) Collect data by human teaching with force feedback based on the constraints, (4) Execute imitation learning based on the collected data. Note that this method is not limited to the FLS task, but can be applied to any task by changing the phase transition condition and motion constraint extraction.


Experiments

One Exemplary Demonstration of Peg transfer
Contrained Data Collection vs. Normal Data Collection

The average variance for Constrained is 5.65 for the left arm and 2.74 for the right arm, whereas for Normal, it is 6.56 for the left arm and 4.78 for the right arm. Imposing minimum and maximum constraints on the z-directional movement allows for stable data collection.

Imitation Learning Result with Constrained Data Collection
Comparison of Success Rates
Comparison of the Trajectories of LSTM's Latent Space

Bibtex

@inproceedings{kawaharazuka2024fls,
  author={K. Kawaharazuka and K. Okada and M. Inaba},
  title={{Robotic Constrained Imitation Learning for the Peg Transfer Task in Fundamentals of Laparoscopic Surgery}},
  booktitle={2024 IEEE International Conference on Robotics and Automation},
  year=2024,
}
            

Contact

If you have any questions, please feel free to contact Kento Kawaharazuka.