Amirshirzad, NeginSunal, BegümBebek, ÖzkanÖztop, Erhan2023-05-152023-05-152021978-166541873-72161-8070http://hdl.handle.net/10679/8261https://doi.org/10.1109/CASE49439.2021.9551415This paper focuses on a learning from demonstration approach for autonomous medical suturing. A conditional neural network is used to learn and generate suturing primitives trajectories which were conditioned on desired context points. Using our designed GUI a user could plan and select suturing insertion points. Given the insertion point our model generates joint trajectories on real time satisfying this condition. The generated trajectories combined with a kinematic feedback loop were used to drive an 11-DOF robotic system and shows satisfying abilities to learn and perform suturing primitives autonomously having only a few demonstrations of the movements.engrestrictedAccessLearning medical suturing primitives for autonomous suturingconferenceObject202125626100087869320003410.1109/CASE49439.2021.95514152-s2.0-85117054261