Computer Science
Permanent URI for this collectionhttps://hdl.handle.net/10679/43
Browse
Browsing by Institution Author "BEBEK, Özkan"
Now showing 1 - 4 of 4
- Results Per Page
- Sort Options
Conference ObjectPublication Metadata only Adaptive inverse kinematics of a 9-DOF surgical robot for effective manipulation(IEEE, 2019) Sunal, Begüm; Öztop, Erhan; Bebek, Özkan; Computer Science; Mechanical Engineering; ÖZTOP, Erhan; BEBEK, Özkan; Sunal, BegümIn a robotic-assisted surgical system, fine and precise movement is essential. However, when the user wishes to cover a wider area during tele-operation, the configuration designed for precise motion may restrict the user and/or slow down the system's operation. This paper proposes a kinematics based method applicable to redundant manipulators to allow the user to make both fast and precise movements and reduce the burden on the user. In the proposed method, different kinematic configurations are selected automatically in real-time to adjust the speed of the robot's end-effector according to the velocity of the haptic device. The proposed method is tested on a 9 Degree-of-Freedom (DOF) system that is realized by attaching a 3-DOF servo driven surgical instrument to a 6-DOF manipulator through a custom interface. The validity of the proposed method is shown with experiments requiring dexterous manipulation using the 9DOF system. The results indicate that adoption of the proposed method in actual operations can facilitate reduction in surgery time and surgeon's effort, thereby help reduce the risk of tissue deformations and other complications in the patient.Conference ObjectPublication Metadata only Adaptive shared control with human intention estimation for human agent collaboration(IEEE, 2022) Amirshirzad, Negin; Uğur, E.; Bebek, Özkan; Öztop, Erhan; Computer Science; Mechanical Engineering; BEBEK, Özkan; ÖZTOP, Erhan; Amirshirzad, NeginIn this paper an adaptive shared control frame-work for human agent collaboration is introduced. In this framework the agent predicts the human intention with a confidence factor that also serves as the control blending parameter, that is used to combine the human and agent control commands to drive a robot or a manipulator. While performing a given task, the blending parameter is dynamically updated as the result of the interplay between human and agent control. In a scenario where additional trajectories need to be taught to the agent, either new human demonstrations can be generated and given to the learning system, or alternatively the aforementioned shared control system can be used to generate new demonstrations. The simulation study conducted in this study shows that the latter approach is more beneficial. The latter approach creates improved collaboration between the human and the agent, by decreasing the human effort and increasing the compatibility of the human and agent control commands.Conference ObjectPublication Metadata only Fast and efficient terrain-aware motion planning for exploration rovers(IEEE, 2021) Uğur, Deniz; Bebek, Özkan; Mechanical Engineering; BEBEK, Özkan; Uğur, DenizThis paper presents a fast, energy-efficient, and low computational cost traversal solution on sloped terrain. The use of grid-based search algorithms requires high computational power and takes a long time because almost every point on the map is visited. An approach that does not depend on the global map but can also navigate towards the target can be presented as a new solution. A cost map for motion planning using depth field and color image data is formed in real-time. The proposed motion planning algorithm, named SAFARI, utilizes four cost layers to efficiently evaluate its surroundings. To reduce the computational overhead, only select features are evaluated and the rover's motion planning cycle speed is increased. SAFARI has been tested against path planning alternatives and has also been proven to work with simulations and field tests. This concept is expected to be used in space applications and cave exploration tasks.Conference ObjectPublication Metadata only Learning medical suturing primitives for autonomous suturing(IEEE, 2021) Amirshirzad, Negin; Sunal, Begüm; Bebek, Özkan; Öztop, Erhan; Computer Science; Mechanical Engineering; BEBEK, Özkan; ÖZTOP, Erhan; Amirshirzad, Negin; Sunal, BegümThis paper focuses on a learning from demonstration approach for autonomous medical suturing. A conditional neural network is used to learn and generate suturing primitives trajectories which were conditioned on desired context points. Using our designed GUI a user could plan and select suturing insertion points. Given the insertion point our model generates joint trajectories on real time satisfying this condition. The generated trajectories combined with a kinematic feedback loop were used to drive an 11-DOF robotic system and shows satisfying abilities to learn and perform suturing primitives autonomously having only a few demonstrations of the movements.