1932

Abstract

This review examines the dichotomy between automatic and autonomous behaviors in surgical robots, maps the possible levels of autonomy of these robots, and describes the primary enabling technologies that are driving research in this field. It is organized in five main sections that cover increasing levels of autonomy. At level 0, where the bulk of commercial platforms are, the robot has no decision autonomy. At level 1, the robot can provide cognitive and physical assistance to the surgeon, while at level 2, it can autonomously perform a surgical task. Level 3 comes with conditional autonomy, enabling the robot to plan a task and update planning during execution. Finally, robots at level 4 can plan and execute a sequence of surgical tasks autonomously.

Loading

Article metrics loading...

/content/journals/10.1146/annurev-control-062420-090543
2021-05-03
2024-05-26
Loading full text...

Full text loading...

/deliver/fulltext/control/4/1/annurev-control-062420-090543.html?itemId=/content/journals/10.1146/annurev-control-062420-090543&mimeType=html&fmt=ahah

Literature Cited

  1. 1. 
    Yang GZ, Cambias J, Cleary K, Daimler E, Drake J et al. 2017. Medical robotics—regulatory, ethical, and legal considerations for increasing levels of autonomy. Sci. Robot. 2:eaam8638
    [Google Scholar]
  2. 2. 
    SAE Int 2018. Taxonomy and definitions for terms related to driving automation systems for on-road motor vehicles Stand. J3016_201806, SAE Int Warrendale, PA:
  3. 3. 
    Simaan N, Yasin RM, Wang L 2018. Medical technologies and challenges of robot-assisted minimally invasive intervention and diagnostics. Annu. Rev. Control Robot. Auton. Syst. 1:465–90
    [Google Scholar]
  4. 4. 
    Hoeckelmann M, Rudas IJ, Fiorini P, Kirchner F, Haidegger T 2015. Current capabilities and development potential in surgical robotics. Int. J. Adv. Robot. Syst. 12: https://doi.org/10.5772/60133
    [Crossref] [Google Scholar]
  5. 5. 
    Nelson BJ, Kaliakatsos IK, Abbott JJ 2010. Microrobots for minimally invasive medicine. Annu. Rev. Biomed. Eng. 12:55–85
    [Google Scholar]
  6. 6. 
    Vitiello V, Lee S-L, Cundy TP, Yang G-Z 2013. Emerging robotic platforms for minimally invasive surgery. IEEE Rev. Biomed. Eng. 6:111–26
    [Google Scholar]
  7. 7. 
    Bergeles C, Yang GZ. 2014. From passive tool holders to microsurgeons: safer, smaller, smarter surgical robots. IEEE Trans. Biomed. Eng. 61:1565–76
    [Google Scholar]
  8. 8. 
    Dahroug B, Tamadazte B, Tavernier L, Weber S, Andreff N 2018. Review on otological robotic systems: toward micro-robot assisted cholesteatoma surgery. IEEE Rev. Biomed. Eng. 11:125–42
    [Google Scholar]
  9. 9. 
    Smith JA, Jivraj J, Wong R, Yang V 2016. 30 years of neurosurgical robots: review and trends for manipulators and associated navigational systems. Ann. Biomed. Eng. 44:836–46
    [Google Scholar]
  10. 10. 
    Faria C, Erlhagen W, Rito M, De Momi E, Ferrigno G, Bicho E 2015. Review of robotic technology for stereotactic neurosurgery. IEEE Rev. Biomed. Eng. 8:125–37
    [Google Scholar]
  11. 11. 
    Pugin F, Bucher P, Morel P 2011. History of robotic surgery: from AESOP® and ZEUS® to da Vinci®. J. Visc. Surg. 148:e3–8
    [Google Scholar]
  12. 12. 
    Yeung BPM, Gourlay T. 2012. A technical review of flexible endoscopic multitasking platforms. Int. J. Surg. 10:345–54
    [Google Scholar]
  13. 13. 
    Groenhuis V, Siepel FJ, Veltman J, van Zandwijk JK, Stramigioli S 2018. Stormram 4: an MR safe robotic system for breast biopsy. Ann. Biomed. Eng. 46:1686–96
    [Google Scholar]
  14. 14. 
    Taylor RH, Menciassi A, Fichtinger G, Fiorini P, Dario P 2016. Medical robotics and computer-integrated surgery. Springer Handbook of Robotics B Siciliano, O Khatib 1657–84 Cham, Switz: Springer
    [Google Scholar]
  15. 15. 
    McKenna SJ, Charif HN, Frank T 2005. Towards video understanding of laparoscopic surgery: instrument tracking. Proceedings of Image and Vision Computing New Zealand 20052–6 Dunedin, N.Z: Univ. Otago
    [Google Scholar]
  16. 16. 
    Hannaford B, Rosen J, Friedman DW, King H, Roan P et al. 2013. Raven-II: an open platform for surgical robotics research. IEEE Trans. Biomed. Eng. 60:954–59
    [Google Scholar]
  17. 17. 
    Voros S, Long JA, Cinquin P 2007. Automatic detection of instruments in laparoscopic images: a first step towards high-level command of robotic endoscopic holders. Int. J. Robot. Res. 26:1173–90
    [Google Scholar]
  18. 18. 
    Wolf R, Duchateau J, Cinquin P, Voros S 2011. 3D tracking of laparoscopic instruments using statistical and geometric modeling. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2011 G Fichtinger, A Martel, T Peters 203–10 Berlin: Springer
    [Google Scholar]
  19. 19. 
    Garcia-Peraza-Herrera LC, Li W, Fidon L, Gruijthuijsen C, Devreker A et al. 2017. ToolNet: holistically-nested real-time segmentation of robotic surgical tools. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems5717–22 Piscataway, NJ: IEEE
    [Google Scholar]
  20. 20. 
    Allan M, Ourselin S, Thompson S, Hawkes DJ, Kelly J, Stoyanov D 2013. Toward detection and localization of instruments in minimally invasive surgery. IEEE Trans. Biomed. Eng. 60:1050–58
    [Google Scholar]
  21. 21. 
    Reiter A, Allen PK, Zhao T 2012. Feature classification for tracking articulated surgical tools. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2012 N Ayache, H Delingette, P Golland, K Mori 592–600 Berlin: Springer
    [Google Scholar]
  22. 22. 
    Colleoni E, Moccia S, Du X, De Momi E, Stoyanov D 2019. Deep learning based robotic tool detection and articulation estimation with spatio-temporal layers. IEEE Robot. Autom. Lett. 4:2714–21
    [Google Scholar]
  23. 23. 
    Bouget D, Benenson R, Omran M, Riffaud L, Schiele B, Jannin P 2015. Detecting surgical tools by modelling local appearance and global shape. IEEE Trans. Med. Imaging 34:2603–17
    [Google Scholar]
  24. 24. 
    Atkins MS, Tien G, Khan RS, Meneghetti A, Zheng B 2013. What do surgeons see: capturing and synchronizing eye gaze for surgery applications. Surg. Innov. 20:241–48
    [Google Scholar]
  25. 25. 
    García-Mato D, Lasso A, Szulewski A, Pascau J, Fichtinger G 2017. 3D gaze tracking based on eye and head pose tracking. Proceedings of the 10th Hamlyn Symposium on Medical Robotics87–88 London: Imp. Coll. Lond.
    [Google Scholar]
  26. 26. 
    Rahman R, Wood ME, Qian L, Price CL, Johnson AA, Osgood GM 2020. Head-mounted display use in surgery: a systematic review. Surg. Innov. 27:88–100
    [Google Scholar]
  27. 27. 
    Tong I, Mohareri O, Tatasurya S, Hennessey C, Salcudean S 2015. A retrofit eye gaze tracker for the da Vinci and its integration in task execution using the da Vinci Research Kit. 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems2043–50 Piscataway, NJ: IEEE
    [Google Scholar]
  28. 28. 
    Yip HM, Navarro-Alarcon D, Liu YH 2016. Development of an eye-gaze controlled interface for surgical manipulators using eye-tracking glasses. 2016 IEEE International Conference on Robotics and Biomimetics1900–5 Piscataway, NJ: IEEE
    [Google Scholar]
  29. 29. 
    Konstantinova J, Jiang A, Althoefer K, Dasgupta P, Nanayakkara T 2014. Implementation of tactile sensing for palpation in robot-assisted minimally invasive surgery: a review. IEEE Sens. J. 14:2490–501
    [Google Scholar]
  30. 30. 
    Black DG, Hosseinabadi AHH, Salcudean SE 2020. 6-DOF force sensing for the master tool manipulator of the da Vinci Surgical System. IEEE Robot. Autom. Lett. 5:2264–71
    [Google Scholar]
  31. 31. 
    Piqué F, Boushaki MN, Brancadoro M, De Momi E, Menciassi A 2019. Dynamic modeling of the da Vinci Research Kit arm for the estimation of interaction wrench. 2019 International Symposium on Medical Robotics Piscataway, NJ: IEEE https://doi.org/10.1109/ISMR.2019.8710210
    [Crossref] [Google Scholar]
  32. 32. 
    Marban A, Srinivasan V, Samek W, Fernández J, Casals A 2019. A recurrent convolutional neural network approach for sensorless force estimation in robotic surgery. Biomed. Signal Process. Control 50:134–50
    [Google Scholar]
  33. 33. 
    Wang Y, Gondokaryono R, Munawar A, Fischer GS 2019. A convex optimization-based dynamic model identification package for the da Vinci Research Kit. IEEE Robot. Autom. Lett. 4:3657–64
    [Google Scholar]
  34. 34. 
    Sang H, Yun J, Monfaredi R, Wilson E, Fooladi H, Cleary K 2017. External force estimation and implementation in robotically assisted minimally invasive surgery. Int. J. Med. Robot. Comput. Assist. Surg. 13:e1824
    [Google Scholar]
  35. 35. 
    Krupa A, Gangloff J, Doignon C, De Mathelin MF, Morel G et al. 2003. Autonomous 3-D positioning of surgical instruments in robotized laparoscopic surgery using visual servoing. IEEE Trans. Robot. Autom. 19:842–53
    [Google Scholar]
  36. 36. 
    Suárez C, Acha B, Serrano C, Parra C, Gómez T 2009. VirSSPA – a virtual reality tool for surgical planning workflow. Int. J. Comput. Assist. Radiol. Surg. 4:133–39
    [Google Scholar]
  37. 37. 
    Das N, Yip MC. 2020. Forward kinematics kernel for improved proxy collision checking. IEEE Robot. Autom. Lett. 5:2349–56
    [Google Scholar]
  38. 38. 
    Sys G, Eykens H, Lenaerts G, Shumelinsky F, Robbrecht C, Poffyn B 2017. Accuracy assessment of surgical planning and three-dimensional-printed patient-specific guides for orthopaedic osteotomies. Proc. Inst. Mech. Eng. H 231:499–508
    [Google Scholar]
  39. 39. 
    Lee SL, Lerotic M, Vitiello V, Giannarou S, Kwok KW et al. 2010. From medical images to minimally invasive intervention: computer assistance for robotic surgery. Comput. Med. Imaging Graph. 34:33–45
    [Google Scholar]
  40. 40. 
    Roberts DW, Strohbehn JW, Hatch JF, Murray W, Kettenberger H 1986. A frameless stereotaxic integration of computerized tomographic imaging and the operating microscope. J. Neurosurg. 65:545–49
    [Google Scholar]
  41. 41. 
    Mochizuki Y, Hosaka A, Kamiuchi H, Nie JX, Masamune K et al. 2016. New simple image overlay system using a tablet PC for pinpoint identification of the appropriate site for anastomosis in peripheral arterial reconstruction. Surg. Today 46:1387–93
    [Google Scholar]
  42. 42. 
    Kong SH, Haouchine N, Soares R, Klymchenko A, Andreiuk B et al. 2017. Robust augmented reality registration method for localization of solid organs' tumors using CT-derived virtual biomechanical model and fluorescent fiducials. Surg. Endosc. 31:2863–71
    [Google Scholar]
  43. 43. 
    Samei G, Tsang K, Kesch C, Lobo J, Hor S et al. 2020. A partial augmented reality system with live ultrasound and registered preoperative MRI for guiding robot-assisted radical prostatectomy. Med. Image Anal. 60:101588
    [Google Scholar]
  44. 44. 
    Katić D, Wekerle AL, Görtler J, Spengler P, Bodenstedt S et al. 2013. Context-aware augmented reality in laparoscopic surgery. Comput. Med. Imaging Graph. 37:174–82
    [Google Scholar]
  45. 45. 
    Qian L, Deguet A, Kazanzides P 2018. ARssist: augmented reality on a head-mounted display for the first assistant in robotic surgery. Healthc. Technol. Lett. 5:194–200
    [Google Scholar]
  46. 46. 
    Sgarbura O, Vasilescu C. 2010. The decisive role of the patient-side surgeon in robotic surgery. Surg. Endosc. 24:3149–55
    [Google Scholar]
  47. 47. 
    Williams MA, McVeigh J, Handa AI, Lee R 2020. Augmented reality in surgical training: a systematic review. Postgrad. Med. J. 96:537–42
    [Google Scholar]
  48. 48. 
    Wang S, Parsons M, Stone-McLean J, Rogers P, Boyd S et al. 2017. Augmented reality as a telemedicine platform for remote procedural training. Sensors 17:2294
    [Google Scholar]
  49. 49. 
    Bowthorpe M, Tavakoli M, Becher H, Howe R 2013. Smith predictor based control in teleoperated image-guided beating-heart surgery. 2013 IEEE International Conference on Robotics and Automation5825–30 Piscataway, NJ: IEEE
    [Google Scholar]
  50. 50. 
    Wood NA, Schwartzman D, Passineau MJ, Moraca RJ, Zenati MA, Riviere CN 2018. Beating-heart registration for organ-mounted robots. Int. J. Med. Robot. Comput. Assist. Surg. 14:e1905
    [Google Scholar]
  51. 51. 
    Ruszkowski A, Mohareri O, Lichtenstein S, Cook R, Salcudean S 2015. On the feasibility of heart motion compensation on the daVinci® surgical robot for coronary artery bypass surgery: implementation and user studies. 2015 IEEE International Conference on Robotics and Automation4432–39 Piscataway, NJ: IEEE
    [Google Scholar]
  52. 52. 
    Bowyer SA, Davies BL, Rodriguez y Baena F 2014. Active constraints/virtual fixtures: a survey. IEEE Trans. Robot. 30:138–57
    [Google Scholar]
  53. 53. 
    Al Nooryani A, Aboushokka W 2018. Rotate-on-retract procedural automation for robotic-assisted percutaneous coronary intervention: first clinical experience. Case Rep. Cardiol. 2018:6086034
    [Google Scholar]
  54. 54. 
    Naghibi H, Hoitzing WB, Stramigioli S, Abayazid M 2018. A flexible endoscopic sensing module for force haptic feedback integration. 2018 9th Cairo International Biomedical Engineering Conference158–61 Piscataway, NJ: IEEE
    [Google Scholar]
  55. 55. 
    Hodgson S, Tavakoli M, Lelevé A, Tu Pham M 2014. High-fidelity sliding mode control of a pneumatic haptic teleoperation system. Adv. Robot. 28:659–71
    [Google Scholar]
  56. 56. 
    Ogawa K, Ohnishi K, Ibrahim Y 2018. Development of flexible haptic forceps based on the electro-hydraulic transmission system. IEEE Trans. Ind. Inform. 14:5256–67
    [Google Scholar]
  57. 57. 
    Molinero MB, Dagnino G, Liu J, Chi W, Abdelaziz MEMK et al. 2019. Haptic guidance for robot-assisted endovascular procedures: implementation and evaluation on surgical simulator. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems5398–403 Piscataway, NJ: IEEE
    [Google Scholar]
  58. 58. 
    Moccia R, Selvaggio M, Villani L, Siciliano B, Ficuciello F 2019. Vision-based virtual fixtures generation for robotic-assisted polyp dissection procedures. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems7934–39 Piscataway, NJ: IEEE
    [Google Scholar]
  59. 59. 
    Van Der Meijden OAJ, Schijven MP 2009. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg. Endosc. 23:1180–90
    [Google Scholar]
  60. 60. 
    Spinelli A, David G, Gidaro S, Carvello M, Sacchi M et al. 2018. First experience in colorectal surgery with a new robotic platform with haptic feedback. Colorectal Dis 20:228–35
    [Google Scholar]
  61. 61. 
    Slawinski PR, Taddese AZ, Musto KB, Obstein KL, Valdastri P 2017. Autonomous retroflexion of a magnetic flexible endoscope. IEEE Robot. Autom. Lett. 2:1352–59
    [Google Scholar]
  62. 62. 
    Haro BB, Zappella L, Vidal R 2012. Surgical gesture classification from video data. Medical Image Computing and Computer-Assisted Intervention – MICCAI 2012 N Ayache, H Delingette, P Golland, K Mori 34–41 Berlin: Springer
    [Google Scholar]
  63. 63. 
    Van Amsterdam B, Nakawala H, Momi ED, Stoyanov D 2019. Weakly supervised recognition of surgical gestures. 2019 International Conference on Robotics and Automation9565–71 Piscataway, NJ: IEEE
    [Google Scholar]
  64. 64. 
    Loukas C, Georgiou E. 2013. Surgical workflow analysis with Gaussian mixture multivariate autoregressive (GMMAR) models: a simulation study. Comput. Aided Surg. 18:47–62
    [Google Scholar]
  65. 65. 
    Beyl T, Nicolai P, Comparetti MD, Raczkowsky J, De Momi E, Wörn H 2016. Time-of-flight-assisted Kinect camera-based people detection for intuitive human robot cooperation in the surgical operating room. Int. J. Comput. Assist. Radiol. Surg. 11:1329–45
    [Google Scholar]
  66. 66. 
    Ahmidi N, Tao L, Sefati S, Gao Y, Lea C et al. 2017. A dataset and benchmarks for segmentation and recognition of gestures in robotic surgery. IEEE Trans. Biomed. Eng. 64:2025–41
    [Google Scholar]
  67. 67. 
    DiPietro R, Ahmidi N, Malpani A, Waldram M, Lee GI et al. 2019. Segmenting and classifying activities in robot-assisted surgery with recurrent neural networks. Int. J. Comput. Assist. Radiol. Surg. 14:2005–20
    [Google Scholar]
  68. 68. 
    Nageotte F, Zanne P, Doignon C, De Mathelin M 2009. Stitching planning in laparoscopic surgery: towards robot-assisted suturing. Int. J. Robot. Res. 28:1303–21
    [Google Scholar]
  69. 69. 
    Jackson RC, Desai V, Castillo JP, Çavuşoğlu MC 2016. Needle-tissue interaction force state estimation for robotic surgical suturing. 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems3659–64 Piscataway, NJ: IEEE
    [Google Scholar]
  70. 70. 
    Zhong F, Wang Y, Wang Z, Liu YH 2019. Dual-arm robotic needle insertion with active tissue deformation for autonomous suturing. IEEE Robot. Autom. Lett. 4:2669–76
    [Google Scholar]
  71. 71. 
    Pedram SA, Ferguson P, Ma J, Dutson E, Rosen J 2017. Autonomous suturing via surgical robot: an algorithm for optimal selection of needle diameter, shape, and path. 2017 IEEE International Conference on Robotics and Automation2391–98 Piscataway, NJ: IEEE
    [Google Scholar]
  72. 72. 
    Staub C, Osa T, Knoll A, Bauernschmitt R 2010. Automation of tissue piercing using circular needles and vision guidance for computer aided laparoscopic surgery. 2010 IEEE International Conference on Robotics and Automation4585–90 Piscataway, NJ: IEEE
    [Google Scholar]
  73. 73. 
    Watanabe K, Kanno T, Ito K, Kawashima K 2018. Single-master dual-slave surgical robot with automated relay of suture needle. IEEE Trans. Ind. Electron. 65:6343–51
    [Google Scholar]
  74. 74. 
    Schulman J, Tayson-Frederick M. 2013. A case study of trajectory transfer through non-rigid registration for a simplified suturing scenario. 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems4111–17 Piscataway, NJ: IEEE
    [Google Scholar]
  75. 75. 
    Sen S, Garg A, Gealy DV, McKinley S, Jen Y, Goldberg K 2016. Automating multi-throw multilateral surgical suturing with a mechanical needle guide and sequential convex optimization. 2016 IEEE International Conference on Robotics and Automation4178–85 Piscataway, NJ: IEEE
    [Google Scholar]
  76. 76. 
    Chow DL, Newman W. 2013. Improved knot-tying methods for autonomous robot surgery. 2013 IEEE International Conference on Automation Science and Engineering461–65 Piscataway, NJ: IEEE
    [Google Scholar]
  77. 77. 
    Mayer H, Gomez F, Wierstra D, Nagy I, Knoll A, Schmidhuber J 2008. A system for robotic heart surgery that learns to tie knots using recurrent neural networks. Adv. Robot. 22:1521–37
    [Google Scholar]
  78. 78. 
    Knoll A, Mayer H, Staub C, Bauernschmitt R 2012. Selective automation and skill transfer in medical robotics: a demonstration on surgical knot-tying. Int. J. Med. Robot. Comput. Assist. Surg. 8:384–97
    [Google Scholar]
  79. 79. 
    Chow DL, Newman W. 2015. Trajectory optimization of robotic suturing. 2015 IEEE International Conference on Technologies for Practical Robot Applications Piscataway, NJ: IEEE https://doi.org/10.1109/TePRA.2015.7219672
    [Crossref] [Google Scholar]
  80. 80. 
    Leonard S, Wu KL, Kim Y, Krieger A, Kim PC 2014. Smart Tissue Anastomosis Robot (STAR): a vision-guided robotics system for laparoscopic suturing. IEEE Trans. Biomed. Eng. 61:1305–17
    [Google Scholar]
  81. 81. 
    Krieger A, Opfermann J, Kim PCW 2017. Development and feasibility of a robotic laparoscopic clipping tool for wound closure and anastomosis. J. Med. Devices 12:011005
    [Google Scholar]
  82. 82. 
    Jansen R, Hauser K, Chentanez N, Van Der Stappen F, Goldberg K 2009. Surgical retraction of nonuniform deformable layers of tissue: 2D robot grasping and path planning. 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems4092–97 Piscataway, NJ: IEEE
    [Google Scholar]
  83. 83. 
    Patil S, Alterovitz R. 2010. Toward automated tissue retraction in robot-assisted surgery. 2010 IEEE International Conference on Robotics and Automation2088–94 Piscataway, NJ: IEEE
    [Google Scholar]
  84. 84. 
    Elek R, Nagy TD, Nagy D, Garamvölgyi T, Takács B et al. 2017. Towards surgical subtask automation—blunt dissection. 2017 IEEE 21st International Conference on Intelligent Engineering Systems253–57 Piscataway, NJ: IEEE
    [Google Scholar]
  85. 85. 
    Nagy TD, Takacs M, Rudas IJ, Haidegger T 2018. Surgical subtask automation—soft tissue retraction. 2018 IEEE 16th World Symposium on Applied Machine Intelligence and Informatics55–60 Piscataway, NJ: IEEE
    [Google Scholar]
  86. 86. 
    Trejos AL, Jayender J, Perri MT, Naish MD, Patel RV, Malthaner RA 2009. Robot-assisted tactile sensing for minimally invasive tumor localization. Int. J. Robot. Res. 28:1118–33
    [Google Scholar]
  87. 87. 
    Back J, Dasgupta P, Seneviratne L, Althoefer K, Liu H 2015. Feasibility study- novel optical soft tactile array sensing for minimally invasive surgery. 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems1528–33 Piscataway, NJ: IEEE
    [Google Scholar]
  88. 88. 
    McKinley S, Garg A, Sen S, Kapadia R, Murali A et al. 2015. A single-use haptic palpation probe for locating subcutaneous blood vessels in robot-assisted minimally invasive surgery. 2015 IEEE International Conference on Automation Science and Engineering1151–58 Piscataway, NJ: IEEE
    [Google Scholar]
  89. 89. 
    Campisano F, Ozel S, Ramakrishnan A, Dwivedi A, Gkotsis N et al. 2017. Towards a soft robotic skin for autonomous tissue palpation. 2015 IEEE International Conference on Automation Science and Engineering6150–55 Piscataway, NJ: IEEE
    [Google Scholar]
  90. 90. 
    Bajo A, Simaan N. 2016. Hybrid motion/force control of multi-backbone continuum robots. Int. J. Robot. Res. 35:422–34
    [Google Scholar]
  91. 91. 
    Ayvali E, Ansari A, Wang L, Simaan N, Choset H 2017. Utility-guided palpation for locating tissue abnormalities. IEEE Robot. Autom. Lett. 2:864–71
    [Google Scholar]
  92. 92. 
    Nichols KA, Okamura AM. 2015. Methods to segment hard inclusions in soft tissue during autonomous robotic palpation. IEEE Trans. Robot. 31:344–54
    [Google Scholar]
  93. 93. 
    Ayvali E, Srivatsan RA, Wang L, Roy R, Simaan N, Choset H 2016. Using Bayesian optimization to guide probing of a flexible environment for simultaneous registration and stiffness mapping. 2016 IEEE International Conference on Robotics and Automation931–36 Piscataway, NJ: IEEE
    [Google Scholar]
  94. 94. 
    Chalasani P, Wang L, Roy R, Simaan N, Taylor RH, Kobilarov M 2016. Concurrent nonparametric estimation of organ geometry and tissue stiffness using continuous adaptive palpation. 2016 IEEE International Conference on Robotics and Automation4164–71 Piscataway, NJ: IEEE
    [Google Scholar]
  95. 95. 
    Constanciel E, N'Djin WA, Bessiere F, Chavrier F, Grinberg D et al. 2013. Design and evaluation of a transesophageal HIFU probe for ultrasound-guided cardiac ablation: simulation of a HIFU mini-maze procedure and preliminary ex vivo trials. IEEE Trans. Ultrason. Ferroelectr. Freq. Control 60:1868–83
    [Google Scholar]
  96. 96. 
    Wang H, Kang W, Carrigan T, Bishop A, Rosenthal N et al. 2011. In vivo intracardiac optical coherence tomography imaging through percutaneous access: toward image-guided radio-frequency ablation. J. Biomed. Opt. 16:110505
    [Google Scholar]
  97. 97. 
    Yang L, Wen R, Qin J, Chui CK, Lim KB, Chang SKY 2010. A robotic system for overlapping radiofrequency ablation in large tumor treatment. IEEE/ASME Trans. Mechatron. 15:887–97
    [Google Scholar]
  98. 98. 
    Su B, Tang J, Liao H 2015. Automatic laser ablation control algorithm for an novel endoscopic laser ablation end effector for precision neurosurgery. 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems4362–67 Piscataway, NJ: IEEE
    [Google Scholar]
  99. 99. 
    Sarli N, Del Giudice G, De S, Dietrich MS, Herrell SD, Simaan N 2018. Preliminary porcine in vivo evaluation of a telerobotic system for transurethral bladder tumor resection and surveillance. J. Endourol. 32:516–22
    [Google Scholar]
  100. 100. 
    Alambeigi F, Wang Z, Liu YH, Taylor RH, Armand M 2018. Toward semi-autonomous cryoablation of kidney tumors via model-independent deformable tissue manipulation technique. Ann. Biomed. Eng. 46:1650–62
    [Google Scholar]
  101. 101. 
    Taktak S, Jones P, Haq A, Rai BP, Somani BK 2018. Aquablation: a novel and minimally invasive surgery for benign prostate enlargement. Ther. Adv. Urol. 10:183–88
    [Google Scholar]
  102. 102. 
    Martin JW, Slawinski PR, Scaglioni B, Norton JC, Valdastri P, Obstein KL 2019. Assistive autonomy in colonoscopy: propulsion of a magnetic flexible endoscope. Gastrointest. Endosc. 89:AB76–77
    [Google Scholar]
  103. 103. 
    Okamura AM, Simone C, O'Leary MD 2004. Force modeling for needle insertion into soft tissue. IEEE Trans. Biomed. Eng. 51:1707–16
    [Google Scholar]
  104. 104. 
    Osa T, Abawi CF, Sugita N, Chikuda H, Sugita S et al. 2014. Autonomous penetration detection for bone cutting tool using demonstration-based learning. 2014 IEEE International Conference on Robotics and Automation290–96 Piscataway, NJ: IEEE
    [Google Scholar]
  105. 105. 
    Yip MC, Lowe DG, Salcudean SE, Rohling RN, Nguan CY 2012. Tissue tracking and registration for image-guided surgery. IEEE Trans. Med. Imaging 31:2169–82
    [Google Scholar]
  106. 106. 
    Peterlík I, Courtecuisse H, Rohling R, Abolmaesumi P, Nguan C et al. 2018. Fast elastic registration of soft tissues under large deformations. Med. Image Anal. 45:24–40
    [Google Scholar]
  107. 107. 
    Navarro-Alarcon D, Yip HM, Wang Z, Liu YH, Zhong F et al. 2016. Automatic 3-D manipulation of soft objects by robotic arms with an adaptive deformation model. IEEE Trans. Robot. 32:429–41
    [Google Scholar]
  108. 108. 
    Alambeigi F, Wang Z, Hegeman R, Liu YH, Armand M 2019. Autonomous data-driven manipulation of unknown anisotropic deformable tissues using unmodelled continuum manipulators. IEEE Robot. Autom. Lett. 4:254–61
    [Google Scholar]
  109. 109. 
    Pappone C, Ciconte G, Vicedomini G, Mangual JO, Li W et al. 2018. Clinical outcome of electrophysiologically guided ablation for nonparoxysmal atrial fibrillation using a novel real-time 3-dimensional mapping technique. Circ. Arrhythm. Electrophysiol. 11:e005904
    [Google Scholar]
  110. 110. 
    Decker R, Shademan A, Opfermann J, Leonard S, Kim PCW, Krieger A 2015. Performance evaluation and clinical applications of 3D plenoptic cameras. Next-Generation Robotics II; and Machine Intelligence and Bio-inspired Computation: Theory and Applications IX pap. 94940B Bellingham, WA: Soc. Photo-Opt. Instrum. Eng.
    [Google Scholar]
  111. 111. 
    Shademan A, Decker RS, Opfermann J, Leonard S, Kim PC, Krieger A 2016. Plenoptic cameras in surgical robotics: calibration, registration, and evaluation. 2016 IEEE International Conference on Robotics and Automation708–14 Piscataway, NJ: IEEE
    [Google Scholar]
  112. 112. 
    Bloch E, Thurin B, Keane P, Nousias S, Bergeles C, Ourselin S 2018. Retinal fundus imaging with a plenoptic sensor. Ophthalmic Technologies XXVIII pap. 1047429 Bellingham, WA: Soc. Photo-Opt. Instrum. Eng.
    [Google Scholar]
  113. 113. 
    Clancy NT, Jones G, Maier-Hein L, Elson DS, Stoyanov D 2020. Surgical spectral imaging. Med. Image Anal. 63:101699
    [Google Scholar]
  114. 114. 
    Yu L, Hao L, Meiqiong T, Jiaoqi H, Wei L et al. 2019. The medical application of terahertz technology in noninvasive detection of cells and tissues: opportunities and challenges. RSC Adv 9:9354–63
    [Google Scholar]
  115. 115. 
    Speidel S, Kroehnert A, Bodenstedt S, Kenngott H, Müller-Stich B, Dillmann R 2015. Image-based tracking of the suturing needle during laparoscopic interventions. Medical Imaging 2015: Image-Guided Procedures, Robotic Interventions, and Modeling RJ Webster, ZR Yaniv, pap. 94150B Bellingham, WA: Soc. Photo-Opt. Instrum. Eng.
    [Google Scholar]
  116. 116. 
    Gu Y, Hu Y, Zhang L, Yang J, Yang GZ 2018. Cross-scene suture thread parsing for robot assisted anastomosis based on joint feature learning. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems769–76 Piscataway, NJ: IEEE
    [Google Scholar]
  117. 117. 
    Jackson RC, Yuan R, Chow DL, Newman WS, Çavuşoğlu MC 2018. Real-time visual tracking of dynamic surgical suture threads. IEEE Trans. Autom. Sci. Eng. 15:1078–90
    [Google Scholar]
  118. 118. 
    Padoy N, Hager GD. 2011. 3D thread tracking for robotic assistance in tele-surgery. 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems2102–7 Piscataway, NJ: IEEE
    [Google Scholar]
  119. 119. 
    D'Ettorre C, Dwyer G, Du X, Chadebecq F, Vasconcelos F et al. 2018. Automated pick-up of suturing needles for robotic surgical assistance. 2018 IEEE International Conference on Robotics and Automation1370–77 Piscataway, NJ: IEEE
    [Google Scholar]
  120. 120. 
    Beigi P, Rohling R, Salcudean T, Lessoway VA, Ng GC 2017. Detection of an invisible needle in ultrasound using a probabilistic SVM and time-domain features. Ultrasonics 78:18–22
    [Google Scholar]
  121. 121. 
    Mathiassen K, Dall'Alba D, Muradore R, Fiorini P, Elle OJ 2017. Robust real-time needle tracking in 2-D ultrasound images using statistical filtering. IEEE Trans. Control Syst. Technol. 25:966–78
    [Google Scholar]
  122. 122. 
    Zhong F, Liu Y. 2018. Image-based 3D pose reconstruction of surgical needle for robot-assisted laparoscopic suturing. Chin. J. Electron. 27:476–82
    [Google Scholar]
  123. 123. 
    Abayazid M, Roesthuis RJ, Reilink R, Misra S 2013. Integrating deflection models and image feedback for real-time flexible needle steering. IEEE Trans. Robot. 29:542–53
    [Google Scholar]
  124. 124. 
    Vrooijink GJ, Abayazid M, Patil S, Alterovitz R, Misra S 2014. Needle path planning and steering in a three-dimensional nonstatic environment using two-dimensional ultrasound images. Int. J. Robot. Res. 33:1361–74
    [Google Scholar]
  125. 125. 
    Patel NA, van Katwijk T, Gang LI, Moreira P, Shang W et al. 2015. Closed-loop asymmetric-tip needle steering under continuous intraoperative MRI guidance. 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society4869–74 Piscataway, NJ: IEEE
    [Google Scholar]
  126. 126. 
    Moreira P, Boskma KJ, Misra S 2017. Towards MRI-guided flexible needle steering using fiber Bragg grating-based tip tracking. 2017 IEEE International Conference on Robotics and Automation4849–54 Piscataway, NJ: IEEE
    [Google Scholar]
  127. 127. 
    Shahriari N, Georgiadis JR, Oudkerk M, Misra S 2018. Hybrid control algorithm for flexible needle steering: demonstration in phantom and human cadaver. PLOS ONE 13:e0210052
    [Google Scholar]
  128. 128. 
    Fu M, Kuntz A, Webster RJ, Alterovitz R 2018. Safe motion planning for steerable needles using cost maps automatically extracted from pulmonary images. 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems4942–49 Piscataway, NJ: IEEE
    [Google Scholar]
  129. 129. 
    Fagogenis G, Mencattelli M, Machaidze Z, Rosa B, Price K et al. 2019. Autonomous robotic intracardiac catheter navigation using haptic vision. Sci. Robot. 4:eaaw1977
    [Google Scholar]
  130. 130. 
    Saeidi H, Le HND, Opfermann JD, Leonard S, Kim A et al. 2019. Autonomous laparoscopic robotic suturing with a novel actuated suturing tool and 3D endoscope. 2019 International Conference on Robotics and Automation1541–47 Piscataway, NJ: IEEE
    [Google Scholar]
  131. 131. 
    Havaei M, Davy A, Warde-Farley D, Biard A, Courville A et al. 2017. Brain tumor segmentation with Deep Neural Networks. Med. Image Anal. 35:18–31
    [Google Scholar]
  132. 132. 
    Hu P, Wu F, Peng J, Liang P, Kong D 2016. Automatic 3D liver segmentation based on deep learning and globally optimized surface evolution. Phys. Med. Biol. 61:8676–98
    [Google Scholar]
  133. 133. 
    Qiu W, Yuan J, Ukwatta E, Sun Y, Rajchl M, Fenster A 2014. Prostate segmentation: an efficient convex optimization approach with axial symmetry using 3-D TRUS and MR images. IEEE Trans. Med. Imaging 33:947–60
    [Google Scholar]
  134. 134. 
    Li X, Chen H, Qi X, Dou Q, Fu CW, Heng PA 2018. H-DenseUNet: hybrid densely connected UNet for liver and tumor segmentation from CT volumes. IEEE Trans. Med. Imaging 37:2663–74
    [Google Scholar]
  135. 135. 
    Brainlab 2020. iPlan RT Planning Software: comprehensive package for premium radiosurgery. Brainlab https://www.brainlab.com/radiosurgery-products/iplan-rt-treatment-planning-software
    [Google Scholar]
  136. 136. 
    Opfermann JD, Leonard S, Decker RS, Uebele NA, Bayne CE et al. 2017. Semi-autonomous electrosurgery for tumor resection using a multi-degree of freedom electrosurgical tool and visual servoing. 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems3653–60 Piscataway, NJ: IEEE
    [Google Scholar]
  137. 137. 
    Nichols KA, Okamura AM. 2013. Autonomous robotic palpation: machine learning techniques to identify hard inclusions in soft tissues. 2013 IEEE International Conference on Robotics and Automation4384–89 Piscataway, NJ: IEEE
    [Google Scholar]
  138. 138. 
    McKinley S, Garg A, Sen S, Gealy DV, McKinley JP et al. 2016. An interchangeable surgical instrument system with application to supervised automation of multilateral tumor resection. 2016 IEEE International Conference on Automation Science and Engineering821–26 Piscataway, NJ: IEEE
    [Google Scholar]
  139. 139. 
    US Dep. Transport 2020. Ensuring American leadership in automated vehicle technologies: automated vehicles 4.0 Rep., Natl. Sci. Technol. Counc. and US Dep. Transport Washington, DC:
  140. 140. 
    US Dep. Def 2017. Toward a next-generation trauma care capability: Foundational Research for Autonomous, Unmanned, and Robotics Development of Medical Technologies (FORwARD) award Fund. Oppor. W81XWH-17-MSISRP-FOR, US Dep. Def Washington, DC:
  141. 141. 
    O'Sullivan S, Nevejans N, Allen C, Blyth A, Leonard S et al. 2019. Legal, regulatory, and ethical frameworks for development of standards in artificial intelligence (AI) and autonomous robotic surgery. Int. J. Med. Robot. Comput. Assist. Surg. 15:e1968
    [Google Scholar]
  142. 142. 
    Eur. Parliam 2017. European Parliament resolution of 16 February 2017 with recommendations to the Commission on Civil Law Rules on Robotics Resol. P8_TA(2017)0051 Eur. Parliam Strasbourg, Fr:.
  143. 143. 
    Eur. Parliam 2018. Comprehensive European industrial policy on artificial intelligence and robotics Resol. P8_TA(2019)0081 Eur. Parliam Strasbourg, Fr:.
  144. 144. 
    O'Sullivan S, Leonard S, Holzinger A, Allen C, Battaglia F et al. 2020. Operational framework and training standard requirements for AI-empowered robotic surgery. Int. J. Med. Robot. Comput. Assist. Surg. 16: https://doi.org/10.1002/rcs.2020
    [Crossref] [Google Scholar]
  145. 145. 
    Jamjoom AAB, Jamjoom AMA, Marcus HJ 2020. Exploring public opinion about liability and responsibility in surgical robotics. Nat. Mach. Intell. 2:194–96
    [Google Scholar]
  146. 146. 
    Shah R, Nagaraja S. 2019. Privacy with surgical robotics: challenges in applying contextual privacy theory. arXiv:1909.01862 [cs.CR]
  147. 147. 
    Haidegger T. 2019. Autonomy for surgical robots: concepts and paradigms. IEEE Trans. Med. Robot. Bionics 1:65–76
    [Google Scholar]
  148. 148. 
    Datteri E. 2013. Predicting the long-term effects of human-robot interaction: a reflection on responsibility in medical robotics. Sci. Eng. Ethics 19:139–60
    [Google Scholar]
  149. 149. 
    Stahl BC, Coeckelbergh M. 2016. Ethics of healthcare robotics: towards responsible research and innovation. Robot. Auton. Syst. 86:152–61
    [Google Scholar]
/content/journals/10.1146/annurev-control-062420-090543
Loading
/content/journals/10.1146/annurev-control-062420-090543
Loading

Data & Media loading...

  • Article Type: Review Article
This is a required field
Please enter a valid email address
Approval was a Success
Invalid data
An Error Occurred
Approval was partially successful, following selected items could not be processed due to error