Ashirwad ChowriappaMohamed SharifSyed Johar RazaKhurshid A. Guru
You have accessJournal of UrologyTechnology & Instruments: Surgical Education & Skills Assessment II1 Apr 2014MP14-05 KNOWLEDGE-BASED ACTIVITY RECOGNITION DURING ROBOT-ASSISTED SURGERY: BABY STEPS TOWARDS AUTONOMOUS SURGERY Ashirwad Chowriappa, Mohamed Sharif, Syed Johar Raza, and Khurshid Guru Ashirwad ChowriappaAshirwad Chowriappa More articles by this author , Mohamed SharifMohamed Sharif More articles by this author , Syed Johar RazaSyed Johar Raza More articles by this author , and Khurshid GuruKhurshid Guru More articles by this author View All Author Informationhttps://doi.org/10.1016/j.juro.2014.02.633AboutPDF ToolsAdd to favoritesDownload CitationsTrack CitationsPermissionsReprints ShareFacebookTwitterLinked InEmail INTRODUCTION AND OBJECTIVES Advances in automation are obvious in both automobile and aviation industries. Automated surgical systems are the next frontier for Robot-Assisted Surgery (RAS). Surgical step recognition is the key-maneuver needed for initiating automation during RAS. A novel methodology for the automated detection of surgical activities from intraoperative sequences during RAS is presented. METHODS Activity sequences during RAS were classified and labeled by expert robotic surgeons based on tool and tissue related criteria (i.e. cautery, camera motion, tissue dissection, etc.). Approach for surgical activity recognition that is capable of identifying known activities and distinguish them from unknown activities is developed and implemented. As these activities are spatio-temporally correlated, we use a procedure-centered description in order to extract eight perceptually characteristic features that capture the three-dimensional structures of the surgical activities. RESULTS The method was validated using 96 expert annotated samples for various surgical activities, extracted from 6 Robot-assisted Radical Prostatectomies (RARP). High (512xn) dimensional feature vectors were extracted frame-wise from activity sequences in RARP. These sequences are warped non-linearly in the time domain to determine the closest sub-sequence match in the surgical procedure using Dynamic Time Warping (DTW). Similar activities tend to cluster within the same region of a multidimensional space in which the axes are the perceptual properties. Closeness between high-dimensional sequences was determined by their Manhattan distance metric (L1 norm). Our results demonstrate that actions of the same activity generate a lower score when compared to activities of a different nature (Figure). CONCLUSIONS Automated activity sequence recognition during RAS paves the way for surgeon-independent procedures. © 2014FiguresReferencesRelatedDetails Volume 191Issue 4SApril 2014Page: e168-e169 Peer Review Report Advertisement Copyright & Permissions© 2014Metrics Author Information Ashirwad Chowriappa More articles by this author Mohamed Sharif More articles by this author Syed Johar Raza More articles by this author Khurshid Guru More articles by this author Expand All Advertisement Advertisement PDF downloadLoading ...
Long BaiGuankun WangJie WangXiaoxiao YangHuxin GaoXin LiangAn WangMobarakol IslamHongliang Ren
Amy C. LehmanNathan A. WoodJason DumpertDmitry OleynikovShane Farritor
Hang SuWen QiChenguang YangJiehao LiXuanyi ZhouGiancarlo FerrignoElena De Momi