Pupillometry in telerobotic surgery: A comparative evaluation of algorithms for cognitive effort estimation

Authors

  • Paola Ruiz Puentes Johns Hopkins University
  • Roger D Soberanis-Mukul Johns Hopkins University
  • Ayberk Acar Vanderbilt University
  • Iris Gupta Johns Hopkins University
  • Joyraj Bhowmick Johns Hopkins University
  • Yizhou Li Vanderbilt University
  • Ahmed Ghazi University of Rochester Medical Center
  • Peter Kazanzides Johns Hopkins University
  • Jie Ying Wu Vanderbilt University
  • Mathias Unberath Johns Hopkins University

DOI:

https://doi.org/10.54844/mr.2023.0420

Abstract

Background: Eye gaze tracking and pupillometry are emerging topics in telerobotic surgery as it is believed that they will enable novel gaze-based interaction paradigms and provide insights into the user’s cognitive load (CL). Further, the successful integration of CL estimation into telerobotic systems is thought to catalyze the development of new human-computer interfaces for personalized assistance and training processes. However, this field is in its infancy, and identifying reliable gaze and pupil-tracking solutions in robotic surgery is still an area of ongoing research and high uncertainty. Methods: Considering the potential benefits of pupillometry-based CL assessments in telerobotic surgery, we seek to better understand the possibilities and limitations of contemporary pupillometry-based cognitive effort estimation algorithms in telerobotic surgery. To this end, we conducted a user study using the da Vinci Research Kit (dVRK) and performed two experiments where participants were asked to perform a series of N-Back tests, either while (i) idling or (ii) performing a peg transfer task. We then compare four contemporary CL estimation methods based on direct analysis of pupil diameter in the spatial and frequency domains. Results: We find that some methods can detect the presence of cognitive effort in simple scenarios (e.g., when the user is not performing any manual task), they fail to differentiate the different levels of CL. Similarly, when the manual peg transfer task is added, the reliability of all models is compromised, highlighting the necessity of more robust methods that consider different factors that complement the pupil diameter information. Conclusion: Our results offer a quantitative perspective of the limitations of the current solutions and highlight the necessity of developing tailored designs for the telerobotic surgery environment.

References

Zhu H, Salcudean SE, Rohling RN. A novel gaze-supported multimodal human-computer interaction for ultrasound machines. Int J Comput Assist Radiol Surg. 2019;14(7):1107-1115. [PMID: 30977092 DOI: 10.1007/s11548-019-01964-8]

Long Y, Wu JY, Lu B, et al. Relational graph learning on visual and kinematics embeddings for accurate gesture recognition in robotic surgery. In: 2021 IEEE International Conference on Robotics and Automation (ICRA). 2021: 13346–13353. [DOI:10.1109/ICRA48506.2021.9561028]

Wu JY, Tamhane A, Kazanzides P, Unberath M. Cross-modal self-supervised representation learning for gesture and skill recognition in robotic surgery. Int J Comput Assist Radiol Surg. 2021;16(5):779-787. [PMID: 33759079 DOI: 10.1007/s11548-021-02343-y]

Bharathan R, Vali S, Setchell T, Miskry T, Darzi A, Aggarwal R. Psychomotor skills and cognitive load training on a virtual reality laparoscopic simulator for tubal surgery is effective. Eur J Obstet Gynecol Reprod Biol. 2013;169(2):347-352. [PMID: 23608628 DOI: 10.1016/j.ejogrb.2013.03.017]

Chen IA, Ghazi A, Sridhar A, et al. Evolving robotic surgery training and improving patient safety, with the integration of novel technologies. World J Urol. 2021;39(8):2883-2893. [PMID: 33156361 DOI: 10.1007/s00345-020-03467-7]

Zakeri Z, Mansfield N, Sunderland C, Omurtag A. Physiological correlates of cognitive load in laparoscopic surgery. Sci Rep. 2020;10(1):12927. [PMID: 32737352 DOI: 10.1038/s41598-020-69553-3]

Joseph AW, Murugesh R. Potential Eye Tracking Metrics and Indicators to Measure Cognitive Load in Human-Computer Interaction Research. J Sci Res. 2020;64,168-175. [DOI: 10.37398/JCR.2020.640.137]

Skaramagkas V, Giannakakis G, Ktistakis E, et al. Review of Eye Tracking Metrics Involved in Emotional and Cognitive Processes. IEEE Rev Biomed Eng. 2023;16:260-277. [PMID: 33729950 DOI: 10.1109/RBME.2021.3066072]

Krejtz K, Duchowski AT, Niedzielska A, Biele C, Krejtz I. Eye tracking cognitive load using pupil diameter and microsaccades with fixed gaze. PLoS One. 2018;13(9):e0203629. [PMID: 30216385 DOI: 10.1371/journal.pone.0203629]

Atkinson RC, Shiffrin RM. Human Memory: A Proposed System and its Control Processes. Psycho Learn Motiv. 1968;2:189-195. [DOI: 10.1016/S0079-7421(08)60422-3]

Sweller J. Cognitive Load During Problem Solving: Effects on Learning. Cogn Sci. 1988;12(2):257-285.

Sweller J. Cognitive load theory. In: Mestre JP, Ross BH, eds. The psychology of learning and motivation: Cognition in education. Elsevier Academic Press; 2011: 37–76.

Larsen HH, Scheel AN, Bogers T, Larsen B. Hands-free but not Eyes-free: A usability evaluation of SIRI while driving. In: Proceedings of the 2020 Conference on Human Information Interaction and Retrieval. 2020: 63–72. [DOI: 10.1145/3343413.3377962]

Strayer DL, Cooper JM, Turrill J, Coleman JR, Hopman RJ. The smartphone and the driver’s cognitive workload: A comparison of Apple, Google, and Microsoft’s intelligent personal assistants. Can J Exp Psychol. 2017;71(2):93-110. [PMID: 28604047 DOI: 10.1037/cep0000104]

Tokuno J, Carver TE, Fried GM. Measurement and Management of Cognitive Load in Surgical Education: A Narrative Review. J Surg Educ. 2023;80(2):208-215. [PMID: 36335034 DOI: 10.1016/j.jsurg.2022.10.001]

Wilson RC, Shenhav A, Straccia M, Cohen JD. The Eighty Five Percent Rule for optimal learning. Nat Commun. 2019;10(1):4646. [PMID: 31690723 DOI: 10.1038/s41467-019-12552-4]

Sridhar AN, Briggs TP, Kelly JD, Nathan S. Training in Robotic Surgery-an Overview. Curr Urol Rep. 2017;18(8):58. [PMID: 28647793 DOI: 10.1007/s11934-017-0710-y]

Brook NR, Dell’Oglio P, Barod R, Collins J, Mottrie A. Comprehensive training in robotic surgery. Curr Opin Urol. 2019;29(1):1-9. [PMID: 30394945 DOI: 10.1097/MOU.0000000000000566]

Chung YJ, Hsu CW, Chan MH, Cherng FY. Enhancing ESL Learners’ Experience and Performance through Gradual Adjustment of Video Speed during Extensive Viewing. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems. 2023: 1–7. [DOI:10.1145/3544549.3585804]

Krejtz K, Żurawska J, Duchowski AT, Wichary S. Pupillary and Microsaccadic Responses to Cognitive Effort and Emotional Arousal During Complex Decision Making. J Eye Mov Res. 2020;13(5):10. [PMID: 33828808 DOI: 10.16910/jemr.13.5.2]

Pfleging B, Fekety DK, Schmidt A, Kun AL. A Model Relating Pupil Diameter to Mental Workload and Lighting Conditions. In: Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems. 2016: 5776–5788. [DOI: 10.1145/2858036.2858117]

Wang CA, Munoz DP. A circuit for pupil orienting responses: implications for cognitive modulation of pupil size. Curr Opin Neurobiol. 2015;33:134-140. [PMID: 25863645 DOI: 10.1016/j.conb.2015.03.018]

Aminihajibashi S, Hagen T, Andreassen OA, Laeng B, Espeseth T. The effects of cognitive abilities and task demands on tonic and phasic pupil sizes. Biol Psychol. 2020;156:107945. [PMID: 32889001 DOI: 10.1016/j.biopsycho.2020.107945]

Duchowski AT, Krejtz K, Krejtz I, et al. The index of pupillary activity: Measuring cognitive load vis-à-vis task difficulty with pupil oscillation. In: Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems. 2018: 1–13. [DOI:10.1145/3173574.3173856]

Duchowski AT, Krejtz K, Gehrer NA, Bafna T, Bækgaard P. The low/high index of pupillary activity. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems. 2020: 1–12. [DOI:10.1145/3313831.3376394]

Abbad-Andaloussi A, Burattin A, Slaats T, Kindler E, Weber B. Complexity in declarative process models: Metrics and multi-modal assessment of cognitive load. Expert Syst Appl. 2023;233:120924. [DOI: 10.1016/j.eswa.2023.120924]

Bacchin D, Gehrer NA, Krejtz K, Duchowski AT, Gamberini L. Gaze-based Metrics of Cognitive Load in a Conjunctive Visual Memory Task. In: Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems. 2023: 1–8. [DOI:10.1145/3544549.3585650]

Kazanzides P, Chen Z, Deguet A, Fischer GS, Taylor RH, DiMaio SP. An open-source research kit for the da Vinci® Surgical System. In: 2014 IEEE international conference on robotics and automation (ICRA). 2014: 6434–6439. [DOI:10.1109/ICRA.2014.6907809]

Klingner J, Kumar R, Hanrahan P. Measuring the task-evoked pupillary response with a remote eye tracker. In: Proceedings of the 2008 symposium on Eye tracking research & applications. 2008: 69–72. [DOI:10.1145/1344471.1344489]

Kassner M, Patera W, Bulling A. Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing: Adjunct Publication. 2014: 1151–1160. [DOI:10.1145/2638728.2641695.]

Tuceryan M, Navab N. Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR. In: Proceedings IEEE and ACM International Symposium on Augmented Reality (ISAR 2000). 2020: 149–158. [DOI:10.1109/ISAR.2000.880938]

Mehler B, Reimer B, Dusek JA. MIT AgeLab delayed digit recall task (N-Back). Massachusetts Institute of Technology, Cambridge, MA, USA. 2011.

Memar Ardestani M, Yan H. Noise Reduction in Human Motion-Captured Signals for Computer Animation based on B-Spline Filtering. Sensors (Basel). 2022;22(12):4629. [PMID: 35746410 DOI: 10.3390/s22124629]

Kret ME, Sjak-Shie EE. Preprocessing pupil size data: Guidelines and code. Behav Res Methods. 2019;51(3):1336-1342. [PMID: 29992408 DOI: 10.3758/s13428-018-1075-y]

Lee Y, Jung KT, Lee HC. Use of gaze entropy to evaluate situation awareness in emergency accident situations of nuclear power plant. Nucl Eng Technol. 2022;54(4):1261-1270. [DOI: 10.1016/j.net.2021.10.022]

Di Stasi LL, Diaz-Piedra C, Rieiro H, et al. Gaze entropy reflects surgical task load. Surg Endosc. 2016;30(11):5034-5043. [PMID: 26983440 DOI: 10.1007/s00464-016-4851-8]

Downloads

Published

2023-08-31

Issue

Section

Original Articles

Downloads

Download data is not yet available.