Mining surgery phase-related sequential rules from vertebroplasty simulations traces
Résumé
We present in this paper an algorithm for extracting perceptual-gestural rules from heterogeneous multisource traces. The challenge that we address is two-fold: 1) represent traces such that they render coherently all aspect of this multimodal knowledge; 2) ensure that key tutoring services can be produced on top of represented traces. In the spirit of automatic knowledge acquisition paradigm proposed in the literature, we implemented PhARules, a modified version of an existing algorithm, CMRules, for mining surgery phase-aware sequential rules from simulated surgery traces. We demonstrated the efficiency of our algorithm as well its performance limits on traces of simulations of vertebroplasty recorded in TELEOS, an Intelligent Tutoring System dedicated to percutaneous orthopedic surgery.