Volume-based Graphics and Haptics Rendering Algorithms for Immersive Surgical Simulation

2015-10-25T00:00:00Z (GMT) by Silvio Rizzi
This work shows the research and development involved in solving essential problems in the emerging field of surgical simulation. It focuses on a haptics-based Augmented Reality surgical simulation platform known as ImmersiveTouch®, which implements technologies patented by the Board of Trustees of the University of Illinois. Through the nine chapters of this thesis, a gradual transition to new paradigms in surgical simulation is naturally developed, starting from methods to create patient-specific 3D models for training and pre-operative planning, continuing with the development of a voxel-based haptics algorithm, its performance evaluation, and extensions for multipoint collision detection; followed by the introduction of graphics and haptics techniques that are combined to simulate bone-removal procedures, and culminating in the successful implementation of surgical simulation modules on the ImmersiveTouch® Multiple validation experiments are also presented, where some of the contributions in this thesis are used in simulation modules that are evaluated in surgical training scenarios with promising and encouraging outcomes. Multi-disciplinary collaboration is one of the highlights of this work, with scientifically sound results published in prestigious peer-reviewed engineering and medical journals and conferences.