Frameless Navigation
Navigation in Neurosurgery
Frame-based image-guided navigation:
Before the procedure, the patient’s head is scanned with a MRI. After the first scan, the frame needed for the operation is applied right before the surgery. This frame is fixating the head, so that the surgeon can insert the endoscope or needle into the brain with a high accuracy. When the frame is placed, the patient is scanned again with a CT and both datasets are fused together to make it possible to plan the concrete procedural steps. To calculate the transformation of the different datasets, fiducial localization plates are mounted on the frame. These plates make it possible to track the frame and movements of the head and adjust the datasets. Usually these trackers are used in combination with a stereoscopic camera that emits an infrared signal. This signal is reflected from the fiducial plates and calculate a marker sphere. In order to exactly calculate the position of the body on which the markers are positioned, the computer needs at least three marker spheres.
This procedure to plan and execute neurosurgery is called image-guided neurosurgery and has clear advantages:
- Lower risk of complications due to minimal invasive nature of the procedure, since the skull opening can be small
- Due to the extensive preoperative scanning and planning, the surgery is completed with higher accuracy
- Surgeon can operate with less pressure and higher confidence, because of the preoperative calculations
- During the operation the different brain functions are better preserved
Problems with such a frame are the bad visibility and low flexibility during the surgery.
Frame-based image-guidance is used in many different procedures like neuro-endoscopy, biopsy, radiosurgery [1].
Video of positioning a biopsy frame [2]
Frameless image-guided biopsy
Like with frame-based biopsies, this procedure requires preoperative medical imaging to register the brains surface and localize the region to be biopsied. Afterwards fiducials are placed on the head to make it possible to localize and track the position of the head in real time during the operation. In order to avoid large movements by the patient, the head has to be immobilized. In contrast to framed biopsies the head is not fixed with a big frame. Instead a three point Mayfield clamp is used to stabilize the head.
Comparing the results of frame-based and frameless biopsies performed on real patients, lead to a 95.2 % accuracy of frame-based biopsies that delivered definitive results. In contrast, frameless biopsies reached an accuracy of 89.4 % in definitive diagnosis.
On the other hand, 20.6 % of frame-based biopsies had complications during the procedure, whereas 19.4 % of frameless procedures had complications.
This shows that at this point frame-based biopsies have a higher diagnostic accuracy but also have higher percentage of complications compared to frameless biopsies [3].
Fusion of different imaging techniques for tumor location [4]
Frameless image-guided radiosurgery
Radiosurgery demands high accuracy and real-time adjustment to the movements in the brain. This accuracy can be achieved by combining preoperative medical imaging and real-time tracking with infrared fiducials. Before scanning the patient’s head with CT and MRI, the non-invasive fixation has to be adjusted to the patient. Mayfield clamps are widely used to fixate the head, not just in radiosurgery, but also in frameless biopsies. After the fixation is ready, the infrared fiducials are mounted on the head. This allows for real-time localization and monitoring of the head and the tumour to adjust the exact radiation of the brain. The preoperative CT- and MRI-scans are fused together and are the bases for the procedural plan.
During the operation some x-ray images are taken to exactly position the head on the operating table. The x-ray images are compared with the images from the fused datasets of the CT- and MRI-scans and the right position on the table is calculated. The radiation is delivered on the previously planned path by a surgical robot [5].
The CyberKnife robot has six degrees of freedom and can reach 1200 different positions to admit radiation and irradiate a high percentage of the cancerous tissue [6].
Set-up for radiosurgery [7]
Different Fiducials and their accuracy
Fiducial are used to compute and adjust the procedure to movements of the patient during operations where high accuracy is needed. Before the introduction of fiducial into the operation rooms small tattoos, called skin markers, were aligned before the operation to position the patient correctly. The skin marks were aligned with the help of lasers and yielded a general positional accuracy of 1 cm.
The infrared fiducials that are common in frame-based neurosurgery are mounted on the frame and reflect an infrared signal emitted from a stereoscopic camera.
Electromagnetic fiducials tracking systems are gaining importance in surgery because of the smaller size of the system and the higher accuracy. This localization and monitoring technique consist of implanting electromagnetic fiducials into the patient. The received electromagnetic signal from the different implants makes it possible to calculate the exact position of the patient. This simple, frameless tracking system is rapidly set up, but the implanted electromagnetic emitters are big and the risk of electromagnetic interference that can distort the signal and make the measurements unusable.
A recent alternative is to use radioactive fiducials instead of electromagnetic fiducials. Like the electromagnetic fiducial tracking it allows to track the 3D position of the body part in real time due to the received radioactive radiation emitted by the implants. These implants are smaller than electromagnetic implants and allow for a less invasive positioning of the fiducials. In addition to the small size, the implants are visible in MRI scans and are not exposed to the risk of interference. The sensors are not fixed in pace but can be moved to track the needed angle of the radioactive source to a specific axis and can rotate to increase the received signal strength. In general, three different sensors are enough to calculate the exact positon, but a fourth sensor is often added to have a better error control. This tracking system detects and calculates the position of an implant with an accuracy of 1 mm.
Disadvantages is the risk of interference with other radioactive sources in the body. Therefore, often only one implant is placed to minimize the risk of interference.
At the moment, this system is mainly used in the treatment of prostate cancer, but researchers are trying to expand the field of usage of such a tracing system [8].
Optical Skinmarker for positioning [9] (left) and fiucial implant (right) [10]
Surgical Simulators
Surgical Training
Surgical simulators are computer software (and hardware) which are developed to simulate surgical procedures. These technologies are employed for training of surgeons, or medical professionals, without having the need to perform the procedure on patients. In earlier times, the surgeons have started preforming surgeries without any significant experience, and the implications and the results of the surgery have been not very well known. However, with the advancement of technology, and especially surgical simulators, surgeons are able to perform typical procedures through these technologies, train for a period of time on virtual patients, before going to animals or even real patients. Even though there is not a very significant advancement in this area, surgical simulators are used to develop couple of skills, such as eye-hand coordination and differentiation of dimensionality. Especially for the second case, there are a lot of procedures which the surgeon solely depends on a screen as a view point. Hence, one should be able to perform 3D movements having a 2D screen as the main visual feedback [11][12].
One of the medical procedures which surgeons are being trained on, is the laparoscopic procedure. Even in real-life, the surgeon has to perform this surgery through cameras and other devices which are completely inside the patient. Hence, the simulations are very advanced and describe the real world very accurately. The surgical simulator provides the surgeon with haptic feedback, force-feedback, collision detection, and actual movements of organs as the surgeon performs the surgery [11][13].
Surgical Simulation Example (up), Surgical Simulation Point-of-view (down) [14][15]
Surgical Rehearsal Platforms
Beside generic surgical simulators, there are developed surgical simulators as rehearsal platforms, where the surgeon is able to actually have insights from the real patient. These kinds of technologies make use of CT or MRI images, in order to render 3D volumes from the insights of the patient and actually have the real patient virtually in the surgical simulator. This is especially helpful for very complicated cases, where the surgeon is not really sure how to proceed with the surgery. Hence, it is vital for the surgeon and the patient to have a platform where one is able to rehearse before performing the actual surgery. As it has been discussed in [16], usage of simulator technology, especially as surgical rehearsal platforms for microsurgery, is on its very first steps. However, very promising platforms are being developed such as the “Selman Surgical Rehearsal Platform” for usage in intracranial micro-vascular procedures.
Surgical Simulator (Surgical Theater) [17]
Simulator Parts
Depending on the procedure that the simulator will be employed, there are different specifications and parts. However, it is very easy to depict several specifications that are generic for surgical simulators. Starting with the graphics and view, the user should be able to have a clear vision on the virtual patient (surgery site). There are different methods for displaying the results, after the rendering of the volume, the user should be able to see the results either from a head mounted device (such as VR headsets), hand-held devices such as iPads (motion sensitive and point-of-view specific) and normal monitor screens. Another very important point of the simulator is software based (mostly), which considers organ movement and deformations. The software should be able to depict organ movements (like real organs) and deformations. However, there should be limitations to not allow physically impossible deformations. Another very important part of the simulator is how the surgeon is preforming the surgery and moving the devices. This part of the simulator depends on the implementation, which might be through real devices which are augmented in the software, through bare hands, joysticks, or even screen movements and touch pads. Lastly, there should be a feedback mechanism in order for the surgeon to feel the procedure and evaluate the force in the movements. It is vital for the success of the simulator the implementation of haptic feedback, force feedback, where the surgeon is able to feel the pressure and existence of some real factors throughout the procedure [18][19][20].
Simulator Examples
According to Jeffrey Ponsky, “A surgeon or endoscopist should be able to work with instruments in an environment that feels just like that of working on a real patient. There should not be the feeling of a game or artificial environment”. Hence, in order to meet these expectations, there have been developed many advanced technologies that try to mimic this kind of thinking. One platform that is specific for minimally invasive spine surgeries is the Spine Mentor from 3D Systems [21].
Spine Mentor from 3D Systems [22]
Another very successful company in this area is TouchSurgery, which has developed many procedures in their app. Starting from Dentistry, to Orthopedics and Neurosurgery. Some examples from brain surgeries are “Pterional Craniotomy” and “Acute Trauma Craniotomy”. Even though these simulations exist as apps only, they showed to be very helpful for surgeons [23].
TouchSurgery [24]
Lastly, CAE LapVR is another example of a very successful simulator in the area of Laparoscopic surgery. The company provides immersive, risk-free laparoscopic training environments [19].
Laparoscopic surgery simulator from CAE LapVR [25]
Just beyond the horizon ...
The Horizon 2020 program of the European Commission is the largest research funding program of the European Union to date. With almost 80 billion Euro, the goal is to promote cutting edge scientific research and technological development both in the public and the private sector. Several projects related to the technological aspect of brain surgery are part of Horizon 2020 [26].
EDEN2020
One of the ongoing projects which is financed by the Horizon 2020 project is EDEN2020 (Enhanced Delivery Ecosystem for Neurosurgery in 2020). Several groups are working on different subprojects with a common goal to create a framework for planning and execution of minimally invasive brain surgeries. While the initial focus is on brain cancer therapy, other applications are envisaged. Several surgical needs will be met by the system. The main goal will be the development of steerable catheters, that are operated by a robot (Neuromate) and can be used for an extended period of time. Other milestones include monitoring of the deformation/shift of the brain during the operation with an augmented accuracy and update rate, prediction of drug diffusion to allow more precise planning and accurate targeting of the medication[27].
In Munich, the focus is on image acquisition, more precisely on ultrasound (US). While ultrasound can provide valuable information on soft tissue, the quality and usability of US images is influenced by several factors. In practice, acoustic attenuation, shadowing, or reverberation can heavily influence the image. In the case of interoperative US, where the patient is being monitored over a prolonged span, another problem arises: reproducibility. To circumvent this problem, the trajectory of the US device on the surface can be computed beforehand and executed by a robotic arm to minimize the deviation from said trajectory. Using this approach the optimal acoustic window can be hit and the need for rescanning is reduced. Furthermore, if the acoustic window is small and constricted, the robotic arm can more accurately and optimally scan the area. The planning process requires precise scanning of the patient in order to calculated the trajectory. In practice intraoperative ultrasound (iUS) can be used to detect brainshift during the procedure as well as precisely localize the catheters and monitor their trajectory [28][29].
A robotic ultrasound arm with a test dummy [28]
Another area of research within the EDEN2020 project is the development of needles, that cause less damage to surrounding tissue and can be targeted to the designated area more precisely. A new, biology inspired approach is a flexible needle that consists of four segments which can be moved independently, instead of a traditional needle that is inserted at a steady pace. During insertion of this needle prototype, in each iteration one segment is pushed forwards, while the other tree parts are slightly pulled back. This motion scheme reduces the strain on the surrounding tissue and reduces its disruption. In an clinical setting these are two parameters that influence the success of a procedure, since disruption and torsion of the tissue reduce the accuracy of the needle insertion. Due to the flexible nature of the needle sensitive areas can be avoided during surgery and the risk of buckling is reduced [30][31].
Model of the flexible segmented needle, and how the directional movement is acquired [30].
The "Neuromate" a stereotactic robot delivered by Renishaw, that will hold the steerable cathether during surgery [32].
The consortiums vision for a modern operation theatre. One can see the stereotactic robot, which holds the steerable catheter. The position of this catheter is monitored by intraoperative ultrasound [33].
SMart weArable Robotic Teleoperated Surgery (SMARTSurg)
The project aims to create a new platform for robot-assisted minimally invasive surgery including (RAMIT) the appropriate hard- and software based on current technology. The main objectives are the development of:
- precise surgical instruments
- wearable hand exoskeleton with haptic feedback to control the surgical instruments
- smart glasses for augmented reality 3D visualisation of the operation target
The system focuses on the usability of the appliances with a small training time and intuitive operation of the surgical instruments, as well as enhanced precision, safety and reduction of surgery time. The new technology shall enable a wider application of keyhole surgery and thus improve the patients outcome [34][35].
Robot-Assisted Flexible Needle Steering for Targeted Delivery of Magnetic Agents (ROBOTAR)
The ROBOTAR project focuses on the steering of needles in order to prevent common problems, that arise with the use of rigid needles, like the deformation of tissue and the deviation of the needle from its trajectory. The idea is to guide the needle magnetically using ultrasound. The main challenges postulated to be overcome in the course of the project are:
- 3D models decribing the needle in motion need to be generated
- the needle needs to be controlled in a real-time setting
- the magnetic agents need to be tracked via US
Ultrasound has no adverse effects on health, provides high frame rates and is less cost intensive in comparison to other methods. In the long run, the invasiveness of surgeries could be limited which would result in a benefit for the patient [36][37] .
References
- Mezger U., Jendrewski C., Bartels M.: „Navigation in surgery“ Springer Verlag 2013, DOI: 10.1007/s00423-013-1059-4
- “UTSW stereotactic surgery 3 placement of frame on patient MR.wmv”, https://www.youtube.com/watch?v=PsL9B7ftbG4 [Accessed: 16-June-2017]
- Lu Y., Yeung C.: „Comparative Effectiveness of Frame-based, Frameless and Intraoperative MRI Guided Brain Biopsy Techniques“ https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4450019/ [Accessed: 16-June-2017]
- http://ars.els-cdn.com/content/image/1-s2.0-S092523121630649X-gr1.jpg [Accessed: 16-June-2017]
- Breneman J. C., Steinmetz R., Smith A., Lamba M., Warnick R. E.: „Frameless Image-Guided Intracranial Stereotactic Radiosurgery: Clinical Outcomes For Brain Metastases“, Elsevier, 2009, DOI: 10.1016/j.ijrobp.2008.11.015
- Voynov G., Heron D.E., Burton S., Grandis J., Quinn A., Ferris R.: “Frameless Stereotactic Radiosurgery for Recurrent Head and Neck Carcinoma”, Adenine Press , 2005, ISSN 1533-0346
- https://jkma.org/ArticleImage/0119JKMA/jkma-51-630-g001-l.jpg [Accessed: 16-June-2017]
- Shchory T., Schifter D., Lichtman R., Neustadter D., Corn, B. W.: „Tracking Accuracy Of A Real-Time Fiducial Tracking System For Patient Positioning And Monitoring In Radiation Therapy”, Elsevier, 2010, DOI: 10.1016/j.ijrobp.2010.01.067
- http://ackermancancercenter.com/blog/what-are-fiducial-markers-and-why-do-i-need-them [Accessed: 16-June-2017]
- https://openi.nlm.nih.gov/imgs/512/54/3357627/PMC3357627_RRP2012-197364.001.png [Accessed: 16-June-2017]
- Anderik, K. (2013). "Computer and videogames: a British phenomenon around the world", White Paper, vol. 2, issue 2, pp. 70 – 78.
- Baer R., "Television and Gaming Apparatus: The new era of entertainment", PLOS, 1990.
- Bradley H., "Can video games be used to predict or improve laparoscopic skills?" Journal of Endourology, vol. 19, issue 3, pp. 372 – 376, 2005.
- Surgical Simulator (example) - Image [Online] Available: http://healthysimulation.com/wp-content/uploads/2014/07/RealSpine.jpg [Accessed: 13-June-2017]
- Surgical Simulator (point of view) - Image [Online] Available: http://a5.mzstatic.com/us/r30/Purple/v4/ef/d5/84/efd5847b-1acf-ed08-0abf-b53bcd026428/screen800x500.jpeg [Accessed: 13-June-2017]
- Bambakidis NC., Selman WR., Sloan AE., “Surgical rehearsal platform: potential uses in microsurgery”, Neurosurgery, vol. 73, issue. 1, October 2013.
- Surgical Theater, "Surgical Theater Press Reel", Surgical Theater, video, July 2016. [Online] Available: https://www.youtube.com/watch?v=43XnJbaWXmk&feature=youtu.be [Accessed: 13-June-2017]
- Adrienne Erin, “Surgical Simulation Training: Is Virtual Reality The Future Of Surgical Training?”, elearningindustry, October 2015. [Online] Available: https://elearningindustry.com/surgical-simulation-training-virtual-reality-future-surgical-training [Accessed: 13-June-2017]
- CAE LapVR, “Immersive, Risk-Free Laparoscopic Training Environment”, CAE LapVR. [Online] Available: https://caehealthcare.com/surgical-simulation/lapvr [Accessed: 13-June-2017]
- 3D Systems, “Simulation-based training for da vinci® surgery”, 3D Systems, 2017. [Online] Available: http://simbionix.com/simulators/robotix-mentor/da-vinci-surgery/ [Accessed: 13-June-2017]
- 3D Systems, “Spine Mentor™”, 3D Systems, 2017. [Online] Available: http://simbionix.com/simulators/spine-mentor/ [Accessed: 13-June-2017]
- Spine Mentor - Image [Online] Available: http://simbionix.com/wp-content/uploads/2017/06/main-banner-1.jpg [Accessed: 13-June-2017]
- TouchSurgery, “Touch Surgery Simulations”, TouchSurgery, 2017. [Online] Available: https://www.touchsurgery.com/simulation/ [Accessed: 13-June-2017]
- Touch Surgery, "Touch Surgery - Intro", Touch Surgery, video, August 2016. [Online] Available: https://www.youtube.com/watch?v=iGm9qoJBcAM&feature=youtu.be [Accessed: 13-June-2017]
CAE Healthcare, "LapVR Interventional Simulator by CAE Healthcare", CAE Healthcare, video, 2013. [Online] Available: https://vimeo.com/52960812 [Accessed: 13-June-2017]
“What is Horizon 2020? - Horizon 2020 - European Commission,” Horizon 2020. [Online]. Available: /programmes/horizon2020/en/what-horizon-2020. [Accessed: 18-Jun-2017].
“European Commission : CORDIS : Projects & Results Service : Enhanced Delivery Ecosystem for Neurosurgery in 2020.” [Online]. Available: http://cordis.europa.eu/project/rcn/200392_en.html. [Accessed: 18-Jun-2017].
R. Göbl, S. Virga, J. Rackerseder, B. Frisch, N. Navab, and C. Hennersperger, “Acoustic window planning for ultrasound acquisition,” International Journal of Computer Assisted Radiology and Surgery, vol. 12, no. 6, pp. 993–1001, Jun. 2017.
M. Riva et al., “3D intra-operative ultrasound and MR image guidance: pursuing an ultrasound-based management of brainshift to enhance neuronavigation,” Int J CARS, pp. 1–15, Apr. 2017.
A. Leibinger, M. J. Oldfield, and F. R. y Baena, “Minimally disruptive needle insertion: a biologically inspired solution,” Interface Focus, vol. 6, no. 3, p. 20150107, Jun. 2016.
“Wasp-inspired robotic needle moves closer to surgery.” [Online]. Available: http://www3.imperial.ac.uk/newsandeventspggrp/imperialcollege/newssummary/news_25-11-2015-10-26-12. [Accessed: 18-Jun-2017].
[Online]. Available: http://www.renishaw.com/media/img/gen/76961bff8d94412884bbfc3156893991.jpg. [Accessed: 19-Jun-2017].
“SystemOverview.jpg (JPEG-Grafik, 2314 × 1275 Pixel) - Skaliert (50%).” [Online]. Available: http://campar.in.tum.de/twiki/pub/Internal/ProjectEDEN2020/SystemOverview.jpg. [Accessed: 19-Jun-2017].
“European Commission : CORDIS : Projects & Results Service : SMart weArable Robotic Teleoperated Surgery.” [Online]. Available: http://cordis.europa.eu/project/rcn/207027_en.html. [Accessed: 19-Jun-2017].
“SMARTsurg - At a Glance.” [Online]. Available: http://www.smartsurg-project.eu/overview/at-a-glance. [Accessed: 19-Jun-2017].
“European Commission : CORDIS : Projects & Results Service : Robot-Assisted Flexible Needle Steering for Targeted Delivery of Magnetic Agents.” [Online]. Available: http://cordis.europa.eu/project/rcn/193575_en.html. [Accessed: 19-Jun-2017].
“Ultrasound-guided control of micro-sized agents - Surgical Robotics Lab.” [Online]. Available: http://www.surgicalroboticslab.nl/projects/ultrasound-guided-control-micro-sized-agents/. [Accessed: 19-Jun-2017].