implementation of the plus open source toolkit for
play

Implementation of the PLUS open-source toolkit for translational - PDF document

Implementation of the PLUS open-source toolkit for translational research of ultrasound-guided intervention systems Release 0.2 Andras Lasso, Tamas Heffter, Csaba Pinter, Tamas Ungi, and Gabor Fichtinger August 15, 2012 Laboratory for


  1. Implementation of the PLUS open-source toolkit for translational research of ultrasound-guided intervention systems Release 0.2 Andras Lasso, Tamas Heffter, Csaba Pinter, Tamas Ungi, and Gabor Fichtinger August 15, 2012 Laboratory for Percutaneous Surgery, School of Computing, Queen’s University, Kingston, ON, Canada Abstract This document describes the design of the PLUS (Public software Library for Ultrasound) open-source toolkit. The toolkit provides a basic infrastructure for implementing ultrasound-guided intervention systems. Functionalities include collection of synchronized ultrasound and position data from a wide variety of hardware devices, spatial and temporal calibration, volume reconstruction, live streaming to end-user applications, and recording to and replay from file. Source code, documentation, tutorials, application examples are available with a BSD-type license at the project website: www.assembla.com/spaces/plus. Contents 1. 2 Introduction 2. 3 Data representation 3. 4 Data acquisition 4. 6 Volume reconstruction 5. 7 Live data streaming 6. 7 Implementation 7. 10 Results 8. 11 Conclusion 9. 11 Acknowledgments 10. 12 References

  2. 2 1. Introduction Ultrasound (US) has a great potential in guiding medical interventions, as it is capable of acquiring real- time images, it has low cost, small size, and does not use ionizing radiation. Positioning of interventional tools can often be successfully completed by simple free-hand manipulation of the ultrasound transducer and surgical tools. However, there is a wide range of challenging procedures that require continuous tracking of the pose (position and orientation) of the images and surgical tools are required. Research and development of position tracked (also known as navigated) ultrasound guided intervention systems requires advanced engineering infrastructure, which has not been available in the public domain, and single project-based solutions have not proven to be suitable as reusable platforms. Gobbi, Boisvert, et al. developed the open-source SynchroGrab software library [1] for the acquisition, synchronization, and volume reconstruction of tracked ultrasound data. SynchroGrab was based on the Visualization Toolkit (VTK, http://www.vtk.org) open-source library and worked with a couple of ultrasound and tracking devices. Later Pace et al. [2] extended the library with electrocardiogram gating to allow reconstruction of time sequences of volumetric images. SynchroGrab was a very valuable contribution to this field and served as a good example with its architecture, utilization of open communication protocols and toolkits. However, the library lacked some essential tools, such as spatial calibration, easy-to-use, documented diagnostic and configuration tools, samples, and tests. The library is not being developed anymore, but several classes are still maintained as part of the Atamai Image Guided Surgery toolkit (https://github.com/dgobbi/AIGS). The Image-Guided Surgery Toolkit (IGSTK, http://www.igstk.org/) is a generic, open-source, component-based framework for the implementation of image-guided surgery applications. It supports pose tracking using a number of hardware devices and contains some registration functions and a very valuable application infrastructure. However, currently it does not seem to offer essential US guidance features such as tracked image acquisition, temporal and spatial calibration, and volume reconstruction. Stradwin is a software application developed by Treece et al. [3] at the Medical Imaging group at the Department of Engineering, University of Cambridge, UK, for freehand 3D ultrasound calibration, acquisition, measurement, and visualization. Stradwin has many useful US guidance related features, but the software is designed for performing only a few specific functions, and not for generic tool guidance. Another limitation is that its source code is not publicly available. Therefore the software is not usable as a research platform. The Medical UltraSound Imaging and Intervention Collaboration (MUSiiC, https://musiic.lcsr.jhu.edu/Research/MUSiiC_Toolkit, [4]) research lab at Johns Hopkins University developed a toolkit for acquiring and processing tracked ultrasound data. MUSiiC addresses many of the common and more advanced problems of US-guided interventions, such as calibration, volume reconstruction, and elastography. The reported characteristics of the toolkit – modular, networked, open interfaces, open-source, real-time, parallel, hardware-agnostic –are very promising. However, the toolkit has yet to be released to the research community as an open-source package. There are a few other frameworks for prototyping research systems, which support acquisition of real- time tracking and image data. Examples include the Computer-Integrated Surgical Systems and Technology (CISST, https://trac.lcsr.jhu.edu/cisst, [5]) libraries developed at Johns Hopkins University. Latest version available at the Insight Journal link http://hdl.handle.net/10380/3363 Distributed under Creative Commons Attribution License

  3. 3 The library provides a nice generic framework, many computational algorithms, and interface for several hardware components. However, its component-based framework, which is essential for setting up a complete system, relies on a library (ZeroC ICE) that has a restrictive, non-BSD license. Also, the focus of CISST libraries are not specific to US imaging and therefore its set up and calibration for US guidance applications does not seem to be straightforward. The goal of our work is to provide a simple-to-use, freely available open-source toolkit to facilitate rapid prototyping of US-guided intervention systems for translational clinical research. The toolkit shall offer access to various US imaging and pose tracking tools using a single, hardware-independent interface. The package shall also include all the software, hardware, and documentation for commonly needed calibration and data processing, visualization, and transfer operations. For the implementation of the toolkit we reuse several concepts and algorithms from the Synchrograb library, but substantially re- engineer and extend it to fulfill all the listed requirements. 2. Data representation Defining common notations and formats for basic data structures is required for seamless interoperability between different groups that use, maintain, and extend the tracked ultrasound system. The basic information types that have to be represented are image, pose, and time. 2.1. Image The PLUS toolkit uses the image representation defined in the generic image processing libraries that the toolkit is based on. However, there is an additional property – US image orientation – that has to be specified for each US image slice. In PLUS the US image orientation refers to the spatial relationship between the image slice axes and the transducer’s principal axes. The image orientation is defined by a two-letter acronym, the first and second letter specify the directions corresponding to the +x and +y directions in the image coordinate system, respectively. The following directions are defined: • Marked (M): pointing towards the marked side of the transducer. • Unmarked (U): pointing towards the unmarked side of the transducer. • Far (F): normal to the transducer surface, pointing away from the transducer (towards increasing depth direction). • Near (N): normal to the transducer surface, pointing towards the transducer. Coordinate system axes definitions for commonly used probes are shown on Figure 1. If the direction of the image +x axis is the same as the marked direction on the transducer and the image +y axis direction is the same as the transducer far direction then the image orientation is MF. If the image is flipped horizontally then the orientation becomes UF, and if flipped vertically then the orientation becomes MN. Latest version available at the Insight Journal link http://hdl.handle.net/10380/3363 Distributed under Creative Commons Attribution License

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend