An improved computational-simulation system for interactive medical imaging has been invented. The system displays high-resolution, three-dimensional-appearing images of anatomical objects based on data acquired by such techniques as computed tomography (CT) and magnetic-resonance imaging (MRI). The system enables users to manipulate the data to obtain a variety of views — for example, to display cross sections in specified planes or to rotate images about specified axes. Relative to prior such systems, this system offers enhanced capabilities for synthesizing images of surgical cuts and for collaboration by users at multiple, remote computing sites.

This Interactive Medical-Imaging System can be implemented in hardware and/or software.
The system (see figure) includes a database, a reconstruction unit, and a virtual collaborative clinic (VCC). The database contains data from an MRI, CT, or other scan. Within minutes of the scan, the reconstruction unit can process the data into high-resolution, stereoscopic images. The reconstruction unit includes a subunit that generates polygonal meshes to represent surfaces of anatomical objects. Whereas prior medical-imaging systems generated such meshes by means of an algorithm known in the art as the marching-cubes algorithm, this system utilizes an improved algorithm that makes it possible to reduce the computational burden of rendering surfaces at high resolution. In particular, the improved algorithm makes it possible to reduce the number of polygons drastically (by as much as 98 percent in some cases) without loss of topographical features, and without introducing spurious tears and holes, which occur in marchingcubes applications.

The reconstruction unit includes a visualization module, which processes data from the mesh generator and the database into images for display. An animation module makes it possible to generate sequences of images that show how anatomical objects change over time. A user-input module enables a user to manipulate images and control other functions by means of a mouse, trackball, touchpad, or other standard input device. Yet another module is the cyberscalpel, which, as its name suggests, enables the system to simulate cutting of an anatomical object displayed via the visualization module.

The VCC is an extension of the reconstruction unit. The VCC can include counterpart components that reside on separate computers at multiple remote locations, enabling users at those locations to interact with the same simulated anatomical objects in real time. The data structure of the system provides for sharing, among the geographically dispersed computers, of a number of variables and of computational models that represent anatomical objects at various levels of image resolution. Data are multicast among the computers on the basis of the data structure, such that by means of the multicast data and a provision for dynamic selection among the computational models, the same images can be displayed on all the computers. For example, all users can observe a simulated surgical cut performed by one of the users. Thus, surgeons at multiple locations can collaborate in planning complicated surgery in advance, using realistic displays.

This work was done by Muriel D. Ross, Ian A. Twombly, and Steven Senger of Ames Research Center.

Inquiries concerning rights for the commercial use of this invention should be addressed to the Patent Counsel, Ames Research Center, (650) 604-5104. Refer to ARC-14441.

NASA Tech Briefs Magazine

This article first appeared in the December, 2003 issue of NASA Tech Briefs Magazine.

Read more articles from the archives here.