Abstract
Segmentation of moving and deformable structures in medical image processing has become an increasingly important research field in recent years. Fast and high-resolution image acquisition methods such as Magnetic Resonance (MR) imaging produce very detailed cross-sectional images of the human body. Segmentation of anatomically relevant objects is then a subsequent operation, performed in order to visualise and/or measure shapes and motions of interest. The segmentation task is usually performed by clinicians and other experts. High demand on expert time and inter- and intra-observer variability impose a clinical and scientific need of automating this process.This thesis presents a novel approach for segmenting and tracking of anatomical objects in 2D medical image sequences, which enables quantitative studies of relevant structures even under the presence of distortions introduced by the image formation process.
The underlying premise of the work presented, is based upon the observation that a robust and precise image segmentation requires a priori knowledge on both image formation and the objects to be detected. Such knowledge is often vague or uncertain and may only be acquired from experts in natural-language terms.
In combining active contours with fuzzy logic a novel contour segmentation method is developed which is capable of exploiting uncertain knowledge in both syntactical and linguistic terms. Unlike other approaches the contour description is fully integrated into the segmentation process, with the additional advantage that many existing image processing operators can also smoothly be integrated into the fuzzy framework.
Specific applications addressed are motion analysis of carpal bones in MR image sequences and the analysis of the vocal tract in X-ray image sequences. Traditional solutions for both applications have been developed. The new framework is fully validated in comparison to these solutions as well as on synthetic image material.
Date of Award | Oct 2001 |
---|---|
Original language | English |