Nalan Karunanayake. Artificial life for segmentation of breast ultrasound deformities. Doctoral Degree(Engineering and Technology). Thammasat University. Thammasat University Library. : Thammasat University, 2022.
Artificial life for segmentation of breast ultrasound deformities
Abstract:
Segmenting and delineating tumor boundaries in breast ultrasound (US) images is a crucial challenge in computer-assisted medical imaging technology, particularly for breast cancer screening and diagnostic programs. The presence of various imaging artifacts, such as speckle noise, low contrast, and low signal-to-noise ratio, can result in unclear tissue boundaries between the lesion and the surrounding background tissue. Additionally, the variability in output characteristics of US devices and potential operator errors can further complicate the task of precise tumor delineation for even highly qualified radiologists. Despite these challenges, ultrasound screening remains the most reliable, effective, and cost-efficient method for early breast cancer detection worldwide. As the number of cases continues to grow, computerized automated systems are proving to be valuable tools for assisting radiologists in enhancing the accuracy of their screenings. As a solution, in this work we propose a novel segmentation algorithm that combines artificial intelligence (AI), image fusion, and artificial life (AL). The proposed tracing agents (TA) are synthetic organisms with a short memory and communication ability with their peers. The agents live in a fused image produced from conventional ultrasound and an elastography image. The agents are able to find strong US feature edges and closes the gaps between broken edges to create the closed US boundary. The novelty points of the proposed algorithm is 1) create a fused image with US and strain elastography images, 2) generate the agent trajectories by offsetting the generated fusion mask, 3) a new artificial life model, i.e., the TA are moving along the trajectories according to a set of prescribed rules, 4) training the algorithm exclusively using synthetic data and evaluating its performance on real data, 5) verify the segmentation ability of the proposed model using US images classified according to the complexity of the lesion shape and the edge map contour features. 6) provides multiple solutions that are not overly diverse, allowing for the application of majority-rule-based analysis to determine a single final contour, leading to a more robust and accurate final contour, ultimately improving the overall performance and reliability of the segmentation process. The proposed AL algorithm was rigorously evaluated in comparison to five cutting-edge segmentation model types, encompassing Deep Learning, Active Contours, Level Sets, Superpixel, and Edge Linking algorithms. In these experiments, the AL algorithm was benchmarked against 16 state-of-the-art ultrasound segmentation techniques, demonstrating superior performance compared to 85 contemporary and traditional ultrasound segmentation methods. The experimental evaluation was conducted on a dataset consisting of 395 breast ultrasound lesion images, obtained from the online resources http://onlinemedicalimages.com and https://www.ultrasoundcases.info/. In order to train the agents effectively, Genetic Algorithms (GA) were employed as an optimization technique, facilitating the process of fine tuning the algorithm for improved segmentation performance. Our proposed algorithm was rigorously tested on challenging ultrasound cases involving complex shapes, boundary leakage, and significant edge noise. We intentionally excluded lesions that were simple and well-defined from the study to focus on the most difficult cases. The results demonstrated that our proposed active learning algorithm outperforms reference methods in accurately delineating lesion boundaries on highly complex ultrasound images. To showcase the effectiveness of our algorithm, we have provided a video demonstration that can be accessed through the following link: http://shorturl.at/fiFW4
Thammasat University. Thammasat University Library