SwePub
Sök i SwePub databas

  Utökad sökning

Träfflista för sökning "WFRF:(Bozaba Engin) "

Sökning: WFRF:(Bozaba Engin)

  • Resultat 1-4 av 4
Sortera/gruppera träfflistan
   
NumreringReferensOmslagsbildHitta
1.
  • Cayir, Sercan, et al. (författare)
  • MITNET : a novel dataset and a two-stage deep learning approach for mitosis recognition in whole slide images of breast cancer tissue
  • 2022
  • Ingår i: Neural Computing & Applications. - : SPRINGER LONDON LTD. - 0941-0643 .- 1433-3058. ; 34:20, s. 17837-17851
  • Tidskriftsartikel (refereegranskat)abstract
    • Mitosis assessment of breast cancer has a strong prognostic importance and is visually evaluated by pathologists. The inter, and intra-observer variability of this assessment is high. In this paper, a two-stage deep learning approach, named MITNET, has been applied to automatically detect nucleus and classify mitoses in whole slide images (WSI) of breast cancer. Moreover, this paper introduces two new datasets. The first dataset is used to detect the nucleus in the WSIs, which contains 139,124 annotated nuclei in 1749 patches extracted from 115 WSIs of breast cancer tissue, and the second dataset consists of 4908 mitotic cells and 4908 non-mitotic cells image samples extracted from 214 WSIs which is used for mitosis classification. The created datasets are used to train the MITNET network, which consists of two deep learning architectures, called MITNET-det and MITNET-rec, respectively, to isolate nuclei cells and identify the mitoses in WSIs. In MITNET-det architecture, to extract features from nucleus images and fuse them, CSPDarknet and Path Aggregation Network (PANet) are used, respectively, and then, a detection strategy using You Look Only Once (scaled-YOLOv4) is employed to detect nucleus at three different scales. In the classification part, the detected isolated nucleus images are passed through proposed MITNET-rec deep learning architecture, to identify the mitosis in the WSIs. Various deep learning classifiers and the proposed classifier are trained with a publicly available mitosis datasets (MIDOG and ATYPIA) and then, validated over our created dataset. The results verify that deep learning-based classifiers trained on MIDOG and ATYPIA have difficulties to recognize mitosis on our dataset which shows that the created mitosis dataset has unique features and characteristics. Besides this, the proposed classifier outperforms the state-of-the-art classifiers significantly and achieves a 68.7% F1-score and 49.0% F1-score on the MIDOG and the created mitosis datasets, respectively. Moreover, the experimental results reveal that the overall proposed MITNET framework detects the nucleus in WSIs with high detection rates and recognizes the mitotic cells in WSI with high F1-score which leads to the improvement of the accuracy of pathologists' decision.
  •  
2.
  • Çayır, Sercan, et al. (författare)
  • Patch-based approaches to whole slide histologic grading of breast cancer using convolutional neural networks
  • 2023
  • Ingår i: Diagnostic Biomedical Signal and Image Processing Applications with Deep Learning Methods. - : Elsevier. - 9780323961295 - 9780323996815 ; , s. 103-118
  • Bokkapitel (refereegranskat)abstract
    • In early-stage breast cancer, the Nottingham Histologic Grading (NHG) is a strong prognostic factor. It is made up of nuclear pleomorphism, tubular formation, and mitotic count evaluation. Major grade disagreement is low (1.5%), but inter-observer agreement in grading among pathologists is moderate. Grading errors or inconsistencies caused by a variety of factors may jeopardize patient care and overall survival. It has been demonstrated that the assessment of the NHG is comparable to light microscopy and Whole Slide Images (WSI), which are digitized images of histopathologic slides. Because AI-based breast cancer grading is a new area of pathology, there are inherent difficulties in training AI models. We mitigate the high computational cost associated with the dimensions of WSIs by using a patch-based approach, and we mitigate the problems associated with the availability of training data by carefully annotating and labeling these patches. This chapter describes a fully automated computer-aided patch-based system that employs deep learning (DL) methods. Nuclear pleomorphism, tubular formation, and mitotic count are all graded using the proposed method. In addition, to train and test the DL methods in the proposed approach, we created an in-house individual dataset for pleomorphism, tubule detection, nuclei, and mitosis detection, which consists of 23.283, 10.117, 2.993, and 9.816 annotated patches extracted from WSIs of breast tissue with varying hematoxylin and eosin stains, respectively. These WSIs were obtained from a variety of patients who had been diagnosed with invasive ductal carcinoma. Four different difficult tasks are solved using the proposed computer-aided DL patch-based system. Semantic segmentation is used for tubular formation, object detection is used for nuclei detection, and image classification is used for mitotic count and nuclear pleomorphism. To obtain the results, we fine-tuned pre-trained (on ImageNet) DL architectures such as EfficientNet backbone U-Net, Scaled-Yolov4, DenseNet-161, and VGG-11 with our dataset for tubule segmentation, nuclei detection, and mitosis and nuclear pleomorphism classification tasks. We demonstrate that data augmentation is critical for improving the accuracy of patch-based DL models, which serve as the foundation of our WSI grading system. The proposed method resulted in reproducible histologic scores with F1- values of 94%, 94.1%, and 50.7% for nuclear pleomorphism classification, tubule formation segmentation, and mitotic classification, respectively. The results of the experiments presented in this chapter show promise for clinical translation of the DL algorithms described. Using the proposed approach to perform histological grading of WSIs will reduce the subjectivity associated with pathologist-assigned grades. © 2023 Elsevier Inc. All rights reserved.
  •  
3.
  •  
4.
  • Tekin, Eren, et al. (författare)
  • Tubule-U-Net : a novel dataset and deep learning-based tubule segmentation framework in whole slide images of breast cancer
  • 2023
  • Ingår i: Scientific Reports. - : Nature Portfolio. - 2045-2322. ; 13:1
  • Tidskriftsartikel (refereegranskat)abstract
    • The tubule index is a vital prognostic measure in breast cancer tumor grading and is visually evaluated by pathologists. In this paper, a computer-aided patch-based deep learning tubule segmentation framework, named Tubule-U-Net, is developed and proposed to segment tubules in Whole Slide Images (WSI) of breast cancer. Moreover, this paper presents a new tubule segmentation dataset consisting of 30820 polygonal annotated tubules in 8225 patches. The Tubule-U-Net framework first uses a patch enhancement technique such as reflection or mirror padding and then employs an asymmetric encoder-decoder semantic segmentation model. The encoder is developed in the model by various deep learning architectures such as EfficientNetB3, ResNet34, and DenseNet161, whereas the decoder is similar to U-Net. Thus, three different models are obtained, which are EfficientNetB3-U-Net, ResNet34-U-Net, and DenseNet161-U-Net. The proposed framework with three different models, U-Net, U-Net++, and Trans-U-Net segmentation methods are trained on the created dataset and tested on five different WSIs. The experimental results demonstrate that the proposed framework with the EfficientNetB3 model trained on patches obtained using the reflection padding and tested on patches with overlapping provides the best segmentation results on the test data and achieves 95.33%, 93.74%, and 90.02%, dice, recall, and specificity scores, respectively. © 2023, The Author(s).
  •  
Skapa referenser, mejla, bekava och länka
  • Resultat 1-4 av 4

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Stäng

Kopiera och spara länken för att återkomma till aktuell vy