SwePub
Sök i LIBRIS databas

  Extended search

WFRF:(Runow Stark Christina)
 

Search: WFRF:(Runow Stark Christina) > Visualization of co...

Visualization of convolutional neural network class activations in automated oral cancer detection for interpretation of malignancy associated changes

Koriakina, Nadezhda, 1991- (author)
Uppsala universitet,Avdelningen för visuell information och interaktion,Bildanalys och människa-datorinteraktion,MIDA
Sladoje, Natasa (author)
Uppsala universitet,Avdelningen för visuell information och interaktion,Bildanalys och människa-datorinteraktion
Bengtsson, Ewert, 1948- (author)
Uppsala universitet,Avdelningen för visuell information och interaktion,Bildanalys och människa-datorinteraktion,Reglerteknik
show more...
Ramqvist, Eva Darai (author)
Pathology and Cytology, Karolinska Institute, Stockholm, Sweden
Hirsch, Jan M. (author)
Uppsala universitet,Käkkirurgi
Runow Stark, Christina (author)
Public Dental Service, Södersjukhuset, Stockholm, Sweden
Lindblad, Joakim (author)
Uppsala universitet,Bildanalys och människa-datorinteraktion,Avdelningen för visuell information och interaktion
show less...
 (creator_code:org_t)
2019
2019
English.
In: 3rd NEUBIAS Conference, Luxembourg, 2-8 February 2019.
  • Conference paper (other academic/artistic)
Abstract Subject headings
Close  
  • Introduction: Cancer of the oral cavity is one of the most common malignancies in the world. The incidence of oral cavity and oropharyngeal cancer is increasing among young people. It is noteworthy that the oral cavity can be relatively easily accessed for routine screening tests that could potentially decrease the incidence of oral cancer. Automated deep learning computer aided methods show promising ability for detection of subtle precancerous changes at a very early stage, also when visual examination is less effective. Although the biological nature of these malignancy associated changes is not fully understood, the consistency of morphology and textural changes within a cell dataset could shed light on the premalignant state. In this study, we are aiming to increase understanding of this phenomenon by exploring and visualizing what parts of cell images are considered as most important when trained deep convolutional neural networks (DCNNs) are used to differentiate cytological images into normal and abnormal classes.Materials and methods: Cell samples are collected with a brush at areas of interest in the oral cavity and stained according to standard PAP procedures. Digital images from the slides are acquired with a 0.32 micron pixel size in greyscale format (570 nm bandpass filter). Cell nuclei are manually selected in the images and a small region is cropped around each nucleus resulting in images of 80x80 pixels. Medical knowledge is not used for choosing the cells but they are just randomly selected from the glass; for the learning process we are only providing ground truth on the patient level and not on the cell level. Overall, 10274 images of cell nuclei and the surrounding region are used to train state-of-the-art DCNNs to distinguish between cells from healthy persons and persons with precancerous lesions. Data augmentation through 90 degrees rotations and mirroring is applied to the datasets. Different approaches for class activation mapping and related methods are utilized to determine what image regions and feature maps are responsible for the relevant class differentiation.Results and Discussion:The best performing of the observed deep learning architectures reaches a per cell classification accuracy surpassing 80% on the observed material. Visualizing the class activation maps confirms our expectation that the network is able to learn to focus on specific relevant parts of the sample regions. We compare and evaluate our findings related to detected discriminative regions with the subjective judgements of a trained cytotechnologist. We believe that this effort on improving understanding of decision criteria used by machine and human leads to increased understanding of malignancy associated changes and also improves robustness and reliability of the automated malignancy detection procedure.

Subject headings

NATURVETENSKAP  -- Data- och informationsvetenskap -- Annan data- och informationsvetenskap (hsv//swe)
NATURAL SCIENCES  -- Computer and Information Sciences -- Other Computer and Information Science (hsv//eng)
TEKNIK OCH TEKNOLOGIER  -- Medicinteknik -- Medicinsk bildbehandling (hsv//swe)
ENGINEERING AND TECHNOLOGY  -- Medical Engineering -- Medical Image Processing (hsv//eng)

Keyword

Oral cancer
saliency methods
deep convolutional neural networks
Computerized Image Processing
Datoriserad bildbehandling

Publication and Content Type

vet (subject category)
kon (subject category)

To the university's database

Search outside SwePub

Kungliga biblioteket hanterar dina personuppgifter i enlighet med EU:s dataskyddsförordning (2018), GDPR. Läs mer om hur det funkar här.
Så här hanterar KB dina uppgifter vid användning av denna tjänst.

 
pil uppåt Close

Copy and save the link in order to return to this view