Authors: Babak Ehteshami Bejnordi,Guido Zuidhof,Maschenka Balkenhol,Meyke Hermsen,Peter Bult,Bram van Ginneken,Nico Karssemeijer,Geert Litjens,Jeroen van der Laak
ArXiv: 1705.03678
Document:
PDF
DOI
Abstract URL: http://arxiv.org/abs/1705.03678v1
Automated classification of histopathological whole-slide images (WSI) of
breast tissue requires analysis at very high resolutions with a large
contextual area. In this paper, we present context-aware stacked convolutional
neural networks (CNN) for classification of breast WSIs into normal/benign,
ductal carcinoma in situ (DCIS), and invasive ductal carcinoma (IDC). We first
train a CNN using high pixel resolution patches to capture cellular level
information. The feature responses generated by this model are then fed as
input to a second CNN, stacked on top of the first. Training of this stacked
architecture with large input patches enables learning of fine-grained
(cellular) details and global interdependence of tissue structures. Our system
is trained and evaluated on a dataset containing 221 WSIs of H&E stained breast
tissue specimens. The system achieves an AUC of 0.962 for the binary
classification of non-malignant and malignant slides and obtains a three class
accuracy of 81.3% for classification of WSIs into normal/benign, DCIS, and IDC,
demonstrating its potentials for routine diagnostics.