Carlos Beltran-Perez, Hua-Liang Wei and Adrian Rubio-Solis. Generalized Multiscale RBF Networks and the DCT for Breast Cancer Detection. International Journal of Automation and Computing, vol. 17, no. 1, pp. 55-70, 2020. DOI: 10.1007/s11633-019-1210-y
Citation: Carlos Beltran-Perez, Hua-Liang Wei and Adrian Rubio-Solis. Generalized Multiscale RBF Networks and the DCT for Breast Cancer Detection. International Journal of Automation and Computing, vol. 17, no. 1, pp. 55-70, 2020. DOI: 10.1007/s11633-019-1210-y

Generalized Multiscale RBF Networks and the DCT for Breast Cancer Detection

  • The use of the multiscale generalized radial basis function (MSRBF) neural networks for image feature extraction and medical image analysis and classification is proposed for the first time in this work. The MSRBF networks hold a simple and flexible architecture that has been successfully used in forecasting and model structure detection of input-output nonlinear systems. In this work instead, MSRBF networks are part of an integrated computer-aided diagnosis (CAD) framework for breast cancer detection, which holds three stages: an input-output model is obtained from the image, followed by a high-level image feature extraction from the model and a classification module aimed at predicting breast cancer. In the first stage, the image data is rendered into a multiple-input-single-output (MISO) system. In order to improve the characterisation, the nonlinear autoregressive with exogenous inputs (NARX) model is introduced to rearrange the available input-output data in a nonlinear way. The forward regression orthogonal least squares (FROLS) algorithm is then used to take advantage of the previous arrangement by solving the system as a model structure detection problem and finding the output layer weights of the NARX-MSRBF network. In the second stage, once the network model is available, the feature extraction takes place by stimulating the input to produce output signals to be compressed by the discrete cosine transform (DCT). In the third stage, we leverage the extracted features by using a clustering algorithm for classification to integrate a CAD system for breast cancer detection. To test the method performance, three different and well-known public image repositories were used: the mini-MIAS and the MMSD for mammography, and the BreaKHis for histopathology images. A comparison exercise was also made between different database partitions to understand the mammogram breast density effect in the performance since there are few remarks in the literature on this factor. Classification results show that the new CAD method reached an accuracy of 93.5% in mini-Mammo graphic image analysis society (mini-MIAS), 93.99% in digital database for screening mammography (DDSM) and 86.7% in the BreaKHis. We found that the MSRBF networks are able to build tailored and precise image models and, combined with the DCT, to extract high-quality features from both black and white and coloured images.
  • loading

Catalog

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return