Treffer: Analysing semi-supervised learning for image classification using compact networks in the biomedical context.
Weitere Informationen
The development of mobile and on the edge applications that embed deep convolutional neural models has the potential to revolutionise healthcare. However, most deep learning models require computational resources that are not available in smartphones or edge devices; an issue that can be faced by means of compact models that require less resources than standard deep learning models. The problem with such models is that they are, at least usually, less accurate than bigger models. We propose to address the accuracy limitation of compact networks with the application of semi-supervised learning techniques. In particular, we perform a thorough comparison of self-training methods, consistency regularisation techniques, and quantization techniques. We present a thorough analysis for the results obtained by combining 11 compact networks and 6 semi-supervised processes when applied to 10 biomedical datasets. Namely, combining semi-supervised methods and compact networks, we can create compact models that are not only as accurate as standard size models, but also faster and lighter. In addition, we have developed a Python library to facilitate the combination of compact networks and semi-supervised learning methods to tackle image classification tasks. [ABSTRACT FROM AUTHOR]
Copyright of Soft Computing - A Fusion of Foundations, Methodologies & Applications is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)