Treffer: An improved NSGA-II algorithm based on fuzzy logic and learning automata for automatically designing the convolutional neural network for image classification.

Title:
An improved NSGA-II algorithm based on fuzzy logic and learning automata for automatically designing the convolutional neural network for image classification.
Source:
Multimedia Tools & Applications; Oct2025, Vol. 84 Issue 35, p44543-44582, 40p
Database:
Complementary Index

Weitere Informationen

Recent advancements in deep convolutional neural networks (CNNs) have revolutionized the fields of image processing, object detection, and image classification. Nevertheless, their performance is strongly architecture-dependent. Manually design optimal CNN architectures requires significant problem-understanding and domain-specific expertise, which is not necessarily available to everyone. Furthermore, this process is often very time-consuming and error-prone. Consequently, Neural Architecture Search (NAS) has emerged as a promising solution that aims to design optimal networks automatically. Unfortunately, many existing NAS methods require substantial human intervention and suffer from prohibitive search costs, particularly for resource-constrained applications. Besides, the optimal depth for a network is unknown and highly dependent on the complexity of the problem and the available data. To this end, this paper proposes a Fuzzy and Learning Automata-Guided method called FL-CNN, which leverages NSGA-II algorithm and is implemented using PyTorch in Python framework. Specifically, FL-CNN utilizes fuzzy logic to adaptively adjust crossover and mutation rates to avoid premature convergence and learning automata to improve diversity. Moreover, it presents a hybrid strategy for evaluating generated networks to reduce the excessively high search costs of NAS. Notably, FL-CNN achieves error rates of 2.6%, 12.1%, and 22.6% on CIFAR10, CIFAR100, and ImageNet datasets, respectively. Considering model complexity, FL-CNN generates networks with 0.17 M, 1.1 M, and 4.6 M parameters, respectively. Despite achieving impressive performance, search cost for CIFAR10 and CIFAR100 datasets is 0.32 and 2.3 GPU Days, respectively. These results indicate that FL-CNN outperforms many existing evolutionary and non-evolutionary NAS methods. [ABSTRACT FROM AUTHOR]

Copyright of Multimedia Tools & Applications is the property of Springer Nature and its content may not be copied or emailed to multiple sites without the copyright holder's express written permission. Additionally, content may not be used with any artificial intelligence tools or machine learning technologies. However, users may print, download, or email articles for individual use. This abstract may be abridged. No warranty is given about the accuracy of the copy. Users should refer to the original published version of the material for the full abstract. (Copyright applies to all Abstracts.)