Treffer: One-step model agnostic meta-learning using two-phase switching optimization strategy

Title:
One-step model agnostic meta-learning using two-phase switching optimization strategy
Source:
Neural Computing and Applications. 34:13529-13537
Publisher Information:
Springer Science and Business Media LLC, 2022.
Publication Year:
2022
Document Type:
Fachzeitschrift Article<br />Other literature type
Language:
English
ISSN:
1433-3058
0941-0643
DOI:
10.1007/s00521-022-07160-1
DOI:
10.60692/gbq5x-3f971
DOI:
10.60692/b1jv9-f1175
Rights:
CC BY
Accession Number:
edsair.doi.dedup.....83cd29e61d21cf827d7a50af4acf995e
Database:
OpenAIRE

Weitere Informationen

Conventional training mechanisms often encounter limited classification performance due to the need of large training samples. To counter such an issue, the field of meta-learning has shown great potential in fine tuning and generalizing to new tasks using mini dataset. As a variant derived from the concept of Model Agnostic Meta-Learning (MAML), an one-step MAML incorporated with the two-phase switching optimization strategy is proposed in this paper to improve performance using less iterations. One-step MAML uses two loops to conduct the training, known as the inner and the outer loop. During the inner loop, gradient update is performed only once per task. At the outer loop, gradient is updated based on losses accumulated by the evaluation set during each inner loop. Several experiments using the BERT-Tiny model are conducted to analyze and compare the performance of the one-step MAML with five benchmark datasets. The performance of evaluation shows that the best loss and accuracy can be achieved using one-step MAML that is coupled with the two-phase switching optimizer. It is also observed that this combination reaches its peak accuracy with the fewest number of steps.