Treffer: Comparison of Multi-Armed Bandit Algorithms in Advertising Recommendation Systems

Title:
Comparison of Multi-Armed Bandit Algorithms in Advertising Recommendation Systems
Authors:
Source:
Applied and Computational Engineering. 83:62-71
Publisher Information:
EWA Publishing, 2024.
Publication Year:
2024
Document Type:
Fachzeitschrift Article
ISSN:
2755-273X
2755-2721
DOI:
10.54254/2755-2721/83/2024glg0074
Accession Number:
edsair.doi...........b2e202cb7cfb3a204c3bb9e597e086ef
Database:
OpenAIRE

Weitere Informationen

In today's rapidly evolving online environment, advertising recommendation systems utilize multi-armed bandit algorithms like dynamic collaborative filtering Thompson sampling (DCTS), upper confidence bound based on recommender system (UCB-RS), and dynamic -greedy algorithm (DEG) to optimize ad displays and enhance click-through rates (CTR). These algorithms must adapt to limited information and update strategies based on immediate feedback.This study employs an experimental comparison to assess the performance of the DCTS, UCB-RS, and DEG algorithms using the click-through rate prediction database from Kaggle. Five experimental sets under varied parameter settings were analyzed, employing the Receiver Operating Characteristic (ROC) curve, accuracy, and area under the curve (AUC) metrics.Results show that the DEG algorithm consistently outperforms the others, achieving higher AUC values and demonstrating robust sample identification capabilities. DEG also exhibits superior precision at high recall levels, showcasing its potential in dynamic advertising environments. Its dynamic adjustment strategy effectively balances exploration and exploitation, optimizing ad displays.The findings suggest that DEG's adaptability and stability make it particularly suitable for dynamic ad recommendation scenarios. Future research should focus on optimizing DEG's parameter settings and possibly integrating UCB-RS's exploration mechanisms to enhance performance and develop more effective strategies for advertising recommendation systems.