Treffer: Code Documentation Generation With BART : A Fine Tuning Approach.
Weitere Informationen
This study investigates the application of the BART model, fine-tuned with LoRA+, for the automated generation of Python code documentation. We address the growing complexity of software systems and the critical role of code documentation in enhancing software comprehension and maintenance. We highlight the challenges of manual code commenting, such as its tediousness and error-prone nature, which often leads to incomplete or outdated documentation. To tackle these issues, the research explores the potential of automatic code documentation generation using pre-trained transformer models. However, the high computational resources required by many of these models pose a limitation for developers with constrained hardware. To mitigate this, we employ Low-Rank Adaptation (LoRA+), a technique designed to fine-tune large models efficiently and reduce computational overhead. The study evaluates the effectiveness of fine-tuning BART with LoRA+ for Python code documentation generation, comparing its performance with state-of-the-art models like CodeT5+ and StarCoder. The evaluation utilizes metrics such as BLEU and BERTScore to assess the quality of the generated documentation. [ABSTRACT FROM AUTHOR]