Treffer: DiffBreed: automatic differentiation enables efficient gradient-based optimization of breeding strategies Open Access.

Title:
DiffBreed: automatic differentiation enables efficient gradient-based optimization of breeding strategies Open Access.
Authors:
Hamazaki, Kosuke1 (AUTHOR), Iwata, Hiroyoshi2 (AUTHOR), Tsuda, Koji1,3 (AUTHOR)
Source:
Bioinformatics. Nov2025, Vol. 41 Issue 11, p1-10. 10p.
Database:
Academic Search Index

Weitere Informationen

Motivation Differentiable programming frameworks like PyTorch and JAX revolutionized biological modeling. A foremost merit is that multiple components programmed separately can be put together so that the parameters are jointly optimized. Despite its proven value in agricultural applications, existing breeding simulators are non-differentiable, hindering integration into general deep learning systems. Results In this paper, we present DiffBreed, a differentiable breeding simulator. Its performance was evaluated in gradient-based optimization of a progeny allocation strategy that maximizes the genetic gain. By utilizing gradient-based optimization, DiffBreed refined progeny allocation strategies, achieving superior genetic gains compared to a non-optimized equal allocation approach. These findings highlight DiffBreed's capacity to properly calculate gradient information through automatic differentiation. With its innovative design, DiffBreed is expected to transform future breeding optimization by integrating into modern deep learning workflows. Availability and implementation The proposed automatic differentiation framework was implemented as the Python module "DiffBreed." This module and all the scripts used in this study, including the gradient-based optimization, are available from the "KosukeHamazaki/GORA-PT" repository on GitHub, https://github.com/KosukeHamazaki/GORA-PT. While the simulated datasets in the present study are available from the same repository, the optimized results by PyTorch were not shared in this repository due to the file sizes. Instead, all datasets, including the optimized ones, will be shared in the repository on Zenodo, https://doi.org/10.5281/zenodo.14046522. [ABSTRACT FROM AUTHOR]