Treffer: Africanus I. Scalable, distributed and efficient radio data processing with Dask-MS and Codex Africanus.

Title:
Africanus I. Scalable, distributed and efficient radio data processing with Dask-MS and Codex Africanus.
Authors:
Perkins, S.J.1 (AUTHOR) simon.perkins@gmail.com, Kenyon, J.S.2 (AUTHOR), Andati, L.A.L.2 (AUTHOR), Bester, H.L.1,2 (AUTHOR), Smirnov, O.M.1,2,3 (AUTHOR), Hugo, B.V.1,2 (AUTHOR)
Source:
Astronomy & Computing. Jul2025, Vol. 52, pN.PAG-N.PAG. 1p.
Database:
Supplemental Index

Weitere Informationen

The physical configuration of new radio interferometers such as MeerKAT, SKA, ngVLA and DSA-2000 informs the development of software in two important areas. Firstly, tractably processing the sheer quantity of data produced by new instruments necessitates subdivision and processing on multiple nodes. Secondly, the sensitivity inherent in modern instruments due to improved engineering practices and greater data quantities necessitates the development of new techniques to capitalize on the enhanced sensitivity of modern interferometers. This produces a critical tension in radio astronomy software development: a fully optimized pipeline is desirable for producing science products in a tractable amount of time, but the design requirements for such a pipeline are unlikely to be understood upfront in the context of artefacts unveiled by greater instrument sensitivity. Therefore, new techniques must continuously be developed to address these artefacts and integrated into a full pipeline. As Knuth reminds us, "Premature optimization is the root of all evil". This necessitates a fundamental trade-off between a trifecta of (1) performant code (2) flexibility and (3) ease-of-development. At one end of the spectrum, rigid design requirements are unlikely to capture the full scope of the problem, while throw-away research code is unsuitable for production use. This work proposes a framework for the development of radio astronomy techniques within the above trifecta. In doing so, we favour flexibility and ease-of-development over performance, but this does not necessarily mean that the software developed within this framework is slow. Practically this translates to using data formats and software from the Open Source Community. For example, by using NumPy arrays and/or Pandas dataframes, a plethora of algorithms immediately become available to the scientific developer. Focusing on performance, the breakdown of Moore's Law in the 2010s and the resultant growth of both multi-core and distributed (including cloud) computing, a fundamental shift in the writing of radio astronomy algorithms and the storage of data is required: It is necessary to shard data over multiple processors and compute nodes, and to write algorithms that operate on these shards in parallel. The growth in data volumes compounds this requirement. Given the fundamental shift in compute architecture we believe this is central to the performance of any framework going forward, and is given especial emphasis in this one. This paper describes two Python libraries, Dask-MS and codex africanus which enable the development of distributed High-Performance radio astronomy code with Dask. Dask is a lightweight Python parallelization and distribution framework that seamlessly integrates with the PyData ecosystem to address radio astronomy "Big Data" challenges. [ABSTRACT FROM AUTHOR]