Treffer: FedSat-LAM: Enabling Large AI Models on Resource-Constrained Satellites via Hierarchical Federated Learning
Title:
FedSat-LAM: Enabling Large AI Models on Resource-Constrained Satellites via Hierarchical Federated Learning
Authors:
Contributors:
AlGorithmes et Optimisation pour Réseaux Autonomes (AGORA), CITI Centre of Innovation in Telecommunications and Integration of services (CITI), Institut National des Sciences Appliquées de Lyon (INSA Lyon), Université de Lyon-Institut National des Sciences Appliquées (INSA)-Université de Lyon-Institut National des Sciences Appliquées (INSA)-Institut National de Recherche en Informatique et en Automatique (Inria)-Institut National des Sciences Appliquées de Lyon (INSA Lyon), Université de Lyon-Institut National des Sciences Appliquées (INSA)-Université de Lyon-Institut National des Sciences Appliquées (INSA)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre Inria de Lyon, Institut National de Recherche en Informatique et en Automatique (Inria), University of California [Berkeley] (UC Berkeley), University of California (UC), Drakkar, Laboratoire d'Informatique de Grenoble (LIG), Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP), Université Grenoble Alpes (UGA)-Institut National de Recherche en Informatique et en Automatique (Inria)-Centre National de la Recherche Scientifique (CNRS)-Université Grenoble Alpes (UGA)-Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP), Université Grenoble Alpes (UGA), Institut polytechnique de Grenoble - Grenoble Institute of Technology (Grenoble INP), Consejo Nacional de Investigaciones Científicas y Técnicas [Buenos Aires] (CONICET), Universität des Saarlandes [Saarbrücken] = Saarland University [Saarbrücken]
Source:
ACM MobiCom 2025 - 31st Annual International Conference on Mobile Computing and Networking. :19-24
Publisher Information:
CCSD; ACM, 2025.
Publication Year:
2025
Collection:
collection:UGA
collection:CNRS
collection:INRIA
collection:INSA-LYON
collection:INPG
collection:LIG
collection:LIG_SRCPR
collection:LIG_SRCPR_DRAKKAR
collection:INRIA2
collection:CITI
collection:INSA-GROUPE
collection:UDL
collection:UGA-EPE
collection:INRIA-LYS
collection:INRIA-ETATSUNIS
collection:INRIA-ALLEMAGNE
collection:TEST-UGA
collection:CNRS
collection:INRIA
collection:INSA-LYON
collection:INPG
collection:LIG
collection:LIG_SRCPR
collection:LIG_SRCPR_DRAKKAR
collection:INRIA2
collection:CITI
collection:INSA-GROUPE
collection:UDL
collection:UGA-EPE
collection:INRIA-LYS
collection:INRIA-ETATSUNIS
collection:INRIA-ALLEMAGNE
collection:TEST-UGA
Subject Terms:
satellite edge computing, CCS CONCEPTS Computer systems organization → Embedded systems • Computing methodologies → Distributed artificial intelligence • Information systems → Spatialtemporal systems federated learning, parameter-efficient fine-tuning, foundation models, federated learning, • Information systems → Spatialtemporal systems, • Computing methodologies → Distributed artificial intelligence, Computer systems organization → Embedded systems, CCS CONCEPTS, LEO constellations, [CHIM]Chemical Sciences
Subject Geographic:
Original Identifier:
HAL: hal-05423924
Document Type:
Konferenz
conferenceObject<br />Conference papers
Language:
English
Relation:
info:eu-repo/semantics/altIdentifier/doi/10.1145/3737902.3768355
DOI:
10.1145/3737902.3768355
Access URL:
Rights:
info:eu-repo/semantics/OpenAccess
URL: http://creativecommons.org/licenses/by/
URL: http://creativecommons.org/licenses/by/
Accession Number:
edshal.hal.05423924v1
Database:
HAL
Weitere Informationen
LEO satellites generate 5.2 PB/day but can only downlink 2.3 PB/day, leaving 57% of data inaccessible. We present FedSat-LAM, enabling 632M-parameter foundation models on heterogeneous satellite constellations through: (1) multihop offloading reducing latency 41%, (2) tree-based aggregation achieving O(log N) complexity, and (3) PEFT training only 0.13% of parameters at 10.45 J/sample. Validated on 100satellite simulations and Jetson hardware, we achieve 87.6% accuracy, 5.2% above FedProx-while cutting communication 85% and energy 95%.