Extended Data Fig. 2: Illustrating a federated learning scenario by distributing Ark+’s training across three local sites.
From: A fully open AI foundation model applied to chest radiography

Ark+ can be federated by deploying a (local) Ark+ at each site to protect privacy and distribute training. In this setup, each local site trains its own Ark+ with all its available data, employing the same cyclic training strategy to train the student and the same epoch-wise EMA to update the teacher. After completing a round of local training, all sites send their student weights to a central server, where weights are averaged to aggregate these local models into a “master” model, consolidating knowledge from all sites. This master model is then distributed back to the local sites, allowing iterative learning and continuous improvement of the local teacher model. For simplicity, the projectors and multi-task heads are omitted from the illustration.