Table 5 Optimizers & FL architecture.

From: Privacy preserving skin cancer diagnosis through federated deep learning and explainable AI

Category

Parameter/method

Description

Optimizer 1

AdamW

weight decay control

Optimizer 2

SGD

Standard optimizer with momentum

Optimizer 3

RMSprop

moving average of squared gradients

No. of clients

3

Each client trains on its local skin cancer dataset

Communication rounds

25

Number of global aggregation rounds

Local training epochs

20

Training epochs per client before sending updates

Aggregation algorithm

FedAvg

Average model weights from clients to update the global model each round