Skip to content

Model Pruning

FedSCR

FedSCR uses structured pruning to prune each update's entire filters and channels if their summed parameter values are below a particular threshold.

cd examples/model_pruning/fedscr
uv run fedscr.py -c fedscr_MNIST_lenet5.toml

Reference: Wu et al., "FedSCR: Structure-Based Communication Reduction for Federated Learning," IEEE Trans. Parallel Distributed Syst., 2021.


Sub-FedAvg

Sub-FedAvg aims to obtain a personalized model for each client with non-i.i.d. local data. It iteratively prunes the parameters of each client's local model during its local training, with the objective of removing the commonly shared parameters of local models and keeping the personalized ones. Besides the original version for two-layer federated learning, the version for three-layer federated learning has been implemented as well.

For two-layer federated learning:

cd examples/model_pruning/sub_fedavg
uv run subfedavg.py -c subfedavg_MNIST_lenet5.toml

For three-layer federated learning:

cd examples/model_pruning/sub_fedavg
uv run subcs.py -c subcs_MNIST_lenet5.toml

Reference: Vahidian et al., "Personalized Federated Learning by Structured and Unstructured Pruning under Data Heterogeneity," in Proc. 41st IEEE International Conference on Distributed Computing Systems Workshops (ICDCSW), 2021.