Skip to content
/ ComEff Public

Code, results and presentation of my bachelors thesis.

Notifications You must be signed in to change notification settings

mainpyp/ComEff

Repository files navigation

Communication-efficient approaches to federated deep neural networks (BA thesis)

| Written by Adrian Henkel, advised by Reza Naserigerdeh and supervised by Dr. Josch Pauling and Prof. Dr. Jan Baumbach.

Hey, thank you for being interested in my thesis. 🎉

Please see the slides in the presentation folder to get a quick summary of the work. The full thesis can be found here here 🤓.

FedLearning

This project aims to simulate and analyse three different communication-efficient approaches for federated machine learning.

  1. Gradient Quantification: Each parameter is reduced in its size before sending the gradients to the server and vice versa.
  2. Gradient Sparsification: This appraoch ignores gradients which have not changed beyond a certain level after the local updates.
  3. Multiple local Updates: This approach performs the training algorithm mini-batch SGD multiple times in one communication round.

All code that was used can be found here.
The configuration files for the final simulations can be found here.
The result files can be found here.

Below the package structure is displayed in a truncated UML diagram that focuses on the main functionalities. UML

This work was graded with a 1.0 (highest grade).

About

Code, results and presentation of my bachelors thesis.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published