Model agnostic federated mutual learning
Author:
Affiliation:

(College of Computer Science and Technology, National University of Defense Technology, Changsha 410073, China)

Clc Number:

TP181

Fund Project:

  • Article
  • |
  • Figures
  • |
  • Metrics
  • |
  • Reference
  • |
  • Related
  • |
  • Cited by
  • |
  • Materials
  • |
  • Comments
    Abstract:

    The mainstream FL(federated learning) methods require gradient interaction and the ideal assumption of the independently identically distribution, which brings additional communication overhead, privacy leakage, and data inefficiency. Therefore, a new FL framework called MAFML (model agnostic federated mutual learning) was proposed. MAFML only used a small amount of low-dimensional information (for example, the soft labels output of the neural network in the image classification task) for sharing to achieve cross-participants “mutual learning and mutual education”. Moreover, MAFML did not need a shared global model, users can customize their own private models without restricting the model structure and parameters. At the same time, MAFML used a general approach for avoiding gradient interference so that each participant′s model could be well generalized to other domains without reducing the performance of its own domain data. Experiments on multiple cross-domain datasets show that MAFML can provide a promising solution for alliance business facing the “competition and cooperation” dilemma.

    Reference
    Related
    Cited by
Get Citation
Share
Article Metrics
  • Abstract:
  • PDF:
  • HTML:
  • Cited by:
History
  • Received:June 30,2021
  • Revised:
  • Adopted:
  • Online: June 07,2023
  • Published: June 28,2023
Article QR Code