
I am currently a postdoc researcher at L2S under the supervision of Florent Bouchard. My work focuses on federated learning on covariance matrices for Riemannian models on health data (EEG).
Click here for my personal website.
Brief description of the research project: the goal is to implement and benchmark a federated learning approach on several EEG motor-imagery datasets in order to perform classification based on sample covariance matrices as input of the so-called SPDnet. SPDnet is a Riemannian lightweight deep learning architecture processing SPD matrices in a geometry-preserving way whose parameters live on the Stiefel manifold. As a baseline, we consider EEGnet, a Euclidean, convolution-based neural network architecture tailored for EEG data signals.
Federated Learning with Riemannian models: the federated learning workflow relies on communication rounds between a central server and selected clients, which share a common model architecture. At each round, each client performs local training and sends its model weights to the central server. No data is ever shared between clients and between clients and server. The challenge for the server is then to aggregate the received model weights in order to produce an updated global model, whose parameters will be sent to clients for the next round of local training. For Euclidean network like EEGnet, the most natural approach is to compute the arithmetic average of the sent weights (FedAvg). For Riemannian weights on the other hand, geometry-preserving aggregation is needed. We consider two Riemannian aggregation schemes: one relying on projecting the extrinsic arithmetic mean back onto the manifold via nearest-point projection and the other relying on tangent space averaging with retractions and liftings.
Keywords: Federated Learning, Riemannian manifolds, Symmetric definite positive matrices, deep learning
CentraleSupélec,
3, rue Joliot Curie,
91190 Gif-sur-Yvette
Supporting institutes
©2025 L2S - All rights reserved, reproduction prohibited.
