Federated Data Centers for AI Native 6G Networks employing Renewable Energy Surplus.
CentraleSupélec/L2S is hiring a post doc to work on the integration of 6G networks, edge data centers and renewable energy sources. The project will be focused on employing dispersed data centers, to run AI Native 6G applications exploiting energy surplus created by renewable energy sources. This approach is similar to the « green energy demand response » that involved shifting data center workloads to different locations based on the availability of renewable energy. As opposed to these approaches where the shifting of data center loads was conditioned on the SLA renegotiation (since the loads moved across data centers were time-sensitive) the current proposal revolves around the exploitation of idle servers close to renewable sources whenever there is energy surplus. In this setup instead of moving around energy surplus across transmission lines or storing it in batteries, the surplus is absorbed close to its production by servers that are turned on to run energy consuming Machine learning algorithms. ML, especially deep learning, is a computationally intensive process that requires significant amounts of energy during the training phase. This has led to concerns about the environmental impact of machine learning, especially as the demand for machine learning applications continues to grow. In this course, renewable energy surpluses can be used to power machine learning workloads, especially during the training phase that is not time sensitive. This can be achieved either by moving raw data necessary for the training phase to data centers close to renewable energy surpluses or by employing federated learning or transfer learning approaches to reduce the amount of data and energy needed to move data. That is, since federated learning allows multiple parties to collaboratively train a model without sharing their raw data, by training the model locally on each device or server, and then aggregating the updates to create a global model. This approach can significantly reduce the amount of data that needs to be transmitted, and thus the energy consumption associated with data transfer, while only local servers of renewable energy surplus can run at a time for the training of the model. Moreover, transfer learning, that leverages pre-trained models to solve new tasks can also be employed in similar setups, taking a model that has been trained on a large dataset, and then adapt it to a new dataset by fine-tuning the model on the new data. This can significantly reduce the amount of data and compute resources needed to train a model from scratch, since the pre-trained model has already learned useful features that can be applied to the new task.
Task 1: Identifying energy and business models for the viability of the transmission of surplus green energy across transmission lines, vs that of the transmission of raw data across data centers in the vicinity of the green surplus.
Task 2: Exploring the feasibility of moving raw data and training models running on virtual machines across data centers, in response to green energy production.
Task 3: Providing early results for the feasibility of federated learning and transfer learning models, running across federated data centers switching on and off in response to green energy production.
The post doc will be hired by CentraleSupélec, L2S laboratory, Gif-Sur-Yvette, France, co-supervised by Alexis Aravanis and Salah El Ayoubi. A PhD in applied mathematics or communications is required.