Distributed learning methods in machine-type communications
Thesis event information
Date and time of the thesis defence
Place of the thesis defence
TA105, Linnanmaa campus
Topic of the dissertation
Distributed learning methods in machine-type communications
Doctoral candidate
Master of Science Matheus Valente da Silva
Faculty and unit
University of Oulu Graduate School, Faculty of Information Technology and Electrical Engineering, CWC - Radiotechnology
Subject of study
Communications engineering
Opponent
Professor Sergey D. Andreev, Tampere University
Custos
Associate Professor Hirley Alves, University of Oulu
Improving Communication Efficiency Using Machine Learning
As we move into the 6G era, smart cities and smart industries rely more and more on devices that talk to each other—like sensors, machines, and meters. These devices often have limited battery life and simple processors, so they need efficient ways to communicate.
This research explores how artificial intelligence, especially a type called machine learning, can help make these systems smarter and more efficient. One focus is on a method where devices learn from their environment and improve over time, even without a central controller. Another promising idea is federated learning, where devices train their own models locally and share only the results, protecting privacy and saving data.
But even sharing these models can put pressure on the network. To fix that, the research looks at using special techniques that allow multiple devices to send their updates at the same time, which saves energy and improves battery life.
The ultimate goal of this research is to build new systems that boost network performance and make sure these devices can work smarter, faster, and longer—all without draining their batteries or overwhelming the network.
This research explores how artificial intelligence, especially a type called machine learning, can help make these systems smarter and more efficient. One focus is on a method where devices learn from their environment and improve over time, even without a central controller. Another promising idea is federated learning, where devices train their own models locally and share only the results, protecting privacy and saving data.
But even sharing these models can put pressure on the network. To fix that, the research looks at using special techniques that allow multiple devices to send their updates at the same time, which saves energy and improves battery life.
The ultimate goal of this research is to build new systems that boost network performance and make sure these devices can work smarter, faster, and longer—all without draining their batteries or overwhelming the network.
Last updated: 7.4.2025