Finnish universities collaborate on a first extensive 6G white paper on large language models
Contributions are invited to augment dialogue on themes such as the integration of LLMs into 6G infrastructure, impacts on data security, sustainability enhancement, and developing new applications across sectors. Additionally, the White Paper will scrutinise the societal, ethical, and economic aspects of integrating LLMs into 6G networks.
“LLM and GPT technologies are entering 6G networks and open up a world of endless possibilities for how applications and services are experienced in real-time environments,” says professor Jaakko Sauvola, 6G Flagship Ecosystem Leader at the University of Oulu.
“This work is pioneering the integration of LLMs and 6G technology, aiming to radically improve network capabilities and support the development of next-generation applications. We envision a new interconnect for configuring, adapting and connecting high numbers of LLMs within and over the network,” explains Professor Sasu Tarkoma from the University of Helsinki.
Call for contributions and publication details
This initiative is a strategic move towards uniting interdisciplinary expertise to fully understand LLMs in the 6G framework. Submissions from researchers and industry professionals are accepted until January.
For comprehensive submission details, please refer to our Call for Contributions document.
The collective knowledge from these universities will guide the research community in creating a detailed, actionable strategy for the adoption of LLMs in 6G technology.
The completed White Paper is scheduled to be published at the 6GSymposium in Levi in April. The symposium will serve as a significant platform for sharing this comprehensive work with the global community. For further details on the 6Gsymposium, please visit 6GSymposium Spring 2024.
Contribute to this effort to advance 6G technology, ensuring it is secure, sustainable, and beneficial for society at large. For more information on the topic, see the recent preprint by the editors.