News

[FOCUS 5G/6G] From 5G to 6G: the inexorable advance of generations

March 3, 2025 - Big Data & AI - Cybersecurity - Industry of the future - Media of the future - Intelligent mobility - Networks & IoT - Digital health - Smart City

In this dossier, you'll find out more about the technical challenges, as well as questions of optimization and sobriety, that accompany the transition from 5G to 6G. Between the explosion of the IoT, the emergence of the metaverse and the imperative of energy sobriety, we take a look at the future of telecommunications.

Le déploiement de la 5G n’est pas seulement synonyme de débits ultra-rapides et de latence réduite ; il symbolise surtout l’émergence de nouveaux usages qui redessinent peu à peu l’économie mondiale et nos modes de vie. À mesure que s’étend la couverture 5G, la palette d’opportunités s’élargit : usines connectées, robots chirurgicaux, véhicules autonomes, villes plus sécurisées et durables… Derrière ces promesses, un enjeu majeur se dessine : comment concevoir et gérer des réseaux plus performants, tout en maîtrisant leurs impacts économiques, environnementaux et sociétaux ?

Olivier Boissier, professor of computer science at Mines Saint-Étienne, and Guillaume Lozenguez, researcher at IMT Nord Europe, provide essential insights into connected objects, which are increasingly present in our daily lives. Through the FITNESS project, the two scientists share their vision of future communication protocols and infrastructures designed to absorb the explosion of sensors. This technological context is also conducive to the deployment of virtual worlds: promising new areas of exploration. Marius Preda, a researcher in augmented reality at Télécom SudParis and involved in the 5GMetaverse project, explains how 5G and immersive technologies will support the evolution of collaborative virtual environments for industry.

The development of infrastructures capable of supporting such innovations also requires technological improvements. Djamal Zeghlache, professor of networks and services at Télécom SudParis and leader of the NF-MUST project, deciphers the concept of "slicing", or virtual network segmentation, a key technology for the implementation of multi-sector 5G, meeting the challenge of a variety of large-scale uses. This naturally raises the question of the energy and environmental limits of networks. For many years, Joe Wiart, a researcher at Télécom Paris specializing in dosimetry, has been studying exposure to electromagnetic fields generated by 5G infrastructures. His expertise is being put to good use in the JEN project, which explores how to reduce these impacts to a strict minimum, while optimizing energy consumption.

In 2025, the National Center for Networks and Systems for Digital Transformation was launched, with the aim of articulating the multi-school, multi-disciplinary and multi-partner activities of the Institut Mines-Télécom - in strong synergy with the industrial world - in the field of future communications networks and distributed systems for digital transformation.

While the challenge of reducing energy consumption, managing diverse communications and innovating in metavers still offers many opportunities, the challenges of future networks such as 6G are vast and provide inexhaustible fertile ground for research. In 2023, Institut Mines-Télécom has been entrusted by the French Ministries of Economy, Finance, Industrial and Digital Sovereignty and Higher Education and Research with the design and management of the future France 6G platform.

Building the IoT networks of the future

The FITNESS project aims to develop networks capable of dynamically adapting to the needs of massive IoT, industrial IoT and connected transport. The project draws on the expertise of several ITM schools to meet the challenges of densification of connected objects, application robustness, energy management and varied protocols, and to optimize network performance in real time.

A factory where every machine, every sensor and every vehicle is connected, sharing information in real time to optimize processes or ensure safe travel. This is one of the promises of the Internet of Things (IoT), which, backed by 5G, is set to transform the industrial and mobility sectors. Supported by the PEPR "5G and Networks of the Future" program, the FITNESS project, which brings together several schools from the Institut Mines-Télécom, CEA, CNRS and Inria, aims to turn this vision into reality by developing solutions adapted to the challenges of today and tomorrow.

The IoT systems developed as part of the project must meet the needs of mission-critical applications in both industry and mobility, where reliability, low latency and resilience are essential criteria. " Industry 4.0 relies on robust communications to guarantee the continuity of critical processes, such as production line management. A failure could have serious consequences for safety," explains Olivier Boissier, a member of the Laboratoire d'informatique, de modélisation et d'optimisation des systèmes (LIMOS CNRS UMR 6158), Professor of Computer Science at Mines Saint-Étienne, and co-leader of the FITNESS project. Similarly, in the mobility sector, IoT systems need to operate reliably in environments that are sometimes high-density, such as public transport or autonomous vehicles.

Massive IoT: how to manage density?

When it comes to the Internet of Things, dense deployments or "massive IoT" refer to environments where the concentration of connected objects is very high, such as in logistics warehouses, smart cities or transport infrastructures. The range of these IoT objects is extremely broad, and includes sensors, actuators, robots and communication devices. In a massive IoT scenario, the number of connected devices that need to coexist harmoniously can reach thousands, or even millions.

The challenges of providing enriched services despite the density of connected objects are numerous, but include energy management, interoperability and interference reduction. The challenge is to ensure that each object functions autonomously while communicating efficiently, without causing disruption to the network.

A typical example is the maintenance hangars of major companies (such as SNCF), where hundreds of sensors monitor the condition of trains. " These places are huge, and the large quantity of IoT objects creates a high demand on the physical resources that support communication. So we need to allocate frequency bands and coordinate who communicates when, so that everything works optimally," explains Olivier Boissier.

Energy issues and resource optimization

For large companies like SNCF, energy management is a central issue, and the massive IoT is no exception to these considerations. IoT objects are constrained devices, particularly in terms of energy resources. They are often battery-powered, and one of the key challenges is to ensure that the battery lasts as long as possible. One solution is to make them operate intermittently: " objects are put on standby when not in use, and only come back on when there are signals to be sent ", explains Olivier Boissier. Another option: objects can communicate at shorter distances, with a less powerful signal. The information must therefore be captured close to the object, using a mobile robot for example.

These optimization issues also apply to computing and communication resources. IoTs cannot process all data: some devices only capture information and transmit it, while others, such as mobile robots, move and act on their environment. Needs thus vary according to the scenario, requiring constant adjustments to transmission strategies.

This calls for a wide range of solutions, capacities and energy consumption. " Some techniques will offer very low latency, provided the data volume is not too large, while others will guarantee processing of huge volumes, sometimes at the expense of speed ", explains Guillaume Lozenguez, also a researcher in computer science at IMT Nord Europe and involved in one of the FITNESS project's work packages. " What ' s interesting is the dynamic nature of IoT networks, and how switching from one technique to another according to needs will affect the architecture," he adds.

A wide range of communication protocols

Once the need has been identified and the constraints established, the next step is to choose the right technology. Each protocol - including Wi-Fi, LoRa (for Long Range Wide Area Network), Narrowband IoT (NB-IoT) and 5G - has its own specific features in terms of range, energy consumption and bandwidth. For example, LoRa is ideal for long-range, low-energy communications, while Wi-Fi is preferable for short-range, high-volume data exchanges.

" Historically, companies have often adopted different technologies to meet their specific needs, each with its own advantages and disadvantages," explains Guillaume Lozenguez. By offering advantages over other protocols in terms of latency, throughput, management of massive communications, and security, 5G may be the ideal standard. But the question of usage remains central: " We have to ask ourselves whether it's worth deploying these new functionalities, if there's no use behind them," stresses Olivier Boissier. " Because such changes have a cost, both financial and environmental.

Dynamic switch: ensuring optimum connectivity

FITNESS teams are looking to combine these different modes of communication, and are working on systems capable of switching from one protocol to another according to real-time needs. " If a robot needs to transmit high-resolution images, it can switch to a Wi-Fi or 5G connection to guarantee sufficient throughput, then switch back to less energy-intensive communication once the transmission is complete," illustrates Guillaume Lozenguez.

Where does AI fit into all this? Of course, AI has a role to play in this dynamic optimization. It can be deployed both in the lower layers, to improve bandwidth management, and in the higher layers, to make the most of transmission capacities in the context of application implementation. AI thus enables systems to better coordinate resources, thanks to techniques such as machine learning or knowledge graphs. It thus provides a powerful additional tool for improving the performance of IoT networks and their flexibility in the face of dynamic needs raised by industry or the mobility sector.

Mobile robots, a concrete use case

One use case explored by FITNESS is mobile robotics, more specifically, a fleet of robots capable of automatically deploying themselves in a space, such as a warehouse, and mapping the quality of connectivity within that space. " These robots will carry their own communications network, and will be able to identify the best areas of network coverage ," says Guillaume Lozenguez. This approach will make it possible to adapt dynamically to the needs of the environment and ensure optimal connectivity, even in complex situations. For example, in an industrial environment where machines and obstacles may change location, the ability of robots to adapt and reorganize network coverage will be a major advantage.

The industry of the future invites itself into the metaverse

The metaverse is no longer limited to online games or social interactions: it is becoming a key technology for industry. Through the 5GMetaverse project, five Institut Mines-Télécom schools are seeking to adapt tomorrow's networks to the needs of augmented and virtual reality. In particular, the project aims to develop concrete solutions for industrial process optimization, remote assistance and man-machine collaboration.

Since the appearance of the first immersive online platforms in the 2000s, the metaverse has undergone several waves of evolution. Initially designed for entertainment or social interaction, these virtual environments are now attracting growing interest from industry. Their promise: to reproduce and synchronize physical environments in real time within a virtual world.

This promise is crucial, as it improves decision-making through accurate simulations, accelerates innovation and strengthens collaboration by making complex data more accessible and visual. Unlike "classic" industrial digital twins, the metaverse's evolving models can capture and integrate the slightest events (changes on a machine, a person passing by...) to guide operational decisions. The 5GMetaverse project, which brings together five schools from the Institut Mines-Télécom (IMT), several Airbus subsidiaries and Orange, seeks to prepare and promote 5G for the needs of the metaverse, particularly in an industrial context.

A dynamic version of industrial digital twins

Indeed, the metaverse offers many opportunities for industry, such as the simulation and optimization of processes prior to their application in the real world. "There's a lot to be gained from simulating the different settings of a machine to optimize its operation in a controlled and controllable virtual environment",argues Marius Preda, a researcher specializing in augmented reality at Télécom SudParis and involved in the 5GMetaverse project.

Tele-operations and tele-assistance are also promising applications, and are among the use cases of the 5GMetaverse project. For example, an operator equipped with augmented reality (AR) glasses can be guided by a remote expert thanks to a virtual representation of the industrial environment. " The expert evolves in a digital twin of the installation in real time, and his interactions with the virtual elements are transmitted to the operator on site," Marius Preda details.

In addition to efficiency gains, these technologies also enhance human-machine collaboration. As industry moves towards increased robotization, metavers offer an interface that facilitates human-robot interactions, while exploiting human adaptability in environments where robots remain limited. To achieve this, however, the metaverse must overcome major technological hurdles, and in particular satisfy increasingly demanding technical constraints in terms of latency and throughput.

Latency and throughput, key indicators of responsiveness

In a context where any action that takes place in the real world must be propagated instantaneously to all users of the metaverse in real time, latency and throughput requirements are indeed particularly critical. "In a Zoom meeting, a quality video exchange requires around 1 Mbps per user. But in a metaverse, where all body movements have to be transmitted, we're talking about at least 20 Mbps per user, for a realistic representation",explains Marius Preda. " Similarly, latency must be very low to ensure a fluid experience. If the animation of my body happens even half a second after what I say, it's very visible. "

These needs obviously vary depending on whether we're talking about virtual reality (VR) or AR. While VR immerses the user in an entirely virtual environment, AR superimposes virtual elements on reality. VR therefore requires a lot of content and high throughput, as it recreates the entire universe, whereas AR is less demanding but requires immediate synchronization with reality. In the 5GMetavers project, VR is used for total immersion of experts, and AR for operators in the field, posing different constraints on latency and throughput.

Optimizing flows and prioritizing critical data

The other challenge lies in data management. It's not just a question of ensuring rapid transmission, but also of identifying priority information. "In an industrial environment, some data, such as machine control, are more important than others," emphasizes Marius Preda. " Transmission protocols must integrate this prioritization to optimize data flows.

But here again, requirements vary according to use: for example, the needs of a teleoperator interacting with a machine are different from those of a trainer, for whom a little latency is tolerable. Similarly, some machines require a high degree of interactivity, while others are more autonomous. It is therefore necessary to involve business experts in labeling critical information so that the metaverse can process it as a priority, ensuring optimized transmission according to the importance of each piece of data.

Key technologies for a metaverse that meets industrial challenges

Several technologies are involved in meeting these requirements. Compression technologies play an important role in compacting information and minimizing transmission times. As for 5G, it enables better flow management thanks to network slicing, which intelligently partitions data flows according to needs. However, these technologies remain limited, and future developments, notably towards 6G, will be necessary to meet the various use cases and support widespread adoption.

In particular, these developments should facilitate interoperability between different metavers. In the case of remote assistance, for example, the expert is likely to move from one metaverse to another to intervene on different sites: the management of his profile and accesses must be fluid. They should also support the "by design" integration of multi-users, essential in environments such as the virtual factory, which is expected to host around ten users, or the virtual store - the second use case in the 5GMetaverse project - which could potentially host thousands.

Finally, artificial intelligence (AI) is also an essential lever for exploiting the full potential of industrial metavers, intervening at several levels, such as data classification to optimize processing, or the representation of 3D content. " AI can, for example, generate a faithful 3D representation of a machine or part from a simple image, thus easing throughput constraints, since certain data will no longer be transmitted but generated ", illustrates Marius Preda.

Fifth wave" objective

While industrial metavers offer promising prospects, their widespread application remains a medium- to long-term objective. " The main aim of metavers is to immerse people in the virtual world, not to turn it into Zoom bis," says Marius Preda. " For this, the interfaces provided by virtual reality headsets are indispensable. However, current devices are still cumbersome and unsuitable for prolonged use in an industrial context.

The future of metavers in industry will therefore depend on several simultaneous advances: on the hardware front, the miniaturization of equipment and its widespread adoption; on the network front, the integration of new protocols, notably with 6G; and finally, advances in AI and modeling to enrich the applications available. We'll have to wait until the "fifth wave" of metavers [see box], if we are to believe the researcher: " not for another 7 or 10 years, then.

From the beginnings of the Internet to Meta: evolution in waves

In the early 2000s, with the advent of the Internet and the first experiments in online gaming, the web was immediately seen as an ideal platform for developing 3D encounter worlds. But at the time, neither the networks nor the hardware were capable of delivering fluid experiences. A few years later, the concept of parallel universes returned to the spotlight with the successful Second Life application.

With no specific objective or scenario, the Second Life experience is essentially about exchanging with other users, discovering the universe, and so on. The phenomenon went far beyond the geek and technophile community . Everyone wanted to have a place there and live a 'second life' - even banks were opening branches! " recalls Marius Preda. Unfortunately, the real caught up with the virtual, and the banking crisis of 2008 put paid to this fleeting craze.

Most recently, under the banner of Facebook, now known as "Meta", the metaverse is making a comeback. However, despite the power of computers and networks, the deployment of metavers remains hampered by the unavailability of suitable hardware: " It's unlikely that this wave will establish metavers in everyday life, due to the limitation of visualization devices. But we can make progress on networks, as well as content creation and transmission ", agrees the researcher. "Technological development is always incremental, and each wave brings its share of knowledge that helps to improve the next.

No more, no less: sober networks for 5G

Faced with the rise of 5G and the challenges of 6G, energy sobriety and exposure to electromagnetic fields are becoming major issues for tomorrow's networks. Between infrastructure optimization, mobile sensors and predictive models, scientists are striving to reconcile performance, reduced consumption and quality of service. The Just Enough Network project, co-sponsored by Télécom Paris, aims to build networks that are "just enough", adjusted to our needs, without excess or waste.

As mobile networks evolve into ever more powerful generations - from 4G to 5G, with 6G on the way - concerns are growing about their impact on our environment and health. For around forty years, each decade has brought a new generation of mobile networks, with progress focused above all on improving performance. For a long time, these developments have taken place without any real consideration for the energy consumption or electromagnetic emissions of these networks.

The first generations of mobile networks were designed solely to meet technical requirements in terms of quality of service. "When the 2G and 3G networks were launched, consumption was not an issue. The aim of GSMs was to transmit as much as possible to be picked up by base stations," recalls Joe Wiart, a researcher at Télécom Paris specializing in dosimetry. It wasn't until 4G that efforts were made to better control energy, notably by switching off certain equipment when demand was low, for example at night. But with the advent of 5G and the imminent arrival of 6G, these efforts may not be enough. The growing complexity of infrastructures and the new uses foreseen require us to go even further to reconcile performance and responsibility.

This is the background to the Just Enough Network (JEN) project, part of the PEPR "5G and Networks of the Future" program. Its proposal: create truly "agile" networks, capable of adjusting their consumption according to the precise needs of users, while guaranteeing optimum quality of service. "Connected objects and networks must therefore be efficient, to minimize both energy consumption and levels of exposure to electromagnetic fields induced by these devices," points out the Télécom Paris researcher and project co-pilot. "In short, we have to look for sobriety! This is what the JEN project sets out to achieve, by imagining networks that consume and emit "just what's needed", without excess or waste.

A compromise between service quality, energy efficiency and emissions

Energy efficiency is therefore a central focus for the various CNRS, CEA and ITM research teams involved in the project. Several strategies are being implemented, from the optimization of IoT sensors to global infrastructure management. IoT sensors, for example, individually consume little, but represent a major energy challenge when deployed on a large scale. The aim is to maximize their autonomy while reducing their overall energy impact.

At the same time, teams are exploring approaches aimed at dynamically adapting the energy consumption of infrastructures. After all, "optimized energy consumption does not imply a system that is sober in every respect. Even if the system is efficient, if it's open all the time, the whole network consumes and emits," explains Joe Wiart. The various working groups are therefore required to develop models that adjust equipment to real needs, avoiding unnecessary consumption, but also prolonged exposure to electromagnetic fields (EMF), the health effects of which are still unknown. The aim is to find the best compromise between quality of service, consumption and exposure.

Minimizing exposure: a social issue and a headache

Exposure to EMF is a fundamental socio-technical issue for the deployment of 5G, and more generally for telecoms technologies. While no evidence of harmful effects has yet been established, the issue remains very much in the public eye. Indeed, trust is a significant factor in the social acceptability of a technology, and this trust is based in particular on the guarantee of its harmlessness. However, finding the optimum balance between minimum exposure and acceptable quality of service is actually a very delicate matter.

At first glance, reducing antenna density mechanically reduces network-induced exposure, but also affects quality of service. Inadequate coverage forces telephones to transmit at higher power levels to maintain the connection, thus canceling out the benefits of reducing the number of antennas. Another solution is to multiply the number of low-power transmitters, but this has repercussions on the overall energy footprint, since all these transmitters have to be built and operated. The JEN project therefore seeks to strike the best possible balance between these different constraints to offer a solution that is globally acceptable, both in terms of energy and health.

EMC: bad reception

This balance is all the more difficult to strike because measuring EMF in the local environment poses fundamental methodological problems: "All the work currently being carried out is limited to one-off measurements aimed solely at compliance with standards. For example, if you make a phone call from a cellar - where the phone transmits more to reach the base station than from a garden - is the average radiated power well below the limits set by the regulations? However, the aim of the JEN project is not to limit itself to compliance with threshold values, but "not to have more exposure than necessary", reminds the researcher. Scientists are therefore trying to build advanced indicators to assess exposure in a global and dynamic way.

There are two possible approaches to capturing EMF levels: the use of fixed IoT networks to build a global model, or the use of distributed mobile sensors. The first approach, tested in a previous project around the Massy-Palaiseau (91) railway station, demonstrated the need for a very tight grid of sensors to obtain a reliable measurement, which proves costly and restrictive. In view of this, JEN's teams opted for the development of mobile sensors: devices mounted on vehicles, enabling real-time measurement of EMF exposure in urban environments.

Mobile solutions for dynamic monitoring

"The system we have developed consists of an antenna connected to a spectrum analyzer and a small PC. The antenna is fixed to the roof of a car, enabling us to capture exposure while driving along bus and cab routes", describes Joe Wiart. While mobile sensing poses a few methodological questions - such as the speed at which the vehicle must travel to ensure acquisition quality - this solution is unquestionably more agile than deploying an IoT network: it requires far fewer sensors, while covering vast, dense areas.

Another advantage is that it enables us to track variations in exposure over time, and identify areas where levels are continuously high. "We can detect hot spots of exposure and readjust the network configuration to reduce these levels," says Joe Wiart. This flexibility is crucial to ensure that exposure levels remain well below recommended thresholds, even in highly concentrated environments. At the same time, the project is developing specific sensors to assess exposure induced by cell phones.

Ultimately, all these measurements will be used to design models and provide indicators to control user exposure and optimize equipment operation. In addition to managing periods of low use (particularly at night), so as to switch off unnecessary equipment, dynamic adaptation to demand in real time, while maintaining low levels of exposure, will be decisive for emerging uses. "For example, if there are autonomous vehicles on the road, we can't decide to turn everything off at night", Joe Wiart points out. This is what JEN's "just enough" approach is all about: adapting consumption and exposure according to use and local context, ensuring that every watt consumed, every electromagnetic field emitted, is necessary and justified.

Slicing: a MUST for multi-sector communications

In emergency situations such as a pile-up, coordination between rescue, health and transport services is essential. However, these players often use compartmentalized communication systems that hamper their efficiency. The NF-MUST project aims to create a dynamic, shared and flexible network architecture that enables fluid communication. It relies on network slicing and virtualization to meet the requirements of different sectors.

If you visualize a pile-up on a freeway and its aftermath, you're probably not projecting all the telecommunications needs that such an event generates. And yet: emergency services need to coordinate, health services need to be informed in real time, transport operators need to reorganize traffic... All these players need to communicate effectively, both internally and with each other. Yet the systems used by these different bodies are often compartmentalized, slowing down decision-making and interventions in the field.

This situation highlights the need to set up network and service infrastructures capable of meeting the simultaneous demands of several players in different sectors. Among the flagship projects of the PEPR "5G and Networks of the Future" program, the NF-MUST project aims to meet part of this need by developing a service architecture adapted to the cooperation of multi-actor and multi-sector operations and communications.

It's a challenge because, in addition to specific technical requirements and the diversity of communication and security protocols, end-to-end coordination between sectors requires the involvement of several operators. "Knowing that a 5G or 6G operator already has heterogeneous technological segments at its disposal, it must concatenate them in order to establish end-to-end infrastructures [without interruption or discontinuity] capable of meeting users' needs," points out Djamal Zeghlache, Professor of Networks and Services at Télécom SudParis, initiator and leader of the NF-MUST project. "To interconnect several operators, we need to network networks! This means extending the notion of slicing to the multi-sector.

Borrowed from cloud computing, "slicing" has developed with 5G, driven by the virtualization of services and networks. Virtualization consists in dividing a single hardware base into independent virtual environments, thereby pooling physical resources. A router, for example, can be divided into compartments, each corresponding to a virtual router serving a particular user (tenant). Several tenants can then share the same equipment, while remaining isolated from each other. These virtualized environments offer a flexible, shared base on which to run a variety of applications or services - typically health or emergency services communications, without the need for dedicated hardware.

Slicing takes this concept a step further, by partitioning the entire network, including virtualized resources, into "slices" dedicated to "verticals" such as energy, healthcare or transportation. "The notion of sharing resources, such as computing, storage or memory, is extended to sharing services and network infrastructures", explains Djamal Zeghlache. In concrete terms, operators are using their 5G and 6G network infrastructures to offer customized slices to meet the requirements of different verticals or a particular tenant. "Like a mille-feuille in which each layer corresponds to a part of the network, dedicated to a sector or tenant", adds the researcher.

In this way, everyone benefits from their own slice, which includes not only the virtualization of network equipment, but also the dynamic allocation of specific resources (such as bandwidth, traffic flow priority, etc.) according to their needs. This is what makes it possible, for example, to deploy 5G infrastructures at a smart port, where the players involved (port authority involving shipowners, crane operators, carriers...) share a network while benefiting from slots reserved for their respective operations. "These infrastructures are interconnected, in order to offer services tailored to the various players and stakeholders operating in the port environment," Zeghlache adds.

The challenge of multi-sector coordination

One of the ambitions of the NF-MUST project is therefore to extend the concept of slicing to a multi-sector scale. Sectors (energy, healthcare, transport, etc.) could then share a common infrastructure, while each benefiting from dedicated slices adapted to their specific needs.

This involves first understanding the context of a particular multi-sector use case: identifying the instantaneous needs of all players, in order to transform them into a demand that will be expressed to all the networks. The latter will then provide the environments in which the services required by the various sectors' business applications will run. To take the case of a pile-up, the NF-MUST architecture will provide an end-to-end slice to ensure cooperation between the gendarmerie, the fire department, the emergency services, the motorway service... In other words, all the communication networks and services associated with the business implications of all these players.

However, despite sharing network resources, these different players need to be able to isolate their sensitive data. NF-MUST's end-to-end architecture must therefore enable each to operate in its own environment, and to communicate flexibly with others. "Think of it as a building where each floor or space (staircase, room, corridor) is reserved for a different tenant, linked or separated from the others by gateways or partitions. In this environment, the configuration can be changed - partitions can be added, moved around, made watertight, etc. - so that each person can operate independently, or cooperate with a trusted partner," explains Djamal Zeghlache.

Towards dynamic resource management

The NF-MUST project's other ambition is to make slicing dynamic and automated, so as to adjust resources according to users' instant needs. Currently, slices are often pre-designed and pre-defined, then made available in a service catalog. These predefined slices are well identified: they meet specific needs and are chosen according to customer requirements. Dynamic slicing aims to go beyond this predefined approach.

This requires real-time analysis of available infrastructure, and verification of unallocated resources. These resources include both virtualized entities - enabling several customers to use them simultaneously - and physical entities, which cannot be virtualized and must be shared successively between several users. Slices are then configured according to requirements.

A catalog of customizable services

The next step is to make service catalogs available to users, so that they can put together their own slices according to their needs, like modular building blocks. When a customer expresses a need, either technically or in natural language, the system translates and analyzes this request to identify the necessary components from the catalog. The catalog then functions as a knowledge base, containing the elementary building blocks from which the slice is composed. Each slice is built by assembling different elements found in the catalog, while ensuring that users are immutably authenticated and have the necessary access rights to their services, "like LEGO® bricks to be combined to meet the customer's request", compares Djamal Zeghlache.

In short, composing a slice involves discovering the services available, authenticating and understanding the customer's request, breaking it down into elementary bricks, and finding compatible components in the service catalog. Although the catalog is central to this process, there is still progress to be made in enriching this knowledge base and adapting it to multi-sector uses. "Even though many of us are working on the subject, service catalogs are not yet very extensive. The procedure is not fully automated. So, for the moment, it's a medium-term objective," moderates Djamal Zeghlache.

NF-NAI, a complementary project

The NF-MUST project has been allocated 3.5 million euros, shared between Télécom SudParis, Inria, CEA-List and CNRS. It is closely linked with another project from the PEPR "5G and Networks of the Future" program, NF-NAI. Whereas NF-MUST focuses primarily on the management and coordination of multi-domain services, NF-NAI is concerned with setting up the network infrastructures needed to guarantee the execution of these services. In other words, NF-NAI provides the foundations, transport technologies and end-to-end interconnections on which NF-MUST relies to dynamically orchestrate services.

This complementarity enables NF-MUST to effectively transform user demands into concrete services, drawing on the robust infrastructures put in place by NF-NAI. In this way, the two projects ensure that physical infrastructure and service management work hand in hand, guaranteeing seamless interoperability and an effective response to multi-actor needs.

Latest news

A platform for testing the performance of new telecom systems

The POM project brought together researchers from Télécom Paris, part of the Carnot TSN institute, and Nokia Bell Labs to model and evaluate telecoms systems. This collaboration was based on the use of the TTOOL platform, accredited by the Carnot TSN institute.
, ,

[VIDEO] "Regards croisés by Carnot TSN": when innovation is the subject of debate

Since its creation in May 2025, the "Regards croisés by Carnot TSN" program has sought to bring together researchers, experts and business players to discuss the major technological issues shaping our future.
,

[Belle Histoire] COMPPIT: using AI to make the most of waste and biomass

S3d Ingénierie and IMT Atlantique, part of the TSN Carnot Institute, have decided to combine their expertise in the COMPPIT joint laboratory. Their joint aim: to develop an AI-based tool capable of determining the best recovery process to use, depending on the waste under consideration.

Need more information?

© 2022 Carnot Télécom & Société Numérique | Legal Notice