Skip to main content

Walking the line towards 100Tb/s

As the world continues to adapt to the ever-changing pandemic landscape, optical networks need to remain stable and constant, now more than ever, to provide the high levels of connectivity required for working and learning from home. 

This is not something that will go away as and when we finally see an end to Covid; demand for bandwidth was being experienced way before 2020. However, it is true to say that the pandemic has certainly accelerated the need for reliable connectivity and networks which support this. Last year alone saw a massive 1.3 billion years of global combined time spent online, according to figures released by intelligence firm DataReportal. 

This places a hugely significant strain on bandwidth. When it comes to addressing this challenge, Marcin Bala, CEO at equipment vendor GBC Photonics, believes that the sector needs to find viable line system solutions, optimising networks to respond to connectivity demands in 2022. 

He explained: ‘The pandemic meant that people struggled to interact in person and had to rely on virtual communication. For the digital sector, there were 4.66 billion internet users in January 2021, an increase of 7.3 per cent from 2020. Similarly, more than 53 per cent of the world population is now on social media. Coherent optics can solve capacity issues by using complex technology to modulate the amplitude, frequency and phase of the light in the fibre.’ 

This process, Bala said, increases network performance and flexibility to transport more information on the same fibre. ‘From fibre optics being able to transport just 10Gb/s at the beginning of coherent technology, it is now possible to transport up to 800Gb/s and even more in the future.’ 

Bala highlighted wavelength division multiplexing (WDM) technology as one way to help optimise networks. ‘According to market research company LightCounting, the value of WDM technology will increase significantly in 2022 and will continue to do so in the next five years to reach $18,000 million in sales,’ he said. ‘WDM technology allows operators to send data over the same medium using multiple light wavelengths, significantly increasing fibre capacity.’ 

Bala looked back to before the emergence of WDM systems, during which time, he said, the only solution for network operators to keep up with bandwidth demand was to lay more fibre – a disruptive and expensive process. ‘In the past 20 years,’ he said, ‘accelerating developments in multiplexing technologies have opened new opportunities to add capacity without necessarily having to lay more fibre. While it increases the capacity of fibre networks, WDM has also been recognised as a layer one transport technology in all tiers of the network, which is why it represents a great investment for operators in 2022.

Coherent technology was another aspect highlighted by Bala as integral to network performance. He explained: ‘400G is a promising technology that allows for high-capacity connectivity and reliable bandwidth, with low operational expenses and a smaller carbon footprint. As part of this revolution, QSFP-DD ZR and ZR+ compatible transceivers have been developed to support 400G connectivity. 400G ZR and ZR+ support the QSFP-DD interface, which allows DWDM systems to be installed directly into network equipment. Operators that opt for a network infrastructure based on 400G QSFP-DD transceivers will save on both initial investment and the total cost of ownership (TCO) of WDM networks. More than 65 per cent of WDM network operators believe that the TCO of such designed networks will be at least 20 per cent lower compared to traditional WDM networks.’ 

However, Bala pointed out that despite a significant investment in 400G technology, there is also a massive increase in the value of fifth-generation coherent technology. Although 800G modules are not yet commercially available, he explained, they will be able to transport 800 billion bits-per-second, increasing fibre capacity or extending wavelengths across any path through remotely adjustable line rates. ‘This new technology could be used in long-haul applications for over a thousand kilometres and submarine applications for more than 10,000 kilometres,’ he said. 

Up for discussion 

A panel discussion at the latest NGON DCI virtual event on advances in optical line systems and the road to 100Tb/s also looked at the networks of the future and how they might be supported. Scott Wilkinson, chair and lead analyst at research firm Cignal AI, said: ‘There have already been lab experiments up to 100Tb/s and there are now commercial transceivers and transponders that can go up to 800G’. Some of these, he continued, had been reported to have achieved up to 1.6Tb/s. ‘100Tb/s is certainly within the realm of possibility in the next couple of years.’ 

But of course, networks and line systems will need to be able to support these speeds. The first topic covered by the panel on this subject was the openness of the line systems and what that might look like in the future. According to Dejan Bilbija, director of sales system engineering at Ciena, there has definitely been a great deal of progress in the deployment of open and disaggregated optical systems. ‘In the beginning,’ he said, ‘we saw quite a lot of disaggregated services platforms. And now there are vendors of open disaggregated line systems to choose from, from the system vendor. We’ve been designing disaggregated systems differently, with the goal to increase automation and open line systems. This is what our customers are asking us to provide. The new generation of line systems include embedded intelligence for automated system turn-up and live system collaboration, together with the northbound interfaces based on common data models. These include embedded instrumentations to recognise the presence of the wavelengths and act only on the individual wavelengths.’ 

From the perspective of service providers, Bilbija explained, deployment models increasingly involve deploying different optical responders from different vendors over one open line system. ‘In addition to that,’ he said, ‘we did see some of the big service providers and content providers also performing integration or line interoperability between the different line systems. By doing that they are taking the responsibility on their side for the integration and the testing. So, how much are these systems open today and how much will they be open in the future?’ From a system vendor perspective, Bilbija detailed how Ciena is working towards having more open systems. ‘How much they will get into the line networks depends on the service providers and their willingness and readiness to provide these integrations.’ 

Szilard Zsigmond, director of product line management at Nokia, agreed that openness is an important consideration. ‘We have seen that going in different areas of the network. There is also the direction to open up on the network; customers are building their own controllers to support that. There have been efforts through the open ROADMs to standardise on the line side and it is a reality today that subsea spectrum sharing is becoming standard.’ ‘All of this is being supported,’ Zsigmond continued. ‘But still, we should not forget the complexity of doing it – because it’s great that we have the technology, but as soon as customers are buying just half of the package, the risk immediately starts going on their end. So there are still unsolved issues, like how to address field issues – whose fault is it? Which team is going to debug it? These are obvious questions.’ 

Image credit: metamorworks/Shutterstock.com

Always open 

Regarding his experience with deployments, Zsigmond admitted that it can be difficult to gauge whether a majority of providers have moved to open line, but said that those who do are driven by customers who have ‘an army of people to support it ... they know how to address those deployments.’ Bilbija agreed. ‘Willingness and readiness of the service provider,’ he explained, ‘is what determines whether they want to go in this direction, taking responsibility for all these issues that Szilard mentioned about going into these deployment models. Bigger service providers are more willing to look into this type of deployment and smaller service providers with a smaller workforce are looking to get into the closed systems.’ ‘However,’ he said, ‘the systems are built at this moment in time and in a way that they are open. It’s up to the service provider to determine which is the model that fits them the best from the operational perspective.’ 

Turning to metro networks, the panel debated if and when C+L line systems might be deployed in metro networks, given that it could be less costly to build more fibre instead. Zsigmond explained that Nokia had introduced C+L back in 2016. At the time, he said, ‘it was fully designed for long-haul, but now we are on a third generation C+L system and have deployed a couple of hundred thousand C+L systems. So the deployment is definitely not just long-haul anymore. We have massive deployments in the metro side. The integration of the components – WSS amplifiers, OCMs – has enabled form factors to be fully compatible with metro deployments. We are working on a next generation C+L where we basically converge the C+L into a single band and that would further enable cost reduction on a C+L system. So the argument that two C- is cheaper than a C+L is not true.’ 

The light side 

The view of Ioannis Tomkos, professor of optical communications at the Department of Electrical and Computer Engineering of the University of Patras, Greece, is that it depends on the operator network and the availability of fibre. ‘If you have a metro area where fibre is abundant because of existing deployments of FTTH or 5G fronthaul,’ he said, ‘then the obvious choice is to use the fibres rather than extend the capacity by using C+L band in the metro. If we look at another network segment, like long-haul or even submarine, then the situation again will change as to which one is the best solution. But for metro I believe that for some years to come, many operators will have available fibre to light up.’ 

‘The experience that we’ve seen so far from customer deployments,’ said Bilbija, ‘is that we do not see a ‘one-size-fits-all’ solution for the metro. Especially if you look at this from the perspective of the coherent 400G pluggables that are coming on the market and we do see adoption of these in the metro networks and basically the line systems that are there to accommodate the networks. It really depends on a customer use case and what type of networks they are going to be deploying. But we do see the deployment of C+L systems in metro networks.’ 

Offering an operator’s perspective, Thomas Logiadis, core transport network planning section at OTE explained: ‘From our perspective it depends on cost, if C+L becomes very cost-effective, maybe it’ll be attractive to us, but since we’re an incumbent operator we don’t rent any fibre. I think for the time being it is not the best choice for us. We have a lot of fibre to use.’

Looking to the future and what’s coming next, Tomkos said: ‘I believe the endgame will be SDM, it cannot be otherwise. We will not be able to satisfy the demands of 20 years from now by just using the available dimensions for scaling bandwidth. Eventually we will need to be able to take advantage of all the available fibre infrastructure and minimise the cost and power consumption-per-bit by integrating amplifiers together, integrating WSSs together, so SDM will reduce the costs.’ 

Logiadis stated that in the mid-term future, C+L will likely be the way to go. ‘We have laid most of our fibre,’ he explained, ‘and it’s pretty expensive to lay the fibre so in order to cover the mid-term demands we will have to go to C+L and then see what is the most cost effective way.’



Topics

Read more about:

Next-generation networks

Editor's picks

Media Partners