By Beth Harlen
Within telecoms circles, the term ‘network neutrality’ often evokes a strong reaction as there are few subjects within the industry that are quite as polarised legally, politically and practically.
At its heart, network neutrality imagines a future where internet service providers (ISPs) treat all internet traffic fairly and equally, with no single service gaining priority over any another. The increasingly bandwidth-hungry demands and expectations of consumers are met, without any restrictions on the type, platform or location of content they can access.
While some in the industry view this as the goal to which the global telecoms community should strive, others are more hesitant, citing concerns over how ISPs could in fact be managing and prioritising traffic over their networks.
In order to ease congestion on their networks and thereby improve efficiency, ISPs are employing traffic management practices to intelligently match the available network resources with content delivery demands. This is designed to ensure that whether customers are sending email or streaming live video, they have a consistently good experience.
This is not an inherently good or bad practice in itself; rather, the question surrounds what happens if these methods were to be used in an anti-competitive way? There is no ignoring the fact that the potential risk exists for ISPs to employ traffic management techniques that ensure their own services and content have priority over those of their rivals. This effectively undermines the notion of network neutrality, and is the spark for much of the divide in opinion.
The solution, it would seem, falls then to regulatory bodies.
In order to fully examine the different approaches to traffic management (TM) – and subsequently judge their appropriateness – the UK telecoms regulator, Ofcom, commissioned an independent technical study on the subject so that it could “further understand the availability of techniques and tools that may be used to detect the presence of TM in broadband networks”.
Entitled ‘A Study of Traffic Management Detection Methods & Tools’, the literature review by Predictable Network Solutions frames the debate by asking a fundamental question: what criteria should traffic management detection methods and tools satisfy?
The report offers a comparative analysis of the identified TM methods and tools in terms of "their efficacy in detecting and quantifying the presence of TM in a given network; the impact on the network and the consumer in terms generated traffic volume, quality of experience, etc.; and the need for a given tool or methodology to be integrated within, or executed outside, a given ISP’s network infrastructure".
It finds, however, that because of the inherently statistical nature of packet-based networks, confirming that internet service provision satisfies suitable criteria of fitness-for-purpose, transparency and fairness is challenging at best. "The absence of differential traffic management does not, by itself, guarantee fairness, nor does fairness guarantee fitness-for purpose. Traffic management detection (TMD) is thus, at best, one component of an overall solution for measuring network service provision," it states.
The report adds that if TM policies to be used on end-user traffic were published, their implementation could be independently verified. This is somewhat of a moot point, however, as ISPs will be reluctant to handover accurate data in this regard. The measurement of fairness and fitness-for-purpose is also made more difficult by the end users’ unique quality of experiences; depending on the application being used, what matters to one consumer may be less of a priority for another.
The emphasis therefore shifts to TMD tools that would be suitable for regulatory use within the UK. From Ofcom’s perspective, it holds that "an effective TM detection mechanism should operate with a sufficient degree of precision, repeatability, reproducibility, scalability and validity of tests percentage. It should take into consideration the different ways by which access networks are architected and the variations in the digital delivery chain (peering and transit); and be able to locate the position in the digital delivery chain where TM is being applied".
The difficulty that arises here is that no current mechanism meets the key desired attributes of what Ofcom envisions a TM detection system to be. The report cites several reasons for this shortfall, such as the fact that techniques “aim only to detect the presence of differential TM within the broadband connection”. One critical constraint is that most of the currently available tools only focus on detecting a particular application of a particular TM technique, and that “even in combination they [TMD tools] do not cover all of the potential TM approaches that could be applied”. The report is forced to conclude that there is no tool or combination of tools currently available that is suitable for practical use”.
The detection of traffic management techniques and identification of the assumed intention behind said management – note that I say ‘assumed’ because, as the report suggests, "by its nature, the intention behind any TM applied is unknowable; only the effects of TM are observable" – is far from ideal. Furthermore it does draw attention to the somewhat considerable gaps that exist in this area.
However, the report does highlight a network analysis tool dubbed ‘network tomography’ that uses the “performance of packets traversing a given network to infer details about the underlying network, its performance; and potentially the presence and location of TM”. This may provide a practical and effective solution in the UK, although more work needs to be done in order to assess its viability.
Whilst in no way signalling the end of the debate over network neutrality, the report does evaluate the critical issues surrounding network performance in a thoughtful and objective way. One of the final highlights dismisses the idea that traffic management and network neutrality are in anyway mutually exclusive; rather the report states that, “traffic management controls how the quality impairment is allocated; and since quality impairment is always present and always distributed somehow or other, traffic management is always present”.
The debate continues.
- Beth Harlen is a freelance science writer based in Cambridge, UK.