Net neutrality is a clear concept in theory: in an open Internet, all users should be able to access all content without discrimination, and all content providers should be able to reach users in the same way. But how this translates into practice is less clear. This article looks at the issue of net neutrality and considers what a ‘neutral’ Internet might look like in the face of increasing demands on Internet service providers to manage the rapidly growing amount of traffic on their networks.
Demand for bandwidth is growing rapidly. More users come on-line, and they access more bandwidth-hungry services from a greater range of devices, creating unprecedented increases in data traffic.1 This growing demand may only partially be met by traditional methods for increasing capacity (such as traffic offloading; more efficient technologies; use of more radio spectrum and fibre). This means that the issue of ‘managing’ Internet traffic in some way is already unavoidable for most ISPs. There are many options for doing so, from blocking capacity-hungry sites to slowing down all traffic at peak times to guaranteeing quality of service for specific sites with ‘hangover’ capacity for carrying all other traffic. Some of these affect all users in the same way, whilst others favour some traffic. What measures are permitted depends on how the relevant authorities decide to define the concept of net neutrality in the context of managed networks.
Why worry?
Whilst the concept of equal treatment of all traffic has some intuitive appeal, economics suggests some caution: allocating scarce resources on a first-come-first-served basis creates inefficiency and welfare losses, and giving priority to the demand from those users and service providers with the highest willingness to pay could generate substantial benefits for all. Internet service providers (ISPs) competing with each other should have good incentives to cater for the needs of different user types, offering a range of products that match the requirements of users in terms of usage intensity, guaranteed speeds, levels of access, etc.
If an ISP decided to block access to particular sites, for example, it would have to consider the impact that this would have on demand for its services, and the price it might be able to charge, given that alternative suppliers are available. Provided that customers are sufficiently well informed about the specific characteristics, they could then pick the service proposition that best suits their needs, trading off limited bandwidth or blocked access to particular sites at particular times for a lower price, for example. To some extent, this is happening, with different usage limits being set for different broadband packages.
However, competition may not be fully effective, and there could be a number of problems that may need to be addressed through intervention.
For example, customers may not be particularly well informed about the characteristics of the services on offer, and competition may focus on particular headline parameters that are not necessarily the most important ones. For example, there is ample evidence that the ‘up-to’ download speeds that currently seem to feature prominently in the advertising of residential broadband packages often say little about the service that is actually available to customers, and that customers are largely unaware of these differences.
There may also be market power issues upstream in the provision of content. Some content providers of ‘must-have’ content may be perceived by ISPs as ‘too big to block’, regardless of the bandwidth they require. Smaller content providers – and those wishing to access their services – might suffer disproportionately if there were no rules governing the behaviour of ISPs under the banner of traffic management.
Last but not least, ISPs may have invested in their own service offerings that compete with services provided by independent third parties. They might be tempted to use strategies that ostensibly are intended to manage traffic on their networks in order to favour their own services at the expense of third parties.
For all of these reasons some fundamental principles may be needed with which traffic management strategies need to comply.
Net neutrality in Europe …
The European Commission decided against introducing legislation to protect net neutrality in April 2011, at least for the time being. It has recognised the need for traffic management in some form, but considers that effective monitoring of ISPs blocking access to certain services in combination with media scrutiny and transparency of ISP offerings should be sufficient to protect an open and neutral Internet.
However, the Commission subsequently indicated that it would assess the potential need for additional guidance on net neutrality in light of the findings of an investigation into traffic management practices that it had conducted jointly with BEREC. Based on a large scale survey, the investigation found that the most frequently reported restrictions were the blocking and/or throttling of peer-to-peer (P2P) traffic and the blocking of Voice over IP (VoIP) traffic, each affecting at least 20% of subscribers.
It seems therefore quite likely that the European Commission will become involved further in the net neutrality debate in the near future.
Meanwhile, the Dutch authorities adopted net neutrality legislation in May 2012 that will prevent Dutch ISPs from charging their customers for access to particular services/websites such as YouTube or Skype, or slow down or block traffic to them. This happened in response to announcements by ISPs such as KPN of plans to charge additional fees for access to services such as WhatsApp and Skype that compete with their own text messaging services and voice calls.
In contrast, UK regulator Ofcom has commented in general terms on what it views as good practice on the one hand, and worrying signs in the market for Internet service provision on the other, without setting down in regulation what sort of market behaviour will be tolerated. Whilst Ofcom has recognised that both ‘best efforts’ Internet access and managed services have their place in the market, it relies at present on a self-regulatory approach within the industry.
Some national authorities have been more prepared than others to impose rules in relation to net neutrality and the specifics of how the balance between providing unrestricted access to the Internet, managing existing capacity efficiently and creating the right investment incentives for new capacity will be struck are emerging only very slowly. Nevertheless, it is possible to identify a few basic principles with which net neutrality rules should comply.
Be clear
First, there needs to be more clarity and transparency about the services that consumers are receiving from their ISPs. One of the main concerns with traffic management strategies is that they might be used in ways that short-change customers by providing services that fall substantially short of what customers believe they are getting.
This may involve providing a clear benchmark against which managed services can be compared, based on an unambiguous service definition. For example, a service that gives unrestricted access to all (lawful) sites without any attempts on the part of the ISP to manage speed or bandwidth allocated to particular sites (perhaps labelled as ‘full Internet access’) would be a good starting point. In its guidelines on net neutrality, UK telecoms regulator Ofcom has suggested such an approach.2
Any deviation from ‘full Internet access’ in terms of blocking or slowing down traffic to lawful sites must be clearly disclosed to customers. In the first instance, ISPs who engage in such strategies would obviously not be able to claim that they offer ‘full Internet access’, which would go some way towards improving transparency. However, the alternative concept of ‘managed Internet access’ encompasses a wide range of ISP offerings, from the use of basic traffic management techniques during peak times that may in effect render inoperable data-hungry sites to slowing down or outright blocking of websites that may compete with the ISP’s own services. To give consumers enough information to make informed choices without having to incur huge costs, ISPs must be explicit about their traffic management policies, and communicate these in ways that are understandable for customers.3 This may require some standardisation of the way in which traffic management strategies are described, although too much standardisation may be counterproductive as it could focus competition on a few key parameters and give customers a false sense of comparability.
Consider competition effects
A small number of hard rules, such as rules against blocking the services of competitors and blatantly discriminatory traffic management techniques, might be appropriate – but such rules should be applied only in relation to ISPs who are in a position to affect competition, or enjoy market power and not across the board. Imposing SMP ‘special responsibility’ style regulatory obligations on all ISPs may unduly limit capacity-constrained ISPs in their ability to innovate (for example, by providing high-speed products with less content coverage) potentially resulting in a lack of product choice for consumers. Provided customers know what their ISP is doing, and have an alternative option, competition should be an effective constraint. On the other hand, where the number of alternative ISPs is relatively small, some protection against abusive behaviour would be needed. In this respect, the two principles of the FCC’s ‘no unreasonable discrimination’ rule (namely preventing discrimination ‘that harms an actual or potential competitor’, such as providers of VOIP services, or that ‘impairs free expression’, such as hindering access to a blog whose message the ISP disagrees with)4 appear to set reasonable parameters for content that should not be blocked by ISPs. Beyond these rules, however, as much discretion as possible should be left with ISPs to generate their own product offerings, where their traffic management policies represent an increasingly important feature of their service, subject to an overriding requirement to communicate these policies effectively to consumers.
Principles for charging content providers
In general terms, a charges to content providers for a guaranteed quality of service might be required to comply with the following principles.
- The charging regime must be transparent. Charges could be calculated on the basis of number of users of a service and average capacity required by users to generate a level of ‘burden’ of a service upon an ISP that can be charged for per unit of ‘burden’.
- Charges must be non-discriminatory. While ISPs may opt to use a matrix of unit prices per ‘burden’ of a service, taking into account for example carriage at peak times and discounts for large content providers, the same rate card should be made available to all content providers wishing to guarantee the quality of their service.
- There should be a minimum level of ‘burden’ threshold for charging. In order not to stifle innovation, ISPs must be required to carry the traffic of content providers below a threshold level, as defined by the ‘burden’ they impose on ISPs, without additional charge.
Together these principles ensure simplicity of ISP charging regimes. They also provide a degree of predictability of charges for content providers, which will be necessary for strategic decision-making. Such decisions will include, for example, whether a content provider values being carried by an ISP for a given price in the first instance and whether to invest in more efficient technologies, reducing its ‘burden’ and cost of carriage, on an ongoing basis.
Support various charging models
While in many cases it may be acceptable to both ISPs and content providers that all data traffic is slowed down at peak times, there will inevitably be cases where this would lead to data speeds that are insufficient for maintaining acceptable quality of service standards for data-heavy services. In such cases, ISPs should have the discretion to offer content providers the option to pay to have their data carried at speeds that will allow them to maintain a guaranteed level of quality for their services. Although it would of course also be possible for end users to pay for guaranteed minimum speeds for accessing particular content, having content providers make such payments may be more efficient as it helps overcoming co-ordination issues and supports innovation. Introducing new, more data-hungry services is easier if effective delivery can be guaranteed without having to rely on potential customers individually paying for the minimum speed guarantee.
Where they offer capacity-intensive content providers the option of paying for carriage at guaranteed speeds, ISPs should set their charges in a transparent manner. ISPs who may enjoy market power should be required to offer the option to pay for such guaranteed speeds on a non-discriminatory basis, and must not be allowed to refuse carriage of traffic without extra payment, or slow down such traffic, below a certain minimum threshold.
Define a standard product as an anchor
Last but not least, it may be desirable to define a standard product (potentially based on the notion of ‘full Internet access’ at a minimum guaranteed speed) that all ISPs (or at least ISPs with market power) will need to offer. Such a product would provide an important safeguard and ensure that access at reasonable speeds is available for all. High frequency, high capacity Internet users have long been catered for by ISPs through premium service offerings, and in the context of net neutrality it is the less lucrative customers of ISPs whose access to the Internet needs to be protected. Further, in the context of a charging regime for content providers, access to Internet users must also be ensured for the large number of low bandwidth content providers. The availability of such a basic product should ensure that such users do not get left behind as the possibilities brought about by the Internet and corresponding access requirements continue to grow.
What is clear is that net neutrality cannot be defined without taking into account the interests of all stakeholders, and will need to allow payments for assured quality of service from content providers generating significant Internet traffic as well as consumers for both access and high levels of usage. Looking at net neutrality as a lofty principle that guarantees unfettered access to all services at a high level of service quality for all users at all times means ignoring the economic realities of scarce capacity in a situation of growing demand, and is likely to lead to a worse experience for every Internet user.

- For example, after only a year in the market, the users of the seemingly ubiquitous iPhone 4 became the most data-hungry users of all smartphone manufacturers as adoption of the data-hungry device made its way into the hands of the most intense smartphone users. (See Total Telecom, ‘iPhone 4S is biggest network hog – study’, 6 January 2012). In 2011, Ofcom found that “DSL-based connections continued to deliver average download speeds that were much lower than the headline ‘up to’ speeds which are frequently used to advertise broadband services. ‘Up to’8Mbit/s and ‘up to’ 20/24Mbit/s ADSL connections delivered just 41% and 31% of headline speeds during the period, in line with results from previous research…” (see Ofcom, “UK fixed-line broadband performance November 2011” published 2 February 2012). [↩]
- Ofcom (November 2011), “Approach to net neutrality” [↩]
- BEREC stated in its recent guidelines on transparency an effective transparency policy would be one that is accessible, understandable, meaningful, comparable and accurate (see BEREC (December 2011), “Guidelines on Transparency in the Scope of Net Neutrality: Best Practices and Recommended Approaches”, BoR (11)67. [↩]
- Paragraph 75 of Federal Communications Commission (FCC) Open Internet Rules. Note that this paragraph of the FCC’s Open Internet Rules also references concern with practices ‘that harm end users’. In this article, the issue of protecting end users is taken into account as part of separate recommended rules. [↩]