One of my long-term predictions has been that Internet Service Providers will ultimately disappear as we know them today.
They were a necessary middle-man when we were trying coax our voice-grade network into the internet-era; dial-up internet evolved from banks of modems providing access to bulletin boards and mail hosts to an interconnected inter-net.
Now we are moving towards a network purpose designed for data that treats voice as just another service, there is something of an assumption that ISPs as we have come to know them will remain an important part of the supply chain – the necessary link between retail customers and the telecommunications core.
We can see that evidenced in the EU’s assumption that open access networks will provide multiple layers of competition, and the initial focus of Ofcom’s work began with a replacement for level 2 services which form the basis of today’s ISP services (in fact it doesn’t feel like they’ve moved much beyond level 2 NGA access).
But its worth remembering what happened in the UK when the regulator mandated access to the first-mile copper network, the so called local loop; ISPs rushed to compete at an infrastructure level, not on a virtualised network but using real networks. It suggests the industries preferred competitive battle-ground and the regulators may not be exactly the same.
In the days of dial-up modems, your ISP provided transit from their network to the internet, your email address and web-space. Today, most people prefer Gmail and consider Facebook their web hosts, leaving ISP’s as little more than resellers of internet transit. No wonder they seeks areas to differentiate!
Returning to today’s next generation transition, there has been a clear reluctance among ISPs to engage and commit to next generation developments. While much of the debate has been on the cost and complexity of creating new software interfaces to manage layer 2 Active Line Access (ALA – what BT calls GEA) services, lying behind this , I suspect, is a deeper preference to find a realistic substitute for local loop unbundling, where ISPs can retain their ability to compete using physical and not virtual networks.
If this is true, then perhaps it should not have been a surprise that the first formal, unequivocal request from a service provider to next generation network builders was for physical network access – Virgin’s offer to use a wavelength to extend their cable coverage to new areas where a full fibre-to-the-premises network exists.
Physical network control provides greater scope to form the service layer in your own image – to differentiate the customer experience, matching it your brand and aims.
While the arguments from the rest of the service provider community for not joining the next generation party have focussed on the complexity of software system interconnection, this is really a facet of the cost and complexity of administering virtual networks – physical network interconnection is typically a much simpler process with fewer variables.
So was the work on ALA a waste?
Absolutely and unequivocally no!
A smart and flexible layer 2 framework is what will release the content providers – the cloud operators. While service providers appear to want to move down the network stack, their place will be filled by application and content providers. The capabilities of a smart next generation networks will unleash the creativity of social media companies, cloud application developers and the content delivery companies.
ALA should be promoted to Sony and Google as much, if more than, to TalkTalk and Plusnet.
Am I bothered that some next generation networks appear vertically integrated? If their intentions are monopolistic, then very much so. If however, they are creating a platform for services and using ALA to actively encourage new service delivery models, then I’m less concerned – in many ways I suspect they will become the pioneers of a new internet era.
So what is the impact of all this?
If internet service provision does move further towards physical network provisioning, then we need to understand one key message: Who ever lights the service owns the customers and controls their access to the digital world. This is the true root of the net neutrality debate.
While it is true that whoever builds the passive cabling has a natural geographical monopoly, whoever lights the service has a natural monopoly over people and businesses. That is one of the key strengths of ALA – it breaks the chains, putting control over the digital experience in the hands of customers and the services they value.
In this regard, it perhaps matters less about having a choice over who lights the service but much more important about how they light it. Getting this right will move the internet message away from bits and bytes and towards stuff that matters to us – the services we value.
So for Ofcom, two messages should be very clear:
- More progress needs to be made on passive infrastructure access. Its not just about ducts and polls but a passive version of ALA – a consistent framework that allows today’s ISPs to unbundle cables and reinforce their apparent desire to deliver real networks, not virtual.
- ALA is a brilliant mechanism but only if its purpose and opportunities are made clear. Whoever lights the cables, should be using ALA, and a new level of service competition should be created where multiple content providers are able to take advantage of the intelligence built into ALA. Ofcom needs to put its long arms around the totality of its remit, and not treat broadband and different in some way to TV or content.
If we can get this right, the UK could become the first country to break the chains of the net neutrality debate and in the process create an exciting platform for the next wave of creative industries and social media. And we will have put to bed one of the key reasons the major ISPs aren’t fully engaging with this future.