Gigabit? Who needs it?

Recently two things got me thinking a little:

  • One of my main PC’s needed a little maintenance
  • I visited the launch of Gigaclear’s Appleton network in Oxfordshire

My main Windows PC developed a memory fault and I needed to get a little support from Yoyotech, the excellent people who made it for me. When I got it back up and running, I benchmarked the machine using the Windows Experience Index and saw it was hovering between 7.8 and 7.9 – the index only goes to 7.9.

I’m yet to find a task this PC, when it’s feeling fit and well, can’t do in its sleep.

And this is one of two Yoyotech machines I have, so when I link databases between them and run queries on millions of records and transfer piles of data between them, they do it at gigabit speed in moments.

My network addressed storage box joins in, delivering data only a tad slower than the solid state drives in my PCs.

So to get this out of the way, I guess that does make me a geek.

But the other event was the launch of Gigaclear’s gigabit network in the Oxfordshire village of Appleton. They’re delivering a gigabit-based broadband service to homes and businesses in the village, and people were encouraged to come along to the village hall to try it out – a Bring Your Own Device (BYOD) day.

Continue reading “Gigabit? Who needs it?” »

National legislation with global impacts

The Internet blackout by many of the big names in response to proposed US legislation isn’t the first time law makers and internet pioneers have faced up to each other, and its also not the first time that national legislation, attempting to target a national issue, has had potentially significant impacts on the running of the international internet.

Almost exactly a year ago I wrote about the global challenges being posed by the US proposal for a domestic “Internet kill switch“; if the US Government were to switch off the US portions of the Internet it would not just deny UK citizens access to common services but may also kill entire portions of the UK’s internet access because of the global nature of internet peering.

There is no simple answer to this. Of course national Governments must act in their own self-interests but when it comes to the Internet the impact is seldom felt by local citizens alone. Internet Governance, thus far, has been largely successful in developing a fairly egalitarian, global phenomenon outside of national governance but we are entering a new world where national security, health and prosperity depend on the future running of the Internet – this makes it politically very important, not least to key knowledge economies like the UK.

US citizens have typically been more aware of this than many other nations – the importance of net-neutrality is a deeply emotional, heart-felt thing in the US but has so far been largely missed by UK citizens and is totally ignored in Ireland where the lack of transparency is actively marketed by the largest operator.

The reaction to the Internet blackout in response to the US SOPA proposals was interesting. It seemed to mark the awakening of debate beyond the US. I didn’t hear much from politicians outside of the US but the interest from commentators went beyond simply bemoaning that they couldn’t look things up on Wikipedia. When Jonathan Agnew from BBC’s Test Match Special comments about the importance of the internet and the problems that SOPA may introduce on Twitter, then it must have become mainstream.

My own position is that while copyright of course needs to be protected, the ramifications of any loosely drafted legislation can have far wider impacts, and the implementation of internet legislation specifically will always have implications far beyond national boundaries. Any Government considering a move like this today has a responsibility to world citizens and not just the self-interests of one sector of their local economy.

Today requires a generation of Internet-savvy politicians who can find new world solutions to old world problems like copyright.

Open is the best (only) policy – Ghost of Christmas Future

In my last post (Open is the best (only) policy) I gave a high-level view on why I think open access networks are important today but I didn’t really explore why I think that offers just a narrow glimpse of why open access will become the single most important thing network operators can do for their customers, and why the UK is unknowingly paving the way.

So a bold statement:

I think that Active Line Access (ALA) will become one of the most important features of public networks in the years to come – but it will take a little time for that to become apparent. I also know that so far very few people have understood this.

When I talk to people who build public networks they typically see ALA as the necessary replacement to PPP/L2TP; that its the technical remedy that allows them to hand-off connections to ISPs in an NGA world. They are of course right in a very practical, narrow sense but what the NICC did in codifying a long list of technical standards was much, much more than that.

When I talk to people who build campus networks their immediate response is what’s all the fuss about; ALA is a codified collection of standards that large corporates have been using for many years. Again broadly true but they have forgotten what their lives were like before they had these tools.

A Ghost of Christmas Past

Travelling back 15 years to the world of large corporates, a network managers lot was very difficult. They typically had the biggest budget in the IT department with the biggest sign-off but they also found it the hardest to provide direct empirical evidence that any incremental increase in their budget would deliver a greater incremental impact on the business; granular return on investment calculations were impossible.

Around this time I started to talk about the proximity to business, and it went like this:

  • The applications people had a direct relationship to the business so anything they did had a direct and immediate bearing on the business; incremental change could be measured and valued.
  • The core software people, like database administrators, were closely coupled to the applications people so although they were one step removed from the business and their systems may be shared, they were were close enough to the business they could measure their impact.
  • The server teams were further removed and incremental investment is beginning to become more challenging because their world is now two layers removed and increasingly shared but by working closely with the applications and core software people they could typically prove enough incremental value to justify additional investment.
  • The network teams were by definition universally shared and with no direct connection to parts of the business, only to the business as a whole; at this time, budget meetings in times of major shifts in the business were a pretty unpleasant affair and something most network managers dreaded (or at least the ones focussed on the business did)

With Y2K looming, I started to focus on how I could bridge the void and improve my proximity to the business. It was also at this time that what I then called 3D networks were beginning to be possible. Traditional 2D networks were a trade-off between distance and speed but 3D networks had a policy axis using a combination of VLANs and qualities of service; combining these meant I now had a granular control over the network and could therefore finely adapt the network in response to changing business needs – it was now possible to improve the network’s proximity to the business and therefore provide a direct and measurable impact. Budget meetings could now be constructive and less confrontational.

It took time for the ideas of 3D networking to take hold, and my name for it never stuck, but today any private network manager of any merit should be able to have a direct dialogue with the business.

When the NICC created ALA, they codified the tools that private network managers use; they put in place the mechanisms to improve the proximity of public networks to people and businesses – and the impact of that will, in time, be far more profound.

A Ghost of Christmas Future

It often takes a single event to focus minds and create the conditions for a shift of this kind:

  • For private network managers it was Y2K, when vast sums were spent renovating application platforms and they needed to justify their budgets.
  • For public networks it will be the shift to NGA network we’re just beginning.

So when I talk about Service Providers I’m not being lazy and omitting “Internet” because I assume they’ re synonymous;  its because I think ISPs are in reality a general-purpose subset of Service Providers – that once “providers of service” become aware of what the NICC has done the service provider market will become a whole lot richer and more exciting.

I had hoped the NHS might have been the pioneer in this space – the confluence of PSNs and the emergence of NGA is an opportunity that should be grabbed with both hands – but I suspect it will take a major commercial company to make the first move.

Who might the early movers be? The major cloud companies and content delivery networks (CDNs) are the obvious choices, and who better than Google (with YouTube) and Amazon Web Services (with Love Films).

Imagine this:

Today Google offer a best endeavours YouTube service, over the top of other people’s transit networks; it works okay if your goal is to support three minutes of viewing per day but isn’t good enough for three hours per day. This is at the root of Google’s concerns over Net-Neutrality.

In response, Google launch a Premium YouTube service for a few pounds month but instead of routing the service via an IP-based BGP interface onto your ISP’s network, its routed via an ALA VLAN hand-over point to your network operator. Quality is assured so now you can watch three-hours a day of broadcast quality media, and Google can secure the rights to premium content as the risk of pixelation has been removed and the rights holders can feel confident their brand wont be damaged.

Love Films backed by an ALA-based “Networks as a Service” offering from Amazon Web Services is at least as well placed to be the pioneer, completely demolishing the current rigid assumption that viewing is either linear (broadcast) or non-linear (on-demand); their new streaming package that learns your viewing habits is the first baby step.

Today, this minute, this is a dream – a perfectly feasible dream – but as companies like Love Films evolve their services and they explore, prod and push the capabilities and limitations of the underlying networks then I’m as confident as I can be that it will become a reality. When (not if) an organisation like Amazon Web Services gets their heads around the capabilities of ALA the world will change and imaginations will be unleashed.

Today we have a world of Over the Top (OTT) services – prepare for a world that combines OTT with RTS (round the side) services – and prepare for a future that blows your mind.

If you build your networks without ALA in mind then you are about to condemn your platform to obsolescence and your customers to boredom!

Start developing your networks with a proximity to your customers in mind and you will never look back!

Broadband doesn’t need high population density or PCs?

I just re-watched the brilliant BBC programme “The Joy of Stats”, where the infectious Hans Rosling’s encourages you to explore the world of statistics. I’ve been hunting for a long time in search of a better way to present the mass of data on broadband and was left somewhat envious of the way Hans presents his world facts.

Good news!! He has a website from which you can download the tool he uses with such zeal, and you can create your own animated graphs on a whole range of global datasets, from poverty and health to broadband and the internet – yep, he has included just what I’ve been looking for!

I’ve not really had time to fully understand the mass of correlations the software supports yet, but I thought I’d share with you a couple of very quick snapshots.

[imageflow id="4"]

The first  was a chart plots the degree of urbanisation against broadband levels. I expected to see a nice clean line – the more urban the country, the higher the level of broadband. But it wasn’t quite as clean cut as that. Many of the countries that rank better than us in terms of broadband take-up are also less urban – in some cases significantly more rural.

So perhaps this is a case of logic over engineering – more rural areas demand broadband because they need it for shopping, healthcare, education and so on simply because traditional face time services are a long way away.

The second assumption was that you need a PC to drive up sufficient demand for broadband. Again, not quite true. A number of countries with lower PC ownership rates than the UK also have better broadband take-up. Is this a sign that other countries are finding more things to do with their broadband connection than just connect a PC, like tele-medicine? Or that perhaps we have been poor at marketing it? Do other countries offer a more compelling story, more than just ”here’s your bandwidth, now get on with it”.

I don’t have the answers to these, just a growing number of questions – and the more I play with the Gapminder tools, the more questions I’ll have.

Homework: read the ALA documents

I received an email this week from the NICC’s Ethernet Working Group with links to the finalised Active Line Access (ALA) document. This is very exciting news for all sorts of reasons.

ALA is the industry agreed model designed specifically for the next generation broadband world, and at any number of levels it fundamentally changes the way broadband will work in the the UK.

The documents are not an easy read (this is, after all, a set of engineering standards designed to be implemented by engineers) but its impact should be understood by everyone who has an opinion on the future shape of broadband, the internet or net neutrality.

I attended some of the early meetings as an observer and because, like any opinionated techy, I wanted to help shape some of the early aspects. The Ethernet Working Group under Chris Gallon’s chairmanship is something of a technical dream team – they are the deeply technical architects and engineers from the major vendors and key network operators tasked with working out how their organisations can interoperate.

The work they have done is undoubtedly impressive – they have taken diverse standards from the Broadband Forum, the Metro Ethernet Forum and other standards bodies from around the world and carefully and creatively sewn them together into a single framework which unlocks the potential of next generation broadband across a wide range or network architectures and technologies.

Whether you opt for GPON, point to point ethernet or VDSL from a cabinet, ALA works and can hand over a connection in a seamless and universally consistent way to a service provider. And I see few reasons why it wouldn’t also work for many wireless technologies based around Ethernet and supporting VLAN’s.

Now this brilliant piece of work is published, we have a duty to understand what is possible and to start to consider what is desirable. It is no longer reasonable to opine on net neutrality or the future of the internet without properly appreciating the impact of this work.

If you are a community, commercial organisation or public body thinking of building a network under the BD-UK framework you will need to be open access – by law – and that means you need to understand ALA – no if, no buts.

So before you say another word on any of this, and you are of a technical bent, your homework is to read it!

If you aren’t technical, turn to your favourite geek and do them a favour – tell them to read it and explain what its all about.

The documents are published on the NICCs website:

Why we should care about the US Internet “Kill switch” proposals

There is a proposal running through the US Senate at the moment which would give the President powers to shut-down critical internet infrastructure, the so called “kill switch”. Apart from any concerns at a distance we might have about free speech and rights, there is an equally big issue which may be more critical to our own homeland security.

In the dawn of time, the internet was a peer network where each organisation with a network they wanted to open up linked, or inter-netted, with others on an equal basis. Since then major providers have moved into a position of some power and the equality of peering has pretty much gone. Small providers often have to club together or pass through multiple hands to get to a universal audience, so a small number of US-based infrastructure items have become critical to us as well as Americans.

Casting your mind back to the Autumn of 2008, you might remember a few days when odd things happened on the internet, where you could Skype some people and not others or reach some websites but not others, while your friends and colleagues experienced the same but it affected completely different sites and services. This was caused by a spat between Sprint and Cogent in the US, where Sprint decided to shut-down its peering relationship with Cogent (see here for a reminder).

Because peering is no longer egalitarian, a significant amount of the UK internet traffic needed to pass through this peering point in order for UK internet users to reach UK services; that’s why you could skype some people and not others, and why some websites disappeared but not others, and depending which way you passed through the peering point determined which services you could access.

I’m quite sure that proposals passing through the US legislature will have more safeguards than they do in Egypt and I’m sure the US President will act more calmly than Mubarak is but surely our national security should be in our hands?

While our diplomats should be ensuring we have assurances and safeguards as the law passes through Capitol Hill, we should also use this time as a moment of reflection, to make sure we have an internet that we can always use and can’t be impacted by the decisions of others far away and beyond our control.

Broadcast Evolution

I’ve just come back from the Broadcast Evolution Summit, a three-day conference exploring how the world of TV and broadcasting is changing, where I was given an opportunity to present some ideas on how the changes in next generation broadband infrastructures will create new opportunities and business models for content providers. Its impossible to turn down an opportunity to speak at a conference if its in Cannes and just as you’re getting over the winter blues, but this was a very good event.

One of the most striking things which really ought to have crossed my mind before was that the broadcast and internet industries are such different beasts, and at times it felt a little like we were looking at each other through some distorting lens, each with perfectly rational views of each other but very slightly wrong in important ways. So this conference was the first opportunity I’d had to meet, listen and chew over the challenges and changes the broadcast industry is facing, and fascinating it was. I plan to start a mini-series of blog entries about some of the speakers and impacts they’ve had on my thinking, and some of the ideas which came from listening to an industry undergoing change.

One thing was clear though, that this industry is facing many of the same pressures the music industry is facing – new technologies and ways of consuming media disrupting the ordered and well established way of doing things. However, there was much less of the Fergal Sharkey position, resisting change and arguing others should prop up their way of life, and much more of the Billy Bragg world view that people will still want to watch quality content, albeit in very different ways, and that although they haven’t got all the answers just yet there is a willingness to collaborate and learn so their industry has a long and successful future.

As I collect my thoughts – and there were lots – the first is that this kind of event, where two industries get to explore their similarities and opportunities, is a welcome addition to the European conference circuit. If there were a music industry version of it I’d be there like a shot!

Where’s the demand for “superfast” broadband?

As we start to debate what “superfast” might mean in a broadband context, too often people are returning to the bigger “why” questions: Why do we need to invest at all? where’s the demand?

The technology sector is an example of where Saye’s economic principles tend to trump Adam Smith; where intelligent supply tend’s to find its own demand rather than Smith’s assertion that proven demand being the overriding principle for creating supply. How could we know we all wanted Skype before we had the infrastructure on which someone could develop it, or that Youtube would become the second biggest search engine after Google?

A common rebuttal is that this is dangerous “field of dreams”, “dot.com bubble” thinking but that would be to over simplifying things, although it must be acknowledged that the evidence of daft investments in the sector are rife and nowhere more than in the UK; even today, try saying “boo” to a venture capitalist if you don’t believe me.

Clearly not all things create demand which is why we need smart people thinking about this – smart innovators and not the kind of daft people who genuinely thought that the rules which govern the economics of the internet were fundamentally different from the economics of everything else – re-read “Blown to Bits” by Evans and Wurster if you need a refresher.

Smart people in the technology sector come in two flavours – technically smart and commercially smart – and its really important never to mix them up!!

Technically smart people innovate incrementally – eureka moments rarely happen in reality, with most of the breakthroughs coming from close collaborations and earlier research. Those collaborations are not just with their peers but also with commercial organisations that have the imagination to drive innovation but who need a market to address. IBM, for example, has announced a major new R&D facility in Australia and the Government’s National Broadband Network is cited is as a major influencing factor; IBM can see that the mere existence of the NBN will lead to new opportunities created by innovators free to imagine and with the tools to invest. The R&D money will seek out places with the imagination and the market. For a country which can’t make the case its likely to become a slow lingering death.

Commercially smart people on the other hand don’t just assume that supply will find its own demand unaided, they reach out to the market to understand how it might be most readily consumed. This is arguably the most critical factor which separates successful fibre projects from those which fail.

In the UK we still present superfast broadband as just that – a very quick pipe to the internet. This kind of message will appeal to the perhaps 10% of society, the early adopters, but the rest of the market is likely to wait until someone delivers a service which captures their imagination – the next Skype or Youtube. So when BT and Virgin say superfast broadband is a risky investment which has so far demonstrated low take-up levels, no-one should be surprised – they’ve not given anyone a reason to buy it.

Contrast it with some of the more successful European projects, where take-up can be north of 80%. A common factor among these is a commercial message that engages people and not just the early adopters – they give all kinds of people reasons to take a service. It could be healthcare, better or more local TV, gaming, education – any number of things which smart commercial people left to imagine ought to be able to conjure up. In fact one of the most successful projects I’ve visited makes a point of never mentioning bandwidth and rarely mentions the internet.

The UK is one of the most technically adept and gadget conscious markets in the world. So if someone tells you there is no demand for superfast broadband, look at them with pity and move on – they don’t have the imagination!

Boosting the funnel

It was reported this week that a group of British scientists at Southampton University have developed a technique for keeping the light in fibre-optic cables nice and tidy and in sync. I thought I’d write a short blog on it because the importance of the discovery seems have been missed by some commentators.

For my purposes, the internet is like a giant funnel; lots of stuff poured in the top at ever higher rates into narrower and narrower pipes the further we get from home. Funnel

We are now pouring more in the top than ever before, which means we need to make sure the neck of the funnel doesn’t become the problem.

One solution is to use a leaky bucket – the genuine name given to the techniques which lie behind many of the traffic shaping tools– but that doesn’t solve the problem, it merely optimises the experience for services squeezed by the neck of the funnel (not that its necessarily a bad thing either).

Increasing bandwidth over short distances is easy but extending over long distance is more problematic; we saw this in first generation broadband and laser light is no different. But, and this a big but, as we move towards next generation access networks, with  the speeds already being deployed around Europe, the pressure on long haul inter-city and inter-national links will become immense. Delivering 100 Gbps is challenging over transatlantic distances and that’s only a hundred customers with gigabit broadband watching quad-hd 3d movies.

If we reach gigabit speeds in the home then rest assured the core will soon need terabit speeds. Delivering such bandwidth over 10’s of kilometres can be demonstrated but not over 100’s or 1,000′s – not in a single channel of usable bandwidth.

And here’s the problem. Fibre-optic cables are now so fine there isn’t much room for a beam of light to bounce off the wall of the fibre; so much so that over relatively short distances the effect is tiny and the signal emerges at the far end unscathed – but over long distances even small levels of bouncing around add up, corrupting the signal.

The developments announced this week are aimed at correcting the bouncing and corruption over distance, paving the way for terabit speeds across the ocean so our gigabit connected homes can still watch Hollywood/Bollywood films on our new  42” quad-hd 3d tv’s.

The whole space of photonics – the boundary where electronics meets light – is one which will move centre stage as we try to manage the funnel. Delivering high speeds to people’s homes is technically easy but ensuring there is the intelligence and scale in the rest of the network to match will frame the problem. Visionaries, like the people at Southampton University and others like InTune Networks and their work on switching tuneable lasers, may not make good dinner party talk but they will be the people that ensure the future Internet keeps up with the uses imagination puts it to.

Ambition is the new agenda

Last Thursday I attended the Government’s Industry Day where they laid out their key policy framework and work programme for broadband and the internet. If you hung around just long enough to hear Jeremy Hunt, Ed Vaizey and Caroline Spelman speak, and with only one ear on what was being said while you rushed to submit your copy you might be forgiven for thinking this was another platform where the new government blames the old for a delay in delivering on a promise – BUT you’d be VERY wrong.

Before the election two phrases kept cropping up – “We’re in this together” and “Big Society”. For me, Thursday’s event was possibly the first time I’d seen a concrete example of what that meant in real terms. What was announced wasn’t a policy which handed large sums of money to a semi-state organisation to proscribe how better broadband would be delivered from on high. Instead we heard from Ministers explaining what their role was in defining and delivering the future, what we could reasonably expect from central Government, and what needed to come from others.

We heard how the Government will remove barriers to investment and create the structures necessary to support local communities in defining their own broadband futures, and how industry would be encouraged to support that process, enabling a smart division of skills that could solve all but the most intractable of broadband problems.

And we heard from a Minister with a vision of 50 Mbps symmetrical services reaching most people by the end of this parliament delivered by the combined efforts of Government, industry and communities. I suspect that sent a few shivers through Whitehall but knowing the people involved I’m sure they are universally excited by the challenge.

Starting immediately is a month long consultation seeking paper solutions to three paper broadband problems. These will be used to shape the Government’s support programmes, ensuring both commercial and community organisations receive the right kind of support in the right manner. At the same time, the English regions and the devolved assemblies are each being asked to construct a long-list of areas they want to benefit from next generation broadband. From this, Broadband Delivery UK (BD-UK) will announce the location of three real market testing projects in September and begin a tendering process to find the right mix of commercial and community players to make them a reality. From these projects they aim to learn about the impact of state aid, forms of broadband registration and demand stimulation, and infrastructure sharing open access models.

While this is going on, BD-UK will be negotiating with the EU towards a national state aid agreement which for the first time since dial-up modems were in short trousers will provide clear guidance to local authorities on what they can and can’t do.  State Aid legislation has been a bigger block to UK investment in broadband than almost any other, with state sponsored projects crippled by fear of challenge or paralysed by years of rulings before they can begin work. The first roadblock gone – and with it gone, a new process will be in place to unlock the public networks which already reach many of our most remote communities.

Secondly work will push ahead on infrastructure sharing including the opening up of BT’s ducts as well as other assets like the sewers and culverts. This is a knotty problem and not a panacea but an important element in making the UK an easier place to invest in. Second roadblock going.

With all this work hopefully complete – the lessons from the market testing projects learnt, infrastructure hopefully opened up, and state aid put to bed – the Government will announce the main programme of work next year to support local delivery of super-fast broadband, supported by what they termed “mid-level aggregation” to make it easier for the service providers to link to homes and businesses. This time next year we will be well prepared for the main challenge ahead.

Did I hear all the answers on Thursday?No
Does that worry me?Quite the opposite – I’m relieved!
Am I excited?Absolutely!

For the first time in a long while, ambition is back on the agenda. Whether we actually achieve at least 50Mbps symmetrically to every corner of the UK doesn’t matter nearly so much as the way it will change the shape and aspirations of an industry, and the people and businesses that it serves. The journey matters as much as the arriving, and we are on our way.

 
Get Adobe Flash player