2009 Year-End Wrap-Up

No matter what you use as an on-ramp to the Internet, at some point that on-ramp can and will become congested. Add to that the fact that every network has data-hogs—people who use their Internet connection to move lots of data.

First, I would like to send best wishes for the best possible holiday season to each of you, along with a wish that the next year will be happy, safe, and profitable for you.

I recently wrote a wrap-up of 2009 for FierceWireless and mentioned a few things I thought had an impact on wireless during the year. What I presented as being important—the Apple iPhone 3GS, the new FCC, the beginnings of net neutrality, cloud computing, broadband stimulus funding, TV white space unlicensed spectrum, WiMAX and LTE, smartphones, app stores, ebooks, and more—were certainly prominent in 2009, but they were conceived of much earlier and will carry forward into 2010 and beyond. The only reason most of us who write about technology at the end of one year and the beginning of the next is that it is expected of us, most likely because the new year represents new beginnings.

What Happened

As I reviewed my FierceWireless column to prepare to write this final COMMENTARY of 2009, I thought of other things that had happened and things that should have happened but did not. The most significant happening was something I view as a negative for the wireless industry. This was the first year of the new administration, a new FCC, and many new people at various levels within the government. It was a year when people and companies not directly involved with wireless effectively lobbied all parts of the government concerning wireless and how it should be, according them, rather than how it is and how it must be due to certain laws of physics.

We had already lost some ground to these people in 2007 and 2008. Google was partially successful in its push to have some of the 700-MHz spectrum auctioned with open access requirements. Google and others fought hard and won more unlicensed spectrum known as TV White Space (which is basically unusable in urban areas, but they did not know that when they went after it). The wireless industry lost even more ground to the wired Internet industry in 2009. The new FCC Chairman announced his plans for a net neutrality ruling and the FCC published documents and asked for feedback. The FCC also said that net neutrality might be treated differently for wireless—the “might” depends upon whom you listen to within the FCC. I am not sure, exactly, what net neutrality is or what is supposed to stop, even though I have read the FCC’s postings a number of times along with many of the comments that have been filed.

No matter what you use as an on-ramp to the Internet, at some point that on-ramp can and will become congested. Add to that the fact that every network has data-hogs—people who use their Internet connection to move lots of data. For example, one “home” served by a cable provider was running 3 or 4 servers and providing large amounts of video and other content across the web, all while the people next door were trying to access the Internet to read their mail. AT&T recently stated that about 3% of its smartphone customers account for 40% of the data usage. I have to ask how, exactly, net neutrality will fix these problems. I believe it could actually make things worse, especially in the wireless world.

Today, at least on the wired/cable side of the business, price points are used to determine how much data speed a customer is entitled to. If you pay the lowest price, you get the slowest speed. However, especially in a cable environment where the total bandwidth is shared in a neighborhood, the more people who are on the system, the slower it is for everyone. Isn’t that fair and as it should be? Systems only have so much capacity. If one user is on the system, he or she can have most of it. If the system is then shared by 2, 6, 8, or 10 people, each experiences the same speed and access. Normally, the bandwidth and data speeds seem fine since not everyone on the same on-ramp is using data at the same time. One of the advantages of packet systems is that packets are intermingled and in normal circumstances, short-lived slowdowns are not even noticed. However, one or more neighbor downloading large video files can have an impact on your access.

This becomes even more pronounced with wireless where capacity is available on a cell sector basis. If you are the only person in the cell sector, again, you have most of the bandwidth (some is reserved for others coming into the cell). But as more people join you in the sector, everyone’s data speeds are reduced and should be the same for everyone. This assumes that the network is being managed, that it is a smart network, and that the network operator is able to manage the connections to the benefit of all of its users, not a selected few. In other words, whoever gets to the bandwidth first should not be permitted to hog it at the expense of others who come on later. I, for one, don’t see how net neutrality will ensure that this doesn’t happen.

I think the answer is Quality of Service with varying pricing levels for varying amounts of data and peak rate pricing. If it is 2 pm in the afternoon and you want to download a very large file, it might cost a few extra dollars, but if you defer the download until 2 am, it could be free. The market and networks should be able to be managed to smooth out network loading so all customers can have equal access. If net neutrality ends up restricting operators’ ability to properly manage their network and limit the amount of data consumed by a single customer to the exclusion of others, it will make the situation worse. The FCC began moving on its plan in 2009, but 2010 will be the critical year.

Those pushing for net neutrality, both wired and wireless, do not seem to understand that every single method of access to and from the Internet has a finite amount of capacity and, at some point along the route, it will become shared capacity. If is it shared, it should be shared equally. Network management is the way to make sure this is the case.

I have often wondered why the cable company that serves New York City was not permitted to reign in the few individuals who were abusing the system by using much more data than the average user when satellite Internet providers have been throttling back customers for many years. When using a satellite system, download speeds usually stay the same, but upload speeds are throttled back when the uplink is used for an extended period of time to send or receive large files.

The net neutrality issue should be of concern to both wired and wireless companies alike. To me, the key here is that those wanting to make the rules do not have a clear understanding of  bandwidth limitations and capacity. Some education for these people is probably in order, especially since some of them seem to be under the impression that both the Internet and the wireless networks will continue to keep pace no matter how fast data traffic increases. This impression is not  based on facts, which is part of the problem. The idea of net neutrality came about many years ago, but now it is in the limelight and I hope it does not rear its ugly head in 2010.

The fact remains that the wireless community is losing ground in the battle for the government mindset. If you spend any time reading the comments on the FCC site (www.broadband.gov) you will find a lot of complaining about wireless companies ripping off consumers. It appears as though no amount of logic or real data (which the new FCC says will rule its decision-making process) will convince those who want to complain that the United States has some of the lowest voice and data rates and that data rationing is already being instituted in other parts of the world. The real problem, of course, is that these types of sites only attract those who don’t like something. Rarely does anyone who has something positive to say add to those comments so the result is a very one-sided view of the wireless broadband world. Again, education is sorely needed. The CTIA is doing some things to address this need, but the industry itself needs to become more proactive.

What Did Not Happen?

The biggest non-happening of 2009 was the absolute lack of progress on the much-needed public safety broadband system. The D Block should have been turned over to the Public Safety Spectrum Trust, funding should have been made available, the waivers for cities and regions that want to build out pilot systems should all have been approved, and the public safety community, working with commercial networks, should have begun putting their systems in the ground. It has been a long time since Katrina and longer still since 9/11. I wonder what it will take to move this forward. It seems a bit lopsided to me that we can provide fiber for a farmer in rural America but we cannot provide a broadband network for the public safety community, even in urban areas.

Considering the world economic situation in 2009, I think we should all be thankful that the wireless industry survived and, in some cases, thrived this past year. If the economy is improving, and I hope it is, we should see even more exciting progress in 2010, but we all need to redouble our efforts to help neophytes understand the nuances of wireless services, and especially broadband. There are too many self-appointed experts making pronouncements about this industry when they do not have all of the facts, nor knowledge about spectrum usage and the laws of physics that govern the limits to what can be done with a finite amount of spectrum. I hope 2010 sees some more reasonable give and take between the various parties and that everyone keeps in mind that at the end of the day, customers and their wallets will have the final word.

Andrew M. Seybold

You must be logged in to comment or reply.