This page is an archive from our previous website. Please check out our new website where you can read new COMMENTARY eNewsletters, TELL IT LIKE IT IS blog posts or Press Releases.
 
The open access announcement is interesting, but devices will still have to meet the minimum technical standard set by the FCC and Verizon
 
Blog

Verizon: Open Access and LTE

Friday, November 30, 2007
 

Verizon made two announcements this week that will have an impact on the world of wireless. The first was that by the end of 2008, Verizon Wireless will "provide customers the option to use, on its nationwide wireless network, wireless devices, software and applications not offered by the company." This announcement is being hailed as a "win" for the Internet community.

 

The second announcement is that Verizon Wireless will use LTE (Long Term Evolution) for its next generation instead of UMB (Ultra Mobile Broadband). This announcement is seen by the GSM/UMTS/LTE community as Verizon Wireless breaking with the CDMA community (CDMA2000 1X, EV-DO and UMB) and moving closer to its part owner, Vodafone, for next-generation services deployment.

 

The open access announcement is interesting, but devices will still have to meet the minimum technical standard set by the FCC and Verizon in order to qualify for activation on the network. This makes sense since network operators' main concern when opening up their networks is to make sure they do not permit devices on the network that could harm the network or might cause problems for other customers. Verizon will be using its "state-of-the-art" testing lab to verify the proper operation of these devices.

 

At a recent WCA panel session held in Palo Alto, the Google representative stated that Google's Android operating system would work "above" the wireless interface layer and have no access to the software and hardware that controls the phone on the network. I certain that confining the operating system and capabilities of the device and isolating them from the technology is the right approach to making the devices "good citizens" on the network. As for software and applications, we will have to wait and see the Verizon published specifications.

 

Getting back to Verizon's next-generation technology choice, first, this will have little or no effect on Qualcomm since it has intellectual property for both flavors of next-generation technology. Further, Verizon has stated that it will continue to run and expand its existing CDMA network and use LTE only on its AWS spectrum (East Coast mostly) and whatever spectrum it wins at the 700-MHz auction "if Verizon decides to bid" according to the Verizon Wireless Analyst call on Wednesday.

 

Will this spell the end for UMB, the CDMA version of the next-generation wireless technology? I have seen a number of reports jumping to this conclusion, but I am not so sure. I believe that the future of wireless is about multiple-technology-capable chipsets in wireless devices, and that both LTE and UMB have a place moving forward. However, neither of these announcements was surprising. What I think the analysts and the press missed is this―

 

Dick Lynch, Executive Vice President and Chief Technology Officer for Verizon, made some statements that I think are the most important part of both of these announcements. First, Verizon is much more concerned about bandwidth issues than almost any other network operator in the world and I happen to believe that its concerns are well founded even with next-generation technology on tap for 2010.

 

On the analyst call, Dick answered one question by talking about use-based pricing―When you open a network, or when you sell data on an all-you-can-eat basis, how do you provide fair and equal access to all of your customers? I see this as a big issue and one that must be addressed because of the limitations of wireless bandwidth. One way is by making use of Quality of Service (QoS) and selling data services at several different data rates and charging differently for them. This is already a standard practice for both DSL and cable access. However, it still does not address the issue of the data hog who starts downloading streaming video for hours on end.

 

Dick's answer is something I believe will become the norm in wireless data pricing moving forward. You pay for what you use. If you use wireless data services to read and answer a dozen or so emails a day and perhaps visit a website or two, what you pay for data will be lower than someone who is downloading a movie. If you are a typical customer, you will pay typical prices. However, if you really need that 50 MB file right now, you will be notified of additional charges and they will be added to your bill. If, on the other hand, you can wait until 2 a.m. when data traffic is much lighter, the cost to download that file will be cheaper.

 

Market pricing is not only necessary to help make bandwidth available for all customers, it is the fairest way to make sure that the bandwidth we have is available to all customers. According to a recent report in Broadband Properties Magazine, Internet traffic grew at a rate of 57% last year and Internet bandwidth grew 68%. There is no way that I know of to increase wireless bandwidth in the United States by 68% in five years, let alone one.

 

I think that open access within limits is a good for all, but by "all" I mean all of the customers on a given wireless network, not only the 10% who believe they should have totally unrestricted access and who don't have any consideration for the rest of us as long as they get theirs!

 

COMMENTS: This is an archived post. Commenting is no longer available.

Brian Taylor - 12/01/2007 11:22:35

How nice. Bandwidth for those who can afford it, and everyone else can suck eggs. Yes, Mr. Gottrocks - you may use the highway. All others, please stick to the cow path.

I can't help but notice that by using the particular example you chose to support this antiquated view, you deliberately trivialized the character of "every man's" need for speed - i.e., all he wants is a movie. He is not, for instance, an impoverished college student conducting needed research in order to complete an assignment. He is not, for instance, a struggling independent consultant seeking to upload a critical file to his first client. She is not a mother frantically looking for information about a household poison.

There are much better, and much more egalitarian, pricing models for bandwidth. Why don't you root around in the history of telephone services, newspapers and other forms of media to see if you can't discover what they might be?

Andrew Seybold - 12/01/2007 11:38:36

Brian--thanks for the comments--and yes there are probably better ways but the issue here is that we do not have unlimited bandwidth when using a wireless network. Lets take your example and move it to a Starbucks with a single access point. The back-haul is a T-1 with a total bandwidth of 1.54 Mpbs. If you are the only person making use of the access point in the starbucks you get it all. If, however, there are 10 of you, including an impoverished student or one of the others you mentioned, you share the total bandwidth but you all get some of it. Now, if three of the 10 people start downloading streaming video the other 7 are left with very little, if any bandwidth. Now, if you move that model to a cell sector (120 degrees) and calculate the total area covered by that sector out to say 3 miles, then there is a potential for more customers at any given time. Without some form of management and/or way to provide for equal access the teen who is downloading movie after movie essentially precludes a decent speed for others in that same cell sector--including a mother frantically looking for inforomation about a household poison.
One of the reasons that Muni-Wi-Fi systems are failing is that their shared, unlicensed bandwidth is getting crowded and the networks are NOT mission critical.
I am not suggesting that pricing will solve all of the bandwidth issues but I am suggesting that it is one way of handling the data hogs who are out there. Even the cable companies are having to enforce bandwidth limits because their cable system is also shared bandwidth, and they have a lot more bandwidth than any one wireless network operator does.
Quality of Service will be important, if you pay for 786 down and 586 up, for example, the ONLY way to make sure you get that amount of bandwidth all of the time is to manage the shared bandwidth over a wireless network--the laws of physics cannot be broken, they have been bent but even the new technologies coming on-line in 2010 do not break Shannon's law--perhaps while I am routing around looking for other models you might want to visit a site which explains Shannon's law.

Best regards

Andy

Brian Taylor - 12/01/2007 11:43:26

Incidentally, here's couple of hints because I don't wish to make unfair assumptions about your knowledge of media pricing models:

1. Network users have little or no control over what bandwidth they use when downloading, which is the majority of what they do. CONTENT PROVIDERS have the greatest control over file size and optimization. Just a few minutes ago, I accessed a PDF file via DSL. There was no notation of the file size. This file loaded...and loaded...and loaded...I thought the article must be at least 50 pages long. Nope - it was 7 pages long, and included two HUGE, unnecessarily high-res graphics. So, I should pay for that? You start charging me bandwidth fees and I will stop accessing many files unless I know in advance how big they are AND unless they are important enough for me to also calculate whether they will push me over into the "bandwidth penalty column". This includes your column, incidentally.

2. CONTENT PROVIDERS are the primary economic beneficiaries of bandwidth in the new world of Web 2.0, just as they are in the world of newspapers and television - not the users.

3. The most important phenomenon in the e-world today is social networking, which can be bandwidth-intensive. Like networks of every kind, the value of such networks is almost entirely proportional to the number of "nodes" on the network. Now, just stop for a minute, and think about overlaying this network - worldwide - with your pricing model. It amounts to nothing less repulsive than a de facto caste system.

I think you can take it from there....

Brian Taylor - 12/01/2007 11:52:31

Thanks, Andy, but I cut my teeth on Shannon and Erlang while many who might read this were kicking the slats out of their cribs.

The problem is, you go off track before you leave the station. Why should having "unlimited bandwidth" be any sort of criterion? The history of bandwidth over the past 20 years clearly demonstrates how wrong-headed it is to start with an existing finite "bucket" of bandwidth and try to calculate how many glasses you can fill in order to know what you should charge by the glass. You are much better off, in fact, to start from the opposite assumption: That bandwidth is an infinitely elastic commodity. While theoretically untrue, it is that assumption that is nearer to the truth than thinking of it as x bushels of corn in a silo.

Andrew Seybold - 12/01/2007 11:57:26

Brian--interesting discussion, however, using your method, and with your undestanding of wireless, you do not take into account the fact that building a cell site in today's environmnet takes about 3 years in most areas. Yes you can add pico, micro, and even femto cells and increase the back-haul costs. But GETTING more bandwidth when and where it is needed in not just as simple as ordering up a new T1 or FIOS connection as it is in the Internet world. There are only a few ways to add capacity to a wireless network as you know. Each taking a long period of time. So I disagree that bandwidth is an infinitely elastic commodity, unless you add time to the equation.

Brian Taylor - 12/01/2007 12:08:11

Technology is a funny thing, Andy, and I have great faith in the power of human ingenuity. We have not seen the best compression algorithm yet, for instance. I would frankly rather put carriers out at the edge of profitability possibilities, and thereby create the impetus for technological creativity, than to "satisfice" them. We saw what happened for decades with respect to communications technology when the pricing model "satisficed" AT&T.

Unfortunately, we must "add time" to the equation, Andy. Once in place, pricing models - and particularly those that are government-sanctioned - become entrenched and are then extraordinarily difficult to displace. We would, I remain convinced, be better to consider pricing models that reward carriers and content providers for creativity and responsibility than any models that in any way disenfranchise or differentiate among users who happen to be of different economic status. Whether I am rich or poor, I can walk up to any newsstand and buy a newspaper for 35 cents, although it has been calculated that the "bandwidth" of a typical daily newspaper is probably "worth" over 100 times that.



Bill Branan - 12/03/2007 13:44:11

INteresting that Brian correctly notes that bandwidth is an "infinitely elastic commodity", but then proceeds to argue for an infinitely inelastic pricing model.

David Boettger - 12/04/2007 01:36:57

"[T]his will have little or no effect on Qualcomm since it has intellectual property for both flavors of next-generation technology."

"I believe that the future of wireless is about multiple-technology-capable chipsets in wireless devices, and that both LTE and UMB have a place moving forward."

Andy, it seems you've fallen under the spell of the mighty Q. The writing has been on the wall for CDMA and its progeny for a long time now (I'm talking years, not months). UMB never had a chance. Qualcomm tried mightily but failed in having CDMA et al become the Windows of the cellular industry: a ubiquitous but still essentially proprietary "standard". The Verizon LTE announcement only confirmed what has been obvious for quite some time. Without Verizon (or Sprint) support in the US, and with the CDMA beachheads in Asia drifting toward 3GPP land, UMB is dead, plain and simple. Good riddance, I say; Qualcomm's "Don't worry about standardizing it; we'll sort it out in our ASICs" philosophy might have been fast, but it was neither robust nor extensible.

Qualcomm surely has its IP claws in WCDMA and HSPA, but LTE? I don't think so -- at least, not nearly as deep. This is seriously bad news over the long-term for Q. Sure, there'll be several (perhaps many) years during which CDMA-LTE dual-mode ASICs will be required, but after that -- once all of the CDMA spectrum is refarmed for LTE -- what is Qualcomm going to do? They're just another modem vendor at that point...

Brian Taylor - 12/04/2007 23:46:52

I can't quite get my arms around what you might mean by an "infinitely inelastic pricing model", Bill. At least, I don't recognize it in anything I said.

I used the 35-cent daily newspaper as an admittedly imperfect model of "egalitarian" bandwidth pricing. Perhaps I should elaborate.

1. Everyone pays 35 cents, but everyone also gets the same bandwidth.

2. As "news bandwidth need " goes, a single newspaper is sufficient in bandwidth to serve the news needs of most users. If all you need is the Times, fine. If you need the Post, too, you can pay for it, as well as the Tribune and the Herald and the...<whatever>. I certainly did not mean to imply otherwise.

3. User bandwidth pricing - even if you purchase a dozen papers - is subsidized by other streams of revenue. Just as with newspaper ads, it requires no flights of fancy to imagine what add-on services might comprise such streams for wireless carriers.

So, there's nothing "inelastic" about the pricing model.

My view of bandwidth is that it does not "belong" to the carriers, but to the people of this nation, just as do the radio and television frequencies along with the rest of the relevant EM spectrum. Granted, there is now subscription-based radio, and cable TV has forever been subscription-based, but isn't the main complaint with cable TV that keeps cropping up in government hearings the fact that it is pricing even basic bandwidth beyond the reach of lower-income people? One day, the cable TV industry will very likely find themselves price-regulated if they don't wake up and smell the coffee. The history of price regulation shows that it is not kind to competition nor to technological innovation, and we cannot allow this to happen in the wireless arena as well.

I believe we have done enough for carriers when we have placed them in a fairly noncompetitive position to create unique secondary streams of revenue beyond the bandwidth itself, and I have every confidence that these very smart fellers will figure out how to do just that. The basic pipe should be nice and fat, and should be subsidized by those who will use special services riding on top of the pipe.

Bill Branan - 12/06/2007 14:27:37

Well, I pretty much agree with you on the last two points.
Unfortunately, our representative government seems quite motivated to sell, rent, or give away a lot of spectrum based on political expediency.

I think the difference between the wireless carriers and the cable operators is the cable operators still have essentially a geographic monopoly. Interesting to note that where competetion exists, like between phone and cable based internet access, prices continue to decline and are now based an peak available bandwidth...