Managing the Demand for Wireless Data
07.18.2011 by Andrew M. Seybold
What then, is left for the network operators in the short term? How do they manage the bandwidth they have, how do they serve as many customers as possible with the best possible capacity and data speeds?
As the demand for wireless broadband services continues to explode, network operators are building more capacity as fast as they can. However, they know that with their existing spectrum and even as they transition to fourth-generation broadband services they won’t be able to keep up with the demand. One advantage wireless network operators have over the Internet is that they can monitor the demand, make network adjustments, add more cell sites closer together to yield more capacity in a given area, and roll out faster and more spectrally efficient technologies. They can also make use of Wi-Fi and femtocells to off-load their main networks.
Adding more capacity means obtaining more spectrum or building out more cell sites closer together, or both. The issue with the first option is that in order to acquire more spectrum, wireless network operators must wait for the FCC to make more available, buy spectrum from a competitor, or merge with another network operator that also has spectrum. They can also add more cell sites closer together, which all of the network operators do on a regular basis. Neither of these options provides a quick fix to the capacity issues. New cell sites take months if not years to plan and build and they must be approved by local planning commissions. Finding new sites in a large metro area is tough, and finding new sites in suburbia not only means having to deal with the planning commission, they must also deal with the residents who are asking for more capacity and more speed but don’t want new cell sites built near them.
So these two options should be viewed as part of the long-term fix for capacity demand. However, the way demand is increasing now we will run out of capacity before more spectrum or enough new cell sites can be built. One of the reasons AT&T wants to buy T-Mobile is that it will gain access to both additional spectrum and additional cell sites and from that perspective the purchase makes a lot of sense (and it won’t hinder competition).
There is one more option using new technology available to network operators and if I failed to mention it here, I know I would hear from one of the cellular pioneers who is also a pioneer in the field of smart antennas. Smart antennas, which are not being deployed in large numbers except in LTE systems), are designed to basically track customers within a cell sector and using a technology called beam forming, extend both their capacity and the distance they can be from the cell site center and still have good data rates. The technology has been around for a number of years now, but for various reasons it has not gained wide acceptance in the industry, at least not yet.
Meanwhile, the FCC has promised to “find” an additional 300 MHz of spectrum suitable for broadband within five years and an additional 200 MHz of spectrum in the next five years. We cannot make spectrum, we can only reallocate what we have and use it more efficiently. If the FCC is successful, it will still be years before this spectrum can be placed into service. First it must identify the spectrum, then figure out where to relocate existing users, then put it out to auction. Only after all of that has been accomplished will the winners of the spectrum be able to build it out. That will take a few years, and then add more time to bring devices capable of using the spectrum into the market. Once again, this is one of the long-term solutions but it won’t help in the short term.
To make matters worse, Credit Suisse just issued a report stating that today’s wireless broadband networks in the United States are already operating at 80% of their capacity. This not only has an impact on the availability of bandwidth for customers, it can also affect the ability of a network to hand off a data session from one cell sector to another. If the customer is mobile and moves from a cell sector that has capacity to one that does not, the session could be dropped or the data rate could be lowered. Data usage is at an all-time high and continues to grow. The same report indicates that in Europe the systems are at 65% of capacity and quickly moving toward 70%.
What then, is left for the network operators in the short term? How do they manage the bandwidth they have, how do they serve as many customers as possible with the best possible capacity and data speeds? Off-loading broadband services from the network is one good way to help distribute the load of broadband and this is being done today using Wi-Fi and femtocells or in-building cell sites. Both of these methods really do help unload the wide-area networks since the backhaul for both Wi-Fi and femtocells is the customer’s own wired broadband connection. Wi-Fi or a femtocell in a home provides better indoor coverage and moves the data to and from the customer via a wired broadband connection and then the last 100 feet over a wireless link.
Only a few years ago, the major network operators did not want to consider using Wi-Fi because it is an unlicensed technology and they did not want their customers to move off the network. T-Mobile was the only network operator to embrace Wi-Fi early and today thousands of Wi-Fi hotspots are seamlessly integrated into its network. AT&T and then Verizon finally decided that broadband demand was building so fast that Wi-Fi off-loading really did provide a way to help manage the demand. Today, both AT&T and Verizon off-load a lot of their broadband services whenever possible. This has been helped by the fact that most smartphones today include Wi-Fi capability as a standard feature, and there are two ways data can be off-loaded. In the case of the iPhone, for example, customers are told by the device that there is a Wi-Fi hotspot nearby and asked if they want to connect to it. In some cases, the device is now totally independent from the wide-area network, but in other cases the two networks are joined at the back end so services such as email and other information exchange can still be used.
The next level up is the type of system run by T-Mobile. In its case, anytime you enter an airport or other place where there is a Wi-Fi hotspot your device automatically moves over to the Wi-Fi network for both voice and data services. (The voice is not Voice over IP but GSM voice that is encoded using the UMA standard.) This automatically frees up the wide-area network in any area such as an airport where there is a high demand for both voice and data services.
There is a lot of interest in femtocells as well. A femtocell is like a Wi-Fi access point in your home or office but it is on the wide-area network’s spectrum. You connect it to your wired broadband service (DSL, cable, fiber) and once activated it is connected to the back end of the wide-area network. Femtocells are seen as one of the hot technologies this year because they do off-load the wide-area network, the customer usually pays for the femtocell, and it uses the customer’s wireless Internet connection as backhaul to reach the operator’s network. Most femtocells offer both voice and data services, and most require the placement of a GPS antenna in order to meet the E911 mandate from the FCC.
There are a number of other ways to increase in-building coverage with Distributed Antenna Systems, Bi-Directional Amplifiers, and other technologies. However, none of these methods off-load the traffic from the wide-area network since they must be in range of a donor cell sector. So while they do increase in-building coverage, they do not off-load traffic from the wide-area network.
All of the above options are being implemented by the network operators as they race to meet the surging demand for broadband services. However, in addition to the technology options they need other ways in which to balance the demand for broadband and the amount of broadband bandwidth they have available. Technology advancements and increases in capacity are long-term fixes while Wi-Fi and femtocells are immediate fixes—but only for a given home or building. What other options do the network operators have to try to balance the demand and capacity of their network?
The Answer Is Broadband Pricing
Only a few years ago it was commonplace for network operators to offer all-you-can-eat broadband data pricing models. This was done to encourage customers to embrace data services and it worked very well. As speeds increased, devices became more broadband friendly and the demand rose. Along with the number of new customers attracted to wireless broadband is a class of users known as “data hogs.” These customers use much more than their “share” of the capacity available on a given network. Only a year or so after AT&T and Apple launched the iPhone, AT&T reported that 4% of iPhone users were using 50% of its network’s capacity.
Now that broadband is an established form of mobile communications and all of the Internet companies have “discovered” wireless broadband, the demand is soaring and by necessity the network operators are finding, one by one, that they have to use data pricing to help manage the capacity on their networks. AT&T and Verizon have switched to tiered pricing levels based on “normal” data usage levels. The low tier is normally 2 GB of data per month and the high tier is 5 GB per month. These numbers are based on a lot of research by the network operators on data usage and they should be sufficient to meet most customers’ demand.
Those who exceed these monthly limits are then charged overage fees on a per-GB basis. Many people think the network operators are simply being greedy, but in reality, the tiers are designed to accommodate 80-90% of all customers’ demand (today), and to better manage the data hogs that are using much more bandwidth every month. Remember that network capacity has to be broken down into cell sector coverage. A typical cell is divided into three 120-degree sectors and each sector has the same bandwidth capacity. If, for example, each sector in the network is capable of 15 Mbps of data and there is only one customer within that sector, he or she can use all of the available capacity. However, if there are ten people requesting data within that same cell sector, they will share the available bandwidth and capacity.
If all of these customers are using the network for email, surfing the Internet, and the like, the data packets are intermingled and each user has a good data experience. However, if several of the customers within the cell sector are streaming video, especially HD video, the amount of capacity available for the other customers in the same cell sector is diminished. If too many people are trying to stream content within that cell sector then the cell sector performance appears to be poor for all of the customers, even those simply trying to send and receive email. Add to this the fact that the further you are from the center of the cell sector the less the data speed and, therefore, capacity you have. Even with LTE the data rates at the end of a cell sector fall off to around 256 Kbps, which is why network operators attempt to build sites close together to minimize the number of customers who have to operate at the edge of a cell sector.
In order to understand the issue, here are some data points for video streaming in use today:
- 420P or DVD Quality requires 2.3 Mbps
- 720P or HD Quality requires 5.8 Mbps
- 1080P or HD plus Blu-ray Quality requires 10 Mbps
Putting that into perspective, if you are a customer in a cell sector with a capacity of 15 Mbps and there are three customers in the same sector who are streaming HD-quality video, there is no bandwidth left for anyone else! These numbers are based on MPEG-4 data compression; another option is H.264, which dramatically reduces the bandwidth required. In the early days of computers, our firm was constantly asked to help vendors find the next “killer application” that would help drive the adoption of PCs and mobile computing devices. Back then a number of applications were considered to be killer applications including VisiCalc, Word processing, and Ashton-Tate’s database. Today we view a killer application differently: Streaming video is a killer application; it could easily kill wireless (and wired) broadband capacity.
Most of the network operators in the United States have started tiered pricing, which has almost always been the case in the rest of the world. It is interesting to note that in the United States we pay some of the lowest prices for wireless voice and data services in the world even though people gripe about the costs. I tell people they should not be surprised that we are entering the era of tiered pricing since most of us have been paying on a tiered pricing basis for our electricity and our water for many years. It is a common tool to help manage any resource that is limited; therefore it is natural that it has found its way into the wireless world.
But more and different types of pricing await us as network operators strive to make as much bandwidth available for as many of their customers as they can. Some of the options I expect to see coming in the near future include one-person, multiple-devices pricing that will save money for many of us who have multiple devices, but the total data usage will still be on a tiered basis and aggregated for all of our devices. Likewise, I think we will see family data plans similar to what we now have with voice plans and I would not be at all surprised to see time-of-day data pricing before long. It might work like this: Want to download a large file? At 2 pm in the afternoon you might have to pay a few bucks to download it but at 2 am the download might be free. The additional cost will be worth it if we need the file right now, but if we can wait until later it will be possible to set the download for an off-peak time or even for the application to measure network usage and download the file during slack times.
In Europe there are discussions going on now, at least among the network operators and the EU, about requiring companies that are heavy traffic users on the wireless networks (YouTube, Netflix, etc.) to be required to pay the operators in order to help with build-out and capacity improvement costs. I, for one, am strongly in favor of this approach and believe that those who create the traffic should pay part of the costs to ensure there is enough capacity for all of the customers on a network. It is unfair for them to congest the networks and not have to pay for the privilege of doing so.
There are those who still believe that both Internet and wireless bandwidth will be abundant for years to come. Cloud companies are betting their businesses on it and companies such as Netflix and many others are betting their businesses on it. Those who understand the realities of the limited bandwidth we actually have available today, the time it will take to increase it, and the sharp increase in the demand for services are working diligently to expand the capabilities of our networks and infrastructure.
There are grumbles about the end of all-you-can-eat data pricing but the reality of the situation is that network operators need to use all the tools available to them to manage the amount of bandwidth available. Technology upgrades, more spectrum, more cell sites, Wi-Fi and femtocell network off-loads, and yes, tiered data pricing are all tools that are being deployed by the network operators. They are trying their best to balance the capacity and demand for wireless broadband. One thing is certain: More people need to realize that wireless bandwidth is in short supply and we need to use it wisely. More importantly at this point, we need to understand that it may not always be there for us no matter where we are and no matter how badly we are in need of a wireless connection.
Andrew M. Seybold