Oct 23, 2009 (08:10 PM EDT)
What CIOs Must Know About The Wireless 'Spectrum Crisis'

Read the Original Article at InformationWeek

Julius Genachowski, chairman of the Federal Communications Commission, is worried. Not today, or even this year, but soon, he thinks the United States won't be able to meet the demand for broadband wireless data. "The biggest threat to the future of mobile in America is the looming spectrum crisis," he told a wireless conference earlier this month.

You should be worried, too--or, at the least, be paying attention. Mobile data will be an increasingly important piece of the enterprise IT architecture, and also part of more products that companies provide. CIOs need to identify network-centric trouble spots today, as new wireless opportunities arise and well before anything resembling a "crisis" hits.

Blame the iPhone. The first widespread use of wireless data began with the RIM BlackBerry in the mid-1990s, but it's the iPhone that's made wireless data a mass-market phenomenon. It's changing what people--employees and customers--expect to be able to do with their smartphones and how they interact with companies. This one blockbuster device has generated enough of a network load that iPhone owners complain about resulting poor service, including dropped calls, delayed messages, and slow download speeds.

Yet it's more than smartphone apps that should have CIOs thinking about wireless spectrum. Sprint this month created a business unit to sell use of its spectrum for devices other than phones to share data. Those devices include anything from Amazon's Kindle book reader, to a water meter that sends usage data, to a cash register (see story, "Why Follett Runs Some Stores On 3G Networks").

But there are limitations to today's wireless networks, and they're going to get bigger. Don't count on so-called 4G technologies to save the day, at least not for many years. Until then, CIOs must understand the effect on applications when the network gets overloaded, and plan accordingly.

Demands On The Network

A good place to start is the actual demand compared with network capacity. Cisco, in a January report, predicted that mobile data traffic will more than double every year through 2013, at which time it will reach 2 exabytes-- 2 billion GB--per month. Think Cisco believes its own numbers? This month, it struck a $2.9 billion deal to buy Starent Networks, whose equipment helps wireless carriers deliver multimedia and other data-intensive features.

The wireless industry is petitioning the FCC to allocate additional spectrum over the next six years to more than double the industry's licensed spectrum. FCC Chairman Genachowski sounds sympathetic. "Even with innovative spectrum policies and innovative new technologies, experts believe we are way too likely to be caught short," he said at the CTIA conference this month.

IT managers should care because congested networks are slow and, worse, unpredictable. As throughput rates decrease and packet delays increase, some applications start behaving erratically, as we'll discuss.

It's not panic time. We haven't yet reached widespread saturation on today's 3G networks, but given current market growth, it will likely happen without the timely addition of new spectrum. And given the multiyear process for identifying, auctioning, and deploying new wireless spectrum, spectrum might not become available fast enough to accommodate the rising tide of smartphones and data-hogging applications. The Cisco report states that smartphones generate more than 30 times the traffic of basic cell phones; laptops with wireless modems 450 times the traffic.

Operator Strategies

Data traffic now represents 25% of the revenues of wireless network operators globally, and it's a more profitable and faster-growing business than voice, so operators such as AT&T and Verizon are highly motivated to find technological ways to squeeze data capacity out of their existing networks.

One is to deploy new networks in spectrum that they already own. Both AT&T and Verizon have spectrum in the 700-MHz band, previously used for UHF TV channels, where they will deploy a technology called Long Term Evolution. Verizon, for instance, plans to start deploying LTE next year and AT&T in 2011.

However, LTE isn't enough. While it could more than double operators' current mobile broadband capacity, it won't be fully deployed until about 2014. The operator in the strongest spectrum position right now to provide mobile broadband, and the one with the least amount of fine print about what customers can do on its network, is Clearwire and its WiMax network. That network, however, also has the smallest footprint since deployment began only recently.

Operators also are upgrading 2G systems to 3G LTE to wring more capacity out of the same spectrum. Over time, probably sometime in the middle of the next decade, these same 3G networks will be upgraded to some more highly evolved versions of LTE or WiMax, what's envisioned as 4G. But near term, unfortunately, wireless technologies are approaching their theoretical limits, limited by what's called the Shannon Bound, which dictates the maximum spectral efficiency relative to noise.

Beyond spectrum and efficiency, operators also can increase the number of base stations. Smaller coverage areas equate to more available spectrum per person, the reason Wi-Fi can offer such high throughputs. Operators not only are deploying more large-area cells, called macrocells, but they're also deploying more microcells and picocells that cover a highly localized area--for example, a shopping mall.

A promising option in its earliest stages is femtocells, which cover very small areas and use ultra-low power via access points in homes or businesses that plug into subscribers' broadband Internet connections. Operators see femtocells as a great way to off-load data onto wired networks when people are in their homes or businesses.

Likewise, you may have noticed operators encouraging Wi-Fi use. Letting 3G subscribers use hotspots free isn't benevolence; operators are trying to off-load as much data as they can onto Wi-Fi.

Finally, one of the most important dials operators can turn to control volume is pricing. Wireless plans for laptop connectivity remain high, at about $60 per month. Smartphone plans are lower, at $30, since phone usage usually consumes less data, but the iPhone is challenging that assumption. Use a phone as a modem and the operator will promptly require the higher-priced laptop plan. There is also the somewhat controversial 5-GB monthly cap on most U.S. laptop plans. Business technology managers should be encouraged that operators are so active in managing demand and in continuing to invest in their networks. Wireless networks often work reasonably well. But the toughest issue is lack of predictability. It's problematic as companies push more applications over smartphones to critical mobile workers such as service personnel and salespeople.

Enterprise Implications

An employee working with a wireless application once a day, perhaps uploading or downloading a batch file, may not care too much about delays or retries. But a worker with a productivity application accessing the network continually throughout the day needs a dependable system. This can be achieved, but not without targeted effort, and generally only with applications that were specifically designed for mobile connectivity.

Though defined in wireless specifications, no wireless operators today offer quality-of-service guarantees, such as guaranteed bit rates, maximum latency, or traffic priority. So performance is purely best efforts. U.S. operators quote typical throughput rates, which means that some reasonable percentage of users--perhaps 80% or even 90%--will experience that range. Business technology managers must realize that leaves as much as one in five users out of the typical range--higher or lower--with some getting relatively low throughput.

Most TCP/IP-based networking applications weren't made to operate over wireless connections. While today's 3G and tomorrow's 4G networks can deliver IP packets reliably and efficiently in a congested situation, or even with just a very weak radio signal, throughput rates can decrease significantly, delays can increase, packets can drop, and connections can be lost entirely. Getting reconnected might require a different IP address, which can confuse an application in midtransaction. Moving rapidly in a train or car also can stress the connection since the quality of the radio signal fluctuates widely.

There are ways to minimize the pain of such difficulties. Those include background synchronization so users don't have to wait as the application communicates with the server; longer time-outs to tolerate wireless network variations; restarting previous data exchanges from the point of failure; and local caching and data compression to minimize the amount of data communicated.

To get those kind of robust communication methods is to build an application from the start to run over a wireless network, the way wireless e-mail on RIM's BlackBerry or the thousands of iPhone apps were. But for legacy apps, mobile middleware is usually the answer.

Some mobile middleware can be installed to work with existing applications, such as mobile VPNs from companies such as NetMotion Wireless. Others, from companies such as Antenna or Sybase, provide a development environment with which to build wireless-optimized applications. Benefits of middleware include not only the more robust communications transports mentioned above, but with some packages, the ability to target multiple smartphone platforms with the same application code base.

Middleware, however, comes with trade-offs. It adds to the cost and system complexity, and it involves a learning curve for IT departments and integrators. And any applications developed using the middleware tend to be tightly bound with it, making it difficult to change middleware vendors down the road.

IT managers must understand what makes some applications inherently better for wireless than others. For example, SQL-based database applications can involve a large number of SQL statements traversing the air, and with wireless latency much higher than wireline, this can translate to very sluggish performance. On the other hand, a database mobility extension via middleware, or from the database companies themselves, generally involves an architecture where the mobile system uses efficient and resilient protocols to communicate with a mobile server, which then acts as a proxy for communication with the end server.

With the move to cloud computing, IT managers may wonder if Web applications will take care of many of these problems. Unlike native applications, Web apps are highly dependent on solid network performance for a good user experience. On the other hand, Web interactions are relatively stateless, meaning TCP connections don't remain open after the browser accesses a page. So network interruptions between page loads shouldn't affect user sessions. The result is that pages may load more slowly in the event of network congestion, but overall, sessions should be fairly resilient. Of course, in really congested situations, page loads themselves will time out.

Applications that require constant high bandwidth are particularly vulnerable to disruption and thus a poor fit for wireless delivery today. A file download can tolerate variation in throughput fairly well, but streaming video (e.g., training, a repair procedure) is more susceptible, especially if the video viewer doesn't buffer data. Audio streaming, though, is practical--radio station streaming is very popular on the iPhone. Bottom line, the more real-time and high-bandwidth the app is, the more vulnerable it is.

IT managers really have two fundamental approaches available to them. For lightly used or noncritical applications, standard apps that aren't optimized for wireless probably will work sufficiently well to be viable. For critical applications, however, IT managers should consider optimizing their apps for wireless.

Field testing in representative coverage areas also is a must. Given variations in network performance, tests should be done across multiple iterations, in multiple locations, and during different times of the day. While there's no rule of thumb, testing projects I have conducted used six to 10 iterations of each application test case in at least six to 10 locations in a single metropolitan area to model expected performance. Usually, it's a good idea to test in more than one city if the application will be used in multiple cities. Testing should be with both 2G and 3G connections.

Where Things Are Heading

Networks will get faster and capacity will increase, but neither will necessarily keep up with escalating demand. So this state of affairs will remain with us for a long time to come. There are plenty of great reasons to use mobile broadband to improve productivity. But radio will never be the same as wire.

In the short term, IT managers should monitor the net neutrality debate, which has recently shot back into the political agenda. The strictest interpretation of net neutrality, for example, would effectively let any user do anything on the network, no matter how bandwidth-intensive. That would let businesses experiment with high-bandwidth uses but could also prevent operators from implementing quality-of-service guarantees, since QoS can be used to give some packets higher priority than others.

These issues are being hotly debated. Some net neutrality backers say quality-of-service guarantees can be accommodated within the reasonable limits of net neutrality regulation. The wireless industry is lobbying hard for measures such as network management that would let them throttle back users who are making excessive demands on the network. The resolution of these issues could affect how predictable network performance ends up.

Another problem to anticipate is that, as networks get faster, the variation in throughput will increase. This is because technologies such as LTE and WiMax can exploit good radio conditions for extremely high throughputs, using high-order modulation and advanced antenna techniques. But with a weak signal, they have to fall back significantly in throughput to get data across reliably.

A report by this author for 3G Americas, a 3G advocacy group for HSPA/LTE technologies that includes vendors and operators, projected that LTE speeds will range from 2 Mbps to 12 Mbps in typical 10-MHz deployments. Should a user be out of a metro area on a 2G network like EDGE, throughputs could be as low as 100 Kbps. Not always having access to the fastest technology in all desired coverage areas will remain a common condition indefinitely. Even today, 3G isn't ubiquitous despite initial deployments that started at the beginning of the decade.

One area that may bring improvement is quality-of-service agreements, assuming operators decide to and can make them available. Technologies such as LTE can readily provide different classes of service, including traffic prioritization and even guaranteed bit rates. But that will all take time and regulation, and even then you could roam onto a partner network, or 3G, or even 2G networks, where the same options won't exist.

So IT managers should take matters into their own hands now and deploy systems and applications with realistic expectations of performance, and appropriate protective measures.

Peter Rysavy is the president of Rysavy Research, a consulting firm specializing in wireless.

Continue to the sidebar:
Why Follett Runs Some Stores On 3G Networks