Archive for the ‘Cellular’ Category

iPhones can auto-connect to rogue Wi-Fi networks, researchers warn

Friday, June 14th, 2013

Attackers can exploit behavior to collect passwords and other sensitive data.

Security researchers say they’ve uncovered a weakness in some iPhones that makes it easier to force nearby users to connect to Wi-Fi networks that steal passwords or perform other nefarious deeds.

The weakness is contained in configuration settings installed by AT&T, Vodafone, and more than a dozen other carriers that give the phones voice and Internet services, according to a blog post published Wednesday. Settings for AT&T iPhones, for instance, frequently instruct the devices to automatically connect to a Wi-Fi network called attwifi when the signal becomes available. Carriers make the Wi-Fi signals available in public places as a service to help subscribers get Internet connections that are fast and reliable. Attackers can take advantage of this behavior by setting up their own rogue Wi-Fi networks with the same names and then collecting sensitive data as it passes through their routers.

“The takeaway is clear,” the researchers from mobile phone security provider Skycure wrote. “Setting up such Wi-Fi networks would initiate an automatic attack on nearby customers of the carrier, even if they are using an out-of-the-box iOS device that never connected to any Wi-Fi network.”

The researchers said they tested their hypothesis by setting up several Wi-Fi networks in public areas that used the same SSIDs as official carrier networks. During a test at a restaurant in Tel Aviv, Israel on Tuesday, 60 people connected to an imposter network in the first minute, Adi Sharabani, Skycure’s CEO and cofounder, told Ars in an e-mail. During a presentation on Wednesday at the International Cyber Security Conference, the Skycure researchers set up a network that 448 people connected to during a two-and-a-half-hour period. The researchers didn’t expose people to any attacks during the experiments; they just showed how easy it was for them to connect to networks without knowing they had no affiliation to the carrier.

Sharabani said the settings that cause AT&T iPhones to automatically connect to certain networks can be found in the device’s profile.mobileconfig file. It’s not clear if phones from other carriers also store their configurations in the same location or somewhere else.

“Moreover, even if you take another iOS device and put an AT&T sim in it, the network will be automatically defined, and you’ll get the same behavior,” he said. He said smartphones running Google’s Android operating system don’t behave the same way.

Once attackers have forced a device to connect to a rogue network, they can run exploit software that bypasses the secure sockets layer Web encryption. From there, attackers can perform man-in-the-middle (MitM) attacks that allow them to observe passwords in transit and even forge links and other content on the websites users are visiting.

The most effective way to prevent iPhones from connecting to networks without the user’s knowledge is to turn off Wi-Fi whenever it’s not needed. Apps are also available that give users control over what SSIDs an iPhone will and won’t connect to. It’s unclear how iPhones running the upcoming iOS 7 will behave. As Ars reported Monday, Apple’s newest OS will support the Wi-Fi Alliance’s Hotspot 2.0 specification, which is designed to allow devices to hop from one Wi-Fi hotspot to another.

Given how easy it for attackers to abuse Wi-Fi weaknesses, the Skycure research isn’t particularly shocking. Still, the ability of iPhones to connect to networks for the first time without requiring users to take explicit actions could be problematic, said Robert Graham, an independent security researcher who reviewed the Skycure blog post.

“A lot of apps still send stuff in the clear, and other apps don’t check the SSL certificate chain properly, meaning that Wi-Fi MitM is a huge problem,” said Graham, who is CEO of Errata Security. “That your phone comes pre-pwnable without your actions is a bad thing. Devices should come secure by default, not pwnable by default.”

Source:  arstechnica.com

Medical Devices Hard-Coded Passwords (ICS-ALERT-13-164-01)

Friday, June 14th, 2013

SUMMARY

Researchers Billy Rios and Terry McCorkle of Cylance have reported a hard-coded password vulnerability affecting roughly 300 medical devices across approximately 40 vendors. According to their report, the vulnerability could be exploited to potentially change critical settings and/or modify device firmware.

Because of the critical and unique status that medical devices occupy, ICS-CERT has been working in close cooperation with the Food and Drug Administration (FDA) in addressing these issues. ICS-CERT and the FDA have notified the affected vendors of the report and have asked the vendors to confirm the vulnerability and identify specific mitigations. ICS-CERT is issuing this alert to provide early notice of the report and identify baseline mitigations for reducing risks to these and other cybersecurity attacks. ICS-CERT and the FDA will follow up with specific advisories and information as appropriate

The report included vulnerability details for the following vulnerability

Vulnerability Type Remotely Exploitable Impact
Hard-coded password Yes, device dependent Critical settings/device firmware modification

 

The affected devices have hard-coded passwords that can be used to permit privileged access to devices such as passwords that would normally be used only by a service technician. In some devices, this access could allow critical settings or the device firmware to be modified.

The affected devices are manufactured by a broad range of vendors and fall into a broad range of categories including but not limited to:

  • Surgical and anesthesia devices,
  • Ventilators,
  • Drug infusion pumps,
  • External defibrillators,
  • Patient monitors, and
  • Laboratory and analysis equipment.

ICS-CERT and the FDA are not aware that this vulnerability has been exploited, nor are they aware of any patient injuries resulting from this potential cybersecurity vulnerability.

MITIGATION

ICS-CERT is currently coordinating with multiple vendors, the FDA, and the security researchers to identify specific mitigations across all devices. In the interim, ICS-CERT recommends that device manufacturers, healthcare facilities, and users of these devices take proactive measures to minimize the risk of exploitation of this and other vulnerabilities. The FDA has published recommendations and best practices to help prevent unauthorized access or modification to medical devices.

  • Take steps to limit unauthorized device access to trusted users only, particularly for those devices that are life-sustaining or could be directly connected to hospital networks.
    • Appropriate security controls may include: user authentication, for example, user ID and password, smartcard or biometric; strengthening password protection by avoiding hard‑coded passwords and limiting public access to passwords used for technical device access; physical locks; card readers; and guards.
  • Protect individual components from exploitation and develop strategies for active security protection appropriate for the device’s use environment. Such strategies should include timely deployment of routine, validated security patches and methods to restrict software or firmware updates to authenticated code. Note: The FDA typically does not need to review or approve medical device software changes made solely to strengthen cybersecurity.
  • Use design approaches that maintain a device’s critical functionality, even when security has been compromised, known as “fail-safe modes.”
  • Provide methods for retention and recovery after an incident where security has been compromised. Cybersecurity incidents are increasingly likely and manufacturers should consider incident response plans that address the possibility of degraded operation and efficient restoration and recovery.

For health care facilities: The FDA is recommending that you take steps to evaluate your network security and protect your hospital system. In evaluating network security, hospitals and health care facilities should consider:

  • Restricting unauthorized access to the network and networked medical devices.
  • Making certain appropriate antivirus software and firewalls are up-to-date.
  • Monitoring network activity for unauthorized use.
  • Protecting individual network components through routine and periodic evaluation, including updating security patches and disabling all unnecessary ports and services.
  • Contacting the specific device manufacturer if you think you may have a cybersecurity problem related to a medical device. If you are unable to determine the manufacturer or cannot contact the manufacturer, the FDA and DHS ICS-CERT may be able to assist in vulnerability reporting and resolution.
  • Developing and evaluating strategies to maintain critical functionality during adverse conditions.

ICS-CERT reminds health care facilities to perform proper impact analysis and risk assessment prior to taking defensive and protective measures.

ICS-CERT also provides a recommended practices section for control systems on the US-CERT Web site. Several recommended practices are available for reading or download, including Improving Industrial Control Systems Cybersecurity with Defense-in-Depth Strategies.a Although medical devices are not industrial control systems, many of the recommendations from these documents are applicable.

Organizations that observe any suspected malicious activity should follow their established internal procedures and report their findings to ICS-CERT and FDA for tracking and correlation against other incidents.

The FDA has also announced a safety communications that highlights the points made in this alert. For additional information see: http://www.fda.gov/MedicalDevices/Safety/AlertsandNotices/ucm356423.htm

Source:  US-CERT

56% of American adults are now smartphone owners

Friday, June 7th, 2013

For the first time since the Pew Research Center’s Internet & American Life Project began systematically tracking smartphone adoption, a majority of Americans now own a smartphone of some kind. Our definition of a smartphone owner includes anyone who says “yes” to one—or both—of the following questions:

  • 55% of cell phone owners say that their phone is a smartphone.
  • 58% of cell phone owners say that their phone operates on a smartphone platform common to the U.S. market.

Taken together, 61% of cell owners said yes to at least one of these questions and are classified as smartphone owners. Because 91% of the adult population now owns some kind of cell phone, that means that 56% of all American adults are now smartphone adopters.  One third (35%) have some other kind of cell phone that is not a smartphone, and the remaining 9% of Americans do not own a cell phone at all.

Figure 1

Excerpt from:  pewinternet.org

911 tech that locates cell phone users in buildings ready to go

Friday, June 7th, 2013

FCC approves system despite interference with wireless ISPs and smart meters.

The Federal Communications Commission today authorized the maker of an enhanced 911 technology to begin commercial operations despite claims that the system will interfere with wireless Internet Service Providers and smart grid applications.

A company called NextNav and its subsidiary, Progeny, makes a system designed to more accurately locate people inside buildings. This is important because the trend of cell phones replacing landlines makes it more difficult to identify the location of a person calling 911. If someone had a heart attack on the 30th floor of a giant building and is able to call 911 but not speak their location, NextNav’s system would theoretically find them with a network of transmitters placed throughout cities and towns. Cell phones would also require software upgrades to communicate with the NextNav network.

The proposal was controversial because NextNav’s enhanced 911 location service operates on the same chunk of spectrum as wireless Internet Service Providers, smart meters, toll readers like EZ-Pass, baby monitors, and more. In March, we detailed the proposal and concerns raised by makers of systems who believe NextNav technology will cause too much interference in this band, stretching from 902-928MHz.

Extensive testing showed that there was interference, but the FCC had not defined what an “unacceptable” level of interference would be. Despite approving NextNav’s proposal, the FCC once again declined to define unacceptable interference, saying “no uniform field testing method is appropriate considering the great array of devices that the Part 15 industry deploys in the 902-928 MHz, which are designed to address different needs and thus have no common design.”

Part 15 refers to FCC rules regarding unlicensed wireless transmissions. NextNav is a licensed service, but would co-exist with unlicensed ones in the 902-928MHz band.

The FCC further stated that “[w]e recognize that the potential exists for interference to certain devices or systems, but also are cognizant that the potential for interference to these devices already exists because a variety of different users operate in this spectrum. Based on the evidence before us, we find that the potential for increased interference within the 902-928 MHz band that could result from commercial operation of Progeny’s M-LMS (Multilateration Location and Monitoring Service) system will not create a significant detrimental effect overall on unlicensed operations in the band, and that the band therefore can continue to be used for such unlicensed operations consistent with their Part 15 status. We therefore conclude that Progeny has satisfied the field test requirement so that it may commence commercial operations.”

The FCC didn’t impose any limitations on NextNav’s operations but said it has to keep track of interference concerns and report back to the FCC. NextNav must establish a website and toll-free help desk to let users of unlicensed devices “seek assistance in investigating and mitigating potential interference issues.” NextNav must also file three reports with the FCC from December 2013 to December 2014 detailing any interference complaints received.

Source:  arstechnica.com

Bug in Samsung S3 grabs too many images, ups data use

Friday, May 31st, 2013

Researchers of the BenchLab project at UMass Amherst have discovered a bug in the browser of the Samsung S3.

If you browse a Web page that has multiple versions of the same image (for mobile, tablet, desktop, etc…) like most Wikipedia pages for example, instead of downloading one image at the right resolution, the phone will download all versions of it. A page that should be less than 100K becomes multiple MB!  It looks like a bug in the implementation of the srcset HTML tag, but all the details are in the paper to be presented at the IWQoS conference next week.

So far Samsung didn’t acknowledge the problem though it seems to affect all S3 phones. You’d better have an unlimited data plan if you browse Wikipedia on an S3!

Source:  slashdot.org

Galaxy S4 will be first to support Verizon’s newer, faster LTE network

Thursday, May 30th, 2013

Back in late 2011, Verizon Wireless bought up $3.6 billion worth of Advanced Wireless Services (AWS) spectrum with an eye toward expanding its LTE network coverage. The carrier will use this spectrum to add LTE coverage on the 1700MHz and 2100MHz frequencies to the carrier’s existing LTE on the 700MHz band later this year, and Bloomberg is now reporting that Samsung’s Galaxy S 4 will be the first phone to support the new frequencies. The new frequency bands will supposedly boost speeds and reduce network congestion, especially in heavily populated areas.

If you’ve currently got another LTE-capable phone on Verizon’s network, though, chances are you won’t be able to take advantage of the network’s upgrades. Support for the band must be built into both the hardware and the software—the Verizon variant of the S 4 already has hardware support, and an update will apparently take care of the software side in the coming months.

We expect hardware and software support for AWS to appear in more devices as Verizon’s (and T-Mobile’s) AWS network is built out in the coming months. In the meantime, if you need a new phone now but don’t want to miss out on the improved LTE speeds, the S 4 appears to be your best bet.

Source:  arstechnica.com

Corning taps into optical fiber for better indoor wireless

Monday, May 20th, 2013

Bringing wireless indoors, which was once just a matter of antennas carrying a few cellular bands so people could get phone calls, has grown far more complex and demanding in the age of Wi-Fi, multiple radio bands and more powerful antennas.

DAS (distributed antenna systems) using coaxial cable have been the main solution to the problem, but they now face some limitations. To address them, Corning will introduce a DAS at this week’s CTIA Wireless trade show in Las Vegas that uses fiber instead of coax all the way from the remote cell antennas to the base station in the heart of a building.

Cable-based DAS hasn’t kept up with the new world, according to the optical networking vendor. Though Corning is associated more often with clear glass than with thin air, it entered the indoor wireless business in 2011 by buying DAS maker MobileAccess. That’s because Corning thinks optical fiber is the key to bringing more mobile capacity and coverage inside.

The system, called Corning Optical Network Evolution (ONE) Wireless Platform, can take the place of a DAS based fully or partly on coaxial cable, according to Bill Cune, vice president of strategy for Corning MobileAccess. Corning ONE will let mobile carriers, enterprises or building owners set up a neutral-host DAS for multiple carriers using many different frequencies.

Though small cells are starting to take its place in some buildings, DAS still has advantages over the newer technology, according to analyst Peter Jarich of Current Analysis. It can be easier to upgrade because only the antennas are distributed, so more of the changes can be carried out on centralized gear. Also, small cells are typically deployed by one mobile operator, and serving customers of other carriers has to be done through roaming agreements, he said.

However, some DAS products based on coaxial cable are limited in how they can handle high frequencies and MIMO (multiple-in, multiple-out) antennas, Jarich said. Some vendors are already promoting fiber for greater flexibility and capacity, he said.

Going all fiber — up to the wireless edge, at least — will make it easier and cheaper for indoor network operators to roll out systems that can deliver all the performance users have come to expect from wireless networks, according to Corning. That includes more easily adding coverage for more carriers, as well as feeding power and data to powerful Wi-Fi systems that can supplement cellular data service, the company says.

Wireless signals don’t travel the same way inside buildings as they do outdoors, so one antenna can’t always cover the interior, regardless of whether it’s mounted in the building or on a nearby tower. A DAS consists of many antennas spaced throughout a structure, all linked to a base station in a central location. Most types of DAS use coaxial cable to carry radio signals in from the distributed antennas.

However, those copper cables get more “lossy” as the frequencies they have to carry get higher, meaning they lose a lot of their signal on the way to the base station, Corning’s Cune said. That has left coax behind as new frequencies are adopted, he said. For example, coax isn’t good at carrying the 5GHz band, which is crucial in newer Wi-Fi equipment, Cune said.

MIMO, a technology that uses multiple antennas in one unit to carry separate “streams” over the same frequency, is another big limitation of DAS, according to Corning. MIMO antennas for better performance can be found in newer Wi-Fi gear based on IEEE 802.11n and 802.11ac, as well as in LTE. A coax-based DAS with MIMO antennas needs to have a separate half-inch-wide cable for every stream, which is a major cabling challenge, Cune said.

Corning ONE links each antenna to the base station over optical fiber, converting the radio signals to optical wavelengths until they reach the base station. Fiber has more capacity than coax, can handle higher frequencies, and requires just one cable from a MIMO antenna, Cune said. Because of fiber’s high capacity, it’s relatively easy to bring other mobile operators onto the DAS.

The system is based on optical fiber, but it can be extended over standard Ethernet wiring to provide backhaul for Wi-Fi access points. Each Corning ONE remote antenna unit that’s deployed around a building will have two Ethernet ports to hook up nearby Wi-Fi access points, which can use the fiber infrastructure for data transport to wired LAN equipment, Cune said.

Corning ONE is in beta testing at one enterprise and will have limited availability beginning in late June, after which orders can be placed, Cune said. It is expected to be generally available two to three months later. The company expects its main customers to be mobile operators, though most of those operators will arrange multi-carrier services, he said. Enterprises and large building owners increasingly will step in to buy and deploy the DAS, Cune said.

Source:  networkworld.com

Samsung achieves 1 Gbps data transfer using 5G network

Tuesday, May 14th, 2013

While many of us are just starting to enjoy the benefits of early 4G networks, Samsung is looking at what it would take to build a gigabit wireless network for 5G.

Unless you live in one of the few places Google has seen fit to give the gift of fiber so far, gigabit Internet is something of a pipe dream in the US. Over the next few years that will change, and slowly there will be a shift to gigabit all over the world. Meanwhile, mobile networks will continue to improve as we to move closer to fully-functional 4G networks with LTE. Eventually there will be a need to shift away from 4G and on to something better. When that happens it looks like Samsung has the next G everyone will be looking for.

By using the 28GHz band, Samsung has been able to reliably transfer data at a speed of 1Gbps with the potential to deliver up to 10Gbps. While there’s currently no globally recognized spec for 5G mobile broadband, this is a significant increase over the maximum currently established for fully-deployed 4G.

As a demonstration, things like functional range or whether or not the radios used can be embedded into mobile devices aren’t taken into consideration. This proof of concept shows what is possible, but it’s not likely that we’ll be using this technology anytime soon. In fact, Samsung expects that 5G speeds aren’t something that will be enabled in mobile devices until closer to 2020.

It’s difficult to imagine the need for that kind of performance in the palm of your hand as we sit here in 2013, but this glimpse at 5G speeds helps paint a picture of a world where the things we do on the Internet now are completely free from any kind of delay or interruptions. All you need now is autopilot for your jetpack so you can watch the news from 3D Google Glass on your 5G network.

Source:  geek.com

For first time, smartphone sales top other mobile phones in first quarter

Friday, April 26th, 2013

Six years after the sale of the first iPhone and 14 years after the first BlackBerry email pager was unveiled, smartphone shipments have outnumbered sales of other types of mobile phones, IDC reported late Thursday.

IDC said 216.2 million smartphones were shipped globally in the first quarter of 2013. The smartphone total accounted for 51.6% of all mobile phones shipped.

Shipments of other mobile phones, which IDC calls feature phones, totaled 202.4 million in the quarter. Total shipments of all mobile phones was 418.6 million, IDC said.

“The balance of smartphone power has shifted,” said IDC analyst Kevin Restivo in a statement. “Phone users want computers in their pockets. The days when phones were used primarily to make phone calls and send text messages are quickly fading away.”

IDC also noted the emergence of China-based companies, including Huawei, ZTE, Coolpad and Lenovo, among the leading smartphone vendors.

Those newcomers and others have displaced longtime mobile phone leaders Nokia from Finland, BlackBerry from Canada, and HTC from Taiwan, in the list of top five smartphone makers, IDC said.

BlackBerry was producing what was essentially a smartphone before Apple introduced the iPhone in June 2007.

The first BlackBerry device was an email pager, introduced in 1999. Those devices were subsequently combined with voice calling.

Nokia has long been a top producer of mobile phones, though it slipped off the top five list for the first quarter.

A year ago, it was common to see previous market leaders Nokia, BlackBerry and HTC among the top five, said Ramon Llamas, an analyst at IDC.

IDC ranked the top five smartphone vendors in the first quarter as: Samsung (70.7%); Apple (37.4); LG (10.3%); Huawei (9.9%); ZTE (9.1%). The rest made up 36.4% of the market.

IDC ranked the top five vendors of feature phones and smartphones combined as: Samsung (27.5%); Nokia (14.8%); Apple (8.9%); LG (3.7%) and ZTE (3.2%). All others combined to hold 41.9% of the market.

LG showed a dramatic 110% year-over-year climb in smartphone shipments, while Huawei’s grew by 94% and Samsung’s by 61%. ZTE’s smartphone shipments grew by 49% and Apple’s by just 6.6%.

The last time Apple posted just a single digit year-over-year growth rate was in the third quarter of 2009. Apple has been in the second spot in smartphone rankings in each of the last five quarters, IDC noted.

Samsung, meanwhile, shipped more smartphones in the first quarter than the next four vendors combined, making it the “undisputed leader” in the worldwide smartphone market, IDC said.

Samsung’s next generation Galaxy S4 smartphone is about to go on sale, while Samsung is also building a new OS, called Tizen, that will run new smartphones later this year.

Source:  computerworld.com

DOJ identifies lower frequency spectrum as key to wireless competition

Sunday, April 14th, 2013

The Department of Justice has provided the FCC with new recommendations for governing spectrum auctions, and with a heavy emphasis on leveling the playing field, the findings are likely to draw the ire of AT&T and Verizon. In its briefing, the DOJ made its case that the nation’s two largest carriers currently hold market power, which is due to the heavy concentration of lower frequency spectrum (below 1,000MHz) allocated to the two incumbents.

According to DOJ officials, “This results in the two smaller nationwide carriers having a somewhat diminished ability to compete, particularly in rural areas, where the cost to build out coverage is higher with high-frequency spectrum.” Although the DOJ never came right out and said it, one can easily surmise that it’s guiding the FCC to establish rules that favor smaller carriers — namely Sprint and T-Mobile — in future low-frequency spectrum auctions. In the DOJ’s opinion, an incumbent carrier would need to demonstrate both compelling evidence of capacity constraints and an efficient use of its current licenses in order to gain additional lower frequency spectrum. Otherwise, the opportunity exists for AT&T and Verizon to snap up licenses simply in attempt to harm competitors.

Given that the FCC and DOJ share the responsibility of ensuring competition in the marketplace, it seems unlikely that this latest brief will fall on deaf ears.

Source:  engadget.com

Cisco acquisition highlights growing interest in small cells

Thursday, April 4th, 2013

Operators and telecom equipment vendors are showing a growing interest in small cells, which aim to give users improved coverage and speeds.

In the latest development, Cisco Systems will acquire British small cell specialist Ubiquisys. While the concept of using smaller base stations or cells to improve coverage and mobile data speeds is not new, this year will see it take a big step forward, propelled by 4G small cells deployed for capacity upgrades, according to Stephane Teral, principal analyst at Infonetics Research.

“AT&T, Sprint, and Verizon Wireless in the U.S., Vodafone in Europe, LG U+ in South Korea, and NTT DoCoMo in Japan have all announced major small cell plans, driven by the need to enhance the capacity of saturated macro cellular networks,” he said in a recent research note.

Small cells are miniature cellular base stations. They provide a low-power signal much closer to mobile users than traditional macro networks, resulting in better voice quality, higher data performance and less toll on batteries, according to Ubiquisys, which has pioneered the technology.

“Eventually networks should have fewer coverage holes. So when you are deep down in the basement of a convention center for a meeting you should still have bars on your phone,” said Daryl Schoolar, principal analyst at Ovum, who thinks of small cells as a mobile network enhancer.

The initial installations will be focused on indoor spaces like convention centers as well as stadiums, hotels and airports where there are many users.

Installing small cells indoors also means connecting the equipment to the operator’s network won’t be a big issue, since there is always at least a DSL line, Teral said via email. Connecting small cells located outdoors can be trickier because power and network connections may not be as readily available. But vendors are trying to address at least the latter by offering wireless products, which was one of the big trends at this year’s Mobile World Congress.

For vendors, a business that is expected to be generating $2.7 billion by 2017 is at stake, according to Infonetics’ estimates.

While Ubiquisys will give Cisco the know-how, products and credibility it needs to compete, it will face tough competition from vendors like Alcatel-Lucent, Ericsson and Nokia Siemens Networks. Ericsson argues that small cells should be rolled out in an coordinated way with the rest of the network to mitigate interference, which is a legitimate position, according to Schoolar. Not having all sizes of base stations will make mitigating interference more of a challenge for Cisco, but its acquisition of Intucell earlier this year should help address that, he said.

Intucell’s software works with equipment from multiple vendors and can detect coverage, overload and other issues in real time and automatically make adjustments in response. For example, when too many users are connected to one base station, the system automatically adjusts coverage by getting assistance from nearby towers.

Even though interest in small cells is growing, it will take time for the vision they propose to be fully realized.

“What we’ll see this year are trials and limited deployments with commercial-ready gear, and next year you’ll actually see launches take place … The question used to be if you were going to need [small cells] or not, but now it’s more about how to best deploy them,” Schoolar said.

Source:  computerworld.com

Wireless dead zones aren’t just in rural areas

Thursday, April 4th, 2013

Wireless dead zones continue to give people headaches in rural areas, but also in cities and suburbs.

While the country’s major carriers advertise their widespread coverage, the level of coverage for voice and data is really more of a patchwork. To be sure, the U.S. has tremendous wireless coverage from a variety of carriers, but the Federal Communications Commission  acknowledged last year that 19 million  people– a significant minority — still don’t have good fixed broadband coverage that better wireless coverage could improve upon.

The U.S. leads the world in 4G LTE development, but there’s still a critical concern about whether the major carriers will ever sell smartphones that interoperate across all the spectrum bands being used for LTE by each carrier.

Those people who complain about spotty or slow or non-existent coverage aren’t all living out in the sticks.  Computerworld readers gave some examples of their poor service in comments to a story that appeared yesterday (April 2): “Why some U.S. homes and businesses still don’t have cellular service.”

The article described wireless service problems for businesses in Wyoming County in West Virginia coal country.  Wireless carriers are constrained because of the mountainous geography there, but newer technologies including small cell technologies haven’t provided much help.  The Federal Communications Commission has identified more than a dozen counties with poorer service.

Readers from western Kentucky, Occidental, Calif., and 25 miles from Ann Arbor, Mich., described wireless service problems in

Add a comment.  “I live 25 miles from Ann Arbor (a high tech area…) and have to stand on a hill to get a signal on my cell phone,” wrote Rob Karatzas in a comment.  “We have NO high-speed Internet…”

Added Tony DeYoung: “It is not just low-income, rural or less-educated areas. I have a house in Occidental, Calif. This is a haven for hi-tech, affluent San Francisco residents who want to move their businesses out of the city. But we can’t. We have spotty high-speed Internet, zippo on Verizon or AT&T coverage. It is a total constraint on economic growth.”

In an interview, Glen Gore, president of Taloga Cable TV in Dewey, Okla., about 75 miles northwest of Oklahoma City, described how he had to switch from AT&T after 22 years to Verizon to get fast data speeds for his iPhone 5. He argued for more competition in more areas.

“Competition is always good and there ought to be more than one wireless carrier,” Gore said. “All the talk of competition goes out the window when you’re in an rural area like here. All the big carriers want to only work in big cities where they can recoup their investment, but here we have oil and gas, wind towers and agricultural users—all with cell phone data needs. There’s a lot of technical stuff in the rural areas anymore and it would be nice to have more than one carrier to consider.”

The comments readers bring to Computerworld in emails and phone calls typically talk about how the smartphone era and the hip TV marketing of smartphones have focused heavily on all the advantages that the cool new technology brings us—seemingly instant communications with data and voice.  Those advantages of having access to the Web on the go or voice service are clearly wondrous when there’s wireless service but essentially useless when there’s not good service, readers note.

The dilemma becomes more complex when workers have to live with phones they buy to use for work under Bring Your Own Device scenarios.  Service contracts lock customers in for two years, and while a sexy new smartphone might work at the office and home, it might not work when visiting a customer. I hear this concern all the time.

How can the problem of poor wireless service be fixed?  There are no easy answers and the FCC and state regulators have a raft of approaches that have been underway for years. In a country as geographically large as the U.S., it’s clear that service will always be sub-par in some areas.

The major wireless carriers are spending $25 billion annually to expand their networks, although most of their efforts are for expanding 4G LTE into more densely populated areas.  Public interest groups are backing proposals that would require coming LTE smartphones to work across different spectrum bands of different U.S. wireless carriers, although that will likely result in higher handset costs.

For individuals concerned about poor cell phone service, it might help to complain to the local wireless carriers. However, most readers who contact Computerworld say that process is not productive. We’re hearing from the ones who haven’t had much success.

The FCC has a formal complaint process that might help. It  includes filing a Form 2000B either online or by mail. There’s a “File a Consumer Complaint” tab on the www.fcc.gov home page on the right-hand column, which eventually leads to a “Billing, Privacy or Service Quality Complaint” page.   You have to be willing to describe your service problem with your number and carrier’s name. If you live where there’s no carrier, you might be out of luck but can call the FCC at 1-888-CALL-FCC (1-888-225-5322) or 1-888-TELL-FCC (1-888-835-5322).

To be sure, the FCC knows there are areas with wireless service problems and has taken proactive steps to address the more severe gaps in service in places such as Tribal Nations.

If you type a search for “Wireless service problems” on the home page, you are first directed to an FCC Encyclopedia entry that concerns “problems with long distance or wireless calling to rural areas.” (It’s interesting to note that the referenced explanation is for wireless “calling” and not for data. ) That entry urges customers with problems to contact their carriers, or file a Form 2000B complaint.

It also notes that “carrier practices that lead to call completion failure and poor call quality may violate the Communications Act’s prohibition on unjust and unreasonable practices and violate a carrier’s obligations under the Act…”

But the FCC in its wisdom also added these words: “We recognize that there is still more to do be done—and will be doing more. We share the concern about this problem and its impact on rural consumers and businesses, and are dedicated to ensuring that all Americans receive high-quality services.”

Despite that wording mentioning “rural,” various FCC officials have acknowledged that gaps in wireless coverage extend beyond rural areas, something all too apparent to Computerworld readers.

Source:  computerworld.com

The 49ers’ plan to build the greatest stadium Wi-Fi network of all time

Tuesday, March 19th, 2013

When the San Francisco 49ers’ new stadium opens for the 2014 NFL season, it is quite likely to have the best publicly accessible Wi-Fi network a sports facility in this country has ever known.

The 49ers are defending NFC champions, so 68,500 fans will inevitably walk into the stadium for each game. And every single one of them will be able to connect to the wireless network, simultaneously, without any limits on uploads or downloads. Smartphones and tablets will run into the limits of their own hardware long before they hit the limits of the 49ers’ wireless network.

Jon Brodkin

Until now, stadium executives have said it’s pretty much impossible to build a network that lets every single fan connect at once. They’ve blamed this on limits in the amount of spectrum available to Wi-Fi, despite their big budgets and the extremely sophisticated networking equipment that largesse allows them to purchase. Even if you build the network perfectly, it would choke if every fan tried to get on at once—at least according to conventional wisdom.

But the people building the 49ers’ wireless network do not have conventional sports technology backgrounds. Senior IT Director Dan Williams and team CTO Kunal Malik hail from Facebook, where they spent five years building one of the world’s largest and most efficient networks for the website. The same sensibilities that power large Internet businesses and content providers permeate Williams’ and Malik’s plan for Santa Clara Stadium, the 49ers’ nearly half-finished new home.

“We see the stadium as a large data center,” Williams told me when I visited the team’s new digs in Santa Clara.

I had previously interviewed Williams and Malik over the phone, and they told me they planned to make Wi-Fi so ubiquitous throughout the stadium that everyone could get on at once. I had never heard of such an ambitious plan before—how could this be possible?

Today’s networks are impressive—but not unlimited

An expansive Wi-Fi network at this year’s Super Bowl in the New Orleans Superdome was installed to allow as many as 30,000 fans to get online at once. This offloaded traffic from congested cellular networks and gave fans the ability to view streaming video or do other bandwidth-intensive tasks meant to enhance the in-game experience. (Don’t scoff—as we’ve noted before, three-plus-hour NFL games contain only 11 minutes of actual game action, or a bit more if you include the time quarterbacks spend shouting directions at teammates at the line of scrimmage. There is plenty of time to fill up.)

Superdome officials felt a network allowing 30,000 simultaneous connections would be just fine, given that the previous year’s Super Bowl saw only 8,260 at its peak. They were generally right, as the network performed well, even for part of the game’s power outage.

The New England Patriots installed a full-stadium Wi-Fi network this past season as well. It was never used by more than 10,000 or so people simultaneously, or by more than 16,000 people over the course of a full game. “Can 70,000 people get on the network at once? The answer to that is no,” said John Brams, director of hospitality and venues at the Patriots’ network vendor, Enterasys. “If everyone tried to do it all at once, that’s probably not going to happen.”

But as more fans bring smart devices into stadiums, activities like viewing instant replays or live camera angles available only to ticket holders will become increasingly common. It’ll put more people on the network at once and require bigger wireless pipes. So if Williams and Malik have their way, every single 49ers ticket holder will enjoy a wireless connection faster than any wide receiver sprinting toward the end zone.

“Is it really possible to give Wi-Fi to 68,500 fans at once?” I asked. I expected some hemming and hawing about how the 49ers will do their best and that not everyone will ever try to use the network at once anyway.

“Yes. We can support all 68,500,” Williams said emphatically.

How?

“How not?” he answered.

Won’t you have to limit the capacity each fan can get?

Again, absolutely not. “Within the stadium itself, there will probably be a terabit of capacity. The 68,500 will not be able to penetrate that. Our intentions in terms of Wi-Fi are to be able to provide a similar experience that you would receive with LTE services, which today is anywhere from 20 to 40 megabits per second, per user.

“The goal is to provide you with enough bandwidth that you would saturate your device before you saturate the network,” Williams said. “That’s what we expect to do.”

Fans won’t be limited by what section they’re in, either. If the 49ers offer an app that allows fans to order food from their seats, or if they offer a live video streaming app, they’ll be available to all fans.

“The mobile experience should not be limited to, ‘Hey, because you sit in a club seat you can see a replay, but because you don’t sit in a club seat you can’t see a replay,'” Malik said. “That’s not our philosophy. Our philosophy is to provide enhancement of the game experience to every fan.” (The one exception would be mobile features designed specifically for physical features of luxury boxes or club seats that aren’t available elsewhere in the stadium.)

It’s the design that counts

Current stadium Wi-Fi designs, even with hundreds of wireless access points distributed throughout a stadium, often can support only a quarter to a half of fans at once. They also often limit bandwidth for each user to prevent network slowdowns.

The Patriots offer fans a live video and instant replay app, with enough bandwidth to access video streams, upload photos to social networks, and use the Internet in general. Enterasys confirmed to Ars that the Patriots do enforce a bandwidth cap to prevent individual users from overloading the network, but Enterasys would not say exactly how big the cap is. The network has generally been a success, but some users of the Patriots app have taken to the Android app store to complain about the stadium Wi-Fi’s performance.

According to Williams, most current stadium networks are limited by a fundamental problem: sub-optimal location of wireless access points.

“A typical layout is overhead, one [access point] in front of the section, one behind the section, and they point towards each other,” he said. “This overhead design is widely used and provides enough coverage for those using the design.”

Williams would not reveal the exact layout of the 49ers’ design, perhaps to prevent the competition from catching on. How many access points will there be? “Zero to 1,500,” he said in a good-natured attempt to be both informative and vague.

That potentially doubles or quadruples the typical amount of stadium access points—the Super Bowl had 700 and the Patriots have 375. But this number isn’t the most important thing. “The number of access points will not give you any hint on whether the Wi-Fi is going to be great or not,” Malik said. “Other factors control that.”

If the plan is to generate more signal strength, just adding more access points to the back and front of a section won’t do that.

The Santa Clara Stadium design “will be unique to football stadiums,” Williams said. “The access points will be “spread and distributed. It’s really the best way to put it. Having your antennas distributed evenly around fans.” The 49ers are testing designs in Candlestick Park and experimenting with different access points in a lab. The movement of fans and the impact of weather on Wi-Fi performance are among the factors under analysis.

“Think of a stadium where it’s an open bowl, its raining, people are yelling, standing, how do you replicate that in your testing to show that if people are jumping from their seats, how is Wi-Fi going to behave, what will happen to the mobile app?” Malik said. “There is a lot that goes on during a game that is hard to replicate in your conceptual simulation testing. That is one of the big challenges where we have to be very careful.”

“We will make great use of Candlestick over the next year as we continue to test,” Williams said. “We’re evaluating placement of APs and how that impacts RF absorption during the game with folks in their seats, with folks out of their seats.”

Wi-Fi will be available in the stands, in the suites, in the walkways, in the whole stadium. The team has not yet decided whether to make Wi-Fi available in outdoor areas such as concourses and parking lots.

The same could theoretically be done at the 53-year-old Candlestick Park, even though it was designed decades before Wi-Fi was invented. Although the stadium serves as a staging ground for some of the 49ers’ wireless network tests, public access is mainly limited to premium seating areas and the press box.

The reason Wi-Fi in Candlestick hasn’t been expanded is a practical one. With only one year left in the facility, the franchise has decided not to invest any more money in its network. But Williams said 100 percent Wi-Fi coverage with no bandwidth caps could be done in any type of stadium, no matter how old. He says the “spectrum shortage” in stadiums is just a myth.

With the new stadium still undergoing construction, it was too early for me to test anything resembling Santa Clara Stadium’s planned Wi-Fi network. For what it’s worth, I was able to connect to the 49ers’ guest Wi-Fi in their offices with no password, and no problems.

The 2.4GHz problem

There is one factor preventing better stadium Wi-Fi that even the 49ers may not be able to solve, however. Wi-Fi works on both the 2.4GHz and 5GHz bands. Generally, 5GHz is better because it offers more powerful signals, less crowded airwaves and more non-overlapping channels that can be devoted to Wi-Fi use.

The 2.4GHz band has 11 channels overall and only three that don’t overlap with each other. By using somewhat unconventionally small 20MHz channels in the 5GHz range, the 49ers will be able to use about eight non-overlapping channels. That’s despite building an outdoor stadium, which is more restricted than indoor stadiums due to federal requirements meant to prevent interference with systems like radar.

Each 49ers access point will be configured to offer service on one channel, and access points that are right next to each other would use different channels to prevent interference. So even if you’re surrounding fans with access points, as the 49ers plan to, they won’t interfere with each other.

But what if most users’ devices are only capable of connecting to the limited and crowded 2.4GHz band? Enterasys said 80 percent of Patriots fans connecting to Wi-Fi this past season did so from devices supporting only the 2.4GHz band, and not the 5GHz one.

“You have to solve 2.4 right now to have a successful high-density public Wi-Fi,” Brams said.

The iPhone 5 and newer Android phones and tablets do support both the 2.4GHz and 5GHz bands, however. Williams said by the time Santa Clara Stadium opens in 2014, he expects 5GHz-capable devices to be in much wider use.

When asked if the 49ers would be able to support 100 percent of fans if most of them can only connect to 2.4GHz, Williams showed a little less bravado.

“For those 2.4 users we will certainly design it so that there’s less interference,” he said. “It is a more dense environment if you are strictly constrained in 2.4, but we are not constrained in 2.4. We’re not trying to answer the 2.4 problem, because we have 5 available.”

“It’s 2013, we have another year and a half of iteration,” he also said. “We’ll probably be on, what, the iPhone 7 by then? The move to 5GHz really just makes us lucky. We’re doing this at the right time.”

Building a stadium in Facebook’s image

Williams and Malik both joined the 49ers last May. Malik was hired first, and then brought his old Facebook friend, Williams, on board. Malik had been the head of IT at Facebook, while Williams was the website’s first network engineer and later a director. They both left the site, basically because they felt there was nothing left to accomplish. Williams did some consulting, and Malik initially planned to take some time off.

Williams was a long-time 49ers season ticket holder, but that was far from the only thing that sold him on coming to the NFL.

“I had been looking for something challenging and fun again,” Williams said. “Once you go through an experience like Facebook, it’s really hard to find something that’s similar. When Kunal came to me, I remember it like it was yesterday. He said, ‘If you’re looking for something like Facebook you’re not going to find it. Here’s a challenge.'”

“This is an opportunity to change the way the world consumes live sports in a stadium,” Malik said. “The technology problems live sports has today are unsolved and no one has ever done what we are attempting to do here. That’s what gets me out of bed every day.”

Williams and Malik have built the 49ers’ network in Facebook’s image. That means each service—Wi-Fi, point-of-sale, IPTV, etc.—gets its own autonomous domain, a different physical switching system to provide it bandwidth. That way, problems or slowdowns in one service do not affect another one.

“It’s tribal knowledge that’s only developed within large content providers, your Facebooks, your Googles, your Microsofts,” Williams said. “You’ll see the likes of these large content providers build a different network that is based on building blocks, where you can scale vertically as well as horizontally with open protocols and not proprietary protocols.

“This design philosophy is common within the content provider space but has yet to be applied to stadiums or venues. We are taking a design we have used in the past, and we are applying it here, which makes sense because there is a ton of content. I would say stadium networks are 10 years behind. It’s fun for us to be able to apply what we learned [at Facebook].”

The 49ers are still evaluating what Wi-Fi equipment they will use. The products available today would suit them fine, but by late 2014 there will likely be stadium-class access points capable of using the brand-new 802.11ac protocol, which allows greater throughput in the 5GHz range than the widely used 802.11n. 11ac consumer devices are rare today, but the 49ers will use 802.11ac access points to future-proof the stadium if appropriate gear is available. 11ac is backwards compatible with 11n, so supporting the new protocol doesn’t leave anyone out—the 49ers also plan to support previous standards such as 11a, 11b, and 11g.

802.11ac won’t really become crucial until 802.11n’s 5GHz capabilities are exhausted, said Daren Dulac, director of business development and technology alliances at Enterasys.

“Once we get into 5GHz, there’s so much more capacity there that 11ac doesn’t even become relevant until we’ve reached capacity in the 5GHz range,” he said. “We really think planning for growth right now in 5GHz is acceptable practice for the next couple of years.”

Santa Clara Stadium network construction is expected to begin in Q1 2014. Many miles of cabling will support the “zero to 1,500” access points, which connect back to 48 server closets or mini-data centers in the stadium that in turn tie back to the main data center.

“Based on service type you plug into your specific switch,” Williams said. “If you’re IPTV, you’re in an IPTV switch, if you’re Wi-Fi you’re in a Wi-Fi switch. If you’re in POS [point-of-sale], you’re in a POS switch. It will come down to a Wi-Fi cluster, an IPTV cluster, a POS cluster, all autonomous domains that are then aggregated by a very large fabric, that allows them to communicate lots of bandwidth throughput, and allows them to communicate to the Internet.”

Whereas Candlestick Park’s network uses Layer 2 bridging—with all of the Wi-Fi nodes essentially on a single LAN— Santa Clara Stadium will rely on Layer 3 IP routing, turning the stadium itself into an Internet-like network. “We will be Layer 3 driven, which means we do not have the issue of bridge loops, spanning tree problems, etc.,” Williams said.

Keeping the network running smoothly

Wireless networks should be closely watched during games to identify interference from any unauthorized devices and identify usage trends that might result in changes to access points. At the Patriots’ Gillette Stadium, management tools show bandwidth usage, the number of fans connected to each access point, and even what types of devices they’re using (iPhone, Android, etc.) If an access point was overloaded by fans, network managers would get an alert. Altering radio power, changing antenna tilt, or adding radios may be required, but generally any major changes are made between games.

Enlarge / Dashboard view of Patriots’ in-game connectivity.
Enterasys

“In terms of real-time correction, it depends on what the event is,” said John Burke, a senior architect at Enterasys. “Realistically, some of these APs are overhead. If an access point legitimately went down and it’s on the catwalk above 300 [the balcony sections] you’re not going to fix that in the game. That’s something that would have to wait.”

So far, the Patriots’ capacity has been enough. Fans have yet to overwhelm a single access point. Even if they did, there is some overlap among access points, allowing fans to get on in case one AP is overloaded (or just broken).

The 49ers will use similar management tools to watch network usage and adjust access point settings in real time during games. “We expect to overbuild and actually play with things throughout,” Williams said. “Though we are building the environment to support 100 percent capacity, we do not expect 100 percent capacity to be used, so we believe we will be able to move resources around as needed [during each game].”

The same sorts of security protections in place in New England will be used in Santa Clara. Business systems will be password-protected and encrypted, and there will be encrypted tunnels between access points and the back-end network. While that level of protection won’t extend to the public network, fans shouldn’t be able to attack each other, because peer-to-peer connections will not be allowed.

What if the worst happens and the power goes out? During the Super Bowl’s infamous power outage, Wi-Fi did stay on for at least a while. Williams and Malik acknowledged that no system is perfect, but they said that they plan for Wi-Fi uptime even if power is lost.

“We have generators in place, and we’ll have UPS systems, so from a communications standpoint our plan is to keep all the communication infrastructure up and online [during outages],” Williams said. “But all of this stuff is man-made.”

A small team that does it all

Believe it or not, the 49ers have a tech team of less than 10 people, yet the organization is designing and building everything itself. Sports teams often outsource network building to carriers or equipment vendors, but not the 49ers. Besides building its own Wi-Fi network, the team will build a carrier-neutral distributed antenna system to boost cellular signals within the stadium.

“We are control freaks,” Williams said with a laugh. He explained that doing everything themselves makes it easier to track down problems, accept responsibility, and fix things. They also feel the need to take ownership of the project because none of the existing networks in the rest of the league approach what they want to achieve. There is a lot of low-hanging fruit just from solving the easy problems other franchises haven’t addressed, they think.Not all the hardware must be in-house, though. The 49ers will use cloud services like Amazon’s Elastic Compute Cloud when it makes sense.

“Let’s say we want to integrate a POS system with ordering,” Malik said. “If you have an app that lets you order food, and there’s a point of sale system, all the APIs and integration need to sit in the cloud. There’s no reason for it to sit in our data center.”

There are cases where the cloud is clearly not appropriate, though. Say the team captures video on site and distributes it to fans’ devices—pushing that video to a faraway cloud data center in the middle of that process would slow things down dramatically. And ultimately, the 49ers have a greater vision than just providing Wi-Fi to fans.

When I toured a preview center meant to show off the stadium experience to potential ticket buyers, a mockup luxury suite had an iPad embedded in the wall with a custom application for controlling a projector. That provides a hint of what the 49ers might provide.

“Our view is whatever you have at home you should have in your suite,” Williams said. “If that means there’s an iPad on the wall or an application you can use, hopefully that’s available. Your life should be much easier in this stadium.”

And whatever applications are built should be cross-platform. As Malik said, the 49ers are moving away from proprietary technologies to standards-based systems so they can provide nifty mobile features to fans regardless of what device they use.

Williams and Malik are already working long hours, and their jobs will get even more time-intensive when network construction actually begins. But they wouldn’t have it any other way—particularly the longtime season ticket holder Williams.

When work is “tied to something that you love deeply, which is sports, and tied to your favorite team in the world, that’s awesome,” Williams said. “I’m crazy about it, man. I get super passionate.”

Source:  arstechnica.com

911 tech pinpoints people in buildings—but could disrupt wireless ISPs

Tuesday, March 19th, 2013

FCC decision could wreak havoc on ISPs, baby monitors, and smart meters.

Cell phones replacing landlines are making it difficult to accurately locate people who call 911 from inside buildings. If a person having a heart attack on the 30th floor of a giant building can call for help but is unable to speak their location, actually finding that person from cell phone and GPS location data is a challenge for emergency responders.

Thus, new technologies are being built to accurately locate people inside buildings. But a system that is perhaps the leading candidate for enhanced 911 geolocation is also controversial because it uses the same wireless frequencies as wireless Internet Service Providers, smart meters, toll readers like EZ-Pass, baby monitors, and various other devices.

NextNav, the company that makes the technology, is seeking permission from the Federal Communications Commission to start commercial operations. More than a dozen businesses and industry groups oppose NextNav (which holds FCC licenses through a subsidiary called Progeny), saying the 911 technology will wipe out devices and services used by millions of Americans.

Harold Feld, legal director for Public Knowledge, a public interest advocacy group for copyright, telecom, and Internet issues, provided the best summary of these FCC proceedings in a very long and detailed blog post:

Depending on whom you ask, the Progeny Waiver will either (a) totally wipe out the smart grid industry, annihilate wireless ISP service in urban areas, do untold millions of dollars of damage to the oil and gas industry, and wipe out hundreds of millions (possibly billions) of dollars in wireless products from baby monitors to garage door openers; (b) save thousands of lives annually by providing enhanced 9-1-1 geolocation so that EMTs and other first responders can find people inside apartment buildings and office complexes; (c) screw up EZ-Pass and other automatic toll readers, which use neighboring licensed spectrum; or (d) some combination of all of the above.

That’s not bad for a proceeding you probably never heard about.

All eyes on the FCC

While the Progeny proceeding has flown under the radar, the FCC may be inching toward a decision. The FCC’s public meeting next Wednesday will tackle the problem of improving 911 services. Feld says the FCC seems to be close to making a decision, although the FCC itself did not respond to our requests for comment this week. All the public documents related to the proceeding are available on the FCC website.

NextNav’s website says the company “was founded in 2007 to solve the indoor positioning problem.” But it has no revenue today, and it won’t unless the FCC approves its application or it finds another line of business.

The Wireless Internet Service Providers Association (WISPA) is worried that the FCC will rule in Progeny’s favor, despite tests that WISPA and others believe prove Progeny service would degrade performance of many existing devices or render them unusable altogether.

“The FCC appears poised to completely disregard technical reality, disregard the record in their own proceeding and give final approval to Progeny to do something that’s going to be very disruptive to the band that’s been in use for 20 years harmoniously by millions of users,” Jack Unger, WISPA’s technical consultant, told Ars.

The band in question is 902-928MHz. This band is similar to Wi-Fi in that it permits many unlicensed uses such as the ones mentioned earlier in this article. It also permits a select few licensed uses, including Progeny’s M-LMS (Multilateration Location and Monitoring Service), which forms the backbone of its enhanced 911 service.

NextNav has already set up a network of roughly 60 transmitters to cover a 900-square-mile area including San Francisco, Oakland, and San Jose, NextNav CEO Gary Parsons told Ars in a phone interview. NextNav has begun deployments in the rest of the top 40 markets in the country, but the Bay Area is the only one fully built out.

“We’ve been actually broadcasting in the San Francisco and Silicon Valley area now, portions of it, for over three years,” Parsons said. NextNav has FCC licenses allowing it to transmit, but it needs a further approval in order to begin commercial operations, he said.

Progeny technology may not solve the 911 problem

In order to work, the GPS chips in the next generation of cell phones would need to be slightly modified to allow communication with the Progeny network. That’s just a software upgrade, but one that has to be done prior to a phone being built, Parsons said.

Why is this necessary? GPS is good at locating people outside, but not indoors, Parsons said. “What we bring to the party is a location accuracy that is much more precise than that which is currently available, with the ability to identify vertically what floor you’re on,” Parsons said. “It’s one thing knowing what block you’re in, but if you’re trying to send someone to a heart attack victim on the 89th floor of the Chrysler Building in New York, you better hope they can tell you where they are.”

Results for the Progeny system are promising, but perhaps not enough so to declare it a winner. An FCC advisory committee known as CSRIC (Communications Security, Reliability, and Interoperability Council) gave Progeny high marks compared to contenders Qualcomm and Polaris in a report dated February 19, 2013. (Unger provided a copy of this report to Ars.)

Progeny claims horizontal accuracy to within 20 meters and vertical accuracy to within 2 meters. But the CSRIC report said that even today’s best technology consistent enough.

“[P]rogress has been made in the ability to achieve significantly improved search rings in both a horizontal and vertical dimension,” CSRIC wrote. “However, even the best location technologies tested have not proven the ability to consistently identify the specific building and floor, which represents the required performance to meet Public Safety’s expressed needs. This is not likely to change over the next 12-24 months. Various technologies have projected improved performance in the future, but none of those claims have yet been proven through the test bed process.”

One set of test results, interpreted in many different ways

NextNav and its opponents collaborated on a series of tests to determine how the Progeny system would interact with WISP signals and smart meters. The test results were released last October. The numbers themselves aren’t in dispute, but each side interprets them very differently.

Among Progeny’s opposition is the Part 15 Coalition (Part 15 of the FCC rules regulate the operation of low power devices on unlicensed spectrum). Besides those already mentioned, Part 15 technology includes devices that monitor safety of gas and oil pipelines, hearing aids, Plantronics headsets, and emergency response devices made by Inovonics, said Henry Goldberg, counsel for the Part 15 Coalition.

Goldberg told Ars that the Progeny system has an 80 percent duty cycle (meaning it operates 80 percent of the time), and that Progeny’s 30-watt transmissions would overwhelm the 1-watt transmissions used by numerous Part 15 devices.

This is one example of where the two sides interpret the same results differently. Parsons said each Progeny transmitter operates only 10 percent of the time, explaining that the 80 percent figure is true only when you add up the transmissions of devices within range of each other.

“What [Progeny opponents] generally fail to note is the ones they are seeing that are far away have a very weak signal coming in,” Parsons said. “They might see one or two strong ones and six more that are miles away and at a much lower intensity level.”

Progeny operates on a total of 4MHz in the 902-928MHz band, roughly within 920-922 and 926-928, he said. Smart meters that periodically report statistics to utilities could occasionally miss one transmission due to interference from Progeny but get the data through the next time by hopping frequencies, Parsons said.

Smart meter maker Itron said in a recent filing that “Any radio receiver mounted outdoors is subject to multiple beacons, experiencing the effect of cumulative duty cycles which, as Itron has shown, would be 80-90% in densely deployed areas… testing shows that unlicensed devices cannot co-exist with the Progeny system on its frequencies, which means that, at the very least, Progeny’s operations will take away 4 MHz of spectrum from unlicensed use, that the compression effect will further degrade use of the remaining spectrum, and that some users will experience greater loss of the spectrum.”

Unger believes the wireless Internet Service Providers are at the greatest risk of interference from Progeny systems. WISPs serve more than 3 million users nationwide, with perhaps a quarter of them on the 900MHz band, he said.

This service is primarily for rural areas where customers have no other options. WISP speed is already low, from 500Kbps to 3 or 4 Mbps, and at its worst, interference from Progeny could reduce download speeds by 47.9 percent and upload speeds by 41.5 percent, Unger said. Besides lower speeds, interference could result in lost connections, he said.

The interference isn’t really that bad, Progeny says

Progeny put a more positive spin on those numbers in a filing. “In two of the co-frequency tests the BWA [broadband wireless access] throughput reduction did reach 47.9 and 49 percent,” Progeny said. “Most of the co-frequency tests documented much lower levels of BWA throughput reduction, however, with two co-frequency tests documenting reductions of just 2.5 and 8.3 percent. In fact, when the two worst case outliers are excluded from the results, the average throughput reduction for even the co-frequency tests drops to 16.33 percent.”

Unger said the WISPs 4-watt transmissions would be wiped out by Progeny’s much stronger ones. The 30 watts used by Progeny is measured in ERP (Effective Radiated Power) whereas the WISP’s 4 watts is measured in EIRP (Effective Isotropic Radiated Power). Ultimately, this means the WISPs use 4 watts compared to Progeny’s 49.2 watts when measured on the same scale, Unger said.

Although Progeny uses just 4MHz of spectrum, it’s placed in such a way as to wipe out two of the three usable channels in the 902-928MHz band, WISPA argued in its most recent filing:

Progeny further counters that WISP equipment already has to avoid interference from other unlicensed devices, and that their networks could be designed to avoid interference from Progeny. “What the joint tests do show is that the impact of Progeny’s M-LMS network on BWA equipment is highly variable and can be affected significantly by the configuration of the BWA link, the choice and placement of antennas, and the proximity and direction toward Progeny’s M-LMS beacons,” Progeny wrote. “The test results also demonstrate that the impact of Progeny’s M-LMS network on BWA equipment is only a small fraction of the degradation that BWA networks already routinely experience from other users of the 902-928 MHz band.”

Parsons argued that the existence of Progeny’s network in the Bay Area without any complaints proves that it can co-exist with unlicensed devices. “It’s not like we’re asking to light up a network,” he said. “All of this interference potential that some of these parties are making political points about are not there in practical impact, because we’ve been operating for years.”

Unger counters that while Progeny has operated, there could be interference that wasn’t detected because Progeny didn’t actually test for interference with existing devices except for when the FCC demanded it. Of course, Progeny’s opponents believe those tests prove the network would disrupt devices in the band, and the opponents are numerous.

One Progeny, many opponents

Businesses and organizations filing opposition against Progeny or at least demanding further testing include Plantronics, Google, the Utilities Telecom Council, the Maryland Transportation Authority, National Association of Regulatory Utility Commissioners, smart grid company Landis & Gyr, Inovonics, the New America Foundation, the American Petroleum Institute., the Alarm Industry Communications Committee, several individual utilities, EZ-Pass, and others.

A few members of Congress have weighed in. US Rep. Anna Eshoo (D-CA) wrote favorably of Progeny’s ability to improve 911 services but acknowledged that more work may be required to prevent interference with “other important spectrum users.”

US. Sen. Maria Cantwell (D-WA) and Sen. Amy Klobuchar (D-MN) opposed Progeny’s request for a waiver.

“There are like 60 companies saying to the FCC there is a real problem and Progeny is the only one saying there’s no problem,” Unger said. “I’ve never seen this kind of an unbalanced record before.”

Goldberg said the Progeny proceeding reminds him of LightSquared, which wanted to build a nationwide LTE network but failed to gain FCC approval because of interference with GPS devices. Goldberg, who was counsel to Lightsquared in that case, believes the Progeny one could have a more favorable outcome for both sides if Progeny is willing to compromise.

“As originally proposed and as currently pushed by Progeny, it can’t live together with Part 15,” Goldberg said. “There’s a way for there to be more compatibility between Part 15 and Progeny but that means Progeny has to use lower power and less of a duty cycle. It has to look more like a Part 15 device.”

Parsons contends that Progeny has already compromised by using only one-way transmissions and using a duty cycle of 10 percent on each transmitter. “There was no need for us to put a 10 percent duty cycle in. We already gave up 90 percent,” he said. If Progeny reduced its power output to 4 watts, “we would have to put a lot more beacons, which frankly I’m not sure it really improves the situation much.”

The FCC will have to sort it all out. But the outcome seems to be up in the air because the FCC has not yet defined what an “unacceptable” level of interference would be in this case. The stakes are high for NextNav, for all its opponents, and for the millions of people using devices and services in the 902-928MHz band.

The Progeny case is also a perfect example of just how complicated an FCC proceeding can be. As Feld wrote, “For me, the Progeny Waiver is a microcosm of why it has become so damn hard to repurpose spectrum for new uses.”

Source:  arstechnica.com

Sensors lead to burst of tech creativity in government

Thursday, March 7th, 2013

Human and mechanical sensors are creating excitement in offices of government IT executives

LAS VEGAS — Here at an IBM conference, City of Boston CIO Bill Oates was telling the audience how citizens are using apps to improve city operations. But it was one of Boston’s latest apps, called Street Bump, that got the interest of one attendee, Gary Gilot, an engineer who heads the public works board in South Bend, Ind.

Information collected by the new app, which uses a smartphone’s accelerometer to record road conditions and send the data to public works workers, has already helped utilities to do a better job at making manhole covers even with the road, Oates said.

Street Bump will be the subject of a citywide publicity campaign this summer in an effort to attract more users, he added.

Gilot was struck by the app’s use of crowdsourcing to assess Boston roads.

South Bend has taken different approaches to same problem.

It once had a half-dozen city supervisors spend six weeks each year driving every street in the city and rating them using a standard road condition measures. It’s latest effort was to hire a vendor to drive all South Bend streets and produce digital video for an analysis of pavement conditions.

But after hearing Oates explain how the Street Bump data was producing “big data” about road conditions by people who launched the app in their cars, Gilot had an admiring smile.

“We are behind them by a bunch,” said Gilot, who sees Boston’s app as a possible alternative to costly road surveys.

“I love the idea of the future — that you can avoid the expense by crowdsourcing,” said Gilot.

South Bend is not behind in the trend of using sensors to improve other operations.

For instance, the city has worked with IBM to create a wireless sensor system that detects changes in the sewer flow, and alerts the city to any problems detected. The system, which includes automated valves that can respond to issues, has reduced overflows and backups, said Gilot.

Improving municipal operations is a major theme at the IBM conference. The company’s Smarter Planet initiative combines sensors, asset management, big data, mobile and cloud services into systems for managing government operations.

Boston and South Bend share in the use of sensors, one human-based and the other mechanical. The adoption of sensors, mobile apps and otherwise, appears to be leading to a burst of creativity in state and local governments.

Boston’s chief vehicle for connecting with residents is its Citizens Connect app. The city will release version 4.0 this summer, with changes that will make it easier for city workers to connect directly with residents.

Citizens Connect allows residents to report issues that need government action. Those issues might be a broken street light, trash, graffiti. The reports are public.

Oates said the app encourages participation. To find out why people used the app, the city asked app users why they didn’t call the city about maintenance issues in the pre-app days.

The response, said Oates, was this: “When we call the city we feel like we’re complaining, but when we use this (the app), we feel like we’re helping.”

In discussing Street Bump, he says it’s entirely possible that analysis of the data may lead to new sources of information. Similarly, Gilot said the sewer data collection was making it possible to determine what “normal” was.

“You really don’t know what’s normal until have you have this kind of modeling,” said Gilot.

The changes in Citizens Connect 4.0 will help personalize the connections that city residents make with government.

For instance, today a citizen sends in a pothole repair request and the city fills the pothole. With the update, the worker will be able to take a picture of the completed work and send it back to constituent who sent the request.

The person who drew attention to the maintenance problem will be informed that “the case is closed, and here’s a picture and this is who did it for me,” said Oates.

The citizen will be able to respond with a “great job” acknowledgement, although Oats realizes negative feedback is also possible. “We think it puts pressure on the quality of the service delivery,” he said.

Boston gets about 20% of its maintenance “quality of life” requests via the app.

Boston’s effort is the forerunner of a Massachusetts state-wide initiative called Commonwealth Connect that was announced in December.

This state-wide app is being built by SeeClickFix, a startup whose app is already used in many cities and towns. The app is free. The firm offers a “premium dashboard” used by municipalities. It also has a free Web-based tool that is used by smaller towns, said Zack Beatty, head of media and content partnerships for the New Haven, Conn.-based firm.

Beatty said the app will be deployed in more than 50 Massachusetts communities, its first state-led deployment.

SeeClickFix uses cloud-based services to host its app, something South Bend is doing as well for a sewer sensor system as well to manage its IBM system.

Authorizing an in-house deployment would have required an authorization for hardware, said Gilot. From a budgeting perspective, it was easier to move money from other accounts for cloud-based services. In any event, running IT equipment is not the city’s core competence.

Source:  computerworld.com

FCC chairman vows continued spectrum expansion

Thursday, March 7th, 2013

‘The mobile infrastructure doesn’t work without spectrum,’ Genachowski says

The Federal Communications Commission remains focused on rapidly expanding spectrum for licensed and unlicensed use, and encouraging both research and products that will let it be used more efficiently, according to the commission’s boss.

That focus has served the U.S. well, according to FCC Chairman Julius Genachowski, who responded to questions during an event Wednesday at MIT in Cambridge, Mass. Four years ago, he told the audience, the U.S. was lagging in key wireless broadband indicators compared to Asia and Europe. Today, the nation has “leapfrogged other countries,” he said.

“Mobile innovation is U.S.-driven,” he said. “The percentage of global mobile devices that have an American OS has gone from under 20% to over 80%. Apps are American-driven. And in [mobile] infrastructure, America has more LTE customers than the rest of the world combined.” In the last two years alone, private investment in mobile networks has totaled about $65 billion.

Genachowski participated in a day of demonstrations, lectures and a late afternoon Q&A session hosted by Wireless@MIT, more formally known as the MIT Center for Wireless Networks and Mobile Computing. The Center was launched last October to be a “focal point for wireless research at MIT” with more than 50 MIT faculty members, research staff and grad students, working with seven founding partners: Amazon, Cisco, Intel, MediaTek, Microsoft Research, STMicroelectronics and Telefonica.

The country still faces spectrum challenges, Genachowski said, and policy makers need to be working on “freeing up more spectrum and having forward-looking spectrum policies. A smartphone puts a demand on spectrum that’s 25 times more than that of a feature phone.”

“The mobile infrastructure doesn’t work without spectrum,” he said. “No one anticipated the growth and demand we’re seeing now. It’s putting tremendous stress on the system. And we have to figure out ways to address that.”

Genachowski said there have been two major spectrum innovations in the last 30 years: the introduction of auctions to allocate spectrum and the introduction of unlicensed bands, which helped fuel the growth of Wi-Fi, Bluetooth, other RF technologies and the ecosystems of products and software that have sprung up around them.

“It seems inconceivable to me that these would be the last two innovations in spectrum policy,” he said.

Possible new ones include what he called “next generation unlicensed spectrum,” with higher ranges and lower frequencies, and “a lot more sharing of government spectrum for private use.”

The rationalizing of the 300 MHz of nationwide broadcast TV spectrum is an example of the kinds of changes he anticipates. Genachowski said the government has created incentives that persuade some of the broadcasters in every market to sell or share their spectrum. Then, the spectrum will be reorganized into something less than 300 MHz to meet the needs of the remaining broadcasters, with the difference being repackaged and, likely in 2014, auctioned for new uses.

He expects that a “significant part” of this spectrum will be reserved for unlicensed use.

At the same time, the FCC is moving forward to make 5,600 MHz of contiguous “white spaces” spectrum, also from TV bands, available for new uses. “A number of broadcasters have formed a group and they’re interested in a reverse auction and in participating in the FCC rulemaking” for it, Genachowski said.

This week, for example, the commission approved a Google project to collect information in a public database on white space spectrum (the gaps between TV bands) that can be used without intruding on protected transmissions like terrestrial TV and radio.

The latest opportunity to expand unlicensed spectrum was launched a few weeks ago, he reminded his audience, with the FCC announcement that it will expand the 5 GHz band by about 35%. [See “FCC will move to give more spectrum to Wi-Fi“] Separately, he said, “we need to create a new unlicensed platform that has different characteristics: higher power, higher range. It’s more complex. But at MIT and other places there’s wonderful research going on to sustain such a platform.”

The 5 GHz expansion will be “on the market in the next year or two.” The FCC will need to work with other agencies to allow spectrum sharing. “Our estimate is that about 60% of the usable spectrum for the kinds of uses we know and love is spectrum [today] controlled by the government,” he said. “But 60% is too much. Where we can clear and re-allocate and repack that spectrum into more efficient uses, we have to do that.”

Genachowski disagreed with an assertion that the value of unlicensed spectrum has “dramatically outstripped” that of licensed spectrum.

“They both provided a tremendous amount of value,” he said. “The ways they’re now working together are creating more value than either alone.”

The country needs policies that “incentivize” major capital investments in wireless broadband infrastructure. “It’s one thing to have wireless routers in our homes and offices, where they rely on existing wired infrastructure,” he said. “But to have wireless everywhere requires investment. We’ll see $35 billion of infrastructure investments this year, on top of $30 billion last year. And licensed spectrum made this possible.”

Source:  computerworld.com

U.S. mobile consumers spent $95B on data in 2012, topping what they spent on voice

Tuesday, March 5th, 2013

TIA report shows ‘historic transition’ for mobile industry.

Talk about a shift in behavior — or maybe that should be text about a shift.

It seemed only a matter of time, but today the Telecommunication Industry Association (TIA) said that 2012 marked the first time that U.S. wireless data spending topped voice spending. Also, according to the association’s 2013 ICT Market Review & Forecast report, there are more wireless subscriptions than there are adults in the country.

The “industry is squarely in the middle of an historic transition,” said Grant Seiffert, president of the association that represents high-tech manufacturers and suppliers of communications technology. “Wireless had a breakthrough year in 2012. … While wireless penetration will level off in the years ahead, infrastructure investments will continue surging in order to meet the heavy demand for mobile data.”

A number of factors will fuel a boom for the industry, he said, including more spending on cloud services and cybersecurity, and the continuing rise of smartphones and tablets.

Here are some specifics of the report:

  • Consumers spent $94.8 billion on mobile data services, versus $92.4 billion on voice.
  • Wireless penetration among adults reached 102.5 percent, and the TIA predicts that carriers will add 40.3 million subscribers over the next four years, for a penetration of 111.3 percent in 2016.
  • U.S. wireline spending was $39.1 billion in 2012, compared with $27 billion for wireless infrastructure. By 2016, wireline spending is expected to climb to $44.4 billion, while wireless will reach $38.4 billion.
  • The overall telecommunications industry experienced 7 percent worldwide growth in 2012, down 3 percent from 2011. While growth actually accelerated in the U.S. (from 5.9 percent in 2011 to 6.2 percent in 2012), international markets saw a decline (11.3 percent versus 7.2 percent).

Source:  CNET

TD-LTE goes mainstream with a new performance promise

Friday, March 1st, 2013

China Mobile’s budding network is helping to pique interest in a different way of using scarce spectrum

A version of LTE that could give consumers more mobile bandwidth for downloading content or apps is moving from the margins to the mainstream at Mobile World Congress this week.

TD (Time-Division) LTE, which uses a single block of radio frequencies instead of the paired blocks used in typical FDD (Frequency-Division Duplexing) cellular networks, has shown up in many places at the world’s annual mobile gathering. Numerous carriers and vendors are building the technology into their gear and demonstrating uses for it, in a departure from the scant attention given TD-LTE a few years ago.

The big prize that shines over all this activity is the prospect of China Mobile’s planned national deployment of TD-LTE, which is still waiting on the Chinese government’s spectrum allocation but is already gathering steam with trial services in six cities. Yet carriers elsewhere are also using or planning to use TD-LTE, including Softbank in Japan, Sprint Nextel and Clearwire in the U.S., and operators in Brazil, Russia, India, Sweden, Saudi Arabia, and other countries. TD spectrum blocks are being set aside in yet other countries, including in a recent auction in the U.K.

“There’s a lot of momentum behind it, and it’s not all China,” said Ovum analyst Daryl Schoolar. Still, with more than 600 million subscribers, China Mobile is big enough to make TD-LTE attractive to network vendors, chip designers and device makers for a long time, he said. “The volume opportunity is going to keep everyone interested.”

At its booth at MWC, China Mobile showed off dozens of chips and devices designed for its planned network. They included smartphones from LG Electronics, Samsung Electronics, Huawei Technologies, ZTE and Quanta; USB dongles and personal hotspots from most of those vendors, and tablets from Huawei and Quanta. The display of chips included ones from big names such as Marvell Technology Group and Qualcomm. All those devices can be used with FDD as well as TD, along with backward compatibility with 2G and 3G networks.

Alcatel-Lucent has already developed a TD-LTE small cell, through subsidiary Alcatel Shanghai Bell, that will be used to add capacity to China Mobile’s network in busy areas. Mindspeed Technologies, which supplied the silicon for it, showed off the cell at MWC.

Also at the show, Nokia Siemens Networks demonstrated a patented algorithm for balancing subscriber loads among LTE cells, including between TD and FDD equipment.

Advocates of TD-LTE say flexibility is its main advantage. Most LTE networks so far have been built with FDD (Frequency-Division Duplexing) technology, which uses two separate and equal-sized spectrum blocks, one for upstream and one for downstream traffic. Because TD-LTE uses just one large block, the frequencies within that block can be divided up in any way that makes sense for the way subscribers will use it.

That means a TD-LTE service could look more like home broadband, with a relatively thin pipe for sending email messages and URLs and a fatter one for downloading the pages that come with those URLs, as well as video, music, images and other content from the Internet.

China Mobile promotes this feature as one of the main things that will make its network better. The carrier could divide its spectrum differently in various areas depending on how the network might be used there, said Lei Cao, a China Mobile representative in the company’s MWC booth.

Some said TD-LTE saves carriers money and is just a better way to use spectrum.

“This is hotly debated, but the TD-LTE advocates will tell you that it can be deployed in cheaper unpaired spectrum and is more efficient when the downlink/uplink is asymmetric,” Tolaga Research analyst Phil Marshall said in an email interview. Dedicating the same amount of spectrum to uplinks as to downlinks leaves a lot of uplink spectrum unused, he said.

The biggest reason FDD is still used is tradition, according to Marshall. When cell phones were used mostly for voice, upstream and downstream traffic was equal.

“Most of the cellular spectrum is allocated in FDD and systems are deployed this way,” Marshall said. “The advocates of FDD will tell you that you get better performance consistency with FDD and it is easier to implement — particularly when coordinated with other FDD systems.”

Without the need for pairing, it’s also easier to cobble together various frequencies. In January, China Mobile and ZTE said they had demonstrated combining two separate TD-LTE spectrum blocks into one virtual block and assigned 75 percent of the whole to downstream traffic.

It’s not especially challenging to implement TD-LTE, Schoolar said. Nor is it hard to hand off subscribers from those networks to LTE FDD systems, according to China Mobile and others. Despite the dominance of FDD, most existing LTE base stations can be set up for TD use with a software upgrade or a new line card, Schoolar said. Sprint plans to mix FDD and TD networks by using the Clearwire TD-LTE network for extra capacity in busy areas, shifting users from one to the other as needed.

China Mobile Hong Kong has already launched a combined TD and FDD network. It puts subscribers on TD-LTE where it’s available, then shifts them onto FDD where possible, and puts them onto GSM when necessary. All these transitions are transparent to users, Lei said.

The pre-commercial network in mainland China is growing rapidly despite the fact that China Mobile can’t offer commercial service yet. There are about 20,000 base stations there today and will be 200,000 in 100 cities by the end of this year, Lei said. And China Mobile is not expected to be the only Chinese carrier to deploy TD-LTE.

That bodes well for a high-volume market that should make TD-LTE devices cheap and plentiful in other parts of the world, with the help of big silicon vendors, analysts said. “It really depends on guys like Qualcomm to make it happen,” Marshall said.

Source:  networkworld.com

Wireless LAN vendors target surging carrier Wi-Fi market

Monday, February 25th, 2013

Ruckus, Aruba products aim at large-scale, integrated Wi-Fi services

Two wireless LAN vendors are targeting the next big explosion in Wi-Fi growth: hotspots and hotzones created by carriers and other services providers.

Both Ruckus Wireless and Aruba Networks this week at the Mobile World Congress Show in Barcelona outlined products aimed at this provider market. The goal is to be part of a crystallizing of hardware and software that can integrate Wi-Fi with core mobile networks.

As part of its reference design for carrier-based Wi-Fi services, Ruckus announced a new family of outdoor 802.11n access points, the ZoneFlex 7782 series. Four models offer different internal and external antenna configuration options. All have three transmit and three receive antennas supporting three data streams for a maximum data rate of 900Mbps. All three have Ruckus’ patented BeamFlex adaptive antenna technology, designed to boost gain and reduce interference. There’s also a GPS receiver, which service providers can leverage for location-based services.

Image Alt Text

Deliberately bland in design, the new Ruckus ZoneFlex 7782 outdoor access point aims at high-performance carrier Wi-Fi networks: dual-band, 3-stream 802.11n with a data rate of nearly 1Gbps.

The company also unveiled a Wi-Fi traffic analysis application for carriers, called the SmartCell Insight analytics engine, which runs on Ruckus’ Smartcell 2000 Gateway, which bridges Wi-Fi and cellular networks. The software sifts out a wealth of data about access point usage, bandwidth, subscriber activity and other metrics, and packs them into a data warehouse. Pre-written and custom reports translate the raw data into information about how well the Wi-Fi network is performing. A battery of standard APIs let carriers export the information to existing data-mining tools and interface with core network applications.

Finally, Ruckus announced SmartPoint, which adds to the ZoneFlex 7321-U access point a USB port that can accept a 3G, 4G, or WiMAX external dongle. The idea is to quickly and easily create a wireless backhaul option where a cable isn’t possible (such as a city bus). Ruckus automatically pushes to the access point the needed driver software for specific 3G/4G/WiMAX dongles. KDDI in Japan, with an extensive WiMAX network, can offer shop owners a Ruckus access point for hotspot Wi-Fi, with a WiMAX dongle for easy backhaul to the Internet.

Both the 7782 outdoor access point, priced at $3,000, and Smartpoint, at $400 are available now; the analytics application, with pricing based on the size of the network, will ship in the second quarter.

Aruba’s carrier play

Aruba, too, is recasting its WLAN architecture via software updates to address carrier requirements for creating a high-capacity, secure and reliable Wi-Fi service for mobile subscribers.

Dubbed Aruba HybridControl, the new code gives Aruba’s 7200 Mobility Controller massive scalability. Aruba says the software update will let the 7200 manage over 32,000 hotspots. That translates into over 100,000 individual access points, because each hotspot can have several of the vendor’s Aruba Instant access points. The scaling lowers carriers’ backend capital costs, cuts data center power demand, and needs less rack space, according to Aruba. The Aruba Instant model offloads cellular traffic locally to the Internet, while centralizes selected traffic such as billing and legal intercept via an IPSec connection to the 7200 controllers at the core.

HybridControl offers “zero-touch activation” for factory-default access points, with no need for any manual pre-provisioning. Switched on, these access points interface with the Aruba Activate cloud service to discover the carrier’s configuration management system and download it. Then, the access points use an assigned X.509 certificate to authenticate with an Aruba controller and set up an IPSec tunnel.

The HybridControl architecture leverages existing Aruba features such as:

  • AppRF, to identify and prioritize real-time applications, such as Microsoft Lync, to create different classes of service;
  • ClearPass Policy Management, a server application to authenticate new access points joining the mobile core network.

The carrier-focused HybridControl offering includes several products: the Aruba 7200 Mobility Controller, available now with prices starting at $38,000; Aruba Instant access points, available now with prices starting at about $400; Aruba Activate, available now and free of charge for Aruba customers. The software update for the 7200 will be available as a free Aruba OS upgrade in the second quarter.

Source:  networkworld.com

Aruba announces controller and software for hybrid wireless networks

Monday, February 25th, 2013

New 7200 Mobility Controller offloads cellular to Wi-Fi

Aruba Networks announced a Wi-Fi controller today that can create more efficient pathways for wireless traffic and control more than 32,000 Wi-Fi hotspots.

Aruba said the new 7200 Mobility Controller will be use far less power and cost much less than competing technology from Cisco, the market leader.

Aruba’s announcement of the controller, made on the first day of Mobile World Congress, is part of a trend of new software and hardware that equipment makers are offering to service providers and large enterprises to make large Wi-Fi networks and Wi-Fi hotspots more efficient, partly by reducing the wireless demand on cellular networks.

The 7200 will start at $37,995. Two rack unit 7200s (with one for redundancy) will serve about the same number of access points as seven Cisco 8500 controllers, but cost 40 times less, Manav Khurana, senior director of product marketing at Aruba, said in an interview.

The controller relies on new software called HybridControl Wi-Fi, which incorporates management capabilities for devices used by workers and guests inside organizations.

Last week, Cisco unveiled Quantum software and hardware to help carriers and enterprises improve wireless connections that are hybrid networks of 3G and 4G cellular and Wi-Fi. The devices will be demonstrated at Mobile World Congress in Barcelona this week.

Source:  computerworld.com