Archive for the ‘WiFi’ Category

The case for Wi-Fi in the Internet of Things

Tuesday, January 14th, 2014

Whether it’s the “connected home” or the “Internet of Things,” many everyday home appliances and devices will soon feature some form of Internet connectivity. What form should that connectivity take? We sat down with Edgar Figueroa, president and CEO of the Wi-Fi Alliance, to discuss his belief that Wi-Fi is the clear choice.

Options are plentiful when it comes to the Internet, but some are easily disregarded for most Internet of Things designs. Ethernet and other wired solutions require additional equipment or more cabling than what is typically found in even a modern home. Cellular connectivity is pointless for stationary home goods and still too power-hungry for wearable items. Proprietary and purpose-built solutions, like ZigBee, are either too closed off or require parallel paths to solutions that are already in our homes.

Bluetooth makes a pretty good case for itself, though inconsistent user experiences remain the norm for several reasons. The latest Bluetooth specifications provide very low power data transfers and have very low overhead for maintaining a connection. The result is that the power profile for the connection is low whether you’re transacting data or not. Connection speeds are modest compared to the alternatives. But the biggest detractor for Bluetooth is inconsistency. Bluetooth has always felt kludgy; it’s an incomplete solution that will suffice until it improves. It’s helpful that Bluetooth devices can often have their performance, reliability, and features improved upon through software updates, but the experience can still remain frustrating.

Then there’s Wi-Fi.

Figueroa wanted to highlight a few key points from a study the Alliance commissioned. “Of those polled, more than half already have a non-traditional device with a Wi-Fi radio,” he said. Here, “non-traditional” falls among a broad swath of products that includes appliances, thermostats, and lighting systems. Figueroa continued, “Ninety-one percent of those polled said they’d be more likely to buy a smart device if it came equipped with Wi-Fi.” Alliance’s point: everyone already has a Wi-Fi network in their home. Why choose anything else?

One key consideration the study seems to ignore is power draw, which is one of Bluetooth’s biggest assets. Wi-Fi connections are active and power-hungry, even when they aren’t transacting large amounts of data. A separate study looking at power consumption per bit of data transferred demonstrated that Wi-Fi trumps Bluetooth by orders of magnitude. Where Wi-Fi requires large amounts of constant power, Bluetooth requires almost no power to maintain a connection.

In response to a question on the preference for low-power interfaces, Figueroa said simply, “Why?” In his eyes, the connected home isn’t necessarily a battery-powered home. Devices that connect to our Wi-Fi networks traditionally have plugs, so why must they sip almost no power?

Bluetooth has its place in devices whose current draw must not exceed the capabilities of a watch battery. But even in small devices, Wi-Fi’s performance and ability to create ad hoc networks and Wi-Fi Direct connections can better the experience, even if it’s at the risk of increasing power draw and battery size.

In the end, the compelling case for Wi-Fi’s use in the mobile space has more to do with what we want from our experiences than whether one is more power-hungry. Simplicity in all things is preferred. Even after all these years, pairing Bluetooth is usually more complex than connecting a new device to your existing Wi-Fi network. Even in the car, where Bluetooth has had a long dominance, the ability to connect multiple devices over Wi-Fi’s wide interface may ultimately be preferred. Still, despite Figueroa’s confidence, it’s an increasingly green (and preferably bill-shrinking) world looking to adopt an Internet of Things lifestyle. Wi-Fi may ultimately need to complete its case by driving power down enough to reside in all our Internet of Things devices, from the biggest to the smallest.

Source:  arstechnica.com

High-gain patch antennas boost Wi-Fi capacity for Georgia Tech

Tuesday, November 5th, 2013

To boost its Wi-Fi capacity in packed lecture halls, Georgia Institute of Technology gave up trying to cram in more access points, with conventional omni-directional antennas, and juggle power settings and channel plans. Instead, it turned to new high-gain directional antennas, from Tessco’s Ventev division.

Ventev’s new TerraWave High-Density Ceiling Mount Antenna, which looks almost exactly like the bottom half of a small pizza box, focuses the Wi-Fi signal from the ceiling mounted Cisco access point in a precise cone-shaped pattern, covering part of the lecture hall floor. Instead of the flakey, laggy connections, about which professors had been complaining, users now consistently get up to 144Mbps (if they have 802.11n client radios).

“Overall, the system performed much better” with the Ventev antennas, says William Lawrence, IT project manager principal with the university’s academic and research technologies group. “And there was a much more even distribution of clients across the room’s access points.”

Initially, these 802.11n access points were running 40-MHz channels, but Lawrence’s team eventually switched to the narrower 20 MHz. “We saw more consistent performance for clients in the 20-MHz channel, and I really don’t know why,” he says. “It seems like the clients were doing a lot of shifting between using 40 MHz and 20 MHz. With the narrower channel, it was very smooth and consistent: we got great video playback.”

With the narrower channel, 11n clients can’t achieve their maximum 11n throughput. But that doesn’t seem to have been a problem in these select locations, Lawrence says. “We’ve not seen that to be an issue, but we’re continuing to monitor it,” he says.

The Atlanta main campus has a fully-deployed Cisco WLAN, with about 3,900 access points, nearly all supporting 11n, and 17 wireless controllers. Virtually all of the access points use a conventional, omni-directional antenna, which radiates energy in a globe-shaped configuration with the access point at the center. But in high density classrooms, faculty and students began complaining of flakey connections and slow speeds.

The problem, Lawrence says, was the surging number of Wi-Fi devices actively being used in big classrooms and lectures halls, coupled with Wi-Fi signals, especially in the 2.4-GHz band, stepping on each other over wide sections of the hall, creating co-channel interference.

One Georgia Tech network engineer spent a lot of time monitoring the problem areas and working with students and faculty. In a few cases, the problems could be traced to a client-side configuration problem. But “with 120 clients on one access point, performance really goes downhill,” Lawrence says. “With the omni-directional antenna, you can only pack the access points so close.”

Shifting users to the cleaner 5 GHz was an obvious step but in practice was rarely feasible: many mobile devices still support only 2.4-GHz connections; and client radios often showed a stubborn willfulness in sticking with a 2.4-GHz connection on a distant access point even when another was available much closer.

Consulting with Cisco, Georgia Tech decided to try some newer access points, with external antenna mounts, and selected one of Cisco’s certified partners, Tessco’s Ventev Wireless Infrastructure division, to supply the directional antennas. The TerraWave products also are compatible with access points from Aruba, Juniper, Meru, Motorola and others.

Patch antennas focus the radio beam within a specific area. (A couple of vendors, Ruckus Wireless and Xirrus, have developed their own built-in “smart” antennas that adjust and focus Wi-Fi signals on clients.) Depending on the beamwidth, the effect can be that of a floodlight or a spotlight, says Jeff Lime, Ventev’s vice president. Ventev’s newest TerraWave High-Density products focus the radio beam within narrower ranges than some competing products, and offer higher gain (in effect putting more oomph into the signal to drive it further), he says.

One model, with a maximum power of 20 watts, can have beam widths of 18 or 28 inches vertically, and 24 or 40 inches horizontally, with a gain of 10 or 11 dBi, depending on the frequency range. The second model, with a 50-watt maximum power output, has a beamwidth in both dimension of 35 degrees, at a still higher gain of 14 dBi to drive the spotlighted signal further, in really big areas like a stadium.

At Georgia Tech, each antenna focused the Wi-Fi signal from a specific overhead access point to cover a section of seats below it. Fewer users associate with each access point. The result is a kind of virtuous circle. “It gives more capacity per user, so more bandwidth, so a better user experience,” says Lime.

The antennas come with a quartet of 36-inch cables to connect to the access points. The idea is to give IT groups maximum flexibility. But the cables initially were awkward for the IT team installing the antennas. Lawrence says they experimented with different ways of neatly and quickly wrapping up the excess cable to keep it out of the way between the access point proper and the antenna panel [see photo, below]. They also had to modify mounting clips to get them to hold in the metal grid that forms the dropped ceiling in some of the rooms. “Little things like that can cause you some unexpected issues,” Lawrence says.

Georgia Tech wifiThe IT staff worked with Cisco engineers to reset a dedicated controller to handle the new “high density group” of access points; and the controller automatically handled configuration tasks like setting access point power levels and selecting channels.

Another issue is that when the patch antennas were ceiling mounted in second- or third-story rooms, their downward-shooting signal cone reached into the radio space of access points in the floor below. Lawrence says they tweaked the position of the antennas in some cases to send the spotlight signal beaming at an angle. “I look at each room and ask ‘how am I going to deploy these antennas to minimize signal bleed-through into other areas,” he says. “Adding a high-gain antenna can have unintended consequences outside the space it’s intended for.”

But based on improved throughput and consistent signals, Lawrence says it’s likely the antennas will be used in a growing number of lecture halls and other spaces on the main and satellite campuses. “This is the best solution we’ve got for now,” he says.

Source:  networkworld.com

Wireless networks that follow you around a room, optimize themselves and even talk to each other out loud

Tuesday, October 8th, 2013

Graduate students at the MIT Computer Science and Artificial Intelligence Laboratory showed off their latest research at the university’s Wireless retreat on Monday, outlining software-defined MIMO, machine-generated TCP optimization, and a localized wireless networking technique that works through sound.

Swarun Kumar’s presentation on OpenRF – a Wi-Fi architecture designed to allow multiple access points to avoid mutual interference and focus signals on active clients – detailed how commodity hardware can be used to take advantage of features otherwise restricted to more specialized devices.

There were several constraints in the 802.11n wireless standard that had to be overcome, Kumar said, including a limitation on the total number of bits per subcarrier signal that could be manipulated, as well as restricting that manipulation to one out of every two such signals.

Simply disabling the Carrier Sense restrictions, however, proved an incomplete solution.

“Access points often send these beacon packets, which are meant for all clients in a network … you cannot null them at any point if you’re a client. Unfortunately, these packets will now collide” in the absence of Carrier Sense, he said.

The solution – which involved two separate transmit queues – enabled OpenRF to automatically apply its optimal settings across multiple access points, distributing the computational workload across the access points, rather than having to rely on a beefy central controller.

Kumar said the system can boost TCP throughput by a factor of 1.6 compared to bare-bones 802.11n.

*

Keith Winstein attacked the problem of TCP throughput slightly differently, however. Using a specialized algorithm called Remy – into which users can simply input network parameters and desired performance standards – he said that networks can essentially determine the best ways to configure themselves on their own.

“So these are the inputs, and the output is a congestion control algorithm,” he said. “Now this is not an easy process – this is replacing a human protocol designer. Now, it costs like $10 to get a new protocol on Amazon EC2.”

Remy works via the heuristic principle of concentrating its efforts on the use cases where a small change in the rules results in a major change in the outcome, allowing it to optimize effectively and to shift gears quickly if network conditions change.

“Computer generated end-to-end algorithms can actually outperform human generated in-network algorithms, and in addition, human generated end-to-end algorithms,” said Winstein.

Even though Remy wasn’t designed or optimized to handle wireless networks, it still handily outperforms human-generated competition, he added.

*

Peter Iannucci is a researcher looking into highly localized ways of providing wireless Internet, which he refers to as room area networks. Having dismissed a number of technologies as insufficient – Bluetooth was too clunky, NFC had limited uptake – he eventually settled on sound.

Iannucci’s acoustic network – which he has dubbed Blurt – uses high-frequency sounds to transmit the ones and zeroes of a network connection. It’s well-suited for a network confined by design to a small space.

“Acoustic networks provide great low-leakage properties, since doors and walls are intentionally sound-absorbent,” he said. “[They] work over moderate distances, using existing devices, and they don’t require any setup for ad hoc communications.”

Iannucci acknowledges that Blurt isn’t without its problems. Given that sound waves move about a million times slower than radio waves, speed is an issue – he said that Blurt can handle about 200 bits per second when using frequencies inaudible to humans, with more speed possible only at the cost of an audible whirring chirp, reminiscent of old telephone modems.

But that’s really not the point – the idea would be more to do things like verify users of a business’ free Wi-Fi are actually sitting in the restaurant, or any other tasks involving heavily location-dependent network services.

Source: networkworld.com

802.11ac ‘gigabit Wi-Fi’ starts to show potential, limits

Monday, October 7th, 2013

Vendor tests and very early 802.11ac customers provide a reality check on “gigabit Wi-Fi” but also confirm much of its promise.

Vendors have been testing their 11ac products for months, yielding data that show how 11ac performs and what variables can affect performance. Some of the tests are under ideal laboratory-style conditions; others involve actual or simulated production networks. Among the results: consistent 400M to 800Mbps throughput for 11ac clients in best-case situations, higher throughput as range increases compared to 11n, more clients serviced by each access point, and a boost in performance for existing 11n clients.

Wireless LAN vendors are stepping up product introductions, and all of them are coming out with products, among them Aerohive, Aruba Networks, Cisco (including its Meraki cloud-based offering), Meru, Motorola Solutions, Ruckus, Ubiquiti, and Xirrus.

The IEEE 802.11ac standard does several things to triple the throughput of 11n. It builds on some of the technologies introduced in 802.11n; makes mandatory some 11n options; offers several ways to dramatically boost Wi-Fi throughput; and works solely in the under-used 5GHz band.

It’s a potent combination. “We are seeing over 800Mbps on the new Apple 11ac-equipped Macbook Air laptops, and 400Mbps on the 11ac phones, such as the new Samsung Galaxy S4, that [currently] make up the bulk of 11ac devices on campus,” says Mike Davis, systems programmer, University of Delaware, Newark, Delaware.

A long-time Aruba Networks WLAN customer, the university has installed 3,700 of Aruba’s new 11ac access points on campus this summer, in a new engineering building, two new dorms, and some large auditoriums. Currently, there are on average about 80 11ac clients online with a peak of 100, out of some 24,000 Wi-Fi clients on campus.

The 11ac network seems to bear up under load. “In a limited test with an 11ac Macbook Air, I was able to sustain 400Mbps on an 11ac access point that was loaded with over 120 clients at the time,” says Davis. Not all of the clients were “data hungry,” but the results showed “that the new 11ac access points could still supply better-than-11n data rates while servicing more clients than before,” Davis says.

The maximum data rates for 11ac are highly dependent on several variables. One is whether the 11ac radios are using 80 Mhz-wide channels (11n got much of its throughput boost by being able to use 40 MHz channels). Another is whether the radios are able to use the 256 QAM modulation scheme, compared to the 64 QAM for 11n. Both of these depend on how close the 11ac clients are to the access point. Too far, and the radios “step down” to narrower channels and lower modulations.

Another variable is the number of “spatial streams,” a technology introduced with 11n, supported by the client and access point radios. Chart #1, “802.11ac performance based on spatial streams,” shows the download throughput performance.

802.11ac

In perfect conditions, close to the access point, a three-stream 11ac radio can achieve the maximum raw data rate of 1.3Gbps. But no users will actually realize that in terms of useable throughput.

“Typically, if the client is close to the access point, you can expect to lose about 40% of the overall raw bit rate due to protocol overhead – acknowledgements, setup, beaconing and so on,” says Mathew Gast, director of product management, for Aerohive Networks, which just announced its first 11ac products, the AP370 and AP390. Aerohive incorporates controller functions in a distributed access point architecture and provides a cloud-based management interface for IT groups.

“A single [11ac] client that’s very close to the access point in ideal conditions gets very good speed,” says Gast. “But that doesn’t reflect reality: you have electronic ‘noise,’ multiple contending clients, the presence of 11n clients. In some cases, the [11ac] speeds might not be much higher than 11n.”

A third key variable is the number of spatial streams, supported by both access points and clients. Most of the new 11ac access points will support three streams, usually with three transmit and three receive antennas. But clients will vary. At the University of Delaware, the new Macbook Air laptops support two streams; but the new Samsung Galaxy S4 and HTC One phones support one stream, via Broadcom’s BCM4335 11ac chipset.

Tests by Broadcom found that a single 11n data stream over a 40 MHz channel can deliver up to 60Mbps. By comparison, single-stream 11ac in an 80 MHz channels is “starting at well over 250Mbps,” says Chris Brown, director of business development for Broadcom’s wireless connectivity unit. Single-stream 11ac will max out at about 433Mbps.

There are some interesting results from these qualities. One is that the throughput at any given distance from the access point is much better in 11ac compared to 11n. “Even at 60 meters, single-stream 11ac outperforms all but the 2×2 11n at 40 MHz,” Brown says.

Another result is that 11ac access points can service a larger number of clients than 11n access points.

“We have replaced several dozen 11n APs with 11ac in a high-density lecture hall, with great success,” says University of Delaware’s Mike Davis. “While we are still restricting the maximum number of clients that can associate with the new APs, we are seeing them maintain client performance even as the client counts almost double from what the previous generation APs could service.”

Other features of 11ac help to sustain these capacity gains. Transmit beam forming (TBF), which was an optional feature in 11n is mandatory and standardized in 11ac. “TBR lets you ‘concentrate’ the RF signal in a specific direction, for a specific client,” says Mark Jordan, director, technical marketing engineering, Aruba Networks. “TBF changes the phasing slightly to allow the signals to propagate at a higher effective radio power level. The result is a vastly improved throughput-over-distance.”

A second feature is low density parity check (LDPC), which is a technique to improve the sensitivity of the receiving radio, in effect giving it better “hearing.”

The impact in Wi-Fi networks will be significant. Broadcom did extensive testing in a network set up in an office building, using both 11n and 11ac access points and clients. It specifically tested 11ac data rates and throughput with beam forming and low density parity check switched off and on, according to Brown.

Tests showed that 11ac connections with both TBR and LDPC turned on, increasingly and dramatically outperformed 11n – and even 11ac with both features turned off – as the distance between client and access point increased. For example, at one test point, an 11n client achieved 32Mbps. At the same point, the 11ac client with TBR and LDPC turned “off,” achieved about the same. But when both were turned “on,” the 11ac client soared to 102Mbps, more than three times the previous throughput.

Aruba found similar results. Its single-stream Galaxy S4 smartphone reached 238Mbps TCP downstream throughput at 15 feet, 235Mbps at 30 feet, and 193Mbps at 75 feet. At 120 feet, it was still 154Mbps. For the same distances upstream the throughput rates were: 235Mbps, 230M, 168M, and 87M.

“We rechecked that several times, to make sure we were doing it right, says Aruba’s Jordan. “We knew we couldn’t get the theoretical maximums. But now, we can support today’s clients with all the data they demand. And we can do it with the certainty of such high rates-at-range that we can come close to guaranteeing a high quality [user] experience.”

There are still other implications with 11ac. Because of the much higher up and down throughput, 11ac mobile devices get on and off the Wi-Fi channel much faster compared to 11n, drawing less power from the battery. The more efficient network use will mean less “energy per bit,” and better battery life.

A related implication is that because this all happens much faster with 11ac, there’s more time for other clients to access the channel. In other words, network capacity increases by up to six times, according to Broadcom’s Brown. “That frees up time for other clients to transmit and receive,” he says.

That improvement can be used to reduce the number of access points covering a given area: in the Broadcom office test area, four Cisco 11n access points provided connectivity. A single 11n access point could replace them, says Brown.

But more likely, IT groups will optimize 11ac networks for capacity, especially as the number of smartphones, tablets, laptops and other gear are outfitted with 11ac radios.

Even 11n clients will see improvement in 11ac networks, as University of Delaware has found.

“The performance of 11n clients on the 11ac APs has probably been the biggest, unexpected benefit,” says Mike Davis. “The 11n clients still make up 80% of the total number of clients and we’ve measured two times the performance of 11n clients on the new 11ac APs over the last generation [11n] APs.”

Wi-Fi uses Ethernet’s carrier sense multiple access with collision detection (CSMA/CD) which essentially checks to see if a channel is being used, and if so, backs off, waits and tries again. “If we’re spending less time on the net, then there’s more airtime available, and so more opportunities for devices to access the media,” says Brown. “More available airtime translates into fewer collisions and backoffs. If an overburdened 11n access point is replaced with an 11ac access point, it will increase the network’s capacity.”

In Aruba’s in-house testing, a Macbook Pro laptop with a three-stream 11n radio was connected to first to the 11n Aruba AP-135, and then to the 11ac AP-225. As shown in Chart #2, “11ac will boost throughput in 11n clients,” the laptop’s performance was vastly better on the 11ac access point, especially as the range increased.

802.11ac

These improvements are part of “wave 1” 11ac. In wave 2, starting perhaps later in 2014, new features will be added to 11ac radios: support four to eight data streams, explicit transmit beam forming, an option for 160 Mhz channels, and “multi-user MIMO,” which lets the access point talk to more than one 11ac client at the same time.

Source:  networkworld.com

Aruba announces cloud-based Wi-Fi management service

Tuesday, October 1st, 2013

Competes with Cisco-owned Meraki and Aerohive

Aruba Networks today announced a new Aruba Central cloud-based management service for Wi-Fi networks that could be valuable to companies with branch operations, schools and mid-sized networks where IT support is scarce.

Aruba still sells Wi-Fi access points but now is offering Aruba Central cloud management of local Wi-Fi zones, for which it charges $140 per AP annually.

The company also announced the new Aruba Instant 155 AP, a desktop model starting at $895 and available now and the Instant 225 AP for $1.295, available sometime later this month.

A new 3.3 version of the Instant OS is also available, and a new S1500 mobility access switch with 12 to 48 ports starting at $1,495 will ship in late 2013.

Cloud-based management of Wi-Fi is in its early stages and today constitutes about 5% of a $4 billion annual Wi-Fi market, Aruba said, citing findings by Dell’Oro Group. Aruba said it faces competition from Aerohive and Meraki, which Cisco purchased for $1.2 billion last November.

Cloud-based management of APs is ideally suited for centralizing management of branch offices or schools that don’t have their own IT staff.

“We have one interface for multiple sites, for those wanting to manage from a central platform,” said Syliva Hooks, Aruba’s director of product marketing. “There’s remote monitoring and troubleshooting. We do alerting and reports, all in wizard-based formats, and you can group all the APs from location. We’re trying to offer sophisticated functions, but presented so a generalist could use them.”

Aruba relies on multiple cloud providers and multiple data centers to support Aruba Central, Hooks said.

The two new APs provide 450 Mbps throughput in 802.11n for the 155 AP and 1.3 Gbps for the 220 AP, Aruba said. Each AP in a Wi-Fi cluster running the Instant OS can assume controller functions with intelligence built in. The first AP installed in a cluster can select itself as the master controller of the other APs and if it somehow fails, the next most senior AP selects itself as the master.

Source:  networkworld.com

NFL lagging on stadium Wi-Fi

Tuesday, September 3rd, 2013

http://i2.cdn.turner.com/cnn/dam/assets/130902121717-levis-stadium-horizontal-gallery.jpg

The consumption of NFL football, America’s most popular sport, is built on game-day traditions.

This week fans will dress head-to-toe in team colors and try out new tailgate recipes in parking lots before filing into 16 NFL stadiums to cheer on their team — which, thanks to the league’s parity, will likely still be in the playoff hunt come December.

But a game-day ritual of the digital age — tracking scores, highlights and social-media chatter on a mobile device — isn’t possible inside many NFL venues because the crush of fans with smartphones can overload cellular networks.

The improved home-viewing experience — high-def TV, watching multiple games at once, real-time fantasy-football updates and interaction via social media — has left some NFL stadiums scrambling to catch up. It’s one of the reasons why, before rebounding last year, the NFL lost attendance between 2008 and 2011, forcing the league to alter television-blackout rules.

In May 2012, NFL Commissioner Roger Goodell announced an initiative to outfit all 31 NFL stadiums with Wi-Fi. But with the start of the 2013 regular season just days away, less than half of the NFL’s venues are Wi-Fi enabled and no stadiums have launched new Wi-Fi systems this year.

Part of the reason for the delay is some stadium operators are waiting for the next generation of increased Wi-Fi speed before installing networks, said Paul Kapustka, editor in chief for Mobile Sports Report.

Another reason, Kapustka said, is that the cost of installing Wi-Fi will come out of the pockets of venue owners and operators who have traditionally not needed to invest in such costly projects. Instead, they receive public money to help build stadiums and television money for the right to broadcast games.

“Stadium owners and operators need to get their hands on the fact that they need to put in Wi-Fi like they need to put in plumbing,” Kapustka said.

Brian Lafemina, the NFL’s vice president of club business development, said the league is still searching for a telecommunications partner that can help tackle challenges of stadium location, design and tens of thousands of fans all trying to access the network at the same time.

“Yes, we are working on it as hard as we can,” he said. “But the technology just isn’t where it needs to be to deliver what we want to deliver.”

The league is unveiling a variety of technological enhancements at stadiums in 2013, including cameras in locker rooms, massive video boards that will show replays of every play, a “fantasy football lounge” with sleek technological amenities, the ability to listen to audio of other games from inside the stadium, team specific fantasy games and free access to the league’s NFL Red Zone cable channel for season ticket holders.

Lafemina emphasized the league’s role as a storyteller and said it is striving to use technology to provide fans in stadiums with unique content.

“The most important people in that stadium are the 70,000 paying customers,” he said.

Jonathan Kraft, president of the New England Patriots and co-chair of the NFL’s digital media committee, told CNN Money in January that he hopes to have all stadiums equipped with Wi-Fi for the start of the 2015 season.

The Patriots helped lead the way last year by offering fans free Wi-Fi throughout Gillette Stadium in Foxboro, Massachusetts. The network was built by New Hampshire-based Enterasys Networks.

“We certainly encourage that any club would invest the way they have,” said Lafemina.

Eleven other stadiums currently have Wi-Fi capability: MetLife Stadium in northern New Jersey, the Georgia Dome in Atlanta, Lucas Oil Stadium in Indianapolis, Raymond James Stadium in Tampa, the Mercedes-Benz Superdome in New Orleans, Bank of America Stadium in Charlotte, Sun Life Stadium in Miami, AT&T Stadium in suburban Dallas, University of Phoenix Stadium in suburban Phoenix, Ford Field in Detroit and Soldier Field in Chicago.

The 20 other stadiums have Wi-Fi in certain areas, but mostly operate on wireless service provided by Verizon and/or AT&T. Many of these venues have installed distributed antenna systems (DAS) to increase wireless connectivity while they seek answers to the challenges of enabling stadiums with Wi-Fi.

DAS connects cellular antennas to a common source, allowing wireless access in large buildings like stadiums.

Mobile Sports Report published its inaugural State of the Stadium Technology Survey this year, based on responses from more than 50 NFL, MLB, NBA, NHL, university, pro soccer, pro golf and car racing sites. The survey concluded DAS is currently more popular at venues because it boosts connectivity to mobile devices while dividing costs between carriers and the facility.

Cleveland Browns fans will benefit from a new DAS tower, installed by Verizon, and an upgraded AT&T tower this year at FirstEnergy Stadium, Browns President Alec Scheiner said the improved technology will serve as a test case for whether to install Wi-Fi in the future.

“If you are a consumer or a fan, you really just care about being able to get on your mobile device, and that’s what we’re trying to tackle,” he said during a July press conference.

Kapustka said DAS is a quick fix and is not a long-term strategy, especially when it comes to fans watching TV replays on their mobile devices.

“The video angle is the big thing for Wi-Fi,” he said. “Cellular just simply won’t be able to handle the bandwidth.”

He also pointed out that it is not in the best business interest of cellphone carriers to install Wi-Fi, as it would take customers off their networks.

Also complicating Kraft’s 2015 goal is the lack of league consensus about who will build Wi-Fi networks in all of its stadiums, and when.

By contrast, Major League Baseball named wireless-tech company Qualcomm its official technology partner in April, launching a two-year study to solve mobile-connectivity issues in its 30 stadiums. Kapustka said MLB was in a position to strike the overarching deal with Qualcomm because team owners made the league responsible for digital properties during the 1990s.

The NFL has a variety of rights deals, including Direct TV and Verizon, which make it more difficult for the league to agree on a single Wi-Fi plan, he said.

“My opinion is they (the NFL) will eventually have something more like MLB,” Kapustka said. “MLB has shown it is a great way to make money.”

Source:  CNN

Amazon is said to have tested a wireless network

Friday, August 23rd, 2013

Amazon.com Inc. (AMZN) has tested a new wireless network that would allow customers to connect its devices to the Internet, according to people with knowledge of the matter.

The wireless network, which was tested in Cupertino, California, used spectrum controlled by satellite communications company Globalstar Inc. (GSAT), said the people who asked not to be identified because the test was private.

The trial underlines how Amazon, the world’s largest e-commerce company, is moving beyond being a Web destination and hardware maker and digging deeper into the underlying technology for how people connect to the Internet. That would let Amazon create a more comprehensive user experience, encompassing how consumers get online, what device they use to connect to the Web and what they do on the Internet.

Leslie Letts, a spokeswoman for Amazon, didn’t respond to a request for comment. Katherine LeBlanc, a spokeswoman for Globalstar, declined to comment.

Amazon isn’t the only Internet company that has tested technology allowing it to be a Web gateway. Google Inc. (GOOG) has secured its own communications capabilities by bidding for wireless spectrum and building high-speed, fiber-based broadband networks in 17 cities, including Austin, Texas and Kansas City, Kansas. It also operates a Wi-Fi network in Mountain View, California, and recently agreed to provide wireless connectivity at Starbucks Corp. (SBUX)’s coffee shops.

Always Trying

Amazon continually tries various technologies, and it’s unclear if the wireless network testing is still taking place, said the people. The trial was in the vicinity of Amazon’s Lab126 research facilities in Cupertino, the people said. Lab126 designs and engineers Kindle devices.

“Given that Amazon’s becoming a big player in video, they could look into investing into forms of connectivity,” independent wireless analyst Chetan Sharma said in an interview.

Amazon has moved deeper into wireless services for several years, as it competes with tablet makers like Apple Inc. (AAPL) and with Google, which runs a rival application store. Amazon’s Kindle tablets and e-book readers have built-in wireless connectivity, and the company sells apps for mobile devices. Amazon had also worked on its own smartphone, Bloomberg reported last year.

Chief Executive Officer Jeff Bezos is aiming to make Amazon a one-stop shop for consumers online, a strategy that spurred a 27 percent increase in sales to $61.1 billion last year. It’s an approach investors have bought into, shown in Amazon’s stock price, which has more than doubled in the past three years.

Globalstar’s Spectrum

Globalstar is seeking regulatory approval to convert about 80 percent of its spectrum to terrestrial use. The Milpitas, California-based company applied to the Federal Communications Commission for permission to convert its satellite spectrum to provide Wi-Fi-like services in November 2012.

Globalstar met with FCC Chairwoman Mignon Clyburn in June, and a decision on whether the company can convert the spectrum could come within months. A company technical adviser conducted tests that showed the spectrum may be able to accommodate more traffic and offer faster speeds than traditional public Wi-Fi networks.

“We are now well positioned in the ongoing process with the FCC as we seek terrestrial authority for our spectrum,” Globalstar CEO James Monroe said during the company’s last earnings call.

Neil Grace, a spokesman for the FCC, declined to comment.

If granted FCC approval, Globalstar is considering leasing its spectrum, sharing service revenues with partners, and other business models, one of the people said. With wireless spectrum scarce, Globalstar’s converted spectrum could be of interest to carriers and cable companies, seeking to offload ballooning mobile traffic, as well as to technology companies.

The FCC issued the permit to trial wireless equipment using Globalstar’s spectrum to the satellite service provider’s technical adviser, Jarvinian Wireless Innovation Fund. In a letter to the FCC dated July 1, Jarvinian managing director John Dooley said his company is helping “a major technology company assess the significant performance benefits” of Globalstar’s spectrum.

Source:  bloomberg.com

Next up for WiFi

Thursday, August 22nd, 2013

Transitioning from the Wi-Fi-shy financial industry, Riverside Medical Center’s CSO Erik Devine remembers his shock at the healthcare industry’s wide embrace of the technology when he joined the hospital in 2011.

“In banking, Wi-Fi was almost a no-go because everything is so overly regulated. Wireless here is almost as critical as wired,” Devine still marvels. “It’s used for connectivity to heart pumps, defibrillators, nurse voice over IP call systems, surgery robots, remote stroke consultation systems, patient/guest access and more.”

To illustrate the level of dependence the organization has on Wi-Fi, Riverside Medical Center calls codes over the PA system — much like in medical emergencies — when the network goes down. “Wireless is such a multifaceted part of the network that it’s truly a big deal,” he says.

And getting bigger. Besides the fact that organizations are finding new ways to leverage Wi-Fi, workers have tasted the freedom of wireless, have benefited from the productivity boost, and are demanding increased range and better performance, particularly now that many are showing up with their own devices (the whole bring your own device thing). The industry is responding in kind, introducing new products and technologies, including gigabit Wi-Fi (see “Getting ready for gigabit Wi-Fi“), and it is up to IT to orchestrate this new mobile symphony.

“Traffic from wireless and mobile devices will exceed traffic from wired devices by 2017,” according to the Cisco Visual Networking Index. While only about a quarter of consumer IP traffic originated from non-PC devices in 2012, non-PC devices will account for almost half of consumer IP traffic by 2017, Cisco says.

Cisco Visual Networking IndexIT gets it, says Tony Hernandez, principal in Grant Thornton’s business consulting practice. Wi-Fi is no longer an afterthought in IT build-outs. “The average office worker still might have a wired connection, but they also have the capability to use Wi-Fi across the enterprise,” says Hernandez, noting the shift has happened fast.

“Five years ago, a lot of enterprises were looking at Wi-Fi for common areas such as lobbies and cafeterias and put that traffic on an isolated segment of the network,” Hernandez says. “If users wanted access to corporate resources from wireless, they’d have to use a VPN.”

Hernandez credits several advances for Wi-Fi’s improved stature: enterprise-grade security; sophisticated, software-based controllers; and integrated network management.

Also in the mix: pressure from users who want mobility and flexibility for their corporate machines as well as the ability to access the network from their own devices, including smartphones, tablets and laptops.

Where some businesses have only recently converted to 802.11n from the not-too-distant past of 802.11a/b/g, they now have to decide if their next Wi-Fi purchases will support 802.11ac, the draft IEEE standard that addresses the need for gigabit speed. “The landscape is still 50/50 between 802.11g and 802.11n,” Hernandez says. “There are many businesses with older infrastructure that haven’t refreshed their Wi-Fi networks yet.”

What will push enterprises to move to 802.11ac? Heavier reliance on mobile access to video such as videoconferencing and video streaming, he says.

Crash of the downloads

David Heckaman, vice president of technology development at luxury hospitality chain Mandarin Oriental Hotel Group, remembers the exact moment he knew Wi-Fi had gained an equal footing with wired infrastructure in his industry.A company had booked meeting room space at one of Mandarin Oriental’s 30 global properties to launch its new mobile app and answered all the hotel’s usual questions about anticipated network capacity demands. Not yet familiar with the impact of dense mobile usage, the IT team didn’t account for the fallout when the 200-plus crowd received free Apple iPads to immediately download and launch the new app. The network crashed. “It was a slap in the face: What was good enough before wouldn’t work. This was a whole new world,” Heckaman says.

Seven to eight years ago, Wi-Fi networks were designed to address coverage and capacity wasn’t given much thought. When Mandarin Oriental opened its New York City property in 2003, for example, IT installed two or three wireless access points in a closet on each floor and used a distributed antenna to extend coverage to the whole floor. At the time, wireless only made up 10% of total network usage. As the number climbed to 40%, capacity issues cropped up, forcing IT to rethink the entire architecture.

“We didn’t really know what capacity needs were until the Apple iPhone was released,” Heckaman says. Now, although a single access point could provide signal coverage for every five rooms, the hotel is putting access points in almost every room to connect back to an on-site controller.

Heckaman’s next plan involves adding centralized Wi-Fi control from headquarters for advanced reporting and policy management. Instead of simply reporting that on-site controllers delivered a certain number of sessions and supported X amount of overall bandwidth, he would be able to evaluate in real-time actual end-device performance. “We would be able to report on the quality of the connection and make adjustments accordingly,” he says.

Where he pinpoints service degradation, he’ll refresh access points with those that are 802.11ac-enabled. As guests bring more and more devices into their rooms and individually stream movies, play games or perform other bandwidth-intensive actions, he predicts the need for 802.11ac will come faster than anticipated.

“We have to make sure that the physical link out of the building, not the guest room access point, remains the weakest point and that the overall network is robust enough to handle it,” he says.

Getting schooled on wireless

Craig Canevit, IT administrator at the University of Tennessee at Knoxville, has had many aha! moments when it comes to Wi-Fi across the 27,000-student campus. For instance, when the team first engineered classrooms for wireless, it was difficult to predict demand. Certain professors would need higher capacity for their lectures than others, so IT would accommodate them. If those professors got reassigned to different rooms the next year, they would immediately notice performance issues.

“They had delays and interruption of service so we had to go back and redesign all classrooms with more access points and more capacity,” Canevit says.

The university also has struggled with the fact that students and faculty are now showing up with numerous devices. “We see at least three devices per person, including smartphones, tablets, gaming consoles, Apple TV and more,” he says. IT has the dual challenge of supporting the education enterprise during the day and residential demands at night.

The school’s primary issue has revolved around IP addresses, which the university found itself low on as device count skyrocketed. “Devices require IP addresses even when sitting in your pocket and we faced a terrible IP management issue,” he says. IT had to constantly scour the network for unused IP addresses to “feed the monster.”

Eventually the team came too close to capacity for comfort and had to act. Canevit didn’t think IPv6 was widely enough supported at the time, so the school went with Network Address Translation instead, hiding private IP addresses behind a single public address. A side effect of NAT is that mapping network and security issues to specific devices becomes more challenging, but Canevit says the effort is worth it.

Looking forward, the university faces the ongoing challenge of providing Wi-Fi coverage to every dorm room and classroom. That’s a bigger problem than capacity. “We only give 100Mbps on the wired network in residence halls and don’t come close to hitting capacity,” he says, so 802.11ac is really not on the drawing board. What’s more, 802.11ac would exacerbate his coverage problem. “To get 1Gbps, you’ll have to do channel bonding, which leaves fewer overlapping channels available and takes away from the density,” he says.

What he is intrigued by is software-defined networking. Students want to use their iPhone to control their Apple TV and other such devices, which is impossible currently because of subnets. “If you allowed this in a dorm, it would degrade quality for everyone,” he says. SDN could give wireless administrators a way around the problem by making it possible to add boatloads of virtual LANs. “Wireless will become more of a provisioning than an engineering issue,” Canevit predicts.

Hospital all-in with Wi-Fi

Armand Stansel, director of IT infrastructure at Houston’s The Methodist Hospital System, recalls a time when his biggest concern regarding Wi-Fi was making sure patient areas had access points. “That was in early 2000 when we were simply installing Internet hotspots for patients with laptops,” he says.

Today, the 1,600-bed, five-hospital system boasts 100% Wi-Fi coverage. Like Riverside Medical Center, The Methodist Hospital has integrated wireless deep into the clinical system to support medical devices such as IV pumps, portable imaging systems for radiology, physicians’ tablet-based consultations and more. The wireless network has 20,000 to 30,000 touches a day, which has doubled in the past few years, Stansel says.

And if IT has its way, that number will continue to ramp up. Stansel envisions a majority of employees working on the wireless network. He wants to transition back-office personnel to tablet-based docking systems when the devices are more “enterprise-ready” with better security and durability (battery life and the device itself).

Already he has been able to reduce wired capacity by more than half due to the rise of wireless. Patient rooms, which used to have numerous wired outlets, now only require a few for the wired patient phone and some telemetry devices.

When the hospital does a renovation or adds new space, Stansel spends as much time planning the wired plant as he does studying the implications for the Wi-Fi environment, looking at everything from what the walls are made of to possible sources of interference. And when it comes to even the simplest construction, such as moving a wall, he has to deploy a team to retest nearby access points. “Wireless does complicate things because you can’t leave access points static. But it’s such a necessity, we have to do it,” he says.

He also has to reassess his access point strategy on an ongoing basis, adding more or relocating others depending on demand and traffic patterns. “We always have to look at how the access point is interacting with devices. A smartphone connecting to Wi-Fi has different needs than a PC and we have to monitor that,” he says.

The Methodist Hospital takes advantage of a blend of 802.11b, .11g and .11n in the 2.4GHz and 5GHz spectrums. Channel bonding, he has found, poses challenges even for .11n, reducing the number of channels available for others. The higher the density, he says, the less likely he can take full advantage of .11n. He does use n for priority locations such as the ER, imaging, radiology and cardiology, where users require higher bandwidth.

Stansel is betting big that wireless will continue to grow. In fact, he believes that by 2015 it will surpass wired 3-to-1. “There may come a point where wired is unnecessary, but we’re just not there yet,” he says.

Turning on the ac

Stansel is, however, onboard with 802.11ac. The Methodist Hospital is an early adopter of Cisco’s 802.11ac wireless infrastructure. To start, he has targeted the same locations that receive 802.11n priority. If a patient has a cardiac catheterization procedure done, the physician who performed the procedure can interactively review the results with the patient and family while he is still in the recovery room, referencing dye images from a wireless device such as a tablet. Normally, physicians have to verbally brief patients just out of surgery, then do likewise with the family, and wait until later to go over high-definition images from a desktop.

Current wireless technologies have strained to support access to real-time 3D imaging (also referred to as 4D), ultrasounds and more. Stansel expects better performance as 802.11ac is slowly introduced.

Riverside Medical Center’s Devine is more cautious about deploying 802.11ac, saying he is still a bit skeptical. “Can we get broader coverage with fewer access points? Can we get greater range than with 802.11n? That’s what is important to us,” he says.

In the meantime, Devine plans to deploy 20% to 25% more access points to support triangulation for location of equipment. He’ll be able to replace RFID to stop high-value items such as Ascom wireless phones and heart pumps from walking out the door. “RFID is expensive and a whole other network to manage. If we can mimic what it does with Wi-Fi, we can streamline operations,” he says.

High-power access points currently are mounted in each hallway, but Devine wants to swap those out with low-power ones and put regular-strength access points in every room. If 802.11ac access points prove to be affordable, he’ll consider them, but won’t put off his immediate plans in favor of the technology.

The future of Wi-Fi

Enterprise Strategy Group Senior Analyst John Mazur says that Wi-Fi should be front and center in every IT executive’s plans. BYOD has tripled the number of Wi-Fi connected devices and new access points offer about five times the throughput and twice the range of legacy Wi-Fi access points. In other words, Mazur says, Wi-Fi is up to the bandwidth challenge.

He warns IT leaders not to be scared off by spending projections, which, according to ESG’s 2013 IT Spending Intentions Survey, will be at about 2012 levels and favor cost-cutting (like Devine’s plan to swap out RFID for Wi-Fi) rather than growth initiatives.

But now is the time, he says, to set the stage for 802.11ac, which is due to be ratified in 2014. “IT should require 802.11ac support from their vendors and get a commitment on the upgrade cost and terms before signing a deal. Chances are you won’t need 802.11ac’s additional bandwidth for a few years, but you shouldn’t be forced to do forklift upgrades/replacements of recent access points to get .11ac. It should be a relatively simple module or software upgrade to currently marketed access points.”

While 802.11ac isn’t even fully supported by wireless clients yet, Mazur recommends keeping your eye on the 802.11 sky. Another spec, 802.11ad, which operates in the 60GHz spectrum and is currently geared toward home entertainment connectivity and near-field HD video connectivity, could be — like other consumer Wi-Fi advances — entering the enterprise space sooner rather than later.

Source:  networkworld.com

7Gbps transmissions up to a mile will boost wireless Internet coverage

Friday, August 9th, 2013

FCC rule change lets industry send higher-power signals on 60GHz band.

The Federal Communications Commission (FCC) has changed its rules to allow higher-power outdoor operations on the 57-64 GHz band, enabling wireless transmissions “over distances up to a mile at data rates of 7Gbps,” the agency said.

This is the same swath of spectrum used by Wireless Gigabit technology, which allows fast streaming between devices in living rooms and offices. Transmissions at this frequency are easily blocked by walls and objects. However, the FCC said the transmissions will still be useful outdoors in line-of-sight applications. Unlike a home network, the 7Gbps of bandwidth would be shared among many users.

“Responding to a petition by the industry, the Commission increased the power permitted for outdoor operations between fixed points using highly directional antennas and tied the maximum power permitted to the precision of the antenna beam which determines its potential for causing interference to other users, including to indoor low-power networks,” an FCC announcement today said. “This rule change would permit outdoor devices to deliver high-capacity communication links over longer distances, enhancing the utility of the unlicensed 57-64 GHz band as a vehicle for broadband. It will also facilitate the use of this unlicensed spectrum as a backhaul alternative in densely populated areas where 4G and other wireless services are experiencing an ever‑increasing need for additional spectrum.”

Like Wi-Fi airwaves, the spectrum is unlicensed, meaning any manufacturer or service provider can use the spectrum provided that they comply with FCC rules. One possible use noted by the FCC is “extending the reach of fiber optic networks by providing broadband access to adjacent structures in commercial facilities.”

“Taken together, the new rules will enhance the use of unlicensed spectrum as a relatively low‑cost, high‑capacity short‑range backhaul alternative to connect wireless broadband networks and for other wireless applications,” the FCC said.

FCC watcher Harold Feld, senior vice president of Public Knowledge, told Ars that these transmissions will be “[g]ood for links around natural breaks in terrain or building to building in urban areas. In rural areas, they put these on grain silos.”

Rules for use of this spectrum indoors will not be changed.

Separately, the FCC today also adopted new rules for satellite services to “remov[e] unnecessary regulations and eas[e] administrative burdens.”

Source:  arstechnica.com

Oil, gas field sensors vulnerable to attack via radio waves

Friday, July 26th, 2013

Researchers with IOActive say they can shut down a plant from up to 40 miles away by attacking industrial sensors

Sensors widely used in the energy industry to monitor industrial processes are vulnerable to attack from 40 miles away using radio transmitters, according to alarming new research.

Researchers Lucas Apa and Carlos Mario Penagos of IOActive, a computer security firm, say they’ve found a host of software vulnerabilities in the sensors, which are used to monitor metrics such as temperature and pipeline pressure, that could be fatal if abused by an attacker.

Apa and Penagos are scheduled to give a presentation next Thursday at the Black Hat security conference in Las Vegas but gave IDG News Service a preview of their research. They can’t reveal many details due to the severity of the problems.

“If you compromise a company on the Internet, you can cause a monetary loss,” Penagos said. “But in this case, [the impact] is immeasurable because you can cause loss of life.”

The U.S. and other nations have put increased focus in recent years on the safety of industrial control systems used in critical infrastructure such as nuclear power plants, energy and water utilities. The systems, often now connected to the Internet, may have not had thorough security audits, posing a risk of life-threatening attacks from afar.

Apa and Penagos studied sensors manufactured by three major wireless automation system manufacturers. The sensors typically communicate with a company’s home infrastructure using radio transmitters on the 900MHz or 2.4GHz bands, reporting critical details on operations from remote locations.

Apa and Penagos found that many of the sensors contained a host of weaknesses, ranging from weak cryptographic keys used to authenticate communication, software vulnerabilities and configuration errors.

For example, they found some families of sensors shipped with identical cryptographic keys. It means that several companies may be using devices that all share the same keys, putting them at a greater risk of attack if a key is compromised.

They tested various attacks against the sensors using a specific kind of radio antennae the sensors use to communicate with their home networks. They found it was possible to modify readings and disable sensors from up to 40 miles (64 kilometers) away. Since the attack isn’t conducted over the Internet, there’s no way to trace it, Penagos said.

In one scenario, the researchers concluded that by exploiting a memory corruption bug, all sensors could be disabled and a facility could be shut down.

Fixing the sensors, which will require firmware updates and configuration changes, won’t be easy or quick. “You need to be physically connected to the device to update them,” Penagos said.

Apa and Penagos won’t identify the vendors of the sensors since the problems are so serious. They’ve handed their findings to the U.S. Computer Emergency Readiness Team, which is notifying the affected companies.

“We care about the people working in the oil fields,” Penagos said.

Source:  computerworld.com

AT&T uses small cells to improve service in Disney parks

Tuesday, July 23rd, 2013

AT&T will soon show off how small cell technology can improve network capacity and coverage in Walt Disney theme parks.

If you’re a Disney theme park fan and you happen to be an AT&T wireless customer, here’s some good news: Your wireless coverage within the company’s two main resorts is going to get a heck of a lot better.

AT&T and Disney Parks are announcing an agreement Tuesday that will make AT&T the official wireless provider for Walt Disney World Resort and Disneyland Resort.

What does this mean? As part of the deal, AT&T will be improving service within the Walt Disney World and Disneyland Resorts by adding small technology that will chop up AT&T’s existing licensed wireless spectrum and reuse it in smaller chunks to better cover the resort and add more capacity in high-volume areas. The company will also add free Wi-Fi hotspots, which AT&T customers visiting the resorts will also be able to use to offload data traffic.

Specifically, AT&T will add more than 25 distributed antenna systems in an effort to add capacity. It will also add more than 350 small cells, which extend the availability of the network. AT&T is adding 10 new cell sites across the Walt Disney World resort to boost coverage and capacity. And it will add nearly 50 repeaters to help improve coverage of the network.

Chris Hill, AT&T’s senior vice president for advanced solutions, said that AT&T’s efforts to improve coverage in an around Disney resorts is part of a bigger effort the company is making to add capacity and improve coverage in highly trafficked areas. He said that even though AT&T had decent network coverage already within the Disney parks, customers often experienced issues in some buildings or in remote reaches of the resorts.

“The macro cell sites can only cover so much,” he said. “So you need to go to small cells to really get everywhere you need to be and to provide the capacity you need in areas with a high density of people.”

Hill said the idea of creating smaller cell sites that reuse existing licensed spectrum is a big trend among all wireless carriers right now. And he said, AT&T is deploying this small cell technology in several cities as well as other areas where large numbers of people gather, such as stadiums and arenas.

“We are deploying this technology widely across metro areas to increase density of our coverage,” he said. “And it’s not just us. There’s a big wave of small cell deployments where tens of thousands of these access points are being deployed all over the place.”

Cooperation with Disney is a key element in this deployment since the small cell technology requires that AT&T place access points on the Disney property. The footprint of the access points is very small. They typically look like large access points used for Wi-Fi. Hill said they can be easily disguised to fit in with the surroundings.

Unfortunately, wireless customers with service from other carriers won’t see the same level of improved service. The network upgrade and the small cell deployments will only work for AT&T wireless customers. AT&T has no plans to allow other major carriers to use the network for roaming.

Also as part of the deal, AT&T will take over responsibility for Disney’s corporate wireless services, providing services to some 25,000 Disney employees. And the companies have struck various marketing and branding agreements. As part of that aspect of the deal, AT&T will become an official sponsor of Disney-created soccer and runDisney events at the ESPN Wide World of Sports Complex. In addition, Disney will join AT&T in its “It Can Wait” public service campaign, which educates the public about the dangers of texting while driving.

Source:  CNET

Nation’s first campus ‘Super Wi-Fi’ network launches at West Virginia University

Friday, July 19th, 2013

West Virginia University today (July 9) became the first university in the United States to use vacant broadcast TV channels to provide the campus and nearby areas with wireless broadband Internet services.

The university has partnered with AIR.U, the Advanced Internet Regions consortium, to transform the “TV white spaces” frequencies left empty when television stations moved to digital broadcasting into much-needed connectivity for students and the surrounding community.

http://wvutoday.assets.slate.wvu.edu/resources/1/1373572611_md.jpgThe initial phase of the network provides free public Wi-Fi access for students and faculty at the Public Rapid Transit platforms, a 73-car tram system that transports more than 15,000 riders daily.

“Not only does the AIR.U deployment improve wireless connectivity for the PRT System, but also demonstrates the real potential of innovation and new technologies to deliver broadband coverage and capacity to rural areas and small towns to drive economic development and quality of life, and to compete with the rest of the world in the knowledge economy,” said WVU Chief Information Officer John Campbell.

“This may well offer a solution for the many West Virginia communities where broadband access continues to be an issue,” Campbell said, “and we are pleased to be able to be a test site for a solution that may benefit thousands of West Virginians.”

Chairman of the Senate Committee on Commerce, Science and Transportation Sen. Jay Rockefeller, said, “As chairman of the Senate Commerce Committee, I have made promoting high-speed Internet deployment throughout West Virginia, and around the nation, a priority. That is why I am excited by today’s announcement of the new innovative wireless broadband initiative on West Virginia University’s campus.

“Wireless broadband is an important part of bringing the economic, educational, and social benefits of broadband to all Americans,” he said.

“My Public Safety Spectrum legislation, which the president signed into law last year, helped to preserve and promote innovative wireless services,” Rockefeller said. “The lessons learned from this pilot project will be important as Congress continues to look for ways to expand broadband access and advance smart spectrum policy.”

Mignon Clyburn, acting chair of the Federal Communications Commission, praised the development, saying, ””Innovative deployment of TV white spaces presents an exciting opportunity for underserved rural and low-income urban communities across the country. I commend AIR.U and West Virginia University on launching a unique pilot program that provides campus-wide Wi-Fi services using TV white space devices.

“This pilot will not only demonstrate how TV white space technologies can help bridge the digital divide, but also could offer valuable insights into how best to structure future deployments,” she said.

The network deployment is managed by AIR.U co-founder Declaration Networks Group LLC and represents a collaboration between AIR.U and the WVU Board of Governors; the West Virginia Network for Telecomputing, which provides the fiber optic Internet backhaul for the network; and Adaptrum Inc., a California start-up providing white space equipment designed to operate on vacant TV channels. AIR.U is affiliated with the Open Technology Institute at the New America Foundation, a non-partisan think tank based in Washington, D.C. Microsoft and Google both provided early support for AIR.U’s overall effort to spur innovation to upgrade the broadband available to underserved campuses and their surrounding communities.

“WVNET is proud to partner with AIR.U and WVU on this exciting new wireless broadband opportunity,” WVNET Director Judge Dan O’Hanlon said. “We are very pleased with this early success and look forward to expanding this last-mile wireless solution all across West Virginia.” O’Hanlon also serves as chairman of the West Virginia Broadband Council.

Because the unique propagation characteristics of TV band spectrum enables networks to broadcast Wi-Fi connections over several miles and over hilly and forested terrain, the Federal Communications Commission describes unlicensed access to vacant TV channels as enabling “Super Wi-Fi” services. For example, WVU can add additional Wi-Fi hotspots in other locations around campus where students congregate or lack connectivity today. Future applications include public Wi-Fi access on the PRT cars and machine-to-machine wireless data links supporting control functions of the PRT System.

AIR.U’s initial deployment, blanketing the WVU campus with Wi-Fi connectivity, demonstrates the equipment capabilities, the system throughput and performance of TV band frequencies to support broadband Internet applications. AIR.U intends to facilitate additional college community and rural broadband deployments in the future.

“The innovative WVU network demonstrates why it is critical that the FCC allows companies and communities to use vacant TV channel spectrum on an unlicensed basis,” said Michael Calabrese, director of the Wireless Future Project at the New America Foundation. “We expect that hundreds of rural and small town colleges and surrounding communities will soon take advantage of this very cost-effective technology to extend fast and affordable broadband connections where they are lacking.”

“Microsoft was built on the idea that technology should be accessible and affordable to everyone, and today access to a broadband connection is becoming increasingly important.” said Paul Mitchell, general manager/technology policy, at Microsoft. “White spaces technology and efficient spectrum management have a huge potential for expanding affordable broadband access in underserved areas and we are pleased to be partnering with AIR.U and West Virginia University on this new launch.”

The AIR.U consortium includes organizations that represent over 500 colleges and universities nationwide, and includes the United Negro College Fund, the New England Board of Higher Education, the Corporation for Education Network Initiatives in California, the National Institute for Technology in Liberal Education, and Gig.U, a consortium of 37 major universities.

“We are delighted that AIR.U was born out of the Gig.U effort,” said Blair Levin, executive director of Gig.U and former executive director of the National Broadband Plan. “The communities that are home to our research universities and colleges across the country need next generation speeds to compete in the global economy and we firmly believe this effort can be a model for other communities.”

Founding partners of AIR.U include Microsoft, Google, the Open Technology Institute at the New America Foundation, the Appalachian Regional Commission, and Declaration Networks Group, LLC, a new firm established to plan, deploy and operate Super Wi-Fi networks.

“Super Wi-Fi presents a lower-cost, scalable approach to deliver high capacity wireless networks, and DNG is leading the way for a new broadband alternative to provide sustainable models that can be replicated and extended to towns and cities nationwide,” stated Bob Nichols, CEO of Declaration Networks Group, LLC and AIR.U co-founder.

Source:  wvu.edu

FCC approves Google’s ‘white space’ database operation

Sunday, June 30th, 2013

 

The database will allow unlicensed TV broadcast spectrum to be used for wireless broadband.

The Federal Communications Commission has approved Google’s plan to operate a database that would allow unlicensed TV broadcast spectrum to be used for wireless broadband and shared among many users.

Google, which was granted commission approval on Friday, is the latest company to complete the FCC’s 45-day testing phase. Spectrum Bridge and Telcordia completed their trials, and there are another 10 companies, including Microsoft, which are working on similar databases. The new database will keep track of the TV broadcast frequencies in use so that wireless broadband devices can take advantage of the unlicensed space on the spectrum, also called “white space.”

In the U.S., the FCC has been working to free up spectrum for wireless carriers, which complain they lack adequate available spectrum to keep up with market demand for data services. The FCC approved new rules in 2010 for using unlicensed white space that included establishing databases to track clear frequencies and ensure that devices do not interfere with existing broadcast TV license holders. The databases contain information supplied by the FCC.

However, TV broadcasters have resisted the idea of unlicensed use, worried that allowing others to use white space, which is very close to the frequencies they occupy, could cause interference. What Google and others developing this database technology hope to show is that it is possible to share white space without creating interference.

The Web giant announced in March that it had launched a trial program that would tap white spaces to provide wireless broadband to 10 rural schools in South Africa.

Source:  CNET

Cheat sheet: What you need to know about 802.11ac

Friday, June 21st, 2013

Wi-Fi junkies, people addicted to streaming content, and Ethernet-cable haters are excited. There’s a new Wi-Fi protocol in town, and vendors are starting to push products based on the new standard out the door. It seems like a good time to meet 802.11ac, and see what all the excitement’s about.

What is 802.11ac?

802.11ac is a brand new, soon-to-be-ratified wireless networking standard under the IEEE 802.11 protocol. 802.11ac is the latest in a long line of protocols that started in 1999:

  • 802.11b provides up to 11 Mb/s per radio in the 2.4 GHz spectrum. (1999)
  • 802.11a provides up to 54 Mb/s per radio in the 5 GHz spectrum. (1999)
  • 802.11g provides up to 54 Mb/s per radio in the 2.4 GHz spectrum (2003).
  • 802.11n provides up to 600 Mb/s per radio in the 2.4 GHz and 5.0 GHz spectrum. (2009)
  • 802.11ac provides up to 1000 Mb/s (multi-station) or 500 Mb/s (single-station) in the 5.0 GHz spectrum. (2013?)

802.11ac is a significant jump in technology and data-carrying capabilities. The following slide compares specifications of the 802.11n (current protocol) specifications with the proposed specs for 802.11ac.

(Slide courtesy of Meru Networks)

What is new and improved with 802.11ac?

For those wanting to delve deeper into the inner workings of 802.11ac, this Cisco white paper should satisfy you. For those not so inclined, here’s a short description of each major improvement.

Larger bandwidth channels: Bandwidth channels are part and parcel to spread-spectrum technology. Larger channel sizes are beneficial, because they increase the rate at which data passes between two devices. 802.11n supports 20 MHz and 40 MHz channels. 802.11ac supports 20 MHz channels, 40 MHz channels, 80 MHz channels, and has optional support for 160 MHz channels.

(Slide courtesy of Cisco)

More spatial streams: Spatial streaming is the magic behind MIMO technology, allowing multiple signals to be transmitted simultaneously from one device using different antennas. 802.11n can handle up to four streams where 802.11ac bumps the number up to eight streams.

(Slide courtesy of Aruba)

MU-MIMO: Multi-user MIMO allows a single 802.11ac device to transmit independent data streams to multiple different stations at the same time.

(Slide courtesy of Aruba)

Beamforming: Beamforming is now standard. Nanotechnology allows the antennas and controlling circuitry to focus the transmitted RF signal only where it is needed, unlike the omnidirectional antennas people are used to.

(Slide courtesy of Altera.)

What’s to like?

It’s been four years since 802.11n was ratified; best guesses have 802.11ac being ratified by the end of 2013. Anticipated improvements are: better software, better radios, better antenna technology, and better packaging.

The improvement that has everyone charged up is the monstrous increase in data throughput. Theoretically, it puts Wi-Fi on par with gigabit wired connections. Even if it doesn’t, tested throughput is leaps and bounds above what 802.11b could muster back in 1999.

Another improvement that should be of interest is Multi-User MIMO. Before MU-MIMO, 802.11 radios could only talk to one client at a time. With MU-MIMO, two or more conversations can happen concurrently, reducing latency.

Source:  techrepublic.com

With faster 5G Wi-Fi coming, Wi-Fi Alliance kicks off certification program

Thursday, June 20th, 2013

Process ensures 802.11ac devices work well with older Wi-Fi products

Although faster fifth-generation Wi-Fi is already available in some new wireless routers and even the new MacBook Air laptops, a new Wi-Fi Certified ac program is being launched today to ensure the newest devices interoperate with other Wi-Fi products.

The Wi-Fi Alliance announced the certification program for 802.11ac Wi-Fi (also known as 5G Wi-Fi). Mobile devices, tablets, laptops, networking gear and other hardware will be available in the last half of 2013 with a Wi-Fi Certified label, ensuring that the devices have been tested to interoperate with other 802.11ac products and older Wi-Fi products.

“The certification program ensures that users can purchase the latest device and not worry if it will work with a device of two years or even 10 years ago,” said Kevin Robinson, senior marketing manager for the Wi-Fi Alliance in an interview.

The faster Wi-Fi allows two-to-three times faster speeds than existing 802.11n technology, Robinson said. It will enhance the speed of movie downloads and other user needs in a home or work place.

Robinson said that 802.11ac should allow a transfer of an HD movie to a tablet in under four minutes, and allow for multiple video streams inside a home at one time. “The average user will notice the difference,” he said, contrary to what some analysts have predicted.

Theoretical maximum speeds on 802.11ac can reach 1.3 Gbps, three times 802.11n’s speeds of 450 Mbps. Older 802.11g supports theoretical speeds of up to 54 Mbps. Actual speeds will be far lower, depending mainly on the number of users and the type of data being transferred.

Aside from faster speeds, 802.11ac allows for more network capacity so that more devices can be simultaneously connected to a network. Because of the added network capacity with 802.11ac, Robinson said that movies can be run without as much less compression, enhancing their overall visual quality. Wi-Fi over 802.11ac also reduces network latency, resulting in fewer delays in streaming music and gaming applications.

Wi-Fi Direct, which is technology to allow device-to-device interoperability with 802.11n, is not yet part of the 802.11ac certification program, Robinson said.

The Wi-Fi Alliance predicts that many of the new routers made with 802.11ac will operate on both the 5GHz and 2.4 GHz bands. That way, 802.11n traffic will be able to run over both bands, while 802.11ac traffic runs over 5GHz. Robinson said that 2.4 GHz will remain sufficient for carrying data for many apps and uses, such as Web browsing. Migrating to 5GHz allows wider spectrum channels with higher data throughputs, yielding higher performance. An advantage of 5 GHz is that various channel widths are supported — 20 MHz, 40 MHz and 80 MHz– while 2.4GHz allows only three 20 MHz channels.

The Wi-Fi Alliance said 11 chips and other components are being used to test new 802.11 ac devices. They are from Broadcom, Intel, Marvell, Mediatek, Qualcomm and Realtek. A list of Wi-Fi Certified ac products is available at www.wi-ficertifiedac.com.

As an indication of the fast industry adoption of 802.11ac, Aruba Networks on May 21 announced new Wi-Fi access points supporting the technology and said more recently that the University of Delaware is a beta customer. Aruba is working for Wi-Fi Certified AC certification of the new access points, a spokeswoman said.

Robinson predicted that many of the recently announced routers and other products will seek Wi-Fi 802.11ac certification.

Source:  computerworld.com

New attack cracks iPhone autogenerated hotspot passwords in seconds

Thursday, June 20th, 2013

Default password pool so small scientists need just 24 seconds to guess them all.

If you use your iPhone’s mobile hotspot feature on a current device, make sure you override the automatic password it offers to secure your connections. Otherwise, a team of researchers can crack it in less than half a minute by exploiting recently discovered weaknesses.

http://cdn.arstechnica.net/wp-content/uploads/2013/06/base-iphone-passwords-640x583.jpgIt turns out Apple’s iOS versions 6 and earlier pick from such a small pool of passwords by default that the researchers—who are from the computer science department of the Friedrich-Alexander University in Erlangen, Germany—need just 24 seconds to run through all the possible combinations. The time required assumes they’re using four AMD Radeon HD 7970 graphics cards to cycle through an optimized list of possible password candidates. It also doesn’t include the amount of time it takes to capture the four-way handshake that’s negotiated each time a wireless enabled device successfully connects to a WPA2, or Wi-Fi Protected Access 2, device. More often than not, though, the capture can be completed in under a minute. With possession of the underlying hash, an attacker is then free to perform an unlimited number of “offline” password guesses until the right one is tried.

The research has important security implications for anyone who uses their iPhone’s hotspot feature to share the device’s mobile Internet connectivity with other Wi-Fi-enabled gadgets. Adversaries who are within range of the network can exploit the weakness to quickly determine the default pre-shared key that’s supposed to prevent unauthorized people from joining. From there, attackers can leach off the connection, or worse, monitor or even spoof e-mail and other network data as it passes between connected devices and the iPhone acting as the access point.

“Taking our optimizations into consideration, we are now able to show that it is possible for an attacker to reveal a default password of an arbitrary iOS hotspot user within seconds,” the scientists wrote in a recently published research paper. “For that to happen, an attacker only needs to capture a WPA2 authentication handshake and to crack the pre-shared key using our optimized dictionary.”

By reverse engineering key parts of iOS powering iPhones, the researchers discovered that default hotspot passwords always contained a four- to six-letter word followed by a randomly generated four-digit number. All the words were contained in an open-source Scrabble word list available online. By using a single AMD Radeon HD 6990 GPU to append every possible four-digit number to each of the words, the researchers needed only 49 minutes to cycle through all possible combinations. Then they stumbled on a discovery that allowed them to drastically reduce the amount of time required.

The hotspot feature, they found, uses an observable series of programming calls to pick four- to six-letter words from an English-language dictionary included with iOS. By cataloging the default passwords issued after about 250,000 invocations, they determined that only 1,842 different words are selected. The discovery allowed them to drastically reduce the number of guesses needed to correctly find the correct password. As a result, the required search space—that is, the total number of password candidates needed to guess a default password—is a little less than 18.5 million.

They were able to further reduce the time required after noticing that certain words on the reduced list are much more likely than others to be chosen. For instance, “suave,” “subbed,” “headed,” and seven other top-10 words were 10 times more likely to be selected as the base for a default password than others. The optimized list in the attack orders words by their relative frequency, so those most likely to be used are guessed first. Given a four-GPU system is able to generate about 390,000 guesses each second, it takes about 24 seconds to arrive at the correct guess.

Among the many security features included in the WPA standard is its use of the relatively slow PBKDF2 function to generate hashes. As a result, the number of guesses that the researchers’ four-GPU system is capable of generating each second is measured in the hundreds of thousands, rather than in the millions or billions. The paper—titled “Usability vs. Security: The Everlasting Trade-Off in the Context of Apple iOS Mobile Hotspots”—demonstrates that slow hashing alone isn’t enough to stave off effective password cracks.

Also crucial is a selection of passwords that will require attackers to devote large amounts of time or computing resources to exhaust the required search space. Had Apple engineers designed a system that picked long default passwords with upper- and lower-case letters, numbers, and special characters, it could take centuries for crackers to cycle through every possibility. Alas, passwords such as “3(M$j;]fL[ZU%<1T” aren’t easy for most people to use in practical settings. Still, a Wi-Fi password that’s randomly generated—say “MPuUjxRpz0” or even “arNEsISIon” will require considerably more time and resources to crack than the default passwords currently offered by iOS.

Readers who use their iPhone’s hotspot feature should override the default password offering and replace it with something that’s harder to guess. They should also take advantage of the hotspot feature’s ability to monitor how many people are connected to the Wi-Fi network. Those who use hotspot features on other mobile platforms would also do well to carefully monitor the passwords protecting their connections. By default, passwords offered by Microsoft’s Windows Phone 8 consist of only an eight-digit number, according to the researchers, and depending on the carrier, some Android handsets may also generate default passwords that are easy to crack.

Source:  arstechnica.com

Obama wants government to free up more wireless spectrum

Friday, June 14th, 2013

President Barack Obama is directing federal agencies to look for ways to eventually share more of their radio airwaves with the private sector as the growing use of smartphones and tablets ratchets up the demand for spectrum, according to a memo released on Friday.

With blocks of spectrum reserved by dozens of government agencies for national defense, law enforcement, weather forecasting and other purposes, wireless carriers and Internet providers are urging that more spectrum be opened up for commercial use.

The call comes as airwaves are becoming congested with the increase in gadgets and services that are heavily reliant on the ability to transport greater amounts of data.

“Although existing efforts will almost double the amount of spectrum available for wireless broadband, we must make available even more spectrum and create new avenues for wireless innovation,” Obama said in his presidential memo.  “One means of doing so is by allowing and encouraging shared access to spectrum that is currently allocated exclusively for Federal use.”

The memorandum, welcomed and lauded by the telecommunications industry, directs federal agencies to study how exactly they use the airwaves and how to make it easier to share them with the private sector.

The directive also sets up a Spectrum Policy Team that in six months will have to recommend incentives to encourage government agencies to share or give up their spectrum – something industry experts see as a critical step in opening more of the federally used airwaves to the private sector.

“Our traditional three-step process for reallocating federal spectrum — clearing federal users, relocating them and then auctioning the cleared spectrum for new use — is reaching its limits,” Jessica Rosenworcel, a Democratic member of the Federal Communications Commission, said in supporting Obama’s move.

The FCC is now working on rules for the biggest-ever auction of commercially used airwaves, in which TV stations would give up and wireless providers would buy highly attractive spectrum.  The auction is expected to take place in late 2014 or later.

The White House on Friday also released a report showing growth of broadband innovation and access, an area that the Obama administration has on because it is viewed as a critical tool for economic growth.  To further the process, the White House now plans to invest $100 million into spectrum sharing and advanced communications.

Friday’s directive also “strongly encourages” the FCC to develop a program that would spur the creation and sale of radio receivers that would ensure that if spectrum is shared, different users do not interfere with each other.

“The steps taken today lay the groundwork for tomorrow’s broadband future,” said Vonya McCann, senior vice president of government affairs at Sprint Nextel Corp.

Source:  Reuters

iPhones can auto-connect to rogue Wi-Fi networks, researchers warn

Friday, June 14th, 2013

Attackers can exploit behavior to collect passwords and other sensitive data.

Security researchers say they’ve uncovered a weakness in some iPhones that makes it easier to force nearby users to connect to Wi-Fi networks that steal passwords or perform other nefarious deeds.

The weakness is contained in configuration settings installed by AT&T, Vodafone, and more than a dozen other carriers that give the phones voice and Internet services, according to a blog post published Wednesday. Settings for AT&T iPhones, for instance, frequently instruct the devices to automatically connect to a Wi-Fi network called attwifi when the signal becomes available. Carriers make the Wi-Fi signals available in public places as a service to help subscribers get Internet connections that are fast and reliable. Attackers can take advantage of this behavior by setting up their own rogue Wi-Fi networks with the same names and then collecting sensitive data as it passes through their routers.

“The takeaway is clear,” the researchers from mobile phone security provider Skycure wrote. “Setting up such Wi-Fi networks would initiate an automatic attack on nearby customers of the carrier, even if they are using an out-of-the-box iOS device that never connected to any Wi-Fi network.”

The researchers said they tested their hypothesis by setting up several Wi-Fi networks in public areas that used the same SSIDs as official carrier networks. During a test at a restaurant in Tel Aviv, Israel on Tuesday, 60 people connected to an imposter network in the first minute, Adi Sharabani, Skycure’s CEO and cofounder, told Ars in an e-mail. During a presentation on Wednesday at the International Cyber Security Conference, the Skycure researchers set up a network that 448 people connected to during a two-and-a-half-hour period. The researchers didn’t expose people to any attacks during the experiments; they just showed how easy it was for them to connect to networks without knowing they had no affiliation to the carrier.

Sharabani said the settings that cause AT&T iPhones to automatically connect to certain networks can be found in the device’s profile.mobileconfig file. It’s not clear if phones from other carriers also store their configurations in the same location or somewhere else.

“Moreover, even if you take another iOS device and put an AT&T sim in it, the network will be automatically defined, and you’ll get the same behavior,” he said. He said smartphones running Google’s Android operating system don’t behave the same way.

Once attackers have forced a device to connect to a rogue network, they can run exploit software that bypasses the secure sockets layer Web encryption. From there, attackers can perform man-in-the-middle (MitM) attacks that allow them to observe passwords in transit and even forge links and other content on the websites users are visiting.

The most effective way to prevent iPhones from connecting to networks without the user’s knowledge is to turn off Wi-Fi whenever it’s not needed. Apps are also available that give users control over what SSIDs an iPhone will and won’t connect to. It’s unclear how iPhones running the upcoming iOS 7 will behave. As Ars reported Monday, Apple’s newest OS will support the Wi-Fi Alliance’s Hotspot 2.0 specification, which is designed to allow devices to hop from one Wi-Fi hotspot to another.

Given how easy it for attackers to abuse Wi-Fi weaknesses, the Skycure research isn’t particularly shocking. Still, the ability of iPhones to connect to networks for the first time without requiring users to take explicit actions could be problematic, said Robert Graham, an independent security researcher who reviewed the Skycure blog post.

“A lot of apps still send stuff in the clear, and other apps don’t check the SSL certificate chain properly, meaning that Wi-Fi MitM is a huge problem,” said Graham, who is CEO of Errata Security. “That your phone comes pre-pwnable without your actions is a bad thing. Devices should come secure by default, not pwnable by default.”

Source:  arstechnica.com

Corning taps into optical fiber for better indoor wireless

Monday, May 20th, 2013

Bringing wireless indoors, which was once just a matter of antennas carrying a few cellular bands so people could get phone calls, has grown far more complex and demanding in the age of Wi-Fi, multiple radio bands and more powerful antennas.

DAS (distributed antenna systems) using coaxial cable have been the main solution to the problem, but they now face some limitations. To address them, Corning will introduce a DAS at this week’s CTIA Wireless trade show in Las Vegas that uses fiber instead of coax all the way from the remote cell antennas to the base station in the heart of a building.

Cable-based DAS hasn’t kept up with the new world, according to the optical networking vendor. Though Corning is associated more often with clear glass than with thin air, it entered the indoor wireless business in 2011 by buying DAS maker MobileAccess. That’s because Corning thinks optical fiber is the key to bringing more mobile capacity and coverage inside.

The system, called Corning Optical Network Evolution (ONE) Wireless Platform, can take the place of a DAS based fully or partly on coaxial cable, according to Bill Cune, vice president of strategy for Corning MobileAccess. Corning ONE will let mobile carriers, enterprises or building owners set up a neutral-host DAS for multiple carriers using many different frequencies.

Though small cells are starting to take its place in some buildings, DAS still has advantages over the newer technology, according to analyst Peter Jarich of Current Analysis. It can be easier to upgrade because only the antennas are distributed, so more of the changes can be carried out on centralized gear. Also, small cells are typically deployed by one mobile operator, and serving customers of other carriers has to be done through roaming agreements, he said.

However, some DAS products based on coaxial cable are limited in how they can handle high frequencies and MIMO (multiple-in, multiple-out) antennas, Jarich said. Some vendors are already promoting fiber for greater flexibility and capacity, he said.

Going all fiber — up to the wireless edge, at least — will make it easier and cheaper for indoor network operators to roll out systems that can deliver all the performance users have come to expect from wireless networks, according to Corning. That includes more easily adding coverage for more carriers, as well as feeding power and data to powerful Wi-Fi systems that can supplement cellular data service, the company says.

Wireless signals don’t travel the same way inside buildings as they do outdoors, so one antenna can’t always cover the interior, regardless of whether it’s mounted in the building or on a nearby tower. A DAS consists of many antennas spaced throughout a structure, all linked to a base station in a central location. Most types of DAS use coaxial cable to carry radio signals in from the distributed antennas.

However, those copper cables get more “lossy” as the frequencies they have to carry get higher, meaning they lose a lot of their signal on the way to the base station, Corning’s Cune said. That has left coax behind as new frequencies are adopted, he said. For example, coax isn’t good at carrying the 5GHz band, which is crucial in newer Wi-Fi equipment, Cune said.

MIMO, a technology that uses multiple antennas in one unit to carry separate “streams” over the same frequency, is another big limitation of DAS, according to Corning. MIMO antennas for better performance can be found in newer Wi-Fi gear based on IEEE 802.11n and 802.11ac, as well as in LTE. A coax-based DAS with MIMO antennas needs to have a separate half-inch-wide cable for every stream, which is a major cabling challenge, Cune said.

Corning ONE links each antenna to the base station over optical fiber, converting the radio signals to optical wavelengths until they reach the base station. Fiber has more capacity than coax, can handle higher frequencies, and requires just one cable from a MIMO antenna, Cune said. Because of fiber’s high capacity, it’s relatively easy to bring other mobile operators onto the DAS.

The system is based on optical fiber, but it can be extended over standard Ethernet wiring to provide backhaul for Wi-Fi access points. Each Corning ONE remote antenna unit that’s deployed around a building will have two Ethernet ports to hook up nearby Wi-Fi access points, which can use the fiber infrastructure for data transport to wired LAN equipment, Cune said.

Corning ONE is in beta testing at one enterprise and will have limited availability beginning in late June, after which orders can be placed, Cune said. It is expected to be generally available two to three months later. The company expects its main customers to be mobile operators, though most of those operators will arrange multi-carrier services, he said. Enterprises and large building owners increasingly will step in to buy and deploy the DAS, Cune said.

Source:  networkworld.com

Wireless networks may learn to live together by using energy pulses

Wednesday, April 24th, 2013

University-developed GapSense system could help prevent interference between Wi-Fi and other networks

Researchers at the University of Michigan have invented a way for different wireless networks crammed into the same space to say “excuse me” to one another.

Wi-Fi shares a frequency band with the popular Bluetooth and ZigBee systems, and all are often found in the same places together. But it’s hard to prevent interference among the three technologies because they can’t signal each other to coordinate the use of the spectrum. In addition, different generations of Wi-Fi sometimes fail to exchange coordination signals because they use wider or narrower radio bands. Both problems can slow down networks and break connections.

Michigan computer science professor Kang Shin and graduate student Xinyu Zhang, now an assistant professor at the University of Wisconsin, set out to tackle this problem in 2011. Last July, they invented GapSense, software that lets Wi-Fi, Bluetooth and ZigBee all send special energy pulses that can be used as traffic-control messages. GapSense is ready to implement in devices and access points if a standards body or a critical mass of vendors gets behind it, Shin said.

Wi-Fi LANs are a data lifeline for phones, tablets and PCs in countless homes, offices and public places. Bluetooth is a slower but less power-hungry protocol typically used in place of cords to connect peripherals, and ZigBee is an even lower powered system found in devices for home automation, health care and other purposes.

Each of the three wireless protocols has a mechanism for devices to coordinate the use of airtime, but they all are different from one another, Shin said.

“They can’t really speak the same language and understand each other at all,” Shin said.

Each also uses CSMA (carrier sense multiple access), a mechanism that instructs radios to hold off on transmissions if the airwaves are being used, but that system doesn’t always prevent interference, he said.

The main problem is Wi-Fi stepping on the toes of Bluetooth and ZigBee. Sometimes this happens just because it acts faster than other networks. For example, a Wi-Fi device using CSMA may not sense any danger of a collision with another transmission even though a nearby ZigBee device is about to start transmitting. That’s because ZigBee takes 16 times as long as Wi-Fi to emerge from idle mode and get the packets moving, Shin said.

Changing ZigBee’s performance to help it keep up with its Wi-Fi neighbors would defeat the purpose of ZigBee, which is to transmit and receive small amounts of data with very low power consumption and long battery life, Shin said.

Wi-Fi devices can even fail to communicate among themselves on dividing up resources. Successive generations of the Wi-Fi standard have allowed for larger chunks of spectrum in order to achieve higher speeds. As a result, if an 802.11b device using just 10MHz of bandwidth tries to tell the rest of a Wi-Fi network that it has packets to send, an 802.11n device that’s using 40MHz may not get that signal, Shin said. The 802.11b device then becomes a “hidden terminal,” Shin said. As a result, packets from the two devices may collide.

To get all these different devices to coordinate their use of spectrum, Shin and Zhang devised a totally new communication method. GapSense uses a series of energy pulses separated by gaps. The length of the gaps between pulses can be used to distinguish different types of messages, such as instructions to back off on transmissions until the coast is clear. The signals can be sent at the start of a communication or between packets.

GapSense might noticeably improve the experience of using Wi-Fi, Bluetooth and ZigBee. Network collisions can slow down networks and even cause broken connections or dropped calls. When Shin and Zhang tested wireless networks in a simulated office environment with moderate Wi-Fi traffic, they found a 45 percent rate of collisions between ZigBee and Wi-Fi. Using GapSense slashed that collision rate to 8 percent. Their tests of the “hidden terminal” problem showed a 40 percent collision rate, and GapSense reduced that nearly to zero, according to a press release.

One other possible use of GapSense is to let Wi-Fi devices stay alert with less power drain. The way Wi-Fi works now, idle receivers typically have to listen to an access point to be prepared for incoming traffic. With GapSense, the access point can send a series of repeated pulses and gaps that a receiver can recognize while running at a very low clock rate, Shin said. Without fully emerging from idle, the receiver can determine from the repeated messages that the access point is trying to send it data. This feature could reduce energy consumption of a Wi-Fi device by 44 percent, according to Shin.

Implementing GapSense would involve updating the firmware and device drivers of both devices and Wi-Fi access points. Most manufacturers would not do this for devices already in the field, so the technology will probably have to wait for hardware products to be refreshed, according to Shin.

A patent on the technology is pending. The ideal way to proliferate the technology would be through a formal standard, but even without that, it could become widely embraced if two or more major vendors license it, Shin said.

Source:  computerworld.com