Archive for July, 2011

Apple Laptops Vulnerable To Hack That Kills Or Corrupts Batteries

Monday, July 25th, 2011

Your laptop’s battery is smarter than it looks. And if a hacker like security researcher Charlie Miller gets his digital hands on it, it could become more evil than it appears, too.

At the Black Hat security conference in August, Miller plans to expose and provide a fix for a new breed of attack on Apple laptops that takes advantage of a little-studied weak point in their security: the chips that control their batteries.

Modern laptop batteries contain a microcontroller that monitors the power level of the unit, allowing the operating system and the charger to check on the battery’s charge and respond accordingly. That embedded chip means the lithium ion batteries can know when to stop charging even when the computer is powered off, and can regulate their own heat for safety purposes.

When Miller examined those batteries in several Macbooks, Macbook Pros and Macbook Airs, however, he found a disturbing vulnerability. The batteries’ chips are shipped with default passwords, such that anyone who discovers that password and learns to control the chips’ firmware can potentially hijack them to do anything the hacker wants. That includes permanently ruining batteries at will, and may enable nastier tricks like implanting them with hidden malware that infects the computer no matter how many times software is reinstalled or even potentially causing the batteries to heat up, catch fire or explode. “These batteries just aren’t designed with the idea that people will mess with them,” Miller says. “What I’m showing is that it’s possible to use them to do something really bad.”

Miller discovered the two passwords used to access and alter Apple batteries by pulling apart and analyzing a 2009 software update that Apple instituted to fix a problem with Macbook batteries. Using those keys, he was soon able to reverse engineer the chip’s firmware and cause it to give whatever readings he wanted to the operating system and charger, or even rewrite the firmware completely to do his bidding.

From there, zapping the battery such that it’s no longer recognized by the computer becomes trivial: In fact, Miller permanently “bricked” seven batteries just in the course of his tinkering. (They cost about $130 to replace.) More interesting from a criminal perspective, he suggests, might be installing persistent malware on the chip that infects the rest of the computer to steal data, control its functions, or cause it to crash. Few IT administrators would think to check a battery’s firmware for the source of that infection, and if undiscovered the chip could re-infect the computer again and again.

“You could put a whole hard drive in, reinstall the software, flash the BIOS, and every time it would reattack and screw you over. There would be no way to eradicate or detect it other than removing the battery.” says Miller.

That attack would require finding another vulnerability in the interface between the chip and the operating system. But Miller says that’s not much of a barrier. “Presumably Apple has never considered that as an attack vector, so it’s very possible it’s vulnerable.”

And the truly disturbing prospect of a hacker remotely blowing up a battery on command? Miller didn’t attempt that violent trick, but believes it might be possible. “I work out of my home, so I wasn’t super inclined to cause an explosion there,” he says.

In fact, the batteries he examined have other safeguards against explosions: fuses that contain an alloy that melts at high temperatures to break the circuit and prevent further charging. But Miller, who has worked for the National Security Agency and subsequently hacked everything from the iPhone to virtual worlds, believes it might still be possible. “You read stories about batteries in electronic devices that blow up without any interference,” he says. “If you have all this control, you can probably do it.”

Miller, currently a researcher with the consultancy Accuvant, isn’t the first to explore the danger of explosive batteries triggered by hackers. Barnaby Jack, a researcher for with antivirus giant McAfee, says he worked on the problem in 2009, but he says he ”benched the research when I didn’t succeed in causing any lithium ion fires. Charlie has taken it a lot further and surpassed where I was at the time.”

Miller says he’s received messages from several other researchers asking him not proceed with the battery work because it could be too dangerous. But Miller has worked to fix the problems he’s exposing. At Black Hat he plans to release a tool for Apple users called “Caulkgun” that changes their battery firmware’s passwords to a random string, preventing the default password attack he used. Miller also sent Apple and Texas Instruments his research to make them aware of the vulnerability. I contacted Apple for comment but haven’t yet heard back from the company.

Implementing Miller’s “Caulkgun” prevents any other hacker from using the vulnerabilities he’s found. But it would also prevent Apple from using the battery’s default passwords to implement their own upgrades and fixes. Those who fear the possibilities of a hijacked chunk of charged chemicals in their laps might want to consider the tradeoff.

“No one has ever thought of this as a security boundary,” says Miller. “It’s hard to know for sure everything someone could do with this.”

Source:  forbes.com

Google adds malware warning to search results

Thursday, July 21st, 2011

Google announced today it is instituting a malware warning system on its search results page to alert users to the possibility that their computer is infected.

The Internet giant said it took the action after discovering unusual patterns of activity on the Web that it identified as a strain of malware that causes infected computers to send traffic to Google through proxy servers.

“Recently, we found some unusual search traffic while performing routine maintenance on one of our data centers,” Damian Menscher, a Google security engineer, said in a Google blog post. “After collaborating with security engineers at several companies that were sending this modified traffic, we determined that the computers exhibiting this behavior were infected with a particular strain of malicious software, or ‘malware.’”

The malware only affects computers running the Windows operating system, according to a post by Google engineer Matt Cutts. Systems can be tested by running a Web search for any word, he said.

Google said that as a result of its discovery, some users who come to Google through these specific intermediary servers will see a prominent notification at the top of their Google Web search results warning them of a possible infection.

“We hope that by taking steps to notify users whose traffic is coming through these proxies, we can help them update their antivirus software and remove the infections,” Menscher said.

Google’s Help Center also offers tips for scanning systems for malware and how to remove infections. Malware is often designed to disrupt normal computer operations or gather private information about the user.

Source:  CNET

Microsoft offers scientists data analytics tool

Monday, July 18th, 2011

Microsoft unveiled new technology today designed to give academics better tools to harness the vast quantities of data available to them.

“We’re living in a data deluge right now,” said Tony Hey, corporate vice president of Microsoft Research Connections.

Scientists generate massive data in their work in areas such as environmental science, particle physics, astronomy, and other disciplines. Analyzing that information becomes ever more cumbersome.

So Microsoft released Daytona, a tool kit that lets scientists run a wide variety of analytics and machine-learning algorithms on Windows Azure. The technology is intended to free up those scientists from having to code their own software tools, giving them the ability to analyze their largest data collections and focus on their work.

“It allows scientists to do science without having to build software,” Hey said.

The genesis of Daytona came after a meeting hosted by the Office of Cyberinfrastructure at the National Science Foundation. Roger Barga, an architect in the eXtreme Computing Group in Microsoft Research, noted that the attendees asked for computational runtime for data analysis and machine learning over their data sets. That led Microsoft to create the tool, which it’s making available freely to that community.

The announcement comes during Microsoft’s 12th annual Faculty Research Summit, a gathering of computer scientists, academics, educators, and government officials at the company’s Redmond, Wash., headquarters. This year’s event, with 300 attendees, is focused on research around natural user interface technology, cloud computing, and machine learning.

Source:  CNET

Pentagon reveals cyber attack on U.S. industry as it unveils new strategy

Friday, July 15th, 2011

The Pentagon hammered home its new cyber policy Thursday by revealing a large, previously secret electronic attack on a U.S. defense contractor.

“In a single intrusion this March, 24,000 files were taken,” Deputy Defense Secretary William Lynn said at the release of an unclassified version of the new strategy to defend the U.S. military networks and critical national infrastructure.

“It is a significant concern that over the past decade, terabytes of data have been extracted by foreign intruders from corporate networks of defense companies,” Lynn said, adding that the March attack was the latest in a series of escalating attacks over the past five or six years.

He carefully avoided specifics of the March attack and would not reveal which company was hit – or which country was to blame. “It was large, it was done, we think, by a foreign intelligence service – a nation-state was behind it,” Lynn, the number-two official at the Pentagon, said during a speech and questions at the National Defense University.

This “digital thievery,” according to Lynn, is interested in the most advanced weapons in the U.S. arsenal. “Cyber exploitation being perpetrated against the defense industry cuts across a wide swath of crucial military hardware, extending from missile tracking systems and satellite navigation devices to UAVs (Unmanned aerial vehicles) and the Joint Strike Fighter,” Lynn said.

The Pentagon carefully emphasized the defensive parts of its new strategy. “Our first goal is to prevent war,” Lynn said.

But the new plan also makes clear that, if necessary, the United States will fight back. “The United States is prepared to defend itself,” Lynn said. “Just as our military organizes to defend against hostile acts from land, air and sea, we must also be prepared to respond to hostile acts in cyberspace,” he said. And that response could include what he called “a proportional and justified military response at the time and place of our choosing.”

A central challenge is to identify if and when a cyber attack would constitute an act of war, to prompt military action. “An act of war, at the end of the day, is in the eyes of the beholder,” Joint Chiefs Vice Chairman, General James Cartwright said at the same rollout of the cyber strategy.

In addition to reliance on civilian power, communications and other critical civilian infrastructure networks, the Pentagon has a huge amount of electronic gear to protect – 15,000 networks, and 7 million computers around the world. The WikiLeaks release of hundreds of thousands of military and diplomatic cables dramatically illustrated the inside-job vulnerability of Defense Department computers. And federal officials say that in 2008 a foreign intelligence agency penetrated its classified computer system.

Both Cartwright and Lynn stressed that there still is catching up to do as new technology and new vulnerabilities require new legislation and regulation.

And Lynn warned that threats will only worsen and become more sophisticated as rogue states and terrorists gain new cyber tools.

“The more malicious actors have not yet obtained the most harmful capabilities,” Lynn said. “But this situation will not hold forever. There will eventually be a marriage of capability and intent, where those who mean us harm will gain the ability to launch damaging cyber attacks.”

Source:  cnn.com

Beamforming your data: how WiGig will offer 7Gbps speeds

Thursday, July 14th, 2011

The Wireless Gigabit Alliance recently announced that it has published the certification-ready 1.1 specification of its wireless system, and it includes some new capabilities, like a framework for video connectors. But given that even 5GHz WiFi is notorious for spotty reception mere feet from the offending wireless router, how will WiGig, which uses an incredible 60GHz frequency, ever manage to transmit information to devices that aren’t literally pressed up against the router?

First, a quick rundown of what WiGig is. WiGig is a specification for hardware that uses 60GHz frequencies to transmit up to 7 gigabits of data per second over the air; for comparison, 802.11n WiFi tops out at a few hundred megabits per second. In other words, a download of an HD episode of Archer on WiGig would take mere seconds, even without perfect reception. The system has been in development for some years now. The WiGig Alliance recently pegged the launch of capable devices for the first half of 2012.

WiGig is sufficiently advanced to have its own IEEE 802.11 standard, coded as 802.11ad. Using it at the time of release will require some new hardware both to send and receive signal, similar to when 5GHz started to make its way onto the market. Unlike 5GHz WiFi, though, WiGig’s design includes methods for avoiding the decay problems that higher-frequency transmissions usually have.

To overcome signal decay, WiGig uses a process called adaptive beamforming (it’s not the first or only system to do so, but is heavily reliant on it). With a combination of physical antennas on the devices and algorithms to tune the signal, WiGig devices effectively shoot their signals back and forth at each other in a narrow, targeted beam.

Antennas in the devices—say, a router—each have a broad reception area for a router to see devices in. When a device that wants to use the 60GHz connection is brought into that area, it begins communicating with the antennas to fine tune the antenna’s signal to maximize connection speed.

The antennas do this by adjusting both the amplitudes and the phase shifts of their broadcasted waves. The reception of the signal is optimized by minimizing different kinds of problems: the error between the antenna’s output and the expected signal, for example, or the signal-to-interference ratio.

When the phase shifts and amplitudes of multiple sources of waves are tweaked to work together and combine their pings in the right way, they create “lobes” of excellent reception areas. An unfortunate result of the lobes is that there are also null areas outside the lobes where there is no coverage at all, which doesn’t bode well for WiGig’s ability to blanket a particular area with simultaneous reception—at least, not without an army of routers and antennas.

As long as a device is within range of a particular antenna on the router, the antenna and receiver can run the digital optimization process fairly quickly to establish a concentrated signal beam. But “quickly” is as specific as the released sources by the WiGig alliance gets, so the speed of connection could be anywhere from a fraction of a second to multiple seconds, or longer.

The time it takes to establish the fastest beam could also depend on the quality of equipment you pick up. But even factoring that in, we’re not sure we’ll be able to stroll around holding our WiGig-capable devices and maintain the 60GHz signal. Still, the specifications say the equipment must be able to fall back on 2.4GHz and 5GHz signals in the event that it loses the higher-speed connection.

Mobility isn’t the only downfall of WiGig, though—according to the WiGig Alliance, the beamforming of compliant equipment needs to be within line-of-sight of receiving devices in order to work well. Even a person stepping between two communicating devices can break the signal, though according to a whitepaper by the group, WiGig-compliant equipment can bounce beams off walls and ceilings in order to reach between devices.

According to the specifications, devices can work over distances “beyond 10 meters,” but it seems walls and ceilings will be an even bigger obstacle for 60GHz WiFi than they already are for 2.4GHz and 5GHz signals. Bouncing the signal may work around some setups, but not all; no one will know whether a single router will cover more than one room until there is some actual hardware to try.

One of the more interesting capabilities of WiGig is an included compliance with audio-visual equipment, including with HDMI and DisplayPort interfaces. In theory, this compliance means that you could plug some kind of dongle into your TV’s HDMI port that can participate in all of the signal optimization processes, connect your computer to it over the superfast WiFi, and stream all the HD video you can gets your hands on (provided both your computer, router, and dongle are 802.11ad-capable and within range of each other, of course).

In the same vein, the new specifications include compatibility with USB and PCIe interfaces. This way potential users who want to get on board with WiGig immediately will be able to hook their computers up with dongles as well, instead of having to buy and install new wireless cards.

As of the second half of this year, manufacturers and adopters will be able to start testing and certifying their compatible devices with the 1.1 specification. WiGig dates have been pushed before, but we’d say 2012 is looking pretty good as the year we get our hands on some blazing, if still slightly fragile, WiFi.

Source:  arstechnica.com

Turning radio waves into power (with circuits printed on paper)

Monday, July 11th, 2011

Researchers at Georgia Tech have found a way to harvest energy from electromagnetic waves in the air. The harvesting devices are produced using an inkjet printer and can collect small amounts of power from a wide band of frequencies–everything from FM radio up to radar.

The technology isn’t new—researchers have floated concepts (and a few devices) that can harvest energy from ambient WiFi signals and other small sources, but these are usually able to pull power only from tiny slices of the electromagnetic spectrum (perhaps just a few KHz). The new system can draw energy from much wider electromagnetic swaths: 100MHz to 15GHz.

Even better, the sensors that harvest the energy are simple to make. To print the circuits on paper or paper-like polymers, the researchers use an inkjet printer and add an emulsion of nanoparticles. Circuits printed on polymers are currently less advanced, but the scientists say they have a wider range and can harvest energy from frequencies up to 60GHz.

Gadgets such as cell phones could one day use residual radio signal to supplement their own batteries, but the amounts of energy harvested are small (on the order of 50 milliwatts) and the system won’t currently make even small consumer devices self-sustaining.

However, if the energy is allowed to build up in a small capacitor, it could temporarily power low-energy intermittent devices like temperature sensors or could supplement other energy-gathering mechanisms like solar panels. The harvesters could also function as mission-critical stopgaps, allowing a system to maintain essential functions or send out a distress signal until it can be fixed.

Source:  arstechnica.com

Microsoft Security Bulletin Advance Notification for July 2011

Friday, July 8th, 2011

Microsoft Security Bulletin Advance Notification issued: July 7, 2011
Microsoft Security Bulletins to be issued: July 12, 2011

Executive Summary

Bulletin ID Maximum Severity Rating and Vulnerability Impact Restart Requirement Affected Software
Bulletin 1 Critical
Remote Code Execution
Requires restart Microsoft Windows
Bulletin 2 Important
Elevation of Privilege
Requires restart Microsoft Windows
Bulletin 3 Important
Elevation of Privilege
Requires restart Microsoft Windows
Bulletin 4 Important
Remote Code Execution
May require restart Microsoft Office

 

Affected Software

Windows Operating System and Components
Windows XP
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating None Important Important
Windows XP Service Pack 3 Not applicable Windows XP Service Pack 3
(Important)
Windows XP Service Pack 3
(Important)
Windows XP Professional x64 Edition Service Pack 2 Not applicable Windows XP Professional x64 Edition Service Pack 2
(Important)
Windows XP Professional x64 Edition Service Pack 2
(Important)
Windows Server 2003
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating None Important Important
Windows Server 2003 Service Pack 2 Not applicable Windows Server 2003 Service Pack 2
(Important)
Windows Server 2003 Service Pack 2
(Important)
Windows Server 2003 x64 Edition Service Pack 2 Not applicable Windows Server 2003 x64 Edition Service Pack 2
(Important)
Windows Server 2003 x64 Edition Service Pack 2
(Important)
Windows Server 2003 with SP2 for Itanium-based Systems Not applicable Windows Server 2003 with SP2 for Itanium-based Systems
(Important)
Windows Server 2003 with SP2 for Itanium-based Systems
(Important)
Windows Vista
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating Critical Important Important
Windows Vista Service Pack 1 and Windows Vista Service Pack 2 Windows Vista Service Pack 1
(Critical)Windows Vista Service Pack 2
(Critical)
Windows Vista Service Pack 1 and Windows Vista Service Pack 2
(Important)
Windows Vista Service Pack 1 and Windows Vista Service Pack 2
(Important)
Windows Vista x64 Edition Service Pack 1 and Windows Vista x64 Edition Service Pack 2 Windows Vista x64 Edition Service Pack 1
(Critical)Windows Vista x64 Edition Service Pack 2
(Critical)
Windows Vista x64 Edition Service Pack 1 and Windows Vista x64 Edition Service Pack 2
(Important)
Windows Vista x64 Edition Service Pack 1 and Windows Vista x64 Edition Service Pack 2
(Important)
Windows Server 2008
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating None Important Important
Windows Server 2008 for 32-bit Systems and Windows Server 2008 for 32-bit Systems Service Pack 2 Not applicable Windows Server 2008 for 32-bit Systems and Windows Server 2008 for 32-bit Systems Service Pack 2*
(Important)
Windows Server 2008 for 32-bit Systems and Windows Server 2008 for 32-bit Systems Service Pack 2*
(Important)
Windows Server 2008 for x64-based Systems and Windows Server 2008 for x64-based Systems Service Pack 2 Not applicable Windows Server 2008 for x64-based Systems and Windows Server 2008 for x64-based Systems Service Pack 2*
(Important)
Windows Server 2008 for x64-based Systems and Windows Server 2008 for x64-based Systems Service Pack 2*
(Important)
Windows Server 2008 for Itanium-based Systems and Windows Server 2008 for Itanium-based Systems Service Pack 2 Not applicable Windows Server 2008 for Itanium-based Systems and Windows Server 2008 for Itanium-based Systems Service Pack 2
(Important)
Windows Server 2008 for Itanium-based Systems and Windows Server 2008 for Itanium-based Systems Service Pack 2
(Important)
Windows 7
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating Critical Important Important
Windows 7 for 32-bit Systems and Windows 7 for 32-bit Systems Service Pack 1 Windows 7 for 32-bit Systems and Windows 7 for 32-bit Systems Service Pack 1
(Critical)
Windows 7 for 32-bit Systems and Windows 7 for 32-bit Systems Service Pack 1
(Important)
Windows 7 for 32-bit Systems and Windows 7 for 32-bit Systems Service Pack 1
(Important)
Windows 7 for x64-based Systems and Windows 7 for x64-based Systems Service Pack 1 Windows 7 for x64-based Systems and Windows 7 for x64-based Systems Service Pack 1
(Critical)
Windows 7 for x64-based Systems and Windows 7 for x64-based Systems Service Pack 1
(Important)
Windows 7 for x64-based Systems and Windows 7 for x64-based Systems Service Pack 1
(Important)
Windows Server 2008 R2
Bulletin Identifier Bulletin 1 Bulletin 2 Bulletin 3
Aggregate Severity Rating None Important Important
Windows Server 2008 R2 for x64-based Systems and Windows Server 2008 R2 for x64-based Systems Service Pack 1 Not applicable Windows Server 2008 R2 for x64-based Systems and Windows Server 2008 R2 for x64-based Systems Service Pack 1*
(Important)
Windows Server 2008 R2 for x64-based Systems and Windows Server 2008 R2 for x64-based Systems Service Pack 1*
(Important)
Windows Server 2008 R2 for Itanium-based Systems and Windows Server 2008 R2 for Itanium-based Systems Service Pack 1 Not applicable Windows Server 2008 R2 for Itanium-based Systems and Windows Server 2008 R2 for Itanium-based Systems Service Pack 1
(Important)
Windows Server 2008 R2 for Itanium-based Systems and Windows Server 2008 R2 for Itanium-based Systems Service Pack 1
(Important)

Excerpt from:  microsoft.com

 

Microsoft: No botnet is indestructible

Thursday, July 7th, 2011

‘Nothing is impossible,’ says Microsoft attorney, countering claims that the TDL-4 botnet is untouchable

No botnet is invulnerable, a Microsoft lawyer involved with the Rustock takedown said, countering claims that another botnet was “practically indestructible.”

“If someone says that a botnet is indestructible, they are not being very creative legally or technically,” Richard Boscovich, a senior attorney with Microsoft’s Digital Crime Unit said Tuesday. “Nothing is impossible. That’s a pretty high standard.”

Instrumental in the effort that led to the seizure of Rustock’s command-and-control servers in March, Boscovich said Microsoft’s experience in takedowns of Waledac in early 2010 and of Coreflood and Rustock this year show that any botnet can be exterminated.

“To say that it can’t be done underestimates the ability of the good guys,” Boscovich said. “People seem to be saying that the bad guys are smarter, better. But the answer to that is ‘no.’”

Last week, Moscow-based Kaspersky Labs called the TDL-4 botnet “the most sophisticated threat today,” and argued that it was “practically indestructible” because of its advanced encryption and use of a public peer-to-peer (P2P) network as a fallback communications channel for the instructions issued to infected PCs.

Takedowns like those of Waledac, Rustock and Coreflood have relied on seizing the primary command-and-control (C&C) servers, then somehow blocking the botnet’s compromised computers from accessing alternate C&C domains for new instructions.

By doing both, takedowns decapitate the botnet, let researchers or authorities hijack the botnet, and prevent hackers from updating their malware or giving the bots new orders. That also gives users time to use antivirus software to clean their systems of the infections.

Kaspersky senior malware researcher Roel Schouwenberg said that TDL-4′s use of P2P made the botnet an extremely tough nut.

“Any attempt to take down the regular C&Cs can effectively be circumvented by the TDL group by updating the list of C&Cs through the P2P network,” Schouwenberg said last week. “The fact that TDL has two separate channels for communications will make any takedown very, very tough.”

Boscovich disagreed, noting that the February 2010 takedown of Waledac successfully suppressed that botnet’s P2P command channel.

“[Waledac] was a proof of concept that showed we are able to poison the peer-to-peer table of a botnet,” Boscovich said.

“Each takedown is different, each one is complicated in its own way,” said Boscovich. “Each one is going to be different, but that doesn’t mean that there cannot be a way to do this with any botnet.”

Alex Lanstein, a senior engineer with FireEye who worked with Microsoft on the Rustock takedown, said that the relationships Microsoft has built with others in the security field, with Internet service providers, and with government legal agencies like the U.S. Department of Justice and law enforcement were the most important factors in its ability to take down botnets, any botnets.

“It’s the trust relationships Microsoft has created” that have led to successful takedowns, said Lanstein. “And I think [the technique] speaks to any malware infrastructure where some kind of data feed exists. It really, really works.”

Those who disagree with Boscovich and Lanstein include not only Kaspersky’s Schouwenberg, but also Joe Stewart, director of malware research at Dell SecureWorks and an internationally known botnet expert.

“I wouldn’t say it’s perfectly indestructible, but it is pretty much indestructible,” Stewart said in an interview last week about TDL-4. “It does a very good job of maintaining itself.”

But SecureWorks also acknowledged Microsoft’s takedown chops, saying that its own statistics show that Rustock attacks have dropped tenfold since March.

“Since mid-March 2011, Dell SecureWorks’ CTU [Counter Threat Unit] research team has seen a significant decline in the number of attempted Rustock attacks, and we do attribute it to the comprehensive efforts of Microsoft,” a SecureWorks spokeswoman said Tuesday.

“With the Rustock takedown, Microsoft has built the framework for others to do the same,” Lanstein said. “This is definitely not the last botnet we’re going to go after.”

He declined to name the next likely target, saying that doing so would tip Microsoft and FireEye’s hand.

Source:  computerworld.com

Does Office 365 signal the end of Small Business Server?

Wednesday, July 6th, 2011

Office 365 is being touted as the perfect solution for the same small businesses currently using Microsoft’s Small Business Server to get Exchange and SharePoint. Will the cloud kill SBS?

On June 28, Microsoft officially launched Office 365, the cloud-based service that serves as the successor to BPOS (Business Productivity Online Suite). It’s designed to provide Exchange, SharePoint, and Lync (formerly Office Communications Server) to small and mid-sized businesses as hosted services. It also includes the Microsoft Office desktop applications (equivalent to Office Professional Plus edition). Office Web Apps, the online versions of Word, Excel and PowerPoint, can also be used.

There are three different plans available:

  • One for small businesses that don’t have an IT department
  • One for mid-size organizations and enterprises that do have IT staff, and
  • One that’s directly targeted at educational institutions

If the small-business version sounds a lot like the same market Microsoft targeted with Windows Small Business Server (SBS), that’s because it is. SBS 2011 is an “all-in-one” server product that integrates Windows Server with IIS Web Server, Microsoft Exchange Server, and Windows SharePoint Services (and, with the Premium add-on, SQL Server, Hyper-V, and Remote Desktop Services).

Microsoft released the latest version of SBS last December, but I’ve heard rumblings in the small business community about whether it might be the last one. Like the fears for the future of Forefront Threat Management Gateway 2010 (TMG), this is fueled by Microsoft’s big push toward cloud computing, and the cloud is something that many see as a particularly compelling option for small businesses — the very organizations that currently use SBS. Are those fears unfounded?

The small business dilemma

SBS 2011 Standard Edition is designed for (and limited to) no more than 75 users and devices (more about SBS 2011 Essentials later). Companies of that size often have no professional IT staff. They may have a person who does IT duties on a part-time basis along with his/her primary job, or they may have an independent contractor who comes in to do periodic network administrative tasks (and is called in a panic when something goes wrong). Either way, unless the business itself is IT-related, maintaining on-premise servers involves a substantial “hassle factor.”

SBS takes some of the complexity out of setting up and maintaining a functional company network with enterprise-class components such as email and collaboration, as well as lowering the licensing cost for small organizations. But even administering SBS is too much work for some companies with limited personnel resources. In fact, many small businesses have used hosted services for at least some of their IT functions for many years. Hosted web services and hosted email services have been most commonplace.

SBS to the cloud — only half in?

Microsoft addressed this by coming out with an edition of SBS 2011 called “SBS 2011 Essentials.” It’s a hybrid solution that aims to ease very small businesses into the cloud, by integrating on-premise and cloud-based software. The SBS on-site server acts as a domain controller that authenticates users and then passes them through to hosted Exchange and SharePoint services that are accessed over the Internet — i.e., Office 365. Thus you get single sign-on for on-premise and cloud-based applications. When it comes to administration, simplicity is the name of the game, with the server using a dashboard console that’s very similar to that of Windows Home Server.

Image courtesy of MicrosoftImage courtesy of MicrosoftBecause it is limited to 25 or fewer users/devices (no CALs required), the Essentials edition won’t be an option for those organizations with between 25 and 75 users. They’ll need to stick with the Standard edition, which doesn’t provide the cloud integration.

Given Microsoft’s “all-in” commitment to the cloud, you have to wonder why they imposed the 25-user limit on the cloud-based version of SBS. What happens if a business with fewer than 25 users grows so that it now has 30 users? Does that mean they have to switch from the cloud-based networking model to an on-premise model? Assuming they want to keep using SBS, it would seem to. How does that make sense?

And to make matters worse, there appears to be no upgrade path from the Essentials edition to the Standard edition. Of course, you can still use Office 365 if you have a network based on SBS 2011 Standard edition, as well, but you don’t get the out-of-the-box integration and you pay for the on-premise versions of applications that you aren’t going to use.

Is SBS Standard on the way out?

Some of the folks who have been happily using SBS to service their small businesses and who aren’t particularly interested in going to the cloud are worried that the debut of Office 365 signals the beginning of the end for their simple, low-cost server solution. There does seem to be more excitement around the Essentials product than the Standard, and I’ve noticed that the Microsoft web sites often put Essentials first when they discuss the editions. Is that a subtle clue that they’re planning to eventually throw the traditional (non-cloud) version of SBS under the bus?

That notion might seem a bit more outlandish if not for the demise, in June 2010, of SBS’s “mid-size sibling,” Essential Business Server. EBS (also known by its code name, Centro), was released in 2008 with much fanfare and was designed for organizations with 250-300 users. It included Windows Server 2008, Exchange Server 2007, ISA Server, and Forefront Security. One reason given by Microsoft for killing EBS was a desire to streamline their product lineup.

You can’t blame people for wondering if more streamlining is about to take place. With rumors swirling around the future (or possible lack thereof) of other non-cloud products such as TMG and indications that the company is trying to regain more focus by reining in its recent “finger-in-every-pie” strategy, SBS (or at least the Standard edition) looks like a logical candidate to get the axe.

SBS fans may take comfort in the statement in the EBS TechNet blog post that announced the end of that product, which assures us that “we are working hard to build the next version of SBS and look forward to a second decade of success with this award-winning small business offering” [emphasis mine]. Pessimists will point out that this was written before the “all-in-with-the-cloud” philosophy came to dominate.

Reasons to let it die

Many will argue that SBS Standard has outlived its usefulness. They believe the cloud is the only sensible choice for small businesses, and they point to:

  • cost savings (especially up-front costs and unexpected emergency maintenance costs, as well as personnel costs)
  • the ability to cut the cord of dependency on IT consultants (on which many small organizations rely because they don’t have in-house IT staff)
  • more reliability and less downtime due to the hardware/software, including redundancy and multiple frequent backups that cloud providers implement as part of standard operating procedure and that small businesses often can’t afford to implement (or don’t implement well) on their own.
  • better security, due to standard practices and more money to invest in security measures on the parts of cloud providers

Microsoft’s Office Division president, Kurt DelBene, expects that more than half of small businesses will adopt Office 365 within ten years. And you have to admit that for a small company, it’s pretty compelling. For $6 per month per user, you get a lot: an Exchange account (with 25GB mailbox), Office Web Apps, SharePoint Online, even Lync instant messaging, and online meetings. That last one is especially interesting. An OCS server is something that few small businesses can afford or have the expertise to support on-site in the past.

Despite the early start enjoyed by Google Apps, many agree with Gene Marks at Forbes, who believes Microsoft will win the small business cloud war. But in a sense, on-premise solutions such as SBS compete with Microsoft’s cloud offerings. The company might not want to fragment its efforts to put more development resources into SBS if it’s a shrinking market.

Reasons to keep it alive

All the arguments above sound good, and if IT decisions were made by robots, dropping SBS Standard might make a lot of sense. However, those decisions are made by humans, and there is still a great deal of human resistance and distrust of cloud computing — perhaps especially among small business owners.

Corporate decisions are made by managers who report to officers who report to boards that are concerned primarily with the bottom line. Those officers and managers have, by necessity, gotten used to delegating responsibilities and trusting others to carry out various tasks. They probably have experience with outsourcing some duties.

Image courtesy of Microsoft The entrepreneurs who run most small businesses tend to be control freaks (and I don’t say that in a derogatory way; I am one of those entrepreneurial control freaks). We worry a lot, and we worry more when we can’t see what’s going on. We hate to fly — not because we’re afraid of heights but because the plane is controlled by a pilot we don’t know, whose competency we can’t be sure of, sitting behind a locked cockpit door, and who we have no power to fire in mid-flight if he doesn’t do a good job.

Likewise, having our data and applications living in some remote data center and being managed and manipulated by persons unknown makes us very uncomfortable. We may come around to the cloud idea, but we’ll do it slowly. In the meantime, we like having a cost-effective, relatively easy-to-maintain solution to meet our somewhat modest IT needs.

Another reason Microsoft should keep investing in SBS is choice. Even if they weigh the options and decide that the cloud makes sense for them, small businesspeople like having choices. We don’t want to be tied in to just one possible way to do things, even if that would simplify our lives. If Microsoft doesn’t give us choices (that we can afford), we might just look elsewhere for them.

Conclusion

Office 365 is an exciting option for many small businesses, but I hope Microsoft doesn’t decide it’s such a great choice that it should be the only choice and doesn’t discontinue the Standard edition of SBS, which has served many companies well for many years. While, in general, I think Microsoft should stop trying to be all things to all customers and stay on a strategic business course that will unify its product offerings, I think they also need to recognize that the migration to the cloud may come about more slowly in the SMB market than they’re anticipating. In the meantime, more choices mean more customers and more customer goodwill. And speaking of choice, how about increasing the user limit for SBS Essentials so companies with up to 75 users can choose whichever of the SBS 2011 editions (cloud-based or not) best fits their needs?

Source:  techrepublic.com

Gigabit Wi-Fi Spec Edges Closer to Reality

Wednesday, July 6th, 2011

Gigabit Wi-Fi edged closer to reality this week, with the release of the latest version of an industry specification. Available to members of the WiGig Alliance, the document will be the basis for an early round interoperability tests in the fall.

The testing round will be the first time Alliance members can see how well their prototype products, using a short-range Wi-Fi radio that operates in the 60 GHz band, can connect with each other. The 1.1 version of the WiGig specification, which is now in sync with the standard being crafted by the IEEE 802.11ad working group, will create a Wi-Fi radio link that will support up to 7Gbps, over fairly short distances, such as one or more rooms in a home.

It’s intended to tie together a range of high-throughput gear, including consumer electronics such as high-definition TVs, computers and network storage.

The initial 1.0 release was published in May 2010. Since then, based on feedback from Alliance members, the group has fine-tuned it with a range of enhancements.

The 1.1 release is “certification ready,” according to WiGig chairman and president Ali Sadri, who also is head of Intel’s Mobile Wireless Group. It will be used as the basis for creating a series of certification tests, a joint project between the Alliance and the separate Wi-Fi Alliance. Sadri says the certification process should be ready during the second half of 2012 and will be overseen by the WFA. In the past, Wi-Fi products often have arrived on the market shortly after winning certification.

Also part of this week’s announcement are a new peripherals extension, and a new partnership to support HDMI products such as flat-panel TVs.

The WiGig Bus Extension (WBE) specification and the companion WiGig Serial Extension (WSE) support wireless connections with storage devices and other high-speed peripherals. The Bus Extension is now available to member companies; the Serial Extension is being finalized for release later in 2011.

The Alliance announced it has become an adopter of HDMI Licensing LLC, to adapt the WiGig Display Extension spec for HDMI mapping. It will be released later this year. The Alliance previously released, with VESA, a similar adaptation for DisplayPort certification. Each will let WiGig devices wirelessly connect to DisplayPort monitors or HDMI TVs.

Source:  pcworld.com

Why is it so hard to push good .Net patches?

Wednesday, July 6th, 2011

June’s ‘Black Tuesday’ patches to Windows’ .Net Framework library have admins hopping mad — again

Why can’t Microsoft turn out decent patches for its sprawling .Net Framework? That’s what I — and about a million admins all over the world — want to know.

Last month’s Black Tuesday .Net patches, MS11-039 and MS11-044, set new lows, even for Microsoft, even for .Net patches. There’s a list of known problems with MS11-044, documented in KB 2538814, that’s as long as your arm. As long as Michael Jordan’s arm, for that matter — and those are just the problems Microsoft has fessed up to.

Susan Bradley updates a lot of different Windows systems and, as a Microsoft MVP for Small Business Server, catches a lot of flak from other admins who are trying to keep their boxes running. She puts it succinctly: “I think Microsoft ought to be ashamed of how difficult it is to keep .Net updated. I don’t come to this conclusion lightly.”

Indeed — in the past year, Microsoft has published seven Security Bulletins for .Net Framework 3.5 or 4 (MS10-041, MS10-060, MS10-070, MS10-077, MS11-028, MS11-039, and MS11-044). Most of the Security Bulletins have at least a handful of acknowledged, documented problems. MS11-028 was withdrawn and re-issued. MS10-077 was re-issued with a “detection change”; in other words, the installer didn’t work right; Microsoft pulled the patch and then posted a different version. MS10-070 was pulled and re-issued; the documentation is now up to version 4.1 and counting. MS10-041 was pulled, re-issued, then pulled and re-issued again. You get the picture.

Why is it so incredibly difficult to issue reliable patches for .Net?

For starters, .Net isn’t a nice, neat bundle. It’s a spread-out mess of issued, updated, and redacted components. In many cases, admins have to keep old versions around because new versions can break old apps. It isn’t unusual to find Windows PCs running three or more different copies of .Net Framework. I’ve seen versions 1.1, 2.0 SP 1, 2.0 SP 2, 3.0, 3.0 SP 2, 3.5, 3.5 SP 1, 4 Client, and 4 Full on fairly recent machines.

My production machine is currently running .Net Framework 2.0 SP 2, 3.0 SP 2, 3.5 SP 1, 4 Client, and 4 Full. That’s five different versions of .Net, all installed on the same Windows 7 PC. I don’t dare remove any of them manually, for fear of breaking an application that relies on a specific version.

One of the problems is so common — receiving a Windows Update error code 0×643 or Windows Installer error code 1603 — that Microsoft points people to a single Knowledge Base article that describes how to dislodge the automatic update mechanism and try again. The Knowledge Base article says the trick works with .Net 1.0, 1.1, 2.0, 3.0, and 3.5. One can only wonder why it took Microsoft so long to fix the faulty installer logic.

Ultimately, the problem stems from the fact that .Net is more like an operating system within an operating system — a collection of sometimes-conflicting libraries and protocols that have evolved significantly over the past 10 years, with hooks deep into every nook and cranny of every Windows version. Still, if Microsoft can produce somewhat reliable patches to the Windows kernel, why does it keep dropping the ball with .Net?

While many of us wonder how Microsoft will support .Net in Windows 8 (see Joab Jackson’s analysis in Developer World), experienced admins also question how in the world Microsoft will patch it. Immersive apps with the new Windows Division-written CLR sure sound nice, but who’s going to keep the bloody thing working?

Curious about which versions of .Net are running on your computers? Download and run the .Net Framework Setup Verification Utility, no installation necessary. The drop-down box lists all of the detected versions.

Source:  infoworld.com

Security company infects client’s network with ‘Trojan mouse’

Sunday, July 3rd, 2011

By loading a USB mouse with malware and exploiting end-user blabbing, NetraGard succeeds in infecting a client’s network

Security consulting company NetraGard has demonstrated that something as seemingly innocuous as a USB mouse, along with tidbits of information freely available on the Internet, can provide a hacker quick and easy access to a seemingly secure IT environment.

In a blog post on the company’s website, NetraGard founder Adriel Desautels explained that his company was hired to test the security of a client’s network while adhering to some very stringent restrictions: The NetraGard team could target only one IP address, offering no services, bound to a firewall. Further, the team couldn’t even use social engineering tactics, such as duping an employee to reveal information over the phone or via email. They couldn’t even physically access the client’s campus.

NetraGard’s solution: Transform a Logitech USB mouse into an HID (hacker interface device) by installing on it a mini-controller and a micro Flash drive loaded with custom malware. The blog post goes into explicit detail of the painstaking process of operating on the mouse.

Once plugged in to a USB port, the altered mouse’s payload launched, spreading the malware across the client’s network. NetraGard specified that the malware they used was a homegrown variety that doesn’t do any actual damage; it simply spreads like a virus so its progress can be tracked.

Desautels wrote that his team discovered the client used antivirus software McAfee thanks to public Facebook posts made by the client’s employees. With that information, NetraGard was able to groom its homegrown malware to circumvent McAfee’s antivirus wares.

The final step: NetraGard purchased a list of its client’s employees from Jigsaw and used the info to choose a target employee to ship the mouse to. Desautels wrote that his team repackaged the mouse so that it appeared brand new and included in the shipment marketing materials so as to make it look like a promotional gadget. Three days later, he wrote, the device called home, indicating that the breach was a success.

The anecdote is a sobering reminder of how susceptible networks are today, even those of seemingly well-prepared companies. Notably, end-users were once again among the weakest links that enabled the attack to be effective: NetraGard was able to tailor its malware to circumvent the client’s antivirus software because employees blabbed publically on Facebook that their company used McAfee.

Further, an end-user was responsible for plugging in a mouse received in the mail. Although said mouse was well disguised as a legitimate promotional device, the way today’s phishing emails and scareware are designed to look as though they came from legitimate companies.

If anything, the scenario should prompt IT admins to develop or revise their in-house policies and controls — and to educate end-users accordingly — so as to reduce the chance of these types of breaches. It’s conceivable that cyber criminals in the near future will embrace similar techniques with any type of USB device.

Source:  infoworld.com

Security researchers discover ‘indestructible’ botnet

Friday, July 1st, 2011

More than four million PCs have been enrolled in a botnet security experts say is almost “indestructible”.

The botnet, known as TDL, targets Windows PCs and is difficult to detect and shut down.

Code that hijacks a PC hides in places security software rarely looks and the botnet is controlled using custom-made encryption.

Security researchers said recent botnet shutdowns had made TDL’s controllers harden it against investigation.

The 4.5 million PCs have become victims over the last three months following the appearance of the fourth version of the TDL virus.

The changes introduced in TDL-4 made it the “most sophisticated threat today,” wrote Kaspersky Labs security researchers Sergey Golovanov and Igor Soumenkov in a detailed analysis of the virus.

“The owners of TDL are essentially trying to create an ‘indestructible’ botnet that is protected against attacks, competitors, and anti-virus companies,” wrote the researchers.

Recent successes by security companies and law enforcement against botnets have led to spam levels dropping to about 75% of all e-mail sent, shows analysis by Symantec.

A botnet is a network of computers that have been infected by a virus that allows a hi-tech criminal to use them remotely. Often botnet controllers steal data from victims’ PCs or use the machines to send out spam or carry out other attacks.

The TDL virus spreads via booby-trapped websites and infects a machine by exploiting unpatched vulnerabilities. The virus has been found lurking on sites offering porn and pirated movies as well as those that let people store video and image files.

The virus installs itself in a system file known as the master boot record. This holds the list of instructions to get a computer started and is a good place to hide because it is rarely scanned by standard anti-virus programs.

The biggest proportion of victims, 28%, are in the US but significant numbers are in India (7%) and the UK (5%). Smaller numbers, 3%, are found in France, Germany and Canada.

However, wrote the researchers, it is the way the botnet operates that makes it so hard to tackle and shut down.

The makers of TDL-4 have cooked up their own encryption system to protect communication between those controlling the botnet. This makes it hard to do any significant analysis of traffic between hijacked PCs and the botnet’s controllers.

In addition, TDL-4 sends out instructions to infected machines using a public peer-to-peer network rather than centralised command systems. This foils analysis because it removes the need for command servers that regularly communicate with infected machines.

“For all intents and purposes, [TDL-4] is very tough to remove,” said Joe Stewart, director of malware research at Dell SecureWorks to Computerworld. “It’s definitely one of the most sophisticated botnets out there.”

However, the sophistication of TDL-4 might aid in its downfall, said the Kaspersky researchers who found bugs in the complex code. This let them pry on databases logging how many infections TDL-4 had racked up and was aiding their investigation into its creators.

Source:  BBC