Jan 24, 2009

Top 10: Obama takes charge, Ballmer lays it on the line

January 24, 2009, 08:51 PM — IDG News Service

Barack Obama became the 44th, and first African-American, president of the U.S. on an Inauguration Day that captured the world's attention and provided a flurry of technology-related headlines. The post-inaugural buzz ended quickly, though, with dismal economic news being rife (again) this week. On a happier note, Apple did well with its quarterly financials and celebrated the 25th birthday of the Mac.

1. The Obama Administration tech watch: Barack Obama was sworn in as the 44th U.S. president Tuesday, with Inauguration Day coverage blanketing the Web, including sites that typically do not have much to do with political news. There were any number of tech-related stories pegged to the big day, including how mobile-phone networks and the Internet held up. A tech-savvy president -- Obama publicly declared he was not going to give up his BlackBerry despite security issues -- will undoubtedly mean more emphasis on tech policy and legislation.

2. Ballmer provides grim outlook as economy 'resets': Leave it to Steve Ballmer to cut to the chase -- the economic recession will drag out for a year or two, leading to a "reset" of the economy from which there will not be a "bounce" that leads to the previous levels of prosperity. He delivered his grim assessment, which he did tinge with some notes of optimism about IT in general and Microsoft specifically, on the heels of news that his company's quarterly net income took a dive and that 5,000 employees are being laid off. Microsoft wasn't the only bearer of lousy financial news this week, as noted in number six.

3. Heartland data breach could be bigger than TJX's and Heartland breach raises questions about PCI standard's effectiveness: Heartland Payment Systems, which provides credit- and debit-card processing services, disclosed a massive data breach that could rival that of TJX, though specific numbers were unclear. However, Heartland processes 100 million card transactions per month, so analysts suspect that the breach was huge. (The TJX breach compromised more than 45 million cards.) The Heartland breach has led to renewed concerns that the Payment Card Industry data security standard required by Visa and MasterCard is not up to the task of ensuring data safety.

4. Microsoft to deliver first IE8 release candidate Monday: The first release candidate of Internet Explorer 8 will be out Monday, according to sources who know what Microsoft is up to with its next browser version. A release candidate, in Microsoft parlance, is software with all the features set to go and code that is stable. It also means that a final release of the software is imminent.

5. Wall Street beat: Recession whacks IT earnings: Besides Microsoft, Nokia, Sony, Advanced Micro Devices and Ericsson all reported grim quarterly earnings. We'll leave the details to those with the fortitude to click the link.

6. Apple reports record profit for first quarter and Google Q4 earnings plummet, revenue up 18 percent: We don't usually go on about financial results in Top 10, but this week is an exception, with the need to temper the bad news with a couple of positive nuggets. First, Apple defied the recession with record revenue and profit in its first quarter. Then, Google said that although its profit took a nosedive due to one-time investment write-downs, its revenue was up nearly 20 percent for the fourth quarter compared to the same period a year ago.

7. Craig Barrett to retire from Intel: Intel Chairman Craig Barrett will retire in May after 35 years with the company. In recent years he had eased away from day-to-day operations to focus on getting technology to emerging countries.

8. PewTube? Vatican to preach via YouTube and Pope praises potential of new technologies: The Vatican's YouTube channel is up and running, providing video of Pope Benedict XVI's activities as well as other links about the Catholic Church. The pope kicked things off with a speech praising new communications technologies for the good they can do, but warning about the possible pitfalls if they are misused.

9. 25 years of the Mac: Mac fans celebrated the 25th anniversary of their beloved Apple computers and Macworld offered a tribute along with a nice package of stories looking at the last quarter-century of Apple's business, including the best and the worst. Some advice on what the company should do soon and in the future is also mixed in.

10. Bull castration, snake eaters, opium and a whorehouse: the life of storage guru Dave Hitz: Dave Hitz, the founder and executive vice president of storage company NetApp, published a memoir called "How to Castrate a Bull: Unexpected Lessons on Risk, Growth, and Success in Business." He worked on a cattle ranch when he was in college, so the title is more literal than at first might be realized. We haven't read the book yet, but there was no way we could leave a headline like that out of Top 10.

Microsoft postpones Iowa data center

January 24, 2009, 08:43 PM — IDG News Service

A day after reporting flat revenue for its online services business, Microsoft said it is postponing construction on a planned data center in Iowa.

It's one of several other cost-cutting measures the software giant announced along with a disappointing financial report Thursday, including laying off around 5,000 people, reducing the use of vendors and lowering marketing spending.

Microsoft will continue construction of its new data centers in Chicago and Dublin and open them based on the level of demand for its online services. The company will revisit its data center plans quarterly, wrote Arne Josefsberg, general manager of infrastructure services, and Michael Manos, general manager of data centers, in a Microsoft blog about its data center efforts.

They referenced Microsoft's cost-cutting efforts in the post, but optimistically described the postponement in Iowa as being a result of successful efforts to improve the efficiency of data center operations elsewhere. "Thanks to the efficiencies we've gained through these ongoing efforts, we will be able to delay the construction and opening of some of our facilities," they wrote.

Josefsberg and Manos said they expect more companies to rely on hosted services from companies such as Microsoft to save money, and that despite the economic downturn, Microsoft's online services businesses are growing. That's just barely true. Revenue from its online services in the quarter ending Dec. 31 was US$866 million, up just a hair from $863 million in the same period in 2007.

Microsoft revealed in August that it would build the facility in a suburb of Des Moines and use it to host consumer services such as Hotmail, Live Search and Windows Live applications. Like its Chicago data center, the facility would use shipping containers to house servers, the company said at the time.

Microsoft is not alone in reining back its data center expansion plans. Google late last year decided to delay building a facility it planned in Oklahoma.

Microsoft to merge Windows Live and Office Live

January 24, 2009, 08:55 PM — IDG News Service

Microsoft seems closer to rebranding its search engine and Live online services under the Kumo name, something that's been rumored to be in the works since last year.

The company confirmed Friday that it will merge its Office Live and Windows Live services into one unified online portal, but did not give a specific time frame for the move and declined to provide any other information.

"Microsoft has made the decision to converge Windows Live and Office Live, that includes Office Live Workspace and Office Live Small Business, into an integrated set of services delivered through one single destination," Microsoft said in a blog post on the Workspace Team Blog attributed to Kirk Gregerson, product manager for Office Live Workspace.

Microsoft's confirmation comes after a blogger unveiled Thursday that Microsoft planned to integrate the services into one set of offerings when the next wave of those services is released.

Currently, there are two separate portals through which users can reach Windows Live and Office Live. Office Live services are related to Microsoft's Office software, while Windows Live are applications and services that more closely align with functionality in the Windows OS.

Based on actions Microsoft has taken, it has been rumored since last year that it plans to rename its search engine and possibly its entire line of online services under the name Kumo, so the integration could be preparation for that move. Microsoft has declined to comment on any name change for its search engine or services.

However, on Dec. 4, 2008, Microsoft filed a trademark application for the Kumo name. The application is to trademark a host of software and services that include not only a search engine, but also advertising and telecommunications services, education, training, entertainment and the design and development of computer hardware and services.

Prior to that, according to Whois.Net, Microsoft registered the kumo.com domain, and through CSC, a company that manages domain names for corporations, also registered related domains that indicate the Kumo name could be used for other services. Those domains include: www.kumosearch.com, www.kumopics.com, www.kumowiki.com, www.kumogroups.com and www.kumotravel.com.

Kumo is a Japanese word that can be used to mean "cloud," "ceiling" or "sea spider" among other things, according to an online Japanese-to-English translation service.

Both Windows Live and Office Live, while somewhat recognizable by Microsoft customers and Web users, are fairly new brand names for the company. Microsoft revealed them in November 2005 in conjunction with a revamp of its online services. Still, the company's Online Services Business -- home to Windows Live, Office Live and the company's online advertising business -- is its least financially successful one, and Microsoft has struggled with developing an audience and solid base of users on the Web.

In an e-mail from its public relations team, Microsoft said the upcoming change does not affect current product cycles for its Office Live and Windows Live services, and the company will have more to share about specifics of the change "at a later date."

Happy 25th birthday to Macintosh

January 24, 2009, 08:37 PM — Computerworld

Well, today is the day. Twenty-five years ago on this day, a much-younger Steve Jobs introduced the Macintosh to 3000 attendees at the Flint Center at De Anza College in Cupertino. It was to be the first commercially successful personal computer to feature a mouse and a graphical user interface rather than a command line interface.

Jobs didn't just catch his trademark showmanship stride recently.

"There wasn't a person in the room who didn't think this was history happening," recalled Richard Doherty, analyst with the Envisioneering Group, who was there.

This wasn't the first hint to the world that Macintosh was coming. A few days earlier, Apple aired their famous '1984' ad. They'd spent $1.5 million on producing what would become the gold standard in Superbowl ads. The ad only ran one other time on TV (millions of times over in YouTube glory) but it not only changed the computer industry, it changed the advertising industry.

The ad wasn't an easy sell. Apple's board wanted to can it until Steve Jobs and John Sculley convinced them to run it. Steve Wozniak even offered to foot the bill he liked it so much.

Computerworld's Ryan Faas put together a look inside what it was like building the Macintosh. If you can't get enough about the original Mac, check Folklore.org or Wikipedia.

The original Mac cost $2495 with the following options:
Imagewriter printer $595 ($495 if purchased with Macintosh)
Numeric Keypad $129
Modem 300 $225
Modem 1200 $495
Carrying Case $99
3 1/2-inch disk box (10 disks) $49
MacWrite/MacPaint $195 (included free with each Macintosh during the introductory period)
External Drive $495 .

As someone who has spent the better part of my life working and playing with Macs, I certainly hope they'll be around another 25 years. The platform is certainly as relevant and exciting as it has ever been.

Jan 23, 2009

Sun: Look for IDEs to tackle the cloud

Look for software development environments to make accommodations for cloud deployments, a Sun Microsystems official said on Thursday.

During the Cloud Connect event in Mountain View, Calif., Lew Tucker, Sun vice president and CTO for cloud computing, spoke of a button, of sorts, for deploying applications to the cloud. "I think that we'll see the IDEs [Integrated Development Environments], the tools that developers use today will start to accommodate the capabilities of even building and deploying applications out into a cloud service provider," Tucker said.

[ For more about the cloud, see InfoWorld's news feature on what cloud computing really means ]

Just as an IDE can deploy to an Apache Web server, IDEs can evolve to back cloud deployments, meaning it would be spinning up a new instance in that cloud. Tucker said he anticipates a variety of tools coming up to accommodate the cloud, but added Sun has not announced any such plans for the NetBeans IDE.

Cloud computing, Tucker said, has become a natural evolution from client-server to the Internet to the cloud. But he acknowledged that moving legacy applications to the cloud remains an issue, particularly data ownership and privacy. "The enterprise by and large has real hard, fast policies around confidential information and protecting that information, archiving, that has kept that information trapped behind the firewall. And so moving apps into the cloud, whereby company confidential information is moving out into the cloud, the enterprise has real issues with that," Tucker said.

He anticipates cloud providers will respond by providing appropriate levels of security and auditing. But IT departments like to know where their data is, who has access to it, and how that data is protected, Tucker said.

Enterprises could have a cloud model move behind the firewall, offering infrastructure as a service to internal customers and get advantages in scalability and cost, said Tucker said.

Sun, he said, has not made any announcement about becoming a cloud provider itself but said the company will be involved in public and private clouds. The company has had a grid computing platform, Network.com. Grid has been geared toward such applications as high-performance computing and batch jobs, Tucker said. Cloud environments are for more general purpose and Web-scale applications, he said.

Also playing into the cloud space is virtualization, Tucker noted. Virtualization had started out as a way for enterprises to do server consolidation, he said.

"Now, we’re seeing virtualization being used and cloud service providers as a way to take applications and move them into the cloud," he said. An OS and LAMP stack can be moved entirely as a virtual machine in the cloud, Tucker said.

Tucker also said he expects clouds for different types of applications, including games.

Sun, meanwhile, has maintained its Project Caroline platform-as-a-service platform as a research project, providing a set of tools for use in the cloud. The project revolves around the notion of a framework of sorts for developing scalable applications.

Windows 7 for free?

Microsoft is planning to offer a Windows 7 upgrade to Vista owners, for free. The Malaysian web site TechArp just reported that Microsoft has a draft of their Windows 7 Technical Guarantee Program, which includes details on what they're calling the "Windows 7 Upgrade Program."

Details are sketchy as of this point, but strategically, it is designed to mitigate the number of users who are putting off buying a new computer, because they would rather go directly to Windows 7 from XP, rather than deal with Vista.

According to the site, the program is targeted at consumers and small business users that bought new PCs that were pre-installed with Vista during a predetermined period of eligibility. Vista Home Basic isn't included in the program however; only Vista Home Premium, Vista Business, and Vista Ultimate. larger enterprises with multiple upgrade requirements aren't eligible.

The upgrade plan is nothing more than a rumor at this point, although TechArp does claim to have seen a copy of a letter Microsoft sent out to its OEM partners. TechArp does have a good reputation on getting the scoop on Microsoft strategies.

Supreme Court deals death blow to antiporn law

The U.S. Department of Justice has been trying since 1998 to convince courts that a federal antiporn law targeting sexually explicit Web sites was constitutional.

No longer. On Wednesday, the U.S. Supreme Court rejected prosecutors' last-ditch defense of the Child Online Protection Act, meaning that the law will not be enforced.

COPA was enacted during the anti-Internet porn scares of the late 1990s, in part as a narrower answer to aprevious Net censorship law that also met its demise in the courts. Any commercial Web site operator that posts "material that is harmful to minors" faces six months in prison and a fine of up to $50,000.

The American Civil Liberties Union filed suit against the law in Philadelphia, saying the prohibition was so broad and vague that even traditional publishers could face fines and imprisonment. Plaintiffs included Salon.com, which occasionally publishes racy material, the California-based lesbian-gay A Different Light Bookstore,PlanetOut, and a now-defunct coalition that included CNET Networks (publisher of CNET News), The New York Times Co., and Reuters. (A CNET executive testified against the law in January 1999.)

"It is not the role of the government to decide what people can see and do on the Internet," ACLU staff attorney Chris Hansen said in a statement on Wednesday. "Those are personal decisions that should be made by individuals and their families."

As a side note, it was the Justice Department's ongoing defense of COPA in 2006 that led to its subpoena to Google asking for a "random sampling" of 1 million Internet addresses accessible through Google's popular search engine and a random sampling of 1 million search queries submitted to Google over a one-week period.

Since the initial proceedings, the case has bounced around the court system without reaching a resolution. During that time, the Supreme Court handed down two preliminary rulings, once in 2002 and again in 2004.

The first time, it sent the case back to an appeals court with instructions to broaden its legal analysis beyond the law's interaction with community standards; the second time, it wanted a review of whether "technological developments" have affected the law's constitutionality.

The Supreme Court's 2004 ruling against the Justice Department and in favor of the ACLU commanded a narrow 5-4 majority, with justices Stephen Breyer, William Rehnquist, Sandra Day O'Connor, and (separately) Antonin Scalia dissenting.

The majority opinion, written by Justice Anthony Kennedy, upheld a temporary injunction barring prosecutors from enforcing COPA.

It was Breyer's dissent that had some free-speech advocates worried. It said COPA places "minor burdens on some protected material--burdens that adults wishing to view the material may overcome at modest cost. At the same time, it significantly helps to achieve a compelling congressional goal, protecting children from exposure to commercial pornography. There is no serious, practically available 'less restrictive' way similarly to further this compelling interest. Hence the Act is constitutional." Scalia went even further.

But the court didn't seem to want to revisit COPA a third time. Wednesday's ruling was a mere refusal to even hear the case, issued without explanation.

Even among antiporn groups, support for COPA waned as the years progressed, and federal prosecutors focused on obscenity and child pornography.

Another reason for the erosion of support may be that because the law was written so long ago, it's surprisingly limited. It applies only to material delivered "by means of the World Wide Web"--meaning that it doesn't cover peer-to-peer file sharing, the Usenet newsgroups that alarm New York's attorney generalgames like Virtual Hottie 2, those naughty things happening in Second Life, videos watched via a third-party iPhone application, or streaming porn viewed through the VideoLAN Client, RealPlayer, or Windows Media Player desktop applications.

Broadband stimulus passes its first hurdle

The House committee on Energy and Commerce , which has oversight on broadband and energy matters, approved its portion of the Economic Recovery Legislation this week.

The broadband components of the legislation which is over 200 pages give the National Telecommunications Information Administration (NTIA) authorization to allocate $2.85 billion for wireless and wireline broadband through a program of grants.

[ Last year, InfoWorld pondered if the U.S. needs a new broadband policy | Barack Obama had been including support for broadband rollout funding in his stimulus package since early this month ]

The entire legislation is expected to come to a floor vote in mid-February.

Approximately $1 billion will be targeted for wireless service with a goal that 25 percent of the $1 billion go to unserved areas and the remaining 75 percent for the advancement of broadband in underserved areas of the country.

The provision requires those that receive grants to lay out a broadband infrastructure to do so in a way that is technologically neutral.

Grantees will be required to “provide access to the supported infrastructure on a neutral, reasonable basis to maximize use.”

The wording appears to support one aspect of what is called net neutrality.

Net neutrality has come to mean everything from opposing additional fees for better quality of service to network accessibility by competing content deliver services.

President Obama has stated that he favors net neutrality, but net neutrality  is not supported by most of the carriers.

According to news reports, Steve Largent, president of the Cellular Telecommunications Internet Association  (CTIA) also sent a letter to Energy and Commerce Committee Chairman Henry Waxman  stating that carriers would be hesitant to participate in the grant program unless the FCC more clearly defines what it means by “open access,” another term that has many definitions.

According to the legislation, open access would ensure that private entities cannot restrict any content deemed lawful “that flows through taxpayer-funded broadband facilities.”

The CTIA has also been lobbying on behalf of the carriers for what is called a “shot-clock” approach to putting up towers and antennas to improve wireless access.

Under shot-clock, a municipality would be required to approve a site within 75 days of an application. Currently, tower and antenna approval can take years as it works its way through various environment impact studies and public hearings.

The current legislation also calls for the NTIA to set minimum speed requirements for unserved and underserved areas, “reflecting what is technologically and economically feasible.”

The bill, in fact, sets specific benchmarks for “advanced broadband service” as delivering data to an end-user at a speed of at least 45 Mbps downstream and 15 Mbps upstream.

The term “advanced wireless broadband service” is defined as data transmitted at a speed of at least 3 Mbps downstream and at least 1 Mbps upstream, “over an end-to-end internet protocol wireless network.”

Basic broadband service is defined as delivering data to the end-user at least 5 Mbps downstream and 1 Mbps upstream.

The legislation states that almost any entity is eligible to apply for a grant, including satellite companies.

Although the entire stimulus package is on a fast track to a full vote in the House and Senate, some provisions in the broadband component appear to be less hurried.

For example, it requires states to create a broadband inventory map of the United States that “identifies and depicts the geographic extent to which broadband service capability is deployed and available from a commercial provider or public provider throughout each state. However, the NTIA is being given two years after the enactment of the Act to create the inventory map according to the proposed legislation.

The bill does appear to go a long way toward keeping the public informed of broadband progress. It requires the NTIA to create a Web site that lists eligible entities that have applied for a grant, the areas the entity proposes to serve, plus the status of each application.

States will have 75 days after the legislation is approved to indicate what areas have the greatest priority for broadband.

Opinion: The top 10 standout Macs of the past 25 years

January 16, 2009 (Computerworld) Back before Apple Inc. made computers that fit in your pocket, it made computers that fit on your desk. Some were big-box machines, others were not so portable portables and still others were -- literally -- cube-shaped. But the firstMacintosh, the one that started Apple's rise to iconic status, is to the computer industry what the wheel was to cave men.

It was launched during the Super Bowl on Jan. 22, 1984 -- in a minute-long commercial directed by Ridley Scott that became a classic of its own -- and went on sale two days later. It was the first of a string of Apple computers that would captivate users for the next quarter of a century.

Much has changed in technology over the course of the past 25 years, with Apple often at the center of the advances we now take for granted. To celebrate the Mac's 25th anniversary, I looked back over the years and picked 10 Apple computers that altered the company's course and changed the way the world works and communicates. My first pick, naturally, is the first Mac.

The Macintosh (1984)

The original Mac, with its compact all-in-one design, innovative mouse and user-friendly graphical user interface (GUI), changed the computer industry. Like the wheel, the Mac just made things convenient for the rest of us.

original Macintosh 128k
The original Macintosh 128k (photo: Marco Mioli,All About AppleGNU FDL 1.2 license)

Most computers in the early 1980s were controlled exclusively through text commands, limiting their audience to true geeks. True, Apple had released a GUI with the introduction of the $9,995 Lisa in 1983, but the Mac, priced at $2,495, was the first computer to capture the attention of everyday people, who could now use a computer without learning an entirely cryptic command-line language.

The mouse, coupled with a user interface that closely followed the physical "desktop" metaphor, allowed users to tackle tasks unheard of for rival computers using its two included applications: MacWrite and MacPaint. Thus was born desktop publishing. Coupled with the Postscript software licensed from Adobe Systems Inc., Apple was able to also sell the Apple Laserwriter, which helped bring about WYSIWYG design, allowing artists to output precisely what was on the Mac's 9-in. black-and-white screen.

In case you forgot, the first Mac came with 128KB of RAM and zipped along with an 8-MHz processor. Reviewers were not always friendly, but the stories of those who helped bring it to life, collected at Folklore.org, offer a fascinating look at the first computer to capture mainstream attention.

The PowerBook 100 series (1991)

On Oct. 21, 1991, Apple unveiled its new portable lineup, which included the PowerBook 100, 140 and 170. These "good, better and best" models, the culmination of a joint venture between Apple and Sony Corp., featured a 10-in. monochrome screen and yielded a design that became the blueprint for all subsequent laptop designs from all computer manufacturers.

Mac PowerBook 100
The PowerBook 100 (photo: Danamania, GNU FDL 1.2 license)

Apple's earlier attempt at a portable Macintosh -- aptly named the Macintosh Portable -- weighed in at a not-so-portable 16 lb. But the Macintosh Portable did introduce the trackball to mobile computing, in this case located to the right of the keyboard.

The PowerBook line placed the keyboard back toward the LCD screen, allowing room for users to rest their palms. It also conveniently allowed Apple to locate the trackball at the center of the palm rest. That made it easy for either left- or right-handed users to operate the machine.

The PowerBook series also introduced Target Disk Mode, which allowed the laptop to be used as a hard drive when connected to another Macintosh using the built-in SCSI port. It also came in a fashionable dark gray, breaking from the standard beige of the PC industry.

The PowerBook 100 series brought in $1 billion in revenue for Apple in its first year, and its impact is still felt to this day. If you're using a laptop with a trackball or track pad between your palms, you can thank the PowerBook 100 design. (If you've got a track pad, you can thank the PowerBook 500. In 1991, that particular model was still three years away.)

Conficker malware ups the ante

Roger A. Grimes explains why keeping up to date with patches can mean the difference between a functional system and a playground for hackers

I'm finding many Windows servers without the MS08-067 patch and no specific mitigations applied. There hasn't been a very large malware outbreak (a la Code Red, SQL Slammer, etc.) in a few years, and perhaps this could be leading to a false sense of security.

If you don't patch, the ever-transforming Conficker malware program could end up testing your security perimeter breach responses. Microsoft released the patch on Oct. 23, 2008, nearly two months ago. To remain unpatched at this point and time doesn't seem to be a great idea, but there are still plenty of vulnerable servers out there.

How the Conficker worm still wreaks havoc
The Conficker worm's main exploit vector is by buffer overflowing unpatched versions of Windows Server services, which is represented by the Workstation and Server services, and svchost.exe processes. Initial exploitation normally occurs remotely over NetBIOS/SMB ports 139 or 445 using a malformed RPC request. It can be accomplished with an unauthenticated connection in Windows XP Pro and Windows Server 2003, but requires an authenticated connection in Windows Vista and Windows Server 2008.

[ Take a slideshow tour of InfoWorld's 2009 Technology of the Year Award winners in Applications, Middleware, and Data Management | Application Development | Platforms and Virtualization | Systems and Storage |Networking and Security ]

Once a system is infected, Conficker takes a variety of actions, including exploiting several routes that have nothing to do with the Server services. It disables common anti-malware programs and uses DNS modifications to prevent local end-users from surfing to anti-malware-related Web sites (which might be one of the first clues that you're infected). It spreads to mapped file shares and identified removal drives. Once there, it creates a subdirectory folder called Recycler (emulating the Recycle Bin) and places an Autorun.inf file, which may be auto-launched when visited.

It attempts to connect to remote admin drive mappings using hundreds of common, weak passwords, including multiple versions of numbers and letters. If you find an infection on your network, you probably want to check out the list and see if any of your passwords are located there. Using either exploit vector, Conficker is able to infect computers that are fully patched after first exploiting one unpatched network computer. Conficker isn't the first worm to do any of these things, but the most popular worms rarely do anything new.

How admins are protecting servers from Conficker -- or not
I don't understand why more administrators aren't patching Windows servers. Maybe they think the perimeter firewall will protect them, since they don't allow ports 139 or 445 access from the Internet. But as anyone from the MS-Blaster days can tell you, a perimeter firewall really doesn't provide that much protection in today's world of traveling laptops and remote VPN users.

I'm sure some of the problem is due to the distrust of Microsoft patches. In the past, some Microsoft patches have caused operational issues. Although Microsoft is getting a lot better on this, it's hard to forget if you've been burned in the past. That's understandable, and why any patch management strategy should have acceptable regression testing built into it.

Some of my clients don't have identical test environments to test patches on. That's understandable, although setting up VM environments should make the task easier cost-wise. Many of the biggest VM vendors, certainly VMware and Microsoft, offer free utilities that will snapshot a real, physical box into a test VM image. But even if you have an identical VM, unless it is being thoroughly tested as if it were functional on the production network, it's never a completely thorough test.

Patching best practices
So always have a plan for reversing patches once applied on the production network. Servers should have a complete backup prior to any patch being applied. Microsoft offers free technical support (866-PCSAFETY) for any security update-related issue. For mission-critical servers, consider using clustering or network load-balancing services to allow one server at a time to be patched, instead of trying to patch all at once.

If you don't want to (or can't) apply the patches in a timely manner, implement suggested workarounds and mitigations. Microsoft offers these in every security bulletin. If your environment is so thin on assets and resources that you simply cannot take the time to regression test before applying patches, back up the servers and apply the patches. Today's Internet is too malicious to let unpatched servers hang around on your network. The patches are being code-reversed in minutes and the worms are pouring out into the public in hours.

I do a lot of on-site security reviews. Frequently I'm told that all patches are being applied in a timely manner, but when I audit sample servers, I find out that isn't really the case. If you're an IT manager, do a little spot checking to see if necessary patches are being applied. Use your regular patch management auditing tool or simply use Microsoft's free Baseline Security Analyzer (MBSA).

Many organizations cannot patch quicker than a few weeks because of security policy or oversight boards. If this is the issue, take your case to management and ask that a more timely response be allowed for high-risk scenarios. Your patching strategy must be updated for the times. Waiting weeks to patch servers is no longer a best practice. It's an outlier, and if anyone is ever asked in court to defend how they supported something, the lawyers and the court system always rely on best practices and due diligence as the baseline defense. Make sure you're not the one having to explain why you deviated from best practices in patch management.

Open source in 2008: Everything but interest up

While 2008 has been a bleak year for the financial markets and the global economy, it has been very kind to open source, at least based on market share. A review of Net Applications data suggests that there has never been a better time for open source; however, as Google Trends data suggests, it's no longer enough to rest on one's open-source laurels.

  • Number of projects. In terms of sheer numbers of open-source projects, as well as traffic to those projects, open source was on a tear in 2008, with SourceForge alone increasing its hosted projects by 10 percent.
  • Browsers. Firefox has cracked 20 percent of the market, while Google Chrome has topped 1 percent. Internet Explorer has dropped month over month for nearly 12 straight months in 2008. As IE falls toward 50 percent market share we will, as Glyn Moody suggests, get a taste of real browser competition again.
  • Operating systems. Windows, too, has been on a slide, though at 89 percent of the market, it's hardly in a weak position right now. Linux and Mac have both gained at its expense, with the latter taking a real bite (over 1 percent) out of the Redmond giant. But for Mac and Linux, the real market to watch isn't the desktop, but rather the mobile (or nearly mobile) market, where Mac and Linux have real consumer advantages.
  • Commercialization. 2008 saw Red Hat and Novell Linux (and JBoss, in the case of Red Hat) revenue jump dramatically, as Sun also saw Open Storage and MySQL revenue climb significantly. But the real story may be in private data. My own company doubled its sales (again), while I know from conversations with executives at SugarCRM, MindTouch, JasperSoft, and other open-source vendors that their sales were on a tear, too. Separately, we're still talking about relatively small amounts of money (under $50 million), but collectively...? Commercial open source is coming into its own.
  • General interest. Intriguingly, "open source" as a search term has been on the decline for years, as shown below, but I think this is a positive development. In light of the above, I read this decline to suggest that people are less interested in generic open source and more demanding of specific open-source projects. Does Hadoop meet my needs, in other words, not is it open source? This is a sign of maturity, not weakness. It's a sign of open source going mainstream, as Sun's Zack Urlocker suggests.

"Open Source" search on the decline

(Credit: Google Trends)

It has been a banner year for open source, just as it has been since at least 2000. I suspect that 2009 will be much the same...only better.