2009-01-17

Integration Watch: The end for Perl?

In 1996, I hired a writer/programmer named Randal Schwartz to write a column on Perl for Unix Review, the magazine I headed up at the time. I believe I was one of the first, if not the first, editor of a broad-circulation tech publication to feature regular coverage of Perl. The rationale was that I could see the rapid adoption of Perl in the Unix community—combined with the difficulty new users had in coming up to speed on the language.
##CONTINUE##
Schwartz was probably the biggest name in Perl at the time (his book was the equivalent of K&R to C or the Pickaxe book to Ruby) after Larry Wall, the language’s inventor. The column, Perl Advisor, was very popular during and after my tenure Unix Review, and it lasted until the magazine folded several years later.

Had I stayed on at the magazine (and had it continued on), I would have eventually supplanted the Perl column with one on Python. As Python emerged, it was clear even in its early years that it offered a superset of Perl’s functionality and that it lacked Perl’s penchant for near-hieroglyphic syntax.

Much of Perl’s original popularity was its clear superiority to CGI for gluing together Web functionality. The fact that it could be used for application development was a secondary aspect in those days, but one that attracted a loyal following. However, even then there was some perception that writing large applications in Perl was using the language for more than what it was designed to do.

Python, in contrast, came out of the box with a stronger orientation towards application development, and I found that many of the Perl application developers were excited all over again doing development work in Python. Python appeared to most folks as the first truly modern dynamic language. My perception, then and now, is that Python’s capabilities would inevitably trump Perl in popularity. This has now come to pass.

During much of the present decade, according to the Tiobe index {tiobe.com}, Perl has held a sizable lead in the marketplace. In 2003, for example, it was the fourth most popular language (after Java, C and C++). This year, it has dropped to eighth place, behind the previous three and (in order) PHP, Visual Basic, C# and, crucially, Python. Of the top 10 languages, none lost more ground in 2008 than Perl. If it loses as much again in 2009, it will fall to 11th place. Google and Ohloh.net show similar declines.

Though Python usage dropped slightly this year, I suspect that Python 3.0's release will boost the language’s popularity. This is a big release. And even though it breaks compatibility with previous versions, the new features it adds are important.

By contrast, Perl has been in a massive funk when it comes to updates. The now-infamous Perl 6 release is still a long way off—as it has been since its announcement in 2000. In September 2003(!), I wrote an installment of this column discussing the features of the soon-to-beta Perl 6. I referred to O’Reilly’s Perl 6 book, which had just appeared in its second edition. And still five years later, we’re years away. Languages that do not advance become stale, especially in the light of the numerous advances in competing languages. For example, since Perl 6 was announced, Java has enjoyed three major releases.

A second aspect of Python’s rise and Perl’s fall is that Python has become the language of choice for several key technology companies that are trendsetters: Google (in particular), BitTorrent, Yahoo and others. In addition, the community has created multiple versions of Python: IronPython (for .NET), Jython (for the JVM), and various versions for handheld devices and smartphones. Perl has only the original Perl implementation plus an experimental version in Haskell under development. Not quite the same.

The pain point in this is that Perl needs an update more than Java or Python ever did. Its arcane syntax might have been fine when competing with CGI, but when users have alternatives such as Python and Ruby, difficult syntax is just a chore. To this end, you’ll note the upcoming revisions to Ruby eliminate precisely those Perl-like syntactical elements.

Despite its syntax and slow release cycle, Perl won’t die, of course. But it seems likely that greenfield Perl projects will dwindle and Perl will be consigned to maintenance of legacy codebases and the writing of quick-and-dirty admin scripts. That’s not a bad fate (Smalltalk would be happy with that much industry relevance), but it’s not growth.

-----------------------------
BY Andrew Binstock
Source:SDTimes

Andrew Binstock is the principal analyst at Pacific Data Works. Read his blog at binstock.blogspot.com.

Copyright © 1999-2009 BZ Media LLC, all rights reserved.

CES 2009: Top 10 Stories From Press Day

For ink-stained wretches like myself, CES begins a day early. The day before the official opening of CES is Press Day: a parade of scribes and bloggers from one press conference to another. As Christine Persaud notes in her coverage, Press Day was packed, with almost all events being standing-room-only.

Rather than give a blow-by-blow account of each press conference, I'll sketch some of the big themes. Based on Press Day, these have the potential to be the top 10 stories of CES.
##CONTINUE##
Thin is Even More In: Last year, several manufacturers showed ultra-thin flat panels, and that trend is accelerating for 2009. LG announced a 55-inch LCD that's 24.8 mm deep - under an inch.

Samsung's new Luxia series are the thinnest TVs on the market that incorporate a tuner and jack pack: they're just over one inch deep. Samsung will also offer an Ultra-Thin wall mount with a gap of only 0.6", so the TV hangs like a picture. Complementing the new TVs is the wall-mountable, one-inch-deep BDP-4600 Blu-ray player.

Panasonic showed a one-inch-thick plasma that will ship this year, and demonstrated what it says is the world's thinnest PDP: a 37-inch display that's less than 1/3 inch thick!

The Cord is Being Cut: Many of the new super-thin flat panels employ wireless HDMI for connection to the external media receiver/tuner/switcher. That feature is employed on LG's super-thin display. Wireless HDMI is also used to connect Panasonic's one-inch-thick Viera Z1 plasma to its matching set-top box.

LED Gains Ground: The number of LED-backlit LCD TVs is increasing dramatically. LED backlighting allows for dimming of the backlight in dark areas of the picture, for better blacks and shadows, and higher contrast ratios. LG's new super-thin 55-incher has 240 dimming zones and a specified dynamic contrast ratio of 2,000,000:1. Samsung employs mega-contrast LED backlighting and local dimming on its 6000-, 7000- and 8000-series Luxia LCDs. And in May, Toshiba will introduce the 46" and 55" Regza SV670-series models with full-array LED backlighting and local dimming.

LCDs Get Faster: In their high-end models, LCD manufacturers are offering 240 Hz processing to reduce blur even further than models with 120 Hz frame doubling. That feature is offered on premium models from LG, Samsung, Sharp, Sony and Toshiba.

Panasonic, which is firmly in the plasma camp on larger screen sizes, noted that the 600Hz sub-field processing on its Viera plasmas results in full 1080p resolution on scenes with motion, better than the performance of LCDs with 120 Hz frame-doubling.

Black is Just Basic: Manufacturers are moving away from plain-black bezels, not just on premium models, but on mainstream sets as well. Samsung kicked off the trend in 2008 with its Touch of Colour (ToC) flat panels, and in 2009 is adding slimmer, more colourful ToC models. Toshiba is offering LCDs with "deep lagoon design," in which the bezel has a three-dimensional finish with subtle gradation.

Blu Movies and Blu Boxes: Blu-ray is now hitting the mainstream. "I believe this will be a watershed year for Blu-ray," stated Stan Glasgow, President and Chief Operating Officer of Sony Electronics Inc. Glasgow said the combination of a growing software catalog and attractively priced players "creates a compelling consumer proposition."

The most interesting announcement came from Sharp: the company's forthcoming Aquos BD series (available in 32", 37", 42", 46" and 52" sizes) will have built-in side-mounted Blu-ray players. Reflecting a trend away from basic black cosmetics, the new panels have a subtle blue accent on the bottom. Sharp also announced two new Blu-ray players and two Blu-ray home theatre systems.

Samsung is introducing three standalone Blu-ray players and three Blu-ray HTIB systems. Of particular interest is the HT-BD8200, a Blu-ray-equipped sound bar with wireless subwoofer and virtual 5.1-channel sound.

There are also lots of Blu-ray products with Internet connectivity, some of which are outlined in the next section.

Everything's Connected: In 2008, we saw the beginnings of the connected TV: flat panels that bring a subset of the Internet to the home theatre. That trend is being entrenched big time this year, and not just on TVs, but on Blu-ray players and Blu-ray HTIBs. And networking companies are finding their own entries into the home theatre.

LG is introducing two new Blu-ray players and three Blu-ray home-theatre systems that let users download high-definition movies from Netflix; the new models can also display content from Cinema Now and YouTube. New flat panels from LG have support for Netflix, YouTube and Yahoo! Widgets: small TV-based Web applications for services like news and weather.

Sharp introduced Aquos Net on its SE94 and Limited Edition Aquos LCDs in 2008. That feature offered services like stock quotes and weather. Aquos Net is being enhanced in 2009, with services like NavTec traffic reporting, Rally Point sports and social networking, and Screen Dreams screen savers.

Samsung's 7000- and 8000-series Luxia LCD televisions will incorporate wireless networking (so you don't need an Ethernet cable drop or wireless bridge in your home theatre). The new TVs feature built-in content library, Yahoo widgets, eBay auction tracking, and support for YouTube and Flickr. There are plans to add widgets that connect the TVs to social networking sites. Samsung's BDP-4600 Blu-ray player is Wi-Fi-ready (a small wireless dongle is used to connect it to the home network). With a network connection, the super-thin player can connect to the Netflix and Pandora movie services.

Panasonic introduced Viera Cast on its premium 850-series plasmas in 2008. That feature delivers Internet services like YouTube, Picassa photo-sharing and Bloomberg news to the home theatre. For 2009, the company plans to offer Viera Cast throughout its Viera lineup, and also on three new standalone Blu-ray players, plus a new portable model, the DMP-515. Later in the year, support for the Amazon Video on Demand service will be added to Viera Cast, giving users access to Amazon's 40,000-title library. After renting a movie, viewers can watch it at their convenience on their Viera display, or on a PC or Mac.

Toshiba will offer TV Widgets, Windows Media Center Extender capability and support for Cinema Now on new LCD televisions and LCD/DVD combo units. Widgets from new content providers will be added to existing TVs and that content becomes available.

In late 2008, Sony made Hancock available to users of its Bravia Internet Video Link via the Net four weeks before its release on DVD and Blu-ray. For 2009, Sony is building Bravia Internet Video capability into select models, including the XBR9 series, which will ship in the spring. Bravia Internet Widgets, a new feature based on Yahoo's technology, includes Flickr, stock quotes and news services.

Netgear's new Integrated TV Player is a small set-top-box-like component that brings video from YouTube and other sites to the home theatre. The unit has a clever search function that complements its preset channels. Users can download torrent videos direct to the device and view them without a PC. Also supported is the Cinema Now movie-download service. Also new from Netgear is the Digital Entertainer Elite, which combines a media receiver for streaming content from a PC with a built-in hard drive. It can be used with Negrear's Ready NAS Pro Pioneer Edition network drive, which can provide up to nine terabytes of storage.

At its press conference, Cisco announced the Linksys Media Hub, a US$299 device that automatically stores and finds all your media content, regardless of where it's located, and makes it available to devices inside and outside the home. The networking giant also unveiled a series of Linksys-branded home audio products, and its Cisco Eos platform, which lets media companies quickly develop artist and community Websites.

Dreaming in 3D: Panasonic held demonstrations of 3D high-definition TV during the show, and said it planned to propose a standard for 3D Blu-ray disc authoring during 2009, allowing for creation of 3D Blu-ray content beginning at 2010. "Panasonic doesn't think that 3D HD for the home is that far away at all,' stated Yoshi Yamada, President and CEO of Panasonic North America. The noted cinema director James Cameron made an appearance at the Panasonic press conference via a recorded video message, in which he described his forthcoming 3D sci-fi movie Avatar. "3D isn't something you watch," Cameron stated. "It's a place you're taken to."

At its press conference, Sony described how its technology would be used by Fox to broadcast a U.S. College Bowl game in 3D to theatres across the U.S.

Going Green: Manufactures are promoting the environmental benefits of their products this year, describing their eco-friendly manufacturing processes, and announcing or enhancing their recycling programs. Panasonic announced new plasmas that can produce a picture that's twice as bright as last year's models without consuming more power, or alternatively, a picture of the same brightness while consuming half the power of last year's models.

Makers of LED-backlit televisions pointed out the ecological benefits of their technology. LG's LED-backlit TVs are Energy Star 3.0-rated, the company noted. Sony made similar claims about its Eco Bravia TVs, which include the HCFL (hot cathode fluorescent lamp) backlit VE series. Arriving this summer, those models achieve Energy Star 3.0 status.

Sony is already operating its own recycling program, which handled 5,500 tons of waste last year. Sony said its goal is to recycle a pound of waste for every pound of equipment it sells. Panasonic, Sharp and Toshiba announced a joint recycling program, which will have 400 sites in the U.S. by the end of 2009, and 800 by the end of 2011.

The Next Generation: Last year, Sony introduced the first TV to use OLED (Organic Light-Emitting Diode) technology, an 11-inch model, and this year is announcing 21" and 27" models, plus an 11-incher that's a mere 0.9mm deep.

Not surprisingly, LCD stalwart Sharp maintains the OLED is still a niche product. "OLED is not ready for prime time," opined Doug Koshima, CEO and Chairman of Sharp Electronics Corp. "LCD is the best technology today and in the near future."

But interesting things are happening with OLED. At an evening press event, an OLED manufacturer's association demonstrated several interesting OLED panels, ranging in size from very small to about 15 inches. Corbin Church, Vice-President of Montreal-based Ignis Innovation Inc. described a new process that would allow OLED panels to be made using the same backplane designs used for LCD. Amorphous Silicon LED technology will drive up yields and drive down costs, predicted Church, whose company has developed the technology. "The next few years will see an awakening for OLED," he predicted.

On to the Show! Are these really the top 10 stories of CES? It's early days, with the show only having been open for an hour. So it's time for your correspondent to haul himself over to the Convention Centre, and see what's happening. Watch this space, and the next issue of the print edition of Marketnews, for lots more from CES.

-----------------------------
BY Christine Persaud
Source:here's how!

© Copyright 2008 Bomar Publishing. All rights reserved.

Living in the cloud…How did it go?

In short, pretty well. Google Docs has gotten faster and more robust and served brilliantly for my word processing needs. I still rely very heavily on my BlackBerry for communication, so that isn’t technically in the cloud, but I’m not using Apple Mail or some other client either.
##CONTINUE##
It was very clear from my week avoiding client software (productivity suites, mail clients, media players, etc.) wherever possible that the average K-12 student could live within their browser quite happily. With Google Docs (my online suite of choice because I’ve really bought into the Google ecosystem, but Zoho provides great tools as well), presentations, spreadsheets, and documents are utterly simple to produce. Despite the changes to Google’s marketing and sales of their Apps suite, their educational version of Google Apps for your domain remains free.

Blogger (and countless other tools) make it easy to produce documents online and share them as needed; Google Docs provides great sharing and collaboration tools as well.

Pandora gives me plenty of music, Twitter gives me quick communications, Gmail rocks, and I can access all of my other email accounts via a webmail interface, too. This is where the experiment started to fall apart for me, though, where it might not for most students (and, actually, most teachers).

First, I regularly access three email accounts, two of which must be accessed via my BlackBerry, an SMTP client, or the webmail site to ensure archiving to meet FRCP requirements. I can have multiple webmail tabs open in the browser, but an email client just makes this so much easier, not to mention simplifying searches through thousands of emails. It also doesn’t help that our webmail interfaces for our two email domains completely stink.

I put it out to the Twitterverse, but if anyone knows of a website that can handle SMTP communications from multiple domains, let me know. A web-based email client along the lines of Mebo for IM would pretty well rock and make this experiment a lot better.

I also couldn’t find anything online that handled photo or video editing in any sort of efficient manner. Training documentation, as I tried to incorporate screen shots, became a real pain in Google Docs. Most online tools make it easy to incorporate video and images, but editing or tweaking them still really requires client software.

I ran into the last problem this morning. I’m working at a school right now where the router is dying a terrible death and the Internet connection is spotty at best. Obviously, cloud applications don’t work so well without an Internet connection. Internet connectivity is largely becoming ubiquitous, but this certainly points to the needs for some local synchronization or cellular access to the web.

So what does this mean for students and teachers using cheap netbooks? It means that even for schools who turn to netbooks as an inexpensive way to get more computers into students’ hands, some dedicated facilities for more sophisticated computing are important. It also means that a bit of flash storage, whether an SD card or USB drive, could allow some multimedia files to be handled client-side or moved between dedicated PCs and the netbooks.

-----------------------------
BY Christopher Dawson
Source:ZDNet

Christopher Dawson is the technology director for the Athol-Royalston School District in northern Massachusetts. See his full profile and disclosure of his industry affiliations.

© 2009 CBS Interactive Inc. All rights reserved.

Defining and implementing your server optimization strategy

While there is always an increased enterprise demand for IT services, the current economic challenges facing most industries require data center managers to deliver better results with fewer resources. CIOs are expected to achieve 10 – 20 percent overall cost efficiencies in providing infrastructure services, and within the data center, the server environment provides great cost-saving opportunities. During a recent client engagement, TPI helped quickly achieve a 10 percent reduction in the client’s server population. The savings resulting from this action alone funded the resources to analyze the next steps in the cost reduction process.
##CONTINUE##
When considering server optimization holistically, companies often look at virtualization as the primary cost-cutting solution. Although virtualization can be one route to consolidating physical servers, other opportunities exist to further reduce costs. We recommend clients start by focusing on server elimination. After identifying servers that run old and infrequently used applications, you can quickly shut them down to provide immediate benefits. These servers do not require consolidation or virtualization since they are simply removed from the environment.

The commoditization of hardware, combined with technology advances in server deployment, has fueled an explosion in server growth. However, as companies have continued to purchase server equipment and software, older servers have frequently remained behind and have never been completely decommissioned.

To find these stranded servers, companies need to look beyond server utilization and examine network traffic or users’ access logs. If usage statistics are not available, investigate the service desk for incident or change management logs for activities on the servers in question.

Virtualization enables the migration of server instances (operating systems) to a consolidated hardware platform, which requires investment in new hardware. While virtualization has its place, application stacking — the migration of multiple applications to a single operating system (OS) — provides additional savings opportunity.

Combining applications and consolidating OS instances are two techniques to reduce OS overhead, software licensing, and system administration costs. To identify candidates for application stacking, look for applications or legacy servers running the same OS and other software release levels. Software standards should be in place to maximize the impact of a common platform that enables combining today’s servers and future applications. In addition to setting standards, providing a single development platform enables applications to grow in a shared environment.

Another significant opportunity is to consolidate test and development platforms. Why not use one server to host multiple test applications? Operating a consolidated environment requires stronger change management processes, tools for resources measurement, and increased technical skills. With these investments, a company can be in a better position to provide more efficient operations at a lower operating cost.

Whether your data center is viewed as a strategic asset, as a cost center, or as a combination of both, the fact remains that businesses in today’s economic climate must continue to find ways to drive down costs. The proliferation of servers throughout enterprise data centers makes consolidation and/or elimination an excellent way to optimize your environments by reducing the data center environmental footprint and lowering capital and operating costs.

-----------------------------
BY Terri Hart-Sears, Sr Advisor & Peter Doane, Dir, TPI
Source:EMQUS.com

Terri Hart-Sears, Senior Advisor, and Peter Doane, Director, advise clients on behalf of TPI’s CIO Services group and are subject matter experts on optimization. For more information, call Terri at +1 248 231 3331, or e-mail peter.doane@tpi.net.

© 2008 Boston Hannah International.

Linux Elitism: Fact or Fiction?

Newcomers to open source software might be intimidated by the insider nature of the communities, but they shouldn't mistake that for elitism, writes LinuxInsider columnist Jeremiah T. Gray. Stick with it, and the benefits will become clear, he advises.
##CONTINUE##
For users reared on GUI-oriented commercial operating systems, switching to open source POSIX-type OSes can be an onerous task. Whereas Linux, and FOSS in general, are built around the ideas of inclusion and sharing, the communities built up around the open source operating systems often face accusations of exclusive techno-elitism. Although there are assuredly a few smug members within the various FOSS communities and although shell fluency is more complex than simple commercial OS GUI administration, the scurrilous accusations of perceived superiority among open source communities and their members amount to little more than sour grapes.

In the tech world, communities are defined by their staunchest advocates. For instance, Apple (Nasdaq: AAPL) is well-known for sexy and sleek designs. Apple fanboys will often list physical attributes as key features of the products they vociferously defend (and you can hardly blame them for being turned on by the look and feel; multi-national corporations need defending from consumers like the Yellow River needs more toxic chemicals).

Elitists Abound

Apple makes flashy gadgets, and it gets a certain amount of cachet from the more fashionable segments of the tech world. Similarly but completely different, Microsoft (Nasdaq: MSFT) finds its biggest fans by pandering to the obtuse. After Apple ran an ad campaign making fun of John Hodgman for "[being] a PC," Microsoft based its self-deprecating ad campaign on Apple's and recruited its defenders to make low-resolution videos of themselves declaring themselves PCs.

Both Apple and Microsoft have elitists in their ranks. Some of the elitists are loyal consumers and others are, or have been, top executives. Although these strident supporters will get into debates with each other or even with members of open source communities, they're never accused of being the reason their platform of choice isn't universal.

Different Rules

The rules are different for open source operating systems, however, in part for good reason. First of all, open source POSIX-oriented operating systems tend to bring out a different crowd. For an amusing look at what unites Linux users, for instance, compare the search terms and results here to the search terms and results here.

Rather than being guided by the fashionable elegance of iPhones or the status quo of Windows, many champions of Free and Open Source Software base their usage and support on philosophical reasoning. Considering the history of the GNU Project, it should be no surprise that one can find ideological purists in the ranks of FOSS users (some who would doubtlessly object to my use of the term "FOSS"). The purists who identify with free software and its guiding principles are no more of a threat to the open source movement than Apple fanboys are a threat to Apple's profits. The difference is purely aesthetic.

The Disconnect

So why do some newcomers walk away from Linux/BSD decrying open source operating systems and calling the community members highfalutin? Mostly because they failed to work through what Seth Godin identifies as "The Dip" and proceed to misconstrue community values and attitudes. To experience the best things Linux/BSD have to offer, users must reorient themselves and learn to think about computing a little differently. For a user accustomed to GUIs, the command line can seem daunting and trivial, but after becoming more familiar, the user will recognize that the command line interface is an elegant, if not zen-like experience. The same goes for building software from source. So what the wounded newcomers sometimes interpret as condescension is actually more along the lines of teaching someone to ride a bicycle. It's not difficult to do, but it can only be done if the rider pushes through the initial doubt and confusion in order to experience the benefits.

Most open source enthusiasts want more people to embrace Free and Open Source Software solutions, but just like how the style of products is important to Apple aficionados, familiarity with the terminal and an appreciation of the under-the-hood mechanics matter to the FOSS lovers. That said, FOSS has an added element absent from the corporate-backed technologies. Whereas fans of products made by rather large businesses need to appeal in aggregate (or focus groups) to get noticed in the product design process, FOSS is a free-for-all. Anyone is free to bring anything to the table. While a lot of folks may get corporate logo tattoos and/or pontificate about what such-and-such company did right or wrong, few of them will ever have any actual input. On the other hand, if Joe Sixpack wants to make his own Linux- or BSD-based operating system with his own logo and software, he's free to do that. FOSS is based on empowerment and the appreciation of empowerment, and with empowerment comes responsibility.

Perhaps this is what the naysayers find objectionable.

-----------------------------
BY Jeremiah T. Gray, LinuxInsider
Source:TechnewsWorld


Jeremiah T. Gray is a LinuxInsider columnist, software developer, sysadmin and technology entrepreneur. He is a director of Intarcorp, publisher of the Linux-oriented educational comic book series, "Hackett and Bankwell."

Copyright 1998-2009 ECT News Network, Inc. All Rights Reserved.

Back to SOA business

At the end of the day, not much has changed in the last three weeks for SOA.

One of the things I've been noticing is that while budgets are slashed and SOA downsizing to more tactical purposes, projects are still under way, people are still moving the architecture ball forward, and it's back to business with SOA. It does not surprise me, but considering all of the blogs and conjectures around the "SOA be bad" stuff, you would think that we've moved on to the next thing.
##CONTINUE##
Truth-be-told SOA is nothing more than an approach to architecture, and most enterprise architectures are still badly broken. We can sit around and talk about how complex and tough this is going to be to fix, perhaps quit, or get to work on even the smallest effort to move the ball forward.

Many small projects that you can call wins will also function to change the architecture. You just need to make sure you're moving towards something that's better than it was before, and keep the business in mind.

If there is a struggle, it's around improving the talent as well as the architecture. I've pointed out several times that there are not enough good SOA architects to go around. Training, mentoring, hiring -- it's a pretty easy solution. Now is the time to look into it; it actually reduces costs.

So, for those of you who think that SOA has stopped, clearly that's not the case. Concepts like SOA, while always debated, seem to be durable over time.

-----------------------------
BY Dave Linthicum
Source:InfoWorld

Copyright © 2007, Reprints, Permissions, Licensing, IDG Network, Privacy Policy, Terms of Service.
All Rights reserved. InfoWorld is a leading publisher of technology information and product reviews on topics including viruses, phishing, worms, firewalls, security, servers, storage, networking, wireless, databases, and web services.

2009-01-16

Hyper-V Gets Fault Tolerance

There was never any doubt it would happen, but the argument that Hyper-V is "just a hypervisor" is fast becoming spurious. Last week, Microsoft added fault tolerance to its virtualization repertoire with the announcement that it had entered into a development and marketing agreement with Marathon Technologies.
##CONTINUE##
In many ways, the news isn't all that surprising. It was, after all, only a matter of time before Microsoft began partnering to grow an infrastructure around Hyper-V.

Marathon's everRun software family has been in the fault tolerance and high availability market since 1993. As virtualization gained traction, it began working its way into virtual environments, and last March it went whole hog and released a purely virtual play, everRun VM, a version with XenServer underpinnings specifically designed for virtual environments.

With this move, everRun goes beyond Xen and deeply into to Hyper-V territory. Marathon's deep coupling with Citrix and thus its design capability make it an ideal partner for Hyper-V.

Currently, everRun supports Windows server 2003. Under the terms of the agreement, support for Windows Server 2008 will be added in the second quarter, Marathon President and CEO Gary Phillips told ServerWatch. But the real value for organizations is that everRun will be put on top of Hyper-V, "enabling customers to cross-leverage" the capabilities of both applications in a seamless environment.

Today, Windows Server 2008 customers can use failover clustering. This comes standard with the Enterprise and Datacenter Editions of Windows Server 2008 and is designed to eliminate single points of failure. Component- and system-level fault tolerance will be available via everRun Component Level Fault Tolerance in the second quarter of 2009. This level of fault-tolerance is for applications with a requirement for little or no downtime or data loss. In all cases, Marathon will provide the first line of support for everRun.

EverRun Chief Technology Officer Jerry Melnick explained the terms of the agreement concisely in a guest post on Microsoft's Windows Server Division WebLog.

In the post, Melnick talks about how everRun mitigates Window's inherent unsuitability for fault-tolerant computing, despite its increased application in this manner. This is a driver because of the following:

First, more customers are relying on Windows Server to run their mission-critical applications, and the number of these applications is increasing. Second, with the growing popularity of server virtualization (where applications are being consolidated onto fewer servers) the impact of downtime is often magnified.

This brings us to the second part of the agreement. Engineering teams from the companies will work together to build a platform in Windows for ISVs and Marathon to deliver fault-tolerant solutions for other ISVs to work on, Mike Schutz, director of product management, Windows Server Division at Microsoft told ServerWatch. Ideally, this will streamline setup and installation and seamless integration between levels of availability.

It's not hard to see the synergy in this alliance. It's also not a stretch to wonder if Marathon is an acquisition target. It is, after all, a venture-backed company, and going public these days is a far less profitable path than being purchased.

From a technological perspective, it's a decent fit. Marathon has eschewed VMware's platform in favor of those from Microsoft and Citrix, and is successfully building a customer base without supporting the environment. Should this become a trend with other ISVs, it will not bode well for VMware.

Microsoft remains behind the eight-ball when it comes to a virtual infrastructure, and ramping up via partnerships and acquisitions may be far less costly and more expedient than building one from scratch. It's more than likely that similar deals will come out of Redmond as Microsoft expands its footprint.

-----------------------------
BY Amy Newman
Source:earthweb.com

Amy Newman is the managing editor of ServerWatch. She has been covering virtualization since 2001.

This article was first published on ServerWatch.com.

Gartner Reveals Five Business Intelligence Predictions for 2009 and Beyond

Analysts Discuss Business Intelligence Challenges and Opportunities at Gartner Business Intelligence Summit 2009, 20-22 January in The Hague, Netherlands

Egham, UK, Gartner, Inc. has revealed its five predictions for business intelligence (BI) between 2009-2012. Speaking ahead of the Gartner Business Intelligence Summit 2009 in The Hague, analysts’ predictions ranged from the impact of business units exerting greater control over analytic applications to the effect of the economic crisis and how it will force a renewed focus on information trust and transparency to innovations such as collaborative decision making and trusted data providers.
##CONTINUE##
“Organisations will expect IT leaders in charge of BI and performance management initiatives to help transform and significantly improve their business,” said Nigel Rayner, research vice president of Gartner. “This year’s predictions focus on the need for BI and performance management to deliver greater business value.”

Through 2012, more than 35 per cent of the top 5,000 global companies will regularly fail to make insightful decisions about significant changes in their business and markets
The economic downturn forces businesses to be aware of changes in their organisations, re-think their strategies and operating plans and face demands from stakeholders and governments for greater transparency about finances, operations, decisions and core performance metrics. However, most organisations do not have the information, processes and tools needed to make informed, responsive decisions due to underinvestment in information infrastructure and business tools.

“IT leaders in companies with a strong culture of information-based management should create a task-force to respond to the changing information and analysis needs of executives,” said Bill Hostmann, research vice president and distinguished analyst at Gartner. “IT leaders in businesses without such a culture should document the costs and challenges of adjusting to new conditions and propose a business case for investing in the information infrastructure, process and tools to support decision making.”

By 2012, business units will control at least 40 per cent of the total budget for BI
Although IT organisations excel at building BI infrastructure, business users have lost confidence in the ability of them to deliver the information they need to make decisions. Business units drive analysis and performance management initiatives, mainly using spreadsheets that create dashboards full of metrics, plus analytic and packaged business applications to automate the process. Business units will increase spending on packaged analytic applications, including corporate performance management (CPM), online marketing analytics and predictive analytics that optimise processes, not just report on them.

“By making purchases independently of the IT organisation, business units risk creating silos of applications and information, which will limit cross-function analysis, add complexity, and delay to corporate planning and execution of changes,” said Mr Rayner. “IT organisations can overcome this by encouraging business units to use existing assets and create standards for purchasing classes of packaged analytic applications that minimise the impact of isolated functions.”

By 2010, 20 per cent of organisations will have an industry-specific analytic application delivered via software as a service (SaaS) as a standard component of their BI portfolio
Information aggregators will increasingly rely on SaaS to deliver domain specific analytic applications built from industry data they collect and shift the balance of power in the BI platform market in their favour. Companies will only share their data with aggregators that can guarantee security and confidentiality so, while hundreds of information aggregators offering SaaS analytic applications will emerge, a virtual monopoly will persist within each vertical niche because of the high barrier to entry for others.

“IT leaders should work with business users to identify the information aggregators in their industry and plan to incorporate a manageable number into their BI and performance management portfolio,” said Kurt Schlegel, research vice president at Gartner. “They should work with the information provider to ensure the information tapped by the SaaS analytic application can be integrated into their internal data warehouses.”

In 2009, collaborative decision making will emerge as a new product category that combines social software with BI Platform capabilities
The emergence of social software presents an opportunity for savvy IT leaders to exploit the groundswell of interest in informal collaboration. Instead of promoting a formal, top-down decision-making initiative, these IT leaders will tap people’s natural inclination to use social software to collaborate and make decisions.

“Social software allows users to tag assumptions made in the decision making process to the BI framework,” said Mr Schlegel. “For example, in deciding how much to invest in marketing a new product, users can tag the assumptions they made about the future sales of that product to a key performance indicator (KPI) that measures product sales. The BI platform could then send alerts to the user when the KPI surpassed a threshold so that the decision makers know when an assumption made in the decision-making process no longer holds true. This approach dramatically improves the business value of BI because it ties all the good stuff BI delivers (e.g. analytical insights, KPIs) directly to decisions made in the business.”

By 2012, one-third of analytic applications applied to business processes will be delivered through course-grained application mashups
Businesses should not trust their megavendor to solve all their integration problems. Vendors move slowly to integrate the disparate code bases they have acquired. Reliance on one vendor also limits the ability to use best-of-breed capabilities and weakens the buyer’s negotiating position. At the same time, business units do not care about grand visions for service-oriented architecture (SOA), such as assembling composite applications by weaving together fine-grained services.

“IT leaders in Type A organisations who want to link analytics with business processes should use course-grained mashups of existing operational and analytical applications,” said Mr Schlegel. “Today, most use portals to integrate operational and analytical applications, but portals simply put the operational and analytical views side by side. Course-grained mashups overlay analytical insights, such as queries, scores, calculations, metrics and graphs, onto the graphical user interface of the operational application.”

“The current economic crisis shows the importance of trust and transparency in the information that organisations use to run their business. Integrate the analytical insights derived from this information into the decision-making processes throughout the company,” concluded Mr Rayner.

More information can be found in the report “Predicts 2009: Business Intelligence and Performance Management Will Deliver Greater Business Value”, available on Gartner’s website at http://www.gartner.com/DisplayDocument?ref=g_search&id=842713&subref=simplesearch

About Gartner Business Intelligence Summit 2009
Gartner analysts will further discuss BI market dynamics at the annual Gartner Business Intelligence Summit 2009 in The Hague, Netherlands, 20-22 January. To register, please contact Holly Stevens, Gartner PR, on +44 (0)1784 267738 or at holly.stevens@gartner.com. For further information on the Summit, please visit www.europe.gartner.com/bi

Gartner, Inc. (NYSE: IT) is the world’s leading information technology research and advisory company. Gartner delivers the technology-related insight necessary for its clients to make the right decisions, every day. From CIOs and senior IT leaders in corporations and government agencies, to business leaders in high-tech and telecom enterprises and professional services firms, to technology investors, Gartner is the indispensable partner to 60,000 clients in 10,000 distinct organizations. Through the resources of Gartner Research, Gartner Consulting and Gartner Events, Gartner works with every client to research, analyze and interpret the business of IT within the context of their individual role. Founded in 1979, Gartner is headquartered in Stamford, Connecticut, U.S.A., and has 4,000 associates, including 1,200 research analysts and consultants in 80 countries. For more information, visit www.gartner.com.

-----------------------------
BY mincho2008
Source:PR-CANADA.NET

© 2009 PR-CANADA.net.

Industry is pushing clouds, but the linings may not be silver for all

The fad-mad high-tech industry is touting the idea of "cloud computing" as if it's the answer to the business community's prayers for cheaper, easier computing, but not everyone is convinced.
##CONTINUE##
While cloud computing is the craze of the moment, the idea has been around for years. Instead of buying software packages to install on their own computers, customers rent computer software, processing power and data storage from a service provider's servers and access it on the internet. Employees browse to a URL, log in and work online.

The software running in the cloud — from a simple word processor or spreadsheet to a complex corporate application — can be accessed over the internet using a small laptop or even a cellphone.

The concept of cloud computing has taken such a firm grip on the imagination of the technology industry that in October analysts at Gartner Inc. predicted that despite the shaky economy, online software revenue will surpass $6.4-billion US in 2008. The market is expected to more than double by 2012, with annual revenue reaching $14.8-billion.

"Cloud computing is the story of our lifetime," Google CEO Eric Schmidt told IBM's 2008 Business Partnership Leadership Conference in Los Angeles. "Eventually, all devices will be on the network."

Part of the appeal of working "in the cloud" is that business can outsource the expensive headache of managing and maintaining the IT plumbing — both the software and the systems that run it — to a tech company (sometimes called an application service provider or ASP) that specializes in handling computing infrastructure. In budgetary terms, this turns capital expenses into operating expenses.

It's also cheaper than buying a software licence for every employee's computer, and easier on corporate IT departments, which no longer have to do software upgrades themselves. Moreover, the idea can reduce or eliminate the need for ever-growing corporate storage networks, since documents and files can be stored on the service provider's servers.

The idea of software, processing power and storage rented and used online also means ever-more-powerful personal computers are no longer necessary as the bedrock of a modern office. Customers often need only a basic computer with an internet connection to access programs running on servers in the cloud, which reduces the need for regular hardware upgrades. And program maintenance — updates, security patches and so on — is handled by the service provider.

All of this can cut corporate computing costs dramatically, supporters say, and in these perilous economic times companies should jump at any opportunity to save money.

Cloudy issues

At least that's the image presented at four major technology conferences in recent weeks: the Windows Professional Developers Conference, the TechEd EMEA Conference, the Web 2.0 Summit and the Windows Hardware Engineering Conference. In geek-speak, conference-goers were promised cloud computing would be a "paradigm shift."

The trouble is that some fear this silver cloud — so called because engineers like to draw a cloud to represent cyberspace — may have a dark lining.

First, there are concerns over data security and privacy. In cloud computing, data can be scattered over servers in different countries, and those countries might have less respect for the privacy or security of foreign data than would be acceptable to the customer.

Ray Ozzie, Microsoft's chief software architect, told CNET that "cloud computing is ultimately going to be 'do you trust this provider to have more to lose than I have to lose as a company if they mess me up?'"

Savvy corporate IT managers also fear the tech giants that control much of the cloud computing market — such as Salesforce.com Inc., IBM Corp., Apple Inc., Sun Microsystems Inc., Google Inc. and Microsoft Corp. — might offer services at appealing prices initially, but then cash in on the customer's commitment to a proprietary system over the long term. It's a practice called vendor lock-in.

In other words, once a business is running its programs and storing its information with a service provider, it becomes very difficult to move to a different provider. Customers may find themselves over a barrel if the provider starts jacking up its fees.

The ability to lock in customers more tightly is one of the reasons the world's top software makers are embracing this new approach to marketing their products. Microsoft has even taken to referring to cloud computing as its "post-Windows" era. In October it launched its Windows Azure Services Platform, a virtualization technology; soon after, it announced a new generation of Windows Live.

In late October, Microsoft CEO Steve Ballmer sent out an e-mail to select customers summarizing his view of the change the cloud will bring. "Why can't we easily access the documents we create at work on our home PCs?" he wrote. "Why isn't all of the information that customers share with us available instantly in a single application? Why can't we create calendars that automatically merge our schedules at work and home?"

IBM, meanwhile, is building a ninth cloud-computing data centre, and computer maker Dell is focusing on offering products to the largest cloud-computing providers.

Amazon.com's Web Services offering has expanded to include the Elastic Compute Cloud (EC2), in which paying customers rent computers from Amazon.com to run their own computer applications. Google's App Engine is essentially the same as Amazon's approach, except that it allows users to host their own virtual machines on Google's servers.

Salesforce.com, the poster child for cloud computing, is expanding its online services so its customers can build, customize and run applications. Sun Microsystems, the struggling server maker, is banking on cloud computing to put wind into its sales as it revamps Network.com, its collection of online applications. Even Apple's iPhone was designed as a highly portable version of a local computer that works with data in the cloud.

Alternative approaches

With all this activity around cloud computing from major software vendors, the fear of vendor lock-in among potential customers is palpable.

Companies such as Adobe Systems Inc. and VMWare Inc. are aware that customers are skittish, and they're using the issue to gain marketing leverage for alternative approaches.

At a San Francisco conference in October, Adobe chief technology officer Kevin Lynch warned of the "balkanization" of the web among competing cloud-based computing technologies. Adobe is promoting the concept of online computing using its generic Flash player as the means of interacting with a company's existing servers rather than having third-party service providers host data and programs. Instead of running applications "in the cloud" through service providers, Adobe wants its customers to deal with their clients directly through customized online Flash programs that blur the line between the internet and the personal computer.

Adobe boasts about the success of Flash, the engine behind 80 per cent of web-based animations - company bigwigs like to bicker publicly about whether the Flash player is installed on 98 or 99 per cent of the world's computers. Flash applications can reside on either an employee or client's computer, or on a corporate server, and they don't require a specific operating system.

Companies would create their own Flash programs as a front end for their databases, taking advantage of the cloud-computing-like benefit programs that can be accessed over the internet by low-powered desktops and laptops — but without giving up control of their data to a service provider.

"We just want to offer them the tools to do it themselves," Adobe vice-president Jim Guerard said.

The company has developed a platform called Adobe Integrated Runtime (AIR) designed to replace the browser with Flash-based desktop applications that can be run on Windows, Apple and soon Linux systems. (Like the Flash player, AIR is free; the company makes its money selling software development kits to programmers.) AIR-based Flash applications can offer features such as drag-and-drop to and from the user's own file system, access to the personal computer's clipboard for cutting and pasting information between applications, and encrypted local storage.

Paul Maritz, once third in line behind Bill Gates and Steve Ballmer at Microsoft before he jumped in July to virtualization company VMWare Inc., also favours an approach where companies run their own servers.

VMWare's ambitious and complex project is called the Virtual Data Centre Operating System. The virtual machine basically allows people to access computing resources on the network more easily and makes more efficient use of in-house computer servers.

"We are in a big transition from a device-centric world to an information-centric world," Maritz recently told Newsweek. "It's going to be about how do you make the information useful and available and make that the centre of people's lives instead of specific devices.

"Devices will have to cleave to the information rather than the other way around. IT infrastructure, the plumbing, will fade away for most users and businesses, and will increasingly be left to professional providers."

Open source

Not surprisingly, another familiar group is interested in cloud computing: the open source community.

This group has had success with its Linux operating system, and has three major cloud systems in development:

  • Hadoop, a Java-based system for data-intensive applications using large clusters of computers (IBM's Blue Cloud uses Hadoop, which is supported by Yahoo, Hewlett-Packard and Intel);
  • Eucalyptus (Elastic Utility Computing Architecture for Linking Your Program To Useful Systems), a cloud computing platform for research and testing;
  • 10gen, designed to help developers make their own cloud services.

And on Dec. 1, a company called Good OS (creators of a form of Linux called gOS), announced a new operating system called Cloud that boots computers straight into a browser. The company is aiming to load Cloud along with Windows XP in ultra-portable "netbook" computers in early 2009.

Momentum builds

Despite the concern of some customers over issues such as security and vendor lock-in, momentum is clearly building around cloud computing among the companies that develop software.

By 2012, for example, Gartner estimates that Web-based freeware such as Google Apps, Adobe Buzzword, ThinkFree and Zoho will account for a nine per cent market share of total software revenue.

With so many players vying for a chunk of the action in the cloud, a storm is surely brewing in the corporate software market.

-----------------------------
BY Jack Kapica
Source:CBC News

Copyright © CBC 2009

Search Engines 101: Paid Vs. Natural Search

When people become involved in Internet marketing, one of the issues that comes up are the differences between natural and paid search. Both strategies have their pros and cons and can be very effective as part of a marketing strategy.
##CONTINUE##
Paid search is when your ad shows up at the top of a Google search or along the right-hand side of the results page. These are called 'sponsored ads' and are paid positions. Every time anyone clicks on one of those ads, the owner of the ad pays Google. This is also known "pay-per-click" advertising. The amount you pay is determined by several factors, including what you're willing to pay every time someone clicks on your ad.

Natural Search is when you type in a keyword. Links and descriptions show up on the left-hand side of the search result page. You wind up on the first page by having relevant content on your Web site and links to your site from other relevant sites. Getting to page one can be a long process. There are many companies that claim to be able to get you on the first page of Google. That may be true if the search term is very specific and no one else would ever search for it but you, or if they're using a "black hat" method that could get your site banned from the search engine.

Search engines such as Google, Yahoo and MSN are really just databases. When you do a Google Search you're not searching the Web, you're searching Google’s database. There are two ways to get into these databases. One is to submit your site to the different search engines. In 6 to 9 weeks the search engine will index your site. They capture key elements from the code on your page and your content. These are then stored in the database. When a user types a keyword into a search engine the algorithms determine the which links should be displayed as a result of your search.

The other way to get added to the search engines database is to have the search engine software find you through a link to your site from another Web site back to yours. The software, known as "spiders," will periodically "crawl" your site to see if you've updated it.

One important thing to know is that each page on your site is indexed individually, and each page stands on its own. The rankings are based upon the combination of correct meta tags, relevant content to the keyword they're trying to get rankings for and link popularity.

As long as the search engine can index the site, clearly read the meta tags and content, the better. An issue is when a site is built using Flash and little content (search engines cannot read or index "Flash" sites). Also, if the bulk of the relevant content is in PDF format, search engines cannot read the documents. If the search engines cannot index the relevant text there won't be any rankings.

The bottom line is this - paid search means you pay for your position. The benefit is if you have the funds you'll get instant traffic. Stop paying and the traffic dries up quickly. Natural search is free traffic but it's built over time. The advantage is that if done right, it can provide visitors for a long time to come.

Search traffic (paid or natural) is the BEST traffic to have because folks who are specifically looking for what you have are finding you. It doesn't get any better than that.

-----------------------------
BY Terry Stanfield
Source:ecommerce-guide.com

Terry Stanfield is a search-engine marketing (SEM) consultant with over 15 years of sales and marketing experience. His company, Clickadvantage, manages PPC and SEO efforts for his lead generation and ecommerce clients.

Adapted from Webreference.com.

Copyright 2009 Jupitermedia Corporation All Rights Reserved.

Muglia on the cloud, Azure and the economy

A long time ago, Bob Muglia worked on a Microsoft project designed to offer a variety of services in the cloud. That effort, known as Hailstorm, didn't exactly take off and Muglia's career took a detour.
##CONTINUE##
But both Muglia and Hailstorm are back. On 5 January, Microsoft elevated Muglia to divisional president, a recognition of the success he has enjoyed as head of Microsoft's server-software business.

As for Hailstorm, the name is gone, but many of the concepts are back as part of the Windows Azure platform that Microsoft announced in October. Ina Fried of ZDNet UK's sister site, CNET News.com, had the chance to talk with Muglia about Windows Azure, the cloud in general, as well as the economy.

Is this supposed to be a slow-motion rollout with Azure?
A: The way I sort of describe it is, it'll be phased — there's a whole broad set of services. You'll see some of those services go to production next year; exactly what and when, we're still working through. People are able to begin to develop right now, of course, on it, but it will happen over a period of time.

And the other thing right now is, people are still very much kicking the tyres. We have quite a bit of tyre kicking going on, a lot of people provisioned on the services right now and, so far, things have been going well.

What are the kinds of things that you think people will want to run in Azure?
In terms of the classes of applications, I think you'll see two initial ones, though it's fair to say people may have an interest in running in this environment any application they would want to run on-premises. But the initial ones I think would be your web-style applications, which tend be internet-connected and need geo-distribution.

The other class I think is really interesting is anything that involves working in partnership with others: supply chain sorts of applications, business-to-business, Electronic Data Interchange, those sorts of classes of applications in which you need to connect multiple organisations, and you need to deal with authentication, and you need to deal with network connectivity.

Today it's very complex with virtual private networks and password management, and a whole nasty set of problems to deal with, and Azure has some built-in services to simplify those things, including a service bus to go through firewalls and connect things over the internet — again, any system to allow people to authenticate. Those are basics that are fundamental, that everyone will really need in this kind of environment.

So, I think you'll see those sorts of things emerge initially, but then you could just imagine all sorts of things. You could imagine people using it for (high-performance computing) applications. That's an area we're looking at, and we certainly are having conversations with a number of academic and other organisations.

In terms of the movement toward the Microsoft-hosted versions of its server products, are there some interesting things that you have come across, as you've had to do the work to get ready for that?
There's a ton. I mean, one is, you need to move so that everything works across the internet, which is just the right thing to do, anyway. Another thing you see is the need to have what we call multi-tenancy. So, to get scale on these things, you can't be dedicating even a virtual machine to a company. You need to be able to support many users, many organisations within a single instance of an application, and so that's an attribute.

There's a whole set of really interesting regulatory things that you hit when you go across the world attached to this. Turns out, some of our products, like unified communications, have VoIP capabilities. Well, you go and take that to many countries of the world, and they call those telephone companies. They call you a telco and, all of a sudden, there's a conversation about being regulated like a telco, you know, in some countries around the world. That may or not be pleasant.

There are issues about data and where data can reside and not reside, and so that's why when you spread geographically around the world, there's a wide variety of new issues that open up that are quite, you know, quite interesting — billing issues, because obviously, there are different issues with the way the banking systems in different countries work.

Let's talk about the economy. What are you seeing when you talk with customers?
People are afraid. I mean, I think we're all a bit afraid, at some fundamental level, because we don't know — no-one knows where this is going to land in the long run.

No-one really is clear as to how far the contraction is going to go and how long it's going to happen, and then there's a lack of clarity also as to how we get through this. Are we going to be tight six months of the year, then boom? Well, maybe. That would be kind of a good viewpoint of things right now. Or is it going to be a longer period of time, with a medium period of time with sort of a slow growth?

I sort of always come back to a belief that the fundamentals will drive all of these things and, ultimately, it means people have to produce things that others value that helps to drive the overall society forward and, you know, generate something that is of sustainable long-term value. Ultimately, one of the key things is, how can we make companies and individuals more productive and able to work together better?

So, I guess I have two questions somewhat related to that. One is, how about for you? As a business manager, obviously, you manage a fairly large business. What are the things that you might have done, had the economy continued, that you're not doing now? What are some of the things that remain priorities, and what are some of the things that you're going to let happen slower?
Well, certainly, there's no question that Microsoft's not immune to the circumstances. There's no question about that. So we have slowed our growth. We are still growing as a company, and [the server and tools business] will grow overall this year, though I admit that we did most of our growth in Q1. We actually were incredibly successful in bringing a lot of folks on in Q1, so we would have had to slow [hiring] under any circumstances because we're out-achieving our plan, but we've slowed considerably.

So if you take some of the areas in the database space, like some of these areas around business intelligence and data analysis, we're actually investing in some of those areas. But we're taking resources off some things that don't have the same kind of results and long-term potential for us to have returns, one of which was pretty public recently: OneCare, where we, you know, decided to refocus that effort into a much more narrowly focused free antimalware offering instead of providing a broader suite.

Are there other things about which you, as a business leader, are saying, this is going to have to wait a little or move slower?
Yeah. I mean, there certainly are. I think that we've looked inside, at what we're doing really in almost every one of our groups. If you look at almost every one of the things that we're doing and say, OK, there's a set of things we want to do in management, let's tighten the belt a little bit, as to where we're going. Yet we're continuing to invest in this whole virtualisation and management space, coherence with Azure, all those sorts of things we're continuing to invest in.

So in each one of our business areas, we've looked at how we could reallocate and refocus, and then across the board, we've made some fairly fundamental shifts like we did with OneCare.

-----------------------------
BY Ina Fried, CNET News.com
Source:ZDNet

Copyright © 2009 CBS Interactive. All rights reserved.

Facebook: what future for social networks?

Facebook has become the 'vanilla' social site millions use as readily as e-mail, while MySpace and YouTube dominate the consumer media, and LinkedIn the business market. That Facebook, like e-mail, is 'beyond fashion' may be key to its success: roughly half of its users log on every day, says founder and CEO Mark Zuckerberg, who established the company in 2004 while a student at Harvard.
##CONTINUE##
Like the World Wide Web itself, Facebook's roots were in academic communication then, but from its student beginnings just five years ago the Palo Alto-based company has grown to employ over 700 people - it is aiming for 1,000 or more in 2009 - and achieved 2008 revenues estimated at $300 million. So is the Facebook revolution unstoppable?

1 The social graph

The about-face in behaviour that social networking has inspired has been rapid and extraordinary. Ten years ago, the idea that millions of people would voluntarily share their private contacts with the world would have been unthinkable, especially in business. Facebook and its rivals trade on what we can term 'mass proximity' to - and privileged insight into - an individual's world. Beyond that, of course, Facebook and its peers are powerful, stable platforms for application development.

Facebook says it aims to be "the social graph" of the 21st century. Its usage has clearly become integrated into many people's lives, notably in the UK where its adoption is nearly three times that of its main rival, MySpace.

However, the exact number of global users is unclear: at the November 2008 Web 2.0 forum in San Francisco, Facebook claimed over 120 million users (18.4 million in the UK), while MySpace was quoted as having 118 million users globally (7.8 million in the UK). In January 2009, however, Zuckerberg announced that passing 150 million users had been a "significant milestone" for the company.

2 Growth but at what cost?

If all those figures are correct, they suggest Facebook is growing at over 10% a month, but this should be taken with a pinch of salt: duplicate profiles are legion, while TechCrunch traffic statistics puts the company's US growth rate at just 3.8% and slowing. ComScore statistics put MySpace ahead in the US and worldwide, but suggest Facebook will become the largest social network by 2010.

It is not just about personal profiles, however. Well over 130,000 businesses make either formal or informal use of Facebook as a networking tool.

Despite the impressive uptake, the revenue models for it and other social networks remain unclear. In November, analysts at IDC found most users ignore adverts on social sites, despite the fact that more than 75% of them visit at least once a week and 61% spend over 30 minutes online each visit.

Social-network adverts also generate fewer click-throughs - 57% versus the web average of 79% - and only 11% of those translate into purchases (compared with 23% elsewhere). IDC found that only 3% of users would be willing to expose their private networks to advertisers, and so the ability to create any leverage from those millions of personal profiles remains untapped.

3 A measurement of true success

Put another way, 150 million users and $300 million in revenue means two dollars per user per annum. Of course, $300 million revenue after five years' trading is an astonishing achievement, but the underlying story reveals the social networks challenge: how to persuade all those users - as opposed to business partners - to spend money? Or does Facebook even care?

At the 2008 Web 2.0 forum, Facebook and MySpace took to the conference stage. MySpace CEO Chris DeWolfe saw expanding advertising as his goal, while Zuckerberg identified aggressive global expansion. "Growth is a strategic thing for us," said Zuckerberg. "We're not as focused on optimising revenue. People have said we're not thinking about it, but that's completely wrong. We have thousands and thousands of advertisers coming to the site and reaching people."

One part of Facebook's underlying business model was revealed just two days earlier at the annual Dreamforce summit of software as a service (SaaS) firm, Salesforce.com. The two companies announced a suite of tools to marry Salesforce.com's business productivity applications to what was described as the "interpersonal power" of social networks.

According to Facebook's COO Sheryl Sandberg: "Facebook's users are always eager to try new applications that can improve their ability to connect and share in a trusted environment... our work will give Salesforce.com's 100,000 developers the tools to create and deliver a new class of business applications for Facebook's 120 million [in November 2008] active users."

In other words, if businesses have both a stable platform and a community that it serves, then they can build native applications. Zuckerberg is betting that Facebook-style behaviour will spread to mainstream business.

-----------------------------
BY Chris Middleton
Source:ComputerWeekly.com

© Reed Business Information Ltd

Gaming Software Has Best Month Ever in December

So maybe video games are recession resistance after all.

December video game software sales jumped 15% from the same month last year, according to data released this afternoon from research firm NPD. Hardware sales were less robust, growing 2%, while accessories were up 8%. Overall, industry revenues totaled $5.29 billion in December, the industry’s best single month ever.
##CONTINUE##
In the console hardware category, the Nintendo (NTDOY.PK) Wii dominated, selling 2.15 million units, beating Microsoft (MSFT) Xbox 360, with 1.44 million units. Sony (SNE) sold 726,000 Playstation 3 consoles in the month, and 410,000 Playastation 2 units. In handhelds, Nintendo sold 3.04 million DS units, compared to 1.02 million Sony PSPs.

Also note that Nintendo has 5 of the top 10 best-selling software titles, Here’s a look at the month’s 10 sellers:

  1. Wii Play w/Remote; Wii, Nintendo, 1.46 million units.
  2. Call of Duty: World At War; Xbox 360, Activision, 1.33 million.
  3. Wii Fit w/Balance Board, Wii, Nintendo, 999,000.
  4. Mario Kart w/wheel, Wii, Nintendo, 979,000.
  5. Guitar Hero World Tour, Wii, Activision, 850,000
  6. Gears of War 2, Xbox 360, Microsoft, 745,000
  7. Left 4 Dead, Xbox 360, Electronic Arts, 629,000
  8. Mario Kart, DS, Nintendo, 540,000
  9. Call of Duty: World At War, PS3, Activision, 533,000
  10. Animal Crossing: City Folk, Wii, Nintendo, 497,000

Nintendo also dominates the list of the best selling games for all of 2008:

  1. Wii Play w/Remote, Wii, Nintendo, 5.28 million units.
  2. Mario Kart, Wii, Nintendo, 5 million.
  3. Wii Fit w/Balance Board, Wii, Nintendo, 4.53 million.
  4. Super Smash Bros: Brawl, Wii, Nintendo, 4.17 million.
  5. Grand Theft Auto IV, Xbox 360, Take Two, 3.29 million.
  6. Call of Duty: World at War, Xbox 360, Activision, 2.75 million.
  7. Gears of War 2, Xbox 360, Microsoft, 2.31 million.
  8. Grand Theft Auto IV, PS3, Take Two, 1.89 million.
  9. Madden NFL ‘09, Xbox 360, Electronic Arts, 1.87 million.
  10. Mario Kart, DS, Nintendo, 1.65 million.
-----------------------------
BY Eric Savitz
Source:Seeking Alpha

Technology firms in the recession - Here we go again

It cannot defy gravity, but the technology industry is faring better than it did in the previous downturn

THE news may turn out to be no more than rumour, but it is telling nonetheless. To cut costs, several blogs recently reported, Microsoft and IBM would soon both get rid of about 16,000 employees each, 17% and 4% respectively of their workforces. If true, these would be some of the biggest cuts in the history of the information-technology (IT) industry.
##CONTINUE##
That such cuts are deemed credible is a sign of the industry’s plight. Hardly a day passes without reports of collapsing revenues and workers being laid off. This week Motorola said it would cut 4,000 jobs, and Seagate, a maker of hard disks, said it would reduce its staff by 800. The earnings season is likely to bring even more bad news. As The Economist went to press, Intel, the world’s largest chipmaker and an industry bellwether, was expected to report a drop in fourth-quarter revenues of more than 20% compared with a year earlier. Is the industry heading for a worse downturn than the one that followed the internet crash in 2001?

That would be quite a feat. In America, for instance, technology spending grew by nearly 16% in 2000, only to contract by 6% in 2001. “The IT industry simply imploded,” says Matt Asay, an industry veteran and executive of Alfresco, an open-source software firm. “It felt like the sector’s reason for being had disappeared.”

This time things are not yet that bad—and are unlikely to become so. In spite of the string of bad news, some forecasters still expect global IT spending to grow this year, at least when you allow for currency fluctuations. According to Forrester Research, a market-research firm, technology purchases will decline by 3% in 2009 when counted in dollars (see chart). But the dollar’s relative strength weighs heavily on the results of American firms by devaluing their foreign revenues. When measured in a basket of local currencies, weighted for each region’s share of the global IT market, Andrew Bartels of Forrester expects an increase of 3%.

There are many reasons why spending is more robust than during the last downturn. For a start, the IT market has become more global. Between 2003 and 2008, developed countries’ share of IT spending fell from 85% to 76%, according to the OECD’s recently published Information Technology Outlook. Demand from China and India is expected to continue to grow despite the gloomy economic outlook.

More importantly, last time around the IT industry was not the victim of an economic crisis, but its cause, says Graham Vickery, author of the OECD report. For years companies had spent far too much on technology, buying more e-commerce software than they could ever hope to use, for example. When the bubble burst they abruptly cut spending. Today IT departments are much less prone to wasting money. In fact, says Mark Raskino of Gartner, another market-research firm, most are quite lean. Further cuts in technology budgets would be difficult, he argues, since they would require many firms to reorganise themselves first. “IT is certainly not sacrosanct, but fairly low on the list of things to cut,” he says.

Tech firms, for their part, are in much better shape. Venture capitalists may again have wasted money by investing in too many internet start-ups, this time labelled “Web 2.0”. But, in general, the industry’s big companies are better managed and have more cash on hand. Heavyweights such as HP, IBM and Oracle are probably best placed to weather the storm, thanks to their broad product portfolios and international presence. Software and services will do better than hardware or semiconductors. Other relative winners will be technologies that promise quick savings, such as videoconferencing gear and “virtualisation” software, which allows firms to get more out of hardware they already own.

Still, the IT industry cannot defy gravity, says Brent Thill, an analyst at Citi Investment Research. And this recession could end up worse than the last if it drags on. In 2001 the crisis consisted of a collapse followed by a quick recovery, he says. Today looks like an inexorable march downward. Mr Thill sees it as ominous that in his latest survey of 200 chief information officers on both sides of the Atlantic, nearly half said they had not yet set their budget for 2009.

Yet even if the plot is different, the consequences are likely to be similar. In many ways the previous IT downturn marked the industry’s coming of age. In its wake, the industry was no longer mainly about “hot” new technologies that made maximal use of Moore’s law, which holds that roughly twice as much processing power is available at a given price every 18 months. Firms have since started to opt more for good-enough “cold” wares, which save them money and allow for more flexibility: commodity hardware, open-source software such as the Linux operating system and programs accessed over the internet, or “software as a service” (SaaS).

The crisis will only speed up this shift, not least because many of the cold technologies have themselves become more mature. SaaS and other computing services supplied online, and collectively called “cloud computing”, have become better and more widespread. In November Salesforce.com, the largest SaaS firm, beat analysts’ expectations, reporting sharply growing revenues and profits.

And open-source software has long since moved beyond Linux. All kinds of enterprise software is now available in this form, which in most cases means that firms pay for maintenance services, but that the programs are free. This business model already seems to be benefiting from the downturn. Revenues at Alfresco, which makes software that helps manage web content, for instance, have tripled in the last quarter, according to Mr Asay.

Will the industry therefore emerge selling commodity products? Actually, commoditisation and standardisation are creating new platforms for innovation. Cloud computing, for example, should allow small firms and even individuals to outsource computer management rather than doing it themselves—something that is only an option for big firms at present—and open up new opportunities for the industry by allowing it to reach into new fields.

The rapidly growing mobile internet gives a glimpse of what is to come. Only a year ago, downloading software to an iPhone or other “smart” mobile device was only for technophiles. Now it is commonplace. Over 300m applications, including games and utilities, have been downloaded from Apple’s App Store alone.

Were it not for the recession, says Gartner’s Mr Raskino, the industry would actually be ready for another of its high-growth phases, which it tends to enter every eight years or so. Since IT investments are being held back now, he predicts, they will pick up quickly when the economy recovers, be it in 2010 or later. And given the industry’s notoriously short memory, today’s jitters will quickly be forgotten.

-----------------------------
From The Economist print edition

Copyright © The Economist Newspaper Limited 2009. All rights reserved.

2009 GPS Market Up 25% Despite Poor Semiconductor Outlook, Says IMS Research

The 2009 outlook for many semiconductor manufacturers is worsening by the day. Share prices have taken a severe hit with millions shaved from companies' market capital over the last 12 months. However, IMS Research projects the global positioning systems (GPS) market represents one of the few areas of growth in 2009.
##CONTINUE##
A recent report from IMS Research suggests that the GPS market is forecast to increase by over US$200 million between 2008 and 2009. IMS Research Analyst, Tom Arran, said: "2008 was the breakout year for GPS in mobile phones. In 2009 GPS will begin to penetrate into a range of vertical markets, such as cameras, laptops, UMPCs, sporting equipment and first responder radios. This will help to drive shipment growth of over 25% YoY".

"OEMs in these markets can use GPS to differentiate their product, while also drive new service revenue streams. Furthermore, location is emerging as a key component of future offerings from companies such as Google, Microsoft, Apple, Nokia, Intel, Mozilla and Ericsson. This will enable a host of new services across all key vertical markets, which in turn will further drive the uptake of GPS."

Despite a significant increase in revenue in 2009, the research firm believes that the best is yet to come. Arran goes on to say: "2009 will not be booming year for GPS in portable devices. Looking beyond the current economic turbulence, IMS Research is forecasting the overall market for GPS to demonstrate a 21.2% CAGR between 2008 and 2013.

"There is still a lot of untapped potential and the GPS market needs to mature before breaking the 500 million units per year barrier. One of the more general issues is poor performance in challenging environments. GPS manufacturers need to start seriously considering hybrid location in their offering. The report forecasts the uptake of WLAN location in each of these markets, both as a competitive and complementary technology."

Arran believes the GPS market remains competitive despite a number of acquisitions by larger semiconductor companies. He said that standalone GPS manufacturers will need to specialize to monetize, as each vertical market requires different hardware considerations. By developing a diverse portfolio of solutions tailored to specific markets, they can spread risk across a number of OEMs in different vertical markets -- ideal in troublesome times.

-----------------------------
BY Nikkei Electronics Asia
Source:TechOn

Copyright © 1995-2009 Nikkei Business Publications, Inc. All rights reserved. All editorial content and graphics on this Web site may not be reproduced in whole or in part without the express permission of the copyright owner.

X86 Market Stalls. Will Servers Be Immune?

Market analysts say that the PC market (which includes x86 servers) performed worse than expected during the last quarter.
##CONTINUE##
The economic optimism seen just last week at CES evaporated this week. Market research firm Gartner Inc. has announced that 4Q08 showed the worst growth in the PC market since 2002—in other words, since the last recession. The worldwide market experienced only a 1.1% increase over 4Q07.

Gartner's rival, IDC, did not disagree, and released figures showing that PC sales in the US fell 10 percent in 4Q08 compared to 4Q07.

This is of concern in this forum since both Gartner and IDC count x86 servers (the mainstay of the SMB server world) as PCs. If the PC market becomes unprofitable and stagnant, entry-level server development will lag.

In this case, Gartner and IDC agreed that Dell had noticeable problems competing, and while Dell hung on to the top spot in the US market, Hewlett-Packard ranked number one in the world market.

IDC said that Dell took a major pounding in the US market, with shipments falling 16.4 percent between 4Q07 and 4Q08. But even though its US market share fell from 30.7 percent to 26.5 percent it still held on to the top rung, beating Hewlett-Packard.

Gartner said that HP saw its worldwide shipments grow 12.7 percent, giving it 18.4 percent of the world market. Dell saw its worldwide shipments rise 11.5 percent, giving it 14.3 percent of the world market, on the rung just below HP.

But if things could be healthier, keep a couple of things in mind:

The first is that things could be a lot worse—no one is predicting any vendors are going to go out of business. The PC market is not the buggy-whip market. People still need PCs—lots of them. The only question is the timing of their purchases.

The second is that these figures represent the past—albeit the recent past—and optimism is about the future. Things could still get better. In fact, it's pretty much certain that they will eventually get better. Again, it's all a matter of timing.

-----------------------------
BY Lamont Wood
Source:bMighty

Copyright © 2008 United Business Media Limited, All rights reserved.