Top 10 disappointing technologies

Pretty much every new product gets hyped as a potentially disruptive technology these days, and usually nobody outside of the company's marketing department actually believes it.
##CONTINUE##
Every once in a while, however, a product comes along that everyone from the executives to the analysts to even the crusty old reporters thinks will change the IT world. Sadly, they are often misguided.

Sometimes, the product really does set the industry on its ear, but all too often it falls flat on its face. This week, we look back at those that did the latter. Potential game-changing products that fizzled out.

Honourable mention: Biometrics

Iain Thomson: Biometrics was supposed to be the magic bullet that solved all our security needs. Look in any film where they are trying to be futuristic or high tech and you'll see people getting their body scanned as a security measure.

However, the reality has proved less than we were promised. Fingerprint readers are in wide circulation but they are easily fooled these days with cheap materials, or by more direct means. Taiwanese robbers reportedly cut the finger off a man whose car had a fingerprint ignition, something that led scanner manufacturers to install a temperature sensor in future models to prevent a repeat.

Facial scanning was also touted as foolproof, and then quickly found to be anything but. Even DNA fingerprinting is now being questioned, either because the chemistry is defective or the lingering possibility that an individual's DNA may not be unique. Hell, they still haven't proved that fingerprints are even unique.

Maybe one day we'll come up with the ultimate biometric solution but I have my doubts.

Shaun Nichols: One of the problems with biometrics is that people don't really want it.

As much as we love movies about cyborgs and futuristic bio-scanning systems, few people are comfortable with actually allowing machines to analyse and classify us on that sort of level. While locks that require a palm or thumb print are emerging for high-security applications, the 'big brother' implications of taking the technology to the masses are too much for most of us.

As Iain mentioned, there are also some rather unpleasant ways to thwart such systems. Iain noted the finger incident in Taiwan, and anyone who bothered to sit through the film 'Demolition Man' remembers the, well, 'creative' way in which Wesley Snipes was able to get through the retinal scanning machine. If someone is determined to get into my place of work or residence, I'd rather they do so by picking the lock than by hacking off a body part.

Honourable mention: Ubuntu

Shaun Nichols: We're no doubt going to catch some flack for this one, but deep down even the hard-core evangelists will agree that Ubuntu has thus far been something of a disappointment. While Linux has definitely caught on in the enterprise server and database market, the open-source OS has never really been able to move into the greater market.

Those who do use Linux as the primary OS for their home or work PC are still by and large tech-savvy users who comprise what used to be known as the 'hobbyist' market. The larger end-user crowd has not been able to warm up to Linux.

Ubuntu was supposed to change that. When the OS was launched, I remember all of my Linux-advocate friends predicting that this would be the product to make the jump and challenge Microsoft in the consumer and workstation spaces. Nearly five years after its release, Ubuntu remains popular amongst Linux users, but has yet to really pick up any sort of real momentum in the greater desktop OS market.

Yes, getting rave reviews from the Linux community is nice, but get back to me when the housewives and pensioners, not just the IT pros and college students, start dumping Windows for Ubuntu.

Iain Thomson: Shaun nearly killed me with this suggestion. He and I come up with these lists over a lunch in the office in a convenient room with decent soundproofing and I'd just taken a mouthful of Vietnamese pork sandwich when he mentioned his desire to put Ubuntu on the list. I narrowly avoided the need for a Heimlich maneuver.

But the more he explained his position the more I came to agree. Maybe it was just the overenthusiastic marketing or the fanboys who swarmed to the system but Ubuntu really was supposed to change everything, where as the operating system landscape looks very much the same these days.

Don't get me wrong, I like Ubuntu and have it running on a home system. But unless a major manufacturer starts preinstalling it it's going to be confined to the Linux enthusiast and the hobbyist market.

10. Virtual Reality

Iain Thomson: Few technologies have promised so much and delivered so little.

I tried out one of the first VR units in the early 1990s. It felt futuristic, but lacked a certain something. Comfort for one; you were encumbered with a massive VR helmet, a handheld grip for direction and that was it. Watching the VR rig in the film 'Disclosure' I wondered what the writers had been smoking.

VR is going to be possible one day, but that day is not now. It was massively overhyped, so much so that when it proved itself to be pretty useless companies dropped it like hot coal. The bursting of the internet bubble killed off most VR developers and the current economic climate is doing the same.

I suspect that VR using external hardware is a no-go for quite some time to come. Far more likely is the success of VR using wetware, direct implants to the cortical systems. However, that's decades away at the current pace of progress.

Shaun Nichols: Most sci-fi movies from the early to mid 90s have aged about as well as a head of lettuce. This is mainly due to the fact that they were all so centred on virtual reality. Movies like "Lawnmower Man," "Johnny Mnemonic" and "Strange Days" all get a chuckle now for their depictions of people sitting around in goofy helmets and punching imaginary keyboards.

Like many things on this list, VR was a great idea that just didn't have a ton of use. Other than novelty stands at shopping malls, the gaming uses were pretty limited, and the business applications for VR in its 1990s state were almost zero. Even today, the VR concept is limited to a couple screens and a good graphics card.

Perhaps some day when the required hardware and software is cheaper and more plentiful, someone will find some better uses for virtual reality. As for now, we seem to be doing just fine with a screen, mouse and keyboard.

9. Alternative search engines

Shaun Nichols: We're closing in on nearly a decade now of Google's reign atop the search world, and at this point, we've come to accept that the site is more or less the de facto way to search for information on the web. Yahoo these days is circling the drain, and although Microsoft continues to prop it up, MSN search has fallen far behind.

But it wasn't always that way. In the late 90s, search engines and services seemed to be popping up left and right. Services like Hotbot, Lycos, Northernlight and Alta Vista were all vying for market share and constantly working to top each other.

Then came the dot-com crash, and from the ashes Yahoo and Google emerged as juggernauts along with MSN. Lately, it's turned into a one-horse race, and as we learned from the reign of Internet Explorer, that is not a good thing in terms of innovation.

It is disappointing that there are so few, if any, services out there that are giving Google search a run for its money and really pushing the company to step up its game. As we are on the cusp of the release of Wolfram Alpha, here's hoping that it and other sites can really bring some competition back to the search world.

Iain Thomson: I'll admit I'm desperate for something to overturn Google in search technology. When Google went public I held off buying shares, and advised a few others to do the same, because I was convinced a better search technology would displace Google and the company was over-hyped. I was wrong, on the first count at least.

Since then we've seen some pretenders to the crown, some of whom lasted about as long as a snowflake in a blast furnace. Remember Cuil, the Irish search engine that would make Google look old? It died a quick death and I fear unless Wolfram Alpha is extraordinary it will suffer the same fate.

8. Voice recognition

Iain Thomson: I could rant about voice recognition for pages, because I really want it to work. The fact remains sadly that it doesn't.

The amount of processing power efficient voice recognition needs is huge, and it takes a huge toll on memory as well. At Intel's Nehalem launch the company showed off voice recognition and said that at last there was a powerful enough processor to handle such applications. I'd have believed it, had I not heard the same things from Intel a decade before.

The fact is voice recognition needs a revolution in intelligent software as well. It's no good having the hardware to drive the application if the software is so poor. Names, regional dialects and the usual lubricating grease of language seem beyond current software. Voice recognition seems to be one of the also-rans in technology for the moment.

Shaun Nichols: Not only was voice recognition a huge disappointment, it is also downright irritating. So much so that we recently named it one of our most annoying technologies.

Part of the problem is hardware. Getting a voice recognition system that works reliably is still a very expensive and time-consuming task. And when it doesn't work reliably, it is downright useless. Anyone who has ever had to deal with the voice recognition software for the local bank's phone line or a taxi service knows just how tough this can be. Lord help you if you're trying to operate one and you have a non-local accent.

I can see voice recognition being critical for such things as handicapped access, so I definitely think the technology should be developed further, I just wish more people would hold off on using it until someone can get it right.

7. Apple Lisa

Shaun Nichols: Steve Jobs has had precious few 'misses' in the course of his career as an executive, but the Lisa still stands as one of his and Apple's greatest failures. And it's a shame, because the system really was quite impressive.

Back in the time period between Apple's infamous 'raid' of Xerox PARC and the launch of the Macintosh, there was a little system called the Lisa. Originally designed as Apple's first foray into the graphical user interface world, the Lisa was an all-in-one system encased in a bulky plastic box.

The Lisa truly was an impressive system for the time, to a fault, even. Owning Apple's masterpiece system would set you back US$10,000. Not surprisingly, the Lisa did not sell too well and the company was sent back to the drawing board to develop the Macintosh.

Though the Mac benefited from the falling price of components, most developer accounts also suggest that the Mac team had to leave out some of the Lisa's best features (such as true multitasking) in order to keep the cost of the Mac down.

Iain Thomson: Lisa couldn't charitably be described as an Apple invention per se, it was a straight steal of Xerox's Alto machine, which the company correctly assumed would be too expensive to sell in volume.

Nevertheless Jobs wanted to build one, and what Jobs wants he usually gets. So the Lisa was built, although the name caused something of a stink. Officially Lisa stood for Local Integrated Software Architecture, but this was apparently a fit so that Jobs could name the system after his daughter. This led to people calling it 'Lisa: Invented Stupid Acronym.'

But the Lisa had more than a stupid name; it had a stupid price tag. For the price of eight Lisa's you could buy the average US house and it would be a brave IT manager who would suggest buying such an expensive bit of kit, particularly as there was precious little software to run on the thing.

In the end Apple ended up dumping nearly 3,000 Lisa's in landfill in Utah, such was the lack of demand.

6. 10GB Ethernet

Iain Thomson: We've been told that this is the year of mass 10GB Ethernet networking take-up for three years now, and it still isn't here yet.

The fact is that, apart from some industries, no-one's that keen. Sure the data centre market and high end computing have moved on but it's a hard sell to get a business to rip out a network infrastructure that's already working fine just to boost speed.

Ethernet is an old technology, but there's nothing wrong with that. What is wrong is expecting businesses to pay for something they don't need. Gig-Ethernet works just fine for most of us and I suspect I'm not alone in holding off increasing speed by a lousy ten-fold when 100GB is already online.

Shaun Nichols: Some technologies always seem to be in 'wait until next year' mode. The problem is that in the IT world, technologies have a very short shelf life, and if a company or group waits too long on something, they will find themselves with a product that has gone from 'cutting edge' to 'obsolete' without ever having really made it into the market.

10GB Ethernet suffers from a pretty big design flaw; for the overwhelming majority of situations, the Ethernet speed is not the biggest bottleneck. I know that in our office, few people would notice the switch to 10GB Ethernet, as the overwhelming amount of network traffic is web access, which is slowed by the internet connection to a speed which the current setup can easily handle.

I'm sure at some point, everyone will want to bump up network traffic speed in the enterprise, but by then, will 10GB even be considered an option?


5. FireWire

Shaun Nichols: Yes, it's still in use, but FireWire has really become just a fraction of what it was expected to be.

When the standard first launched, USB 1.0 was still dominant, and it appeared as though FireWire would be the one to unseat SCSI as the high-bandwidth peripheral connection of choice. Apple even went so far as to build internal FireWire ports into many of its high-end systems, anticipating that the system would even be used for RAID setups.

Of course, that never happened. USB 2.0 came along and was good enough to power the vast majority of printers, cameras and portable drives. Meanwhile, internal technologies such as SATA have emerged. Outside of a few models of high-end video cameras, FireWire isn't seen much these days.

In the end, it seems USB's own industry "connections" were too much for the promising FireWire's connection.

Iain Thomson: I don't think it was industry connection to be honest Shaun, it was force of numbers.

The USB port is ubiquitous, it's on pretty much every system in the world. It's used to connect everything from mice to external drives and while it has its flaws (the connector is ridiculously vulnerable to lateral pressure for example) it's something people knew and trusted.

FireWire was technically superior in terms of throughput speed, even after USB 2.0 came along, but very few people had the right connection port. I know of at least three people who purchased shiny new portable video recorders and were stuffed when they realised they'd have to upgrade their systems to support FireWire.

It's a story as old as technology; the best system doesn't always win out if the moderately good competitor has force of numbers.

4. Bluetooth

Iain Thomson: I was tempted to include all forms of near field wireless communication in this one but Bluetooth really typifies the essence of the problem.

As an industry we are cursed by wires. They clutter up our desktops, make server rooms a tripwired deathtrap and get lost around the home when you need them most. So when the first of these technologies that looked workable (don't talk to me about IR) came along I was all for Bluetooth.

Sadly the reality was much worse than the deal. Different manufacturers stuck their own code into the Bluetooth stack and destroyed the very thing that was needed - compatibility. But, said the manufacturers, if you buy our products there are no compatibility issues. Go hang said consumers.

It's only now that Bluetooth is getting to be useful, and only then in very limited terms. Sure, it allows people to walk around babbling into headsets, but it could have been so much more.

Shaun Nichols: Bluetooth is especially problematic in San Francisco, because it makes it nearly impossible to tell who is hearing voices and who is just talking on the phone, and in this city their numbers seem about equal.

Bluetooth was another device that fixed a problem most people didn't really have. Yes, not needing to connect a cable to the printer is somewhat more convenient, but you still need wires to do things such as plug it in, and since most setups have the printer either on the desk or connected to the network, its usefulness is fairly limited.

I can't rag on the technology too much, however. We use a Bluetooth microphone to record sound for all of our US videos, and I have to say that the little guy has been incredibly useful and reliable.

3. Itanium

Shaun Nichols: When Intel rolled out Itanium, the company thought that it had its server platform for the foreseeable future taken care of. Unfortunately, the company didn't quite take into account the fact that the rest of the IT world doesn't exactly share Intel's enthusiasm for upgrading to new systems.

Perhaps the fatal flaw of Itanium was that the chip did not support 32-bit code. Given that very little software at the time was optimised to run on 64-bit chips, this left many of Itanium's would-be adopters to hold off on upgrading until developers decided to optimize their software for the new chips.

That wait turned out to be a bit longer than expected, and Intel has been forced to all but concede defeat on the Itanium project, at least for the near future.

Iain Thomson: I had real problems putting Itanium this high up on the list, because on one level it works very well. It was a very fast chip well designed for server work, with only one teensy fault - no-one wanted to move to 64-bit processing.

Itanium was a monstrous conceit on the part of Intel. It wasn't so much an "If you build it they will come" product as Intel saying to the rest of the industry "You're moving to 64-bit processing whether you like it or not." Not surprisingly the IT community told Intel where they could shove it.

Itanium's fate was sealed when AMD brought out the Opteron, a crossover 32/64-bit processor which, while not quite as powerful as Itanium, was a product IT managers could buy without having to upgrade their entire infrastructure. The market loved it and suddenly Intel was getting beaten by its chief rival.

Still Itanium has carved out something of a niche in high performance computing and those who have a fully 64-bit infrastructure tend to rather like it. Too bad nobody else does.

2. Zune

Iain Thomson: Hand on my heart, when I first heard Microsoft was going to be bringing out a media player to rival the iPod I was a little hopeful. Microsoft had the cash to really develop a system that would beat Apple.

Instead they seem to have given the design job to the same person behind Windows. What we got was a clunky player with all the elegant design of a road accident, and one that was loaded with so many lockdowns as to be totally useless. It was the greatest missed opportunity since Pilate reportedly washed his hands.

So what do we have with the Zune. It's a media player just like any other, only a little worse in some respects and a lot worse in others. Microsoft seems to insist that the Zune has a future, and has started a major advertising campaign for the purpose of keeping it alive. It will fail, as will the Zune.

Shaun Nichols: When the Zune was first introduced, Microsoft set up a big bin in the middle of its campus where employees could throw away their iPods. I believe it collected a total of three devices. That should have been a sign right there.

The problem with the Zune is that it lacks any sort of killer app. The Xbox had some big exclusive titles and a superior online community. There was nothing of this magnitude to establish the Zune over its competitors. It arguably functions as well or better than the iPod in many ways, but not in any way that was good enough to really make people dump their iPods. The music service for the Zune was also lacking, and it didn't help that the thing was ugly as sin.

You'd hope that by now Microsoft would have just given up. But not so much. The company is still going at it and they're starting to get desperate. The most recent attempt was a claim that the iPod actually costs US$30,000 to own and operate. When you have to make those sorts of outrageous statements about the competitor, you've clearly lost.

1. Windows Vista

Shaun Nichols: There's not much debate here. Even though Microsoft has been able to take a bit of the sting off with a thus-far smooth development for Windows 7, Vista is still fresh in everyone's mind as an epic failure.

In the near future, management students will be taught about the Vista launch as a textbook example for how not to release a piece of software.

The OS was supposed to be a huge milestone for Microsoft. After more than half a decade of success, Windows XP was entering its later years and the company had hyped up Windows Vista as the platform for the next several years.

Unfortunately, Vista hit with a laundry list of shortcomings. Security developers complained about the restrictive kernel protections. Customers griped about the extensive number of versions of Vista and the high price for the premium packages.

Vista's biggest problem, however, was its big appetite for hardware. Users who had grown accustomed to the low demands of XP were often shocked to realise how much hardware they would need to upgrade just to run Vista. Many of the high-end features were out of the realm of even some brand new systems.

The matter was further complicated when a lawsuit unearthed evidence that the company purposely lowered some of the suggested requirements at the behest of hardware vendors.

All in all, Vista has become the biggest flop in computing history. Fortunately for Microsoft, Windows 7 appears to be coming along much more smoothly. Then again, how can it be any worse?

Iain Thomson: Shaun if I've told you once I've told you a thousand times - you never, ever ask how could it be worse when Microsoft's involved.

Microsoft literally spent years telling us about how Vista was going to be the operating system to beat all the competition into the ground. As it turns out Vista was more akin to the old joke about how the software developer's wife who dies a virgin because her husband just sits at the end of the bed telling her how good it's going to be rather than doing anything.

As time wore on more and more got cut out of Vista, from decent file subsystems to some of the security measures that were supposed to make it such a great step forward. By the time Vista came out it was a neutered mess. On the face of it all the upgrade gave you was a flashy interface (that required at least a gig of RAM) and a lot of annoying dialogue boxes.

Shaun has already mentioned the hardware issues with Vista, and they did more than anything to make the system unpopular with the IT crowd. Despite being over three years late Microsoft didn't consult with third party manufacturers over driver compatibility and more than a few upgraders suddenly found their systems refusing to recognise cards and add on components.

What made it worse was that Microsoft refused to accept there was a problem. At the launch a senior executive told me with a straight face that Vista could be configured to run on the same hardware as XP. He wasn't alone; the corporation seemed to think sticking its fingers in its ears and shouting "La,la,la,I can't hear you!" was a solution.

Windows 7 looks to be a much better effort, but the Vista fiasco has done lasting harm to Microsoft's reputation. If you thought Windows ME was the worst Microsoft could do, think again.

-----------------------------
BY Shaun Nichols and Iain Thomson
Source:iTnews

Copyright © 2009 Haymarket Media.

0 comments:

 

Copyright 2008-2009 Daily IT News | Contact Us