TORONTO — Utility computing will only become a reality when its main proponents — IBM, Sun Microsystems and Hewlett-Packard
— agree to work together and build technology on open standards. Without that foundation, it simply won’t work, Crawford Del Prete, IDC’s senior vice-president of hardware and communications in the U.S., told a gathering of IDC customers late last week in Toronto.
Despite the fact it doesn’t exist today, utility computing is often heralded as the next technology wave because it promises to relieve one of the biggest challenges companies face today: seamlessly accessing and scaling computing capacity on demand across geography, application and operating system.
In a presentation that focussed on where the IT industry is heading after a prolonged period of belt-tightening, Del Prete acknowledged the industry is in a down cycle, but pointed out the opportunities for vendors.
“”This is truly not the end,”” Del Prete said. “”During each transition, there’s always a great deal of pain, but there’s also an opportunity for innovation.””
Del Prete said utility computing will unfold in three phases. The first phase is server consolidation, which helps companies better control over their assets.
The second phase, he said, will involve automating resources to allow them to be shared and then virtualizing those resources.
“”Phase 3 will bring one homogenous environment where assets aggregate and disaggregate as necessary,”” Del Prete said.
Before this nirvana is realized, however, economic, technical and cultural challenges must be resolved. Efforts to achieve higher levels of automation will be sabotaged by IT departments, said Del Prete, if, for example, they feel the initiative threatens to eliminate human resources.
“”If the IS department pushes back against these changes for fear it will cut jobs out, it won’t work,”” he said.
According to Del Prete, companies will also be investing in Web services, an area closely related to utility computing and a market which represents about $20 million of spending in Canada today, but is projected to reach $170 million in 2006.
Admitting there’s a lot of confusion around what Web services are and how it can help companies, Del Prete said it is simply “”automating what previously required human intervention.””
A Web service is software that is used by other software via Internet protocols and formats to help companies divvy up business logic into discrete modules and allows them to extend those modules, as necessary to the logic in other systems, such as those from suppliers or customers. Applications can be deployed more quickly and will run better in such an architecture, he said.
Del Prete cautioned that vendors in the Web services game must concentrate on standards such as Simple Object Access Protocol (SOAP) and Web Services Description Language (WSDL) to make it work.
Practically speaking, Del Prete said companies need to think about the various databases they have in production that don’t work together and what could be gained by integrating them.
“”Web services can make this happen, and much more cost-effectively than paying a consultant $1 million,”” Del Prete said.
The third frontier where companies will invest IT dollars is in mobile and wireless computing, Del Prete said. He pointed to the 802.11 wireless networking protocol as an example of the corporate hunger for technology that moves with people.
Despite the complexity of 802.11 – 802.11a is not compatible with 802.11b – Del Prete said it’s proof that people will respond to something complicated rather than nothing at all.
“”The focus on mobility today is backwards. We’ve been given all these devices with this functionality”” but haven’t given enough thought to where people can be productive using technology the technology. He said hotspots make senses in airports, but maybe not in a McDonald’s.
Comment: [email protected]