If a picture really is worth 1,000 words, a microchip is worth 1,000 pictures.
Many of the best pictures of processors in action came out three years ago, when more than 100 photographers captured images for a project called One Digital Day. The photographs included views of Singapore, where processors track thousands of shipping containers, and Puerto Rico, where chip-based devices are used to judge horses.
This project, which was shot over a 24-hour period and later became a book published by Time Books/Random House, remains the quintessential IT coffee book of all time and a compelling way of demonstrating the impact of this industry on day-to-day life.
I was surprised, therefore, to discover that on the 30th anniversary of the first Intel chips last Thursday, the original Web page offering an online glimpse of One Digital Day had been removed. I couldn’t find it in their archives, either. It’s sort of like throwing a birthday party for your son and learning that he burned the album with all his baby pictures.
On the other hand, no one could blame chipmakers for feeling less than celebratory. Less than a month ago research firm Gartner Inc. predicted the processor segment would not begin to recover until 2003, and even then we’re only looking at overall revenue increases of three per cent. Everyone has been feeling the pinch. Intel has put plans to open up new fabrication plants on hold, while Advanced Micro Devices has closed down some of its best-known overseas fabs.
Forecasting has never been easy, but in the processor business we have slowly approached a plateau from which it will be difficult for companies like Intel to see the kind of growth it has enjoyed for the last three decades. Experts point to video streaming and speech recognition as applications which will demand increased computing power, but these are limited markets, and will remain so even once the chips bring them up to speed. That’s because in almost every other basic area — word processing, spreadsheets, other office programs — the chipmakers have done a good job of supplying the throughput customers require.
This doesn’t mean there isn’t anywhere for Intel and others to go. IBM has managed to put two processors on a piece of silicon, a feat Hewlett-Packard is racing to match. There are rising concerns about power consumption, which Intel addressed at Comdex Fall last week through a partnership with the U.S. Environmental Protection Agency. Perhaps more significantly, everyone in the industry is trying to figure out how to reduce the complexity of transistors as chips are shrunk down to boost performance.
It’s hard to imagine where we’ll most need increased processing capability. That makes people in this industry uncomfortable, as we are used to long-term forecasts that paint us a comforting, inspiring picture of the future. It may be helpful to remember that when the Intel 4004 processor was developed in 1969 it was intended for calculators. No one was dreaming of One Digital Day.
I know there are bright ideas out there that will revolutionize application development as we know it, and the processor will be the talisman that helps make their dreams come true. When it happens, we’ll have our cameras ready.