Canada’s “most powerful and energy efficient supercomputer” will soon be a reality thanks to a joint initiative by the University of Toronto and IBM.
The supercomputer – capable of doing 360 trillion calculations per second – will be built by the university’s SciNet Consortium (which includes the University of Toronto and associated research hospitals) and IBM Canada.
The machine will be the second largest system ever built on a university campus, and the largest supercomputer outside the U.S.
When fully completed, it will be able to store 60 times more data than the U.S. Library of Congress Web archive, and have 30 times more power than the system used by Environment Canada to forecast the weather.
The machine will consume an estimated two megawatts or 2,000 kilowatts of energy per hour – roughly equivalent to the electricity consumption of 1,500 homes or burning 2,275 litres of gas.
It that seems a lot of juice, consider this: Because of a special water-cooling system, the supercomputer will require considerably less energy to cool it down. So it’s considered far more energy efficient than other models in the market.
The older system consumes around 2.5 to 3 megawatts, and requires just as much to keep it from over heating.
The supercomputer will be housed in a 12,000-square-foot data centre in Vaughan several kilometres north of Toronto, because it requires a larger space and because it would create such a huge power drain in the high-density city grid.
Typically, energy required for cooling a mainframe or supercomputer is equal to the amount of energy the machine consumes. But the water-cooled system on this supercomputer will only eat up about 200 kilowatts of power and provide an estimated 80 per cent energy savings, according to one of the system’s architects.
“On days when the outside temperature is below -7C – and there’s plenty of those days in Canada – we can even use the outside air to serve as a free cooling plant for the liquid in the system,” says Neil Bunn, technology architect at IBM Canada.
He said the new supercomputer also uses a hybrid design that links two different designs to develop greater flexibility and a five petabyte (equivalent to one quadrillion bytes or 1,000 terabytes) of storage capacity.
IBM’s iDataPlex system, which has more than 32,000 processors, provides the massive computational power for extreme number crunching. Big Blue’s POWER 6, which has 3,000 processors, provides the capability the run a wide range of software applications for processor intensive simulations.
“This hybrid system allows the supercomputer to [perform] numerous kinds of calculations required by various disciplines,” said Bunn.
The supercomputer supports green computing because it will eventually allows university departments to de-commission their separate systems, he said.
The machine will be used by scientists engaged in research in aerospace, astrophysics, bio-informatics, chemical physics, medical imaging and other areas, according to Richard Peltier, physics professor and scientific director of the U of T’s SciNet Consortium.
At the moment, Peltier said, various university departments use their own systems to conduct simulation and high-performance computing. Once the new supercomputer is fully operational by next summer, these tasks can be carried out on the new machine.
This will greatly reduce multiple administrative and management costs and provide departments access to a far more powerful system.
Peltier sees the new supercomputer as a boon to researchers, who are working on very complex simulation research such as climate change studies.
In an era where melting polar icecaps has replaced nuclear meltdowns and ballistic missile attacks as the universal harbinger of global annihilation, scientists are increasingly turning to supercomputers to predict climate changes.
“Very often our research requires the integration of models that span over several hundreds of years – for instance, from the beginning of the Industrial Revolution to 100 years from now,” he said.
By analyzing models, scientists can better predict how factors – such as increasing or decreasing greenhouse gas emissions or solar energy – would affect the onset of global warming.
With current systems, it could take weeks or months to generate models that would simulate hypothetical climatic conditions and changes in a global or local scale. The supercomputer will enable researchers to build more precise models within a matter of hours or days, said Peltier.
For example, while researchers are able to predict variations in surface temperatures at points separate by 500 kilometres, the new computer will enable calculations covering areas less than 100 kilometres apart.
Being able to more precisely determine the climatic changes and their causes for a specific area over a longer time span, Peltier said, will help scientists and decision makers develop better policies and solutions.
“The need for high performance computing or number crunching is fuelling the demand for supercomputers,” said Bill Terrill, Tucson-based associate senior analysts for Info-Tech Research Group of London, Ont.
“Various corporations from pharmacies to financial firms, automakers and aircraft manufacturers to conglomerate farms are all using models to simulate reactions or predict outcomes.”
Creating virtual models on a computer is much faster and more economical than conducting actual lab experiments said Jason Brenner, director of infrastructure hardware at IDC Canada in Toronto.
For example, virtual models created by supercomputers can simulate the growth and re-growth of cropland through various generations, under numerous hypothetical conditions within a matter of days or weeks.
“By using extreme computational power of supercomputers to crunch through larger data sets, researchers are able to carry out simulations at a much faster pace than just five years ago,” he said.
High-performance computing has become more acceptable in various industries as price points continue to go down, Brenner added.
He said nowadays you don’t need to shell out millions of dollars for a single machine such as a Cray. “For many applications, you can buy a much cheaper system that uses the popular clustered application.”
There are, in fact, plans to eventually hook up the new U of T supercomputer with other supercomputers in the country to create a more powerful machine says Bunn of IBM.
Terrill, of Info-Tech, says clustering is a growing trend.
Much of supercomputing is moving towards the harnessing of the combined processing power of smaller machines much like in cluster or cloud computing, said Terrill.
He said universities and computer makers like U of T and IBM continue to collaborate on supercomputer programs because both parties benefit from the partnership.
“Universities get the machines which they otherwise couldn’t afford, their researchers get a chance to work on leading edge technology which is a big draw for top rate talent. Computer makers get cheap labour that will develop software and applications for their machines and they get to showcase their technology for next to nothing.”