The Canada Foundation for Innovation (CFI) has put the cherry on top of its long-standing commitment to funding high-performance computing with a $78 million investment to create a national network of supercomputing facilities.
Announced at an event held at the University of Toronto late last month, $60 million will come from the new National Platforms Fund (which was formed to fund infrastructure and resources that can be utilized by multiple institutions and scientific disciplines). Another $18 million is coming out of the Infrastructure Operating Fund. The National Sciences and Research Council of Canada also contributed $10 million over five years; this will go towards taking care of ongoing operating costs.
The project will link seven pre-existing HPC facilities including B.C.’s WestGrid, Ontario’s SHARCNet and ACENet in Eastern Canada. Under the plan, researchers could input their need for a certain function to be run for 48 hours, said CFI president and CEO Eliot Phillipson, and then that project would be routed to whichever supercomputers specialize in that area and are available.
This information would zip across government-funded non-profit Canarie‘s CA(*)net 4 optical network, according to Canarie president and CEO Andrew Bjerring. “What makes it different than other networks is that it will make a single platform out of different resources and integrate it much more tightly.” Doing a single job on multiple machines is already a reality, said Bjerring. “But we want to link together resources in ways more than computing. We want to be able to have the data in one place, the software come from another, and the visualization happen elsewhere.”
To prevent the network crashing from the massive data loads, a dedicated network is in place. Canarie will also work with members of some of the consortia, including WestGrid, to develop software and middleware that will give the network a more unified interface.
At the event, Phillipson compared the network’s pan-Canadian, innovative properties to the pivotal role the Trans-Canada railway played in building the country.
In an interview, Phillipson said that, having already invested around $100 million in the seven HPC consortia across the country, the CFI would be building on its investment (and reducing research duplication). “They would likely all be coming back for more money, so it made more sense (to fund them as a group). Then they wouldn’t be competing against each other. They could work as a whole, instead as of the sum of their parts,” he said.
In a funding first, the CFI actually offered the money upfront. It hosted a conference held in Ottawa in the fall of 2005 that gathered the HPC consortia, along with HPC-concerned representatives from the provincial and federal givernment, and the private sector. “We put the $60 million on the table for a national system. If they provided a solid application, and followed the proper review steps, and it was found (by a panel of international HPC experts) to have high merits, they would have the funding.”
When asked what about HPC inspired the CFI to put the money before the proposal, Phillipson said, “It wasn’t something simply out there. We recognized that HPC is a fundamental, critcal need for virtually all research.”
These sentiments were echoed at the event by Isabelle Blain, vice-president, research grants and scholarships, of NERSC: “High performance computing is one of the most important tools we have to drive innovation.”
Phillipson hopes to see the new network stay innovative by being ultra-efficient — he compares the network to a car, which is only used on a part-time basis, and goes to waste the rest of the time. Add in the countless specialities of the scientists, and many potential projects could go unrealized, or slower, or less thoroughly because their particular HPC resources don’t fit the bill. But with the easy access of a cross-country network, researchers can tap in wherever works best.
Phillipson said, “They don’t care if their research is done through ACEnet or WestGrid — researchers just want the access, hardware, software, and technical expertise to do their research.”
Hugh Couchman, scientific director of SHARCNETand professor of physics and astronomy at McMaster University, is confident that the hardware side is already there, but stressed the importance of making sure that there are sifficient connections at respectable bandwidth to keep projects flowing.
Another key asset of the new network isn’t computers at all — it’s humans. “You can’t be an expert on every technical aspect — this network will be able to put researchers in touch with technical support staff, ” said Couchman. While the pool of available HPC support staff is limited, the network should get a boost from the NSERC-provided operating funds that will pay for more HPC support staff.