Two teams from Carleton University’s Immersive Media Studio (CIMS) have collaborated on a digital reconstruction of the Salk Institute for Biological Studies in La Jolla, Calif.
The teams — one in Ottawa and the other at the California Institute for Telecommunications and Information Technology at the University of California, San Diego – were participating in iGrid 2005, one of the largest aggregations of computing and data transmissions bandwidth ever assembled for research. About 50 demonstrations were showcased.
CIMS director Michael Jemtrud, who described it as a watershed event, said the lab’s participation was sponsored by Ottawa’s CANARIE Inc., the nation’s advanced Internet development organization. It used CA*net 4, Canada’s high-speed research network, which is part of the Global Lambda integrated Facility (GLIF), an international high-speed research network, to stage the demo.
“It’s really bleeding-edge type technology for us. It’s a 10GB network, which is pretty extreme,” he said. “The basic technology we’re using isn’t that different from what carriers such as Bell and Telus use at the switch level. It’s the way you provision the network which is unique.”
Although the technology is still used mostly for scientific purposes, particularly for physics, CIMS wants to further its development for the arena of design, including the architecture, aerospace and automotive industries.
The Canadian solution, which based user-controlled lightpaths (UCLP), includes software intended for articulated private networks, he added. UCLP technology, which lets users change the configuration without a central network management facility, was first developed as a critical part of CA*Net4’s advanced network program.
Maxine Brown, co-chair of iGrid 2005, explained that in optical networks, bits travel through light and optical fibre. A lambda, she explained, is wavelength of light. White light can be broken into colours onto which data can be encoded. So instead of having just one 10GB fibre, you can have 40 in parallel, she explained.
The CIMS teams, among which figure electronic and software engineers, information technology and industrial design students, created the digital replica by using imaging software called photogrammetry. The two teams worked together via videoconferencing.
“It simulates in a visually intensive way any sort of design process whether automotive, aerospace or architecture,” Jemtrud said. Recreating the Salk Institute was “more an exercise for us but the value of content itself is to have a precise digital document of this building,” he added. “We did the Rideau Chapel in Ottawa last year, which was destroyed (in 1972). The Salk Institute is probably one of the most important modern buildings in North America, and this is an earthquake zone, so it could disappear today. To have that precise model has heritage value as well as reconstruction value, and they’re going to use the model on the Salk Institute Web site.”
The challenges the teams faced relate to the fact that the applications they used for visualization are not yet optimized to function well on the new network, said Jemtrud. CIMS, which just received a CANARIE grant to fund 18 months of research, will spend the next couple of years customizing applications and developing new technologies and protocols for the high-speed network.
“The thing is this network is not going to be widely available to industry for another five to eight years probably,” said Jemtrud. “We have to start customizing those applications to respond to that network now. That’s why we’re doing the research. It’s like when the Internet first started, no one was ready to actually use it except for really high-end scientific applications, and it has taken 10 years to productively use it, so we’re just trying to get ahead of the ball.”
Jemtrud said this year was the first that the iGrid event has used content as a means of displaying its technical capabilities.
“That’s what these conferences typically lack,” he said. “They usually show pretty purposeless and lame imagery because the scientists don’t produce content, so it’s straight streaming of high definition video, or fish swimming across the screen. Ours had actual content of La Jolla, and it was architecture so everyone could relate to it.”
According to Brown, the iGrid events take place following major technological developments. The 2002 event in Amsterdam, for example, followed Amsterdam’s extension of a 2.5GB link across the Atlantic Ocean.
“We look for the technological leap, when has something changed,” she said. “Prior to that there were only megabits across the ocean.”
Since then, high-speed research networks with bandwidth of 150GB have been extended across North America.
The goal for iGrid participants is to develop technology that enables the network to be treated as a resource much the same way as the hardware is now, said Brown.
“In grid computing, all the computers and the data storage devices are all resources you want to be able to access, have accounts on and talk to, but currently the network just connects these resources; it’s not part of the resource, it’s just a way to connect resources,” said Brown. “In building the lambda grid we want to be able to treat the network as if it were as much of a resource as the devices it connects, so if you want to have a video teleconference between two sites, you schedule the network as well as the computers at both ends.”
Once that happens, the focus will be on designing Web applications that make it easy for the applications to talk to all the various resources. “The applications people shouldn’t know about networks and accounts and authentication — this is what the middleware does,” she said.
And although the current pipe of 150GB is hefty, especially compared to its predecessor of 2.5GB, even that will be taxed by the amount of data expected to be generated when CERN’s large Hedron Collider begins operations in 2007.
“All the physicists all over the world will want access to this data, so the challenge in some cases is how to move data experiments from their source worldwide to the people who want to analyse the data,” she said. “What they do now is they travel to the source as opposed to bringing the data to them.”
Visualization applications also tax the grid.
“We had a demo here on 4K digital cinema – it was a 4,000 by 2,0000 pixel screen, which is four times the data of high definition TV,” said Brown. “We streamed real-time images using cameras and stored images between Tokyo and San Diego without the latency so it looked like you were watching a movie, it looked like it was being shipped from next door.”
But it’s impossible to predict what to expect at the next iGrid event, she said.
“In two years a lot of these tools that are being developed now will be in production mode,” she said. “This community we’ve built is interested in advancements, so they can go to a conference and do show and tell, but iGrid is more of a workshop to bring all these smart people together, and they come because they’re working with their counterparts in other countries worldwide.”