q&a Reports of the death of the famous technology predictor have been great exaggerated, according to Justin Rattner. He talks to us about why it’s still relevant and what he envisions for the future of microprocessor development. ‘Spintronics,’ perhaps?
In his famous paper published in April 1965, in the journal Electronics, Gordon Moore wrote: “Integrated circuits will lead to such wonders as home computers — or at least terminals connected to a central computer — automatic controls for automobiles, and personal portable communications equipment.” Analyzing the future of the industry, he predicted that, “reduced costs is one of the big attractions of integrated electronics, and the cost advantage continues to increase as the technology evolves toward the production of larger and larger circuit functions on a single semiconductor substrate. For simple circuits, the cost per component is nearly inversely proportional to the number of components.” This became known as Moore’s Law. Forth-two years later, it is still valid. But will it be the same in, say, 10 years from now? Justin Rattner, Intel’s chief technology officer, answers that question in this interview.
ITBusiness.ca: Moore’s Law is now 42 years old and still running. But isn’t it approaching its physical limits?
Justin Rattner: Whenever someone says that Moore’s Law is reaching its limits, I look back over the history of the industry. I’ve been working in this industry for more than 30 years. I’ve been living with Moore’s Law for quite some time. You know, we never can say [what will happen] more than about 10 years ahead in terms of technology. And the reason we can’t say more then 10 years ahead is because we feel confident that by the end of that period of time we’ll see another 10 years ahead. If you look at our 45 nanometers development, you’ll see the number of problems in terms of steering Moore’s Law we have dealt with.
Four or five years ago people said they would just bring Moore’s Law to an end due to leakage. The transition to High-K gate dielectric and metal gate transistors over silicon gate transistors has brought a dramatic reduction in leakage. That’s just one example of how technical innovation has addressed what was thought to be a fundamental limit. And there are more of these innovations I could describe, though we haven’t put in production yet. For example the tri-gate transistors, which for the first time moved from transistors, actually are silicon in the bulk to a circuit device transistor that’s above the silicon bulk. And again this is one of the classic steering problems to prevent much better transistor performance.
So, what I say is that in 10 years the transistors we’re building may not even look anything like the transistors we build today. That doesn’t mean it’s the end of Moore’s law.
ITB: The announced future 80-core chip is an example of that change in transistor design?
JR: Absolutely! Six or seven years ago, Intel began talking about how we were running up against power limits, which made it very difficult to increase performance in the same ways that we had, because the kind of energy, the kind of power that was been dissipated by these processors would be more then we could cool in a cost-effective manner. So in 2001 we talked about this power wall, and we decided to pursue a new approach in processor design, which involved the use of more energy-efficient processors and then … a multiple core processor. So we now dual-core and quad-core today. And we’ll have eight-core and, you know, more core.
ITB: And 16 and 32 and so on?
JR: Actually, I think what we’re going to see is the evolution in different add rates in processor roads in different product lines. I think mobile and desktop will have a relatively low rate, and they will move to eight, 12, maybe 16. And then in the high-end group I think we’re going to see dramatic increases, and that was really the spirit of motivation behind the design of the 80-core processor, with very high computing capabilities.
ITB: Do you think there is a future in computing without silicon?
JR: Well, it’s a very provocative question. You know, it’s hard to imagine right now that silicon won’t continue its way as a major component. It’s such a versatile material and we continue to discover new ways to its harvest its capabilities. For example we’re building a variety of optical devices in silicon. In fact, we just announced last month a 40 gigabits per second silicon optical modulator. So we now take an optical signal and modulate it, impress on it data at 40 gigabits per second which is about as fast as anybody has done in any technology. Silicon is a very powerful material and I think it will remain a mainstay in semiconductors. And as we move to new transistor designs or transistor architectures as we call them, we may actually introduce materials that are non-silicon in nature.
So, I’m talking about surface transistors. We might deposit other materials on the surface and build transistors out of other materials or we might build devices that rely on different quantum properties than electronic charge. Everything we do today is still due to electronic charge, but we are actually researching things based on some quantum effects. People call it spintronics (for “spin-based electronics”, Spintronics relies on the spin of an electron to carry digital information, its 0s and 1s). It’s possible that spintronics will represent a future design if we can figure out how to harvest the spin effect in useful circuits and devices.
ITB: Are you using spintronics as a synonym for quantum computing?
JR: No, no, they are two different things. I’m glad you asked it. Quantum computing is actually a different kind of computing that we know today which is based on the statistical behaviour found in quantum physics. It would be useful for some things but not very useful for the sorts of things we do today.
ITB: I think it would be useful for massive calculation but not for today’s desktop or laptop computing, right?
JR: Yes, that’s right. It offers the potential for a tremendous demand of parallelism, which is very high efficient. But there are only some problems that would be solved with the kind of computing that quantum computers would carry out. What I was talking about was simply using one of the quantum properties, like, you know, charge being 1s, colour being 0s. Spin is particularly interesting if we could control these spin effects in new devices of different sorts, and Intel is researching it as well. Returning to Moore’s Law, we may reach an end to charge-based electronics in 10 years or 15 years or 20 years, whatever it is. But we replace charge-based electronics with [a] spin-based system. If you declare that will be the end of Moore’s Law, I don’t know. But as long as we can continue to improve the performance and the energy efficiency and the density we might make that transition.
ITB: Now I would like to return to a 40 gigabits transmission, also known as silicon photonics. Could you explain what silicon photonics is?
JR: Sure, It’s becoming increasingly difficult to increase data rates when we try to move data over digital copper wire. You know, there is wire on a circuit board; they might be over large distances within a data center or even beyond. As everyone is well aware copper is essentially gone throughout system communications. There are optical fibres extending the globe now. In fact, where I live the telephone company is bringing fibre optics to my house. So, it’s becoming de facto for long distance communication. And it is moving closer and closer, beginning to get into the data center. But the cost has been relatively high.
So again, four to five years ago we began to look at the possibility of building these new high-performance optical devices on silicon. And we build high-performance silicon-germanium photodetectors. We build the modulators, starting with 10 gigabits per second and reaching now 40; we’ve done multiplexers and demultiplexers, and last year we developed a technology called hybrid silicon laser — we actually build an electrically optical laser. We now are at the point we have light sources, modulators, multiplexers, demultiplexers and detectors. We have a complete end-to-end optical transceiver capability, but we have to put all the devices together on one chip and that’s what we are working on right now. What we plan to demonstrate is a complete optical transceiver in silicon over the next year or so.
ITB: When will this new technology be available in the marketplace?
JR: Well, it’s possible that by the end of the decade we’ll have silicon photonic products in the market. If it’s not 2010, maybe 2011, but I think we are close now.
ITB: People want to know what their future computing devices will do and how soon they will do it. What prototypes are you working with to envision the applications of the next decade?
JR: We have been looking at a whole range of applications, which require enormous amounts of computing power to be practical, to be viable. That’s consistent with what we were talking about earlier, of building very high multiple-core machines like the 80-core processor. … We look at a whole range from things like useful motion tracking, the ability for video cameras to watch what you’re doing and translate your movements into the motion of a human in a computer. So as you move your arms then the figure in the computer moves its arms, and if you blink your eyes then the figure in the computer blinks its eyes. We look at that to catch facial expressions as well as doing whole body motion. That might be the basis for a future kind of game or it might be the way you teach someone how to dance. Your virtual partner is a computer.
Another application that generates a lot of interest is where we fit the live video of a sporting match like football — not American football, international football — and the computer tracks all the individual players and watches the game play and automatically displays a highlight video. So you say, ‘I just want to see shots on goal’ or ‘I just want to see the penalty kicks’ [or] ‘I just want to watch this one player, show me all the plays where this player has the ball.’ The computer can do all of that automatically. This requires a tremendous amount of computing as you can imagine: track all the players, analyze the movements, whatever it is. So, very computer intensive, but in five years it might not be unusual to find this feature in television sets. Those are a couple of future applications that we work with.
Now, we also do help devices. We’ve been experimenting with a number of technologies that you would either carry with you or you would hold beneath your clothing. They will basically monitor your heartbeat, your respiration, your physical activity, and all those kinds of things. They will basically provide you information that would improve your life style and lead to better health or alert you to a medical condition that would take you to see a doctor.
ITB: Today, we begin to see some real impressive robots, equipped with image and speech recognition. Some of them, especially those projected for the Department of Defense, can walk as if they were alive. Do you think the Terminator age is close or the age of intelligent machines is coming?
JR: The age of intelligent machines, is that what you said? Well, I think that in some ways the answer has to be yes. The help platform I was talking about has a whole variety of forms. Right now you can put it in your pocket or clip it on your belt or something like that. It gathers all these data and it makes inferences. It can tell whether you’re sitting down or standing up, whether indoors or outdoors, whether you’re climbing, going upstairs or downstairs, and then based on that it can make a whole range of other decisions about what you’re doing. I mean, when it learns patterns of your behaviour it can tell whether you’re at home, whether you’re at work, whether you are driving your car or sitting on the beach someplace.
I think this whole area of perceptual computing is ready to make very rapid advances. I think that over the next decade or so there will be devices, whether they’re robots or not, that exploit computational perception and exhibit a very human like behaviour. I think it is definitely going to happen.