90 minute lecture by former Intel chief architect Bob Colwell, given at Stanford University. Wow. Very interesting. Even if you are just a spectator, like me, and don't have any real engineering experience, this is fascinating. I mean, I guess you have to know a little bit about CPU design - but not too much, the talk is mostly high level general industry direction type stuff. If you know what 'transistor count' and 'pipeline stage' means you know enough to understand everything here.
Shorter version: the race for ever higher clock rates is a no win situation, and this sort of thinking misses the (near) future direction of the industry.
He is not terribly complimentary towards his former employer and their CPU offerings.
The video is in .asf format.
I plan to watch the video sometime this weekend. But I checked out the first few minutes. I really liked the BBQ analogy. We're using high end Opteron and Xeon systems. The heat management of these things is quite a mess. I don't even want to be in the same room with the quad Opteron system. It sounds like a turbine.
-Mark
I will be very curious to hear your reaction. He seems to talk so much sense. I really like to listen to smart people. Even though I'm no expert on the subject, just to get a sense of how he approaches problems is very interesting. You can see his mind work, the way he frames things, and to me it's very impressive.
Wow. That was incredible. Makes me miss Stanford. (I never got a degree there, but did some grad level work in signal processing.)
Here are some random thoughts ...
I think he's right about demand for processing power beginning to flatten. I've got a 900 MHz P3 in my laptop that's just fine for 99% of what I want a personal computer for. This could have a major impact on the computer industry over the next few years. I recall in the early eighties having a conversation about disposable computers. A non-techie lamented at the waste. I argued that computers were going through an incredible evolution. If Ford had gone from the Model T to the Taurus in 10 years, would you blame people for scapping a 3-year old car? But that argument doesn't have the same impact 20 years later.
As raw processing throughput becomes less important, other considerations, such as power consumption, mobility, wireless, etc., will be deciding factors for consumer-grade processors.
While my PIII laptop kicks ass for personal computing, I have apps which can consume all the compute resources of a dual or quad processor server, and some harder apps which can bring a large cluster of PCs to its knees. There will always be a high end. But to a certain extent, we power gluttons are getting a free ride off the volume created by millions of personal computers. What happens to the high end if the consumer market flattens? Is clustering the future?
The IA32 versus Itanium skirted around an interesting point. AMD went to 64 bits without screwing up the instruction set. Itanium will be killed by 64-bit chips from AMD (and now Intel) that are compatible with IA32.
I found the complexity argument compelling. The x86 is perhaps the world's most complicated kludge. Oops, I forgot about Windows. Okay, the second most. As Bob argues, at a certain point a clean sheet design will blow the x86 out of the water. When that will come, and from whence, I don't know. But perhaps within 10 years, and not from Intel. An interesting tack would be to hollow out the Pentium market from below. That's a traditional method of displacing an entrenched giant. (The mainframe was killed by the mini, which was displaced by the workstation, which fell to the micro,) Might a clean sheet design blow away the Pentium in power consumption relative to performance for mobile apps, and then move up the performance scale?
I enjoyed Bob's observations about tensions within high tech product organizations, especially: marketing vs. architectural engineering, architecture vs. implementation. Architecting a product is indeed one of the hardest things to do. I sketched up an architecture in 1993 that's still on the market. I don't know if I'll ever nail one that well again. But it's easiest to do in an early stage company when you don't have to explain to some short-sighted marketing dweeb the meaning of the verb "to architect".
High tech product architecture is not about meeting known, quantifiable needs -- although that must be encompassed. It's about establishing a framework which can adapt to unknown requirements, yet which is not overly burdened with "flexibility". There really aren't many people with the right mixture of right-brain/left-brain skills to do this, especially at the rarefied levels of processor architecture that Bob operates at. Yeah, I did a box with a good backplane, but a processor, that's something else.
|
Shorter version: the race for ever higher clock rates is a no win situation, and this sort of thinking misses the (near) future direction of the industry.
He is not terribly complimentary towards his former employer and their CPU offerings.
The video is in .asf format.
- jim 3-05-2004 9:35 pm
I plan to watch the video sometime this weekend. But I checked out the first few minutes. I really liked the BBQ analogy. We're using high end Opteron and Xeon systems. The heat management of these things is quite a mess. I don't even want to be in the same room with the quad Opteron system. It sounds like a turbine. -Mark
- mark 3-06-2004 12:05 am
I will be very curious to hear your reaction. He seems to talk so much sense. I really like to listen to smart people. Even though I'm no expert on the subject, just to get a sense of how he approaches problems is very interesting. You can see his mind work, the way he frames things, and to me it's very impressive.
- jim 3-06-2004 12:13 am
Wow. That was incredible. Makes me miss Stanford. (I never got a degree there, but did some grad level work in signal processing.)
Here are some random thoughts ...
I think he's right about demand for processing power beginning to flatten. I've got a 900 MHz P3 in my laptop that's just fine for 99% of what I want a personal computer for. This could have a major impact on the computer industry over the next few years. I recall in the early eighties having a conversation about disposable computers. A non-techie lamented at the waste. I argued that computers were going through an incredible evolution. If Ford had gone from the Model T to the Taurus in 10 years, would you blame people for scapping a 3-year old car? But that argument doesn't have the same impact 20 years later.
As raw processing throughput becomes less important, other considerations, such as power consumption, mobility, wireless, etc., will be deciding factors for consumer-grade processors.
While my PIII laptop kicks ass for personal computing, I have apps which can consume all the compute resources of a dual or quad processor server, and some harder apps which can bring a large cluster of PCs to its knees. There will always be a high end. But to a certain extent, we power gluttons are getting a free ride off the volume created by millions of personal computers. What happens to the high end if the consumer market flattens? Is clustering the future?
The IA32 versus Itanium skirted around an interesting point. AMD went to 64 bits without screwing up the instruction set. Itanium will be killed by 64-bit chips from AMD (and now Intel) that are compatible with IA32.
I found the complexity argument compelling. The x86 is perhaps the world's most complicated kludge. Oops, I forgot about Windows. Okay, the second most. As Bob argues, at a certain point a clean sheet design will blow the x86 out of the water. When that will come, and from whence, I don't know. But perhaps within 10 years, and not from Intel. An interesting tack would be to hollow out the Pentium market from below. That's a traditional method of displacing an entrenched giant. (The mainframe was killed by the mini, which was displaced by the workstation, which fell to the micro,) Might a clean sheet design blow away the Pentium in power consumption relative to performance for mobile apps, and then move up the performance scale?
I enjoyed Bob's observations about tensions within high tech product organizations, especially: marketing vs. architectural engineering, architecture vs. implementation. Architecting a product is indeed one of the hardest things to do. I sketched up an architecture in 1993 that's still on the market. I don't know if I'll ever nail one that well again. But it's easiest to do in an early stage company when you don't have to explain to some short-sighted marketing dweeb the meaning of the verb "to architect".
High tech product architecture is not about meeting known, quantifiable needs -- although that must be encompassed. It's about establishing a framework which can adapt to unknown requirements, yet which is not overly burdened with "flexibility". There really aren't many people with the right mixture of right-brain/left-brain skills to do this, especially at the rarefied levels of processor architecture that Bob operates at. Yeah, I did a box with a good backplane, but a processor, that's something else.
- mark 3-06-2004 10:18 am