Ultra-Tiny Transistors = not that imp now

Deepak

B3D Yoddha
Veteran
Link

"The days of relying on shrinking transistors to achieve performance gains are over, and the chip industry needs to enter a new era of innovation where system-level features--including dual-core processors--are just as important as thinner transistor gates."
 
Deepak said:
Link

"The days of relying on shrinking transistors to achieve performance gains are over, and the chip industry needs to enter a new era of innovation where system-level features--including dual-core processors--are just as important as thinner transistor gates."

Well, one day not too far away we'll get to the Limits of Shrinkage(TM), so that's quite obvious. But in those days dual core processors will be the norm if not outdate.
 
Yeah, this issue raised its head about six months ago now, around the time of the cancellation of Tejas. IBM made similar comments back then too.

The downside of approaches such as multi-core processors is that they are only a win for certain applications (a bit like SSE/SIMD stuff is), whereas clock-speed hikes benefit all apps.

Certainly makes for interesting times ahead though.
 
If and when dual-core becomes the norm for consumer systems, AMD and Intel had better hope the "Home edition" of Longhorn supports 2 cpus, otherwise there'd be no benefit for those systems.
 
Even though it probably won't happen , what I belive the pc industry needs is to do away with x86 instruction set and start from the ground up. Of course there's a lot of good reasons for sticking with x86 (e.g backwards compatability) but I think the gains we could get from going away from x86 would outweigh these.
 
madmartyau said:
Even though it probably won't happen , what I belive the pc industry needs is to do away with x86 instruction set and start from the ground up. Of course there's a lot of good reasons for sticking with x86 (e.g backwards compatability) but I think the gains we could get from going away from x86 would outweigh these.

This has been tried. Several times. It failed for the one good reason that you mention: backwards compatibility.

You, as a programmer, might care about instruction sets. That's if you write compilers for a living that is. If you don't write compilers, why would you care about instruction sets, even as a programmer? What fraction of commercially relevant code is written in assembler these days?

Nobody else cares about instruction sets at all (read: end-users). They do care about backward compatibility, and they care lot$$$$$$. This is why even Intel has having a hard time killing x86.

There was a massive barney about this on RWT a few months ago.
 
Back
Top