Hexus, eFuse, and AMD GPUs

Geo

Mostly Harmless
Legend
This has got me scratching the old noggin.

http://www.hexus.net/content/item.php?item=7467

ATI's developed a new technology called eFuse, which is intended to provide more control over specific parts of the chip, i.e. make it easier to 'turn off' smaller sections, rather than large chunks of chip. It's obviously better to have more control, especially if it saves shutting down bits of silicon that are mostly capable of functioning properly.

So, in terms of getting a saleable product from iffy wafers, it's great news. The only snag is that with a more versatile means of controlling the chips, we could see some very varied GPU configurations, leading to lots of different SKUs... as if the discrete graphics card market wasn't confusing enough with all the different products and xtreme hyperbole.

Well, that name "eFuse" triggered a tinkle somewhere in the back of my skull, so I googled, and found a bit of confirmation that ATI/AMD would have some background with a bit of something or other called "eFuse".


http://www-03.ibm.com/chips/news/2004/0730_efuse.html
http://www.eetimes.com/news/semi/sh...JUNN2JVN?articleID=26100962&_requestid=133047

And if you search our forums you'll find the occassional reference to it in the console forum re the Xenon CPU for Xbox360.

So what goes on with Hexus report? Is AMD using this tech with GPUs now? Or is Hexus confused somehow? Or are there two separate techs by the same name?
 
Hmm, could this potentially be used to power down sections of a chip when idle? Or is that allready well and truly taken care of.

The technology, called "eFuse," is said to combine software algorithms and microscopic electrical fuses to produce chips able to regulate and adapt to their own actions in response to changing conditions and system demands

That's the paragraph that makes me think so.
 
Makes me think of a powerful GPU in say a notebook that can scale way back and behave as a much lower end chip. That'd be really useful, I know chips perform similar actions right now, but nothing I can think of can scale back major parts of the chip.
 
Hmm, could this potentially be used to power down sections of a chip when idle? Or is that allready well and truly taken care of.

A fuse is a one-way deal, or this tech has a really dumb name.
This sounds more like a much more intense yield-maximization method than a power-saving technique.

It sounds like they've created a system for monitoring circuit behavior during chip validation, that monitors fine detail to detect defects and bad signals.
Instead of killing off half the chip or several pipelines, they can design alternate routes and work around problems more effectively without junking the chip.

In a running system, these chips might be able to salvage themselves without going completely offline in the case of a chip failure.
 
Assuming that there's really some relation here...

Considering that this is a patented technology from IBM, how much of a hurdle will it be for ATI to use it at TSMC?
 
Last edited by a moderator:
Considering that this is a patented technology from IBM, how much of a hurdle will it be for ATI to use it at TSMC?

No hurdle, probably, other than a hefty fee.
IBM makes hunderds of millions per year licensing patents for just this kind of stuff. It's the one of the main reasons they keep a bunch of pure fundamental research divisions operational. By being first at inventing stuff that nobody else has practical uses for yet, they can put stakes in the ground at very early stages. (Their own fabs aren't exaclty the most profitable...)
 
A fuse is a one-way deal, or this tech has a really dumb name.
This sounds more like a much more intense yield-maximization method than a power-saving technique.

Agreed. They use electromigration as a way to blow the fuse. That's not a reverisble fenomenon, as far as I know.
 
Back
Top