80nm sucks.
Thanks for the sig.
80nm sucks.
80nm sucks.
80nm sucks.
How concise of you!
Well pardon my french, but ATI has had only problems with it (***edit: and yes I mean R600) and I haven't heard the most glorified impressions from NV's side either for G84/6. It sounds rather obvious what is at fault here.
You know for a fact that there are no process related issues? How?Yes indeed. It is ATI who screwed up R600 on 80nm, not that there's anything wrong with the 80nm process.
You know for a fact that there are no process related issues? How?
Ok, I don't know it, but it seems to be clear that the G84/86 is totally fine but the R600 is not. I believe it is ATI fault that R600 is bad and not the process.
Speaking of which, why would the process have any serious problems to begin with? 80nm is just an optical shrink of 90nm, which means it should work nearly the same.
"Should work nearly the same." is engineering speak for it's going to mess everything up because you thought it wouldn't.
In addition, I doubt the G84/86 are taxing the process nearly as much as the R600. Does this make the 80nm process bad? Not necessarily, but it may be bad with the R600. So it may be a combination of things.
See, we heard that before, back in the NV30 dark ages, so allow us to be skeptic...
It too had a deadly combination of new fabrication process (130nm Low-K), new API (DirectX 9), new memory technology (DDR2) and huge cooling requirements (for its time, of course).
And what happened ?
Arrived later than the competing solution, it was slower than expected, hot (not as in "wanted" hot, but as in "ouch !" hot) and it was hard to find after the official launch, eventually living a very short shelf life in light of the debut of the NV35/37/38 series.
Even the R300 didn't risk as much, sticking with the proven 150nm node and standard DDR1 coupled with a wider bus (256bit).
The fact that it stuck to DX9-spec (FP24, emphasis on shader processing speed and image quality) didn't hurt either.
Oh yes, the NV40 and R420 later came out on 130nm (may have had low-k, don't remember) and they were fine. NV35 wasn't too bad either, albeit not very exciting. That was proof that it was the fault of the NV30 and nothing else. I suspect the same thing with the R600. Somewhere, ATI screwed up, because even though 80nm may have it's problems, it isn't exactly bleeding end stuff anymore.
Not really. The existence of good chips that came out on time in high volumes on the same node may just indicate that they figured out how to work with the process properly.Oh yes, the NV40 and R420 later came out on 130nm (may have had low-k, don't remember) and they were fine. NV35 wasn't too bad either, albeit not very exciting. That was proof that it was the fault of the NV30 and nothing else.
Yeah, NV40 and R420 came out in the 130nm node... without Low-K and almost 2 years later.
Geez I sure hope it's not running at those clocks in 2D for real. But all signs point to G84 being ready to go.
Ok, this could be it:
http://club.ocer.net/attachments/2007/03/2227_200703282131471.jpg
64 SP's, 720MHz/2200MHz (core/memory).