So... what will G80/R600 be like?

_xxx_

Banned
I think it's time to officially begin speculating :)

I say both will have unified shaders, decoupled tex units, advanced memory controller and scheduler along the lines of R520 and at least 24 "pipes". HDR+AA and correct AF will be there as well. G80 will of course feature multiple clock domains, R600 maybe as well.

I also think we'll see a replay of this year (business-wise). G80 in late summer, R600 a few months later. I do expect ATI to have parts which are actually available this time around, but nV doing the same trick they did this year.

All just IMHO. What do you people think?

EDIT: to make it perfect, I think they'll both feature some physics calculations capabilities, too :p
 
Last edited by a moderator:
If G80 has unified shaders I'd expect it to be after R600. They seemed to be fighting the notion while ATI was pushing for it.
 
_xxx_ said:
I think it's time to officially begin speculating :)

I say both will have unified shaders, decoupled tex units, advanced memory controller and scheduler along the lines of R520 and at least 24 "pipes". HDR+AA and correct AF will be there as well. G80 will of course feature multiple clock domains, R600 maybe as well.

I also think we'll see a replay of this year (business-wise). G80 in late summer, R600 a few months later. I do expect ATI to have parts which are actually available this time around, but nV doing the same trick they did this year.

All just IMHO. What do you people think?

Triple-slot cooling!!!!!!!
 
AlphaWolf said:
If G80 has unified shaders I'd expect it to be after R600. They seemed to be fighting the notion while ATI was pushing for it.

Nah, that's just politics like ATI downplaying SM3.0 until they had SM3.0 parts themselves, where it suddenly became the best thing since sliced bread.
 
_xxx_ said:
Nah, that's just politics like ATI downplaying SM3.0 until they had SM3.0 parts themselves, where it suddenly became the best thing since sliced bread.

As far as I can remember, ATi said they wouldn't do 3.0 until it could be done right, whereas nV didn't see the usefullness of unified shaders just yet (or something of the sort). In too much of a hurry, though.

_
K
 
Matasar said:
R600 supposed to hit late Q4 2006 or early 2007 ?
Since it will most probably be a DX 10 part and DX 10 will be introduced with Vista, it will be available shortly before the Vista launch (probably at least one or two month earlier). Vista is supposed to launch mid-2006, but will probably be a bit late since Beta 2 has slipped to january or february.
 
Kaizer said:
As far as I can remember, ATi said they wouldn't do 3.0 until it could be done right, whereas nV didn't see the usefullness of unified shaders just yet (or something of the sort). In too much of a hurry, though.

_
K

Well yeah, it's valid for both sides. But frankly, do you think nV did SM3.0 wrong in some way? Seeing the current bunch of benchmarks, I somehow can't see nV's SM3 being useless, neither I think that unified shaders are useless.
 
From what it looks like unified shader cores cannot be avoided in the longrun; whether NVIDIA will arrive with such a sollution at least for it's first DX10 incarnation is another story. I personally wouldn't bet my money on it.
 
An analyst contaced me a little while back saying that some the Wall Street guys that are in contact with them are now of the opinion that G80 will be unified. I suspect that this could well be the case now, but I don't think it was the initial plan.
 
I'm waiting for the ExGPU. My vision of this will be a crazy CrossFire/SLI config where it connects to a master card through an external cable (ala CrossFire) and is a self contained rendering box. External GPU! With all the one upmanship recently, I wouldn't be surprised.

Are we expecting more of a performance based upgrade in R600 and G80 or more emphasis on features?
 
Dave Baumann said:
An analyst contaced me a little while back saying that some the Wall Street guys that are in contact with them are now of the opinion that G80 will be unified. I suspect that this could well be the case now, but I don't think it was the initial plan.
Even if we know it could mean anything we should remember nvidia has actually patented some unified shading tech..
 
Someone said that R600 cant be released until 1 year after xbox launch because they signed some contract with MS ?
 
I think it was probably in the roadmap. But I think NVidia PR is divorced from NVidia engineering.

For example, I don't believe NVidia "threw in" a 256-bit bus to the NV35 at the last moment after being "surprised" by the R300. I think they had investigated all the options they had available in terms of # of pipelines and bus size, and I think a 256-bit bus was in the internal roadmap, it was just a question of if they could deliver it in time. They probably deemed the extra complexity as making an already delayed design even further out. Once the decision was made to launch with 128-bits (and 4 ROPs) Just like ATI did with the R400/R420/R520/Xenos, they settled on what was fastest they could deliver to market, by taking baby steps. However, in NVidia's case, their decision to go with a paired down bus for the initial launch of the architecture put them at a PR disadvantage, so that's why you saw all that FUD about "we don't need a 128-bit bus", and then 1 quarter later, they launched a 256-bit bus.

I think NV has been badmouthing unified for fear that ATI would beat them to market with a unified architecture, but I don't think that internally they are forgoing a unified architecture. I think it's in the roadmap, has always been there, the only difference is, the degree to which the roadmap has been compressed by pressure from ATI and the hype around unified.
 
ever wonder what nVidia did/does with all the 3dfx tech they now own? i've often wondered how different nVidia's products would be w/o 3dfx, if they didn't own all the IP

idk, just thinking maybe we might see a surfacing of some un-used/un-shown 3dfx technology, combined with more engineering, sort of like nVidia seriously showing their teeth, not just competition, but ATi has enough momentum to maintain competition imo

still, i'm wondering what ever happened to tiled gfx (ignoring CrossFire's use of tiles)
the Kyro II used it, 3dfx's roadmap showed a product like 4-5 gens out from Rampage, "Mojo" to use tiled graphics, but nothing else has been published on that...and btw, for what it had in terms of hardware Kyro II was amazing...


but given 3dfx's use of tiles for memory, possibly a rendering solution wherein a small portion of the screen has it's own RAM address range, and portion of rendering power, then again who knows?

as far as G80/R600...i'm thinking GDDR4 for ATi, and XDR or XDR II for G80, with a bias towards XDR II, along with DX10 support, maybe higher FP precision, maybe not, better CrossFire modes for ATi, and possibly version of SLI that allow non-50/50 SMP (ASMP rendering, but w/o tiles?)
 
Back
Top