So G7x=nv50 then?DaveBaumann said:G7x will support 3Dc.
So G7x=nv50 then?DaveBaumann said:G7x will support 3Dc.
digitalwanderer said:So G7x=nv50 then?DaveBaumann said:G7x will support 3Dc.
Why? 3Dc is nothing heavy or hard to implement into NV4x pixel pipeline.digitalwanderer said:So G7x=nv50 then?DaveBaumann said:G7x will support 3Dc.
Dig, don't be stupid. Dave has repeatedly referred to G70, and now he's referring to G7x, which implies two things.digitalwanderer said:So G7x=nv50 then?DaveBaumann said:G7x will support 3Dc.
Two words for ya buddy, "DONKEY KONGA"!The Baron said:Dig, don't be stupid. Dave has repeatedly referred to G70, and now he's referring to G7x, which implies two things.
1. G70 is the next-gen NV enthusiast architecture.
2. G7x is a whole new line of chips--just like NV4x--so we can expect to see new mid-end GPUs, new low-end GPUs maybe, etc. It's a good thing for everybody.
Then again, the X might imply RAMPAGE! or GIGAPIXEL! (it's a joke and if you comment on it, I hope that Dave will break you)
I will destroy you. Like, I'm driving up to Indiana now.digitalwanderer said:Two words for ya buddy, "DONKEY KONGA"!The Baron said:Dig, don't be stupid. Dave has repeatedly referred to G70, and now he's referring to G7x, which implies two things.
1. G70 is the next-gen NV enthusiast architecture.
2. G7x is a whole new line of chips--just like NV4x--so we can expect to see new mid-end GPUs, new low-end GPUs maybe, etc. It's a good thing for everybody.
Then again, the X might imply RAMPAGE! or GIGAPIXEL! (it's a joke and if you comment on it, I hope that Dave will break you)
The Baron said:I will destroy you. Like, I'm driving up to Indiana now.
Oh hey, will R520 support DST?egore said:So I guess Futuremark will be adding 3Dc support then.
The Baron said:Oh hey, will R520 support DST?egore said:So I guess Futuremark will be adding 3Dc support then.
The Baron said:Oh hey, will R520 support DST?
LeGreg said:The Baron said:Oh hey, will R520 support DST?
they already do, one way or the other (through pshader comparison). It's just slower.
They could probably expose it in the API as nvidia does also. Maybe they do already (for pixel shader 1.x ?).
Ostsol said:Doesn't ATI support depth textures already in OpenGL? The format doesn't contain stencil data, though. . .
I always thought that was pretty much guaranteed.geo said:So are we liking the odds of seeing 3dc in WGF now?
Ratchet said:I always thought that was pretty much guaranteed.geo said:So are we liking the odds of seeing 3dc in WGF now?