When Tuesday does the G70 NDA expire?

Hmm... so considering it is similar to PS3's RSX, how does G70 compare to ATI's Xenon unified vertex/pixel architecture (which they don't intend to use until TBA)? The specs don't impress me that much and the SLI score is just plain confusing :oops: :?:
 
nAo said:
Unknown Soldier said:
So the GTX still only has 16 pipes? Sorry . .had browsed really quickly passed the diagram.
It shades 24 pixels per clock, but it can't ouput more than 16 shaded pixels per clock

Yeah, so it's 16 pipes when we think the old way, right?
 
alexsok said:
Hmm... so considering it is similar to PS3's RSX, how does G70 compare to ATI's Xenon unified vertex/pixel architecture (which they don't intend to use until TBA)? The specs don't impress me that much and the SLI score is just plain confusing :oops: :?:

There is a comparison chart with R500 in the article.
 
U S, a quick read of B3D's 6800 and X800 reviews would show you both have separate pixel and vertex shaders, like every other programmable gamer's 3D card before them. Xenos will be the first to use "amalgamated" shaders.

The 6800U preview would also show you that nV has separated ROPs from pixel "pipes," thus 24 "pipes" (6 quads) and 16 ROPs.

Why has performance improved so dramatically in CMR2005 and TR:AoD? Is it the separated texture processor or (what looks to be) the two full ALUs per pipe?

Edit: Heh, "anti-denticle."
 
_xxx_ said:
geo said:
that would move us into three full years since NV was the performance leader on a top card vs top card basis.


:? What?

He means that if the rumoured R520 score proves to be correct this will mean ATi retains the performance crown for the 3rd card generation.

Personally, I think that basing that on a synthethic benchmark score alone turns it into a very debatable argument.
 
Mordenkainen said:
He means that if the rumoured R520 score proves to be correct this will mean ATi retains the performance crown for the 3rd card generation.

I got that, but since when does ATI have the performance crown? It depends on the game/benchmark, sometimes ATI is the winner and sometimes nV. Crowning either as the "winner" is ridiculous.
 
Pete said:

Will you people PLEASE stop posting links that are being flagged as "inappropriate" by the corporate filters here!! :devilish: I've clicked on about 6 of them in 5 mins so I expect some guys in black suits and dark shades to show up anytime now! :cry:
 
Unknown Soldier said:
Mordenkainen said:
Unknown Soldier said:
I'm suprised to see the Normal GTX smack the crap outta the GTX SLI.

Is something wrong there?

US

You do know that translated page you're linking to features an image of a slightly goofy and bulky guy with the following caption " Heh heh, I came: P " right? :LOL:

Aghh!! No man!! .. Damn Link is too long for Google.

Fixed Thx Captain :D

Btw re your original question, assuming these benches are accurate at 1024 D3, with the demo they used, is CPU limited for 6800 SLI, GF7 and GF7 SLI. You can see that at 1600 the difference between the two opens up.
 
Pete said:
U S, a quick read of B3D's 6800 and X800 reviews would show you both have separate pixel and vertex shaders, like every other programmable gamer's 3D card before them. Xenos will be the first to use "amalgamated" shaders.

The 6800U preview would also show you that nV has separated ROPs from pixel "pipes," thus 24 "pipes" (6 quads) and 16 ROPs.

Pete .. ye .. I actually understand that, my bad for wording it so badly. What I was trying to get at was that they've never before separate the ALU's and said it's 24 Pixel and 8 Vertex(if they have .. then my bad .. I shouldn't really rush through the reviews). I'm not really a engine type of guy since I really don't understand all about ALU's ROP's etc. I'm getting there .. but am slow to capture the info.

US
 
alexsok said:
Hmm... so considering it is similar to PS3's RSX, how does G70 compare to ATI's Xenon unified vertex/pixel architecture (which they don't intend to use until TBA)? The specs don't impress me that much and the SLI score is just plain confusing :oops: :?:
Hardspell had a comparison table for G70, RSX, and Xenos on one of the 41 pages. Dunno how accurate it is WRT Xenos, though.

trini, it was just a Google translation of the Hardspell review, with accompanying funny terms, like "anti-denticle" for anti-aliasing, "shader establishment" for shader model, and "Fission Cell" for Splinter Cell.

U S, they've always listed pixel shaders separately from vertex shaders. And the GF6 has split ALUs, though I don't know if the GF7's are now fully functional and separate twins, or if nV has simply add extra functionality to one (nAo said both can now do MAD--see the pic under "How to Fuel the Pipeline" in that link).
 
Mordenkainen said:
Btw re your original question, assuming these benches are accurate at 1024 D3, with the demo they used, is CPU limited for 6800 SLI, GF7 and GF7 SLI. You can see that at 1600 the difference between the two opens up.

Mordenkainen .. Sorry my bad .. I've linked the incorrect page.

My original question was referring to this.

http://www.hardspell.com/hard/showcont.asp?news_id=14372&pageid=3058

Call Of Duty.

The Normal GTX craps all over the SLI. Shouldn't the SLI have the advantage?

My bad.

US
 
Pete said:
Hardspell had a comparison table for G70, RSX, and Xenos on one of the 41 pages. Dunno how accurate it is WRT Xenos, though.
It's accurate if we believe to numbers from ATI and MS.
 
DaveBaumann said:
Basing anything on BS rumour numbers is pretty shakey ground in the first place.

Ahh ye .. you right of course.

/me sits on hands and waits for Dave's review and the 26th July to come by.

US
 
Pete said:
U S, they've always listed pixel shaders separately from vertex shaders. And the GF6 has split ALUs, though I don't know if the GF7's are now fully functional and separate twins, or if nV has simply add extra functionality to one (nAo said both can now do MAD--see the pic under "How to Fuel the Pipeline" in that link).

There are still not twins but more equal as in NV40.
 
In Call of Duty, I have noticed that in many parts that a single board 6800 GT will outperform a SLI GT set. That title is pretty CPU limited, even at high res and AA.

I was able to actually play CoD: UO at 1024x768 with 16X AA and it was still surprisingly smooth. Good looking title, but not terribly tough on the video card anymore. So it wouldn't surprise me at all that it would perform better on a single card vs. SLI.
 
Back
Top