When Tuesday does the G70 NDA expire?

geo said:
_xxx_ said:
geo said:
that would move us into three full years since NV was the performance leader on a top card vs top card basis.


:? What?

Which part are you having a problem with?

I think he is hinting that the "top card" for NVIDIA is the 6800 Ultra SLI. Ok, so its not a single card, but it is a product that a user can go out and buy and make look like one card (or something like that).
 
JoshMST said:
I think he is hinting that the "top card" for NVIDIA is the 6800 Ultra SLI.

No, I just say that there is no way to determine a "winner", since depending on the game or situation the picture's rather balanced.
 
Unknown Soldier said:
My original question was referring to this.

http://www.hardspell.com/hard/showcont.asp?news_id=14372&pageid=3058

Call Of Duty.

The Normal GTX craps all over the SLI. Shouldn't the SLI have the advantage?

Ah... I see. Well, yes. That is surprising. I think that's most likely a SLI driver problem as the 6800 also beats the 6800 SLI.

It's also surprising that even after all this time the game is still GPU limited.
 
So the GF7800's performance is not stellar .. and Dave says we should not expect stellar performances from the R520 either. So this round of GPU/VPU's is gonna be a disappointment then hey.

/me starts flicking boogies at DW .. you awake boy!!?? :D

US j/k about the boogies DW
 
_xxx_ said:
I got that, but since when does ATI have the performance crown? It depends on the game/benchmark, sometimes ATI is the winner and sometimes nV. Crowning either as the "winner" is ridiculous.

But ATI wins more benchmarks...

You could use the same logic to prove the 6800 ultra SLI isnt better than the X850XTPE because it looses in certain games.
 
Unknown Soldier said:
Mordenkainen said:
Btw re your original question, assuming these benches are accurate at 1024 D3, with the demo they used, is CPU limited for 6800 SLI, GF7 and GF7 SLI. You can see that at 1600 the difference between the two opens up.

Mordenkainen .. Sorry my bad .. I've linked the incorrect page.

My original question was referring to this.

http://www.hardspell.com/hard/showcont.asp?news_id=14372&pageid=3058

Call Of Duty.

The Normal GTX craps all over the SLI. Shouldn't the SLI have the advantage?

Call Of Duty is not one of the games suported by SLI, it will only run on one card even with a SLI setup.
 
Tim said:
Call Of Duty is not one of the games suported by SLI, it will only run on one card even with a SLI setup.

That doesn't explain why it's faster....or does the driver still have all the SLI overhead even when the game isn't supported?
 
Yay for NDA translating to "publish whenever" in Mandarin :rolleyes: :LOL:
 
Rys said:
Yay for NDA translating to "publish whenever" in Mandarin :rolleyes: :LOL:

Well TBH I haven't really learned anything new from this article (especially about the new features). Translations suck and there seem to be some CPU limitations. Don't worry, I'm quite sure Hexus' review will still make quite a splash ;)
 
_xxx_ said:
geo said:
Which part are you having a problem with?

"Performance leader". I explained it a couple of posts ago.

Sorry, missed that. I think Dave's point re single rumored bench is pretty on target tho, so let's just see.

Btw, I never used "winner", which is a much more broad kind of metric that brings in a whole bunch of other factors, and I was careful to put it in terms of NV's own stated goals in this area, which as I understand them are not to be "competitive", "arguably", "mixed", or anything of the kind.

Me, I'm happy with "competitive", as my original post said.

Edit: And, actually, re-reading my post, I didn't call ATI the performance leader either!
 
Aren't these new cards g70/r520 suppossed to be better with the hyped high dynamic range (HDR) feature? I read in a couple of places that this is one of the "next big things" in games. So I wouldn't want to buy a card that is crippled in this area... That is why I'm reluctant to go with a 6800 series and nevermind ATI without sm3 right now..

I have an old 9500pro that has been very faithful to me but is starting to show its age. I was thinking of holding out for a 7800 (quite the leap I imagine). However, the gtx seems kindof pricey right now. Anyone have any idea when the 7800 GT is going to be released?

I'll be upgrading my whole computer at once but I've been holding out waiting for a next gen PCI-E card that I want to last a while. Thinking I'll need it for Oblivion, NWN2, Quake IV, etc when they come out... To play at decent frame rates at least anyway... :)


Will go with an A64 venice core maybe and pray it overclocks like hell. A64 X2 seems pricey right now to so might not jump on that wagon.
 
_xxx_ said:
Yeah, so it's 16 pipes when we think the old way, right?
That depends...
In the "old way", fragment pipelines ("pixel shaders") and ROPs were not separated, so depending on what you consider the main part of the "old pipelines", G70 has either "24 pipes", "16 pipes" or maybe "16 pipes plus 8 half-pipes". ;)

Or you just drop the "old pipes" and say 6 quad fragment pipes and 4 quad ROPs.

Unknown Soldier said:
Pete .. ye .. I actually understand that, my bad for wording it so badly. What I was trying to get at was that they've never before separate the ALU's and said it's 24 Pixel and 8 Vertex(if they have .. then my bad .. I shouldn't really rush through the reviews). I'm not really a engine type of guy since I really don't understand all about ALU's ROP's etc. I'm getting there .. but am slow to capture the info.
ALUs are the core of the shader pipelines. So having separate vertex and fragment shader pipelines implies having separate ALUs.
 
So, seeing the drop in frame rate when AA is on.Do you think Xenos will be more powerfull since AA won´t make such a severe hit in performance ?
 
Back
Top