So umm, dual GPU on one card is a nice little end around by Nvidia

superguy

Banned
Who cares if it's two. It's still going to be the fastest one card, and isn't that what these companies are after?

It's no different than if they had one superfast card for $799.
 
There is less Software compatability when they use two-cores vs one-core. Sometimes the score of having two-cores isn't even higher than having one core, and often times the score isn't more than 30% faster.
 
Don't forget only being able to use half the memory on the card. Quad SLI would have substantial lag as well in AFR mode, but I guess dual GPU shouldn't be that bad. I remember the Rage Fury MAXX being ripped over this.

Any effect requiring persistent textures, like image burn-in, motion trails, cloth/water/fur simulation, etc would really hammer the SLI scaling characteristics, because these textures would have to be shuttled back and forth between memory copies.

SLI would be worth it IMO if you can make a super fast (say, 50GB/s) interconnect between the chips. Then it should be possible to avoid the memory duplication. Maybe NVidia and/or ATI already have something like this in the works...
 
So if you have qaud SLI and, two GPU's per card, 512 MB each, how much effective memory is that? 256MB or 512 MB?


And if they start going to multi core GPU's, how do you handle each core getting memory access? Do they each have to have seperate busses or what? Or one bus for all?
 
You could say all the same things about SLI. Did it become larger and larger in the maketplace or not?

Yeah, it did become more and more important. Sure we'll reach a point of diminshing returns, but who knows where.
 
And Charlie says CPU is the bottleneck.

This basically bodes huge success for the system if that's true. Because that means it's hugley powerful, only being constrained by the CPU.

It would be far worse to be bottlenecked by some other intrinsic factor.
 
Xbot360 said:
So if you have qaud SLI and, two GPU's per card, 512 MB each, how much effective memory is that? 256MB or 512 MB?


And if they start going to multi core GPU's, how do you handle each core getting memory access? Do they each have to have seperate busses or what? Or one bus for all?


see what the general and mostly oblivious money throwing gamer doesnt understand is that this is a PR campaign more then an aim for great performance. Graphics card processing units are already multi-cored. Nvidia and ATI are going to attempt to cash in on the (sorry!) idiots who think more always = much better. I know, especially on this boared, i am by far not the only one who is a bit saddened to watch these companies slide more and more into showing they would rather make a buck, and frankly i think both are getting a bit lazy. Its been quite awhile since something completely unique has been done. Having the most hype and fancy (even if un-needed junk) about your high end products sells more low/medium end products for that company. They dont make suped up cards to make alot of money on them.

In an industry where it has become common that your products are refreshed with performance increases of 15-30+% over the previous ones every 6 months (usually get 2-3 titles in that time worth stressing of that hardware if you're lucky), spending very large amounts of money is looking more and more like a joke.
 
Last edited by a moderator:
Seems to me option for quad-AFR is go use them not for serial framerate increases, but to increase AA or add temporal AA. Another option is maybe "speculative AFR" in other words, if the game can determine that based on inputs (controls, network) that the game state has updated, but not in a way that invalidates the N+2 or N+3th frame significantly, then it can use AFR, otherwise, it would discard those frames. Of course, the only problem is, this would make framerate jerky, as well as fall down where framerate falls down the most - rotating the view.

Given a quad-7900GTX, you might be able to get away with 16x temporal antialiasing on older titles. Each card could render the frame 4 times, and with 4 cards, you've get 16x.
 
I think someone ought to dig up those lovely quotes from Nvidia execs and PR back when the Voodoo 5 5500 was released... Along with the comments of the NV fans. :devilish:
 
I agree, one mans " cpu limited " is another mans " AA+AF " for free. Depends what settings you choose.

What's more of a concern is the psu and power supply issue. Even with 2 gpus and the top end multicore cpu's there have been the signs of trouble. Careful consideration should be given to ancillary components if a home built quad Sli machine is being built ....
 
Mintmaster said:
Don't forget only being able to use half the memory on the card. Quad SLI would have substantial lag as well in AFR mode, but I guess dual GPU shouldn't be that bad.
Well, from what we've seen for Quad SLI so far, it looks like the typical mode will be a combination of AFR and SFR.
 
dizietsma said:
What's more of a concern is the psu and power supply issue. Even with 2 gpus and the top end multicore cpu's there have been the signs of trouble. Careful consideration should be given to ancillary components if a home built quad Sli machine is being built ....

Well, wouldn't this be a case where a gpu specific psu would be useful?
I mean, if you're going to shell out the amount of cash needed for such a system, another psu to make sure you won't get problems with the graphics shouldn't be much of a put off...
 
BRiT said:
There is less Software compatability when they use two-cores vs one-core. Sometimes the score of having two-cores isn't even higher than having one core, and often times the score isn't more than 30% faster.

We should wait to see how it is implemented before jumping to conclusions. Perhaps it is using a different methodology for utilizing two cores simultaneously then the standard SLI or Crossfire systems.
 
Doesnt appear these duo cards will be available except through OEMs. With a die nearly half the size of their competition, Nvidia could sell the duo cards as something that will beat the X900XTX straightup and probably get away with a nice premium over the X1900XTX.
 
trumphsiao said:
well I heard a rumor that Duo-Core G71 Single card will be priced at between 600~650USD:D

Pricing aside, will someone be able to buy them as single sollutions?
 
Back
Top