NVIDIA Fermi: Architecture discussion

True, but this generation is unlike any other in that the hardware has now surpassed the software. The 5870 already allows all settings maxed out in nearly every game presently out at 24" and below and it's going to be a while before this changes. The only compelling reason, other than bragging rights, to even need something with substantially more performance is going to three screens, and AMD owns that niche. I'm thinking that will grow far faster than some first thought considering the across the boards and sometimes startling enthusiasm from the sites that have reviewed a three screen set-up ... to SEE it in action apparently kindles an irresistable lust to have it kind of thing. That means a shift from upgrading to the holy grail of a 30 inch screen to a three screen set-up (and at substantially less cost to boot) ~ a shift from a focus on a Fermi oriented solution (assuming it has a substantial performance edge) to an AMD-Xfire solution ... at least until Nvidia can come up with a multi-screen solution.

For those already heavily into sli/xfire and who have a 30" screen the logical next step is THREE 30" monitors, after all they are, by definition, already way INTO the absolute best gaming experience/edge possible, and a three screen set-up provides a substantial advantage in head to head gaming, and that leads directly and only to AMD, or for those like Coleen Kelly at TWiT who already run 3 30" monitors, the next step is almost certainly xfired 5870's (now that eyefinity supports xfire) and there simply is no reason to buy a Fermi board. It doesn't GET you anything.

Hate to break it to, but for alot of folks, its all about performance. As I said in a post that got removed, I own and use 2x GTX260s in SLI, the 5870 offers very little as a performance boost to me to upgrade. Fermi on the hand should have the performance at launch to compell an upgrade. "IF" it doesn't then I like many people may wait till games make the purchase justified. Beit 5870 or Fermi.
 
And if AMD releases a 5890 card when Fermi is released and it matches Fermi's performance?
And is cheaper?
And is quieter?
And runs cooler?
And uses less power?
And does eyefinity?

If all the above were true, would you still buy a Fermi over a 5890?

Heck, who would?

Thus the conundrum Nvidia faces if Fermi is not meeting performance expectations and cannot trounce the expected 5890 ... why WOULD anyone buy it over a 5870/5890 ... and if one DID do so just because it's an Nvidia ... maintaining pride is going to have a tough slog when one's peers evince laughter and derision for being such a fanboy twit as to buy a second rate solution just because ...

Fermi not besting a 5890 is a worst case scenario, perhaps, but one well within the realm of the possible.

I would just to keep my driver upgrade simple. No need to clean out drivers.
 
I would. For its Folding @ Home performance. F@H GPU client performance is about the only thing holding me back from buying a Radeon HD 5 series card to replace either my 8800 GTS 512 or GTX 285. That and the current pricing which places these cards above their initial MSRP.
Exactly why I have 5 285s F@H 24/7.
 
True, but this generation is unlike any other in that the hardware has now surpassed the software. The 5870 already allows all settings maxed out in nearly every game presently out at 24" and below and it's going to be a while before this changes.

And you'll find that to be the case in reviews around HD4870/GTX280 launch as well. People weren't satisfied then either.

The only compelling reason, other than bragging rights, to even need something with substantially more performance is going to three screens, and AMD owns that niche.

Sorry but you can't seriously think that Eyefinity is going to be AMD's ace this generation. Folks aren't going out in droves to buy two more monitors. Not to mention the software support just isn't there. It's a novelty at best for the forseeable future, even more so than multi-GPU setups.
 
Ironically. I use only 2 monitors these days. But thats because I play 2 accounts in MMORPGS. And performance can still be an issue when your accellerating 2 games. The Quad Core CPUS are up to it. But some games I'd like to disable SLI and just run from independent graphic cards. 1 GTX 295 GPU core simply isn't enough for some of the latest MMORPGs at the highest settings.
 
I don't see this. With the small amount of information they gave us, there will be easily faster than GTX285. Look at this: Fermi has 128 TMUs instead of 80, 512 Cores instead of 240, 48 ROPs instead of 32, 384bit+GDDR5 instead of 512bit+GDDR3. And there is the other stuff, which is new over GT200.
nv30 and r600 had a lot more similarities http://www.dict.cc/englisch-deutsch/similarities.html to the older generation than Fermi to GT200.

I don't believe that we will see r900 this year because there is no new process node before q4. And the delay of Fermi means not that this will effect the next generation. Look at nv35/nv40 or r580 or r670. They all came in the expected timeframe.

When was the last time ATI launched a new architecture on a new node? Just sayin'......

-Charlie
 
Exactly why I have 5 285s F@H 24/7.

Sorry for the OT but what is exactly the incentive for people to do that?

I run F@H as well with my spare cycles but why would I want to buy like 5 GPUs and run then just for that? What is the gain to you?
 
Hmm all that and nothing worth while :!: How long do you think driver development would take? Lets see hmm August was tape out of A1? And they will be demoing the board at CES not just benchmarks;).

How could someone know that it would be pushed if nV didn't even know how many respins it would have taken! Ridiculous!

Lets start out with the basics. Tape out was not August, it was July, third week.

Second, I explained my reasoning very carefully. You should read it, then you might be able to answer your own rhetorical sounding questions. You have yet to attempt to counter any of my points with facts, just off-handed ad-homs. Then again, all of your predictions have been laughably off so far, so I guess you are internally consistent.

-Charlie
 
Sorry for the OT but what is exactly the incentive for people to do that?

I run F@H as well with my spare cycles but why would I want to buy like 5 GPUs and run then just for that? What is the gain to you?
The chance to help cure diseases and the competition between members and teams.
There are folders who make my small farm look... small. :smile:
 
And you'll find that to be the case in reviews around HD4870/GTX280 launch as well. People weren't satisfied then either.

Far fewer will be dissatisfied this generation than last. The 5870 is far more powerful relative to the games presently released than the 4870 was relative to the games in release when it came out. Considering console hardware upgrades are still years in the future, the 6000 series will continue that trend and a point will be reached when only a multi monitor set-up provides any reason at all to upgrade GPU's.

[/QUOTE] Sorry but you can't seriously think that Eyefinity is going to be AMD's ace this generation. Folks aren't going out in droves to buy two more monitors. Not to mention the software support just isn't there. It's a novelty at best for the forseeable future, even more so than multi-GPU setups. [/QUOTE]

From everything I've read it's doesn't remain a novelty for most that have experienced it in person, it becomes a lusted after must have. As more people set up such a system, more people will have the chance to experience it in person and so on. It's a matter of how compelling that first hand experience is and from what i've read, it is VERY compelling. Such a meme can catch on very rapidly making eyefinity THE upgrade to have. With AMD having already provided eyefinity support for the 5970, it is unlikly to be long before that is extended to the rest of the 5000 series, likely with the next driver release, and most of the new games will have built in support for eyefinity.

I see no reason why 3 monitor eyefinity setups wouldn't be experiencing exponential growth by june and for some time after among gaming enthusiasts, power users and even the average joe with the money to spend on it.
 
Last edited by a moderator:
I'm fairly sure it's gonna remain just as niche as SLI/CF and Stereo 3D (for now). If anything, I reckon 3D is the technology most likely to go mainstream at some point. It's even more compelling and at least that one doesn't require more desk space than most people have.
 
From everything I've read it's doesn't remain a novelty for most that have experienced it in person, it becomes a lusted after must have. As more people set up such a system, more people will have the chance to experience it in person and so on. It's a matter of how compelling that first hand experience is and from what i've read, it is VERY compelling. Such a meme can catch on very rapidly making eyefinity THE upgrade to have. With AMD having already provided eyefinity support for the 5970, it is unlikly to be long before that is extended to the rest of the 5000 series, likely with the next driver release, and most of the new games will have built in support for eyefinity.

I see no reason why 3 monitor eyefinity setups wouldn't be experiencing exponential growth by june and for some time after among gaming enthusiasts, power users and even the average joe with the money to spend on it.

NVIDIA will apparently be coming out with their own [supposedly improved] version of Eyefinity when GF100 launches.

All the other nonsense about "what if...?" scenarios is meaningless. NVIDIA will come out with their true next gen part, ATI/AMD will refresh current 58xx parts, NVIDIA will refresh GF1xx parts, and it's a cycle that goes back and forth.
 
I would. For its Folding @ Home performance. F@H GPU client performance is about the only thing holding me back from buying a Radeon HD 5 series card to replace either my 8800 GTS 512 or GTX 285. That and the current pricing which places these cards above their initial MSRP.
But...
It was said several times that radeon/GF performance in F@H is not "fair"due to different workloads, no?
So one will buy 5 cards (wtf), not because he cares about solving cancer but because of "points" he gets ...
 
Last edited by a moderator:
Far fewer will be dissatisfied this generation than last. The 5870 is far more powerful relative to the games presently released than the 4870 was relative to the games in release when it came out. Considering console hardware upgrades are still years in the future, the 6000 series will continue that trend and a point will be reached when only a multi monitor set-up provides any reason at all to upgrade GPU's.



From everything I've read it's doesn't remain a novelty for most that have experienced it in person, it becomes a lusted after must have. As more people set up such a system, more people will have the chance to experience it in person and so on.
I would say that more people will have the chance to experience ATi's deficiencies even more , as people pointed out , the software for Eyefinity just isn't there , it only supports some games , in the majority of games however , the picture gets stretched too far , enough to make all 3D pbjects look falt and ugly , compare that to the situation when Nvidia released their 3Dvision , and you will see the difference , Nvidia had a list of games compitabilities with a rating system , and they supported large number of old and new games .

And I wouldn't say that HD5870 can play every game out there maxed , it can't play Crysis or STALKER Clear Sky , or even Arma II , it has the same performance of GTX 295 , which is still defecient in those areas .
 
I would say that more people will have the chance to experience ATi's deficiencies even more , as people pointed out , the software for Eyefinity just isn't there , it only supports some games , in the majority of games however , the picture gets stretched too far , enough to make all 3D pbjects look falt and ugly ...
That isn't a list of games that work with Eyefinity, it's a list of games that are problematic with extra-wide display support like Eyefinity or TH2Go but are fixed by a program called Widescreenfixer. Many games simply work by selecting the correct resolution, no Widescreenfixer required. I don't have a single game that doesn't work with Eyefintiy. Since getting it in Early October, I haven't started a game in single monitor mode.
 
NVIDIA will apparently be coming out with their own [supposedly improved] version of Eyefinity when GF100 launches.

All the other nonsense about "what if...?" scenarios is meaningless. NVIDIA will come out with their true next gen part, ATI/AMD will refresh current 58xx parts, NVIDIA will refresh GF1xx parts, and it's a cycle that goes back and forth.

Really ... ? Sounds quite fanciful, Nvidia being ready with an Eyefinity solution so soon out of the clear blue sky (since AMD told the world about it).

Each cycle is unique onto itself and this cycle even more so with the double whammy of one of the two major players for the first time branching out to include major non-gpu specific cababilities in their GPU and the graphic card capabilities moving well ahead of the software requirements due to the focus on programming for consoles. These are both firsts and they are both major changes in the GPU market and will have commensurately major consequences. That cycle that 'goes back and forth' is not carved in stone, times and circumstances continually change and that cycle may be in the beginning stages of breaking down or changing in ways not seen before.
 
Back
Top