NVIDIA Fermi: Architecture discussion

Discussion in 'Architecture and Products' started by Rys, Sep 30, 2009.

  1. XMAN26

    XMAN26 Banned

    Hate to break it to, but for alot of folks, its all about performance. As I said in a post that got removed, I own and use 2x GTX260s in SLI, the 5870 offers very little as a performance boost to me to upgrade. Fermi on the hand should have the performance at launch to compell an upgrade. "IF" it doesn't then I like many people may wait till games make the purchase justified. Beit 5870 or Fermi.
     
  2. XMAN26

    XMAN26 Banned

    I would just to keep my driver upgrade simple. No need to clean out drivers.
     
  3. Razor1

    Razor1 Veteran


    The "hypothetical" situation you just presented wasn't what we were discussing so again...........:?:
     
  4. mapel110

    mapel110 Newcomer

  5. Kowan

    Kowan Newcomer

    Exactly why I have 5 285s F@H 24/7.
     
  6. trinibwoy

    trinibwoy Meh Legend

    And you'll find that to be the case in reviews around HD4870/GTX280 launch as well. People weren't satisfied then either.

    Sorry but you can't seriously think that Eyefinity is going to be AMD's ace this generation. Folks aren't going out in droves to buy two more monitors. Not to mention the software support just isn't there. It's a novelty at best for the forseeable future, even more so than multi-GPU setups.
     
  7. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983- Veteran

    Ironically. I use only 2 monitors these days. But thats because I play 2 accounts in MMORPGS. And performance can still be an issue when your accellerating 2 games. The Quad Core CPUS are up to it. But some games I'd like to disable SLI and just run from independent graphic cards. 1 GTX 295 GPU core simply isn't enough for some of the latest MMORPGs at the highest settings.
     
  8. When was the last time ATI launched a new architecture on a new node? Just sayin'......

    -Charlie
     
  9. compres

    compres Regular

    Sorry for the OT but what is exactly the incentive for people to do that?

    I run F@H as well with my spare cycles but why would I want to buy like 5 GPUs and run then just for that? What is the gain to you?
     
  10. Lets start out with the basics. Tape out was not August, it was July, third week.

    Second, I explained my reasoning very carefully. You should read it, then you might be able to answer your own rhetorical sounding questions. You have yet to attempt to counter any of my points with facts, just off-handed ad-homs. Then again, all of your predictions have been laughably off so far, so I guess you are internally consistent.

    -Charlie
     
  11. Kowan

    Kowan Newcomer

    The chance to help cure diseases and the competition between members and teams.
    There are folders who make my small farm look... small. :smile:
     
  12. spigzone

    spigzone Banned

    Far fewer will be dissatisfied this generation than last. The 5870 is far more powerful relative to the games presently released than the 4870 was relative to the games in release when it came out. Considering console hardware upgrades are still years in the future, the 6000 series will continue that trend and a point will be reached when only a multi monitor set-up provides any reason at all to upgrade GPU's.

    [/QUOTE] Sorry but you can't seriously think that Eyefinity is going to be AMD's ace this generation. Folks aren't going out in droves to buy two more monitors. Not to mention the software support just isn't there. It's a novelty at best for the forseeable future, even more so than multi-GPU setups. [/QUOTE]

    From everything I've read it's doesn't remain a novelty for most that have experienced it in person, it becomes a lusted after must have. As more people set up such a system, more people will have the chance to experience it in person and so on. It's a matter of how compelling that first hand experience is and from what i've read, it is VERY compelling. Such a meme can catch on very rapidly making eyefinity THE upgrade to have. With AMD having already provided eyefinity support for the 5970, it is unlikly to be long before that is extended to the rest of the 5000 series, likely with the next driver release, and most of the new games will have built in support for eyefinity.

    I see no reason why 3 monitor eyefinity setups wouldn't be experiencing exponential growth by june and for some time after among gaming enthusiasts, power users and even the average joe with the money to spend on it.
     
    Last edited by a moderator: Jan 4, 2010
  13. Florin

    Florin Merrily dodgy Veteran Subscriber

    I'm fairly sure it's gonna remain just as niche as SLI/CF and Stereo 3D (for now). If anything, I reckon 3D is the technology most likely to go mainstream at some point. It's even more compelling and at least that one doesn't require more desk space than most people have.
     
  14. jimmyjames123

    jimmyjames123 Regular

    NVIDIA will apparently be coming out with their own [supposedly improved] version of Eyefinity when GF100 launches.

    All the other nonsense about "what if...?" scenarios is meaningless. NVIDIA will come out with their true next gen part, ATI/AMD will refresh current 58xx parts, NVIDIA will refresh GF1xx parts, and it's a cycle that goes back and forth.
     
  15. chavvdarrr

    chavvdarrr Veteran

    But...
    It was said several times that radeon/GF performance in F@H is not "fair"due to different workloads, no?
    So one will buy 5 cards (wtf), not because he cares about solving cancer but because of "points" he gets ...
     
    Last edited by a moderator: Jan 4, 2010
  16. DavidGraham

    DavidGraham Veteran

    I would say that more people will have the chance to experience ATi's deficiencies even more , as people pointed out , the software for Eyefinity just isn't there , it only supports some games , in the majority of games however , the picture gets stretched too far , enough to make all 3D pbjects look falt and ugly , compare that to the situation when Nvidia released their 3Dvision , and you will see the difference , Nvidia had a list of games compitabilities with a rating system , and they supported large number of old and new games .

    And I wouldn't say that HD5870 can play every game out there maxed , it can't play Crysis or STALKER Clear Sky , or even Arma II , it has the same performance of GTX 295 , which is still defecient in those areas .
     
  17. thatdude90210

    thatdude90210 Regular

    That isn't a list of games that work with Eyefinity, it's a list of games that are problematic with extra-wide display support like Eyefinity or TH2Go but are fixed by a program called Widescreenfixer. Many games simply work by selecting the correct resolution, no Widescreenfixer required. I don't have a single game that doesn't work with Eyefintiy. Since getting it in Early October, I haven't started a game in single monitor mode.
     
  18. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    Honest question: do dates on chip represent the real tape out date?
     
  19. spigzone

    spigzone Banned

    Really ... ? Sounds quite fanciful, Nvidia being ready with an Eyefinity solution so soon out of the clear blue sky (since AMD told the world about it).

    Each cycle is unique onto itself and this cycle even more so with the double whammy of one of the two major players for the first time branching out to include major non-gpu specific cababilities in their GPU and the graphic card capabilities moving well ahead of the software requirements due to the focus on programming for consoles. These are both firsts and they are both major changes in the GPU market and will have commensurately major consequences. That cycle that 'goes back and forth' is not carved in stone, times and circumstances continually change and that cycle may be in the beginning stages of breaking down or changing in ways not seen before.
     
  20. Ailuros

    Ailuros Epsilon plus three Legend Subscriber

    And when was the last time ATI jumped to a completely new generation from another without a refresh in between? Just askin'.... ;)
     
Loading...

Share This Page

Loading...