I want to upgrade my desktop when we're at +200% of my retro 8800GTX.
Haven't we been there for a while now already?
btw, edit button no worky for you?
I want to upgrade my desktop when we're at +200% of my retro 8800GTX.
1. If ATI disregarded power and heat, we would have the same performance and a smaller chip to boot.
2. Time matters. This chip is 6 months after cypress.
3. I can argue some people will prefer physX while other will prefer eyefinity.
Again, if ATI disregarded power and heat, we would have the same performance and a smaller chip to boot.
Haven't we been there for a while now already?
btw, edit button no worky for you?
1) What do you mean?
2) yup
3) probably heavily shifted towards Physx. They aren't really related IMO, and eyefinity is definitely going to be very niche, assuming anyone prefers that kind of setup over a huge TV instead.
Point is that there is now a choice, and fortunately it's not up to you to decide that this choice shouldn't exist.
Your sweeping statement that every objective person must agree that these cards shouldn't exist is something completely different, of course.
Exactly what everyone without bias will see it. Guess now we have a rough estimate of the percentage of users with nVidia bias (on B3D) from the poll.
The fact that they still show up here arguing is quite telling, at least others have remainded silent, as they should.
After having read through many reviews and giving it some time, my final take on Fermi is this:
There's not much with Fermi worth giving thumbs up over. You can say "oh but the performance is there!" but that comes at a price. The card punishes the users to extract the performance. Severe heat build up, huge power drains and a shrieking fan are what one must readily endure to experience the performance.
Nvidia disregarded praticality for a benchmark victory. Not a great one at that consdering this is the least they could do against the competition which has had their products out on the market for 6+ months. All this and the still have the nerve to charge 25% more than the competition. It's just a poor offering all around.
Perhaps future iterations of Fermi will be more reasonable from a price, performace and practical usage standpoint. This version however, should have stayed in the labs....
I notice you keep lying about me saying the product should not exist. You have been spinnin gand spinning what I said:
In response to:
What's the difference between 'should've stayed in the labs' and 'product shouldn't exist'?
Sometimes we are at +300%, sometimes <+100%. I just deleted my post because I changed my mind about wanting to discuss it after looking at some charts. But be my guest.
heat is noise. The point being that inside a case the temps go up, resulting in higher fan speeds, resulting in more noise. And stock case have minimal SLF/NRC/DLF.
The costly power drain is a problem. The noisy fan is a problem. But the performance advantage over a 5870 is not so small as 13%. Dropping to 33fps versus 25fps is a significant advantage because the second number is the only one you notice.
1. If ATI disregarded power and heat, we would have the same performance and a smaller chip to boot.
But none of those charts show how long those dips are. If one card dips more than the other, but only for one frame, it's not important, and is actually skewing those charts. The graphs that actually shows the changing frame rate is more useful. A stable, consistent frame rate is more important than one that changes dramatically, whether it be to peaks or dips.
I think this is a key point. If ATI wants to, they can open up the power envelope, add another 50W and likely win the majority of the benchmark battles.
Yes because Nvidia cards can't be OC'd?
I disagree having played several games in the past where dips into the sub 25fps range have caused death or even worse, an unpleasent gaming experience.
Yes, but how many people use their cards out in the open or with case covers off? I'd venture to say far fewer than those who buy these things. Put the setup in a case, close it up and place it in the 2 places most people stick it, on the floor and on the desk and measure the heat and noise output then. I dont know about you, but I dont play games with me freakin ears 6" from my cards.
Yes because Nvidia cards can't be OC'd?
I doubt it. Saw a review with a HD5870 overclocked to ~1GHz, it was still getting beat by the GTX 480 (default clocks).aaronspink said:I think this is a key point. If ATI wants to, they can open up the power envelope, add another 50W and likely win the majority of the benchmark battles.