Sure, Fermi may have some problems such as power, scalability, yields, whatever, but Nvidia has done something that's fundamentally creative and extended the concept of GPUs in so doing. It may take them another generation to perfect it, but, without efforts like this, neither they nor ATI nor Intel would be as good tomorrow as they will be now.
Thanks, that sums up my view of things quite nicely.
(Sorry guys but because of language barrier i'm not always able to express my views properly.)
In fact, it could very well be the case.
So you're basically saying that NV's engineers don't know what they're doing?
Add to that the possible unbalanced derivation of the architecture.
It's unbalanced right now because quite often you get higher triangle rate in middle end than in high end. What Fermi does is solve this disbalance.
How do these setup engines/rasterizers work when you disable half a GPC's SIMD units or a quarter of 2 of them?
Less work isn't a problem so I think they work just fine. You have a reason to believe they're not?
That raises at least as many questions as that answers to "will it be faster?".
If you want to rise some questions no one can stop you. But not all questions are smart you know.
Can I play the benchmark or otherwise gain enjoyment from it in any way? Does it push the hardware to its absolute peak, giving me insight into how my games will run in some way? Or is it just a pretty tech demo with a framerate counter with very little secondary meaning?
Unigine is as close to a real DX11 engine with heavy tesselation as possible right now. It's not some kind of a synthetic benchmark. And what's interesting is that it was developed on AMD's DX11 hardware. Sure you can say that it's not a game and thus it's irrelevant. But then everything's irrelevant beyond what we have now in games. Cypress' DX11 is irrelevant too. And most of today games run just fine even on an RV770 because these are console ports made for 5-year old hardware. No reason to buy GF100 or Cypress for them. So let's talk about things that matters then?
OK, I'll play. I think you're (sometimes hilariously) biased
As always you may think whatever you like. But me not crying in dissapointment over GF100 graphics architecture doesn't make me biased sorry. (And the opposit does actually.)
Did you get those performance deltas for GF100 from a comprehensive review using a multitude of theoretical and game benchmarks, ideally from games you're interested in, at the resolutions you want to run at, using the IQ settings that you need as a minimum to enjoy the graphics fidelity, from an outlet that you can trust as much as is humanly possible to give you an accurate view of real-world performance? Or did they come from NVIDIA?
I've seen enough vendor-provided benchmarks to know what to expect in a real world judging from them. A vendor can pick results but he can't lie. So it's a matter of painting the whole picture from the information made avialable to us. Sure a proper review is neccessary but just to prove that your guess was right or wrong. And for the last 5 years my guess was wrong only once -- with RV770.
Cypress has GF100 licked in some non-subtle ways, by big margins. You might not enjoy a modern Radeon architecturally (I struggle sometimes, so it's cool, you're in good company) but it's hard to argue with their raw single precision numbers in Cypress, big ROP performance and that large dollop of sampling and filtering.
Numbers are irrelevant, it's how you use them. You're asking me if i've seen a proper review of GF100 and then you're saying that Cypress is winning by the numbers. That's a contradiction. You need to test the sample yourself just as i need to see a review before making any assumptions just from the number of units. But we have more than that already. We have some performance numbers. And from what I'm seeing here people are saying "oh well 64 TMUs are less than 80 -- that's settled then, it's worse already". Yeah, well, 240 SPs are less then 800 and 40 TMUs are less than 80 but that didn't mean much in a GT200 vs RV770 battle isn't it?
No they haven't. Where's my clocks!
You don't know the planned delta? =)
Yeah, why did you bother?
Did I? I'm sorry my English isn't very good.
I'd like to see a post proving that, especially considering I used one for about a year.
I'm using 5850 right now. How's that for a revelation?