NVIDIA GF100 & Friends speculation

More limited? Could you give an example of what a vec4 could do that a VLIW with the same 4 instruction slots couldn't?
You know you're right, for some reason I was thinking the GPRs couldn't feed them at peak for independent ALU ops but I don't really know why I thought that in retrospect. So the only problem is the increased granularity for branch coherence.

The compiler would have to be really careful how it create the VLIW instructions from scalar threads as far as LDS accesses are concerned so it doesn't create bank conflicts.
 
Of course not .. but the idea is still silly , at the very least , HD5870 should be more than enough to run the game with all possible quality settings SMOOTHLY .

Wow, that's a really arbitrary yardstick....

U have almost the same performance drop by just turning on the direct compute Depth of field than tesselation :rolleyes:.

And AA too. All in all it seems not to be overdone. I'm not impressed at all with the result though. So far tessellation is only being used to smooth edges which is really underwhelming IMO. Call me once it actually adds detail.

Maybe this dx11 direct compute advanced depth of field was made more for the gtx4xx than the radeons :rolleyes:.

How exactly do you write a compute shader that's friendlier to gtx4xx than the radeons? Or alternatively, how do you write a compute shader that's friendly to radeons?
 
Arbitrary? I'm sure he selected the fastest single GPU available on the market today...

Yes, but the "fastest single GPU available today" is an arbitrary target. Why shouldn't they target the "fastest single GPU available in six months, a year, two years"?
 
Wow, that's a really arbitrary yardstick....
Add some "in 1920x1200" and I totally agree with him.

Why the hell should a game running on a 3 years old console struggle to run correctly on an all new $1000+ PC using a mainstream screen res?

CF/SLI are there for extreme cases (30" monitors, multi monitors, quad-HD screens, stereo...), not for an "optimal" result.
 
Add some "in 1920x1200" and I totally agree with him.

Why the hell should a game running on a 3 years old console struggle to run correctly on an all new $1000+ PC using a mainstream screen res?

As with Crysis, as long as the IQ justifies the performance hit there's nothing to complain about. Would you have been happier if they just dumbed down the quality and renamed "Medium" to "High" to satisfy your hardware ego? You don't have to run it maxed out you know?

Judge the performance based on what IQ you're getting, not on the name of the setting.
 
Yes, but the "fastest single GPU available today" is an arbitrary target. Why shouldn't they target the "fastest single GPU available in six months, a year, two years"?

oh I agree, as I stated a few posts back. As long as the IQ justifies the performance hit, I see nothing wrong with targeting future high end configurations. Being used as a benchmark due to the stress on a PC is almost like free advertising as well :)
 
How exactly do you write a compute shader that's friendlier to gtx4xx than the radeons? Or alternatively, how do you write a compute shader that's friendly to radeons?

I was just thinking that they could do it specificaly for gtx4xx (with the L2 cache the compute things could run much faster there as they showed with the raytracing demo). I mean the complexity of the DOF. So the radeons will run it as fast as the gtx4xx in dx10 but as u turn on DX11 nvidia cards will gain a masive lead.
On nvidia site http://www.nvidia.co.uk/object/gf100_uk.html the advanced cinematic efects are represented with metro 2033 picture and depth of field. Its a TWIMTBP title after all.
It could be for them a good way to show the gtx4xx dx11 power if done the way they thinked.
Dirt 2 (i think AVP too) had direct compute depth of field amongst other efects and the leaked benchmarks showed similar fps in dx11 than the radeons.
Still without gtx4xx cards and fps numbers my whole theory is nonses :oops:.
 
I was just thinking that they could do it specificaly for gtx4xx (with the L2 cache the compute things could run much faster there as they showed with the raytracing demo).

Well yeah Fermi can be faster at certain workloads due to its caches and other bits but that's not the same thing as saying the developer is specifically targeting one architecture over another. We should fully expect some workloads to benefit from the caches without any specific optimization.

Its a TWIMTBP title after all.

It is but based on the developer interviews and early performance numbers there's nothing to indicate that AMD was shafted in any way.
 
This game is completely unplayable on Radeon HD5000(5870/5850) series using max details (DX11+ 4XAA ) , let alone the GTX200 series !

Fun fact:

If a game can run at max settings on current hardware at +60FPS people flame the developer for making "consolized games".

If a game rapes current hardware people flame the developer for making "unoptimized" games.

Damned if you push the hardware...damned if you don't?
 
Add some "in 1920x1200" and I totally agree with him.

Why the hell should a game running on a 3 years old console struggle to run correctly on an all new $1000+ PC using a mainstream screen res?

CF/SLI are there for extreme cases (30" monitors, multi monitors, quad-HD screens, stereo...), not for an "optimal" result.

Erhm...did you just compare 720p, no AA, scaled down I.Q ect. to the PC version running at high settings??

This is indeed silly season :oops:
 
As with Crysis, as long as the IQ justifies the performance hit there's nothing to complain about. Would you have been happier if they just dumbed down the quality and renamed "Medium" to "High" to satisfy your hardware ego? You don't have to run it maxed out you know?

Judge the performance based on what IQ you're getting, not on the name of the setting.
That is right , however the IQ is no different than Crysis , which is STILL the best looking game today .. so why should I have lower performance with the same IQ ?
 
doublepost
I wanted to quote something else and after I posted it I cant edit it:???:

Judge the performance based on what IQ you're getting, not on the name of the setting.

QFT!

btw.: how many posts here do I need to be able to edit my posts?
 
That is right , however the IQ is no different than Crysis , which is STILL the best looking game today .. so why should I have lower performance with the same IQ ?

Well you're certainly entitled to your opinion but where is the consensus that Crysis is a better looking game? It's definitely a brighter, sunnier game set in a more vibrant environment. But do Crysis' interiors really look better (both aesthetically and technically) than what you've seen out of Metro 2033 so far?
 
THe problem with metro is the lack of adjustable options.

Its slly that you can't set the Tessliation levels even setting it low , medium and high would render it alot more playable on current hardware.

This is where Crysis succeeded and Metro has failed.

There are reports of people editing files and turning off dof on dx 11 for massive boosts or tessliation for massive boosts.
 
Well you're certainly entitled to your opinion but where is the consensus that Crysis is a better looking game? It's definitely a brighter, sunnier game set in a more vibrant environment. But do Crysis' interiors really look better (both aesthetically and technically) than what you've seen out of Metro 2033 so far?

I never understood the majority of complaints against Crysis interiors. I guess I kind of understand them, when they are targeting the interior of shacks, that can be blown to pieces and thus it's understandable that they don't look "as good" (being destructible and subject to physics calculations and such, extreme detail may not be an option, otherwise performance would tank even more), but some go as far as labeling ALL of Crysis interiors as bad, with which I can't agree. Two of the biggest interior scenes, are done inside the alien ship and inside the aircraft carrier and they are both sublime to say the least.
 
THe problem with metro is the lack of adjustable options.

Its slly that you can't set the Tessliation levels even setting it low , medium and high would render it alot more playable on current hardware.

This is where Crysis succeeded and Metro has failed.

There are reports of people editing files and turning off dof on dx 11 for massive boosts or tessliation for massive boosts.

Erhmm...people did all sort of whack hacks to get Crysis to perform better?
(eg. they ran it in DX9 mode and tweked ini-files...for the same reason...that they thought the DX10 mode ran to slowy)

But if you are surprised that extra features impact performance...oh well...
 
Let me get this straight. The NDA expires and the launch is the 26th, but reviewers don't get cards until 1 week later? So basically all the benchmarks for one week will be Nvidia's own? That's a pretty controlled launch, even better than having reviewers that follow guidelines.
 
Well you're certainly entitled to your opinion but where is the consensus that Crysis is a better looking game? It's definitely a brighter, sunnier game set in a more vibrant environment. But do Crysis' interiors really look better (both aesthetically and technically) than what you've seen out of Metro 2033 so far?
I see your point , however I believe Crysis interiors look just as good (aesthetically) , and that had been manifested in the alien ship levels and mine level in Crysis Warhead .

on the technical side though , it is different , Metro might have more accurate light models , may be accurate dynamic shadows , maybe better precision in DOF , however these are redundant additions , I don't need them to get better image quality , approximations of them will work fine , and probably have the same image quality with increased performance .

I can't believe that something with the astonishing rendering power of HD5870 is struggling to run a game where I spend most of the time in corridors , what would happen if this was an open world city then ?

I am 100% sure that if the Xbox360 had a GPU with the same brute force as the HD5870 , games would have looked like a CGI movie by now .
 
Back
Top