[modhat]Do NOT do Pc vs. consoles![/modhat]
Two UT3 maps (PC) will be left out of the PS3 version because of the lack of smoothness:
The question is: why do you believe it?
The question is: why do you believe it?
That's more like it, but AFAIK RSX isn't so much weak as 'not as strong', though I may be wrong on that. It should in principle be just as good at efficiency measures as G70, unless they deliberately gimped those bits to make a chip just as big and expensive but less functional... I think the advantage PS3 has is Cell is very efficient at these graphics optimization functions. And compare that to XB360 where Xenos is very good at them. Thus the two machines need different approaches to keep everything running at top speed.Until the PS3, was the general idea of console and PC graphics based on just throwing everything at the graphics chip and letting it sort out what gets drawn and what doesn't?
..or, are functions like these things that developers would normally expect to be built into the graphics chip and the PS3 is unique and a bit backwards by not having a strong preprocessing system on the RSX itself?
I'm scared framerate performance is the real bottleneck of ps3 architecture instead of the shared ram
It should be no mystery about the framerate issues with many PS3 games. a large part of the problem is pixel fillrate. it's a fairly simple thing to understand. Fillrate is part of the overall graphics performance equasion. it's a large part. bandwidth is another. both factors combined, limited fillrate and bandwidth, accounts for lower than ideal framerates in most current-gen games, be it on PS3 or Xbox 360.
it seems we've gone backwards from last-generation. fewer games have rock solid framerates. fewer 60fps games compared to the same point in time last-gen.
Sry for the previous post, sometimes I have a variable mood. I'm going to edit it because, except the first lines, it's a bunch of crap.The question is why do you feel the need to question everything without checking facts out for yourself before.... Mark Rein said it himself in an interview!
ihamoitc2005, a good motion blur can help hiding the "low" framerate. PGR3 is a great example of this, everything just seemed smoother than it actually was, especially when it came out.I disagree my friend. I will not say so much backwards but instead I prefer to say clarification. Xbox/Xbox360 showed that 30fps will be acceptable frame-rate for many games for many customers. I think for future games very few will be 60fps. Look at PGR4, Halo3, Gears of War success. All action games with low frame-rate but still for many consumers they are happy
An additional note on this SPU module: it really maps bad to the SPUs as it basically navigates through hierarchies to collect bone datas that will be used (and set) as vertex shader constants by another SPU module.[*]Skin matrices, even after animation there some work required to get them into the format used by the GPU vertex shader.
[*]Blend shapes, a custom module that handles facial animations
SPU culling doesn't save any bandwidth (aside from some corner cases). All vertices still have to be loaded from XDR, except now instead of sending all through FlexIO and having RSX's setup hardware eliminate hidden triangles, the SPUs are doing it (at a faster rate). It could actually take up more XDR bandwidth if you decide to temporarily store the results of SPU culling rather than feeding it directly to RSX (this would prevent the SPU from waiting on RSX in portions of high pixel load and low vertex load).PS3 devs, would it be true to say that to a large extent. PS3s disadvantage in framebuffer bandwidth can be alleviated by using some of the SPU time for detailed culling? Or is that pushing it to far?
4GPix/s is plenty for 720p or 1080p. For 720p and 60fps, you can fill the entire screen and/or 1024x1024 shadow maps a total of ~70 times per frame.It should be no mystery about the framerate issues with many PS3 games. a large part of the problem is pixel fillrate. it's a fairly simple thing to understand. Fillrate is part of the overall graphics performance equasion. it's a large part. bandwidth is another. both factors combined, limited fillrate and bandwidth, accounts for lower than ideal framerates in most current-gen games, be it on PS3 or Xbox 360.
it seems we've gone backwards from last-generation. fewer games have rock solid framerates. fewer 60fps games compared to the same point in time last-gen.
if Microsoft and Sony had demanded GPUs with say, at least twice fillrate (8 Gpixels/sec instead of 4 Gpixels) and double the graphics memory bandwidth, the framerate situation could be more reasonable. there would always be developers that would just waste that on even more advanced graphics and therefore those games might still have framerate issues-- but *responsible* developers would use that to push 60fps. or at the very least, 30fps with absolutely no dips below 30, a common problem in games today.
With framerate, suppose you have a distribution like this:
89% of the time the framerate is >60fps
10% of the time the framerate is 30fps-60fps
1% of the time the framerate is 15fps-30fps
0.1% of the time the framerate is <15fps.
What should the dev do? Halve graphical load to avoid the 1% of the time that it goes below 30fps? Quarter graphical load to ensure >99.9% of the time it's over 30fps? I say leave it as is, and maybe even increase the graphical load a bit because I wouldn't mind 2% of the time going below 30fps.
Maybe you could redesign the scenes that tax the engine too much?
4GPix/s is plenty for 720p or 1080p. For 720p and 60fps, you can fill the entire screen and/or 1024x1024 shadow maps a total of ~70 times per frame.
The problem is that the ROPs don't always reach that rate in reality. Sometimes bandwidth isn't enough. Sometimes they're waiting for the pixel shader. Sometimes both are waiting for the setup engine. You have to divide up your frame time among the various bottlenecks that each object brings about.
In any case, the reason framerates are low has nothing to do with the hardware.
It entirely due to developers aiming for too many objects and/or too long shaders. You could halve Xenos/RSX and still get 60fps at 1080p with 4xAA if you simplify the scene enough.
There's always a three-way tradeoff between scene complexity/quality, resolution, and framerate. Once the renderer and art is as good as it gets, you can't increase all three at the same time. The exact relationship between these three variables isn't fixed across all games, but the devs still subjectively decide what the optimal point is.
With framerate, suppose you have a distribution like this:
89% of the time the framerate is >60fps
10% of the time the framerate is 30fps-60fps
1% of the time the framerate is 15fps-30fps
0.1% of the time the framerate is <15fps.
What should the dev do? Halve graphical load to avoid the 1% of the time that it goes below 30fps? Quarter graphical load to ensure >99.9% of the time it's over 30fps? I say leave it as is, and maybe even increase the graphical load a bit because I wouldn't mind 2% of the time going below 30fps.
Framerate is all about decisions. It has little to do with hardware unless you're talking about a fixed rendering load. Unfortunately, rendering load is incredibly hard to judge without having full access to the assets, game engine, profiler, and hardware.
well, because framerates were, to a high degree on some arcade platforms, fixed with games of the 1990s, I don't believe it's not possible to keep framerates at a steady and high rate for many games. Namco's System 22/23 families, Sega's Model 2 & Model 3 families, ran more than 95% of their games at 60fps. one could argue that todays games are far more complex, and they are. but then, back in the mid to late 1990s, the arcade games of the time were graphically the most complex games at the time, more complex than games of before that time.
ERP said:If framerate is not a top priority (and frankly these days there is so much other stuff to worry about) it'll be inconsistent. Gone are the days where I could personally police every piece of artwork and content in a game.
As a PS2 owner, I am very disappointed with the amount of 30fps titles on next gen consoles as of yet. Some of them are still enjoyable, but as I've said numerous times, there should be no substitute for framerate! :/