What resolution and framerate should next-gen target *spawn

Status
Not open for further replies.
imagine this scenario, farcry @ 1080p60hz vs crysis @ 720p30hz.

crysis would look way better, and imo would feel just as smooth, thanks to motion blur and per object motion blur, which really takes the edge off the jerkiness at lower frame rates.

Have you tried it at different frame rates? You might get a similar 'look' but you will get a really different 'feel'. And that is what is important when you play games.
 
RE5? - Yes, I did. Didn't play any better or worse than my PS3 version and looked pretty much the same too. If achieving something special means you have to resort to sub-720p, even in the next generation, I certainly won't be holding it against you.

I do not get what you mean about sub 720p in regards to frame rate. Could you expand please?

Also, did you really get consistent 60 fps on PC RE5? And you tried the PS3 version around the same time and didn't feel any difference?
 
I think resolution will become "mandate".1080p will become standard and there will always be games that will go for 1080p and 60fps at beginning of a gen, but as graphics start to get better and better developers will have to cut one of two, and my guess is that it will be former, its like every gen.
 
and its been a mistake every gen, no reason they shouldn't correct that mistake.
How was that mistake? Every person on this earth will notice difference between HD resolutions on their TV's and older SD resolutions, thats not debatable. But how many people will notice difference between 30fps and 60fps? There is no doubt almost every developer will go for more eye candy and 30fps, because, if you can do 1080p60fps at beginning of a gen, you surelly wont be able to pull it of 3 years later because visual jump will stall. Developers will have to choose between those two and it all comes down to whats more important to majority, and its always the visuals( look at SoTC on ps2)
 
How was that mistake? Every person on this earth will notice difference between HD resolutions on their TV's and older SD resolutions, thats not debatable. But how many people will notice difference between 30fps and 60fps? There is no doubt almost every developer will go for more eye candy and 30fps, because, if you can do 1080p60fps at beginning of a gen, you surelly wont be able to pull it of 3 years later because visual jump will stall. Developers will have to choose between those two and it all comes down to whats more important to majority, and its always the visuals( look at SoTC on ps2)
The fact that they removed the mandates tells you it was a mistake. The majority of people want the best looking game a developer can make. That isn't done by imposing stupid artificial limits on their creative choices. And the majority of people don't know that many of the games this gen aren't actual HD resolutions and many of them believe that some of those Non HD games are among the best looking.
 
The fact that they removed the mandates tells you it was a mistake. The majority of people want the best looking game a developer can make. That isn't done by imposing stupid artificial limits on their creative choices. And the majority of people don't know that many of the games this gen aren't actual HD resolutions and many of them believe that some of those Non HD games are among the best looking.
Thats why I wrote "mandate".What I meant, majority of developers will aim for 1080p once Sony and MS get hype train rolling, not all will achieve it, but that will be an aim.

What if they were tauting this gen as HD one? Developers could easily went with 900x500 and more polygons, better lighting and post processing but that still wouldn't look quite right.
 
The SOTC HD remake can't manage 1080p60 compared to an SD PS2 game, but I wouldn't say PS3 was early and gamers would have been better off waiting, having to make do with PS2 for another year or two. :p And if PS3 had been later, 2008, then next gen would need to come even later to get the same degree of performance increase. We'd be looking next-gen in 2016!
According to the devs, SOTC's animation was design for 30fps. That is why the game is 1080p/30fps. That's why S3D mode is also 30fps.
 
Not talking about forcing requirements.
Talking about HW capability that developers don't have to make the tradeoff.
I feel like everything I've said has been ignored. ;) There's no such thing! Until we have the resources to render photorealistic graphics at any resolution, there'll always be a need to trade resolution for shader power. Until then it will and should be down to the devs to decide. Some will want high framerate, high pixel shading (racers), while others will trim back on the shader requirements and can put more into resolution and IQ.

If they can't, maybe they should wait, or not bother. Maybe it won't be enough of a jump for people to buy consoles, when they can spend money for other gadgets in this day and age.
Except even if Uncharted 4 doesn't hit 1080p60, it'll look bloody amazing on PS4, and we'll have better looking games that play better. As others have said, if we could hit SD resolution with TV quality realism, that'd be way better than Crysis on the most awesome PC rig at the most ridiculous resolution. There's still loads of room for hardcore games to improve,

MLAA was fairly unrelated to Larrabee for the record. It came from the labs/ray-tracing side. Larrabee can do MSAA quite well... in fact often better than conventional GPUs since it's a binning rasterizer (and thus can accumulate and resolve in-core).
Okay. Important correction for me!

According to the devs, SOTC's animation was design for 30fps. That is why the game is 1080p/30fps. That's why S3D mode is also 30fps.
And according to DF, S3D can't maintain 30 fps, ergo 1080p60 would not work either.
DF said:
Performance is the real issue here, though. In many respects, running Shadow of the Colossus in 3D is highly reminiscent of playing the PS2 game. We see the same problems with frames rendered over-budget causing sustained frame-rate drops: the smooth 30FPS of the 720p version regularly drops down to the 20 and 15FPS we experienced with the game's original release.
SOTC gives us a great example of the limits of visuals that can be achieved at 1080p60. Of course it is a remake, and we could have other improvements. But look at the other titles out there that have tried to hit 1080p60. Things like Wipeout need to resort to resolution scaling, and they're far from the best looking games in terms of stills. The motion is beautiful, but that's only of value to some games. Others can make do with lower framerates to make each frame prettier. And thankfully that choice remains with the developers. Uncharted looks as great as it does because Sony didn't mandate ND hit 1080p60!
 
Ah, yes. Geometry Wars 2, as well. Only 2D/side-scrollers?
Function said Virtua Tenis. I dunno since I guess you would exceed eDRAM limit by couple times if you did it at 1080p with AA. I just don't know why is that even important?If you took average of this gen you would probably get closer to no AA and sub hd resolution.
 
Yes, I think that just about concludes the list. Yes, averaging all the 360 games together should lower the overall resolution to sub-HD.

Anyway, I think the target will probably be 1080p/30fps in S3D. Although, according to some, we should shoot for SD with more processing per pixel. I strongly disagree with that path unless we go back to SDTVs. I won't pay for games that go that path.
 
Yes, I think that just about concludes the list. Yes, averaging all the 360 games together should lower the overall resolution to sub-HD.

Anyway, I think the target will probably be 1080p/30fps in S3D. Although, according to some, we should shoot for SD with more processing per pixel. I strongly disagree with that path unless we go back to SDTVs. I won't pay for games that go that path.
Not only 360, both consoles would probably end up sub hd,especially PS3. PS3 has been getting worse multiplats from beginning of this gen.

Yea, I would guess they are gunning for 1080p30fps for next gen.
 
Yes, I think that just about concludes the list. Yes, averaging all the 360 games together should lower the overall resolution to sub-HD.

All of the 'street' games are 1080 but who cares, most (all?) of the games that are at 1080 aren't very good looking. They make a great argument for using lower resolutions.

And just for completeness, the PS3 games all averaged together will be sub-HD as well.

Anyway, I think the target will probably be 1080p/30fps in S3D. Although, according to some, we should shoot for SD with more processing per pixel. I strongly disagree with that path unless we go back to SDTVs. I won't pay for games that go that path.

I'm not sure I understand the sentiment? Do you think a DVD movie looks worse than any current games on an HDTV? If they could come anywhere close to DVD quality I would be elated. The actual value of how many pixels are displayed should be of almost no importance.
 
I'm not sure I understand the sentiment? The actual value of how many pixels are displayed should be of almost no importance.
Play a console game ported over to PC. In most cases of multplatform games, there's very little difference in shaders or effects, even on the PC version. Yet even at 1680x1050 (not even full HD), it's a night-and-day difference to the console. Resolution absolutely makes a difference in terms of sharpness and clarity.

You mentioned DVD movies, why? Why not mention Blu-rays? The only difference there is more pixels, and they also look a hell of a lot better than their lower-res counterparts.
 
Play a console game ported over to PC. In most cases of multplatform games, there's very little difference in shaders or effects, even on the PC version. Yet even at 1680x1050 (not even full HD), it's a night-and-day difference to the console. Resolution absolutely makes a difference in terms of sharpness and clarity.

You mentioned DVD movies, why? Why not mention Blu-rays? The only difference there is more pixels, and they also look a hell of a lot better than their lower-res counterparts.

The point being DVD still looks a hell of a lot better than games. Absolutely adding resolution is an improvement over no other changes, but that doesn't make adding resolution the best use of that processing power when you have the option of doing other things with that processing power.

And I mentioned DVD because the person I was replying to mentioned SD resolutions so I used DVD as a comparison point for what was technically (ultimately) possible on SD.
 
Play a console game ported over to PC. In most cases of multplatform games, there's very little difference in shaders or effects, even on the PC version. Yet even at 1680x1050 (not even full HD), it's a night-and-day difference to the console. Resolution absolutely makes a difference in terms of sharpness and clarity.
That's not a fair comparison. Obviously the same thing @ a higher resolution will typically look better, but it takes more processing power too. The question is how best to use a fixed amount of available processing power. Compare for instance 720p w/ good 4x AA or something to 1080p w/ no AA. The 720p image almost always looks better in motion, with far fewer distracting edges and shimmering artifacts.
 
Status
Not open for further replies.
Back
Top