The Next-gen Situation discussion *spawn

Just chose the GTX 670 as an example, because it draws less power then a GTX 560 Ti (which you did mention) according to the figures in the link you provided, while actually being faster ;). Could have chosen the HD 7950 as well for example, which, according to those figures draws less power than a GTX 560 Ti as well ;).

Was basically just wondering why you did choose to mention the GTX 560 Ti, when there are actually better and less consuming GPUs mentioned in the link you provided.

Because you were asking about power draw under maximum load, and the 560 happened to be one of the cards that was mentioned as coming with the Tiki. The 670 almost certainly throttles, as does the 7950.

If you want to know about how PC GPUs throttle under load there's lots of info, but it's not relevant to consoles as they need to maintain a consistent level of performance regardless of load or temperature.

Yeah, but that question was referring to the reviewed system "Dr Evil" was posting about over there:

Fair enough, just as long as you understand that you can't compare max TDP of monster PC parts that throttle and then say they should automatically fit just fine in a console where they can't.

Probably no point in pretending that you're still interested in knowing at the Tiki now though.
 
forcing developers to meet some arbitrary pixel standard will automagically result in a better looking end product, amirite?

Yes it would, running at native 1080p on a native 1080p panel improves IQ dramatically over 720p on a 1080p panel.

You consoles peeps need to get with the times and drop all this 720p crap, it was ok for the current consoles but get with the times already, 1080p is the current HD standard that pretty much everything HD runs at.

Or are you you trying to tell me that rendering at double the resolution doesn't make for a nicer looking end product.
 
Last edited by a moderator:
Yes it would, running at native 1080p on a native 1080p panel improves IQ dramatically over 720p on a 1080p panel.

You consoles peeps need to get with the times and drop all this 720p crap, it was ok for the current consoles but get with the times already, 1080p is the current HD standard that pretty much everything HD runs at.

Fully agreeing.

But only increasing the rendering resolution of a current-gen game doesn't make it a "next-gen" game, doesn't it?
 
Fully agreeing.

But only increasing the rendering resolution of a current-gen game doesn't make it a "next-gen" game, doesn't it?

depends on how low the original game was to begin with, COD for xample look like completely different games when rendered at native 1080p because the resolution on consoles is so low and piss poor that a lot of the finer details in the assets is lost.

1080p would bring all the little details out.

You can instantly see the difference between the native 720p PC screen shot and the console shot.

PC-vs-PlayStation-3-BIG.png


The difference at 1080p would be even higher still.
 
depends on how low the original game was to begin with, COD for xample look like completely different games when rendered at native 1080p because the resolution on consoles is so low and piss poor that a lot of the finer details in the assets is lost.

1080p would bring all the little details out.

Not sure if serious :???:?

Asking cautiously now: Did you just say that you consider current-gen COD to be "next-gen" looking when rendered at 1080p :???:?

:oops::eek:

Why are you using a strawman?

You might want to read all the posts before claiming such things, because "almighty" actually made that statement just a few posts earlier, see:


Going from 1280x720 @ 30fps with no AA and AF to 2560x1600 @ 60fps with any thing up to 32x AA with 16xAF in my eyes is a generation leap.


;)
 
user542745831 said:
You might want to read all the posts before claiming such things, because "almighty" actually made that statement just a few posts earlier, see:

Define "next-gen."

Clarification: I only ask because from my perspective going from 1280x720 @30fps (min) to 2560x1600 @60fps (min) takes approximately 10x the resources. Now, in GPU hardware terms a "generation" typically refers to an architectural change accompanied by a process reduction and a significant performance improvement. Generally, a generational leap in GPUs will provide somewhere around a 1.8x to 2x (best case scenario) improvement in performance. Thus, if you need something with 10x the performance, it would have to be at least 3 generations ahead (which would still only get you to ~8x performance increase). But all of that is assuming perfect power scaling, which is very far from reality.


Now, if you want to cling to some vague undefined idea of what "next-gen" is, then I may submit that WiiU is a "next-gen" system.
 
Last edited by a moderator:
This is Zelda Ocarina of Time running at 1080p with High-Resolution textures.

Does it look better? Yes. Does it look PS2 quality? No way.

That is what a true generation leap is.

 
Yes it would, running at native 1080p on a native 1080p panel improves IQ dramatically over 720p on a 1080p panel.

You consoles peeps need to get with the times and drop all this 720p crap, it was ok for the current consoles but get with the times already, 1080p is the current HD standard that pretty much everything HD runs at.

Or are you you trying to tell me that rendering at double the resolution doesn't make for a nicer looking end product.

No, I'm trying to tell you that it's possible to make a game at 720p (or some other sub 1080p resolution) that looks better than at 1080p because of how they have chosen to use the resources. Choosing to render at a higher resolution might not be as useful as rendering at a higher frame rate or using higher poly counts (or many other things).

Saying you need 1080p and FXAA doesn't really say jack, because you can do that right now on current consoles, the end product might be pong but it meets some silly mark you've drawn in the sand. Pointless.
 
This is Zelda Ocarina of Time running at 1080p with High-Resolution textures.

Does it look better? Yes. Does it look PS2 quality? No way.

That is what a true generation leap is.


A Zelda GCN/Wii game to compare. I am firmly sided with those that developers, and not some abstract standard, is the best path. There will be games that make good use of 1080p (flight games, RTS, etc come to mind) but there will be games where the lower resolution of 720p will allow for a better end image. Heck, I would choose the lower resolution and higher framerate for some games (racing) every day of the week over higher resolution.
 
I generally agree that you can have it be decided by the developer, and let the customer decide whether the developer made the right decision. However, one of the reasons we have 30fps or less in a lot of games, is because there's a graphics rat-race - if everyone had been used to whatever consoles can do at 60fps, our perception of good graphics would have been the same as it is now - the peaks would be lower, but you only notice lower peaks when there are higher peaks to compare them with. In the meantime the gameplay would very likely be better.

And the only way to change that could very well be through a platform holder mandate ...

Of course we all just hope that the consoles will be powerful enough to be able to do 1080p 60fps so easily that developers don't have to compromise for more framebuffer stuff ;)
 
This is Zelda Ocarina of Time running at 1080p with High-Resolution textures.

Does it look better? Yes. Does it look PS2 quality? No way.

That is what a true generation leap is.


lol, i almost fell out of my chair laughing.......It would be a pitty if developers were going that rout for 15 solid years to now. oh lord. :LOL:

the standard for this upcoming generation should be 1080p with a substantial amount of change in visuals, they should surpass the previous generation quite a bit if it's going to propel well on it's own.

Now, there are plenty of exceptions to the game when it comes to how clean it's got to look :)

If none of the future tech demos were genuine 1080p; then It's possible that the next gen of gameing won't necessarily need to be 1080p native anyways, it's an ideal goal but actually what there SHOULD be primarily; is a substantial amount of decrease in aliasing. whatever means of resources they will now have, their goal is to make it as super clean as possible. no more excuses on any kinds of good multisampling methods not being used.

Besides the target of using real professional quality of multisampling, the other important goal is to greatly increase the meshes and object detail. sharp textures are nice but they can only go so far when they're on budgeted surfaces. there should be fewer "smoke and mirrors" like methods.

- - -

on a side note-

did you know that metroid prime used actual polygons for cracked surfaces, as well as depth inside of it's environments and it still ran at 60 fps? now we've switched to texture based environments as a means of performance conveniences. they're good but at certain angles they don't pare well as better alternatives to polygonal etching.
 
Last edited by a moderator:
I had this discussion years before. People were arguing that PS2 could never have good graphics because of the TV resolution. They thought Quake 3 was the best possible graphics ever, that only the resolution could go up.

To them I said:
Which has better graphics; your PC running Quake3 on 1600*1200, or my dvd player running The Matrix on 720*576?

Suffice to say, they shut the hell up. And when they saw Tekken Tag running a short time later on my brand new PS2. They were almost crying because of the detail in the floor textures, the animation and the character models. So while there should be a high resolution next gen, I rather have gran turismo 5 on 1280*720, then Forza 2 or 3 on 4048*2024 or something. You get the point ;-)
 
I had this discussion years before. People were arguing that PS2 could never have good graphics because of the TV resolution. They thought Quake 3 was the best possible graphics ever, that only the resolution could go up.

To them I said:
Which has better graphics; your PC running Quake3 on 1600*1200, or my dvd player running The Matrix on 720*576?

Suffice to say, they shut the hell up. And when they saw Tekken Tag running a short time later on my brand new PS2. They were almost crying because of the detail in the floor textures, the animation and the character models. So while there should be a high resolution next gen, I rather have gran turismo 5 on 1280*720, then Forza 2 or 3 on 4048*2024 or something. You get the point ;-)

Problem with that is the texture resolution was not significantly higher then the display resolution so it made little to no difference, the same is not true for todays games, well on PC atleast.

Having 1kx1k textures or above on a low resolution display is pointless as you'll never really get to see all the detail.
 
Problem with that is the texture resolution was not significantly higher then the display resolution so it made little to no difference, the same is not true for todays games, well on PC atleast.

Having 1kx1k textures or above on a low resolution display is pointless as you'll never really get to see all the detail.

The same can still be said today:
The Matrix on DVD will look better than Halo5, Gears of War 4, Uncharted 4, or whatever future sequel, on 8K HD
 
Back
Top