Will next gen consoles focus on improving IQ at current HD resolutions?

Personally, I just want them to take what is currently the Xbox 360, rip out the motherboard and replace it with a 2012-2013 spec one, replace the dvd drive with blu-ray, and leave the controller exactly the same. Do you guys think the majority of gamers would agree with me on that?

Hardcore gamers, yes ... they don't need controller with screen, refocusing from TV to controller too often isn't good idea anyway.
I also expecting PS3 controllers to be compatible with PS4, there is really no need to some crazy redesign.
 
Do you guys think the majority of gamers would agree with me on that?

Indeed.

The only caveat I'd suggest WRT hardware (not spec) would be to make sure the IO isn't a limitation as it has proven to be on xb360 with shared usb2.

Everything else I'd suggest the same.
Roughly the same box size as the original (but with the cooling fan as it is now in xbox360s with a direct export of heat on the top/side instead of the back)
Same controller support (perhaps new split controller if they can figure it out).

Then invest the budget in the guts and apply that processing power to making 720/1080p games (with a preference on 720 and investing heavily in image quality).

Having said that, any investment made to bring in expensive accessory features will likely have an impact on the spec in the box and further push the 720p mantra.
 
For those that complain about CoD's rendering resolution, do you prefer the look of games which are 720p with no AA, like BF3? I think BF3 on consoles looks like far more of a mess than the recent CoD games. Also, I make the choice between AA and resolution all the time with PC games, and consistently chose to use AA over higher resolutions with none whenever I have to settle for only one or the other to keep a pleasant framerate.
 
Closed minded people prevent progress.

You've closed your mind off to the idea that one size doesn't fit all, and that it isn't likely to in the near future. Fortunately you aren't in a position to prevent progress though (so don't be too hard on yourself).
 
You've closed your mind off to the idea that one size doesn't fit all, and that it isn't likely to in the near future. Fortunately you aren't in a position to prevent progress though (so don't be too hard on yourself).

Another snide remark. How surprising.

Your contributions are mind blowing. On second thought, scratch the "mind" part.

kyleb said:
For those that complain about CoD's rendering resolution, do you prefer the look of games which are 720p with no AA, like BF3? I think BF3 on consoles looks like far more of a mess than the recent CoD games. Also, I make the choice between AA and resolution all the time with PC games, and consistently chose to use AA over higher resolutions with none whenever I have to settle for only one or the other to keep a pleasant framerate.

I disagree on BF3 visuals vs COD, but strictly from a single-player perspective. But only partly due to the visuals (better than COD IMO), more to due with everything else in the feel of the game. COD feels like one very scripted and canned event after another, but they make up for a lot of it with the events that they script and the atmosphere around it being very high quality.

With the success of both franchises, I'll be interested to see what they put together for nextgen. I'll go out on a limb and predict COD will be just as heavily scripted and controlled as this gen, but with higher quality assets. With their tight deadlines though, they may choose to let 1080p eat some processing as it will allow a lighter budget than 720p with even higher quality assets, shaders, physics, etc.

One curious demo that did strike me was Nintendo's 1080p Zelda bit. That may be the angle Nintendo takes to try and set WiiU apart from xb360/ps3 ... having the same games, but at 1080p as a feather in their cap. Implications of that for ps4/xb720 would likely be a marketing war of bulletpoints leading to 1080p standard which would be a shame.
 
Another snide remark. How surprising.

Your contributions are mind blowing. On second thought, scratch the "mind" part.

You just complained about snide remarks, and then immediately went on to try and make one. That's incredible.

Middleware and libraries are nothing new; suggesting that a console vendor should invest huge sums of money in researching, specifying, developing, testing and supporting a huge library of *everything* (and that developers should contribute to it for nothing), while simultaneously ignoring the fact that games face a large range of highly specific problems and see an even greater range of solutions, is not something you should be patting yourself on the head for.

Here's an ERP quote from the "Next Gen Gameplay and/or Graphics Differentiators" thread:

UGH!

I think you just have a very idealized view of software development. It really is not just a case of plugging together off the shelf components, every library has a support cost, the bigger more varied the functionality the bigger the support cost. When 3rd party technology has gameplay or visual quality implications it's even more complicated. There is an enormous gulf between demonstrating something and getting it to the point you can ship it.
 
One curious demo that did strike me was Nintendo's 1080p Zelda bit. That may be the angle Nintendo takes to try and set WiiU apart from xb360/ps3 ... having the same games, but at 1080p as a feather in their cap. Implications of that for ps4/xb720 would likely be a marketing war of bulletpoints leading to 1080p standard which would be a shame.

Correct me if I'm wrong but wasn't the WiiU Zelda tech demo 720p with no AA?

For those that complain about CoD's rendering resolution, do you prefer the look of games which are 720p with no AA, like BF3? I think BF3 on consoles looks like far more of a mess than the recent CoD games. Also, I make the choice between AA and resolution all the time with PC games, and consistently chose to use AA over higher resolutions with none whenever I have to settle for only one or the other to keep a pleasant framerate.

BF3 is 704p with PPAA on consoles and it looks much sharper than any CoD game IMO, I'm recently playing a lot of Gears 3 and MW3 on my 360 and everytime I pop in MW3 the difference in the resolution is pretty apparent...MW3 sure looks blurry when compared to Gears 3 - maybe that's me but I prefer sharper image with some aliasing to a blurrier IQ with less jaggies.

Higher resolution makes a big difference IMO, I'd get Crysis at 1080p with no AA over 720p with 4xAA any day personally.

Don't get me wrong though the 60fps is always awesome and overall I think that the CoD games look really good especially if you consider the crazy setpieces that the series have in the SP portion of their games but here we're talking strictly about IQ and how it compares in that aspect to games like BF3...having said that I think that we need to go with 1080p as the standard res next gen instead of 720p with better AA and let the developers choose what they need to compromise to get the vision that they have on their head on screen.
 
That's incredible.

Indeed.

I don't intend to drag this thread off topic so let's leave comments from other threads in other threads and leave your snide and insulting remarks in your head.

Kameradschaft said:
Correct me if I'm wrong but wasn't the WiiU Zelda tech demo 720p with no AA?

hmm...

Reggie said:
We showed what Link (the main character of Zelda) might look like in a 1080p environment, and it got people pretty excited...

Again, the point is not about a comparison versus our competitors. What we’ve said is it will be 1080p. Check the box on the best graphics capability.
http://nintendoeverything.com/67223/
 
Last edited by a moderator:
Indeed.

I don't intend to drag this thread off topic so let's leave comments from other threads in other threads and leave your snide and insulting remarks in your head.

But you are talking about middleware, libraries and tools in this thread. Here it is, on this page, in this thread:

Standard toolsets, libraries, techniques and methodologies will go a long way in helping to bridge this gap on games with commonalities to share the investment cost.

This can be applied to not only the interactivity of objects, but physics, AI, and more relevant to the thread topic, to 3d objects (models, textures, and "bones").

How does x look?
How does x act?
How does x react?
etc

The commonality of games and game worlds is increasing ... as processing power increases, expectations increase, and budgets increase.

The time for a massive library to be built is coming soon. Just a matter of someone having the vision, budget, and gumption to execute the concept.

It will drastically change the way games are perceived, and open the market to people that are otherwise non-gamers.

None of which requires 1080p to execute a convincing gameworld.

ERPs comment was/is directly relevant to the line that you are pursuing (once again) in here.
 
ERPs comment was/is directly relevant to the line that you are pursuing (once again) in here.

My post was in response to another in this thread as a potential means to elevate graphics:

blip said:
Do you guys think maybe they are just reaching a point where we've taken graphics as far as we can? We can make invidual elements look great but we cant shake the "video game" feeling of the whole thing.

Like some other user noted, watching a VHS video still looks more realistic than a modern video game. Sure, the resolution is low, but the textures, lighting, shadow, geometry is all infinity more complex because it's real life. Versus video games which are at HD resolutions, but still composed of mostly flat surfaces and a lot of "tricks" to give things more depth. I cant really pick up that blanket on that couch in that room and set it over on that table and have it drape off and all the lighting and shadowing changes that involes. GTA 4 was still a bunch of empty fake cardboard buildings, a facade of a city and nothing more. Textures are artist creations which can never have the organic complexity and subtelty of randomness of real life. The end result still looks a lot more fake than that old VHS of your '87 family christmas... there just isn't the complexity there and something tells me it won't really be there next generation either.... I feel like it's still going to be invisible barriers... "dont look behind that building because the textures drop off", this... that... etc.. etc..


I dont know... something tells me this is a budget thing. Its too expensive too make all that complexity.. video games need to be pumped out in a year or two, on a reasonable budget, hence probably the next gen trend towards "lower power usage, increased internet functionality...... and a minor visual upgrade lol...."

And my response was essentially "a huge library of HQ content".

50 people making 50 3d cars given the same time/tool budget will never equate with 50 people working on one. Pooling asset creation instead of individual recreation per project will yield far more favorable result in games which use realistic content as a basis for their game world going forward and as long as the 3d content isn't created with a hard poly / texture budget in mind, that same 3d asset could instead be down-sampled for the intended use (a good idea anyway as a mipmap of sorts for polygon objects in scenes which many games use these days) and when hardware does catch up, the assets are already built and ready to go.

And your response was "A Swiss army knife fits in your pocket!"

I think that's about as far as I'm willing to go off-topic with you on the matter as you don't seriously want to discuss my initial response proposal, so have fun coming up with whatever other quip you intend to respond with. I'm done responding to you.
 
My post was in response to another in this thread as a potential means to elevate graphics:

Nope, not just graphics that you were talking about, you were also talking about physics, AI etc. All of which is actually relevant to the thread, and whether next gen consoles will just focus on graphics. I'd quote your post again and point this out in black and white but I'd just get accused of things.

It's bad form to attack people while shouting about how offended you are. It's especially bad to do it three responses in a row.
 
Do you guys think maybe they are just reaching a point where we've taken graphics as far as we can? We can make invidual elements look great but we cant shake the "video game" feeling of the whole thing.

Like some other user noted, watching a VHS video still looks more realistic than a modern video game. Sure, the resolution is low, but the textures, lighting, shadow, geometry is all infinity more complex because it's real life. Versus video games which are at HD resolutions, but still composed of mostly flat surfaces and a lot of "tricks" to give things more depth. I cant really pick up that blanket on that couch in that room and set it over on that table and have it drape off and all the lighting and shadowing changes that involes. GTA 4 was still a bunch of empty fake cardboard buildings, a facade of a city and nothing more. Textures are artist creations which can never have the organic complexity and subtelty of randomness of real life. The end result still looks a lot more fake than that old VHS of your '87 family christmas... there just isn't the complexity there and something tells me it won't really be there next generation either.... I feel like it's still going to be invisible barriers... "dont look behind that building because the textures drop off", this... that... etc.. etc..


I dont know... something tells me this is a budget thing. Its too expensive too make all that complexity.. video games need to be pumped out in a year or two, on a reasonable budget, hence probably the next gen trend towards "lower power usage, increased internet functionality...... and a minor visual upgrade lol...."

If the price for Graphics is poor resolution on VHS level i prefer HiRes and less complex graphics :)

I am curious as to how much of a difference the next gen is really gonne be able to "show" us, considering that the most visual upgrade this gen was going high def. That trick has already been spend.
 
If the price for Graphics is poor resolution on VHS level i prefer HiRes and less complex graphics :)

I am curious as to how much of a difference the next gen is really gonne be able to "show" us, considering that the most visual upgrade this gen was going high def. That trick has already been spend.

I think it'll depend on the individual user more than ever how much value they see in the bump in graphics, and that even for the same user it'll vary by game. That's why I think platform vendors need to give developers as much flexibility as possible to use their platform as they see fit - it's probably why I'm not in favour of arbitrary constraints on resolution, frame rate, or any other graphical feature.

It'll be really interesting to see how the technical aspects of this next generation of consoles are marketed. The mHz and Gflops war of E3 2005 wouldn't seem to be as important to win if reasonably affordable PCs are significantly outperforming consoles at the time of their introduction as it removes a bit of the glory. Or maybe that doesn't actually matter for spinning purposes.
 
I am curious as to how much of a difference the next gen is really gonne be able to "show" us, considering that the most visual upgrade this gen was going high def. That trick has already been spend.

How did you come to that conclusion? Which ps2 game (if it were in 720p) do you think would look as good as uncharted3 (or insert game you think looks really good here)? I think how much work is done on each pixel is at least equal (probably way more in most cases) to the bump in resolution in terms of overall graphic quality. Tessellation alone in next generation could lead to some significant improvements in poly counts for much more realistic scenes.

The idea that 'That trick has already been spent' is in error, because most of the best looking games of this generation aren't 1080p, so there's an opportunity to still push resolution if they want. However there's plenty of room for significant visual improvements without a bump in resolution. I don't think we're anywhere near being so close to absolute graphical fidelity that people won't care about the difference on future hardware.
 
Perhaps the real marker for when graphics are good enough is when peope don't notice the change to CGI custscenes. In War in the North, the end cinematic is what the games should look like. If that happens next gen, woohoo!
 
Back
Top