Well it can only be maxed out in 2011 if The Last Guardian is released.
It's only maxed out when exclusives looks 2x the best of 360 ones and multiplatforms runs/looks best on PS3. True story.
Well it can only be maxed out in 2011 if The Last Guardian is released.
Code quality counts for a lot. Learning how to write efficient code for a piece of hardware, especially one like SPU code without the execution aids of x86 where you have to make micro level choices that can severly cripple the speed of your processor, means a big difference in how much work you can do. That surely has to be an issue of developers learning to understand and design for the systems, rather than developer tools magically optimising their source code. If a new algorithm paper is released with generic demo C code, one developer copy/pasting that into their game won't get as much from their engine as another developer understanding the algorithm and crafting it from the ground up for the target machine using every efficiency they can think of.Sure there are new techniques developed, but these days I think that has less to do with hardware and more to do with algorithmic development.
My question is, do you think that developers are pushing near the achievable level of fidelity on current gen consoles or there is still a lot more juice to be gotten out of them. LA Noire, seems quite a huge leap from the very first-gen titles for example.
The PS3's gpu is a basically a 7800GTX, which is ancient by today's standards, yet being a closed box environment developers can really push the hardware to its limits, but do you think that they are close to that limit yet?
I mostly disagree with the forums opinions of technical superiority any way, pretty graphics are much more a function of art direction and good choices during asset creation than they are a function of technical merit.
Well, explain that to some staunch fans and they aren't going to listen. You're exactly right but it's almost always console users and not developers who do the policing here.I think it's a meaningless question.
Developers certainly learn from previous experience with a platform, and 2nd titles are often a jump over 1st titles, though in practice that's as much a function of tight launch windows and lat hardware visibility as it is anything else.
I mostly disagree with the forums opinions of technical superiority any way, pretty graphics are much more a function of art direction and good choices during asset creation than they are a function of technical merit.
When I did my first game port circa 1989 I realized how little good technology or software engineering had to do with pretty games.
Sure there are new techniques developed, but these days I think that has less to do with hardware and more to do with algorithmic development.
I think what does improve through console life cycles is the tools, and that probably has as much effect on quality as anything else. Ask any AAA product developer how many lines of code in the tools they use to build a game and how many in the actual product.
Finally someone else agrees
Where have you heard about Infamous 2 being 60fps?I seriously doubt they are shooting for 60fps in open world game.It certainly does not make much sense and since R* games usually run even at sub 30fps I can't imagine this running at twice higher frame rate.I guess we'll see with open world game Infamous 2 right around the corner. I believe they are shooting for 60fps at 720p. I think, as far a s clock cycles are concerned, PS3 has just reached it's limit with these batch of AAA 1st party games.
Code quality counts for a lot. Learning how to write efficient code for a piece of hardware, especially one like SPU code without the execution aids of x86 where you have to make micro level choices that can severly cripple the speed of your processor, means a big difference in how much work you can do. That surely has to be an issue of developers learning to understand and design for the systems, rather than developer tools magically optimising their source code. If a new algorithm paper is released with generic demo C code, one developer copy/pasting that into their game won't get as much from their engine as another developer understanding the algorithm and crafting it from the ground up for the target machine using every efficiency they can think of.
I imagine that's the main reason for games improving - developer know-how in targeting the machines. Once that know-how has reached a decent level of efficiency, the hardware is being exploited and it's then just a matter of picking which features and assets to make it look good. Until then, no amount of fancy lighting techniques and carefully crafted artwork will make a poorly written game deliver what it's fully capable of.
Don't you just HATE that?!!I wrote a big response to it and lost it, so you get the short version.
I agree absolutely with the art being mostly what people regard as 'good tech', in that a good looking game is what people consider technically advanced; the OP even says as such. And perhaps the basic level of coding in games development is high, such that the performance loss due to bad choices or poor code isn't that great, and I'm holding a somewhat pessimistic view of how well console development is run based on standards in lots of industries that are slap-dash, slipshod, driven by lousy management chasing impossible targets etc. (eg. Dilbert). Still, that doesn't really contradict the OP - "My question is, do you think that developers are pushing near the achievable level of fidelity on current gen consoles or there is still a lot more juice to be gotten out of them." The "achievable fidelity" will be the most polys and textures and shaders at the highest resolutions, which is all about using the hardware efficiently. Whether such games are regarded as beauties or not is a parallel but different consideration.Yes software quality is important, but any decent assembly programmer (which I'll admit is a rarity these days) should be able to push out quality SPU code for a given algoithm. It's not that hard. The hard part is optimizing the data flow, and that usually does improve certainly in the second release and incrementally there afterwards...
I agree absolutely with the art being mostly what people regard as 'good tech', in that a good looking game is what people consider technically advanced; the OP even says as such. And perhaps the basic level of coding in games development is high, such that the performance loss due to bad choices or poor code isn't that great, and I'm holding a somewhat pessimistic view of how well console development is run based on standards in lots of industries that are slap-dash, slipshod, driven by lousy management chasing impossible targets etc. (eg. Dilbert). Still, that doesn't really contradict the OP - "My question is, do you think that developers are pushing near the achievable level of fidelity on current gen consoles or there is still a lot more juice to be gotten out of them." The "achievable fidelity" will be the most polys and textures and shaders at the highest resolutions, which is all about using the hardware efficiently. Whether such games are regarded as beauties or not is a parallel but different consideration.
No because I think there are algorithimic leaps, and better compromises to be had. And that doesn't include scope for artistic choices like radical art styles.
Definitely (I presume you're including Brian Despain?). But that kinda highlights my argument. The same team worked on the same hardware for BGDA and CON. What was it that made CON so much prettier and more involved, with more characters and lighting and shadowing effects? Better use of hardware, no? But once that level of hardware use was reached, there wasn't much further to go regards 'better', only different. Like you say, a choice of render style can elevate a game to gorgeous without being technically demanding, or maybe it needs a technology to implement it like LBP2's lighting.I know the guys who started Snowblind, and while Ezra is a very smart guy the real value in that Studio IMO is their art, they have 2 of the better artists I've ever worked with on staff.
Absoutely, but that's a common fault with human perception is it not? A lot of what impresses people is kinda shallow and flashy. Many pop stars are crap singers held aloft by technology fixing their voices, but they look fancy and are dressed up and get the attention that a plain vanilla, naturally gifted singer won't get. And Hollywood chucks in loud soundtracks and big explosions to add zing to limp stories and weak acting. As we've pretty much all noticed over the years on this board, innovation in games doesn't get you very far and Joe Public is more interested in what something looks like, so that's where the attention goes. Look at the "Game Technology - best of" thread and it's all about the visuals. No developer is ever going to get credit for a sweet little AI routine that simplifies their crowd behaviour simulation while freeing up CPU time for other tasks, or a cunning use of data packing to make streaming more efficient for their title, or a novel approach to terrain modelling. Such appreciation can only ever come from one's peers.Very few things impress me technically, I'm old and cynical, I still think the original Grand Tourismo is up there, not for polygon counts or graphics, but because I know how hard it was to put that simulation (all be it a fairly poor one) on that box.
I've seen the animation code for Madden and I think it's impressive, though you'd be hard pushed to find a message board fan who would agree.
I worked on Spore, a game probably with more clever technical solutions to somewhat unique problems than any other I've seen (I can't take credit for any of them), but very few people understand which parts were technicaly complex and which weren't.
I just get somewhat irritated when people equate technical quality with graphical quality, while they are certainly not orthogonal, they are certainly not equivalent.
...As we've pretty much all noticed over the years on this board, innovation in games doesn't get you very far and Joe Public is more interested in what something looks like, so that's where the attention goes. Look at the "Game Technology - best of" thread and it's all about the visuals. No developer is ever going to get credit for a sweet little AI routine that simplifies their crowd behaviour simulation while freeing up CPU time for other tasks, or a cunning use of data packing to make streaming more efficient for their title, or a novel approach to terrain modelling. Such appreciation can only ever come from one's peers.
...