The Next-gen Situation discussion *spawn

Completely agree with you. If you look at the min specs of PC games, even great looking ones like Witcher 2, the minimum requirements are all around a ATI 38xx GPU, a high end GPU but from 2008. I'm not a believer in engines that are that scalable (38xx series to 78xx, vastly different architectures, orders of magnitudes in performances). When the min spec is a 7750 GPU, then we'll see next gen games.

Is The Witcher 2 really such a great example though, since it's DX9 only?
 
I never realized it got a 360 port. My mistake.


Apparently not only that, but:


Digital Foundry said:
http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=2

Digital Foundry: Your early 4A tech demos showed you were working with PS3 too, but Metro 2033 is console-exclusive to Xbox 360. Why is that? Are there any technical reasons holding back the game from running on PS3?

Oles Shishkovstov: From the start we selected the most "difficult" platform to run on. A lot of decisions were made explicitly knowing the limits and quirks we'll face in the future. As for me personally, the PS3 GPU was the safe choice because I was involved in the early design stages of NV40 and it's like a homeland. Reading Sony's docs it was like, "Ha! They don't understand where those cycles are lost! They coded sub-optimal code-path in GCM for that thing!" And all of that kind of stuff.

But THQ was reluctant to take a risk with a new engine from a new studio on what was still perceived to be a very difficult platform to program for, especially when there was no business need to do it. As for now I think it was a wise decision to develop a PC and console version. It has allowed us to really focus on quality across the two platforms.

One thing to note is that we never ran Metro 2033 on PS3, we only architected for it. The studio has a lot of console gamers but not that many console developers, and Microsoft has put in a great effort to lower the entry barrier via their clearly superior tools, compilers, analysers, etc.

Overall, personally I think we both win. Our decision to architect for the "more difficult" platform paid off almost immediately. The whole game was ported to 360 in 19 working days. Although they weren't eight-hour days.


;)


Earlier points about overall visual fidelity still stand.


By the way:


Digital Foundry said:
http://www.eurogamer.net/articles/digitalfoundry-tech-interview-metro-2033?page=4

Digital Foundry: You've created state-of-the-art technology that is right up there with some of the best we've seen on console. Both Microsoft and Sony don't want to replace their consoles yet, so where do you see the software evolving from here? How can you improve on what you have achieved with the 4A engine?

Oles Shishkovstov: Well, the majority of our Metro 2033 game runs at 40 to 50 frames per second if we disable vertical synchronisation on 360. The majority of the levels have more than 100MB heap space left unused. That means we under-utilised the hardware a bit...


:eek:;)
 
Last edited by a moderator:
STALKER games are PC centric and have pushed the PC forward...

Try do understand what I'm saying. I know pc exclusive games push pc hardwar foward. I never denied that. Even non-pc exclusive titles do it, I can name many just from this year.
What I am saying is the rate of development, or, the speed at which they are being pushed, is hampered by the focus of the entire rest of the industry into console architectures. Don't you agree?
 
Try do understand what I'm saying. I know pc exclusive games push pc hardwar foward. I never denied that. Even non-pc exclusive titles do it, I can name many just from this year.
What I am saying is the rate of development, or, the speed at which they are being pushed, is hampered by the focus of the entire rest of the industry into console architectures. Don't you agree?

I think there are multiple issues at work.
When I worked on PC you were hamstrung by whatever your sales group dictated as min spec. Often the CPU wouldn't be bad but the min spec GPU's were always hideously bad.

And despite what some PC gamers seem to think, scalability only get's you so far, you have to target your assets at something and it certainly isn't the shiny new GTX680 even if your PC exclusive and targeting core gamers.

Obviously this is compounded by the money being in console games so you'll see assets targeting consoles, and minor improvements for the PC port.
 
Just look at games like Torchlight II or Diablo 3. Not exactly demanding, and purposely designed so they would reach a large audience. They don't have a console port either, so there's no excuse on that end (although Torchlight II may well end up getting a 360 version again)

They are only now slowly moving from DX9, because the consoles are still DX9 hardware. Even now that high number of PCs are equipped with DX10 and higher GPUs, the PC market alone is not large or viewed as lucrative enough to warrant huge investments aside from few exemptions. Once the next consoles come out and hopefully manage to kill the old gen swiftly, we should see the devs starting to develop for higher configuration specs.

I agree that closed box console environment brings huge!! (/Trump) benefits, but the differences in hardware power between a high end PC and the old consoles is way too big for that to matter too much at this point in time. I mean those next gen demos we saw like the elemental demo, 1313 and that Square one were running on a high end, but still reasonable PC-configuration. The games could pretty much be looking like that today.

Reasonable? The GPU required alone costs as much as next-gen consoles will likely cost at launch.

I generally agree still, but that we are slowly moving from DX9 has very little to do with consoles and everything to do with what hardware and drivers are out there in the PC market. If anything, I think DX9 and CPU/GPU limitations has held back console game development as much as the reverse is true.
 
I still think next generation games will look much better than any PC games out there currently. PC games right now are basically nothing more than higher res version of ps360 games with directx11 topping.

Bingo, it's only because consoles have become the baseline of gameing. This is why i think the research and development of the new consoles should reflect more refined in development tech, as that will no doubt bump the PC gameing experience too and make High level tech cheaper to obtain.
 
Reasonable? The GPU required alone costs as much as next-gen consoles will likely cost at launch.

I understand it's a matter of perspective, but I generally consider one chip GPUs "a reasonable high end". There is definitely a possibility to go much higher than that in the PC-world. In terms of heat/power consumption and requirements of other components like the case, PSU and cooling, a single 680 doesn't require unreasonable things.

The price of the GTX 680 is purely a marketing decision. 670 with slightly higher clocks gives the same performance and 660 Ti has almost the same manufacturing costs (same 294mm2 die, same amount of memory) and you can get that for under $300, watch the performance level of the GTX 680 become the mid range on the same 28nm manufacturing process before the next gen consoles are out.

I generally agree still, but that we are slowly moving from DX9 has very little to do with consoles and everything to do with what hardware and drivers are out there in the PC market. If anything, I think DX9 and CPU/GPU limitations has held back console game development as much as the reverse is true.

http://store.steampowered.com/hwsurvey

According to the Steam survey around 82% of the users have DX10+ cabable system (OS+GPU), 11% have DX10+ GPU/Win XP, so over 90% actually have DX10+ hardware. If PC-market alone would sustain high end game development or was seen as such by the publishers, you would see more game tech arms race going on there.

In the console environment, the best multiplatform games are (depending on who you ask) as good or close in quality to the exclusive games. I'm predicting that the 680 in my PC will suddenly run quite a lot prettier games once the multiplats in the next gen starts to pop up.
 
Yet is it worth contracting a budget engine when you have a $20M-$40M launch title, probably a new IP, when you would pay more to potentially get a more freshed out engined and support (or just shift the resources toward upgrading your own inhouse tech to keep a relatively smoother workflow)?
 
Yet is it worth contracting a budget engine when you have a $20M-$40M launch title, probably a new IP, when you would pay more to potentially get a more freshed out engined and support (or just shift the resources toward upgrading your own inhouse tech to keep a relatively smoother workflow)?

Because every studio has that kind of budget :rolleyes:
 
Because every studio has that kind of budget :rolleyes:

I am not sure how, "every studio" relates to, "That would make a good budget engine for next generation launch tittles."

To cut to the chase (beyond the moving target) is launch titles tend to fit various trends. One is the big budget "launch our new IP to market fast" mentality. These will have large development budgets and large marketing budgets. Using a low end 3rd party engine they aren't currently using doesn't give with that. Another major launch title genre is current gen properties re-targetted to the new hardware hence they will be mostly the same engine upgraded from current gen. Then there is the low budget rush ware which is becoming rarer as companies try to limit the number of crap launch titles and have limited number of dev kits early.

Maybe you can create a scenario where Uningine fits the criteria for the development cycle for a launch title. Is Uningine going to support next gen hardware Day 1? Do they have dev kits or plan to do early next gen development? -- or are developers going to have to port the engine over even to use it?

Anyways, you can roll your eyes all you want but the number of studios who can afford and get enough dev kits for a launch title won't be a slew of budget apps. You are right that not every developer has the money for a huge new IP but MS/Sony doesn't need to let them clog the launch window, either. And as developers have noted here a 3rd party engine is no guarantee to save time. It may allow early asset development but my guess is Uningine, unless they are specifically targetting next gen and have everything up and going ASAP, won't be in a position to assist these budget titles hit the launch window.

Of course you seem to believe otherwise :!:
 
I am not sure how, "every studio" relates to, "That would make a good budget engine for next generation launch tittles."

To cut to the chase (beyond the moving target) is launch titles tend to fit various trends. One is the big budget "launch our new IP to market fast" mentality. These will have large development budgets and large marketing budgets. Using a low end 3rd party engine they aren't currently using doesn't give with that. Another major launch title genre is current gen properties re-targetted to the new hardware hence they will be mostly the same engine upgraded from current gen. Then there is the low budget rush ware which is becoming rarer as companies try to limit the number of crap launch titles and have limited number of dev kits early.

Maybe you can create a scenario where Uningine fits the criteria for the development cycle for a launch title. Is Uningine going to support next gen hardware Day 1? Do they have dev kits or plan to do early next gen development? -- or are developers going to have to port the engine over even to use it?

Anyways, you can roll your eyes all you want but the number of studios who can afford and get enough dev kits for a launch title won't be a slew of budget apps. You are right that not every developer has the money for a huge new IP but MS/Sony doesn't need to let them clog the launch window, either. And as developers have noted here a 3rd party engine is no guarantee to save time. It may allow early asset development but my guess is Uningine, unless they are specifically targetting next gen and have everything up and going ASAP, won't be in a position to assist these budget titles hit the launch window.

Of course you seem to believe otherwise :!:

You really read into things too much..... learn to chill
 
Regarding what I said about IQ, if you come up with any way to quantify the difference between "My imagination/real life" and "What I actually rendered," that number will get smaller over time. For example, take a car model. In terms of the geometry, the difference between the 3D mesh and the actual, physical dimensions of the car is smaller and smaller relative to the overall size of the car with each successive generation, and therefore less perceptible. Each five-fold increase in geometry results in a smaller and smaller overall impact on measurable quality. Same goes for colors, lighting, texture resolution.

Edit: Here's a thought-experiment. Suppose you have 30 seconds of 1080p footage at 60 fps of a Ferrari driving around Laguna Seca with a high-quality camera. You must create the same scene in real-time on every console hardware going back to the 3DO. Scale the image to 1080p and 24 bpp using nearest-neighbor so we can directly compare. If you rendered at 30 fps, we just repeat frames to get to 60. Define your error function as:

error = ( sum over all pixels and frames((RGB_realtime - RGB_reality)^2) / sum(RGB_reality^2) )^(1/2)

This function measures the ratio of the the difference between the realtime rendering and the camera footage of reality. As hardware power increases, the function will approach zero, and as it approaches zero, the differences will become less noticeable to human observers.

This can be directly applied to increasing screen resolution and AA while keeping the actual content fixed.
 
Last edited by a moderator:
Regarding what I said about IQ, if you come up with any way to quantify the difference between "My imagination/real life" and "What I actually rendered," that number will get smaller over time. For example, take a car model. In terms of the geometry, the difference between the 3D mesh and the actual, physical dimensions of the car is smaller and smaller relative to the overall size of the car with each successive generation, and therefore less perceptible. Each five-fold increase in geometry results in a smaller and smaller overall impact on measurable quality. Same goes for colors, lighting, texture resolution.

Edit: Here's a thought-experiment. Suppose you have 30 seconds of 1080p footage at 60 fps of a Ferrari driving around Laguna Seca with a high-quality camera. You must create the same scene in real-time on every console hardware going back to the 3DO. Scale the image to 1080p and 24 bpp using nearest-neighbor so we can directly compare. If you rendered at 30 fps, we just repeat frames to get to 60. Define your error function as:

error = ( sum over all pixels and frames((RGB_realtime - RGB_reality)^2) / sum(RGB_reality^2) )^(1/2)

This function measures the ratio of the the difference between the realtime rendering and the camera footage of reality. As hardware power increases, the function will approach zero, and as it approaches zero, the differences will become less noticeable to human observers.

This can be directly applied to increasing screen resolution and AA while keeping the actual content fixed.


Ahh, my pet topic :p

We may hit diminishing returns someday, but it certainly appears (at least) another generation of vast increases is in store. Just the likes of Star Wars 1313 show that.

I am sure it was said each past gen. Imagine how silly it seems now to say "PS2 was the best it can get". I suspect it will seem similarly silly very rapidly in regards to Xbox 360. And I suspect someday, Xbox Durango and Playstation Orbis. Which already seem weak compared to PC's.

Graphically demanding current console games frequently struggle to hit 30 FPS, are often slightly less than 720P, with little AA or AF, "image quality" widely scorned by B3D perfectionists, etc, all in a mad effort to extract one more ounce of visual pizazz from straining hardware. Those are some extremely low bars that just show a lot of headroom remains.

Sweeney on the other end said 5000 teraflops is needed to simulate reality, and until then rapid graphical advancement will continue.

There is some question whether we even move to 1080P standard next gen (though I believe we will). Let alone 4k. Why not? Because obviously we "need more power". That is another tell.

Another one is all the concern and arguing over the Wii U's power level. There are many.
 
We may hit diminishing returns someday
If you can quantify "returns" meaningfully, there are diminishing returns with every successive increase in hardware speed, going all the way back to the very first computer games. It's true in every aspect of computing, too, not just graphics. For example, in word processing software, the quantifiable returns gained relative to increases hardware power are already near-zero compared to what they were in the 80s and 90s.

It seems pretty obvious that quantifiable returns in 2D games are already small, and I suspect they have hit a point of rapid diminishing in certain types of 3D games.
Graphically demanding current console games frequently struggle to hit 30 FPS, are often slightly less than 720P, with little AA or AF, "image quality" widely scorned by B3D perfectionists, etc, all in a mad effort to extract one more ounce of visual pizazz from straining hardware.
You cannot quantify the scorn of perfectionists, as it is always infinite. You can, however, quantify the error differentiating what you actually can achieve and what you'd achieve in an ideal universe when talking about computer graphics, since they're intrinsically quantitative.

The quality gain from 1080p from 4k measured as image error, for example, is not nearly as large as the gain from 320x240 to 1080p. If you're using nearest-neighbor sampling of whatever image you're trying to draw (i.e. the worst possible sampling), it converges like a Riemann sum.

You could also quantify it in economic terms, which is really what matters in the game industry. If the costs associated with creating delivering higher-quality content are below the revenue you gain from it, then your returns have gone from positive to negative.
 
If you can quantify "returns" meaningfully, there are diminishing returns with every successive increase in hardware speed, going all the way back to the very first computer games. It's true in every aspect of computing, too, not just graphics. For example, in word processing software, the quantifiable returns gained relative to increases hardware power are already near-zero compared to what they were in the 80s and 90s.

It seems pretty obvious that quantifiable returns in 2D games are already small, and I suspect they have hit a point of rapid diminishing in certain types of 3D games.

You cannot quantify the scorn of perfectionists, as it is always infinite. You can, however, quantify the error differentiating what you actually can achieve and what you'd achieve in an ideal universe when talking about computer graphics, since they're intrinsically quantitative.

The quality gain from 1080p from 4k measured as image error, for example, is not nearly as large as the gain from 320x240 to 1080p. If you're using nearest-neighbor sampling of whatever image you're trying to draw (i.e. the worst possible sampling), it converges like a Riemann sum.

You could also quantify it in economic terms, which is really what matters in the game industry. If the costs associated with creating delivering higher-quality content are below the revenue you gain from it, then your returns have gone from positive to negative.

Yet I have a 720P phone with a 4.8" screen. And this looks much better than the same phone with a 800X480 screen. Rumors abound 1080P phones are the next step.

Hmm, so what res do I need to "max out" my 42" TV? Or my 27" PC monitor I'm sitting 18 inches from typing this? Which is a lowly 1080P?

How do you measure what the eyes see as "image error"? I'm not entirely following you.

But the rest of my points remain anyways. We dont have enough power for 720, many games cannot get there. We need a LOT more power. These games cut every corner imaginable in the lustful chase for one ounce more graphics, hundreds of people push them for one ounce more graphics. 300 people each made Crysis 3, and Assassins Creed 3, and Halo 4.

It's hard for me to say where the "end" is, just that I'm sure we're not there, and I have a funny suspicion we're really far away. I kind of think, in ~2017, we are going to get ANOTHER huge jump after the next consoles. Just say Durango ships with a 1.5 teraflop GPU, I can already see the beginnings of that being weakling.

There's going to be some point where we look at an X360 game and say "that looks like shit", just like we do a PS2 game now. There's likely going to be a point where we look at an Xbox Durango game and say "that looks like shit", too, even if it seems too far away to comprehend now.

I do agree some games seem more susceptible to less returns. Car games for one. FPS are the genre that has the most headroom. RTS are maybe even more demanding.

Even 2D though, I think this gen's 2D is the best ever, for example (move to hd alone could do that). Why wont next gen better it?

I'm also not sure you can "quantify" graphics, entirely. It's not how close you can get to a photo imo. But if it is, check out any ingame screens to see, they look very crap to a photo. Yes you can get a car model looking really nice, but then you have to draw the track, and a thousand trees, and 20 other cars at varying distance in perfect clarity, and the horizon, and everything else, it gets exponentially more complicated. In essence I'm not sure graphical motion can be compared to a photo.
 
Yet I have a 720P phone with a 4.8" screen. And this looks much better than the same phone with a 800X480 screen. Rumors abound 1080P phones are the next step.

Hmm, so what res do I need to "max out" my 42" TV? Or my 27" PC monitor I'm sitting 18 inches from typing this? Which is a lowly 1080P?

How do you measure what the eyes see as "image error"? I'm not entirely following you.

But the rest of my points remain anyways. We dont have enough power for 720, many games cannot get there. We need a LOT more power. These games cut every corner imaginable in the lustful chase for one ounce more graphics, hundreds of people push them for one ounce more graphics. 300 people each made Crysis 3, and Assassins Creed 3, and Halo 4.

It's hard for me to say where the "end" is, just that I'm sure we're not there, and I have a funny suspicion we're really far away. I kind of think, in ~2017, we are going to get ANOTHER huge jump after the next consoles. Just say Durango ships with a 1.5 teraflop GPU, I can already see the beginnings of that being weakling.

There's going to be some point where we look at an X360 game and say "that looks like shit", just like we do a PS2 game now. There's likely going to be a point where we look at an Xbox Durango game and say "that looks like shit", too, even if it seems too far away to comprehend now.

I do agree some games seem more susceptible to less returns. Car games for one. FPS are the genre that has the most headroom. RTS are maybe even more demanding.

Even 2D though, I think this gen's 2D is the best ever, for example (move to hd alone could do that). Why wont next gen better it?

I'm also not sure you can "quantify" graphics, entirely. It's not how close you can get to a photo imo. But if it is, check out any ingame screens to see, they look very crap to a photo. Yes you can get a car model looking really nice, but then you have to draw the track, and a thousand trees, and 20 other cars at varying distance in perfect clarity, and the horizon, and everything else, it gets exponentially more complicated. In essence I'm not sure graphical motion can be compared to a photo.

I think the goal of the next consoles isn't so much to pursue photo realism, though some developers will try. I think their goals should be to aim for Pixar and Dreamworks level of CGI.

Back to topic
-----------------------------
In the regards of Project Durango and whether the parts inside of it are going to stick.
with lots of people around gaming related stores talking about the next xbox and saying that we have legitimate evidence of the specifications.

Durango simply put right now, is just a PC that supposedly reads the language of the next Xbox (Or possibly 360's coding.) the parts inside of it are just to get the programs running. the parts inside of the console are most likely going to be current tech as to represent a fraction or partition of the console, as this take would be much cheaper to distribute to developers.

Back in the day long ago, sony in retrospect would take a band of consoles and link them up since they couldn't get the real specs put into actual parts. these band of consoles would serve as rough estimates on what it would translate to. Most recently with the ps3's development, sony linked 16 ps2s back in 2001, and in 05 linked two Nvidia 6000 series to represent the ps3.
 
I am wondering if $499 is going to be an acceptable price point for next generation consoles. At that price point, Sony and MS will be able to have powerful hardware as well as new control scheme.
 
Back
Top