Middle Generation Console Upgrade Discussion [Scorpio, 4Pro]

Status
Not open for further replies.
You are thinking about fps where ai runs around or does squatting behind a crate, but what about crowd ai and autonomous npc interactions?
Like Fallout? Or like Ubisoft open worlds?

Yea, I can see that case, just a lot of AI running to create a realistic world. Real time Civ, etc. Totally get it. This would be a big step towards a next generation game, a world that could operate entirely on its own and grow without player interaction would be a neat simulation.

But once again, we accomplish these games on Jaguars today, and that's with _huge_ load by the render block. Freeing up all of that CPU of render draw calls, for AI, You've got huge opportunity there to really expand on it. We've not yet hit the limit of the systems quite yet.

And if we continually get into more complex worlds, isn't that where the cloud is supposed to fundamentally be the right asset to use in this case?
 
Is it not a safe bet to expect the trend to continue for Scorpio and consoles beyond? What's a possible impetus behind a reversal of this trend?
Of course, but that doesn't mean the CPU can stand still. If the trend has been a quadrupling of GPU versus a doubling of CPU power, that should continue. Or if the ratio of CPU:GPU was 1:10 then 1:15 then 1:20, for 1:30 ratio of CPU to GPU and a 16 TF GPU, you'd need

But once again, we accomplish these games on Jaguars today, and that's with _huge_ load by the render block. Freeing up all of that CPU of render draw calls, for AI, You've got huge opportunity there to really expand on it. We've not yet hit the limit of the systems quite yet.
That's probably the most applicable argument. Even if we can't map game code onto compute, moving graphics rendering tasks off the CPU onto GPU will free up a lot of CPU, meaning lots more available for game code.
And if we continually get into more complex worlds, isn't that where the cloud is supposed to fundamentally be the right asset to use in this case?
I'm not convinced by the cloud. It makes sense for multiplayer but not necessarily single player.
 
Like Fallout? Or like Ubisoft open worlds?

Yea, I can see that case, just a lot of AI running to create a realistic world. Real time Civ, etc. Totally get it. This would be a big step towards a next generation game, a world that could operate entirely on its own and grow without player interaction would be a neat simulation.

But once again, we accomplish these games on Jaguars today, and that's with _huge_ load by the render block. Freeing up all of that CPU of render draw calls, for AI, You've got huge opportunity there to really expand on it. We've not yet hit the limit of the systems quite yet.

And if we continually get into more complex worlds, isn't that where the cloud is supposed to fundamentally be the right asset to use in this case?

You still want to increase both. You want more local performance for things that wouldn't translate to the cloud well and you want more cloud performance for things that would. I could see stuff like nividia's PhysX fog and grass and what not being something good for the cloud . Something that adds a lot to visuals of the game but doesn't effect gameplay .
 
I'm not convinced by the cloud. It makes sense for multiplayer but not necessarily single player.

I think for the particular application that iroboto mentions it makes tons of sense. A huge, ridiculously detailed, world simulation that can live in it's entirety on a server somewhere utilizing that server's compute, memory and storage resources that runs in conjunction with and syncs to a version of that world running on the local console that is just a window into that larger world focused on what the player can currently perceive and effect seems very workable within the current technological and infrastructural restraints and could be incredibly immersive if done well.
 
Of course, but that doesn't mean the CPU can stand still. If the trend has been a quadrupling of GPU versus a doubling of CPU power, that should continue.

I'm not talking about it standing still. But the degree to which it needs to increase is very much in question. In every consumer application for CPUs I can think of right now the demand for increased *sustained* CPU performance is stagnating. I make the sustained performance distinction because mobile chips are continuing to increase their peak performance, but this is pretty much so that those high-performance cores can turn on, get the work they are needed for done as quickly as possible and then turn off in favor of the low-power cores that run the great majority of the time.
 
I think for the particular application that iroboto mentions it makes tons of sense. A huge, ridiculously detailed, world simulation that can live in it's entirety on a server somewhere utilizing that server's compute, memory and storage resources that runs in conjunction with and syncs to a version of that world running on the local console that is just a window into that larger world focused on what the player can currently perceive and effect seems very workable within the current technological and infrastructural restraints and could be incredibly immersive if done well.
Adding on this one, I guess the concept is that if the servers are linked (even if it's a single player experience) you get this opportunity where the servers can learn from other players events and use that knowledge in your world. Like a constantly learning organism. Could be really immersive! Think of drivatar technology, but applying that to all aspects of your game world.
 
You still want to increase both. You want more local performance for things that wouldn't translate to the cloud well and you want more cloud performance for things that would. I could see stuff like nividia's PhysX fog and grass and what not being something good for the cloud . Something that adds a lot to visuals of the game but doesn't effect gameplay .
But could these eventually be done by the GPU? If so, just keep beefing up the gpu!
 
If gameplay doesn't change massively then more CPU goodness won't be so required, meaning PS4 could keep on trucking for years as an entry level 1080p consoles.

You can do things with more CPU and GPU resources than don't impact gameplay but does impact the game. AC Unity has vastly higher pedestrian density compared to Assassins Creed games on 360/PS3 and GTA V on One/PS4 has much higher pedestrian and traffic density and peds and vehicles are persistent in the world at much longer distances. Both hugely contribute to making the world feel dense, lived in and believable. Plus first person mode but I think that was more a RAM issue.

When you think about it, there aren't a lot of gameplay mechanics that are truly beyond the previous gen consoles but you'd have to make other sacrifices to realise them. 3D GTA debuted on PS2 but there are very few mechanics in GTA V that aren't in San Andrea and plenty of mechanics in San Andreas that aren't in V: body customisation, territories, gang wars etc. Mostly you see the same type of games with refinements.

But I take your point. And it is difficult to definitively define what a generation means and I bet it means different things to different people.
 
what do you do between now and eventually ?
Cut what's unnecessary and focus on getting what's necessary to work. If it's really critical these guys will figure it out. There is a massive planning process for engine guys, the technical tests I had to do at Ubisoft was pretty indicative of what they are looking for in terms of programmers.
 
If shitty/inexperienced/under-resourced programmers could make a breakthrough game, I'd be happy to give them the CPU to do it.
 
Get another job

Cut what's unnecessary and focus on getting what's necessary to work. If it's really critical these guys will figure it out. There is a massive planning process for engine guys, the technical tests I had to do at Ubisoft was pretty indicative of what they are looking for in terms of programmers.

You could just also you know.... increase the cpu power ? Sony did it by clocking the jaguar cpu higher.
 
You could just also you know.... increase the cpu power ? Sony did it by clocking the jaguar cpu higher.
sure I don't think anyone disagrees with that.
Zen is something else entirely. Zen is probably technically possible, financially feasible is something else. We could write 101 ways a top of the line CPU would be a better fit than a lower one, no one would disagree with any of your points, more is always better in just about every single scenario we can possibly think of.

The question has always been about whether a reduction in CPU power is worth a significant drop in price, with everything we have considered to make up for it; I'm sure most people would agree, that the answer would be yes. It's a console at the end of the day and consoles have limits on price; the strategy is to rectify the wrong that is underpowered Xbox One, go too far on trying to rectify that error and you likely expose yourself to another crucial error which is pricing your device out of the market.
 
sure I don't think anyone disagrees with that.
Zen is something else entirely. Zen is probably technically possible, financially feasible is something else. We could write 101 ways a top of the line CPU would be a better fit than a lower one, no one would disagree with any of your points, more is always better in just about every single scenario we can possibly think of.

The question has always been about whether a reduction in CPU power is worth a significant drop in price, with everything we have considered to make up for it; I'm sure most people would agree, that the answer would be yes. It's a console at the end of the day and consoles have limits on price; the strategy is to rectify the wrong that is underpowered Xbox One, go too far on trying to rectify that error and you likely expose yourself to another crucial error which is pricing your device out of the market.
wouldn't it depend on what the price drop is ?

Would you trade a scorpio with zen (8core /16 thread) for a 8 core jaguar if its a $15 difference ? $20 difference ? What is a significant drop in price ?
 
Like Fallout? Or like Ubisoft open worlds?

Yea, I can see that case, just a lot of AI running to create a realistic world. Real time Civ, etc. Totally get it. This would be a big step towards a next generation game, a world that could operate entirely on its own and grow without player interaction would be a neat simulation.

But once again, we accomplish these games on Jaguars today, and that's with _huge_ load by the render block. Freeing up all of that CPU of render draw calls, for AI, You've got huge opportunity there to really expand on it. We've not yet hit the limit of the systems quite yet.

And if we continually get into more complex worlds, isn't that where the cloud is supposed to fundamentally be the right asset to use in this case?

The cloud isn't free. The more work you remove from being handled locally the less you r going to see of that $59.99 going into your pockets as profit in the long run.
 
wouldn't it depend on what the price drop is ?

Would you trade a scorpio with zen (8core /16 thread) for a 8 core jaguar if its a $15 difference ? $20 difference ? What is a significant drop in price ?

Hard to quantify as we don't know what zen would actually entail for a console. Zen is suppose to scale all the way down to notebooks. And I'd bet that's being accomplished by sacrificing performance. Less cores, less cache and maybe ripping out features like SMT.

Scorpio might end up with a cut down version of zen that's not all that more performant than Cat cores. Because why continue to invest in cat cores at 14nm or 16nm when doing it for zen allows the work to be applied to other lower powered less performant zen based designs?
 
Last edited:
Hard to quantify as we don't know what zen would actually entail for a console. Zen is suppose to scale all the way down to notebooks. And I'd bet that's being accomplished by sacrificing performance. Less cores, less cache and maybe ripping out features like SMT.

Scorpio might end up with a cut down version of zen that's not all that more performant than Cat cores. Because why continue to invest in cat cores at 14nm or 16nm when doing it for zen allows the work to be applied to other lower powered less performant zen based designs?
This is something I said many pages ago, although I disagree with not all that more performant, although I guess it depends what that actually means to you.

I think there's changes that would be of benefit like changes to garlic/onion buses, general architectural updates.
They could cut down on caches, remove the advanced power management stuff.

Remove smt and end up with cores that still have a better ipc, clocked at the same speed as the jaguar would be.

Or keep the smt, cut down on other bits that have bigger impact on ipc.

Either way you could end up with a chip that's say 25-80% more performant, with other benefits. And yes I'm picking everything out of my butt here.More for illustrative purposes to show what I consider more performant.
 
I think for the particular application that iroboto mentions it makes tons of sense. A huge, ridiculously detailed, world simulation that can live in it's entirety on a server somewhere utilizing that server's compute, memory and storage resources that runs in conjunction with and syncs to a version of that world running on the local console that is just a window into that larger world focused on what the player can currently perceive and effect seems very workable within the current technological and infrastructural restraints and could be incredibly immersive if done well.
That becomes multiplayer if multiple people are affecting it, even if you never meet them. You might have designs on doing something with a building, making it your HQ and ordering furniture, and then some other player can come along and bulldoze your building down. We need full simulated instances without player cross-over where it's single player.

It'd also be one hell of a gamble now to design a console based on using the cloud for games when it hasn't actually happened yet! Personally I'd like some active proof of concepts in the wild showing it works before I'd commit a whole generation of hardware to relying on the cloud. Recently I've been repeatedly dumped out of Magicka 2 because it 'can't connect to the server' even midst game playing solo, and last night three player Diablo refused voice chat between two the two clients but allowed it between both clients and host, and worked using PS4 Party chat. So network gaming is still far from perfect. I can't see Rockstar planning the next GTA on cloud compute without it yet being proven what and how much we can actually reliably achieve on it.
 
Status
Not open for further replies.
Back
Top