NGGP: NextGen Garbage Pile (aka: No one reads the topics or stays on topic) *spawn*

Status
Not open for further replies.
Well for me after considering different things the main viewpoint I've settled on is that the gap won't be enough for most, if not all multi-plat devs to justify increasing the budget to take advantage of said gap. The end result would be 3rd party games looking exactly the same regardless of the console leaving 1st party titles to show any potential differences.

Yeah, I think this is true - the differences for MP titles at least, will be relatively small, like Gamecube to PS2 multiplats, or earlier PS3 ports (GTA4, RDR, Skyrim etc) to 360.

Remember also that the Wii U supposedly has 2x or 3x the GFlops of the Xbox 360 and we don't see any difference. Yes, lots of caveats such as 1st gen software, different archs, slower CPU, etc, but just saying theoretical GFlops don't tell the whole story. Given what we know, the PS4 will be stronger, but you might need DF articles to tell the difference again.

Actually, we now very much doubt it has 2x the FLOPS, the GPU is probably closer to 300GF.

latching onto individual figures is not a useful way to understand performance differences.
OK lets say that everything else in the hardware is identical for a second, so you have a 50% performance difference, but only in ALU limited situations.
When I render shadows, I'm ROP (or rathe Z-Fill) limited, if my shaders are texture heavy, the ALU's sit idle and wait on memory, on current PC GPU's if the vertex workload is dominant, for the most part they grossly underutilize the ALU's.
So what percentage of the time do the extra ALU's actually help?
If it's 40% it's a 20% performance difference, if it's 80% it's a 40% difference I would guess it will end up being closer to the first than the second.
If 720 has more ROPs (and enough associated bandwidth) or a larger register pool and can hide more latency, then it gets more of that difference back, because it runs other portions of the frame faster.

Now I'm not suggesting it's "faster" or for that matter "slower", I'm saying it's just one aspect of a design.

Thanks for that great explanation ERP.

Though as Rangers said, more CUs tends to mean you get more of everything else too (unless Sony has gimped their card or MS has improved theirs in these areas)

But I agree that the visible difference is likely to only be 10-20% or so, rather than 50%.
 
Last edited by a moderator:
Even then, 1st/exclusive can play on different strengthes. The PS4 has more muscle, but the Xbox has better HSA integration. Unless Orbis really switched to Jaguar and ends up with an identical CPU to Xbox's one (!)

Supposedly, the PS4 would have Steamroller. But this get delayed. So, it can have what's in Trinity or Richland, with is Trinity 1.1. Or Jaguar would be the other contingency plan : slower, but with that HSA stuff which can be quite what define new consoles and general computing in the long term. This would have been a big reason to go with Steamroller in the first place.

Knowing what exactly is the CPU in the PS4, is what I wait for to know which is the better console chip. If Orbis has Jaguar or Steamroller then Orbis is automatically better than Durango. (But really, if Orbis is n% faster it will be n% more power hungry methinks.)
 
leaving 1st party titles to show any potential differences.
Hardware differences aside, SONY arguably has the upper hand when it comes to first party studios - plus indie-devs seem to lean more towards PSN when choosing a "start-up" platform for their games (both of which is all too often overlooked when speculating on "which platform better") ...

Not saying that first party titles and downloadable games make all the difference (as a matter of fact, Microsoft has been very successful in balancing their relative lack with respect to first party power by putting a lot of money into exclusive content etc.) - but considering hardware alone simply doesn't show the entire picture when it comes to consoles.
 
Supposedly, the PS4 would have Steamroller. But this get delayed. So, it can have what's in Trinity or Richland, with is Trinity 1.1. Or Jaguar would be the other contingency plan : slower, but with that HSA stuff which can be quite what define new consoles and general computing in the long term. This would have been a big reason to go with Steamroller in the first place.

Knowing what exactly is the CPU in the PS4, is what I wait for to know which is the better console chip. If Orbis has Jaguar or Steamroller then Orbis is automatically better than Durango. (But really, if Orbis is n% faster it will be n% more power hungry methinks.)

I'm pretty sure we can assume they've now switched to Jaguar instead of Steamroller, sweetvar said so 7 months ago and now DF have confirmed.
 
I don't think there's any amount of special sauce in Durango that can bridge a 6 CU and 600 GFLOP gap.
Special hardware not even need to close a rumoured gap in flops. If their is some kind of unit that does all the lightning and shadowing task(as an example) in hardware and free the gpu from doing that, the result can not only match that of the Ps4 but can also surpass it by far.
To understand that: Take a system with a cpu without a gpu,so software rendering is required. Take a system with a gpu that only has half of the flops. This system will still blow the first one out of the water when it comes to gaming tasks.
 
So Orbis and Durango would be incredibly close.
I've owned two consoles that were that close bought in what you would call garage sales in the US : Atari 2600, and Coleco if you consider it with the Atari 2600 compatibility module attached :)

Or, you had both Game Boy and Game Gear using a Z80.. Both Genesis and Neo Geo using a 68000 + a Z80 though the Neo Geo had 10x bigger games and was better overall :)
 
Special hardware not even need to close a rumoured gap in flops. If their is some kind of unit that does all the lightning and shadowing task(as an example) in hardware and free the gpu from doing that, the result can not only match that of the Ps4 but can also surpass it by far.
To understand that: Take a system with a cpu without a gpu,so software rendering is required. Take a system with a gpu that only has half of the flops. This system will still blow the first one out of the water when it comes to gaming tasks.

If only someone had thought of a way to do lighting in hardware on a GPU. Some kind of "shader" unit.
 
Well for me after considering different things the main viewpoint I've settled on is that the gap won't be enough for most, if not all multi-plat devs to justify increasing the budget to take advantage of said gap. The end result would be 3rd party games looking exactly the same regardless of the console leaving 1st party titles to show any potential differences.

Yeah. As I've posted a few times, with 1.8 TF Orb and 5 GB RAM Dur, most games will be made for the lowest common denominator, 1.2 TF and 3.5GB.
 

Well, you can sigh all you want and think that I believe it's magic.

When some how you magically believe that Microsoft spent millions of dollars, tons of man hours with entire teams of some of the best minds in the industry from AMD (which this was a project that was a huge priority more so than Sony or Nintendo), IBM, and internal people at Microsoft to produce a cheap console that is barely better than the Wii U.

I am not saying anything is better than Sony here. I am stating that some times you have to stick to facts and these are the facts.

The facts do not add up to what the rumors are relating. Sometimes, you have to use some brain power and not listen to what the rumors are saying because they don't have a complete picture of things.

A lot of people are just believing what they read and not using any brain power and reason to see that hey we don't know the full story yet.
 
What does this "system GPU" does with it's 1.2TF according to you?
If it has a "system" GPU doing minor low power tasks, it is not going to be 1.2TF, that is for certain. You need to wake up.
 
What does this "system GPU" does with it's 1.2TF according to you?
If it has a "system" GPU doing minor low power tasks, it is not going to be 1.2TF, that is for certain. You need to wake up.

Okay, then go look at AMD's latest mobile GPU's which are around a TF (which is not too much of a stretch to do 1.2 TF). Yeah, I need to wake up and you need to learn to read some specs. Whoops....
 
Last edited by a moderator:
A Cape Verde chip would be a total overkill for the purposes of a "system GPU" running some smaller low power apps. That's a 1.5 billion transistor chip. X360 GPU with eDram has little over 200 million. If there is a Cape Verde in there, it's not going to be restricted as some friggin "system GPU", it makes no sense whatsoever.
 
A Cape Verde chip would be a total overkill for the purposes of a "system GPU" running some smaller low power apps. That's a 1.5 billion transistor chip. X360 GPU with eDram has little over 200 million. If there is a Cape Verde in there, it's not going to be restricted as some friggin "system GPU", it makes no sense whatsoever.

It' could be just an APU that has been modified to give that performance. Microsoft did have two GPU's in that Yukon .pdf so I wouldn't doubt it. It's not rocket science. ATI also had the Console label in the same documentation from "Cape Verde". In 2013, I don't call 1TF performance overkill.

According to you it's impossible to have a console have an APU and a discrete GPU.
I mean that is more crazy than having ray tracing built in for sure! LOL
 
A Cape Verde chip would be a total overkill for the purposes of a "system GPU" running some smaller low power apps. That's a 1.5 billion transistor chip. X360 GPU with eDram has little over 200 million. If there is a Cape Verde in there, it's not going to be restricted as some friggin "system GPU", it makes no sense whatsoever.

actually total with edram is 332m. just to be pedantic.

am i correct 32MB esram would be something like 320m (10m tran per MB)? maybe add in some other junk to the edram block and it could be 400-500m total transistors added to the cape verde...
 
Well for me after considering different things the main viewpoint I've settled on is that the gap won't be enough for most, if not all multi-plat devs to justify increasing the budget to take advantage of said gap. The end result would be 3rd party games looking exactly the same regardless of the console leaving 1st party titles to show any potential differences.

For a large part, that didn't happen this gen and the gap was much smaller.

I expect the vs threads to believe just as alive and well next gen as they were this gen
 
Status
Not open for further replies.
Back
Top