Next Generation Hardware Speculation with a Technical Spin [pre E3 2019]

Status
Not open for further replies.
So are we looking at:

Navi ALU (as in the patent) = 2 MUL + ADD (3 flops per ALU)
GCN ALU = MADD (2 flops)
The claims are kept broad as to cover similar implementations, but one of the examples may be more like 2 FMA and a side ALU that has an ADD+miscellaneous logic.
The two core ALUs with FMA units support the most common operations, while the side ALU with no multiplier may work in concert with one or the other to implement part of a complex operation like a transcendental instruction.

Though there are 3 ALUs, there are only enough inputs for six operands--matching the two FMA units and a sustained throughput of 2 FMAs per clock. The side ALU needs to hope for an unused or shared input from a neighboring ALU, and the three ALUs arbitrate for two result outputs. While it might be possible to create an access pattern that can feed the 3 ALUs with read operands, there's no leeway for the result ports.

Another curious point I am not sure how to reconcile is that while the patent doesn't say a given SIMD width is necessary, the example given has units narrower than GCN's traditional 16-wide SIMD. The registers and ports for the example SIMD block discuss widths and outputs that are 4-wide, with the output cache capable of supplying two SIMD4 operations per clock.
Having SIMD blocks and registers sized for 4-wide paths, and then having two core/full ALUs gets the output up to half of a GCN SIMD. I'm having some trouble parsing some of the language, and there's one summary line saying the number of units is equivalent, but I don't know how to get back to the throughput of SIMD16 with the numbers given.

I believe the patent also goes into the number of tex units per unit as well, though my brain isn't quite fully operational buttAl station this morning.
It seems to point to a large CU type with two texture units and L1 caches, and a small one with one texture block and L1.
I'm not entirely sure how to reconcile this with some of the claims of keeping the ALU/TEX ratio the same as with other GPUs. Going by some of the suggested math throughput, having even one texture unit with GCN's 4-address capability would make the ALU/TEX ratio lower, much less doubling the number of texture units. I'm not sure of the purpose of having two adjacent L1s.
One possible interpretation is that the designers in this case aren't looking at the texture portion as a monolith, but rather each independent address processing block and filter unit as a texture unit in its own right. This might mean there could be multiple narrower texture units.

ergo, a compute unit

GCN CU = 4 x (16 x MADD)
NaviCU = 4 x 16 x (2 MUL + 1 ADD)
Not sure about the 16, and it looks to be sized to have 1 FMA per core/full ALU.
 
Its called expanding your market. Consumers that dont own a 42"+ 4K TV are lkely budget minded and wouldnt be buying a $400/$500 console.

Well that 400-500$ console is going to cost half sooner or later but I will never soend that amount of money for a console that is limited by low tier version of itself.

If the rumour is true, they are having another Xbox One
 
  • Like
Reactions: snc
You think a TF number is a marketing weapon? I think the games will sell the system, not technical specs. Most people don't know what a TF is.

No? What about “Most powerfull console ever”?
People read. And people who write do know what a Tflop is. If an article states “The Playstation console is a mere improvement in graphical terms from what Xbox One X was able to deliver. On the other hand, the new Xbox console doubles on what the X delivered”.
Would those persons understand this?

I hear people talking about the GPGPU and how it “speeds things up”, about the Onion and Garlic and how they “manage to increase performance”. And I also hear them talking about flops. Some of dialogs allow to understand that, as you say, people have no real ideia about what they are talking about, or how those things work... but they do talk about them, and they know they are good things, because they just happened to read about it.

And you can bet that if One console is unable to double the performance of an old generation one, amd the other can, that console would make headlines as “The lowest generation jump ever. It wont even double on the X performance”.
And that... people do understand.

Just remember that those are numbers, and a bigger number will always be a bigger number in marketing terms.
 
Last edited:
Games shown at whatever launch event is going to have the biggest impact me thinks.

Guerilla is definitely working on a launch title I just wonder what ip they going to go with?
I doubt it will be Horizon so probably killzone or a new ip. Xbox will have Halo.
 
Two tier console make sense if the low tier console is a streaming device that at most supports some windows store games and costs 150$ or less otherwise I dobdo see anything good I this approach
 
Well that 400-500$ console is going to cost half sooner or later but I will never soend that amount of money for a console that is limited by low tier version of itself.

If the rumour is true, they are having another Xbox One

If the lower tier system has the same CPU and featureset it's not a problem.

Most GPU time is taken up with workloads that scale with resolution.
 
Two tier console make sense if the low tier console is a streaming device that at most supports some windows store games and costs 150$ or less otherwise I dobdo see anything good I this approach
There's a price point that the mass market buys into consoles and that usually takes a couple years to reach.
That's the point of the lower entry sku, trying to get to it as fast as possible. Without hobbling the console that the hardcore wants.

Personally I can see the pros and cons for a single console and a dual one.
Really depends on your view:
  • Single console can hit sweet spot in power and price between lower and upper end model.
  • Dual console can squeeze out single one by having low end and high end appealing to broader market.
Pick your poison.
 
There's a price point that the mass market buys into consoles and that usually takes a couple years to reach.
That's the point of the lower entry sku, trying to get to it as fast as possible. Without hobbling the console that the hardcore wants.

Personally I can see the pros and cons for a single console and a dual one.
Really depends on your view:
  • Single console can hit sweet spot in power and price between lower and upper end model.
  • Dual console can squeeze out single one by having low end and high end appealing to broader market.
Pick your poison.

Having a low end to support means limiting games.
 
If the lower tier system has the same CPU and featureset it's not a problem.

Most GPU time is taken up with workloads that scale with resolution.
So developer want to target 1440p checkerboard to 4k on 12tf xbox, what resolution will be on 4tf xbox ?:) 720p in 2020-2026
 
So developer want to target 1440p checkerboard to 4k on 12tf xbox, what resolution will be on 4tf xbox ?:) 720p in 2020-2026
Someone who bought it would know that they are getting the entry device, so I'm sure they would be ok.
If ps5 is in between the 2 power wise, 1080p in 2020-2026 oh no :runaway:
 
Having a low end to support means limiting games.

And if the CPU and feature set are the same, why would that be? If you can run the same game and use the same techniques, with enough power to handle the target resolution, why would that happen?

So developer want to target 1440p checkerboard to 4k on 12tf xbox, what resolution will be on 4tf xbox ?:) 720p in 2020-2026

Well does this reconstruction technique work, or doesn't it?

And if not, why are you using it any any resolution?
 
The more devices to develop for, the less the multiplats will be optimized on average for each SKUs. So it will depend of the games and developers. We are already seeing this with the 4 (5 with Switch) models of consoles IMO.

Now if we take into account the Switch, it's going to be worse, example with the Trials games:

Trials Evolution was running locked 60fps on X360 (and looked as good or better than Rising on Switch for some). Then later Trials Fusion was very stable 60fps on XB360, XB1 and PS4, the game has being quickly optimized and now runs basically locked 60fps on all consoles.

Now Trials Rising has being developed on 5 SKUs, 5 years later, well overall it's a much less optimized game than the previous games: 30fps with drops on Switch and unstable 60fps on some home console (allegedly).

The more SKUs, the less optimized Trials games have being over the years. Some would say it's because @sebbbi is not longer working on those games though...:LOL:

In 2020, if we have 3 next gen consoles, for cross-gen games, so for the first year or so, developers will have to make games for 7 different SKUs. :runaway:
 
If two systems are sufficiently similar then for most stages of optimisation they can effectively be treated as the same platform.

For example, the same CPU, with the same compiler, accessing the same API and with the same memory access overheads (e.g. same VM) can effectively be treated the same way. Optimise once and run it on either.

GPUs work in repeating blocks. Things like geometry processors, ROPs, L1 cache, LDS, SIMDs per CU all scale on a per block level, so they should benefit from the same optimisations. As long as you get stuff like the command processor and the memory bandwidth / L2 cache right, the same game with the same optimisations should scale really, really well between the two performance profiles with minimal additional work (if any). Get your high end version running as you want, then just adjust the resolution of the low end one to match the performance profile.

Choosing the right balance of LODs for the assets could end up being the most SKU specific thing, and if you keep everything the same but use an automated tool for streaming textures even that might end up requiring little additional work.
 
There's a price point that the mass market buys into consoles and that usually takes a couple years to reach.

I'd posit it is just not a price point for the mass market, but a value proposition, of which for them, popularity of the machine and breath of exclusive library are huge factors.
What I mean by that is: I strongly believe the kind of gamer-Joe that waits untill the time ps5 is reaching $299 to buy it, would still not buy it at launch even if there was a $299 sku available then. This dude simply is not in a rush to buy the new cutting edge toy. He'd rather wait for the thing to get more must-have software and hope they drop the price further to motivate him even more.
 
Compatibility + enchantment with the previous gen might sway some of the traditionally late folks.

But that's a thread for a different discussion. Or a discussion for a different thread.
 
Status
Not open for further replies.
Back
Top