That's a bottleneck, not diminishing returns.
What do you think causes diminishing returns (none linear performance scaling)?
That's a bottleneck, not diminishing returns.
That's a bottleneck, not diminishing returns.
It would probably be trivial to create a CPU-limited scenario where a GCN-based card with more CUs performed no better than one with less. If your performance is being held back by the lack of one resource, throwing more of another resource at it isn't going to add to your performance.
Yeah that is why the 7790 performs better than the 7770,and the 7850 better than the 7790..
7970>7950>7870>7850>7790>7770
This is a line on PC each one with more or less CU than the other and in all scenarios the one with more CU performs better this is 100% accurate and we haves tons and tons of benchmarks proving this without shadow of a doubt.
Unless something is horrible wrong inside any of the next gen consoles this should hold pretty well.
This is what make the whole theory of diminishing returns so hard to swallow.
So jaguar is enough for 14CU but after that you get diminishing returns.?
But how many games are CPU bound.?
The whole balance argument doesn't even make sense,MS claim 14 CU is the right balance,but they have 12 CU not 14 CU working ones,sony try to stir developers to use compute the 14+4 was just and example and even Cerny say it.
Nothing has to be wrong, much less seriously.Yeah that is why the 7790 performs better than the 7770,and the 7850 better than the 7790..
7970>7950>7870>7850>7790>7770
This is a line on PC each one with more or less CU than the other and in all scenarios the one with more CU performs better this is 100% accurate and we haves tons and tons of benchmarks proving this without shadow of a doubt.
Unless something is horrible wrong inside any of the next gen consoles this should hold pretty well.
This is what make the whole theory of diminishing returns so hard to swallow.
Bottlenecks lead to diminishing returns. Workloads are variable as are their dependencies on system resources. As more and more of your workloads are being stalled by bottlenecks your performance will tail off until all workloads are effected. Hence, diminishing returns.
Sure a bottleneck can lead to diminishing returns, but a diminishing return doesn't need a bottleneck to exist. They're different concepts. One (bootleneck) is a performance metric, the other (diminishing returns) is a value proposition. A value proposition is a numerical expression of an opinion, so that makes it even harder to define.
Yeah that is why the 7790 performs better than the 7770,and the 7850 better than the 7790..
7970>7950>7870>7850>7790>7770
This is a line on PC each one with more or less CU than the other and in all scenarios the one with more CU performs better this is 100% accurate and we haves tons and tons of benchmarks proving this without shadow of a doubt.
Unless something is horrible wrong inside any of the next gen consoles this should hold pretty well.
This is what make the whole theory of diminishing returns so hard to swallow.
Nothing has to be wrong, much less seriously.
Comparing PC parts is a nice useful start, but it's not the end result.
Was those cards benchmarked with powerful cpu's to make sure that only the gpu's was being tested, yes, and if not then that makes the cards comparisons to each other invalid much less consoles.
Just an example, not saying it's the cpu specifically that is under powered or anything.
A console is a built as a whole not individual parts.
The balance that Sony has gone for doesn't seem broken, wrong, or anything but well done to me.
Why would a diminishing returns on 14+4 be wrong if their vision of the future is to make good use of gpgpu?
It's not like the CU's are not going to not be doing anything, could even be doing graphic related stuff.
People really need to stop seeing things as broken, issues, etc.
Discussing what led to the design, and if you had x more bandwidth, or cpu, resource allocation etc, so there wasn't a diminishing returns for that specific use case is different than saying there is something broken or wrong with it.
See the reference to it in the Sony slide (assuming on first page of this thread).How is 14+4 diminishing returns? What result do you believe is diminished?
On top of the 14 + 4 CU split, does the PS4 have a GPU reserve for OS (from the 14) similar to Xbox (though likely less) putting the effective GPU gflops at 1.4 tflops or less?
Is this why PS4 launch games aren't doing the 60fps or 1080p they were expected to?
On top of the 14 + 4 CU split, does the PS4 have a GPU reserve for OS (from the 14) similar to Xbox (though likely less) putting the effective GPU gflops at 1.4 tflops or less?
Is this why PS4 launch games aren't doing the 60fps or 1080p they were expected to?