Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
My understanding was that graphics tasks are so inherently parallel that throwing more cores at those problems tends to "Just Work". I'm sure it's not a perfect correspondence, but it's pretty dang close. Yes? No?
I don't think Albert P's recent post helped Microsoft's case much at all. While many of his points seemed valid, his first two seemed like pure hand-waving. Am I alone in thinking this? Am I misunderstanding how these things work?
Those first two points, heavily paraphrased, were:
- 50% more CUs doesn't mean 50% more-bettah "graphics".
- Boosting "each" CU by 6% is somehow better than boosting the entire GPU by that amount.
The two points are related. When you add cores, you always add some level of overhead to handle the extra core. Two cores can't do twice the work of one core, even in a highly parallel problem, since the are always places where a bottleneck occurs. Let's say each added core makes all your cores 99% as efficient as they were before you added the new core. At 12 cores, you're looking at each core being about 88% as efficient as it would be by itself. At 18 cores each core would be 83% of its best. This is the same reasoning behind why beefing up all your cores could be better than adding a new core.Those first two points, heavily paraphrased, were:
- 50% more CUs doesn't mean 50% more-bettah "graphics".
- Boosting "each" CU by 6% is somehow better than boosting the entire GPU by that amount.
My understanding was that graphics tasks are so inherently parallel that throwing more cores at those problems tends to "Just Work". I'm sure it's not a perfect correspondence, but it's pretty dang close. Yes? No?
I don't understand what he's trying to imply in his second point, at all. I assume he's just confused, but I'd like to hear if what he's saying makes any more sense to the experts here.
To the hardcore non family gamer yes your right .
But to the family man who started playing games in the 70s and 80 now has two point two children of different ages . A wife who enjoys social media and small games of her tablet .its not such a hard sell as everyone in his family can enjoy what the Xbox one offers .
Like I said Microsoft are looking to a different market the family that's why the hardcore gamer is spitting his dummy out on the internet the biggest company in gaming is saying your only part of our vision .where as there rival is saying we still love you .
Again its just my opinion on the available facts as I see them .
The two points are related. When you add cores, you always add some level of overhead to handle the extra core. Two cores can't do twice the work of one core, even in a highly parallel problem, since the are always places where a bottleneck occurs. Let's say each added core makes all your cores 99% as efficient as they were before you added the new core. At 12 cores, you're looking at each core being about 88% as efficient as it would be by itself. At 18 cores each core would be 83% of its best. This is the same reasoning behind why beefing up all your cores could be better than adding a new core.
I don't actually know what the real efficiency loss is per core added, but the principle is the same, irrespective of how small the difference between "perfect correspondence" and actual performance is.
Lastly, and changing the subject, how crazy would it be if Microsoft is being prevented (via an NDA) from talking about some super-neato aspect of their hardware? Wouldn't that just perfectly fit in with the rest of this "rolling thunder" of a PR disaster they've been executing on? "We've released all the unimpressive parts of the specs, but are prevented from discussing the good parts. Please understand."
I'm not saying I believe that's what's happening, by God, wouldn't that just be par for the course this year?![]()
He said differences on third party games will be 1 digit framerate (0-9FPS).
That'd be less than it was with some PS360 games.
Well, 0-9 FPS IF talking about 30 FPS is 0-30% for what that's worth. I guess even the high end is under the 40% GPU FLOPS deficit.
I will ask two questions of the detractors, honest questions.
1. What piece of information would you want that I could provide that would convince you there is not a huge delta in performance?
2. If it comes out after we launch that the difference between 3rd party games is maybe single-digit FPS between the two platforms, will I get an apology or concession?
Well, 0-9 FPS IF talking about 30 FPS is 0-30% for what that's worth. I guess even the high end is under the 40% GPU FLOPS deficit.
He really should shut up.1. What piece of information would you want that I could provide that would convince you there is not a huge delta in performance?
2. If it comes out after we launch that the difference between 3rd party games is maybe single-digit FPS between the two platforms, will I get an apology or concession?
Agree with you there, especially as a single digit has a pretty big range.He really shouldn't have suggested the single digit fps difference.
Agree with you there, especially as a single digit has a pretty big range.
One of those comments that probably sounded better in your head![]()
Well, 0-9 FPS IF talking about 30 FPS is 0-30% for what that's worth. I guess even the high end is under the 40% GPU FLOPS deficit.
Not only that, but going deeply it's also a confession of inferior performance.
Which he shouldn't be putting himself in the position to confess in the first place.
FWIW my expectation is that PS4 ought to have a performance advantage, but I wouldn't expect it to reflect the difference in CU counts. CU's are MASSIVELY underutilized on vertex heavy workloads and plenty of the frame will be ROP or bandwidth limited.