*split* multiplatform console-world problems + Image Quality Debate

I'm surprised MS didn't camp their engineers at Infinity Ward to make sure that they have the best version of the game. COD is really one of the games that really made the 360 and given how it was a big feature at the initial reveal, I would have thought MS would have helped out with it.

How would they ensure they'd have the best version of the game with such a large GPU deficit? Besides, maybe they did send out guys to IW. I doubt they didn't try.

Anyways, I think it's smarter for MS to concentrate on broader things like getting exclusive titles. It has more impact.
 
The odds that they are not using the ESRAM at all is pretty much zero. The odds that the engine has not received significant changes to it's rendering pipeline is also zero. Xbox One and PS4 are incredibly different than PS3 and 360. They'd most likely be porting from their PC engine, which, let's be honest, should be fairly good at taking advantage of AMD GPUs and x86 CPUs. I'm sure like every dev, they'll find better ways to do things with each release, but I wouldn't expect utilization to be that bad.
 
I don't think they actually argued that. They said they boosted the CPU because there were cases where games were CPU limited. That does not mean if they couldn't have 50% more CUs or 100% more ROPs that they'd take the slight CPU boost instead.


They purposely attempted to portray additional CUs beyond the 12 as less important than the upclock of the whole chip here:

"Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team and the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly. Did we do a good job when we did all of our analysis a couple of years ago and simulations and guessing where games would be in terms of utilisation? Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance. Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing. But we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did. Now everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that [can] cause you not to get the performance you want [if your design is out of balance]."


and they also said the CPU was more important to frame rate here:

"Another very important thing for us in terms of design on the system was to ensure that our game had smooth frame-rates. Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU. Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console. And so that was a key design goal of ours - and we've got a lot of CPU offload going on."

They may be correct, perhaps the tools are incomplete but what we are seeing in these early days raises questions, that is my point.
 
Yeah, those quotes are pretty much in line with what I said. They chose an upclocked 12 CUs over 14 CUs. There is nothing there that says CPU > GPU.

They chose to upclock the CPU because they were getting dropped frames in CPU limited situations. There is nothing there that says CPU is more important than GPU in general. Put it this way, if you're CPU limited, and it's causing dropped frames, adding more GPU won't help (unless I guess you can do GPGPU for that workload). Reducing the amount of times you're CPU limited will help smooth out frame rate issues, but it does not say there aren't any limitations on the GPU side.
 
I wonder how the X1 will compare to the X360 version.

People don't seem to borther about Full HD. I wonder what they say about X1 versus X360...
 
They chose to upclock the CPU because they were getting dropped frames in CPU limited situations.

Let's be honest, they chose an upclock because it was literally the only thing they could do. Any extra costs from lower yields would be temporary and relatively cheap. Any post-hoc public justification was just PR.
 
I wonder how the X1 will compare to the X360 version.

People don't seem to borther about Full HD. I wonder what they say about X1 versus X360...

You really think X1 version isn't going to mop the floor with the 360 version? Compare BF4 at 720p to BF3/4 on 360. I'd expect a big improvement.
 
The odds that they are not using the ESRAM at all is pretty much zero. The odds that the engine has not received significant changes to it's rendering pipeline is also zero. Xbox One and PS4 are incredibly different than PS3 and 360. They'd most likely be porting from their PC engine, which, let's be honest, should be fairly good at taking advantage of AMD GPUs and x86 CPUs. I'm sure like every dev, they'll find better ways to do things with each release, but I wouldn't expect utilization to be that bad.

The engine receive next gen treatment, but probably without specific next gen console optimization. If the game have same fps on both console and the same pixel quality, then they probably don't use the ESRAM. Or It might be that X1 use better AA to compensate for the lower resolution which would make the difference more explainable. Either we ask the dev directly or wait for the direct comparison of the game. Personally, I wouldn't mind 720p with better AA.
 
The engine receive next gen treatment, but probably without specific next gen console optimization. If the game have same fps on both console and the same pixel quality, then they probably don't use the ESRAM. Or It might be that X1 use better AA to compensate for the lower resolution which would make the difference more explainable. Either we ask the dev directly or wait for the direct comparison of the game. Personally, I wouldn't mind 720p with better AA.

Why would you assume they aren't using ESRAM if the pixel quality is about the same ...
 
Yeah, those quotes are pretty much in line with what I said. They chose an upclocked 12 CUs over 14 CUs. There is nothing there that says CPU > GPU.

They chose to upclock the CPU because they were getting dropped frames in CPU limited situations. There is nothing there that says CPU is more important than GPU in general. Put it this way, if you're CPU limited, and it's causing dropped frames, adding more GPU won't help (unless I guess you can do GPGPU for that workload). Reducing the amount of times you're CPU limited will help smooth out frame rate issues, but it does not say there aren't any limitations on the GPU side.

Why does the XB1 version have a lower frame rate? I'm not directing this at you personally its more or less a response to the article and talking points MS has shared recently. As I already said they could be correct, could be wrong but it would be really interesting to know what the source of the lower resolution and lower frame rate was.

Regarding the article I would say that they did try to suggest that Sony was also targeting 14CUs but I can see how not everyone would interpret their comments the same way.

"If you go to VGleaks, they had some internal docs from our competition. Sony was actually agreeing with us. They said that their system was balanced for 14 CUs. They used that term: balance. Balance is so important in terms of your actual efficient design. Their additional four CUs are very beneficial for their additional GPGPU work. We've actually taken a very different tack on that. The experiments we did showed that we had headroom on CUs as well. In terms of balance, we did index more in terms of CUs than needed so we have CU overhead. There is room for our titles to grow over time in terms of CU utilisation, but getting back to us versus them, they're betting that the additional CUs are going to be very beneficial for GPGPU workloads. Whereas we've said that we find it very important to have bandwidth for the GPGPU workload and so this is one of the reasons why we've made the big bet on very high coherent read bandwidth that we have on our system."
 
The GPUs on these 2 systems are worlds apart IMO (18 vs 12 CUs, 16 vs 32 ROPs, 32megs vs gigabytes of high speed memory) so I am not suprised. Looking at launch titles (some not all) Sony felt that 1920x1080 was important in terms of product differentiation, at least for the near future.

2-4 software cycles into the lifespan we will see what happens to resolution on both these systems.
 
Why would you assume they aren't using ESRAM if the pixel quality is about the same ...
Because then the difference wouldn't be 720p vs 1080p (assuming the fps is also the same). Of course there could be bottleneck other than ESRAM utilization. But I would think that at least X1 capable of running within at least 2/3 of PS4 resolution if they use ESRAM.
 
They purposely attempted to portray additional CUs beyond the 12 as less important than the upclock of the whole chip here:

"Balance is so key to real effective performance. It's been really nice on Xbox One with Nick and his team and the system design folks have built a system where we've had the opportunity to check our balances on the system and make tweaks accordingly. Did we do a good job when we did all of our analysis a couple of years ago and simulations and guessing where games would be in terms of utilisation? Did we make the right balance decisions back then? And so raising the GPU clock is the result of going in and tweaking our balance. Every one of the Xbox One dev kits actually has 14 CUs on the silicon. Two of those CUs are reserved for redundancy in manufacturing. But we could go and do the experiment - if we were actually at 14 CUs what kind of performance benefit would we get versus 12? And if we raised the GPU clock what sort of performance advantage would we get? And we actually saw on the launch titles - we looked at a lot of titles in a lot of depth - we found that going to 14 CUs wasn't as effective as the 6.6 per cent clock upgrade that we did. Now everybody knows from the internet that going to 14 CUs should have given us almost 17 per cent more performance but in terms of actual measured games - what actually, ultimately counts - is that it was a better engineering decision to raise the clock. There are various bottlenecks you have in the pipeline that [can] cause you not to get the performance you want [if your design is out of balance]."


and they also said the CPU was more important to frame rate here:

"Another very important thing for us in terms of design on the system was to ensure that our game had smooth frame-rates. Interestingly, the biggest source of your frame-rate drops actually comes from the CPU, not the GPU. Adding the margin on the CPU... we actually had titles that were losing frames largely because they were CPU-bound in terms of their core threads. In providing what looks like a very little boost, it's actually a very significant win for us in making sure that we get the steady frame-rates on our console. And so that was a key design goal of ours - and we've got a lot of CPU offload going on."

They may be correct, perhaps the tools are incomplete but what we are seeing in these early days raises questions, that is my point.
I think most people are conscious the Xbox One can do better than that. In fact, the hardware side of the console has some neat and interesting features.

I wonder how the X1 will compare to the X360 version.

People don't seem to borther about Full HD. I wonder what they say about X1 versus X360...
Yes, I wonder that too. Developers admittedly said that the console is like 8x the Xbox 360 so I'd expect an 8x increase in visual fidelity.

Aside from that, many developers say that the differences aren't huge between the PS4 and Xbox One and Major Nelson hinted at the possibility of the Xbox One to be close to the PS4 and said he couldn't wait til the truth came out.

The only truth I see for now is that some games aren't running at a superior resolution compared to the PS3 and X360. I wonder what Treyarch and Microsoft have to say in regards to this.
 
I don't think they actually argued that. They said they boosted the CPU because there were cases where games were CPU limited. That does not mean if they couldn't have 50% more CUs or 100% more ROPs that they'd take the slight CPU boost instead.

In the df interview with microsoft they implied that they chose 16 rops for bandwidth reasons.
They even implied that the PlayStation 4 didnt have enough memory bandwidth to take advantage of the 32rops. They even gave a number for the bandwidth requirements for 32 rops.
I dont remember the exact number but i know it was in the 200gbs.
 
You know with the xbox one having supposedly a great hardware scaler could turn out to be a curse as well as a blessing. Think about it what reason does a 3rd party dev have to optimize a game for a higher resolution when they can drop to 720p and quickly solve issues and take comfort in the fact that the xbox one will automatically upscale to 1080p. The know a majority of people wont be able to tell the difference. I think it would be wise of Microsoft to enforce a 900p minimum resolution for all games. If they dont we could see alot of games that are developed for multiple platforms go this route on the xbox one.
 
Aside from that, many developers say that the differences aren't huge between the PS4 and Xbox One and Major Nelson hinted at the possibility of the Xbox One to be close to the PS4 and said he couldn't wait til the truth came out.

You expect the guy who hangs out at malls for XB1 events to know what resolution the XB1 and PS4 are running at? Is this where you get your info from, a PR mouth piece?
 
Of course I do! But will Joe Gamer see and appreciate the difference?

Joe Gamer probably won't see a huge difference in COD between 360 and next gen because the game clearly looks cross gen asset wise. They are equally unlikely to see the 720p - 1080p difference.
 
Of course I do! But will Joe Gamer see and appreciate the difference?

That's pretty much what they are counting on. Joe Gamer really can't see the difference in resolutions, so simply shade the pixels the same way and they won't be able to tell much difference at all from 1080p to 720p. Resolution is the easiest visual element to drop without anyone knowing any different. That's why we have a "resolution thread", yet don't have a "are those shadow maps blocky" thread or "are those textures blurry" thread, etc. That's because while the latter two are relatively easy to spot even by the untrained eye, resolution typically remains an unknown until someone measures it. Only when the unknown and unseen is meticulously measured and deemed < 1080p does Outrage(tm) ensue that the pixels they don't notice and can't see aren't there. I'd expect on forums this gen that resolution will become The Definitive Image Quality Measurement Metric(tm), meanwhile the real world won't notice or care.
 
Back
Top