terraincognita
Newcomer
^ The difference being however, that from the ports we have seen, its not just one or two games that render at these lower performance ratios, its a vast majority of launch titles.
Most Wanted and Trine 2 are the only two games so far that have been superior to their HD twin equivalents in a framerate, resolution or texture perspective.
The other side to your argument would be; why would only these two games so far have a handle on the Wii U to that extent?
We also know that especially in Criterons case, they already had the game essentially made, they just didn't want to ship before knowing Nintendo's online plans.
So does that say more about Nintendo's documentation, the hardware itself, or the developers?
It could be a mix of all 3.
To be fair, Most Wanted isn't out yet. If the finished game does indeed turn out to be superior to PS3/360 as the devs (and video preview) claim, it's worth pointing out that both Criterion and Frozenbyte (in the case of Trine 2) developed the Wii U port themselves. It doesn't sound like they were outsourced to a secondary team where many (if not all or most) of the others were. It's also worth pointing out that they ported the PC versions in both cases rather than use the 360 version as a base (unless Criterion ported the 360 version then added the assets from the PC version).
Also to be fair, "majority of launch titles" is only a handful of titles. I know it's not saying much but they were close to parity with the HD twins and a couple titles surpassed the PS3 version in certain places (and in certain other places the PS3 version pulled ahead). I would even go so far as to say if these versions were what originally appeared on PS3, they would be within what is considered parity (which isn't saying much considering its age, but still).
To follow up on my last post, while I commend your efforts to figure this thing out, I don’t think these Call of Duty numbers prove your point. You are trying to say that if we were looking at a 320 SPU part, we’d be seeing vastly better frame rate and resolutions, correct? In reaffirming the statement about the effect of resolution on GPU and CPU loads that I quoted before, I started to notice some peculiarities in those benchmarks.
Let’s look at the HD 6450 for comparison. At 1024x768, we are seeing comfortable frame rates. This makes sense since the GPU is barely being taxed at that resolution. The next bump up in resolution/IQ and look what happens to the frame rate. It takes a nose dive. Is it any coincidence that when we look at low resolutions, CPU bottlenecks are easier to discern? So when looking at the lowest numbers in this chart vs avg figures for BLOPS II Wii U, it makes sense that we would see a difference due to Espresso being no i7. And then there are other performance factors like the locked vsync and characters on screen (which seems to be a cpu thing) on Nintendo’s console. Meanwhile the chart also displays the clear effect of memory bandwidth on performance.
It’s pretty amazing that on the same card, the difference between the 1280x1024, 2xAA, 8x AF and 1920x1200, 4x AA, 16 AF is only ~8 frames!
In short, while I agree that the jury is still out on whether it’s a 160 SPU part, I don’t think you rule out a 320 SPU part by making the rightful observation that games thus far haven’t automatically featured increased resolution and framerate. If getting the image quality to where they felt comfortable resulted in a merely acceptable framerate together with everything else that affects performance, why would the developer then go ahead and increase the settings?
Good point about the CPU. The test setup used in the benchmark was an i7 920 overclocked to 3.8ghz. Even at stock (2.67ghz) I doubt even eight AMD Jaguar cores would surpass it. I could be wrong though.
http://www.notebookcheck.net/AMD-Radeon-HD-7450M.57211.0.html
This is a 7450M, a 160:8:4 mobile GPU based on Caicos. If you scroll down you'll get some average framerates listed for a few games. If you hover the mouse over the settings it will tell you the resolution used (along with AA and AF settings). The 7450M came in DDR3 and GDDR5 flavors, both clocked to 700mhz. One of the games tested is COD:BO (the original). According to this, it averages 52.8 fps @ 800x600 with no AA or AF on low settings using the DDR3 version. BO/BO2 are using the same engine as the original MW, but each subsequent game adds to it (meaning a PC would run the 5+ year-old MW faster than BO2 on the same PC).
BO2 on 360/Wii U is close to the PC's high settings (without AO and the Extra texture setting) @ 880x720 resolution with 2xMSAA and probably 4xAF. The above link claims @ 1024x768 using medium settings, the original BO had an average of 36.5 fps. BO2 on Wii U does experience a drop in fps when there is a lot of action on the screen, especially in a few realtime cutscenes (like the first one as seen over at DF). And in the jungle it runs kind of mixed. For the most part, the single player campaign stays closer to the target 60fps though. Well at least above 50fps, I'm not using fps-measuring equipment and going by eyes only which can only notice fps dips below 50fps.
Keep in mind this was a DDR3 card, but compare it to the DDR3 version in the link function posted. The laptop CPU was an AMD A6 3420M @ 1.5ghz. Even without the overhead that comes with Windows, I doubt 160 shader cores could pull it off. Latte is probably using a modern custom 7xx-based GPU or else the leaked info wouldn't keep alluding to shader model 4+ (and + doesn't mean 5). At the very least I would consider 256 shader cores using an unorthodox custom design, but I'm leaning toward 320 due to the block size. If it really was 160, I doubt those outsourced devs would have gotten as close to parity with 360 as they did given the time they had to work on their ports. But I'm just a noob, I could be wrong.