^ Arguing for what we may or may not see at E3 seems more like conjecture than anything we've been talking about.
Never said it would, i was referring to the high resolution textures. Although the GPU being more powerful to a degree would contribute to that as i've always assumed a 300-350flop GPU. But i'd like to stress that i have no "interest" in making the Wii U out to be anything more than the information we've gleaned so far.
I have a Wii U, i like playing games on it. But i've been stating for a while that there's no magic in this set up. Anything that Criterion has gotten out of it has been through careful optimization of the elements that we've always known are there.
We've always known that 1GB of ram should bring theoretical higher resolutions, and the GPU, being a theoretical 320SPU part should perform much better than the old ass Xenos and RSX set ups. My questions have been down to why that has not been the case outside of one or two exceptions.
We should also take care to remember that mirroring images incurs no performance penalty. Only when the screens are different does the performance of the game start to be impacted, which is why the gamepad usually only shows light material. Its not a bullet point proving anything spectacular the Wii U is hiding.
Also, the 360 has a tessellation unit, so i'm not sure of your angle in regards to that, you mean the Wii U is in a more capable position to use its tessellation unit? That makes more sense
I won't argue against how powerful the Wii U’s in proportion to the 360/PS3, they all have their places against each other.
As for exotics.
I call the Cell an exotic piece of hardware. We know that the Wii U's GPU is based off of the RV700 series of GPU, with custom arrays bolted onto it. That's not exotic to me.
Yes, I meant to say a more modern tessellation unit. Afaik, Latte is rumored to house a fixed function unit,
(ala the 360) but doesn't require explicit vertex fetches, rendering cracks have also further been reduced. When I speak of software at E3, it is certainly
not conjecture. There are only 2 examples
because proper documentation was not available before November, & Frozenbyte simply spent more time optimizing their engine to better exploit aspects of the Wii U hardware.
Nintendo's most recent SDK is still
immature. Toolchains, developer customer support contracts, compilers, various language support that allow for access to its unique PPC-based architecture, etc. These are still very much in their infancy. So I'm sure you can understand this, as well as its direct implications for multi-platform software development. Rayman Legends should further showcase these enhancements in Sept., especially since its the lead platform.
Again we are discussing ported software, non-proprietary developed engines. In all likelihood using the 360 as the base platform. Its akin to looking at the PS3 ME3 port vs. the 360 & thinking
“Is this all the system is capable of?” Not at all representing an accurate capability metric when viewing the software output of Naughty Dog or Santa Monica Studios. The titles I have knowledge of were built from the ground up, utilizing proprietary engines built specifically to exploit the Wii U's hardware strengths.
Who here is making the Wii U out to be
more than what it is? I'm in total agreement with you regarding the theoretical specs. I am well aware that mirroring the screen incurs no performance penalty, I merely brought this system aspect up. Although mirroring & allowing for seamless & instantaneous changes to the game world,
(as in NFS) as well as assisted car control
(drifting etc.) is a bit disingenuous to label as a simple mirror, ala Tekken TT. Regardless I believe that Black Ops II was simply a
starting point, & these asymmetric gameplay experiences will only further be refined.
(certainly for 3rd parties as Nintendo’s API matures) Also all game scenarios will vary by complexity & implementation, so yes it is indeed a
bullet point for the system.
I also recognise the Wii U's weaker central processor,
albeit not as inept as previously thought. I stated as much a very long time ago here.
(despite being of a very GPU-centric design) That is why I qualified the statement by saying
"overall more capable system." It was not meant to be a point-by-point comparative evaluation of power vs. the current HD twins, as all systems have their deficiencies. Exotic, or rather unorthodox designs can vary by degrees.
(yes the PS3’s Cell architecture had slipped my mind) And your description of Latte being an RV700 derivative with custom arrays
“bolted on” as if an afterthought I found quite telling. It’s a
tad more complex than your oversimplified GPU analysis. Anyhow I hope David Hegalson’s, CEO of Unity, words ring true:
http://www.cinemablend.com/games/Interview-Why-Unity-Engine-Good-Fit-Wii-U-47173.html
Gaming Blend: While the Wii U doesn't specifically use DirectX functionality, will the Unity Engine for the Wii U allow for DirectX 11 equivalent functionality in regards to shaders, soft and self shadowing as well as potential scalability for Shader 5.0 or higher?
Hegalson: Yeah. We’ll do a-We’ll make it potentially possible to do so.
The ram aspect was my viewpoint, but we did have a debate in here about that.
Higher resolution textures, lighting and draw distance are all enabled by a larger amount of ram.
Again with the magical ram
enabling all effects. I believe you have been corrected
twice now, excluding myself. Why you persist along this incorrect line of thought is puzzling, your agenda perhaps?