Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
ok,how the ATI can emulate the Hollywood 1 megs texture cache?
Do they have to? I mean, just mapping it to main memory would still be faster (let alone that the normal texture caches of all halfway modern GPUs kick in automatically). I don't get your logic, that the Wii U would need 30 times or so the bandwidth of the original Wii to run the old games. WTH?!?
 
I'd say, the same way PC emulators do it... you hardly need beefy hardware to run Wii games at high res today. Though, I don't know exactly how probably such a solution actually is. Sony does it... so Nintendo should be able to do it, too.
 
Dolphin is actually pretty demanding. Some rendering techniques of Wii/Cube don't map well to PC and so performance can tank in some cases. Like say FZero GX's Sand Ocean map. That game has highly variable performance even on my 4.3ghz i5 with 560Ti. However, the Dolphin authors access PC hardware through a restrictive API and they probably don't have access to all of the WiiCube hardware docs. It's also possible that they again have hardware to make things easier here.
 
I might be wrong here, but maybe - just maybe - not everyone agrees with you on this. Maybe they dont see 'exotic hardware' or on paper advantages as something to base their enthusiasm for a peice of hardware on. Maybe their optimistic because they know for a fact there'll be some beautiful games on the system based on what we know is in there and what we know games developers can do with this range of hardware.

Yes, but then that is not enthusiasm for Nintendo hardware per se but for Nintendo developers and their games.

Hence my earlier point that it'd probably be best if Nintendo ended up like Sega and became a publisher rather than a platform owner (then we could get Nintendo games on high-end hardware).

It's unfair to call it "Kool-Aid drinking" when some people have a different opinion than you on a subject. Sure some people are just huge Nintendo fans and wont let anybody diss "their" system - just like with any other console. But not all.

Actually the Kool-Aid drinking refers to people who (ignoring all the information we have) insist the Wii U will be able to compete (in terms of processing hardware) with the next gen machines from Sony and MS.
 
Anyone speak nip?

Makes no sense...


2 processors 6 cores 12 threads Proc 1 (presumably espresso): 920mhz Proc 2 (presumably ARM): 262mhz 64-bit architecture Big Endian SMP 4-way Lvl 1 512kb Lvl 2 2MB 12 way Lvl 3 12MB Edram 14nm Process SOI.

Second part (graphics core): Architecture: custom 500 Stream Processors 710mhz Engine clock 29~ Billion/sec texture fill rate 512MB GDDR5 video ram 1ghz memory clock 512bit memory interface 153/GB memory bandwidth 16kx16k frame buffer max size 1.4GB frame buffer memory (these are all total BS btw)
 
Probably fake.

14nm EDRAM....

It does exist though:

Gregory Pitner
14nm Fin-FET based eDRAM Design/Verification at IBM
Research Assistant at Rensselaer Polytechnic Institute
eDRAM Development Co-op at IBM
http://www.linkedin.com/pub/gregory-pitner/39/580/4b1


Building on its successes with SOI technology, IBM will move to finFETs based on silicon-on-insulator wafers at the 14nm node.
Gary Patton, vice president of IBM’s Semiconductor Research & Development Center, said IBM will use SOI for all of its 14nm products, including the server processors it uses internally and the Asics it makes for itself and external customers, including SoCs made for video game vendors. Some of those chips will be made internally and others at external foundry partners, Patton said.
 
That article's from 2010, any manufacturing of processors and revenues and so on aren't going to be Wuu-related, as that's way way too far back to have any connection. The Wuu might not even have been fully specced out on paper back then.

There was a China-only console launched some while back, whatever it's called, in partnership with LG I believe. Maybe this talk refers to that one.
 
Look at the page of this very same thread that I also linked.
http://forum.beyond3d.com/showthread.php?t=60501&page=33

My apologies gentlemen, (got distracted at work) I meant specifically the ARM9 line of component processors was out of the "equation". Nintendo and ARM have indeed grown quite close as business partners, they will again be providing an applications processor core. A proprietary solution specifically for Nintendo. Marvell will not, as it is not SOC. I referenced Retro bgassassin because their project is for the "broader" demographic. It is not DK, LOZ, nor MP. It is extremely visually impressive, or so I am told. And I love those maniacs, back in my younger days I established the MC threads. Along with CVX, iirc.

You've been around for a minute then. :D

And look at what you've done. Your two posts have spurred around 300 posts on this. :LOL:

Thanks for the clarification on who is providing the ARM.
 
Ok I know it was a bit OT....but did a big chunk of the last page of this thread dissapear (the Miiverse discussion)? Or is it just me?
 
Don't know if this has been posted, but this doesn't sound really encouraging. This is from the game director of Darksiders 2.

...
You know, so far the hardware's been on par with what we have with the current generation's. Based on what I understand, the, you know, the resolution and textures and polycounts and all that stuff, we're not going to being doing anything to uprez the game, but we'll take advantage of the controller for sure.
...

This is very, very old news. Was one of the first interviews that mentioned Wii U. In fact he had been misunderstood, he meant that the Wii U version of darksiders 2 will have not many graphical advantages over the PS360 version. But of course the Wii U is far more powerful than PS360.

More up-to-date interviews with game developers:

http://www.develop-online.net/news/42303/The-Wii-U-is-surprisingly-powerful-says-Rayman-dev
http://www.nintendolife.com/news/2012/05/gearbox_wii_u_is_a_powerful_powerful_machine
http://www.polygon.com/gaming/2012/...nts-on-wii-u-will-have-richer-denser-graphics
http://www.notenoughshaders.com/2012/11/03/shinen-mega-interview-harnessing-the-wii-u-power/
 
Probably an accurate translation of the last rumor:


"Another banana to skin..."

Code name: Nadir
Architecture: z/Architecture
# Of Processors: IFL01 derivative + zIIP02 derivative
# Of Cores: 6 (Kernel mode 2 cores)
# Of Threads: 12
Operating frequency (processor 01) : 920MHz
Operating frequency (processor 02): 262MHz
Instruction set: 64 bit
Endianness (?): Big Endian

Floating-point format (IEEE 754): Binary128/Decimal128


Integer execution unit: 5 per core
SMP: 4-way per core
Level 1 (L1) cache: 512kb per core
Level 2 (L2) cache: Associative 2MB, 12-way set per core
Level 3 (L3) cache: 12MB per core (eDRAM)
Memory management unit (MMU): 2
Process Technology: 14 nm SOI

Code name: Zenith

Architecture: Custom
Stream processor: 500
Engine Clock: 710MHz
Texture Fill Rate (billion / sec): 29.44
Size / Type: 512 GDDR5 SGRAM
Memory clock: 1GHz
Memory Interface: 512-bit
Memory bandwidth: 153GB/s
Frame buffer: 15936 x 15936 pixels
Frame buffer memory: 1.4GB

ECC memory: Checks for these
API support
OpenGL (ES, SC)
COLLADA
OpenRL
Universal 3D
X3D
http://www.ign.com/boards/threads/wii-u-specs-in-japanese-now-with-less-thread-lag.452746814/page-2
 
Does it qualify as a rumor? It appears to me as some random collection of someones hallucinations. How is this connected to the Wii U anyway?
 
Status
Not open for further replies.
Back
Top