Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
It's a tricky situation to follow (for me at any rate!).

The 55nm UX7LSeD is listed on the website, as is the UX8LD at 40nm which is described as still being in development. The UX8 process is listed on the website, but not the UX8GD.

Further complicating matter is the UX7LSeD spec sheet from 2007 that I've just opened up that reckons up to 768 Mb and 1 gHz is doable (http://www.datasheetarchive.com/UX7LSeD-datasheet.html#). But that's probably old data.

Edit: That pdf also makes it look like NEC were working on a 45 nm edram process??
 
It's a tricky situation to follow (for me at any rate!).

The 55nm UX7LSeD is listed on the website, as is the UX8LD at 40nm which is described as still being in development. The UX8 process is listed on the website, but not the UX8GD.

Further complicating matter is the UX7LSeD spec sheet from 2007 that I've just opened up that reckons up to 768 Mb and 1 gHz is doable (http://www.datasheetarchive.com/UX7LSeD-datasheet.html#). But that's probably old data.

Edit: That pdf also makes it look like NEC were working on a 45 nm edram process??
NEC issued a press release regarding their 40nm eDRAM back in November 2007. It also addresses the 45nm issue:
NEC Electronics and Toshiba Corp. are co-developing a 45-nm process, as part of a previously-announced partnership. NEC is said to be tweaking the process and will roll out its own version, which will be a 40-nm offering.

Like the 55-nm process, NEC Electronics has also incorporated an embedded DRAM or eDRAM technology into the mix. In fact, the company has a pair of eDRAM offerings: UX8GD and UX8LD.

The UX8GD eDRAM technology boasts clock speeds up to 800-MHz and low operating power, making it ideal for digital video cameras, game consoles and other consumer applications.

Meanwhile, the UX8LD eDRAM technology features low leakage-current levels that reduce power consumption by as much two-thirds compared to equivalent SRAM in the marketplace. This makes it ideal for use in mobile handsets and other portable devices that require low standby power, according to NEC Electronics (Kawasaki, Japan).
http://www.eetimes.com/General/PrintView/4100382
 
Yeah, I'd found the press releases about UX8GD and UX8LD from a few years back. It just seemed strange that while they're happy to talk about UX8 on their website, and talk about various edram products (even giving out spec sheets for UX7LSeD), and even talk about UX8LD and list it as still in development, that UX8GD isn't mentioned anywhere on their website. Even the google can't find a single reference to the code anywhere on their website, at all.

And that just seemed strange because it was a product they'd trumpeted years earlier along side another form of 40nm edram that actually has made it onto the website. It seems inconsistent.
 
Yeah, I'd found the press releases about UX8GD and UX8LD from a few years back. It just seemed strange that while they're happy to talk about UX8 on their website, and talk about various edram products (even giving out spec sheets for UX7LSeD), and even talk about UX8LD and list it as still in development, that UX8GD isn't mentioned anywhere on their website. Even the google can't find a single reference to the code anywhere on their website, at all.

And that just seemed strange because it was a product they'd trumpeted years earlier along side another form of 40nm edram that actually has made it onto the website. It seems inconsistent.
There's little reason to have much about a technology on the site when there are only very few potential customers. It's exactly the same with Macronix: The ROMs Nintendo uses in 3DS carts are not mentioned anywhere on their website either, because there's simply no mass market for 4GB ROMs.

And don't forget that Renesas basically left the high end semiconductor field with the transition to 32nm. Why even continue trumpeting?
 
There's little reason to have much about a technology on the site when there are only very few potential customers.

Well not having much about the technology would makes sense, but not listing it at all even in the press centre just seems odd.

It's exactly the same with Macronix: The ROMs Nintendo uses in 3DS carts are not mentioned anywhere on their website either, because there's simply no mass market for 4GB ROMs.

In the sense that there's no mass market it's probably the same, but the cost of developing a 40 nm edram capable manufacturing line seems to be a much more significant investment. I just thought they would have mentioned it in the last couple of years, especially if it's now delivering huge volumes of reasonably sized processors. Even IBM aren't doing that on anything smaller than 45nm yet afaik.

And don't forget that Renesas basically left the high end semiconductor field with the transition to 32nm. Why even continue trumpeting?

Well they're advertising the low power 40 nm edram even though it's still "in development". Maybe with them leaving the high end that'll never make it to availability. Would have thought there's a bigger market for low power edram than high performance though, so maybe it still will. Or maybe it is available (along with UX8GD) and the website just needs updating (even big corporations can have pretty shocking website policies).
 
anybody reading about what's going on neogaf, they payed 200$ to get the wiiu gpu x-rayed, will this give us a confirmation if its weaker or stronger then current gen.
 
anybody reading about what's going on neogaf, they payed 200$ to get the wiiu gpu x-rayed, will this give us a confirmation if its weaker or stronger then current gen.
Just to clarify, they aren't paying to have it X-Rayed, they just pooled some money to purchase the images off chipworks. The money is collected (in just a matter of hours), and they're just waiting for verification of funds before the purchase is made. Should be interesting to hear what they find.
 
anybody reading about what's going on neogaf, they payed 200$ to get the wiiu gpu x-rayed, will this give us a confirmation if its weaker or stronger then current gen.

Assuming Chipworks take good photos and the architecture is similar to what we already know about (It will be) then we should be able to get at least a straight Jigglyfloppage number out, shader and other base element counts.
 
...Or something like that. :) I'm no super expert on this sort of thing.

I think your expertise is well enough:) So basically, GPUs take framebuffer bw penalties when a 2x2 block doesn't have full coverage, or when 4 2x2 blocks cannot be dispatched to 4 different chips in a single clock. If I'm not mistaken it is the same approach a single DDR chip already uses right?

The write gathering you mentioned seems to be a good solution and avoids multiple writes to a single chip. At first I thought it would be quite expensive (using high speed scheduling and all), but setting it up as a reluctant cache and write buffer on top of that might help already.

Another question that disturbs my mind: in what way do shading units interact. Is it like all of the units rendering as much pixels, performing a single operation per pixel per clock. Or is it more like the shaders are divided equally among the ROPS and are setup as a pipe, so that it will have lots of latency, but renders a complete shading program in a single cycle? In the latter case the fillrate wouldn't be influenced by shading operations. What is plausible? (BTW, sorry for being offtopic)

You have to look at 5xxx series mobile binned parts on 40nm to find 400 shader GPUs that might fit into the Wii U power envelope. And even with 8 ROPs they could outperform the 360.
IMO AMD would be perfectly capable of instructing Nintendo how to build a GPU that is affordable and effective in 2012. So god knows what they choose for in the end. If they want to attract hardcore gamers they'd better have, if I were a XBOX fanboy that might be seduced by next gen Wii, I'd certainly wait for another two years for a 720 with these kind of specs speculating around! I agree on the ROP thing, but given the small clockspeed increase, the additional performance (if there already) should come mainly from faster shading in order to achieve higher fillrates.

Chances are that either the GPU is BW limited in some way or it doesn't actually have a lot of extra grunt. Or both.
Well we shouldn't be disappointed if that were the case:)

ERP; said:
because tiling it reduces the efficiency of the texture cache which is optimized for the swizzled case.
Mmm, good point. I'd say this is no issue for GC and Wii since the data is converted when exported to main memory. Don't know if the same applies to 360? Then again, the bandwidth required for transfer might be of larger cost. Agree with the alignment thing.

jlippo; said:
Also mipmaps are needed to get any decent performance from cache.
From a TMU's perspective it is, each mip level halves the res, so less data to read. But in case of anisotrophic filtering, or when LOD is in between two mip levels, it results in additional external bw requirements when both textures aren't in cache yet (you always need to read 1 byte, and theoretically 0.25 byte for the next LOD).

Just to clarify, they aren't paying to have it X-Rayed, they just pooled some money to purchase the images off chipworks. The money is collected (in just a matter of hours), and they're just waiting for verification of funds before the purchase is made. Should be interesting to hear what they find.
Interesting! Makes me kind of wonder why chipworks made x-ray pics... I don't assume because there is a large market for GPU circuitry posters!
 
Chipworks breaks down and analyzes chips for companies that want to know more about the technologies being used in them.
This is for people that stand to make money from this knowledge or use it for research, such as investors, and potentially other tech companies.

I don't have the cash to burn for something like that, although I wonder if Chipworks has restrictions on disseminating the information found in a purchase.
I'd think it would be frowned upon if a forum bought something they've reported and then tried spamming the internet with it.
The tools and work involved are frequently very expensive and very intensive.
 
Chipworks breaks down and analyzes chips for companies that want to know more about the technologies being used in them.
This is for people that stand to make money from this knowledge or use it for research, such as investors, and potentially other tech companies.

I don't have the cash to burn for something like that, although I wonder if Chipworks has restrictions on disseminating the information found in a purchase.
I'd think it would be frowned upon if a forum bought something they've reported and then tried spamming the internet with it.
The tools and work involved are frequently very expensive and very intensive.
They certainly have such restrictions in place. We'll see what happens...
 
What a great idea, I wouldn't mind giving 5 bucks to find out how weak the Wii U actually is and finally end fanboy dreams of 500 GFLOP GPUs and hardware underutilized by MP titles.
 
What a great idea, I wouldn't mind giving 5 bucks to find out how weak the Wii U actually is and finally end fanboy dreams of 500 GFLOP GPUs and hardware underutilized by MP titles.

I hope someone would just post the images for everyone to see. Don't look like that happening.

Take it with a grain of salt...
 
What a great idea, I wouldn't mind giving 5 bucks to find out how weak the Wii U actually is and finally end fanboy dreams of 500 GFLOP GPUs and hardware underutilized by MP titles.

You can always adapt your definition of weak to its actual performance. Why would you care about any die pics coming out anyways? $200 is nothing, anybody can order them.
 
reading the neogaf wiiu tech thread, some are convinced its more powerful then current gen, why is it so different here, fanboy speculation?
 
Because quite frankly, most of those folks are delusional cheerleaders for Nintendo opting to live in their own version of the world.
 
I think that's fair to say. Some are so obsessed with "500gflops or more" talk and are utterly convinced that the Wii U has some super secret tech that'll make it comparable to Durango.

"Oh but it technically has the capability to do DX11 effects just like the other new consoles! Oh but it has GPGPU! Oh but it also has embedded ram!"

My view is, we would actually being seeing that substantial increase in GPU power on the screen, even in half baked ports if it was there.

Even if the GPU is superior to the 360's and PS3's, its A) bottle-necked by the other design choices, B) not much stronger anyway or C) Both of these.

I've always thought that the Wii U's GPU would be in the 300-350 gflop range. Its stronger than the 360 and PS3's 250 gflops i'm sure, just because its based on a(more) modern architecture by default and a GPU under 250glops would probably be more expensive than just going with something like a cut down 4650.
 
Status
Not open for further replies.
Back
Top