Fact: Nintendo to release HD console + controllers with built-in screen late 2012

It's the Radeon 4600 series, so that's 320SPs, 8 ROPs, 20 TMUs IIRC. On 40nm, the equivalent desktop GPU would be akin to the 55xx/56xx series, which sports 400SPs.

Anyways, you're all so keen on cutting down clocks of an rv770 class GPU. Clocks and power envelopes are a part of the equation, but these can always be tweaked. Do consider that rv770 is inherently a much larger chip than rv730. That is going to be an inherent factor in the BOM and will weigh some amount in what Nintendo wants to achieve with the WiiU. The rv770 is a gigantic leap in shading and pixel throughput power relative to Xenos. It's not just "a little". It's not even "ballpark".

At what point does lowering the clocks of an rv770 outweigh using a higher clocked low end part that is also a smaller chip (cheaper to manufacture)? You won't get more than double the clock, but what is good enough and how much do you expect Nintendo to spend? Will they have enough bandwidth to support rv770-like pixel throughput? How much will they spend for the memory? How many RAM chips fit inside the WiiU chassis? How many did they fit within the Wii? Was there much room left-over? How inexpensive do you believe GDDR5 2Gbit @ highest speed is? Does GDDR5 even make sense if they go with another edram for framebuffer usage? How much edram makes sense for them? How big will that chip end up being and what is its cost?

I agree also, and thanks for the insightful post AlStrong.
 
I agree. Furthermore, I doubt any manufacturer's first option when designing a console is to take a custom chip and simply reduce speeds to meet a power envelope. Slowing down a gpu means reducing performance. A more efficient option is shaving off silicon which means reducing power requirements and performance while also reducing chip size and overall cost of each gpu.

Downclocking a gpu may make sense when you are talking off the shelves parts and the smaller silicons do not offer the performance you seek. It may also make sense where a reference design doesn't perform within expectation and a redesign isn't an option. But we are talking a custom gpu which gives Nintendo greater design flexibility and simple downclocking is a wasteful option in comparsion to using a smaller cheaper part, which can be clocked to fall within the required power envelope and still perform on the same level as a downclocked 4850.


Downclocking is pretty normal in consoles I think simply for heat. You're not going to push the bleeding mhz edge like a dedicated PC card can.

Obvious example is Xenos+RSX clocked at 500, when there where similar PC GPU's already doing 650mhz at the time.
 
My dream would be the RV770 clocked at 750 MHz on 28nm process, but that just probably isn't going to happen. We'll probably be lucky to see 2x the performance of Xenos at this point, but we'll wait and see how things shape up.
 
Downclocking is pretty normal in consoles I think simply for heat. You're not going to push the bleeding mhz edge like a dedicated PC card can.

Obvious example is Xenos+RSX clocked at 500, when there where similar PC GPU's already doing 650mhz at the time.
The chosen clock rates were more because of yield than heat.
 
The chosen clock rates were more because of yield than heat.

How do you know that?

Anyways it seems pretty obvious, high end PC cards can be and often are loud, hot, power sucking etc. with huge fans, they can just generally be bad citizens much moreso than a console. They get to live in a more open space as well.

If you get a high end card in next consoles, it wont be at the bleeding clock edge.
 
How do you know that?

Anyways it seems pretty obvious, high end PC cards can be and often are loud, hot, power sucking etc. with huge fans, they can just generally be bad citizens much moreso than a console. They get to live in a more open space as well.

If you get a high end card in next consoles, it wont be at the bleeding clock edge.
You never get a high end chip in consoles. A high end architecture maybe, but not high end clocks because those chips cost too much.

Since the thread has been talking about r770 consider when it launched you had the 4870 and 4850. A console that launched at that time would have chosen the 4850 because the manufacturer wouldn't want to throw away half of the functional chips because they couldn't run at the targeted clock speed.

The fact that these lower clocked cards run cooler is a benefit, but not the primary reason to use them.
 
ALStrong,
Laptops don't overheat if they have been made properly.

I have had a 15 inch gaming laptop (same GPU as the Nvidia 9800) that never had overheating problems and it was a 2009 model, Only one year after the desktop part of the same speed came out.

The GPU had a TDP of 65W, when something like the Radeon HD 6950M (which should be as fast or faster than a high end R770) only has a TDP of 50W!
50 Watts TDP should fit into the Wii U no problem.

Note that this is 100% from a tech standpoint.
From a price standpoint I think Nintendo does not want to spend the money for it and has wasted the money on that gimmicky controller instead.
 
ALStrong,
Laptops don't overheat if they have been made properly.

If... if... if... So what are they doing wrong now with the reports of the early units overheating? The other thing I'd like to point out is that we haven't even begun to consider the TDP for the CPU.

never had overheating problems
What were the ambient conditions? How long were you playing at 100% load? Can you say the same for every household and weather condition? Stuffing it in a home theater setup behind the window? I wish it were so easy as your single example. I don't have problems with my laptop in the winter. I sure as hell can if the house jumps to 30C and I want to play for several hours. I know when to quit when I can feel the bottom burning my finger. How loud was your laptop as well? Is that acceptable for a console or much less, the WiiU? How much was the laptop? How big of a heatsink are they going to need (i.e. extra costs, extra weight, extra shipping weight...)? Can it be done for under $300?

I don't mean to be snarky with these questions, but these are important considerations that go beyond just looking at "oh another device can do it IF there's enough money thrown at it, IF it has much sturdier construction, IF it's not playing games 100% of the time, IF we just reduce clock speeds, IF we ignore that there are hugely different ambient temperatures that people can game in, IF we ignore that Nintendo is quite possibly wasting its money producing a larger chip and just clocking it down.......".

And that's it. I'm done with the thread.
 
How loud was your laptop as well? Is that acceptable for a console or much less, the WiiU?
Considering how ridicilously loud the disk drive was in XB360 I don't think having relatively loud cooling in the first (couple) of HW iterations is out of question.
 
I remember the stink about Nividia chipped laptops self destructing due to thermal issues a couple of years back.

I wouldn't want a console whose case gets as physically hot as my laptop does under full, prolonged load. Or that gets as noisy!
 
The point is that it is similarly spec'd on shaders, TMUs, ROPs, and is a 40nm rendition of what rv770 was.
Juniper is DX11, so of course it packs more.

So you'll agree that a 40nm shrink of a RV770 @ 500MHz would consume as much as, if not less than a Juniper @ 500MHz? And a Juniper @ 500MHz pulls ~22W.



Power consumption is still power consumption...

Power efficient in what sense? Idle clocks? Fine. But we're talking about something that's going to be full load.
Overheating at how many hours of operation?

The power numbers I've been mentioning is full load at continuous operation.
Unless you think "TDP" might be somehow related to "idle" or "load during 1m30s"...



Who says the GPU is the only contributor? There is a 45nm CPU packed away in there too. What sort of cooling is in there? Too many assumptions.
The discussion has been about the GPU, either it could be possible to put a 25-30W GPU inside that case. If you get another 25-30W CPU, odds are that it would be possible.

Pics have shown active cooling with a ~40mm fan. With rumours of overheating consoles and new units arriving this month, much may have changed in the meanwhile.



Anyways, you're all so keen on cutting down clocks of an rv770 class GPU. Clocks and power envelopes are a part of the equation, but these can always be tweaked. Do consider that rv770 is inherently a much larger chip than rv730. That is going to be an inherent factor in the BOM and will weigh some amount in what Nintendo wants to achieve with the WiiU.
(...)

At what point does lowering the clocks of an rv770 outweigh using a higher clocked low end part that is also a smaller chip (cheaper to manufacture)? You won't get more than double the clock, but what is good enough and how much do you expect Nintendo to spend? Will they have enough bandwidth to support rv770-like pixel throughput?

You should also take into account that power consumption doesn't scale linearly with clock speeds, or transistor count.
Lookng at the mobile Evergreen versions, the 500MHz Juniper (800-40-16) with GDDR3 consumes less than the 650MHz Redwood (400-20-8), also with GDDR3.

So there's a really high chance that a ~800MHz RV730 would consume a lot more power than a ~400MHz RV770 while the performance would be about the same.





How much will they spend for the memory? How many RAM chips fit inside the WiiU chassis? How many did they fit within the Wii? Was there much room left-over? How inexpensive do you believe GDDR5 2Gbit @ highest speed is? Does GDDR5 even make sense if they go with another edram for framebuffer usage? How much edram makes sense for them? How big will that chip end up being and what is its cost?

Yes, of course those (and a lot more) factors need to come in.
Yet, I stay with my opinion that, given the current rumours and info, it should be a reasonably down-clocked RV770.

BTW, I don't really know why you're mentioning it'll use GDDR5. For all I know, a 256bit DDR3 UMA could be perfectly possible. The extra PCB layers would come in handy for a supposedly small PCB.



If you can't bin the parts (ie Nintendo has no venue to sell or use them), you aim low because any part that doesn't meet spec, is useless.
No one's stopping a console to use a GPU or a CPU with a few disabled units for redundancy, like we've seen with the PS3.
 
How much will they spend for the memory? How many RAM chips fit inside the WiiU chassis? How many did they fit within the Wii? Was there much room left-over? How inexpensive do you believe GDDR5 2Gbit @ highest speed is? Does GDDR5 even make sense if they go with another edram for framebuffer usage? How much edram makes sense for them? How big will that chip end up being and what is its cost?

I think the Edram is for the CPU it should improve the latency problems, over PS360 CPUs, even more if they use GDDR5.

On the amount of RAM, GDDR5 is todays norm and probably the cheapest of all the RAM in the future so it makes sense to get 1Gb (pretty usual in cheap cards too), latency wouldn't be a problem too. It is in the midterm the best buy after all.

On a 128b bus at todays speeds it would give ~60GB of bw that should be more than enough IMO.
 
How do you know that?

Anyways it seems pretty obvious, high end PC cards can be and often are loud, hot, power sucking etc. with huge fans, they can just generally be bad citizens much moreso than a console. They get to live in a more open space as well.

If you get a high end card in next consoles, it wont be at the bleeding clock edge.

Because it would be poor design otherwise.

Yes, you have highend PC cards that are clocked higher than what you find in most consoles. But those off the shelves highend cards are unfit for use for consoles because of the power consumption, the huge power envelope needed and the inability of those chips to be manufactured in volume. If you are designing a console, the worse thing you can do is simply take one of those chips and significantly downclock it to meet your needs.

All you are doing is reducing performance for the sake of heat when more than likely there exist silicon thats smaller (thus cheaper) and will fall within the your power envelope needs and meet the same performance characteristics. It ends up being a waste of silicon that you still have to pay to manufacture.

You also have to remember that in late 2005 most ATI cards ran between 500 and 600 Mhz. The highest end cards were mostly cherry picked GPUs that could run higher than normal and simply paired with faster memory in comparsion to lesser cards with the same GPU clocked slower. I doubt it was very feasible to produce a high yield high volume 90nm ATI GPU that clocked significantly faster than 500 Mhz in 2005.
 
Last edited by a moderator:
This is more problematic for Nintendo:
http://www.gamesindustry.biz/articles/2011-07-20-iwatas-views-no-reflection-of-what-consumers-want

In his keynote address, Iwata claimed that craftsmanship in game development was dying due to the abundance of smaller, less expensive games on new digital platforms

Speaking at the Develop conference, Baverstock, who co-founded Kuju Entertainment, vehemently disagreed, saying that these new platforms have expanded the traditional skill-set required to make games.

The rise of Facebook, iOS and Steam have shifted the emphasis away from pleasing retail partners – something that Nintendo's business is built upon - and given developers more control over what they create and who they create for.

"I just don't agree," he said. "This lack of craftsmanship is really a reflection of Nintendo's point of view – they are completely obsessed with retail, and have been very successful in that."

However, by ceding so much power to their retail partners, the platform holders have led the industry towards a "narrow distribution pipe, with huge inventory risks and huge inventory costs." Baverstock believes that retail buyers don't make decisions based on craft or quality, but on who has "the biggest sign" at E3.
 
my Asus A42JA (14inch laptop) i5-460, Radeon Mobility HD 5730/6500 seris (the ATi CCC say like that, weird).
Play BFBC2 = aftar almost 1 hour laptop overheat and hibernate itself
Play all other games = noisy, CPU max about 101c, GPU about 70c on some game, 90c on some game.

that in south-east-asia, where everyday temperature is hot (about 35c at noon), and a lot of dusts.

so the new wii if have spec slower than that, in a box like the one displayed in E3. Maybe the heat can be no problem with proper cooling. But the fan still can be loud if the console is stored in tight space.
 
Back
Top