Fact: Nintendo to release HD console + controllers with built-in screen late 2012

RudeCurve, so you're suggesting that people working at Nintendo Technology Development for example are just sitting on their asses collecting money for doing nothing?
They're hardware focused R&D wing of Nintendo.
 
RudeCurve, so you're suggesting that people working at Nintendo Technology Development for example are just sitting on their asses collecting money for doing nothing?
They're hardware focused R&D wing of Nintendo.

I'm pretty sure they're not doing nothing, they do research into different technologies that is available so they could make good choices in what technologies to incorporate into whatever device they plan on bringing to market.
 
I think it's highly unlikely to see a single-chip CPU+GPU in the first iteration of the console, considering both are coming from different companies.

Developers have confirmed that the specs aren't final yet, there's a new prototype with different specs coming up in this month, and until E3, developers were working with underclocked versions..

Fruthermore, IGN's sources (supposedly developers) claimed the GPU "will feature a tweaked design but a similar speed to the HD 4850".
That's 800 VLIW5 shaders, 40 TMUs, 16 ROPs @ 625MHz and 63.55GB/s memory bandwidth.

I'm more inclined to believe the GPU has the same number of units as a RV770, but the earliest SDKs had the GPU underclocked to ~350MHz, with the final version having something closer to 500->600MHz.

So now you are believing the Wii U GPU is based on the RV770?
 
As for the rest of your posts, I see that you've been thoroughly corrected regarding your DX7 vs DX8 Flipper claims...

Maybe if you actually read the posts, you'll reach a different conclusion?




So now you are believing the Wii U GPU is based on the RV770?
Adding all the RV770 rumours, IGN's "emulated Wii U", developers from Colonial Marines claiming it'll have better visuals, statements of overheating consoles, demos at E3 being done with underclocked hardware, etc.. I thinks that's the most logical conclusion at the time being.

It wouldn't be likely to have a console overheating with a RV730 in that case, that's for sure.

Developers not claiming it'll be faster is also like a safeguard for them (not to mention all of them seem to be clearly NDA'ed). If they don't say the system is capable of more, people won't expect them to do more for the Wii U.

Besides, the Wii U could be a lot more powerful in "our eyes", and that wouldn't be enough to show drastically better graphics, even more when considering the PS360 are now having their performance potential being squeezed at very high levels.

Take Crysis 2, for example. If you run it with a HD4890 1GB on a PC, it won't be able to show drastically better graphics than the X360.
It'll do 1080p, but you'll only be able to run at medium-high settings, in DX9 mode.
There'd be a difference if you did a side-by-side comparison, but it wouldn't be a "OMG it's in a whole different level!!" difference.
That said, take a Wii U with a 500MHz RV770. You now have low-level optimizations, but a 40% slower GPU, so the comparison maybe isn't too far off.

In order to achieve a generational leap in graphics, you'd need a >8x more powerful system, as we've seen with XBox -> X360 and PS2 -> PS3. It's obvious the Wii U won't be that much more powerful than the other two.
 
RV770 would be 3-4X as powerful as PS360. At least, depending on clocks.

It also uses a 256 bit bus, another non starter. If you are very very very very very lucky it could be a RV740...

The Crysis 2 4890 vs 360 analogy is bad just because of the vast difference in optimization on console vs PC.

Also 4890 is my personal card, it runs Crysis 2 on 1080P highest settings, looking a lot better than 360. Even with those caveats.

rv 770 would be the true half generation step I'm not expecting to see from Wii-u.
 
Yet a RV730 is still significantly more powerful than the GPUs we have in 360/PS3, so developers would eventually be able to provide better looking games than the current HD consoles. I guess we'll just have to wait for more information to leak out.
 
It also uses a 256 bit bus, another non starter. If you are very very very very very lucky it could be a RV740...
And what would stop AMD from adapting a 128-bit bus to a RV770?


The Crysis 2 4890 vs 360 analogy is bad just because of the vast difference in optimization on console vs PC.
I mentioned the optimization difference, but I also mentioned the clocks should be a lot lower (850MHz vs ~500MHz) decreasing that difference.




Also 4890 is my personal card, it runs Crysis 2 on 1080P highest settings, looking a lot better than 360. Even with those caveats.

Crysis 2 DX9 "highest settings" != Crysis 2 highest settings. Besides, I doubt you've ever made a side-to-side comparison:
http://www.youtube.com/watch?v=7Gj6fgnMvQw
 
Crysis 2 DX9 "highest settings" != Crysis 2 highest settings. Besides, I doubt you've ever made a side-to-side comparison:
http://www.youtube.com/watch?v=7Gj6fgnMvQw

Bare in mind they are also no longer DX9 highest settings. Still looks quite a bit better to me though.

Certainly the Wii-U should be capable of that with proper optimisation.

Games built from the ground up for it's power though should look quite a bit better purely because that power will be spent where it will be noticed the most while everything above low in Crysis 2 is quite an inefficient use of power. Funnily enough it's the Ulta settings that have the biggest visual upgrade. Extreme -> Ultra is a bigger jump than Gamer -> Extreme IMO.
 
"When we got the new kits there were some things in the old build that wouldn't work with the new hardware and we had to wait for updates," Donald added. "So it's been a little tricky in that regard."

This mean that the HW is indeed quite different from a a 4x00, no?
 
Or it means their (Nintendo's) software is spotty and they suffered regressions that needed to be fixed.
 
And what would stop AMD from adapting a 128-bit bus to a RV770?



I mentioned the optimization difference, but I also mentioned the clocks should be a lot lower (850MHz vs ~500MHz) decreasing that difference.






Crysis 2 DX9 "highest settings" != Crysis 2 highest settings. Besides, I doubt you've ever made a side-to-side comparison:
http://www.youtube.com/watch?v=7Gj6fgnMvQw

Well I consistently use the benchmark that Xenos=~240 SP's. In that case, 800 SP's for RV740 is >3X even at same clocks. The "3-4X" comes from the fact RV740 could be clocked up to 800 mhz (at which point you could be pushing near 6X times Xenos power...).

I'm talking about Crysis 2 on the "hardcore" settings which were the highest available until literally a couple weeks ago when the DX 11 add on came out which isnt part of the stock game.

I actually have played the game on both platforms have you on either? :rolleyes: The old MP PC demo in that youtube didn't even have all settings and MP is not a place to compare. Not even to mention the 1080P vs (less than) 720P part which is double the pixels and kind of a big deal.

128 bit bus to RV770 would just be more dev costs when the natural idea would be to just use Rv740 which already has a 128 bit bus, and only steps down a little bit to 640 shaders.

But I'm sure it's an RV730. Which should be well better than PS360 anyway (I use the 320 SP's vs 240 on Xenos, but I'm forgetting RV730 is over 500 million transistors, so it should be more superior than I'm allowing) UNLESS, Nintendo nukes it with low clocks (talking something like 350 mhz) which is quite possible knowing Nintendo and that small form factor.

BTW, was this posted? It's deemed "old" on GAF but I found it interesting:

http://www.digitalspy.com/gaming/news/a330677/wii-u-development-has-been-tricky-says-vigil.html

"[Wii U] will be at least as powerful [as PS3], if not more, but honestly we don't really know because the hardware has been changing a lot," Donald said.

"We just got the generation two dev kits and there's no release date for the Wii U, so we don't know how long the hardware development process is going to go on for, when they're going to stop and what they're ultimately going to be happy with. So it has provided some instability when working on it."

I think that about sums up the likely knowledge of Wii U hardware we currently have.

Already does not sound like Nintendo is being the smartest in dealing with devs though, if they are complaining of difficulties due to a moving target. Possibly cant be helped.

Ahh, megadrive posted it just above me didn't see it.
 
Last edited by a moderator:
Well I consistently use the benchmark that Xenos=~240 SP's. In that case, 800 SP's for RV740 is >3X even at same clocks. The "3-4X" comes from the fact RV740 could be clocked up to 800 mhz (at which point you could be pushing near 6X times Xenos power...).
(...)
But I'm sure it's an RV730. Which should be well better than PS360 anyway (I use the 320 SP's vs 240 on Xenos, but I'm forgetting RV730 is over 500 million transistors, so it should be more superior than I'm allowing)

Saying Xenos shaders = RV770 shaders is like saying nVidia Tesla shaders = RV770 shaders.. They're not comparable.
One is Vec4+scalar, the other is VLIW5. It's been said several times that VLIW5 are more efficient than Vec4+scalar (it clearly takes more transistors per shader too).

You've actually kind of reached that conclusion by yourself, since Xenos has a lower transistor count than the RV630/RV635 (120 VLIW5 shaders, 8 TMUs, 4 ROPs), even with the EDRAM included.


128 bit bus to RV770 would just be more dev costs when the natural idea would be to just use Rv740 which already has a 128 bit bus, and only steps down a little bit to 640 shaders.

"More dev costs" than what?
Look for "HD4730" in the interwebs.


But I'm sure it's an RV730.
That's a weird certainty, since all the rumours that mentioned an ATI codename so far have claimed it to be a RV770.


I actually have played the game on both platforms have you on either?
Yes, and I stand by what I said. There's a difference, but it'll not blow you away and it's certainly not a generational leap between them.
You could say so with the DX11 version with everything maxed out and the high-res texture pack, but not with the DX9 mode.
 
HD 4730 I did not know of, but it has 640 shaders and 128 bus anyway, so in essence it is a RV740. Probably scrap part with defective functional units. Loos like it has a few more transistors too, so it would be a negative to use it instead of just a leaner RV730. I see it also only has 8 ROPS, I dont think that's enough. If you had this much power it would pointless to target 720P anyway.

So you're basically replying with a oddball RV770 that's actually speced worse than an RV740. Kudos?

About Crysis 2, ok we can sort of agree I guess. We can agree C2 on a 4890 rig is going to be much better than 360, and we can agree it's not a generational gap, perhaps other than the resolution. But there's still the optimization thing which is the huge thing.

As for these shader units comparisons and so on, I may be a noob but I'm usually in the past pretty darn close with these armchair functional unit comparisons, I wager I'll be again. I used to do it with Xenos vs RSX and project them close to equals before these consoles came out, and that turned out correct...

The 514m transistors in RV730, yes give it a nice edge on Ps360 (which is another reason I think it's the reasonable choice). I have to wonder how many are wrapped up in DX10 type stuff though and how much that can improve a consoles graphics? This is where I have a total failing in tech knowledge...but like for example it's how my "lean and mean" 4890 can run Crysis 2 in DX9 mode almost as well as some AMD GPU's with nearly twice the transistors and higher unit counts. For example it's fairly close to an HD6870, which has 1.7 billion transistors and 1120 SP's, versus ~1B and 800...and sure stepping up to DX11 is nice extra icing on the main cake, but is worth almost 2X transistors? Not in the case of C2 anyway...

If I were you and I was getting an RV730, with devs struggling to articulate that there's any large difference to PS360 so far, and given the enclosure, I'd be worried if Nintendo might try to clock it below 500 mhz, which might bring it closer to parity with Xenos/RSX. I might be totally off base there, 500 mhz might already be one super ice cool running RV730/770 whatever you think it is...
 
Last edited by a moderator:
For example it's fairly close to an HD6870, which has 1.7 billion transistors and 1120 SP's, versus ~1B and 800...and sure stepping up to DX11 is nice extra icing on the main cake, but is worth almost 2X transistors? Not in the case of C2 anyway...


You decide

http://www.gametrailers.com/video/directx11-pc-crysis-2/716871


Personally it is up to the price, the things I have seen from tessellation in demos are certainly a cool and sometimes quite noticeable fx IMO, so I would like to see that in any next gen if the cost isnt to much.

But I agree the DX9 Crysis to DX11 in game is almost (if at all) unnoticeable... But I believe that tessellation could make a difference in some kind of game or even give us some new kind of visuals like real time transformations or waves and the like that could lead to interesting new gameplay elements
 
HD 4730 I did not know of, but it has 640 shaders and 128 bus anyway, so in essence it is a RV740. Probably scrap part with defective functional units. Loos like it has a few more transistors too, so it would be a negative to use it instead of just a leaner RV730.

It's a RV770. Shaders are laser-cut, along with the respective TMUs, but it's a RV770 "in essence", not a RV740.


I see it also only has 8 ROPS, I dont think that's enough.

It has half the original ROPs because the memory channels are cut in half (2*64 as opposed to 4*64bit in a full RV770).


So you're basically replying with a oddball RV770 that's actually speced worse than an RV740. Kudos?

No, the RV770 "LE" reference was just to prove you that making a RV770 with a 128bit bus is easily attainable, unlike your suggestion that it'd take "more dev costs".


As for these shader units comparisons and so on, I may be a noob but I'm usually in the past pretty darn close with these armchair functional unit comparisons, I wager I'll be again. I used to do it with Xenos vs RSX and project them close to equals before these consoles came out, and that turned out correct...

"Correct", you say?
:oops:
 
I still dont understand why you'd want to use a salvage part versus an RV740 for the same specs (less 8 ROPS). It would be inefficient. Dont you see that in that case, you'd be better off with an RV740?
 
I still dont understand why you'd want to use a salvage part versus an RV740 for the same specs (less 8 ROPS). It would be inefficient. Dont you see that in that case, you'd be better off with an RV740?

It probably won't be an exact "off-the-shelf" part, either.
Rumours claim "based on" and "similar performance to" a RV770, and IGN's "Wii U emulator" article says it was similar to a Radeon HD4850 (625MHz core, GDDR3 256bit).
RV730 and RV740 were never mentioned outside this thread.

That said, it's more likely to be a RV770 with 128-bit GDDR5 or even 256bit DDR3 than anything else. It has nothing to do with what I want..

Even with a 128bit bus, ROPs could still be 16, if AMD re-arranged the memory channels to 4x32bit instead of 4x64bit. Or maybe 8 ROPs along with some eDRAM (like the X360) could be considered more efficient overall.

Either way, even though the chances for it to be an "off-the-shelf" part are slim, rumours point to having the number of functional units equivalent to a RV770.
 
Because the mainstream media and neogaf dont understand just how powerful a HD4850 would be compared to current consoles. They just think "oh it's an old PC card so it cant be very good" or something.

Not realizing it's massively more powerful than whats in PS360. Massively. 3-4X at least depending on clocks. On the order of half as powerful as current high end AMD PC cards. Which I think we could agree if Xbox 720 launched tomorrow, it probably wouldn't match said high end AMD PC cards.

The IGN article where they built a PC to supposedly emulate Wii U was a joke. I'm completely disregarding it. They basically concluded stuff like "it'll run call of duty with slightly better AA according to our tests!". When a 4850 based console would blow PS360 out of the water.

Maybe IGN got the conclusion right, for all the wrong reasons heh.

The only possible tidbit of information to come out of it that I tried to include in my thinking was hey, it seems some dev told IGN Wii U has something like a 4850 in it and that counts for something, even though the article was so vague even that cant be assumed. But I've disregarded even that in the face of plenty of other evidence in the other direction, such as Miyamoto himself's comments, the footage we've seen etc.
 
Last edited by a moderator:
Back
Top