Fact: Nintendo to release HD console + controllers with built-in screen late 2012

If the Stream\Feel has anything close to a RV770 in shader+TMU+ROP amount, then I'll be very surprised.
 
If the Stream\Feel has anything close to a RV770 in shader+TMU+ROP amount, then I'll be very surprised.

I don't think it's out of the question. It was a upper mid-range GPU in 2008. It's low-end by 2012. We'll see in about 3 weeks. I know you got your hopes dashed when Sega didn't go with Lockheed Martin Real3D tech for their successor to the Saturn, but there's no reason to be overally pessimistic,
 
I wonder about IGN's article... wouldn't all that be far too power hungry and hot for a console?

Tri-core CPU, similar to XCPU but clocked OVER 3.4 GHz? :oops:

It'd have to be at 28nm or lower right? Otherwise it'd sound like a jet engine wouldn't it?
 
When the Saturn came out, I was too young to care about what it was inside it :D


My only doubts about the RV770 are because it'd have at least 4x the GPU power of the other two, making it way more powerful and not only "similar", as most rumours point out.

That article from IGN is a bit stupid, yes, but they haven't been wrong about specs.. and they're claiming their sources are confirming it's a RV770.

Of course.. we're assuming IGN actually knows what a HD4850\RV770 is.
 
I wonder about IGN's article... wouldn't all that be far too power hungry and hot for a console?

Tri-core CPU, similar to XCPU but clocked OVER 3.4 GHz? :oops:

It'd have to be at 28nm or lower right? Otherwise it'd sound like a jet engine wouldn't it?

A Tri-core CPU means nothing. Neither does clock speed. The XCPU itself would run on an extremely low power requirement. The Wii 2 could easily afford quite a bit more performance without spending a huge amount of power.
 
I mean if it truly is 4850 level, with 1GB RAM (although again, 4850 seems almost overkill for 1GB total RAM...), that's impressive, but I'm still skeptical.

If it is true, my Guess is that it'll be a 4850 with 128 bit bus and GDDR5, which is a lower clock 4870 with 128 bit bus. Much like RSX compare to its G-force counterpart. It's a very decent machine IMO.

It can do proper 720p res @ 60 fps in all PS360 console ports rather easily. Or 1080p @ 30 fps. But 1080p @ 60fps will be a challenge from my experience.

I really think a proper next gen system needs something with the equivalent performance of X-fire 6990 with 8 GB of memory pair with a decent multi-core CPU to give the WOW factor that can triumph current gen. I don't think current crop of games at ultra detail setting @ 1080p and high amount of AA will do it for the core gamers. They'll just be happy with their PS360. While those that want to play games at that setting will most likely be a PC gamers anyway. I doubt the current Wii crowd will upgrade because of graphics either, they didn't care when the Wii came out, I doubt they care now.

But I think Nintendo is not going to try to win gamers with just graphic but some other gimmick like they did with Wii. I think what that gimmick will be is more interesting to discuss than Wii 2 spec.
 
A Tri-core CPU means nothing. Neither does clock speed. The XCPU itself would run on an extremely low power requirement. The Wii 2 could easily afford quite a bit more performance without spending a huge amount of power.

You're saying that because it's the *architecture* that matters, not the clockspeed, right?
 
If it is true, my Guess is that it'll be a 4850 with 128 bit bus and GDDR5, which is a lower clock 4870 with 128 bit bus. Much like RSX compare to its G-force counterpart. It's a very decent machine IMO.

It can do proper 720p res @ 60 fps in all PS360 console ports rather easily. Or 1080p @ 30 fps. But 1080p @ 60fps will be a challenge from my experience.
I'd say it all depends on rest of the machine, if Xenos & RSX can run "close-to-hd" resolutions at 30 FPS, anything even near 4850 speeds shouldn't have any problems running it 1080p 60 FPS
I mean, 4850 should be around 5x+ faster than Xenos or RSX in pretty much everything, if not more.
 
That said, we saw an era where one console had 2X as much RAM as the leader (Xbox:pS2) and often it wasnt taken advantage of in ports.

It mattered in the games that defined the Xbox I bet. More RAM makes development easier, eases up streaming for large world games. Even if Wii 2 is "port ready" we could see Wii 2 versions that reflect more of what we see in PC versions of multiplatform games. I've already advocated a semi-two tier approach where PS3 and 360 games are on parity with eachother, with the PC and Wii 2 versions a bit beyond. A Radeon 4670 would be hugely hampered by only 512 MB of total RAM.
 
However if you are rendering complex 3d geometry and have high amount of overdraw, then EDRAM surely helps a lot. And that's why it's there in the first place. Why write something to the main memory that gets soon overwritten. EDRAM is a great thing to have. And it's especially a great thing for deferred renderers, since deferred renderers write more data per pixel, and that's where the EDRAM bandwidth helps a lot.

Simple example. 1280x720p screen, 24f8 depth, double 16fx4 g-buffers (pretty common deferred setup), 4 x average scene overdraw (pretty common overdraw factor for complex scenes):

with EDRAM:
1. Render geometry to g-buffers = no main memory BW used (everything stays on chip)
2. Resolve g-buffers to main memory = (4+8+8)*1280*720 = 18.4MB of memory writes

without EDRAM:
1. Render geometry to g-buffers = 4*(4+8+8)*1280*720 = 73.7MB of memory writes. Also the z-buffering needs to read z-values four times for each pixel = 4*4*1280*720 = 14.7MB of memory reads in addition to writes.

With EDRAM the memory traffic was 18.4MB, without it it was 88.4MB. = EDRAM helps deferred rendering a lot.

I was just thinking about this quote in the context of the NES 6. If they go to EDRAM again as a buffer like they did with the Gamecube and Wii then perhaps they could get away with using much cheaper / power efficient RAM alongside a portion of EDRAM? That should relieve a lot of the bandwidth pressure they would face otherwise especially if a GPU and CPU share the same memory interface.
 
I'd say it all depends on rest of the machine, if Xenos & RSX can run "close-to-hd" resolutions at 30 FPS, anything even near 4850 speeds shouldn't have any problems running it 1080p 60 FPS
I mean, 4850 should be around 5x+ faster than Xenos or RSX in pretty much everything, if not more.

Well maybe if it was a good port and not a cash in port. But on PC I just couldn't get games like SFIV, Lost Planet and DMC4 to be steady 60fps at 1080p with the 4850 I have. It will get to 60 fps, but dips all over the place. It's good at 720p + AA/AF though.
 
Well maybe if it was a good port and not a cash in port. But on PC I just couldn't get games like SFIV, Lost Planet and DMC4 to be steady 60fps at 1080p with the 4850 I have. It will get to 60 fps, but dips all over the place. It's good at 720p + AA/AF though.

I'm not sure about the other 2 but Lost Planet at least runs at quite a bit higher settings than the console versions even withoug accounting for high resolution so it's not directly comparable.

But yeah a better optimised console port would do even better than the PC versions. Saying that I think all 3 of those games are quite well optimised on the PC.
 
But yeah a better optimised console port would do even better than the PC versions. Saying that I think all 3 of those games are quite well optimised on the PC.

Theres a significant difference between PC optimised and console optimised. See how well a 7900 GTX does in a PC vs something weaker in a console so I would suspect the same would apply to the NES 6 and whatever equivalent PC hardware they're using.
 
Theres a significant difference between PC optimised and console optimised. See how well a 7900 GTX does in a PC vs something weaker in a console so I would suspect the same would apply to the NES 6 and whatever equivalent PC hardware they're using.

Yeah I agree. I was just pointing out that as PC ports go, those games aren't bad examples. Capcom tend to be one of the better studios for PC ports.
 
Is there a way to judge how much performance gain could be had if say Crysis was optimized for a closed platform based on the build IGN made up a few days ago? Could Wii 2 run that at 1080p?
 
Regardless of the performance, it would be better for gaming if this thing comes out sooner than later and draws sales away from the other consoles, forcing more competition and innovation.

Frankly, to leave it to Sony and Microsoft to advance the next gen almost 10 years later is untenable.

You would think Nintendo has a better chance if they try to put out a system which will be competitive for at least 5 years instead of say just 2 years.

IGN podcast made the point that Nintendo's gimmicks just don't last after a week or so. They weren't referring to the motion controllers (though that too has run its course) but some of the other social features in the handhelds.
 
Strap on cameras on the arm that only allow arm tracking is overkill. Unless you need pixel-perfect resolution to recreate something like playing a piano, a camera set back can give enough detail for major hand changes (open/closed) which'll be enough for pretty much every game, I'm sure.

Motion capture of hands and fingers in particular is still an unresolved issue for many reasons, especially with any optical method. Even the magnetic and mechanical approaches aren't good enough, only full gloves are working to some extent and even those are only okay if the virtual hand is an exact copy of the user's.
 
Back
Top