Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
I think that we could manage with just 2 GB in the next generation if virtual texturing becomes the norm. Without virtual texturing something like 8 GB would be pretty good, but then again I don't personally want to see increased level loading times. Most current games have way too long loading screens already. More memory = more data needs to be loaded from HDD to fill it up.

Maybe this should be another thread (Competing Next Gen Rendering Approaches and Resource (RAM) Projections) but to throw this out: We saw this generation a swing from Forward Rendering to various Deferred Rendering approaches. I am not sure how anticipated this was and, reflecting back, how this knowledge would have changed the 2005/2006 consoles had this been better understood at the time. I mention this because while virtual texturing may address some issues now, and indicate a lower need for memory, I would deposit some skepticism. Are developers really ready to completely anchor themselves to such and stick with 2GB of memory until 2020? Especially when RAM is cheap and there are so many obvious benefits outside of just storing textures (and no one says having a lot of memory forces long load times--that is a design issue as virtual texturing is still an option) and historically memory footprint limits has been a point of tension/limitation. Don't get me wrong, I love what devs have done with virtual texturing but there do appear to be some limits it places technologically (just look at RAGE). Just throwing that out ... now if it was a toss up between 2GB RAM and a small fast SSD versus 4GB of RAM and only an optical drive (which the impression I get we could be lucky for such a scenario...) it becomes easier. Anyhow, just thought I would toss that out and see what you think. If you disagree we will just have to be men and settle it in some online Trials in the coming weeks ;)
 
Couldn't that be a reason to go with split memory pools despite the disadvantages?

Untill last week I only had 2gb of slow ddr2 ram in my gaming rig but I never noticed that limiting gaming performance. However gpu's love fast memory. So for non gfx related stuff you can go with relative slow memory but you need the fast stuff for the gfx. In a regulair game, how much of the total memory would generally be dedicated to non gfx related things? The cheapest 4gb ddr3 ram I could find is 15 euro's so that is way under 10 for production costs. I don't know how much gddr5 or whatever is going to be used in next gen consoles costs, but wouldn't it make sense to say lets go for 1/3 or whatever % of memory is normally used for non gfx content of fast memory and toss in a lot of slower memory?

As I see it the system memory doesn't need to be so fast.This way you don't have to worry about OS stuff running in the background eating up your valuable ram (well, not as much anyway) etc and you could use it as cache etc.
 
Not going with GCN is not going to be due to die size, obviously.

Obviously.

Not going with GCN has to due with Nintendo being short sighted and naively believing they didn't need to compete on hardware and could wing it with plastic and 4 year old tech.

Having said that, just because GCN is just hitting market now does not mean Nintendo had no clue of it's existence upon contracting AMD for their WiiU GPU.

Perfect example of this situation would be Xenos with it's UMA which didn't show up in PC cards until a year after xbox360 launched.

The only thing stopping GCN in WiiU is Nintendo's lack of foresight.
 
The only problem which bothers me with your explanation is the seemigly absolute connection between on screen pixels and defined requirement of size ... that's silly since it all depends on texture resolution as well.
One more effort to avoid further silliness: maybe this will help you grasp the basic idea behind MegaTexture. (Read that page and the next.)

WRT the Wuu discussion, RAM amount isn't the bottleneck with MegaTexture. The bottleneck is the latency of the MT-storage-medium (CD/HD)-to-RAM transfer. SSD solves that, but whether a next-gen console will have one is a discussion for another thread (which exists).
 
Thanks Pete, I was too lazy to link to that but I should have as it is a B3D tech article. And as I remembered the total megatexture texture buffer is even smaller than sebbbi's virtual texturing implimentation (not that one is better/worse, just difference). I mean 13.5MB for color and another 7MB for specular and, boy oh boy, are we running out of RAM quickly :p
 
Obviously.

Not going with GCN has to due with Nintendo being short sighted and naively believing they didn't need to compete on hardware and could wing it with plastic and 4 year old tech.

Having said that, just because GCN is just hitting market now does not mean Nintendo had no clue of it's existence upon contracting AMD for their WiiU GPU.

Perfect example of this situation would be Xenos with it's UMA which didn't show up in PC cards until a year after xbox360 launched.

The only thing stopping GCN in WiiU is Nintendo's lack of foresight.

What tech are they going with then?
 
Last edited by a moderator:
Fine - but IDTECH5 virtualisation is larger than just terrain textures, it's all of the textures for everything, the landscape being so diverse (no repeating pattern) that's why it takes so much storage.

Carmack said he wants to bring virtualisation to other things such as geometry.

For the RAM thing, surprisingly yes, the exe only used like 800-1600 MB on PC.

Supposedly a derivative of the 4870, which at this point is 4 years old. I really am just shaking my head.

based on the architecture - but we're not sure if that's true

it will obviously get upgraded , custom designed , die shrink , and loads of other stuff on it so it won't really be out of date much
 
Last edited by a moderator:
it will obviously get upgraded , custom designed , die shrink , and loads of other stuff on it so it won't really be out of date much

If it were that obvious, why not go with a dx11 card in their dev kits to begin with? Were 6xxx** cards really that hard for Nintendo to come by that they couldn't find enough for their developers?

I find that rather hard to believe.

Custom part? Sure.

Custom part based on the latest tech so as not to be out of date much? Not according to the 4xxx rumor.


(**which would be the closest to representing their final spec if they truly are looking to incorporate GCN)


I hope I'm wrong and Nintendo can find a way to plug a top notch GPU (or even just a competent get-along solution such as HD7770) in their box so as to not be completely irrelevant on the tech side (leaving out that whole 'deeper and wider' part of his speech) as soon as they drop.

I don't want to see Nintendo go bankrupt, or drop out of hardware, but if they keep dropping crap on the hardware side, there's not much reason to support that part of their business and at that point they need to Sega just to survive. I just hope the internal devs can keep up with the outside world and keep enough pride in their work without hardware to continue to push forward.

It seemed as soon as Sega dropped out of Hardware, their internal devs lost all inspiration and it became a shell of the company it once was. Not just due to lack of Hardware, but more importantly, on the software side.

So here's to hoping Nintendo has some level of understanding the market they're competing in and producing a competent spec which doesn't target machines which are on the verge of becoming irrelevant.
 
If it were that obvious, why not go with a dx11 card in their dev kits to begin with? Were 6xxx** cards really that hard for Nintendo to come by that they couldn't find enough for their developers?

I find that rather hard to believe.

Custom part? Sure.

Custom part based on the latest tech so as not to be out of date much? Not according to the 4xxx rumor.

If the console is never going to need DX11 compliance and the GPU is a custom solution, why should they spend transistors and die space on something that isn't going to be used?




I don't want to see Nintendo go bankrupt, or drop out of hardware, but if they keep dropping crap on the hardware side, there's not much reason to support that part of their business and at that point they need to Sega just to survive. I just hope the internal devs can keep up with the outside world and keep enough pride in their work without hardware to continue to push forward.

It seemed as soon as Sega dropped out of Hardware, their internal devs lost all inspiration and it became a shell of the company it once was. Not just due to lack of Hardware, but more importantly, on the software side.

Are you seriously talking about the "we made more money during this generation than the other two competitors combined" Nintendo?

So you think the company that pretty much reinvented gaming controls and single-handedly brought console gaming to all audiences during this gen is suffering from "loss of inspiration".

Yeah, poor Nintendo. It's a downward spiral they can't control.
 
If the console is never going to need DX11 compliance and the GPU is a custom solution, why should they spend transistors and die space on something that isn't going to be used?

True, if their not going to use dx11 features then there's no point in producing a chip capable of them. Hence, their decision for Wii spec based on GC. They have shown no desire to keep up with technology, much less lead. And on this, I agree that is likely where they are headed.

And that's why I think they will be in serious trouble nextgen and WiiU will be their last console as we think of today.


Are you seriously talking about the "we made more money during this generation than the other two competitors combined" Nintendo?

So you think the company that pretty much reinvented gaming controls and single-handedly brought console gaming to all audiences during this gen is suffering from "loss of inspiration".

Yeah, poor Nintendo. It's a downward spiral they can't control.

Just because they made money ( a lot of it) with Wii doesn't mean they will with WiiU (not sure I'd call it nextgen at this point).

Kinect killed the Wii roadmap. Motion gaming is done for them.

So now they're sitting in pseudo portable land between a DS/ipad and a console with WiiU with none of the advantages of either. WiiU is essentially Afro Ninja. A ton of confidence at first, and HORRIBLE execution.



And what I was referring to on inspiration is what happens after they are knocked out of the hardware game? They are already pretty ho hum software-wise over this gen.
 
Supposedly a derivative of the 4870, which at this point is 4 years old. I really am just shaking my head.

Supposedly a derivative of the HD4xxx in some of the dev kits, all that means for the final GPU is its very likely VLIW5 based.

If it were that obvious, why not go with a dx11 card in their dev kits to begin with? Were 6xxx** cards really that hard for Nintendo to come by that they couldn't find enough for their developers?

Maybe because the GPU they went with fits in with the specs they're going for (number of SPU's, TU's, ROPS ect) and the custom part they're developing won't even use DX anyway? Point is this isn't going to be an off the shelf part, lets actually wait and see what it is before assuming its no more modern than a HD4xxx card.
 
True, if their not going to use dx11 features then there's no point in producing a chip capable of them. Hence, their decision for Wii spec based on GC. They have shown no desire to keep up with technology, much less lead. And on this, I agree that is likely where they are headed.

And that's why I think they will be in serious trouble nextgen and WiiU will be their last console as we think of today.

They won't be using DirectX full stop and Wii has very little to do with any of this. Wii used a speed bumped GC chip, this will be a custom chip designed soley for WiiU.

Also the last sentence there, is, well amazing... :LOL:
 
I'm seriously starting to think that even if WiiU would put even fastest PC's out there to shame both speed- and tech wise, TheChefO still wouldn't be happy with it, and Nintendo can't do anything right.
 
DirectX versions indicate a certain level hardware features. NO console game runs on top of direct x
Someone finally pointed this out.
The way console hardware can be accessed by developers doesnt have to abide to any specific API; remember that Wii didnt support DX nor OGL, afaik wiki. Its called low level access ,there are several more familiar terms that i cant recall now, but you can essentially override the API or bypass it completely, by writting the machine code ofcourse directly, and thats how you can "suck every bit of performance from the hardware".


So if the hardware supports it, you can technically have hardware tesselation even though DX version might not suport it. To put it in perspective> As long as the hardware GPU supports a feature, you can use that in games no matter what, the use of API is not enforced.

This is what Carmack is frustrated about through a lot , PC hardware just doesnt have the lower access, and its all about the drivers and api, very unoptimized.

Thats why people get surprised how much you can pull out of this gen console hardware.

So WiiU will just have support for the DX and OGL, in console world that only means a faster and easier devlopmeent, however every console game would totally suck in performance if it wasnt for a very simple thing called low level access.

For PCs, that blame goes to GPU manufacturers who wont open up their access, as well as DX monopoly.
 
Last edited by a moderator:
True, if their not going to use dx11 features then there's no point in producing a chip capable of them. Hence, their decision for Wii spec based on GC. They have shown no desire to keep up with technology, much less lead. And on this, I agree that is likely where they are headed.

And that's why I think they will be in serious trouble nextgen and WiiU will be their last console as we think of today.




Just because they made money ( a lot of it) with Wii doesn't mean they will with WiiU (not sure I'd call it nextgen at this point).

Kinect killed the Wii roadmap. Motion gaming is done for them.

So now they're sitting in pseudo portable land between a DS/ipad and a console with WiiU with none of the advantages of either. WiiU is essentially Afro Ninja. A ton of confidence at first, and HORRIBLE execution.



And what I was referring to on inspiration is what happens after they are knocked out of the hardware game? They are already pretty ho hum software-wise over this gen.

"If Nintendo don't do what I want them to do, they're doomed."

:LOL:
 
which process is it on? I used to think 40nm but this gives a performance/power disavantage and thus a slower GPU in the small wiiU budget.
but maybe on 40nm it's cheap and high availability.

if it uses VLIW5 or VLIW4, why not, it's what is in current and future AMD APUs.
I expect low specs, if it's VLIW5 maybe 240SP, if it uses radeon 6970 architecture then 256 SP, both with 64bit gddr5.

it' more than good enough. for instance I've seen a sandy bridge laptop with a renamed radeon 5450 with ddr3. it's pretty good at running games at 768p, we tried far cry 2 at default settings. it's like an order magnitude better than stuff like the X300 SE we had a few years ago.
the unobtainable radeon 6450 w/ gddr5 is twice better. something a bit above compares well with PS360 I guess.
 
Status
Not open for further replies.
Back
Top