Wii U hardware discussion and investigation *rename

Status
Not open for further replies.
A good idea for providing more input to advance the gaming medium? Sure, advancing the medium is always a good thing. However, are we really under the impression that Nintendo is providing meaningful leadership for this industry?

Seriously?

If Sony were implementing it, it would be a different story. They actually do intend to push the medium forward and have done so in the past. Same for MS (though less as a leader and more as a competitor for leadership).

Nintendo at this point are simply trying to figure out a way to survive and stay as profitable as possible.

If that means selling a clip on a finger, great. If it means selling a plastic balance board, great. If it means selling a gamecube with a wiimote (and accessory analog plastic dongle), great.

And see the full list of other plastic goodie add-ons.

But when it comes down to fundamentally improving the interactive medium, Nintendo aren't interested.

It's sad how far they've fallen since their Silicon Graphics partnership trajectory. At one point (not that long ago) Nintendo was a pioneer, not just a profiteer.

Yes, as long as Nintendo opts to bring new gamers to the industry, make great games and find new ways to play games instead of making cinematic games and pushing for bankruptcy so techies can have super powerful consoles, they are not pushing the industry forward.

Don't mistake your personal desires as the only thing that is important for this medium.
 
Yes, as long as Nintendo opts to bring new gamers to the industry, make great games and find new ways to play games instead of making cinematic games and pushing for bankruptcy so techies can have super powerful consoles, they are not pushing the industry forward.

Don't mistake your personal desires as the only thing that is important for this medium.

Bringing new gamers to the medium, I agree with. But those gamers are quickly finding other avenues to invest their time as a direct result of Nintendo's lack of leadership and innovation.

Nothing personal about Nintendo's customers starting to head elsewhere.

That's not only due to their poor hardware choices, but also due to their lack of innovation in software as well. This combined with 3rd party devs not having success in sales leads to a stagnant platform which is looking to be replaced with a platform of yesterday's tech and yesterday's interface.



I'm all for innovation. And if that means other aspects of the console are compromised to include those innovations that's fine. As long as it is truly innovative and meaningfully supported by 1st and 3rd party devs. Conversely, I'm not ok with it if the "innovation" is there simply as a distraction to avoid direct comparisons to the competition.

I don't think anyone here would like to see Nintendo go belly up in the chase for top tech. But that doesn't mean they should roll over with ancient tech either.

Again, I fail to see how a 120mm2 GPU would cause them to go bankrupt.
 
Again, I fail to see how a 120mm2 GPU would cause them to go bankrupt.

Not going with GCN is not going to be due to die size, obviously. Stop assuming they can just drop in a new architecture. That's just not how it works. If they haven't targeted said architecture for customization from the getgo, you don't just drag and drop the latest PC architecture. The teams are going to be separate.
 
http://www.neogaf.com/forum/showpost.php?p=35648257&postcount=13282

No, how can I describe it ... the kits obviosly has more ram than retail (the double to be exact or this is the goal ). The RAM amount that you can use for your application is limited to a lower amount, the problem here is that nintendo didn't decide yet (not sure if the new ones or new sdk's will change this) there is a range of memory possibilities. But now you can say that the application limit is the amount of the retail machine, but it can change in the future (but not a big jump, so no crazy theories please).


hm
 
Originally Posted by lherre View Post
No, how can I describe it ... the kits obviosly has more ram than retail (the double to be exact or this is the goal ). The RAM amount that you can use for your application is limited to a lower amount, the problem here is that nintendo didn't decide yet (not sure if the new ones or new sdk's will change this) there is a range of memory possibilities. But now you can say that the application limit is the amount of the retail machine, but it can change in the future (but not a big jump, so no crazy theories please).

So basically exactly what I said, Arkam was just talking about the actual amount of RAM the system can use (what lherre calls the application limit) rather than the raw amount in the dev kit. I always figured that, common sense.

People trying to discredit arkam latched onto some minor faux incongruency.

You'll notice lherre has been subtly backing arkam's controversial post up.
 
YESSSS

THAT'S WHAT IM TALKING ABOUT !!!!!!!!

http://gonintendo.com/?c=5


The same source that has been supposedly leaking Wii U dev kit details recently is back with more rumored info. Thanks to Dim for the heads up!


- Will have more than 1 GB RAM
- based on Rev4 dev kit frame
- amount of RAM is 'quite surprising', but less than 8 GB
Man this news made my day!


Well that's the dev kit ram but ... that's good starters ! Hopefully the source knows that dev kits get more RAM than retail.


Also for those that still think 1 GB is enoug, just look neogaf posts, most of people hope for 1.5 GB and it'll be fine for WiiU to not be stagant in long-term, and obviously i'll be surprised if it gets above 1.5 GB that'll be great, ram is so important for games, especially IDTech5 engine requires it a lot, the new technologies won't repeat same textures as tiles, it's all unique art, that's why some are surprised by the need.

You also need to take into account that all PC recommended specs take PAGEFILE into account as enabled, this means all of those specs that show RAM are actually incorrect since they expect everyone to have that enabled, a lot more ram is needed these days than what appears.
 
Last edited by a moderator:
YESSSS
ram is so important for games, especially IDTech5 engine requires it a lot, the new technologies won't repeat same textures as tiles, it's all unique art, that's why some are surprised by the need.

Eh?

You do understand how megatexture works don't you?
 
YESSSS

THAT' WHAT IM TALKING ABOUT !!!!!!!!

http://gonintendo.com/?c=5


Man this news made my day!


Well that's the dev kit ram but ... that's good starters ! Hopefully the source knows that dev kits get more RAM than retail.


Also for those that still think 1 GB is enoug, just look neogaf posts, most of people hope for 1.5 GB and it'll be fine for WiiU to not be stagant in long-term, and obviously i'll be surprised if it gets above 1.5 GB that'll be great, ram is so important for games, especially IDTech5 engine requires it a lot, the new technologies won't repeat same textures as tiles, it's all unique art, that's why some are surprised by the need.

You also need to take into account that all PC recommended specs take PAGEFILE into account as enabled, this means all of those specs that show RAM are actually incorrect since they expect everyone to have that enabled, a lot more ram is needed these days than what appears.

Don't get too excited yet.

That source (from Neogaf) still needs to show something to give reliability , IMO.

I still expect 1Gb at the least and 1.5 GB at the most.
 
Eh?

You do understand how megatexture works don't you?


I do , the carmack way, if we're mixing terms im not to blame for it.

It's one big texture island cut into sectors or pieces, the pieces get streamed into view, with a lot of RAM those pieces stay in the buffer, on consoles with low ram, it' can't all stay in RAM obviously, that's why Rage has so many texture-pop in issues on consoles, while x360 is fine, PS3 is not quite, texture-pop in is the sole reason of slow buffers, IO speeds, hdd loading , RAM amount.


And I made this thread OP ...http://www.techpowerup.com/forums/showthread.php?t=150325 - take a look so i don't have to explain all over again. watched all carmack's interviews i could possibly find and quakecon2011 (5x times over)
 
Don't get too excited yet.

That source (from Neogaf) still needs to show something to give reliability , IMO.

I still expect 1Gb at the least and 1.5 GB at the most.


I saw the negaf post now, gonintendo linked the wrong page, it's 259 not 260.

He does point out that the sources actually do take "the final retail" into account, but this is obviously good rumor than no rumor.

But again, sources aren't technical people to begin with :cry:
 
I do , the carmack way, if we're mixing terms im not to blame for it.
If you're thinking iD tech 5 needs more RAM for its unique texturing, then I'm not sure you do. Read Sebbbi's complete explanation here.

In summary, at 720p 50 MBs is enough for pixel-level texture fidelity, while <10 MBs transfer speeds are needed to keep the tiles updated in a timely fashion. The major issue is seek times when fetching new tiles, which is what you feel more RAM is needed to solve. Well, throw in some decent, cheap flash cache on the drives and RAM requirements can be kept pretty minimal when using VT. Thus even a 512 MB console from any manufacturer could have faultless textures next-gen for those games that can use megatexturing. You'd probably want more not to be limited in rendering options. Texturing certainly isn't a reason for wanting loads more RAM though thanks to new approaches.
 
If you're thinking iD tech 5 needs more RAM for its unique texturing, then I'm not sure you do. Read Sebbbi's complete explanation here.

In summary, at 720p 50 MBs is enough for pixel-level texture fidelity, while <10 MBs transfer speeds are needed to keep the tiles updated in a timely fashion. The major issue is seek times when fetching new tiles, which is what you feel more RAM is needed to solve. Well, throw in some decent, cheap flash cache on the drives and RAM requirements can be kept pretty minimal when using VT. Thus even a 512 MB console from any manufacturer could have faultless textures next-gen for those games that can use megatexturing. You'd probably want more not to be limited in rendering options. Texturing certainly isn't a reason for wanting loads more RAM though thanks to new approaches.

The explanation is fine, but your opinion on RAM is valid for the crap looking game that it is, if you want that, that's subjective.

No thanks. 512 MB is not enough, not at all, resolution is terribly low.


The game is ~ 21 GB ... heavily compressed, the actual full quality build is +100 GB.
 
Stewox, then you have again failed to understand megatexturing.

The framebuffer footprint is very small (tens of MBs; iirc id's Mega-texturing footprint is smaller than the virtual texturing sebbbi and Red Linx lays out) for the virtual textures. A "normal" implementation where textures (and their mip levels) are all held in memory can take hundreds of MBs.

The issues with Rage's textures has NOTHING to do with RAM; the issues are primary (a) physical storage (compression), more physical storage space would allow more details and (b) streaming speed and (c) GPU performance.

I get that you are a big fan ("THAT'S WHAT IM TALKING ABOUT !!!!!!!!" --and what is up with even giving a nod to any source that says "less than 8GB" of memory -- no @$@# sherlock, that right there pretty much invalidates a source by even throwing out crazy numbers like that) but please, I encourage you, to read the links and information long time members here have before quipping back dogmatically. The flow here at B3D is information exchange and industry understanding and less rah-rah.

More RAM on the Xbox 360/PS3 would not have necessarily addressed the texture fidelity issues in RAGE. The optical drive speeds aren't necessarily killers either (RE: Hard Drive installs) but the optical storage size issues were discussed by id Software years before RAGE released. I encourage you to read up on the issue before making a false association that more RAM would have solved the problem.

PS- I am all for tons of fast memory in next gen consoles.
 
Stewox, I highly suggest you take the time to re-read sebbi's detail account of megatexturing and ponder it some more before continuing to post about it here. You miss the obvious takeaways.
 
The explanation is fine, but your opinion on RAM is valid for the crap looking game that it is, if you want that, that's subjective.

No thanks. 512 MB is not enough, not at all, resolution is terribly low.
Do you mean that the screen resolution is low or that the texture resolution is low in Rage? Virtual texture streaming BW requirements and memory requirements scale pretty much linearly with the screen pixel count. The texture density doesn't matter much (if we are talking about worst case memory and streaming requirements). 720p is 921k pixels. 1080p is 2073k pixels. If next gen console games render at 1080p, you would need 2.25x as much memory to hold the virtual texture tile cache. That's around 50 megabytes of extra memory required to hold the textures (assuming you streaming targets a high quality 1:1 screen to texel mapping). It's not impossible with the current 512 megabyte consoles, but would require compromises (unless your game is really simple otherwise).

Extra RAM to hold a bigger virtual texture cache doesn't help the worst case much, assuming that you are already HDD bandwidth bound, and cannot prefetch a larger set of tiles because of that (all prefetching mechanics increase the HDD bandwidth requirement). Larger cache allows tiles to stay in the cache for longer time. Unfortunately a larger cache doesn't help much if the player advances in the game all the time and always sees new texture surfaces instead of revisiting the old ones. Extra memory alone doesn't help the worst case performance, and worst case scenarios are basically the only occasions you see visible artifacts (texture popping). Even for scenarios where the cache will be utilized (player going backwards in level), the extra memory used doesn't offer dramatic gains after certain point. Increasing cache size way beyond working set size will in the end only give logarithmic gains (this result is generic to caches, and not even an "oracle" can manage better).

More HDD bandwidth (for example an SSD) combined with extra memory would allow the developer to prefetch more pages in advance. However prefetching is not a magic bullet, since most games are interactive, and player actions are often hard to predict in advance. We are basically guessing how the game state proceeds. Often good guesses take lots of CPU resources to calculate (need to predict/simulate future physics and AI behaviour in advance), so we use simple guesses such as extrapolating camera/character movements using their current positions and movement vectors. This produces pretty good results for near future predictions (just a few frames ahead of time). The problem is that the worst case scenarios (visible texture popping) often occur because of erratic user behaviour that is hard to predict (user does something unpredictable that reveals lots of texture surfaces at once). Extra prefetching always considerably increases the amount of memory and bandwidth that is wasted. The more prefetching, the higher percentage of loaded tiles that are never used. Assuming next gen consoles are rendering at 1080p. That alone would require 2.25x HDD bandwidth (compared to 720p) just to stream the visible textures (that are guaranteed to be used), and nothing more. I don't think we have much extra HDD bandwidth to do speculative prefetching beyond that.

SSD in a future console would change things a lot. Our development PCs have all SSDs, and we have tested virtual texturing extensively on them. Basically SSD has so fast seek times that virtual texturing requires no prefetching at all. We use the same small 50 MB cache on PC as well, and no matter what you do, you cannot see any texture popping. Physical media seek time is the most important thing for virtual texturing. Physical media bandwidth is the second most important. After those come the CPU and GPU performance, as worst case scenerios require bursts of tiles to be transcoded very quickly (the better image compression ratio, the slower algorithm). Tile cache size in memory is of course also important, to a certain point, but after reaching the treshold adding extra is just a waste of memory (and requires more CPU time for management). 50 MB -> 100 MB cache would help, but 100 MB -> 200 MB would only result in minimal gains (so the memory would be better used elsewhere).

I think that we could manage with just 2 GB in the next generation if virtual texturing becomes the norm. Without virtual texturing something like 8 GB would be pretty good, but then again I don't personally want to see increased level loading times. Most current games have way too long loading screens already. More memory = more data needs to be loaded from HDD to fill it up.
 
Stewox, then you have again failed to understand megatexturing.

The framebuffer footprint is very small (tens of MBs; iirc id's Mega-texturing footprint is smaller than the virtual texturing sebbbi and Red Linx lays out) for the virtual textures. A "normal" implementation where textures (and their mip levels) are all held in memory can take hundreds of MBs.

The issues with Rage's textures has NOTHING to do with RAM; the issues are primary (a) physical storage (compression), more physical storage space would allow more details and (b) streaming speed and (c) GPU performance.

I get that you are a big fan ("THAT'S WHAT IM TALKING ABOUT !!!!!!!!" --and what is up with even giving a nod to any source that says "less than 8GB" of memory -- no @$@# sherlock, that right there pretty much invalidates a source by even throwing out crazy numbers like that) but please, I encourage you, to read the links and information long time members here have before quipping back dogmatically. The flow here at B3D is information exchange and industry understanding and less rah-rah.

More RAM on the Xbox 360/PS3 would not have necessarily addressed the texture fidelity issues in RAGE. The optical drive speeds aren't necessarily killers either (RE: Hard Drive installs) but the optical storage size issues were discussed by id Software years before RAGE released. I encourage you to read up on the issue before making a false association that more RAM would have solved the problem.

PS- I am all for tons of fast memory in next gen consoles.


No I did not said that, seems like i did, but my meaning wasn't.

Then you go to say "nothing about ram" .. this is silly, you mentioned compression ... well, if you want to run uncompressed rage that's 100 GB , i don't think you would be able to run it with 512 MB memory.




Do you mean that the screen resolution is low or that the texture resolution is low in Rage? Virtual texture streaming BW requirements and memory requirements scale pretty much linearly with the screen pixel count. The texture density doesn't matter much (if we are talking about worst case memory and streaming requirements). 720p is 921k pixels. 1080p is 2073k pixels. If next gen console games render at 1080p, you would need 2.25x as much memory to hold the virtual texture tile cache. That's around 50 megabytes of extra memory required to hold the textures (assuming you streaming targets a high quality 1:1 screen to texel mapping). It's not impossible with the current 512 megabyte consoles, but would require compromises (unless your game is really simple otherwise).

Extra RAM to hold a bigger virtual texture cache doesn't help the worst case much, assuming that you are already HDD bandwidth bound, and cannot prefetch a larger set of tiles because of that (all prefetching mechanics increase the HDD bandwidth requirement). Larger cache allows tiles to stay in the cache for longer time. Unfortunately a larger cache doesn't help much if the player advances in the game all the time and always sees new texture surfaces instead of revisiting the old ones. Extra memory alone doesn't help the worst case performance, and worst case scenarios are basically the only occasions you see visible artifacts (texture popping). Even for scenarios where the cache will be utilized (player going backwards in level), the extra memory used doesn't offer dramatic gains after certain point. Increasing cache size way beyond working set size will in the end only give logarithmic gains (this result is generic to caches, and not even an "oracle" can manage better).

More HDD bandwidth (for example an SSD) combined with extra memory would allow the developer to prefetch more pages in advance. However prefetching is not a magic bullet, since most games are interactive, and player actions are often hard to predict in advance. We are basically guessing how the game state proceeds. Often good guesses take lots of CPU resources to calculate (need to predict/simulate future physics and AI behaviour in advance), so we use simple guesses such as extrapolating camera/character movements using their current positions and movement vectors. This produces pretty good results for near future predictions (just a few frames ahead of time). The problem is that the worst case scenarios (visible texture popping) often occur because of erratic user behaviour that is hard to predict (user does something unpredictable that reveals lots of texture surfaces at once). Extra prefetching always considerably increases the amount of memory and bandwidth that is wasted. The more prefetching, the higher percentage of loaded tiles that are never used. Assuming next gen consoles are rendering at 1080p. That alone would require 2.25x HDD bandwidth (compared to 720p) just to stream the visible textures (that are guaranteed to be used), and nothing more. I don't think we have much extra HDD bandwidth to do speculative prefetching beyond that.

SSD in a future console would change things a lot. Our development PCs have all SSDs, and we have tested virtual texturing extensively on them. Basically SSD has so fast seek times that virtual texturing requires no prefetching at all. We use the same small 50 MB cache on PC as well, and no matter what you do, you cannot see any texture popping. Physical media seek time is the most important thing for virtual texturing. Physical media bandwidth is the second most important. After those come the CPU and GPU performance, as worst case scenerios require bursts of tiles to be transcoded very quickly (the better image compression ratio, the slower algorithm). Tile cache size in memory is of course also important, to a certain point, but after reaching the treshold adding extra is just a waste of memory (and requires more CPU time for management). 50 MB -> 100 MB cache would help, but 100 MB -> 200 MB would only result in minimal gains (so the memory would be better used elsewhere).

I think that we could manage with just 2 GB in the next generation if virtual texturing becomes the norm. Without virtual texturing something like 8 GB would be pretty good, but then again I don't personally want to see increased level loading times. Most current games have way too long loading screens already. More memory = more data needs to be loaded from HDD to fill it up.


I don't play console games except nintendo first-parties. I was playing rage on PC, so obviously i wasn't talking about screen resolution.

The only problem which bothers me with your explanation is the seemigly absolute connection between on screen pixels and defined requirement of size ... that's silly since it all depends on texture resolution as well.
If you're scaling Rage's dataset that is valid , but this seems very doubtful it would apply for any game with similar technology, plus we have no idea what IDTech5 really does since SDK is not out yet, you rely on your own
Im sure ID Software is not doing everything in the same manner but that's not an excuse, my point still with that unless you explain that.

WiiU simply needs more memory just because it's 1080p ... that alone requires 2x more memory at minimum, and if you want to actually make a next-gen console with actually non-crap looking texture-resolution then you need another 1 GB for that, you want to make virtual texturing above

Just ask your self why Rage SP is so lonely on enemies, they don't really send you more than 5 enemies at once against you ...

Still, it looks better than other games, but since im a PC guy , that's just silly say i was surprised or anticipated, i was banking on the fact they would release super HD pack for PC which turned to be ... way out of 99% chance i predicted.
 
Last edited by a moderator:
The only problem which bothers me with your explanation is the seemigly absolute connection between on screen pixels and defined requirement of size ... that's silly since it all depends on texture resolution as well.

Nope, you still don't get it.

What do you think they store in the "virtual texture"?
 
The only problem which bothers me with your explanation is the seemigly absolute connection between on screen pixels and defined requirement of size ... that's silly since it all depends on texture resolution as well.
No, it doesn't. If you increase the texture resolution of your megatexture source, such as for closeups on a face, you increase the texture tilemap size and number of tiles, but the number of tiles needed in RAM doesn't go up. This is the fundamental concept behind virtual texturing - you need one texel per pixel in RAM (plus surrounding pixels in the tile of course).

You have an opportunity on B3D to hear from very knowledgable folk like Sebbbi who are happy to give up their time to educate us. I suggest you put aside your "I know best" mentality and pay attention, if your intention is to engage in discussion rather than just tell everyone how wrong they are.
 
Pretty simple, you want one texel per pixel, so a 1920*1080 frame only requires that amount of texels.
It's just not that easy to find the exact set of texels you need, so you have a coarse selection and you need a (relatively) big cache.
AFAIR Rage uses 8k² S3TC texture(s) as cache. (Plural because there's prolly one for albedo/colour, another for normals and maybe a third for specular+gloss)

[The obvious problem is to load the data you need quickly enough if it's not in cache, in which case a SSD is pretty much perfect.]
(If any console gets an SSD by default, it could make a massive image quality difference.)
 
Status
Not open for further replies.
Back
Top