Predict: The Next Generation Console Tech

Status
Not open for further replies.
What do you think about the Tim Sweeney presentation in which he claims to be movint to a software rasterizer in UE4? He seems to imply huge bandwidth would be needed. However, I suspect that a software rendering path might be designed in order to exploit data locality.
Rather than thinking of EDRAM as it was implemented in Xbox 360, I believe we should discuss whether future graphics techniques justify fast main memory or they will be better able to exploit data locality. In the latter case a ulta-fast local memory to assist a not-so-fast main memory would probably be quite effective in terms of both performance and performance/cost with respect to particulary expensive main memory. I am no graphics expert and I would like to hear your opinions about this, but I suspect that lack of data locality is not something inherent to rendering, but rather related to the current rasterization approach.
 
Well, that's an architectural difference between the platforms. Triple-buffering is hardly an unfair advantage - after all you can expect every scene in a console game to be tested against fixed hardware and optimized accordingly to hit a specific performance target, while obviously nothing like this can happen on the PC. And it does provide v-sync quality, just without the setbacks.

I'd say +1 for raw power vs fixed hardware optimization on this one ;)

Being able to to force triple buffering is really nice on the PC. Another thing I like is being able to force a good amount of aniso filtering, and been able to turn trilinear optimisation off (it may have improved since the 7900 series, but I see no point in checking!).

Anything you can do to increase the accuracy of the pixel/fragment colour in representing what's behind it is a good thing, and it seems strange that for the enormous benefit that anisotropic filtering can bring that it's relatively shunned in console land. I believe that the 360 has issues with it (due to a small texture cache and big cache miss penalty iirc - which might also explain what looks like bilinear filtering sometimes) and a simple hope I have for the next gen is for less costly use of it.

I feel the same way about MSAA - I don't think that post process filters are a substitute for losing less information in the first place, and as screens grow and resolutions continue to be kept low by wanting higher quality pixel shaders, the need for maintaining as much information about polygon edge position as possible will remain. You can't always reconstruct this as accurately as you can calculate it using subsamples, and every time someone cheers about MLAA / FXAA killing MSAA I feel a bit sad. :(

So I hope next gen delivers fast aniso and fast MSAA. Apply post process filters to whatever you want, but lets get that information right (or less wrong) in the first place!
 
So I hope next gen delivers fast aniso and fast MSAA. Apply post process filters to whatever you want, but lets get that information right (or less wrong) in the first place!

So you'd want (for example) 'gears of war AA' that you can't even see, but at least "the information is right" ??
 
The idea of next gen consoles withput a hard drive is so stupid it is even hard to comment. Did everyone collectively fall on their heads? :D

Do you really think that Live! and all its downloadable content will be unavailable by default? It should be clear that we are moving ever stronger towards digital distribution, and even if optical media is included, it will only be playing second fiddle to the HDD/Live! combination.

Currently only X360 has a single SKU without a HDD. HDD has become a de facto standard, and the 4GB X360 only a way to dilute costs before the inevitable upgrade to the 250GB HDD. What kind of a thought process would lead anyone to think that the next gen of consoles will lose HDD as a baseline feature? It makes NO sense. PSN and Live! are practically built upon the fact that you have a HDD for permanent content storage.

I understand that currently not all owners are connected to the internet, but come on - we are talking 2015-2025 timeframe here, and markets like Japan, USA, Europe and China. Fast broadband is a f*cking standard and a given.
 
So you'd want (for example) 'gears of war AA' that you can't even see, but at least "the information is right" ??
I can see the difference between no AA and 4 sample non rectangular grid MSAA ... I'd prefer if the worst case was the latter, rather than the former.
 
I believe the eDRAM currently is severely overrated; in this current gen it looked okay, but only because a competing console was bandwidth starved.
The competing console was bandwdith starved only because there was no feasible way to provide more bandwidth within the price. Similarly, without eDRAM 360's workable BW would be well down on where it is now.

Other than TSV and 3D ICs there is nothing on the horizon which would make external RAM substantially faster Shifty ... so I don't quite see why you would expect external RAM to be fast enough.
I'm thinking XDR2, with fingers crossed. ;) I suppose use of GDDR5 will require split memory.
 
The idea of next gen consoles withput a hard drive is so stupid it is even hard to comment. Did everyone collectively fall on their heads? :D

Do you really think that Live! and all its downloadable content will be unavailable by default? It should be clear that we are moving ever stronger towards digital distribution, and even if optical media is included, it will only be playing second fiddle to the HDD/Live! combination.

Currently only X360 has a single SKU without a HDD. HDD has become a de facto standard, and the 4GB X360 only a way to dilute costs before the inevitable upgrade to the 250GB HDD. What kind of a thought process would lead anyone to think that the next gen of consoles will lose HDD as a baseline feature? It makes NO sense. PSN and Live! are practically built upon the fact that you have a HDD for permanent content storage.

I understand that currently not all owners are connected to the internet, but come on - we are talking 2015-2025 timeframe here, and markets like Japan, USA, Europe and China. Fast broadband is a f*cking standard and a given.

Very simple, there is a Xbox 360 for 199, and it's a good seller. Further, it's fully functional.

Knocking off 50 bucks for the hard drive costs becomes a really big deal when you get really low on price. How well will a $149 360 sell? If Microsoft matches the PS3 price cut anytime this year, you will see it soon.

Not everybody is a hardcore gamer. The console with the least storage this gen (no hard drive available) was the Wii and it did ok for 3-4 years.

And as long as there's an upgrade path, I see no problem.

bkilian just said half of 360 owners dont even sign up for Live.

Another factor is cheapening flash. You already have a 4Gb 360 which used to have no internal memory, then got 512 MB, and now is bumped to 4GB. It may see another bump or two this gen. 8GB, 16GB, or more should be a doddle to start off next gen in the HDD-less SKU.
 
I'm thinking XDR2, with fingers crossed. ;)
I think in orders of magnitude ... a factor of ~1.5 gets a big meh from me.

There is no off the shelf processor which can use XDR2 either ... so I don't see why GDDR5 would necessarily mean split memory pools, they're both equally usable with unified memory. Unified memory only makes sense with a unified SOC though IMO.
 
Unified memory only makes sense with a unified SOC though IMO.

Why? A unified memory pool has been a common feature of almost all of the consoles released over the last couple of generations and none of them featured SoC designs, at least not at launch.
 
I am curious why is MS still using 64MB chips for RAM (no change from 2005)? Sony started with 8x32MB DDR3 + 4x64MB XDR and now they are using 256MB chips for both DDR,XDR.

MS started switching to 1 Gb chips in late 2008 iirc, on the Falcon motherboard. The board design must have given them the flexibility to use both 512 Mb and 1Gb, depending on what was cheapest / available.
 
So I hope next gen delivers fast aniso and fast MSAA. Apply post process filters to whatever you want, but lets get that information right (or less wrong) in the first place!
So you'd want (for example) 'gears of war AA' that you can't even see, but at least "the information is right" ??

I don't understand what that question means, or how it relates to the quote. Help me out here.
 
What do you think about the Tim Sweeney presentation in which he claims to be movint to a software rasterizer in UE4? He seems to imply huge bandwidth would be needed. However, I suspect that a software rendering path might be designed in order to exploit data locality.
Rather than thinking of EDRAM as it was implemented in Xbox 360, I believe we should discuss whether future graphics techniques justify fast main memory or they will be better able to exploit data locality. In the latter case a ulta-fast local memory to assist a not-so-fast main memory would probably be quite effective in terms of both performance and performance/cost with respect to particulary expensive main memory. I am no graphics expert and I would like to hear your opinions about this, but I suspect that lack of data locality is not something inherent to rendering, but rather related to the current rasterization approach.

Lack of data locality is inherent in rendering large textures. By their very nature, there is no time locality at all (in fact, there is anti-locality -- using a texel instantly makes it the least likely to be used data point in your entire data set, until your next frame).

So the question becomes: Is it possible to render beautiful scenes without using large textures? I don't know, but the industry certainly seems to be heading in the direction of more texture data, not less. For example, id tech 5, the engine used in Rage, essentially trades away almost everything else for more texture data, and people seem to like it.
 
Lack of data locality is inherent in rendering large textures. By their very nature, there is no time locality at all (in fact, there is anti-locality -- using a texel instantly makes it the least likely to be used data point in your entire data set, until your next frame).

Thanks for your answer. Do you have any estimate on much memory bandwidth is spent reading textures VS reading/writing framebuffers VS reading/writing geometry?
 
Lack of data locality is inherent in rendering large textures. By their very nature, there is no time locality at all (in fact, there is anti-locality -- using a texel instantly makes it the least likely to be used data point in your entire data set, until your next frame).

I'm not sure this is true.

Data locality is important in that you want to keep your textures / vertex data cached for as long as possible before you throw it out. The hardware is built around this principle as ever vertex / pixel will typically want to touch the same data-set (textures, uniforms etc...) multiple times per mesh fragment per frame. Batching w/ texture-packing is also employed to increase the potential for this. Also MegaTexture is a perfect example of an entire system built around the idea.

If in your game you have a single texture sampled on a single draw call once per frame (that's not rendering your skybox) then you're probably doing it wrong...


So the question becomes: Is it possible to render beautiful scenes without using large textures? I don't know, but the industry certainly seems to be heading in the direction of more texture data, not less. For example, id tech 5, the engine used in Rage, essentially trades away almost everything else for more texture data, and people seem to like it.

Smaller textures make matters worse & can in many ways hurt performance drastically as you reduce batch-efficiency. Maybe this is a symptom of the hardware architectures of today? but then the question becomes, when trying to draw huge, detailed & varied worlds with data sets that comfortably run into the tens of GBs, you still need a hardware memory access model & caching hierarchy that can deal with so much data efficiently so what would smaller textures (typically uniquely applied to geometry in the world) really solve?

Not much I suspect...
 
Status
Not open for further replies.
Back
Top