Predict: The Next Generation Console Tech

Status
Not open for further replies.
This kind of thinking is why there are practically no native Full HD AAA games. X360 would be perfectly capable of proper Full HD (it's called using the 10MB daughter die in tiling mode I think, which helps to get over the 720p 2xAA limitation most heavy 3D games face), but what's the point if the textures will look simply awful due to RAM restrictions?

The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome). One would hope they learned from it.

Thats crazy talk. RAM has little to do with rendering resolution beyond having enough for the framebuffer. Rendering res has everything to do with the pixel processing power of the GPU, what on earth made you think the reason for not going 1080p across the board was down to textures looking bad?? Every game on the 360 would look better at 1080p without any change to texture res :???:
 
The point is, development of VR products stopped because of liability concerns.

Stopped? But there is development of VR products ongoing now. Several HMDs are available for PC already. Sony demoed a new prototype HMD at CES2011. I doubt they would be investing anything into the project if it was a certain dead end due to legal issues. Either they see enough merit to the tech that they feel it overcomes some of the issues, or they fell they can solve some of the issues to some extent.

One big issue to overcome is being able to get a high enough FOV, i wonder if they will be able to improve here.
 
Last edited by a moderator:
The cost in creating assets doesn't come from simplifying them for low spec. It comes from building them. 100x the content is going to take 100x the effort. Sure, you can easily get a 4x increase in RAM requirement just by using 2k textures instead of 1k textures, but texture res will only get you so far. 1 TB of content is astronomical. It'll require at least an order of magnitude more expense on assets. I'll point you to the 50GB potential of BRD, and the fact that since the beginning of this gen, game sizes have remained in the DVD capacity range. What exactly are you going to distribute your 1 TB RAM consuming games on?? We'll spend two hours loading the game?? :p By 2020 we'll even be looking at streaming game like OnLive.

Textures is the number one memory eater. The quantity of content rarely increases at all (or if it does, then by geometry instancing or just plain old repetition), while the quality increases exponentially (well, used to before this gen).

In any case, a 2K texture should take around 16MB and a game may contain up to a thousand or more textures.

So a 4K texture would take 64MB and an 8K one 256MB. The difference?
15,6GB of texture data vs 250GB of texture data. That tells me that a game using 16K textures would take 976,5GB of memory.
It does not seem that preposterous.

As for content delivery, the fastest affordable connection available in my town (100Mbit/s for $50/month without bandwidth cap) can theoretically download 1054 GB in 24 hrs.

Thats crazy talk. RAM has little to do with rendering resolution beyond having enough for the framebuffer. Rendering res has everything to do with the pixel processing power of the GPU, what on earth made you think the reason for not going 1080p across the board was down to textures looking bad?? Every game on the 360 would look better at 1080p without any change to texture res :???:

The higher the resolution, the higher the texture quality has to be for your eye to think that things look ok and not macroblocking or otherwise starting to look like brown jam that's spread too thin over a piece of bread. I thought everyone knew that? :)

Just upping the resolution will make a game with low detail assets look even more low end... there's just nothing left to cover all these pixels (the effect is akin to watching 240p youtube video fullscreen).
 
Last edited by a moderator:

You are way off.
In 2005, the average pc had a 256mb of VRAM and 1 GB of system RAM
As today, the average pc has 1 GB of VRAM and 4 GB of system RAM (look at steam hwsurvey).

This is a more likely transition, considering that in 7 year, we had a 4x jump.
2011: 1 GB VRAM , 4 GB system ram
2013: 2 GB VRAM, 8 GB system ram
2016: 4 GB VRAM, 16 GB system ram

Bandwidth, and processing power are much more of a constrain to the performance.
 
Last edited by a moderator:
The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome).
You could not be further away from it. 1080p requires 2.25x pixel shader instances (pixels) compared to 720p. That's 2.25x ALU (GPU maths) and 2.25x TEX (GPU sampling units and memory bandwidth) to render the frame. So you need much more powerful GPU to render the game at same frame rate at 1080p. Current console GPUs are not powerful enough to run games at 1080p. You could run simple last gen graphics at 1080p, but most developers rather choose even lower than 720p resolution to push as many fancy effects as possible.

With virtual texturing you do not need more than 100 megabytes of graphics memory to texture everything with as high detail as you can fit to the game disc. Our next game uses 2048x2048 textures on most objects, and our whole texturing system takes a constant 56 megabytes of memory to texture all objects in the world. We could use 4096x4096 textures everywhere, and still all textures would use exactly the same 56 megabytes of graphics memory, but the game would be too big to distibute (XBLA games have 2 gigabyte size limit). Rage comes with three DVDs and I suspect their system uses the same 56 megabytes (or slightly more if they have more material layers).
 
Now you're applying PC-building logic to console building. Cost-effectiveness to the extreme, regarding key system components, is a nice thing to have if you can easily upgrade, but it's the wrong way to go if you want your box to last 10 years (and put out nice graphics while doing so).

Cost effectiveness to the extreme is essential for anyone making a console. Console vendors look at the costs and benefits over the projected life of the platform when judging the cost effectiveness of the hardware.

MS and Sony took enough upfront losses on their systems as it is!

This kind of thinking is why there are practically no native Full HD AAA games. X360 would be perfectly capable of proper Full HD (it's called using the 10MB daughter die in tiling mode I think, which helps to get over the 720p 2xAA limitation most heavy 3D games face), but what's the point if the textures will look simply awful due to RAM restrictions?

There are lots of 360 games that use tiling (1280 x 720 with 2x msaa requires it), the holdup on higher resolutions is processing power, or rather that developers choose to focus on work per pixel above number of pixels.

As the PC shows, even console level textures don't stop a game looking much better at higher resolutions.

The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome). One would hope they learned from it.

Where does $15 for 512 MB (1400 mhz GDDR 3) in 2005 come from? Even DDR (1) was costing several times that. Super high end graphics cards were using 256 MB of GDDR 3 at the time, if it was that cheap they'd have been swimming in the stuff (and not using $7.50 worth of memory on the super high end!). You're probably a factor of ten out on that, maybe more.

And that's not even taking into consideration the costs of having a mobo with 16 memory chips (!) soldered to the board with all the complexity (traces, power, physical size, cooling) that would add!
 
The higher the resolution, the higher the texture quality has to be for your eye to think that things look ok and not macroblocking or otherwise starting to look like brown jam that's spread too thin over a piece of bread. I thought everyone knew that? :)

Just upping the resolution will make a game with low detail assets look even more low end... there's just nothing left to cover all these pixels (the effect is akin to watching 240p youtube video fullscreen).

So are you telling me Gears of War 3 or Forza 3 looks better at 720p than it would at 1080p? if you can tell me that with a straight face theres little hope :smile: Oh and just for good mesure: http://ps3media.ign.com/ps3/image/article/118/1188204/uncharted-3-drakes-deception-20110816001204718.jpg. You positive Uncharted 3 would look bad at 1080p? There is a reason bullshots use much higher resolutions, it makes current gen assets look much better.

Even PS2 and Wii games look much much better rendered at 1080p. I guess you dont have much experience with emulation. Current gen assest are more than fine for rendering at 1080p, the reason they are not is due to performance cost.
 
Last edited by a moderator:
I wouldn't call 8GB anything near high end these days, You can pick it up for $50. 12-16 is highend with 24 being possible but extreme high end.

Also you could easily get 4GB for around $100 back around 2005. So While it was closer to high-end it did exist.
 
With virtual texturing you do not need more than 100 megabytes of graphics memory to texture everything with as high detail as you can fit to the game disc. Our next game uses 2048x2048 textures on most objects, and our whole texturing system takes a constant 56 megabytes of memory to texture all objects in the world.
Games using virtual texturing on PS3, with optimized streaming, do they have to "waste" half of the 256MB VRAM, at the same time 256MB RAM is screaming for more space but free space is in VRAM and no use for game's business logic?
 
Textures is the number one memory eater. The quantity of content rarely increases at all (or if it does, then by geometry instancing or just plain old repetition), while the quality increases exponentially (well, used to before this gen).
By that reasoning development costs haven't increased at all this generation. We just use the same amount of content as last gen, only instead of producing high end assets and downgrading them to PS2 spec, we don't have to downgrade them as far to PS3 spec.

In any case, a 2K texture should take around 16MB and a game may contain up to a thousand or more textures.
No, it shouldn't.

So a 4K texture would take 64MB and an 8K one 256MB. The difference?
15,6GB of texture data vs 250GB of texture data. That tells me that a game using 16K textures would take 976,5GB of memory. It does not seem that preposterous.
A game using 16k textures is utterly preposterous, as you won't see that much detail. Almost all your texture information wouldn't actually be rendered, making it useless. Movies, who are photorealistic in quality at 2k projection, tend to use 2k and 4k maps. The law of diminishing returns makes your pursuit of unqualified large numbers a pointless one.

As for content delivery, the fastest affordable connection available in my town (100Mbit/s for $50/month without bandwidth cap) can theoretically download 1054 GB in 24 hrs.
Your reference point of your own eperience is not at all representative of the rest of the world. The 16GB PC you built in March is not anything like a target for games developers. 100 Mbps that you have access to is nothing like the world average broadband speed. Also you are downloading one TB's worth of content. Unless your plan is to store the whole game in RAM, you'd be needing several TBs of content with each area/map filling that TB with your massive textures. Current console games use a good 8+ times storage over the amount of RAM the consoles have. You also need to populate the RAM from storage. Games on PS1 were something like 300 MB capacity. PS2 games were more like 3 GB, a 10x increase. This generation we're not even seeing a 10x increase, and games are still getting painfully expensive to make 8GB of content for.

But that's all moot. Even if the technology enables that, which it might well do with holographic disks or something, the fact remains it will be impossibly expensive to create a game using even 1 TB of assets, let alone 1 TB per level requiring a system that can fit that much in RAM. The only possible exception is if we ended up with something like voxels. Then 1 TB of RAM might be nice, But as all the issues that meant voxels haven't got anywhere so far haven't been solved, that's a long stretch. I suppose you could also have a procedural world that models unique trees and grass and records where everything is, but there's no need to have that much data in RAM at a given moment. I suppose if 3D world modelling like the Kinect Fusion demo takes off, we could replace 2D and 3D photo data with volumetric data. I find that implausible.

Reality is most people don't use the RAM they have in their PCs. Next to no developers target the largest possible RAM level because it costs too much and there are no gains, save for the likes of CryTek who'll target power-users who want to max out their PCs. The current standard isn't anything like 16GB as you suggest, and your extrapolation to 1 TB as standard in 2020 is working from decidedly wonky reasoning. :p

Just upping the resolution will make a game with low detail assets look even more low end... there's just nothing left to cover all these pixels (the effect is akin to watching 240p youtube video fullscreen).
Most of those texels aren't drawn. Most of the time you're rendering smaller mipmaps. It's only ocassionally when up close to textures that you need the detail. Higher resolution isn't an issue, as 1080p is likely it for a long time. There's no benefit to higher resolutions in most people's homes where they can't/don't want a 100" screen. 4k textures will be as good as perceptible. In a 1080p framebuffer, if you walk up to a character so their entire face fills the screen, that's 2 million pixels, or half a 2k texture. There's no rationale for replacing every texture with one 128 times larger than the screen can display!
 
Examining the capacities and prices of consumer PC DDR memory in an attempt to try to predict the RAM provisions of next-gen consoles is maddeningly futile...

Reductions in DDR memory costs matter little when no console manufacturer would ever use it due to the rather poor bandwidth it provides (i.e. you want to save cost by providing a unified memory pool for you CPU+GPU & your GPU will starve regularly on bog standard DDR...)

Looking at pricing and price projections of both GDDR & XDR style memory technologies would serve as a far more useful basis for argument...
 
You are way off.
In 2005, the average pc had a 256mb of VRAM and 1 GB of system RAM
As today, the average pc has 1 GB of VRAM and 4 GB of system RAM (look at steam hwsurvey).

This is a more likely transition, considering that in 7 year, we had a 4x jump.
2011: 1 GB VRAM , 4 GB system ram
2013: 2 GB VRAM, 8 GB system ram
2016: 4 GB VRAM, 16 GB system ram

Bandwidth, and processing power are much more of a constrain to the performance.

Where did I say I was talking about *average*? I distinctly remember saying that my calculations applied to what is possible at what time, not when it becomes average.

What the hell does a console maker care about average? He's looking to build something that is technically cutting edge, feasible and not priced beyond reasoning (but still prepared to eat a lot of the initial cost to encourage sales).

By the way, 16GB DDR3 costs only 73 EUR in my part of the woods. When building a 1000 EUR system, you don't pass up on that kind of futureproofing.

And if you think that you will buy a new computer in 2016 with just 16GB of RAM you must be looking at the netbook section.
 
After listening to a recent interview with Carmack i really am in agreement that the next big leap will be through display technology, specifically head mounted displays. Looking at the environments in games like say Skyrim or Rage the way we are veiwing games at the moment is the limiting factor, not the graphics themselves in many cases. People still scoff at VR as a pipe dream but i really do think that for next gen graphics and environments will be at such a level that i really think its time for someone to take another proper crack at it. Sony seem to be doing some work in this area, lets hope that they can come up with something good, the advancement in OLED display tech is promising for HMD tech.

Yeah, I'm amazed that no one has really gone back and revisited VR tech. I remember the company that put together those simple VR pods ('Dactyl Nightmare", etc).. they were using Amiga tech, for chrissakes!
 
Last edited by a moderator:
I wouldn't call 8GB anything near high end these days, You can pick it up for $50. 12-16 is highend with 24 being possible but extreme high end.

If you judge "high end" based on what is going into the systems most people are buying then 8GB is high end and 4GB is still the most common. Even expensive laptops tend to have no more than 8GB, and a quick look at Dell's "Performance XPS" line shows desktops going for over a thousand bucks being fitted with 8GB of ram.

The latest Steam Hardware Survey shows 4GB or less as 78% of system, they don't even list anything beyond "5GB or higher". The 4GB segment is currently shown as growing faster than the 5GB+ segment.

Also you could easily get 4GB for around $100 back around 2005. So While it was closer to high-end it did exist.

I built a PC in the middle of 2005, I never saw prices even remotely that good (regretfully)! I can't get access to the historical price graph things on dram exchange (lol memberships), but I've found this from the end of 2005 when DDR2 prices were tanking iirc:

http://www.tomshardware.co.uk/ddr2-price-drop,news-18088.html

These were the kind of prices I remember from mid 2005:

http://www.behardware.com/news/7507/ddr2-price-cuts.html

Though with slumps and firesales I don't know what deals others could find. I can't see 1400 mhz GDDR 3 being dumped off at $3.5 per 64MB chip though.
 
Do any of you really think there are millions of people out there just waiting to jump on the chance to strap on an expensive nausea inducing helmet? Even if you get past the expense and nausea, I think you'd find a very limited market.
 
Status
Not open for further replies.
Back
Top