This kind of thinking is why there are practically no native Full HD AAA games. X360 would be perfectly capable of proper Full HD (it's called using the 10MB daughter die in tiling mode I think, which helps to get over the 720p 2xAA limitation most heavy 3D games face), but what's the point if the textures will look simply awful due to RAM restrictions?
The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome). One would hope they learned from it.
The point is, development of VR products stopped because of liability concerns.
The cost in creating assets doesn't come from simplifying them for low spec. It comes from building them. 100x the content is going to take 100x the effort. Sure, you can easily get a 4x increase in RAM requirement just by using 2k textures instead of 1k textures, but texture res will only get you so far. 1 TB of content is astronomical. It'll require at least an order of magnitude more expense on assets. I'll point you to the 50GB potential of BRD, and the fact that since the beginning of this gen, game sizes have remained in the DVD capacity range. What exactly are you going to distribute your 1 TB RAM consuming games on?? We'll spend two hours loading the game?? By 2020 we'll even be looking at streaming game like OnLive.
Thats crazy talk. RAM has little to do with rendering resolution beyond having enough for the framebuffer. Rendering res has everything to do with the pixel processing power of the GPU, what on earth made you think the reason for not going 1080p across the board was down to textures looking bad?? Every game on the 360 would look better at 1080p without any change to texture res
You could not be further away from it. 1080p requires 2.25x pixel shader instances (pixels) compared to 720p. That's 2.25x ALU (GPU maths) and 2.25x TEX (GPU sampling units and memory bandwidth) to render the frame. So you need much more powerful GPU to render the game at same frame rate at 1080p. Current console GPUs are not powerful enough to run games at 1080p. You could run simple last gen graphics at 1080p, but most developers rather choose even lower than 720p resolution to push as many fancy effects as possible.The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome).
Now you're applying PC-building logic to console building. Cost-effectiveness to the extreme, regarding key system components, is a nice thing to have if you can easily upgrade, but it's the wrong way to go if you want your box to last 10 years (and put out nice graphics while doing so).
This kind of thinking is why there are practically no native Full HD AAA games. X360 would be perfectly capable of proper Full HD (it's called using the 10MB daughter die in tiling mode I think, which helps to get over the 720p 2xAA limitation most heavy 3D games face), but what's the point if the textures will look simply awful due to RAM restrictions?
The decision to put 512MB in X360 instead of 1GB in the hopes of saving something around $15 per unit was essentially what cost MS the ability to do Full HD AAA games, in my opinion (arguments welcome). One would hope they learned from it.
The higher the resolution, the higher the texture quality has to be for your eye to think that things look ok and not macroblocking or otherwise starting to look like brown jam that's spread too thin over a piece of bread. I thought everyone knew that?
Just upping the resolution will make a game with low detail assets look even more low end... there's just nothing left to cover all these pixels (the effect is akin to watching 240p youtube video fullscreen).
Games using virtual texturing on PS3, with optimized streaming, do they have to "waste" half of the 256MB VRAM, at the same time 256MB RAM is screaming for more space but free space is in VRAM and no use for game's business logic?With virtual texturing you do not need more than 100 megabytes of graphics memory to texture everything with as high detail as you can fit to the game disc. Our next game uses 2048x2048 textures on most objects, and our whole texturing system takes a constant 56 megabytes of memory to texture all objects in the world.
By that reasoning development costs haven't increased at all this generation. We just use the same amount of content as last gen, only instead of producing high end assets and downgrading them to PS2 spec, we don't have to downgrade them as far to PS3 spec.Textures is the number one memory eater. The quantity of content rarely increases at all (or if it does, then by geometry instancing or just plain old repetition), while the quality increases exponentially (well, used to before this gen).
No, it shouldn't.In any case, a 2K texture should take around 16MB and a game may contain up to a thousand or more textures.
A game using 16k textures is utterly preposterous, as you won't see that much detail. Almost all your texture information wouldn't actually be rendered, making it useless. Movies, who are photorealistic in quality at 2k projection, tend to use 2k and 4k maps. The law of diminishing returns makes your pursuit of unqualified large numbers a pointless one.So a 4K texture would take 64MB and an 8K one 256MB. The difference?
15,6GB of texture data vs 250GB of texture data. That tells me that a game using 16K textures would take 976,5GB of memory. It does not seem that preposterous.
Your reference point of your own eperience is not at all representative of the rest of the world. The 16GB PC you built in March is not anything like a target for games developers. 100 Mbps that you have access to is nothing like the world average broadband speed. Also you are downloading one TB's worth of content. Unless your plan is to store the whole game in RAM, you'd be needing several TBs of content with each area/map filling that TB with your massive textures. Current console games use a good 8+ times storage over the amount of RAM the consoles have. You also need to populate the RAM from storage. Games on PS1 were something like 300 MB capacity. PS2 games were more like 3 GB, a 10x increase. This generation we're not even seeing a 10x increase, and games are still getting painfully expensive to make 8GB of content for.As for content delivery, the fastest affordable connection available in my town (100Mbit/s for $50/month without bandwidth cap) can theoretically download 1054 GB in 24 hrs.
Most of those texels aren't drawn. Most of the time you're rendering smaller mipmaps. It's only ocassionally when up close to textures that you need the detail. Higher resolution isn't an issue, as 1080p is likely it for a long time. There's no benefit to higher resolutions in most people's homes where they can't/don't want a 100" screen. 4k textures will be as good as perceptible. In a 1080p framebuffer, if you walk up to a character so their entire face fills the screen, that's 2 million pixels, or half a 2k texture. There's no rationale for replacing every texture with one 128 times larger than the screen can display!Just upping the resolution will make a game with low detail assets look even more low end... there's just nothing left to cover all these pixels (the effect is akin to watching 240p youtube video fullscreen).
XDR is very hard to predict unfortunately due to it's low use. However GDDR should be predictable.
Nintendo? You have your rambus love off. Sony has been their big partner.
You are way off.
In 2005, the average pc had a 256mb of VRAM and 1 GB of system RAM
As today, the average pc has 1 GB of VRAM and 4 GB of system RAM (look at steam hwsurvey).
This is a more likely transition, considering that in 7 year, we had a 4x jump.
2011: 1 GB VRAM , 4 GB system ram
2013: 2 GB VRAM, 8 GB system ram
2016: 4 GB VRAM, 16 GB system ram
Bandwidth, and processing power are much more of a constrain to the performance.
After listening to a recent interview with Carmack i really am in agreement that the next big leap will be through display technology, specifically head mounted displays. Looking at the environments in games like say Skyrim or Rage the way we are veiwing games at the moment is the limiting factor, not the graphics themselves in many cases. People still scoff at VR as a pipe dream but i really do think that for next gen graphics and environments will be at such a level that i really think its time for someone to take another proper crack at it. Sony seem to be doing some work in this area, lets hope that they can come up with something good, the advancement in OLED display tech is promising for HMD tech.
I wouldn't call 8GB anything near high end these days, You can pick it up for $50. 12-16 is highend with 24 being possible but extreme high end.
Also you could easily get 4GB for around $100 back around 2005. So While it was closer to high-end it did exist.