function, thanks for catching my mistake regarding months. I was at work and other stuff demanded attention as well. So it's 11x reduction over 39 months, 3.25 years.
As for hoho's post .... Ooooh boy. Here goes.
Let's be honest here - streaming in textures mid-game is why most multiplatform games look so goddamn ugly and it is the sole reason behind texture pop-in. Almost every Unreal Engine 3 game suffers from it regardless of platform, and badly. Rage will suffer from it as well, as Carmack himself admitted.
What use is the amount of unique data on screen, if it comes in with a visible lag because some low-speed optical drive or HDD has to stream it there? Unique data is useful if you don't have to stream it in, and then you can have a frame with no pop-in graphics if you can fit all necessary info into the framebuffer, right?
You're already seeing it. This whole generation has been about optimizing PS3 to somehow cope with multiplatform titles that it has not been the lead platform for. If you have the time, look up Digital Foundry's platform comparisons.
PS3 loses on IQ frequently, and oftentimes in texture quality (see Bioshock), although RSX is much better at texturing than Xenos (13 200MTex/s vs 8000MTex/s). I can only chalk it down to the max 256MB of frame buffer vs X360s < 384MB of unified RAM, a 50% difference.
What I do know that if not for the 256MB limitation of its framebuffer, PS3 would have handed the X360 its ass when talking texture quality, having a wild 65% advantage in texturing power!
No amount of shading will reduce the blurryness of textures. It will only make them more shiny.
hoho, is that really you? What, so now console GPU and memory subsystems are not by and large the same exact technology used in graphics cards?? I know you know better than that.
The only difference between consoles is that once they run out of VRAM they go to the HDD or worse, the optical drive, while on the PC there is main RAM which holds that data, and is orders of magnitudes faster than any HDD. Yet even that does not help.
So please tell me how can you fit 363MB of data in a 320MB buffer without reducing image quality (and that includes the pop-in textures that come with streaming)?
A game completely without any texture or worse, geometric pop-ins and a normal draw distance?? Unimaginable luxury for the next gen?
What I don't understand is how you, who I know has lots of experience with 3D games, have not seen enough of the world to know that a HD 4850 today with 1GB of VRAM can run games much, much better than the first model with 512MB of VRAM, while having the exact same bandwidth and computational power?
Memory industry prices operate on volume. The more volume, the cheaper they are. Once one memory type is out of volume, it's price will go straight up. That's not something new.
Do you think that 100 million + another 100 million times 4-8GB is enough volume to maybe keep the prices down over a 10 year-cycle, factoring in that PC graphics cards will use the same memory type for a significant portion of that period?
Find me a game on PS3 that can bottleneck the texturing capability RSX. You can't, because the PS3s 256MB frame buffer cannot hold the needed amount of textures to do that in the first place.
These are the sad realities of the soon-to-be-last-gen.
As for hoho's post .... Ooooh boy. Here goes.
Depends how well your streaming system works. Also, just adding extra RAM won't help nearly as much for IQ if you don't pair it up with higher bandwidth and that costs an arm and a leg compared to just using bigger memory chips.
Let's be honest here - streaming in textures mid-game is why most multiplatform games look so goddamn ugly and it is the sole reason behind texture pop-in. Almost every Unreal Engine 3 game suffers from it regardless of platform, and badly. Rage will suffer from it as well, as Carmack himself admitted.
The amount of unique data you can see on screen per-frame is pretty much only limited by vram bandwidth. Capacity is nice too but not nearly as important or useful, at least providing you have enough for getting the basics to work ( = as much as competitors).
What use is the amount of unique data on screen, if it comes in with a visible lag because some low-speed optical drive or HDD has to stream it there? Unique data is useful if you don't have to stream it in, and then you can have a frame with no pop-in graphics if you can fit all necessary info into the framebuffer, right?
True but that will NEVER happen in a console world where you have clearly defined hardware. How many games had significantly worse IQ on PS2 than XB considerng the latter had tons more RAM? How big part of IQ difference on PS3 vs XB360 was caused by lack of RAM* and not by differently allocated performance**?
*) IIRC PS3 reserved more RAM to OS and it also has two memory banks instead of one unified chunk.
**) PS3 having generally weaker GPU and stronger CPU
You're already seeing it. This whole generation has been about optimizing PS3 to somehow cope with multiplatform titles that it has not been the lead platform for. If you have the time, look up Digital Foundry's platform comparisons.
PS3 loses on IQ frequently, and oftentimes in texture quality (see Bioshock), although RSX is much better at texturing than Xenos (13 200MTex/s vs 8000MTex/s). I can only chalk it down to the max 256MB of frame buffer vs X360s < 384MB of unified RAM, a 50% difference.
Again, let's assume MS or Sony had doubled the amount of RAM their machine had vs the competitor. Do you think they could have had significantly higher IQ as a result and not be bottlenecked for having to stream twice as much data through their GPU?This was addressed before but it's quite funny how THE most powerful platform, PC, hasn't been the lead platform for pretty much the entire lifespan of latest gen consoles
What I do know that if not for the 256MB limitation of its framebuffer, PS3 would have handed the X360 its ass when talking texture quality, having a wild 65% advantage in texturing power!
Texture resolution is only a tiny part of IQ. Shading is much more important and bigger textures don't help you much there.
No amount of shading will reduce the blurryness of textures. It will only make them more shiny.
You were using a deeply flawed example. Consoles are not comparable to PCs and as has been repeated several times already simply having more ram without increasing bandwidth and processing power will not give you higher IQ. Only reason why your 320 vs 640M example works is that no one expects any random PC GPU to handle anything thrown at it.
hoho, is that really you? What, so now console GPU and memory subsystems are not by and large the same exact technology used in graphics cards?? I know you know better than that.
The only difference between consoles is that once they run out of VRAM they go to the HDD or worse, the optical drive, while on the PC there is main RAM which holds that data, and is orders of magnitudes faster than any HDD. Yet even that does not help.
So please tell me how can you fit 363MB of data in a 320MB buffer without reducing image quality (and that includes the pop-in textures that come with streaming)?
While the price of the chip is dropping the cost of providing N-bit wide channel to it is pretty much constant and overall costs significantly more than the memory chip itself (more complex motherboard, memory controller, power usage, ...) so again all you could get is just more RAM at same bandwidth for less loading screens and less need for streaming but you won't get much better IQ.
A game completely without any texture or worse, geometric pop-ins and a normal draw distance?? Unimaginable luxury for the next gen?
What I don't understand is how you, who I know has lots of experience with 3D games, have not seen enough of the world to know that a HD 4850 today with 1GB of VRAM can run games much, much better than the first model with 512MB of VRAM, while having the exact same bandwidth and computational power?
How nice of you to bring up a clearly undercut price point. In the next ~22 months the cheapest RAM was still significantly more expensive than that DDR2 there.
Also, in the same list there was a 512MB SDR for $39 or 1G for $78 on sale on August 2003. That doesn't quite match your nice prediction of price reduction
Memory industry prices operate on volume. The more volume, the cheaper they are. Once one memory type is out of volume, it's price will go straight up. That's not something new.
Do you think that 100 million + another 100 million times 4-8GB is enough volume to maybe keep the prices down over a 10 year-cycle, factoring in that PC graphics cards will use the same memory type for a significant portion of that period?
Find me a game that is GPU computing-limited* on a 560GTX and compare 1G vs 2G cards. I'd love to see the IQ and performance difference between them when you crank up the texture resolutions.
*) because that's what is really limiting game IQ and that's what gets botttlenecked first on a console. Anecdotical "evidence" with a game that doesn't have any kind of half-decent streaming is not an example of what happens in a real world on consoles.
Find me a game on PS3 that can bottleneck the texturing capability RSX. You can't, because the PS3s 256MB frame buffer cannot hold the needed amount of textures to do that in the first place.
These are the sad realities of the soon-to-be-last-gen.