Can bandwidth really be made redundant? I'd think no.
I never said that....
1080p, 60 fps, high rez GT5 is enough for me.
or Crysis 1, highest settings, 1080p, 60 fps, is also enough for me
I have doubts that next gen consoles could ever manage that...
Well it was a trade off. They wanted one pool of RAM, but than bandwidth wasn't enough so they had to go with eDRAM. Besides, in 2005 they couldn't get more than 10 MB of eDRAM anyway.But then RSX has more bandwidth to RAM as it has to buses feeding it, you also have the down side of EDRAM as well, which you've failed to list.
personally I feel that the EDRAM in 360 was a complete waste of time, it's just too damn small.
Maybe if it was implemented like PS2's EDRAM was and allow frame buffer and assets to be stored it would of been a better option but it's essentially nothing more then a scratch pad which over the years has been the cause for many a game to run below 720p due it only being 10mb.
And with the industry focusing heavily on post process AA and moving away from bandwidth and memory heavy MSAA would EDRAM even be useful next gen?
By using the post process AA they drastically reduce the bandwidth consumption and thus make the high bandwidth advantage of EDRAM for the most part, redundant.
Games heavy on transparencies generally cope better on 360 and there are more sub hd no AA games on PS3 so I don't exactly see your point.
IQ = Image Quality. It's a measure of resolution, antialiasing, and texture filtering. If you consider a collection of random geometric objects like cubes and pyramids, textured with checkers, all thrown around a scene, then a poor IQ image will have chunky or blurry pixels (upscaled), jaggies, and horrible texture aliasing. Perfect IQ would look like a photograph.it may be stupid to ask ; what is the full form or meaning of IQ in games ?
What? Missing grass? Lower res particles? Less smoke? Thats bandwidth problem, not memory amount.I would put that down to memory differences between the 2 machines and not processing power, especially with 360 being the lead development platform for most third party games.
IQ = Image Quality. It's a measure of resolution, antialiasing, and texture filtering. If you consider a collection of random geometric objects like cubes and pyramids, textured with checkers, all thrown around a scene, then a poor IQ image will have chunky or blurry pixels (upscaled), jaggies, and horrible texture aliasing. Perfect IQ would look like a photograph.
What? Missing grass? Lower res particles? Less smoke? Thats bandwidth problem, not memory amount.
And again, I doubt 360 is lead platform for most 3rd party games. At least it seems that EA, Ubisoft and other big houses lead on PS3 in last few years.
Yea, probably because you have to go parallel early and that way PC and 360 both benefit. That and the fact that memory/bandwidth setup in 360 gives developers less headache so they can go a bit overboard (at least when 360 had clear memory advantage).My impression is that that's because PS3 -> 360 is easier than 360 -> PS3.
And, sadly for MS, the PS3 toolchain is faster, which translates directly to dollars saved for a studio.My impression is that that's because PS3 -> 360 is easier than 360 -> PS3.
And, sadly for MS, the PS3 toolchain is faster, which translates directly to dollars saved for a studio.
I always thought MS had a lead in development tools easiness...
Sony will opt for AMD's quad-core APU (accelerated processing unit) codenamed 'Liverpool,' according to multiple reports in June. It's tipped to be built on a 28-nanometer process. The smaller this number, the more transistors can be fitted into the same space on the chip, and the lower the power consumption, but the more complicated the chip is to build. For context, PS3's Cell processor shrank from 90nm to 45nm over the console's six-year life.
"It'll be easier for developers to get PS4 games up and running" The clock speed is 3.2 GHz, which while not lightning fast will be supplemented by powerful graphics hardware - the Radeon HD 7970 (currently £300 on its own) is being linked to PS4.
Sony will be looking to assemble PS4 from 'off the shelf' PC parts, reducing costs and making it easier to program for. This is in contrast to PS3's Cell chip, which its creator Ken Kuturagi once envisioned appearing in freezers and TVs as part of a parallel processing network. And look how that worked out.
AMD's chips allow for easy porting of code, theoretically preventing the issues surrounding, say, the PS3 port of Skyrim compared to Xbox 360. It'd be easier for developers to get PS4 games up and running, without waiting years for them to learn its unique tricks.
They're regurgitating old rumours, and know nothing more than the rest of us.ps4 is having 7970 !
honestly I have never been worried seriously about the processing power of CPU/GPU bundles of next gen consoles, it would be hugely more powerful than current gen consoles and thats for sure given the huge shrink from 90nm process to 28nm + huge gains in efficiency of GPUs (lokking at what a Vita or an Iphone5 could do graphically with as little TDP and as little number of transistors).
My real concern (for all consoles historically speaking) was, is and will always be : memory amount+bandwidth. Memory amount+bandwidth...Memory amount + bandwidth....thats what limits the use of processing power of consoles and thats what limits the amount and quality of art and textures in console games , and thats what limits the size of gameplay environments, the possibilities of gameplay, number of animations, branching stories, diversity of gameplay mechanics and choices...in short thats what limits innovation and immersion of games.
Give the PS3 and Xbox360 more RAM and more Bandwidth and for sure they could still amaze us in terms of gaming exepriences for at least another 2-3 years...