Shifty Geezer said:
The term 'Free' is what's misleading. If they called it 'hard-coded' would that settle the issue? It's not free as it's cost the system. It's cost them transistors that could be used for other areas. (yep, we've had this debate before
)
Run Shifty, come back after the rant
/begin rant
I never did get involved in that flaming thread, but I might as well say it here:
The individuals argueing, "It is not free!!!" and called it misleading have intionally took what ATI said completely out of context and twisted it to be fodder for the console debate. All ATI ever said was that from a performance perspective that 2x AA had no performance hit and 4x AA had a 1-5% performance hit and could realistically be called "free" from a performance perspective.
AA was never said to be "free" from a transistor budget perspective. That was a made up arguement by Sony apologists to downplay the effects of the eDRAM--which have been significant on the forum.
There has been no end to 2 subject themes on B3D since E3:
1. Downplay the CELL SPEs in the PS3
2. Downplay the eDRAM / Unified Shaders in the Xbox 360
Sony and MS fans are equally guilty. Ironically, if people stopped and thought about how unique each design was and how their design structure is built around these philosophies I think people would appreciate them more and how each is unique and plays to their strengths.
Hint: Note where the memory controller is on each system. One is CPU-centric, the other GPU-centric. The CPU-centric design has a revolutionary CPU, the GPU-centric design has a revolutionary GPU.
That is not to say that a thoughtful individual, like Shifty, cannot bring up realistic concerns about a design (e.g. unified shader performance or ease of use, tiling performance hit, will FP10 have significant artifacts, how realistic is it for developers to cleanly serialize their code to fit into 7 hardware threads that will perform adequately on the SPEs, and so forth).
But as Shifty has noted, a lot of discussions have veered away from what was actually said, or the more evaluative stance of information released, to more:
MS Fan: Poo Poo on Sony news/media
Sony Fan: Poo Poo on MS news/media
That includes leap from, "AA is free from a performance perspective on C1" to "AA is not free! It took transistors". That was a silly leap in logic that ATI/MS never made. IMO it is fair to discuss the question like thus:
- "Was 80M transistors a good tradeoff to prevent AA, Z, Alpha, and other framebuffer tasks from being a system bottleneck when those transistors cost potential shader realeastate/performance?"
That prevents fan- boi wars. While the following is 1. twisting facts, and 2. invites an arguement and is really not a stance to promote a useful discussion:
- "AA is not free on the C1"
Some call it semantics, but then again, semantics is what this forum is all about
/end rant
That's a marked difference to Xenos's situation. As PC software gets more complicated those frame rates are going to drop. The G70 slides from nVidia that included the statement 'AA for Free' showed that on older, simpler software, this is true - AA didn't affect performance. But on modern games and future games, AA penalities were as high as ever.
Pretty fair summary of the G70 info. No one knows exactly what tweaking has gone into RSX, but at 550MHz and linked to CELL it should be a strong performer--much stronger than the G70. My guess is between a closed box and a tweaked design the AA performance will still have a hit at higher resolutions on modern games but less so than G70 shows.
I know, talk about the master of the obvious here
I can see this being a bone of contention that no-one can agree on
Why agree on something when we can spend the next 6 years argueing about it? Calling a "truths" before a system even launches? pffffft!!
IMO both companies have divergent philosophies, but with similar foundations. The PPC CPUs and NV/ATI GPUs are the similar foundations. Ditto 512MB memory, sound processing on the CPU, and so forth. The big difference, IMO, is that MS's strengths may show up sooner on the screen, at least in areas regarding the framebuffer (AA, stencil shadows, Z, alpha blends). The unified shaders, and how much vertex work to offload to the CPUs, and the ins and outs will take more time. Sony will get a good shot in the arm from a fairly standard GPU, but wont get the immediate perks the eDRAM offers. Devs will master that with time (some even at launch), but where the PS3 will really begin to shine is when developers begin compiling libraries of code that is threaded to work well on the SPEs. My guess is last gen PS3 games will look so much better than 1st gen PS3 software we wont recognize them coming from the same system. It is going to take developers time to learn what code can be designed to make the best use of the system, and then to serialize it for the 7 SPEs, but once that is accomplished the games will look like night and day.
Sony and MS are really putting forth a great first step this generation with the best hardware, compared to the market, we have ever seen. Like all console launches there are hurdles, concessions, and dissappointments, but overall I do not think we could have asked for anything more. MS and Sony fans are going to be in for a treat over the next 5 years.
And the divergent designs will allow all of us to argue which was better for the next 10 years!