"4xAA": Microsofts big mistake.

Status
Not open for further replies.
Titanio said:
b) are running on a card with less bandwidth to memory than RSX. I'm not saying that those necessarily balance out the first flaw, just that it's not a one-way street.

I think this second point is not realistic at all.

RSX has a bandwidth of 22.4GB/s to the 128MB of GDDR3. To the XDR, RSX has bandwidth of 20GB/s write and 15GB/s read. So in a best case situation RSX has either 42.4GB/s of write bandwidth or 37.4GB of read bandwidth.

To compare, a stock 7800GTX has 38.4GB/s of bandwidth.

So in best case scenario the RSX will have 4GB/s more bandwidth, and in worse case scenarios (where RSX is given bandwidth priority over CELL) it will have 1GB/s less bandwidth. Without knowing the overhead of accessing XDR over a dedicated pool I think it is a little early to say definitively that RSX has more bandwidth than G70.

Furthermore, in the above example of maximum bandwidth (42GB/s) we are assuming RSX is eating all but 5GB/s of system bandwidth.

No one can say with a straight face that game designers are going to do that; and that just dovetails with the previous point: If you give CELL 5GB/s of bandwidth for system memory you are going to be CPU limited.

And with 5GB/s for CELL (which is going to need the bandwidth to get close to theoretical performance) you have less than a modern PC (6.4GB/s with DDR400). I don't think I need to connect the dots on how such a game would most likely look. The fact it is getting free AA because it is CPU bound probably wouldn't impress many gamers ;)


In reality, the G70 will have quite a bit more memory bandwidth than RSX in most games.
 
Brimstone said:
I'm not saying that the RSX is better. The marketing of Nvidia has damaged one of the key aspects of Xenos. The A.A. capabilities of the RSX aren't even in the same ballpark as Xenons, yet mainstream consumers reading the press are going to conclude there isn't much special about the second 10 MB core Xenos has.

I think you are misunderstanding the slides then. This is not an RSX vs. Xenos slide. Neither GPU is mentioned.

Out of the VERY slim percentage of consumers who see this slide only the small population that visit forums like B3D would even bother to begin comparing RSX and Xenos.

So mainstream consumers are not even in the equation for this slide. Even my friends who are gamers would not be able to repeat the memory pool bandwidths OR the fact Xenos has 10MB eDRAM. Mainstream consumers are going to see the end product at Walmart and EB and other stores in demo units. That is all they know--especially as these are not super Hyped E3 Xenos vs. RSX slides. The presence, or lack of, AA will be telling on screen as MS has mass ordered HDTVs from Samsing for their koisks (sp).

As for us, just look at the threads that have discussed it. These slides are clearly misleading if we are comparing Xenos and RSX. The 720p games tend to be CPU bound and modern games at 1080p take a huge hit with 4xAA. And as I hinted in my above post, the bandwidth situation in the PS3 is NOT better than a G70. To believe that RSX will have be LESS CPU bound + have more bandwidth is a dream world.

That said, in a dedicated box I am sure developers as smart as the guys here will fiugure this stuff out and make the best choices. The fact RSX probably wont support HDR & MSAA at the same time could make the equation easier: AA or HDR. Only one bandwidth sucking feature.

I think we will see a lot of PS3 games with AA, so don't get me wrong. But the idea AA is free on RSX is a joke.

Anyhow, I never saw anyone look at a MS slide and say, "It is fake but it is a huge PR win and will hurt Sony". I think it is understood that stuff like this, Maj. Nelson, and even E3 slides only appeal to a very small demographic--US! And even we are waiting for the GAMES.

All the numbers in the world mean squate. 3 symetric cores and 6 HW threads, or 218GFLOPs, or 256GB/s bandwidth. Whatever. Show us how it looks in game. At this point we are still waiting for both companies to show us some real next gen games. The jury is out until that changes.
 
Acert93 said:
I think this second point is not realistic at all.

RSX has a bandwidth of 22.4GB/s to the 128MB of GDDR3. To the XDR, RSX has bandwidth of 20GB/s write and 15GB/s read. So in a best case situation RSX has either 42.4GB/s of write bandwidth or 37.4GB of read bandwidth.

To compare, a stock 7800GTX has 38.4GB/s of bandwidth.

So in best case scenario the RSX will have 4GB/s more bandwidth, and in worse case scenarios (where RSX is given bandwidth priority over CELL) it will have 1GB/s less bandwidth. Without knowing the overhead of accessing XDR over a dedicated pool I think it is a little early to say definitively that RSX has more bandwidth than G70.

There is an overhead, but that's more of a latency question than one of bandwidth. And there 256MB of GDDR3, as I'm sure you'll remember.

Acert93 said:
Furthermore, in the above example of maximum bandwidth (42GB/s) we are assuming RSX is eating all but 5GB/s of system bandwidth.

No one can say with a straight face that game designers are going to do that; and that just dovetails with the previous point: If you give CELL 5GB/s of bandwidth for system memory you are going to be CPU limited.

And with 5GB/s for CELL (which is going to need the bandwidth to get close to theoretical performance) you have less than a modern PC (6.4GB/s with DDR400). I don't think I need to connect the dots on how such a game would most likely look. The fact it is getting free AA because it is CPU bound probably wouldn't impress many gamers ;)


In reality, the G70 will have quite a bit more memory bandwidth than RSX in most games.

I'll agree on that, I was just looking at totals and comparing. The first point stands, though. Any comparison of PC to closed boxes has got to be very very qualified.
 
What's going on on this thread?

Ati develops a part that has free 2X and almost free 4X MSAA. MS advertise it.

Nvidia claims that 4X MSAA is almost free on Call of Duty, or UT2K4 with their new part.

The result of this should be that MS was wrong to advertise one of the forté of their new console? The mind boggles... really...
 
Vysez said:
The result of this should be that MS was wrong to advertise one of the forté of their new console? The mind boggles... really...

As Dave would say:

"G70 AA performance in PC games circa 2003-2004 is not orthogonal to AA performance in next generation console games."(TM)

:LOL:
 
I really don't think it matters. In the end it will come down to which GAMES look better. Period.

Consumers mainly go by what they see on screen, so Nvidia can do PR all day long, but in the end they won't be getting free AA and if that results in lower IQ the consumer will see it.
 
Well when it comes down to it, I think NVidia will go with the HDR and leave the AA to natural downsampling on the expected TV sets. Shouldn't be a big deal - but we'll see.

Anyway those slides those slides... I swear it's because NVidia was worreid about a halo effect forming around ATI and their AA, and not a support for RSX. It just is too... too random.
 
Maybe some sort of pre-emptive strike?

Maybe they sense some upcoming benchmarks from MS.

I bet that MS will at some point release slides showing top-end games, and the AA hit for the G70 vs. Xenos. If G70 really is taking a 10-40% hit in FPS with AA on, in top end games, that sure would make a sweet comparison graph for MS's PR team.
 
> "while Xenos does 4xAA and FP10 (or FP16), there will be a distinct IQ difference"

We shall see, if 1) there is much of a difference, with my guess being there will not, and 2) whether it even makes a difference in most people's decisions, which by going by the whole PS2 versus Xbox visuals made very little difference.
 
Acert93 said:
Not all the facts.

In fact, at 1080p G70 takes ~40% hit in performance with 4x MSAA in a couple modern games like D3. So comparing how the G70 does on old DX7/DX8 level games is very misleading on how well it will perform in games with much higher requirements.

The PS3 and 360 are expected to have games that are a couple steps beyond D3. Just look at the high detail / high geometry Sony Render Targets. This will have even an even more significant hit on performance.

And the eDRAM is not only for AA, but also Alpha blends, Z, and other backbuffer tasks.
Perfectly sums up my response.


Brimstone said:
When talking about A.A. and Filtering, it can get pretty esoteric and put people to sleep. So far Nvidia has totally neutralized the Smart Memory Xenos has. Nvidia has taken a scapel to the Xenos A.A. capabilities and carved it up with the skill of a surgeon.
I don't think the marketing has hit many who cannot make the educated analysis seen in this thread. And even if it extends beyond where it is now, people won't care too much. Why? You said it yourself: talking about AA puts people to sleep. If true, it doesn't matter if nVidia markets against Xenos' Smart Memory.
 
scooby_dooby said:
I bet that MS will at some point release slides showing top-end games, and the AA hit for the G70 vs. Xenos. If G70 really is taking a 10-40% hit in FPS with AA on, in top end games, that sure would make a sweet comparison graph for MS's PR team.

I doubt that comparison would come from MS unless they got the cooperation of third parties - and that's not something most would probably feel comfortable partaking in. Even then, it'd be a comparison of multiplatform games, not games built specifically around each format. edit - assuming you mean xenos vs rsx..

I also don't know if it's so clearcut..any hit (and the G70 hit you're quoting) would be relative to its own performance without it, not Xenos's performance. It's been mentioned that Xenos's AA is "free" but ultimately that's not the case - it's cost them dollars/silicon, dollars/silicon that could have been spent elsewhere in the GPU. They've enshrined that choice in hardware, on RSX it's up to the developer to make the tradeoff.
 
scooby_dooby said:
I bet that MS will at some point release slides showing top-end games, and the AA hit for the G70 vs. Xenos.
The Xenos is not a PC part, how would MS run PC games on it?
And what about X86 found in the PC and the PPC in the X360?

And what sense would that make, from a PR point of view?
Xbox 360 runs PC games @xxfps and a PC with a Geforce 7800 runs PC games @ xxfps? So what? Therefore people can conclude that Xbox 360 is a PC?
 
1) THe aa talked about as of our knowledge will not work with hdr on the rsx since its not able to on the g70. Which means you have one or another


2) These games are all old and the engines used for them are even older .


3) If you think wow graphics on the ps3 with free 4x fsaa will sell against what the x360 will have with free fsaa then you have another thing coming


Its really stupid to compare the two. We should wait till the rsx is in devs hands and they start showing playable games before we comment on this
 
Titanio said:
kyleb said:
Upping the resolution just makes more jagges, albit smaller ones. Sure, if they are small enough then they won't be noticed, but with consoles getting played on 50"+ screens, even 1080p has a rather low DPI compared to say a 21" monitor at 1600x1200.

I'm not disagreeing, but you're making the argument even more complex, and trying now to not only account for variables between games, but between gamers' setups too.

I'm not making anything more complex, I'm just pointing out how off base your assertation that AA isn't needed at 1080p by pointing out some rather obvious realities of the situation.
 
Inane_Dork said:
Acert93 said:
Not all the facts.

In fact, at 1080p G70 takes ~40% hit in performance with 4x MSAA in a couple modern games like D3. So comparing how the G70 does on old DX7/DX8 level games is very misleading on how well it will perform in games with much higher requirements.

The PS3 and 360 are expected to have games that are a couple steps beyond D3. Just look at the high detail / high geometry Sony Render Targets. This will have even an even more significant hit on performance.

And the eDRAM is not only for AA, but also Alpha blends, Z, and other backbuffer tasks.
Perfectly sums up my response.


Brimstone said:
When talking about A.A. and Filtering, it can get pretty esoteric and put people to sleep. So far Nvidia has totally neutralized the Smart Memory Xenos has. Nvidia has taken a scapel to the Xenos A.A. capabilities and carved it up with the skill of a surgeon.
I don't think the marketing has hit many who cannot make the educated analysis seen in this thread. And even if it extends beyond where it is now, people won't care too much. Why? You said it yourself: talking about AA puts people to sleep. If true, it doesn't matter if nVidia markets against Xenos' Smart Memory.

The Gamespot Playstation3 vs. Xbox 360 article is just a gloomy when it comes to A.A. and the Smart Memory core. It doesn't even mention that Xenos has 10 MB of eDRAM except in the final spec breakdown.

Apples to Apples on Graphics?
The Xbox 360's ATI graphics core also throws a wrench into our graphics comparison since it uses a new-fangled Unified Shader Architecture that mixes up pixel- and vertex-pipelines and makes comparison to older video card technology very difficult. The Xbox 360 graphics core may have 48-pipelines, but we don't know how powerful they are compared to dedicated pixel and vertex pipelines.

The PlayStation 3 has a pretty strong Nvidia graphics processor, but you can see how Sony may be afraid of the specification sheet comparison by the pipeline number conveniently omitted from the PS3 graphics specifications. We're guessing that the RSX graphics processors has a traditional, non-unified shader engine, so it likely has a smaller total "pipeline" number than the ATI chip. Even if the RSX's normal pipelines are more powerful than the Xbox 360's pipes, Sony doesn't want to risk printing a lower "pipeline" number since people won't understand that it isn't an apples-to-apples comparison.

So how many traditional pipelines does the RSX have? Sony has revealed that the RSX GPU has a 550MHz core clock and has over 300 million transistors. Sony has also stated that the chip is more powerful than two GeForce 6800 Ultra cards put together. Your first guess might be that Nvidia simply doubled the pipeline number on the 6800 Ultra to make the RSX, but you also have to remember that the Ultra only clocked in at 400MHz. If the "double" performance measurement is based on fill-rate performance rather than hardware, the clock speed increase up to 550MHz is clear sign that the hardware improvement isn't from a pure doubling of pipelines. We're guessing that the actual pipeline count is going to be at 24, which is about right for 300 million transistors and, at 550MHz, has just a slightly larger fill-rate than two GeForce 6800 Ultras clocked at 400MHz. Since the GeForce Ultra had 6 vertex pipelines, the RSX likely has 6 more vertex pipes in addition to the 24 pixel pipelines.

http://hardware.gamespot.com/Story-ST-15015-1985-4-4-x

Having 105 million transistors on a seperate core isn't even mentioned. It just mentions a unified shader pipeline. This is a large neutral multi-platform gaming site doing a PS3 vs XB 360 tech head to head comparison and the magic of Xenos A.A. doesn't even get mentioned.



If you're paid to market a product, you should come up with a good way to communicate it's strenghts to the target audience as best as you can. Microsoft is going to fork out a lot of money for the 10 MB Smart Memory core and it's something that does give a substantial advantage in image quality. For a site like Gamespot to not even touch upon the subject is ridiculous.

Anand and Gamespot don't understand the A.A. situation because it's not conveyed to them.
 
From a PR standpoint, that's huge performance advantage and I'm sure they are looking for a way to get it across.

I guess it isn't likely for them to release PC benchmarks compared to G70, as most people wouldn't have a clue that G70 = RSX anyways. And as you guys pointed out, there are big issues with that approach anyways.

So that probably won't happen, but I'm sure they are trying their best to come up with some way to show the consumers what a large performance hit the RSX may take w/ AA, and what a small hit the Xenos will take.

And Nvidia's charts are simply an attempt to squash the argument before it has begun.

As far as talking about AA putting people to sleep. I don't know about that, when you talk about the prospect of no more "jaggies" people get very excited. Jaggies are hated by all, so most consumers do care about AA, even if they don't realize it.
 
I agree with Brimstone.

Bad job by MS here.


As I've mentioned a couple of times, I really think this is the reason for all the misconceptions and misrepresentation about the 256 transfer rate.

MS has NO IDEA how to explain this advantage in a simple formula.

They needed a good name, like, "Reality Synthesizer". ;)

Oh and as Acer said, Maybe MS is counting on the fact that the 4,5,6 month? advantage will give them PLENTY of time to present their case in High Def Samsung kiosks.
 
kyleb said:
I'm not making anything more complex, I'm just pointing out how off base your assertation that AA isn't needed at 1080p by pointing out some rather obvious realities of the situation.

Read more carefully, I said it wasn't as necessary. It depends on your definition of "needed" too. Wanted? Sure.

As resolution scales up it becomes less of a problem. No AA at 1080p is a very different ballgame than no AA at 640x480, or even 720p (though the difference is obviously less). Beyond resolution there are other factors that can be considered with regard to perceived aliasing - factors that could help alleviate it further.

Talking about screen size, viewing distance etc. - I'm not writing off their impact, of course they are factors, but these are variables outside of the developer's control. Perhaps they can target a "typical" screensize or whatever, but you'd have to define that. If you want to talk about the "typical" experience next-gen, none of this argument will be relevant anyway, really. No one with a SDTV will have any cause for complaint re. aliasing next-gen, I don't think.
 
Brimstone said:
Anand and Gamespot don't understand the A.A. situation because it's not conveyed to them.

Not true. Anand had an interview with ATI and I am sure he had access to the HardOCP and FiringSquad interviews as well. In all of them the eDRAM was explicitly mentioned. And Anand surely had access to Dave's information as well.

So Anand is very well aware of the information, it is just a matter of processing that information. Anand is a PC-Centric site. As much as I like the site (I visit it daily) I would not take his view on the situation--from either side--as a real benchmark of how things are.

To put it a different way: I don't see you trumpeting the fact Anand calls into question certain aspects of CELL and the PS3 in general. It all depends on how you look at it.

As for GameSpot, what did you expect? They are a casual game site. From listening to their own audio shows they themselves hardly understand the technical jibberish. Their stats were provided to them and they really do not do a very informed job of disecting them. Remember, IGN posted the Major Nelson info :oops:

GameSpot, IGN, GameSpy, etc... are all pawns. Tit for tat. Right now it is all hype for the machines... anything to keep consumers interested.

Call November 2005 all that will matter is what people can buy. These are not Gaming PCs or PC gamers. These are mainstream console devices for mainstream consumers. Feature X and Feature Y are irrelevant in that consumers want good looking games.

And the fact we have people on these forums who argue they cannot see the difference between Xbox NCAA 2005 and PS2 NCAA 2005 (the former looks significantly better) makes me scratch my head how these same people expect to see a big difference between 360 and PS3 games.

It really comes down to the games. But for those of us who love AA, I will take all I can get ;)
 
Won't console games have frame rates locked at 30 or 60?

IOW, you're not going to have an X360 game at 100 FPS while the PS3 version is at 50 FPS or vice versa.

If AA and all the other filtering prevents a game from hitting 30 or 60 FPS, the developer will turn off those detail-enhancing features.

Or, maybe one version will run at 60 FPS while the other version will run at 30 FPS but both will have comparable AA and HDR?
 
Status
Not open for further replies.
Back
Top