What do you expect for R650

What do you expect for HD Radeon X2950XTX

  • Faster then G80Ultra about 25-35% percent overall

    Votes: 23 16.4%
  • Faster then G80Ultra about 15-20% percent overall

    Votes: 18 12.9%
  • Faster then G80Ultra about 5-10% percent overall

    Votes: 18 12.9%
  • About same as G80Ultra

    Votes: 16 11.4%
  • Slower thenll G80Ultra about 5-10% percent overall

    Votes: 10 7.1%
  • Slower then G80Ultra about 15-25% percent overall

    Votes: 9 6.4%
  • I cannot guess right now

    Votes: 46 32.9%

  • Total voters
    140
Don`t say that out loud. The lads over at Rage are convinced that nVidia is doing Transparency MSAA whilst ATi is doing Quality ADAA(meaning SSAA), and that actually it`s the driver fairy that brought improvements in terms of performance, not the implementation of EATM:D

Sorry I have been gone a while.

And to get back to you on your false assumption. EATM did not replace SSAA.
 
he was talking about the alpha, beta oblivion drivers, and they were doing it. That was before you were a beta tester btw.
 
he was talking about the alpha, beta oblivion drivers, and they were doing it. That was before you were a beta tester btw.

He was refering to driver 8.37.x.x (which is not like 7.5) and as I know is also not a oblivion driver as their is no such thing as a specific driver just for a game. Now their is patches or hot fixes that are placed on drivers specificly for games at times, AKA chuck patch.

Now tell me this...

Why would ATi replace SSAA with mode that has issues with semi transparant and missing objects on feilds?:p
 
He was refering to driver 8.37.x.x (which is not like 7.5) and as I know is also not a oblivion driver as their is no such thing as a specific driver just for a game. Now their is patches or hot fixes that are placed on drivers specificly for games at times, AKA chuck patch.

Now tell me this...

Why would ATi replace SSAA with mode that has issues with semi transparant and missing objects on feilds?:p

Well then connect, these sites and find out why they wrote they got drivers specific for oblivion, with the launch of the 2900 series,

Techreport
Digit-life
[H]

Alpha Driver – Oblivion Adaptive AA Improvements

Toward the end of wrapping up our evaluation a new alpha driver was dropped on us last minute. This alpha driver contained some performance improvements specific to Oblivion running with Adaptive AA that is a preview to what we will see in an upcoming driver release. Though we aren’t using Adaptive AA in Oblivion there is a slight performance increase with this alpha driver overall in Oblivion. However, in our testing it was not enough of a performance increase to change the settings we found playable.

With the 8.37.4 driver as you can see above we found 1600x1200 playable with 2X AA and 25% grass with HDR. When we installed the alpha 8.37.4.2 driver we still find these same settings as the highest playable game settings. We tried 4X AA but found performance to be unplayable in the same places we found it unplayable with the 8.37.4 driver. We tried pushing the grass up higher but found the same performance problem with grass.

Ah, the drama! The Radeon HD's new alpha driver allows it to just barely edge past the GeForce 8800 GTS 640MB OC.

And the driver that I know first hand that has this optimization, is the 8.37.4.2
 
ahh, it's a driver that has specfic improvements for oblivion. You got me a bit confused calling it a Obllivion driver. I guess we have to call the 7.5 driver the crossfire driver then. :D

From what I know/been told/seen, EATM has not made its way over SSAA, and will not due to some issues with it. It's simply a checkmark feature under the registry at this moment but hopefully in the future consumers get the choice to use that as well as having the choice of SSAA.
 
And it shouldn't, imho! EATM may be a choice to improve performance in titles that have heavy alpha use where super-sampled is simply too much of a hit.

What would be nice for some is to see super-sampled adaptive improve performance; and still have EATM flexibility, too. This would be welcomed, imho!

EATM replacing super-sampled wouldn't due to the limitations EATM has at times. Sure, EATM offers some nice quality for its performance and some very welcomed IQ for the hit -- but a more quality based super-sampled over-all -- when it comes to curbing shimmering may be still superior sample-vs-sample.

Flexibility -- depends on title and resolution need.
 
Last edited by a moderator:
Bump.

R650 isn't on the roadmap anymore. R680 is.

Release Januari 2008 together with RV620/RV635 (and maybe RV670 if they don't meet their December timeframe for that card. It's on the roadmap and all that's currently known is that it will support DX10.1 (SM4.1) just like the other new cards. It will support PCIe 2.0, have a 800+ Mhz clock combined with GDDR3 512MB / 1GB. It will also have display port and be manufactured on a smaller process. And ofcourse it has Enhanced CrossFire. Launchdate is currently set for Januari and suprisingly it will have UVD.
 
Last edited by a moderator:
I hope it will beat the crap out of nVidia's next part (although I would not withstand the ATI fanboys going: ZOMG TEY STRKE BACK!!!111 I TOLD JOO!11)

Why? Because AMD needs the money - and we need the competition.
 
Bump.

R650 isn't on the roadmap anymore. R680 is.

Release Januari 2008 together with RV620/RV635 (and maybe RV670 if they don't meet their December timeframe for that card. It's on the roadmap and all that's currently known is that it will support DX10.1 (SM4.1) just like the other new cards. It will support PCIe 2.0, have a 800+ Mhz clock combined with GDDR3 512MB / 1GB. It will also have display port and be manufactured on a smaller process. And ofcourse it has Enhanced CrossFire. Launchdate is currently set for Januari and suprisingly it will have UVD.

Small corrections :)
AMD R700 will be DX10.1 (SM4.1), but R6xx series will stay at DX10 (SM4)....
 
I hope it will beat the crap out of nVidia's next part (although I would not withstand the ATI fanboys going: ZOMG TEY STRKE BACK!!!111 I TOLD JOO!11)

Why? Because AMD needs the money - and we need the competition.

You looks too optimistic, users with brain know AMD need money, but won't buy products from AMD only because AMD need money when its not competative in some way. (mostly users buy the cards from performance aspect, than other things coming like, price,power consuption, features,...).

In the reality when AMD release performance competative product than its already a step forward from the current situation, why? because they get user respect back, can sell they products with better price and higher volume, not need to balance the lack of performance with lower prices (this happening now with the whole hd2k family), but all for this step they need to release in time, and not months after competition.
 
Duh, I need money too...

As a user, I could care less what company produced the product I buy, but I'll certainly always go for the product with better price/perf/power ratio. If that makes the competition bankrupt, so what. It's their fault (btw I'm not talking about ATI specifically, if they make a better product I'll buy that and make nV bleed just the same). Noone buys a crappy product just to make the company happy, that's ridiculous.
 
Missing the holiday season again? I'm sorry, that deserves a :rolleyes:
The holiday season is irrelevant here. If you release in December, you won't have any OEM deals, nor full channel presence. So December or January. It just doesn't really matter.

The reason why early Q1 is important is the Notebook Refresh cycle btw; not the full new platform, just the refresh where the CPUs will be replaced by Penryns and the GPU MXMs might get replaced too. If you miss that Q1 cycle, then you at least need to make the Q2 one - but there's still a substantial amount of share (guess: 20%+?) that you would lose compared to the scenario where you had both cycles in the pocket.

It looks like G92 and G98 will make that Q1 cycle, but G96 won't. This might give an advantage to AMD's 55nm RV630. It's easy to see missing the holidays as a doom scenario, but IMO, that's certainly not the right way to look at it. From a channel desktop POV, maybe partially so, but not in any other way that I can tell.
 
You looks too optimistic, users with brain know AMD need money, but won't buy products from AMD only because AMD need money when its not competative in some way. (mostly users buy the cards from performance aspect, than other things coming like, price,power consuption, features,...).
Huh? That's why I said I hope it will be much stronger than nVIDIAs part at that time.
 
Back
Top