ATi 2900 XT 1GB DDR4 for pre-order...

My pc plays DiRt @ far faster speeds than those posted. But I know that 512mb card WILL NOT RUN ULTRA SHADOWS..the game crashes and exits.:LOL:

Crossfire makes any ultra details unusable(app crash). Seems Crossfire uses more mem than singlecard configs.
 
I dont have crash, but crossfire dont make fps any bettah in DIRT for me, same like in Company of Heroes DX10 path or Lost Planet DX10 version.
 
Well, considering the real DiRT, retail, patched, the app closes itself with a "out of vidram" error, I highly doubt that those screens up there are really indicitive of real perforamnce in that app from that card.

You just simply cannot select ultra shadows, and get the app to run.

spajdr, I DO get performance increase in Company of Heroes and lost planet(you are using the hotfix for that are you not?), so something is wrong in your setup, driver-wise.

;)
 
Even in review where is company of heroes mentioned (that one where is 1GB in crossfire and 512MB version also) is only 2fps increase in DX10 version in crossfire. I doubt thats crossfire not working, since CoH over DX9 it shows huge increase. You sure you get more then 30fps on first mission at start in lost planet DX10? 3D Mark 06 or STALKER shows fps boost too.
 
Using the hotfix? yes, I get more than 2FPS(wow 9FPS:rolleyes:).


Listen, each app needs a profile for crossfire. If tiled rendering is used, you'll notice only small or no FPS increase.

Sounds like you are using a driver that does not have a profile for that game. :yes:

i go from 42 FPS max in DX10 CoH to 72. Again, reviewers not using a good driver.


I don't get my info from reviews..frankly, I see a huge lack of honesty in alot of reviews(BIASED), so they hold no value for me, nor should they for you.

I get numbers from the pc sitting on the floor here next to me. If you'd like to come to my house, and see the numbers for yourself, you are more than welcome. I own most games released for pc since like DESCENT and MYST days, and through all these years, I've yet to find a writer that I trust enough to make a purchase after reading thier words.


TBH, alot of people are saying Crossfire overclocking doesn't work either...even after I posted evidence pointing otherwise, and easily explain why they cannot get the same results, yet still they stand firm in thier opinion on how things work.

Sadly, reviews are the same way...a writer's opinion. It just cannot be otherwise, due to human nature.

So, regardless, I posted the truth, accept it or not, that's what it is.
 
Last edited by a moderator:
Ya, im using hotfix one, truth is i never toyed with profiles. So for CoH DX10 you need to create profile and tick crossfire option? same for lost planet? or what else i need to do? cause lost planet isnt directly run by .exe but over steam.
 
Hmm, i got retail box..maybe that's the difference?

CoH DX10 works fine with 7.6 here. Fresh install patch to 1.4, then 1.7, then play with settings, then played. NO profile needed.

DiRt does not like crossfire, no matter the settings. Lowest settings still causes"low vidram error".:rolleyes:



LoL.
 
Using the hotfix? yes, I get more than 2FPS(wow 9FPS:rolleyes:).


Listen, each app needs a profile for crossfire. If tiled rendering is used, you'll notice only small or no FPS increase.

Sounds like you are using a driver that does not have a profile for that game. :yes:

i go from 42 FPS max in DX10 CoH to 72. Again, reviewers not using a good driver.


I don't get my info from reviews..frankly, I see a huge lack of honesty in alot of reviews(BIASED), so they hold no value for me, nor should they for you.

I get numbers from the pc sitting on the floor here next to me. If you'd like to come to my house, and see the numbers for yourself, you are more than welcome. I own most games released for pc since like DESCENT and MYST days, and through all these years, I've yet to find a writer that I trust enough to make a purchase after reading thier words.


TBH, alot of people are saying Crossfire overclocking doesn't work either...even after I posted evidence pointing otherwise, and easily explain why they cannot get the same results, yet still they stand firm in thier opinion on how things work.

Sadly, reviews are the same way...a writer's opinion. It just cannot be otherwise, due to human nature.

So, regardless, I posted the truth, accept it or not, that's what it is.

DIRT works for me, with ultra shadows and a 512MB card using the official 7.6s.Odd.

I know that DX10 CF wasn`t enabled for X64 drivers with recent drivers, dunno if the 7.6s/8.39.5 fixed that, so it may explain why your COH scales and his doesn`t;)
 
Hmm, i got retail box..maybe that's the difference?

CoH DX10 works fine with 7.6 here. Fresh install patch to 1.4, then 1.7, then play with settings, then played. NO profile needed.

DiRt does not like crossfire, no matter the settings. Lowest settings still causes"low vidram error".:rolleyes:



LoL.

I think that retail version, if you mean Lost Planet connects to Steam anyway and run thru it? What i found is that when i run DX9 version of Lost Planet i get video corruption on video playing under menu where you choose multiplayer etc, it doesnt occurs in DX10 version, because second card not working, found it by checking quickly temperatures thru amd gpu util.
Forget to mention im running Vista 32bit driver. Do you guys see wide,tent AA option when running crossfire or only AA 2 4 8 16x option + temporal? maybe something got screwed, since i got alot of troubles to have crossfire enabled, when i installed drivers, crossfire was unticked, i enabled it, it then automatically untick again, so it was long process before i made it to work, but probably not 100%
 
Maybe crossfire for 2900XT is crippled for 965X chipset, not sure what guys you got, mine mobo is Asus P5B Deluxe.
 
DIRT works for me, with ultra shadows and a 512MB card using the official 7.6s.Odd.


Singlecard? everything else ultra? 4xAA, etc? I can run with 0xAA, but it's really bad to see jaggy shadows on the road when going 180KPH. In Crossfire, the game runs, maybe one race only, then gpu memory leak seems to be an issue. Dunno if it's driver, the app, DX, or what, but ATI seems to think it's a none-issue and has dropped every support ticket i create on the issue.:rolleyes:

hmm. For sure spadjr has issues here...could not enable crossfire in CCC...methinks this speaks volumes as to why he has issues i do not.


UPdated DirectX lately, Spadjr?

no matter chipsets here...I ahve Crossfired P35 adn 975x..performance differences are negligible in games. ~400 points difference in 3Dmark06.;)
 
updated direct X now, not sure if i did it previously, so now im sure i use latest binaries.
Can you please tell me when you run DX10 Company of Heroes benchmark .. is if smooth? i mean part when airplane going down at one moment screen stops for a second completely. Because my min.fps was 10fps under DX10. Good you got both chipset for comparing. Some months ago when they start to support Crossfire over 965X chipset there were games that shows fps drops more then 50 percent compared to 975X chipset, though it was X1900XT CF + X1900XTX card, maybe some thing similar happening now, but P35 got 16+4 too, so probably not ;)
 
nah, everyone gets those drops. Personally i think it's due to the geometry instancing used for the grass(grass is very close in view then).

as for the bandwidth issue...might be part of it. the only reason for differences i scores between 975x and p35, currently, in benchmarks, is due to a lower cpu score. SM2.0 and 3.0 scores are within 80pts of each other...minimal difference.
 
Singlecard? everything else ultra? 4xAA, etc? I can run with 0xAA, but it's really bad to see jaggy shadows on the road when going 180KPH. In Crossfire, the game runs, maybe one race only, then gpu memory leak seems to be an issue. Dunno if it's driver, the app, DX, or what, but ATI seems to think it's a none-issue and has dropped every support ticket i create on the issue.:rolleyes:

hmm. For sure spadjr has issues here...could not enable crossfire in CCC...methinks this speaks volumes as to why he has issues i do not.


UPdated DirectX lately, Spadjr?

no matter chipsets here...I ahve Crossfired P35 adn 975x..performance differences are negligible in games. ~400 points difference in 3Dmark06.;)

With AA, singlecard(till tomorrow, when I`ll reenter Crossfire hell:) )I`ve ran a number of races without issue. But I`m using Vista64 and the official drivers, so that may add some confounding elements.
 
NOt really. we are all probably using slightly different drivers, and are met with slightly different issues.


Really says alot that...drivers are definately an issue with these cards.nV was faced wit ha similar problem, so ATI get a slight break for having to deal with my expectations, or rather my expectations drop for a wee bit.

Personally, I'm waiting until September for good drivers. That's 5 months of me owning these cards. If things improve, then I'll keep them, I suppose. If not, yet another crappy part that's lived with me for a while. Oh well. I'm beyond really caring about it any more. I'll deal with inferior performance, and AMD becomes an inferior company. Not like we have not seen this coming....
 
A resolution of 1280x1024 is really not enough to distinguish if 1024mb > 512mb. 1GB of RAM is only gonna seriously come into play at >16x12 with 4xAA at least. Or 2560x1600. I wanna see comparisons at this res so I can decide which card to buy.


Here are the results for your decision.
HD-2900XT 512MB cost $400.00 dollars
HD-2900XT 1GB cost $550.00 dollars ($150.00 dollars extra for small bump)

hdrfol2560.gif


Based on this data, it looks like you really won’t see any benefits from the 1GB of memory present on the Diamond Viper Radeon HD 2900 XT 1GB with today’s current crop of games unless you’re playing with HDR+AA. Of course as any hardcore gamer will tell you, the next generation of games should utilize HDR+AA extensively, so Diamond’s 1GB Viper card would be a better long-term purchase than the stock Radeon HD 2900 XT 512MB if this is important to you.

http://www.firingsquad.com/hardware/diamond_radeon_2900_xt_1gb/page18.asp

If Diamond would instead clocked it at 850MHz+ for the core default, then it might be worth it to buy.
 
Last edited by a moderator:
Too bad he didnt overclock cards in test, wanted to see how +100Mhz on core and on memory would make a difference.
 
Despite a fairly substantial memory bandwidth increase the GPU gains practically no performance. I think the reasoning is fairly obvious, the R600 is not bandwidth limited at all, its core is simply underpowered for the current crop of games to maximise use of even 100GB/sec, never mind 128GB/sec.
 
Back
Top