first r420 review leak

Rican said:
is it just me but it looks like they are pretty evenly match

Yeah from what I have seen in this supposed leaked review it surely does look like a close race. I think I am going to hold off making any judgments yet. I always look at a wide variety of reviews before I make any conclusions.
 
Sabastian said:
Rican said:
is it just me but it looks like they are pretty evenly match

Yeah from what I have seen in this supposed leaked review it surely does look like a close race. I think I am going to hold off making any judgments yet. I always look at a wide variety of reviews before I make any conclusions.
The review is from THG. Therefore, you must assume that all ATI cards score about four times what THG is showing them as scoring. :)
 
Waltar said:
Cool I can spend $100 more and get 3 more frames per second in aquamark and 9 in farcry. Thanks ATI. :rolleyes:

Better to spend more and get more than spend more and get less. Thanks Nvidia. :rolleyes:
 
Doomtrooper said:
Max Payne 2 should be there...Breed should not...game is a POS.

indeed - again what kind of tech displays are breed and cod - they're not.

same goes for halo - game looks like a POS basically.

:rolleyes: @ THG

someone leak a decent review!! :D
 
pocketmoon66 said:
Rican said:
is it just me but it looks like they are pretty evenly match

My initial thoughts also.

For me it's going to boil down to:

The ability of my PSU to power a dual molex 6800 vs The Single Molex X800

And

PS3.0

Given the X800Xt vs the 6800Ultra, I'm leaning towards the XT. Not all that interested in the Pro. The darkhorse in this for me is this rumored 6800 Pro/GT. Sliding in between the 6800 and the 6800Ultra, with the ultras full 16 pipes but slower clocks I'm VERY interested in seeing where its priced at.
 
Sxotty said:
Hey though it is pretty darn good looking so far, but not what I had hoped for ah well...

Now if the pro was ahead of the ultra like the XT is I would be happy though.

Ummmm why would that make you happy? Only way that would make me happy is if the PRO is $299.

Eronarn said:
Rican said:
is it just me but it looks like they are pretty evenly match

From those images, yeah. But wait until B3D's review goes up because that will provide the REAL benchmarks. :rolleyes:

And you know the REAL benchmarks will show something different because..... :rolleyes:

Looks like Joe was right about the ATI boys discounting anything that doesnt show NVIDIA wallowing in despair.
 
AlphaWolf said:
Why wouldn't the theoretical performance of the XT be 50% greater than the pro? 12 pipes vs 16, plus a higher clock speed. Or do you think that Dave meant 50% greater than the 6800?

16/12 = 1.33
1.1 * 1.33 = 1.46

So basically if the XT is clocked ~50Mhz higher than the pro, then it should have roughly ~1.5 the pixel throughput.

Aaron Spink
speaking for myself inc.
 
trinibwoy said:
DemoCoder said:
trinibwoy said:
Interesting ATI Pro 475/900/12 matching NV Ultra 400/1100/16 shows a superior architecture for ATI but ATI XT 525/1120/16 not much better?

Superior? How come when NVidia was going for high clocks but ATI was doing well with low clocks and high-parallelism (4pipe NV vs 8pipe ATI), it was ATI with the superior architecture (wide shallow pipes), and now that ATI has gone more for high clocks again it signals the superiority of ATI?

Is AMD with its good per-clock performance vs Intel P4 better? Performance and cost is what matters, not the architecture that was used to achieve it.

LOL. Damn :oops: Calm down dude. I'm not 'claiming' ATI has a superior architecture. I leave that to guys like you, Chalnoth, Hellbinder etc. And yes I agree with you that in a 400/475 16 pipeline contest the NV40 would appear more efficient than ATI. but this is 400*16 = 6400 vs 475*12 = 5700 raw theoretical fillrate. Of course there is a lot more at hand here but wouldnt you say that points to ATI being more efficient than the other way around?




Let's see...... you make some dumbass comments that are so off the wall stupid that it pisses people off then you post telling them to calm down?!?!

Looks to me like you are trying to piss people off just because NVDA's 6800 is bigger, hotter, louder, less OEM friendly, more PSU crippling, and heavier than the X800.

Did I mention it is also slower, has worse IQ, and requires hacked-out-the-ass drivers to stay competitive?



Wow to bad about that NVDA stock going down 34% these last few weeks. Get out now before you lose another 20% on top of that.
 
if the scaling continues - i think you're gonna see the x800 around 30% faster than the 6800 in very high AA/AF modes (6x/16x), and for definate in 1600x1200 etc :)
 
No, my math is not bad. Running temporal AA at 90fps requires your framerate to the same as your monitor refresh rate most of the time.

When framerate fluctuates at higher resolutions, the images will still look good. When it fluctuates with temporal AA, you'll get noticable flickering.

Frankly, I'd rather play at 1024x768 @ 90fps without temporal AA and vsync off. (and before I get accused of this being an anti-ATI issue, it's not. Temporal AA can be implemented on other cards too.)

Temporal AA looks usable for Call of Duty, UT2003, and a bunch of other 100+fps (with minimum FPS > 70fps) games. I think you're kidding yourself if you think it will be great on FarCry.
 
To all the guys saying that all TWIMTBP games are crap benchmarks please provide an ATI approved list for a 'fair' comparison :devilish:

Wish I was around when Nvidia was on top and they were the ones moaning about the 9700PRO :LOL:
 
trinibwoy said:
Sxotty said:
Hey though it is pretty darn good looking so far, but not what I had hoped for ah well...

Now if the pro was ahead of the ultra like the XT is I would be happy though.

Ummmm why would that make you happy? Only way that would make me happy is if the PRO is $299.

Why not both then I would be even happier no ? :) Really though if ati kills nvidia then nvidia will have to sell the 6800 cheaper, I am much more excited about doom3 than halflife so it would be to my interest to have the 6800 reduced in price if it performs better even in that select application.
 
dr3amz said:
DemoCoder said:
I think what Ailuros is saying is that if he was getting 90fps @ 1024x768 w/4xFSAA, he'd rather go to 1200x1024 w/6xFSAA @ 60fps than spend the 90fps on temporal AA.

With higher spatial antialiasing and resolutions, you don't get artifacts if the framerate is unstable. You also don't lose FPS on non-tripple-buffered games because of vsync lock.

either its late or your maths don't add up :)

at 1024*768 he can go 4xTFSA (8x effective) at 90fps - imo thats better than running 1280x1024 @ 6xfsa @ 60fps - especially when you get into `busy` scenes and the fps drops.

also - the fps drop would be far bigger running 6xFSA at 1280*1024 than it would with 4xTFSA @ 1024*768

:D

DemoCoder got somewhat my point yet his example was a tad weird.

First off 1024 looks already pathetic on my 21-incher here to get things clear.

Second how often do I have the chance to get 90 fps with 4xAA (temporal or no doesn't make a difference here)? If I wouldn't experience any edge crawling I wouldn't have a single reason to object against the method; everytime I drop lower than let's say 45-50fps I get a crawling fest and that even if I hold still. Quake3 is boring and old before anyone comes up with it.

Third higher resolutions are always better even if you drop to just 2xAA. There's more to a scene than just polygon edges and since the majority of my games are CPU bound flight/racing sims (where for me any sort of antialiasing matters more) I use the highest possible resolution; usually no less than 1280*960 and in some better cases 1600*1200.
 
trinibwoy said:
[And you know the REAL benchmarks will show something different because..... :rolleyes:

Looks like Joe was right about the ATI boys discounting anything that doesnt show NVIDIA wallowing in despair.

I prefer fanATIc. And the only reason I'm discounting it is because it's THG, which is biased pretty much no matter what way you look at it. Not that it matters, mind you, I'm stuck with my 9700P for a long-ass time. Anyways, I think ATI will win again in many circumstances, and if the B3D benchmarks show otherwise I will post a recording of me saying that Nvidia is the king of graphics cards. :D
 
I actually can't run lower than 1280x1024 or 1600x1200, because of the LCD monitors I use. Running in lower resolution invokes the monitor to use upscaling interpolation which frankly looks bad compared to native resolution.
 
Back
Top