Anandtech Oblivion Benches

Blacklash

Newcomer
Anyone read this rot from Anandtech?

"Our highest end X1900 XT CrossFire setup can't even run our most stressful real world Oblivion test at 1280 x 1024 with all of the detail settings set to their highest values. That's right, $1200 worth of GPUs will get you less than 50 fps at less than the highest image quality, and we're not talking about having AA enabled either."

Source:http://www.anandtech.com/video/showdoc.aspx?i=2746&p=3

I wonder how people who spent 799-999 on 7800 GTX 512mb cards in SLi feel right about now? How about those that pay 500+ a card on 7900 GTXs to see them losing to X1800s in the most demanding situations? It is interesting that the Firingsquad numbers in the most demanding situtations show the X1800s in Crossfire besting the current Flagship nVidia cards in SLi. Heck and ATi doesn't even use all their aggressive optimizations. Do those mins look unplayable to you fellows with X1900s in Crossfire? I find the game perfectly playable at 1600x in Crossfire with HDR and 16xAF. I might add I have draw distance max and many other tweaks to improve IQ.

The bottom line is 1280x with no AA|AF makes nVidia look better in these dual card configuration tests. I wonder when Anandtech will post another review of an ASUS motherboard? I can't wait.

/rude Anandtech

Contrast their rot with Firingsquad dual card min and average numbers in the most demanding areas at 1600x with AF and|or with AA|AF.

slimtshdr16001db.gif

mhdr8af3wq.jpg


slimtsaa16004ty.gif

m4aa8af6sf.jpg
 
Last edited by a moderator:
You left out a very signifcant part of the quote.

John Reynolds said:
Gee, I didn't know I had to have 50fps or more when playing a RPG (on $1200 worth of hardware)

Although the game doesnt require it, I too would feel somewhat silly at those framerates with that caliber of hardware. Who was it that predicted a couple months ago that future games would target dual-GPU configurations for max IQ? Well, welcome to the future!! :D
 
Yes I thought it was a very bad article, I mean, who on Earth is going to play this game on a high end GPU without all the settings turned up to maximum?

And I would hardly describe 50+ fps in the most stressfull areas of the game as "can't even run".
 
Would it really have killed Anandtech if they did some extra AA+HDR testing?

The AA hit on the X1k isn't even that big on Oblivion.
 
wishiknew said:
Would it really have killed Anandtech if they did some extra AA+HDR testing?

The AA hit on the X1k isn't even that big on Oblivion.

How about any AA testing? The article, and particulary the conclusion was a joke. They had the "Chuck patch" installed. I dont see any reason why they limited themselves to no AA, no HDR+AA, and 1280x1024.

For now, those of you hoping to run Oblivion at 1920x1200 with all the detail settings at maximum will need to wait for future GPU generations. But how do the current generation of cards fare?

Funny, I play at 1920x1200 with every detail turned up, and play just fine.

The performance offered by a pair of X1900 XTs simply can't be matched by any single card, but as good of a game as Oblivion is you'd have to have a pretty serious computer budget to accommodate the $1200 that a pair of X1900 CrossFire GPUs will set you back.

Do they only buy from CompUSA? You can get a X1900 CF setup for well under $1000. Newegg has a CF Edition X1900 for $490, and a XT for $420. Both retail. Do they just try and mislead people, or do they only go by MSRP prices? Pretty misinforming.

Moving to an X1800 XT will set you back more than $300

Again, misinforming. newegg has the 256MB XT for $260, and the 512MB XT for $300. More than $300? Hardly.

AT's reviews have down hill for a while now. They're easily the 4th or 5th best to me now.
 
My weedy 6800GS SLi setup doesn't look to healthy so I'm refraining from casting any comments on the top range cards seeming lack of performance in this game :D

It's not really my Austin Powers style bag to be honest, however I am getting rather concerned about Ghost Recon thingymabob I'm considering buying and whether I will have to run at 800x600 with HDR on... sob sob.

I was hoping to make them last until G80 or R6xx :(
 
Who wants to wager that UE3 will look and perform much better than Oblivion? Oblivion`s issues are not tied to it being cutting-edge-spank-my-viddy-card software(it`s not, it looks fairly nice, but it`s definently not), but to the fact that mediocre coders used middleware products stitched together,IMO. It could do alot more with quite a bit less.

OTOH, I`ve never encountered unplayable conditions on my CF setup, even though I`m using 4x AA and 16x HQ-AF. Adaptive kills it, but that`s to be expected.
 
well i see it worked for AT. They must be getting the hits they want, and some special green HW for later... Oh and ya who cares if 1200$ gets ya 50fp, im sure 89$ will get ya jaggie blurry mess at 800/600 with 30fps..... pifft
 
LOL guys. Nit picking a bit too much here?

The performance they show fits what I've seen on 6800s and X850s from my experience, so whether or not they are dead on with their pricing is a moot point. Who relies on the press to find prices? They tell me how the cards run the game. That's all I needed.

I like how my X850XT PE can run with the best without HDR. Heh. My $200 Dual DVI X800GTO2 that is.

6800 runs the game so badly. It's amazing. My friend's 6800GT and my laptop's 6800Go run it significantly worse at a lower resolution with lower detail than my X850. My 385/770 NV42 in my notebook can barely hack 1440x900 (native res) 18fps sometimes without AA, HDR, or Bloom, and with no grass and very few shadows.
 
swaaye said:
LOL guys. Nit picking a bit too much here?

The performance they show fits what I've seen on 6800s and X850s from my experience, so whether or not they are dead on with their pricing is a moot point. Who relies on the press to find prices? They tell me how the cards run the game. That's all I needed.

Its not nitpicking, its calling them out. Its not a moot point about being right or wrong on their pricing statements, AT gets a TON of hits from people looking for answers, and suggestions in reviews. Making misleading statements (or just plain wrong) like they did, is a huge disservice to their readers.

Just like only benches at 1280x1024, and with no AA. Who runs X1900's in CF, or GTX in SLI, at that res, with no AA? Show of hands, anyone? Tell you how the card runs the game, is all you ask? Well, that would be nice. But their very limited benchmarks do not do anything for me. They do not show me a thing.
 
fallguy said:
Just like only benches at 1280x1024, and with no AA. Who runs X1900's in CF, or GTX in SLI, at that res, with no AA? Show of hands, anyone? Tell you how the card runs the game, is all you ask? Well, that would be nice. But their very limited benchmarks do not do anything for me. They do not show me a thing.

You are throwing a hissy fit for no good reason. Why ask silly questions like "Who runs X1900's in CF, or GTX in SLI, at that res, with no AA?" as if Anandtech's numbers are through the roof at the tested settings. Gone are the days where dual-GPU's chew through anything less than 16x12 4xAA.

Has any site done HDR+AA benches with the Chuck patch as yet to see how CF performance holds up?

Edit: Just saw EB's take - not too bad at all.
 
Last edited by a moderator:
fallguy said:
Its not nitpicking, its calling them out. Its not a moot point about being right or wrong on their pricing statements, AT gets a TON of hits from people looking for answers, and suggestions in reviews. Making misleading statements (or just plain wrong) like they did, is a huge disservice to their readers.

Just like only benches at 1280x1024, and with no AA. Who runs X1900's in CF, or GTX in SLI, at that res, with no AA? Show of hands, anyone? Tell you how the card runs the game, is all you ask? Well, that would be nice. But their very limited benchmarks do not do anything for me. They do not show me a thing.


I'm really getting sick of people throwing out this damn question over and over as if .. how many people run SLI/CF/MultiChrome (ok how many people run MultiChrome at all might be a valid one ;p) .. at 12x10 .. considering the mass of screens sold by Etailers and B&Ms are LCDs between the sizes of 17" and 19", 99% of which have a native resolution of 12x10.. Gone to any fests/lans lately ?? An overwhelming majority of "Core" gamers are running LCDs that fall into that group.. yeah you still see the SadoMachoist with the 21" TV ..err CRT that makes most table legs shake when placed upon. I do agree that Sans AA/AF etc is rather disheartening to those that just spent $1200 on a video upgrade (seeing as NewEgg isn't available WORLDWIDE). The prices quoted in the article seem to use typical pricing from MSRP, if only to dramatise the situation but again.. NEWEGG is NOT the end all be all retailer for all that is PC related,.. jump over to Pricewatch and the median price for X1800 XTs range from 325-430, a hop over to Froogle isn't any better. The US maybe the single largest market (debatable ..) however this doesn't mean that the rest of the world pays the low prices that the US citizens are luxury to. Even the friendly neighbors to the north aren't privy to NewEgg,.. so what do you say to the cash weilding denizen of europa ?

Im not defending AT's reasoning (whatever it may be) just trying to make sense of it, admittedly the whole AA thing still has me baffled,..

One area where AT IMO has excelled and one what NO other sites have touched upon is the effect of last gen hardware (x800/6x00), and vaule/low end mainstream parts such as 1600/6600 coupled in CF/SLI configurations. It's more of a dis-service to readers of ANY site to convey the message that CF/SLI and other mvp setups require the use of the most expensive/top end hardware available. There are MANY "geeks" who love the thought of playing in the niche that is SLi/CF etc but dont have insanely deep pockets or an expense account that can pay for such extreme configurations.***

***(A bit OT.. with the rumor that the upcoming nv drivers will add SLI support to Intel 975, it looks like for once, that there will be at least ONE option for a universal MVP solution, albeit w/o interproduct compatability amongst competing products... Now only if ATI and NV would remove their heads from the respective posteriors.. !!!)

So in one sense AT's numbers make ultra expensive (and higher performing) platforms seem a bit more needed if you want to maximize your emmersion into the gameplay experience.. they also show that for those who don't live along the bleeding edge of hardware that those older products will allow you to at least play comfortably,.. just not with all the eye candy and expecially near Oblivion gates ;-P
 
Did anyone notice AT explaining anywhere in that read exactly how they benchmarked Oblivion? I didn't, but maybe I missed it?
 
digitalwanderer said:
Did anyone notice AT explaining anywhere in that read exactly how they benchmarked Oblivion? I didn't, but maybe I missed it?
Anandtech said:
It only made sense that we benchmarked in each of those three areas, so we constructed manually scripted (read: walk-throughs by hand) benchmarks taking us through one of each type of area in Oblivion.
Anandtech said:
We measured frame rates using FRAPS and reported both the minimum and average frame rates in our charts (we left out maximum frame rates because they simply aren't as important and they made the graphs a little too difficult to read when we included them).


Apart from the lack of AA testing I don't see much to fault at AT's benches.


Oh, and their CPU testing is up now too: http://www.anandtech.com/cpuchipsets/showdoc.aspx?i=2747

I'm not that surprised about the dependence on CPU speed considering Morrowind was pretty much entirely CPU dependant. And from tweaking around with Oblivion settings, some settings obviously don't depend solely on graphics performance.
 
FrameBuffer said:
I'm really getting sick of people throwing out this damn question over and over as if .. how many people run SLI/CF/MultiChrome (ok how many people run MultiChrome at all might be a valid one ;p) .. at 12x10 .. considering the mass of screens sold by Etailers and B&Ms are LCDs between the sizes of 17" and 19", 99% of which have a native resolution of 12x10.. Gone to any fests/lans lately ?? An overwhelming majority of "Core" gamers are running LCDs that fall into that group.. yeah you still see the SadoMachoist with the 21" TV ..err CRT that makes most table legs shake when placed upon. I do agree that Sans AA/AF etc is rather disheartening to those that just spent $1200 on a video upgrade (seeing as NewEgg isn't available WORLDWIDE).

I agree with what you say about the resolution. 1600x1200 is practically dead yet most sites still treat it like the holy grail. I want to see benchmarks at the resolutions of 1280x1024 (your average PC gamer) and 1920x1200 (for your high end gamer).

Always with 4xFSAA and always with 16xAF (since it performs almost identically to 8x). Thats only on high end systems of course, the lower end can do without those enhancements.

However those things aren't really my problem with the AT article, my problem is the fact that they didn't even run the game at maximum details! Who in their right mind owns a high end graphics setup and cares about playing the game at anything less than the full detail settings? It may be a good way to test the midrange cards but certainly not the high end setups and so for me, completely invalidates those scores.
 
Back
Top