ANova said:I noticed Newegg has the X6800 in stock already, for a mere ~$1300.
Yikes, I'm hoping that $300 is a premium for early arrival and not for other reasons.
ANova said:I noticed Newegg has the X6800 in stock already, for a mere ~$1300.
Chalnoth said:Comment on the original article:
I'd just like to say that while I totally disagree with HardOCP's conclusions, I think it's important to have at least one article of this type. Gamers should understand that anything above a midrange CPU typically won't get you much gaming performance. It's also important that people notice that the framerate graphs at that high resolution were spot-on for basically every game.
That said, what lower-resolution game benchmarks do tell you about a CPU is how long it will last. After all, as newer games come out, they will demand more and more processing power to run at acceptable framerates. So you're certainly not buying nothing when you purchase a higher-performing CPU: you're buying longevity.
And, of course, there are other applications where CPU power is much more important, such as media encoding.
P.S. I'm still rooting for AMD. Intel's got this round, it seems, but I'm sincerely hoping for a quick turnaround.
geo said:Wheee. [H] is getting more aggressive in their language on pushing their benchmarking methodology as the One True Faith. "Lie" is getting tossed around there pretty liberally about others.
Wouldn't you want to see multi-gpu results before you came to such conclusions as they come to tho?
The AMD fanboi brigade has their Kyle t-shirts on this morning, no doubt.
I'm not sure I buy this argument. I mean sure, a faster CPU will last longer, but it will also cost more, and as you get toward the high-end cost increases more rapidly than performance. I don't see how the economics can work out.Chalnoth said:So you're certainly not buying nothing when you purchase a higher-performing CPU: you're buying longevity.
Actually if you look even for more benches, E6600, priced at tad over $300, beats FX-62 in most, if not all situations.Albuquerque said:Alright, so then let's just focus on the benchmark.
An Intel $500 processor at lower clockspeed was just shown to be equal to a $1000 AMD processor, and you're unimpressed. Care to elaborate?
The price difference between the two is very extreme with the Core 2 Extreme X6800 costing $999 and the Core 2 Duo E6700 at $530. Does it look like the price is justified between the two for gaming? We can safely say “no†as far as gaming goes with this gameplay testing we have performed.
But, if you look at the amount of difference between the AMD and Intel CPUs, you will see that it isn’t enough to amount to anything. The only game that we saw any real-world difference in was Oblivion, and even that was tiny. A little overclocking would clear that difference up.
dizietsma said:One thing Kyle has not taken into consideration is G80 and possible quad Sli. 4 x G80 will not be as frame rate limited ( you have to suppose ) and we know dual Sli takes a fair bit of cpu power .. does quad SLi take 2 x that. If so then I think I would be happier with the most powerful cpu I could get my hands on ..especially if it saved on power as well so my mighty 4 G80's could have just a bit more.
It's not common for two CPU's at roughly the same performance level from each manufacturer to have the same cost to the consumer, or for two CPU's at the same cost to have the same performance (particularly when you factor in such user-specific things when upgrading like the motherboard and memory).nutball said:I'm not sure I buy this argument. I mean sure, a faster CPU will last longer, but it will also cost more, and as you get toward the high-end cost increases more rapidly than performance. I don't see how the economics can work out.
But then it's not deterministic, is it? More importantly, if you reduce resolution from 1600x1200 to 800x600 (so you're reducing the GPU load by a factor of four) and running a demo without a framerate limit, which is just feeding geometry, wouldn't you increasing the CPU load significantly (up to a factor of four)?Gubbi said:While the HOCP approach is flawed, - running a game at GPU limiting settings, they do one thing correct: They don't used canned demos. Demo playback is little more than feeding geometry to the GPU (and a bit of skinning and shadow volume extrusion in Q4 games, but alas). Canned demos do not exercise the most important parts of a modern game engine: AI and physics.
That's a bit of a problem I have with the article too, as well as why his sudden switch in stance on the enthusiast crowd.Acert93 said:Talk about cherry picking testing situations to arrive at your desired results... again.
nutball said:I'm not sure I buy this argument. I mean sure, a faster CPU will last longer, but it will also cost more, and as you get toward the high-end cost increases more rapidly than performance. I don't see how the economics can work out.
There's a minimum performance delta which is worthwhile longevity-wise. A CPU with 10% extra performance gets you what in extra longevity? Pretty much zero I'd say, in practical terms. Both are going to be hung out to dry by whatever the next game is going to be; that extra 10% isn't going to make the unplayable playable.
So how far do you have to go? 25%? 50%? 100%? Want a processor twice as fast? You're going to pay a lot more than twice the money, if you want it now. Or you wait 18 months, upgrade and get twice the performance for the same money you just paid, less if you sell your old kit on eBay, less still if AMD and Intel are having one of their silly little price wars!
Errr... yeah I know people like this too. That doesn't make it a sensible or economically justifiable choice though. I mean presumably you advised them against it because you know it's stupid?poopypoo said:disagree. as a computer person (haha we all are, but i did it professionally a good bit, you know, just sayin) i'd say that the ppl i've met (aside from online e-penis dimwits) who bought highend CPUs have all done it (against my advice) for one reason -- they want to slap it in (or me to do that, really) and not look at it again for 2 or 3 years at least.
Chalnoth said:Heh, and I've had 3 CPU's in the past 5 years, so I'd say wanting to keep a CPU for a little while describes me, too.