AMD confirms R680 is two chips on one board

I know that absolutely nobody ever seems to read my posts, but why the hell do you benchmark a multi-GPU system in MEDIUM DETAILS?!??!

I wrote a post about the MultiGPU Crysis patch and my benchmark findings in this Beyond3D post: http://forum.beyond3d.com/showpost.php?p=1116612&postcount=19

Cliff notes:
If you actually play with settings that you'd expect to use in a multi-GPU system, the increase is quite good. If you play on a 15" screen at medium details, you don't freakin' need multiple GPU's in the first place. :rolleyes:


The point of that article was just to show if there were any improvements across the board (heh, excuse the pun ;)). The article was comparing Crysis pre- and post-patch, so it's useful just to know for the people out there. You'll note that on the last page, they finally do show a hefty improvement with the patch and crossfire in particular showed the most gain (DX10, HQ, 1920x1200).

http://www.techspot.com/article/83-crysis-patch-performance-multigpu/page5.html
 
Well, you'll also notice that I too tested patched and unpatched. And I also stated what benchmark I used -- we have no clue how he even came to those numbers; he doesn't say ANYWHERE about how those numbers were generated. So we have no clue.
 
Ok, maybe I need to be more specific about what I find utterly wrong about that Techreport article:

He says he's on Vista 64 -- did he test with Crysis in 64 bit mode, or 32 bit mode? We can assume it was 64-bit, but we don't know.

He gives ONE NUMBER for each configuration. Is that minimum framerate? Is that average framerate? Is that maximum framerate? We can assume it's average, but we don't know.

That one number comes from what? One of the built-in benchmarks? Which one? A benchmark that he made up himself? Good, what did it cover? Just standing on a ledge and watching the framerate counter? Looking at what? We can assume it's from something legitemate and repeatable, bu we don't know.

That performance review is about as useful as a wet pair of underpants.
 
Perhaps instead of ranting about how terrible you think it is, you could send in feedback.
 
Perhaps instead of ranting about how terrible you think it is, you could send in feedback.

I did, almost verbatim for what I wrote above. This is the first I've seen it, so hence why he's only now getting my email. That still doesn't make the above rendition any more useful, so for all current intents and purposes, that review is still pretty much worthless as it stands.

I have a feeling that one of two things happened:

Either he A: used the Benchmark_CPU mode or
B: made his own benchmark that was heavily CPU bottlenecked in lower resolutions.

The numbers don't really line up with ANYTHING I've seen otherwise.
 
Prices in Europe will probably end up at around €400 (probably even lower when looking at the AIB -> distri prices (€305 without VAT).


Price for the Sapphire HD3870X2 1GB without VAT and sales-margins is €307,82. So I think we'll see the 1GB version of the HD3870 X2 retail at around €399.

Stores are getting their first batch of cards at the end of this week, ready for next weeks launch (rumors about a triple launch together with the HD3400 and HD3600 seem to be correct). A hard launch it is.
 
Price for the Sapphire HD3870X2 1GB without VAT and sales-margins is €307,82. So I think we'll see the 1GB version of the HD3870 X2 retail at around €399.

Stores are getting their first batch of cards at the end of this week, ready for next weeks launch (rumors about a triple launch together with the HD3400 and HD3600 seem to be correct). A hard launch it is.

Any news on the possible 2Gb versions?Will there be any available at launch?
 
Price for the Sapphire HD3870X2 1GB without VAT and sales-margins is €307,82. So I think we'll see the 1GB version of the HD3870 X2 retail at around €399.

Stores are getting their first batch of cards at the end of this week, ready for next weeks launch (rumors about a triple launch together with the HD3400 and HD3600 seem to be correct). A hard launch it is.

Smells like someone is pulling a bit more than necessary for themselves from that, according to someone from finland's biggest hardware forum, the price without any taxes (VAT) or postages or any e-tailer/retailer steps in between for the card is 290,82€ for him, different importers (or what ever they should be called) take apparently different sums per card for themselves.
 
Please don't confuse that TechSPOT article with the fine work done over at TechREPORT. :)

Regards,
SB
:oops: :oops: I did mistype that, my bad!

I received a reply back from Steven at TechSpot, here's what he had to say:

Hi Jason,
The multi-GPU hotfix was installed on the test system and we tested Crysis in 32-bit mode. The results that we recoreded were the average frame rates and we used the Assult mission for testing.

Thank you
Steven Walton
Tech Spot
www.techspot.com

Odd: why would you use a 64-bit OS to test the an application in 32-bit mode, when the app comes in BOTH flavors straight out of the box? The average framerate part makes sense of course, but the Assault mission is the benchmark_CPU test -- wrong test to use if you're going for video card testing, isn't it? The benchmark_GPU run tests the Island map.

So again, not anything like the best testing methodology for what they were trying to test.
 
:oops: :oops: I did mistype that, my bad!

I received a reply back from Steven at TechSpot, here's what he had to say:



Odd: why would you use a 64-bit OS to test the an application in 32-bit mode, when the app comes in BOTH flavors straight out of the box? The average framerate part makes sense of course, but the Assault mission is the benchmark_CPU test -- wrong test to use if you're going for video card testing, isn't it? The benchmark_GPU run tests the Island map.

So again, not anything like the best testing methodology for what they were trying to test.

One reason might be the fact that the 64bit version of Crysis runs slower from some reason
 
I'll test that out sometime this week; I didn't realize there was any sort of performance change between the 32-bit and 64-bit versions. It makes me wonder what the "tradeoff" is...
 
vertex_shader said:
What happend with the idle 45 watt power consuption?
It grew up.

LoL :LOL:

At VR-Zone, 110w is referred to lowest TDW?? which I do not sure that it would mean by idle power or not. However, there might be possibility for 45w if the PowerPlay can switch off on chip on the card and only one chip + PCIe switch are working in idle.... 110W seems possibly for both the chips and PCIe switch working at the same time...

Just my guess.
 
1800MHz GDDR3 Memory gives approx ~114GB :( memory bandwidth for Total 2 GPU combine. Where is GDDR4 ?? approx GDDR4 2400MHz ~154GB memory bandwidth combine for 2 GPU's :) :D
Yuck at GDDR 3.

I'm not sure if "combined" numbers really mean anything; I don't know that you can explicitly add bandwidth together like that for multi-GPU setups like this one. Either way, I'd much prefer to have some GDDR4 / 1200mhz like my current 3870's have.
 
Yuck at GDDR 3.

I'm not sure if "combined" numbers really mean anything; I don't know that you can explicitly add bandwidth together like that for multi-GPU setups like this one. Either way, I'd much prefer to have some GDDR4 / 1200mhz like my current 3870's have.

Nvidia Geforce 7950GX2 utilize combine bandwidth - wasn't so bad.
 
Back
Top