AMD confirms R680 is two chips on one board

In most games yes. But Crysis? No.

I guarantee that you could not playably run Crysis at 2560x1600 with 8xMSAA/16xAF at very high settings even with quad 3870X2's.

At 30fps, that works out to 3.75 fps per 3870 assuming perfect scaling. And frankly, I don't think the 3870 could manage 3.75 fps at those settings (going off how my single 8800GTS 640 handles much lesser settings).

And even then, thats assuming "CPU bound" to be below 30fps. If anything I would say 60fps is the ideal. So really the question is could that GPU setup hit 60fps average?

The answer is a definate no.

On the other hand, at much lower graphics settings i'm sure a 9650 could easily maintain a 60fps average.
No doubt. But to brute force that much video power at Crysis would be playable at much higher then it is currently on anything. For any other game, it would be a waste IMO.
 
No doubt. But to brute force that much video power at Crysis would be playable at much higher then it is currently on anything. For any other game, it would be a waste IMO.

If it were possible Quad-CrossFire 4X Radeon3870X2; nobody going to spend that much money on video-power just to play crysis. :LOL:
 
This CPU limited stuff is bull. Really. Are we back in the dark ages or something?Start running things with high enough settings at a high enough resolution, and then come back to us and tell us how's that CPU limited thingie coming.

Do you actually check out CPU reviews?There's about zilch scaling with high-end CPUs beyond a certain point(read, over medium settings or whatever). If you're running a Pentium D, yeah, sure, you'll quite CPU limited. But under normal circumstances, you're almost always GPU bound. Even with SLi, even with Tri-SLi, and even with a hypothetical octo-GPU.
 
This CPU limited stuff is bull. Really. Are we back in the dark ages or something?Start running things with high enough settings at a high enough resolution, and then come back to us and tell us how's that CPU limited thingie coming.

Do you actually check out CPU reviews?There's about zilch scaling with high-end CPUs beyond a certain point(read, over medium settings or whatever). If you're running a Pentium D, yeah, sure, you'll quite CPU limited. But under normal circumstances, you're almost always GPU bound. Even with SLi, even with Tri-SLi, and even with a hypothetical octo-GPU.

http://www.hardocp.com/article.html?art=MTI2MiwsLGhlbnRodXNpYXN0

Scaling between AMD and Intel processors. It seems there is a difference. We have a paradox I think of being limited at high resolutions again with power to burn still waiting untapped in the CPU. This is IMO because a lot of games still don't offer full threading.
 
In F.E.A.R. and in Oblivion?Not to mention that those other titles are fairly ancient in their own right. If you were to take the top-end GPUs of the time when Oblivion was released and perform the test, for example, you'd be likely to be GPU bound. It's the same here. The hypothetical scenario of 8-way multi GPU rendering isn't for playing your three year old game at 1280x1024. For that it'd be a waste.

Once you start getting into modern times, it becomes very useful FOR A CERTAIN NICHE. There are ppl buying 30" Dells to game on, buying the Skulltraill(when it'll be available) and Maximuses, QX9650s and 9770s, paying a lot of money in order to get top notch watercooling setups and so on. And they'll play Crysis, and Bioshock, and World in Conflict and so on. At the 30" Dell's native resolution. Preferably with AA and AF. And over there you need all the GPU oomph you can get.

The trouble with an 8-way setup stems from the difficulties associated with getting it to scale, the lag that AFR brings, power requirements, not the fact that you'd be CPU limited. That's bull(for the target demographic of such a solution).

Case in point:with a 3.2 Pentium D, 2900XTs in CF were indeed CPU limited at 1280x1024/1440x900 in 3DMark...even with 4x AA on, the score didn't fluctuate much. Punching it up to 4.5GHz, and voila, you got perceivable performance delta from no AA to 4xAA. A 4.5GHz Pentium D isn't exactly a speed demon by today's standards, and 3DMark06 isn't the most demanding thing out there, there are games that are more GPU intensive.
 
I'm sorry im not convinced you've offered enough proof that at 2560 by 1600 you won't get cpu limited in modern games. Im not talking about crysis, im talking about games you can actually run at 2560 by 1600 with say XfireX 3870x2. They compared two GTX's in SLI on a dual core 3ghz. Surely how can you prove that a cpu has enough power to fuel two teraflops (theoretical but gpus seem to hit close enough to their potential) of GPU's when the speed of out of the box parts haven't changed much in that time. Sure we have dual vs quad, but not all games CAN take advantage of that even now. So what im saying is that I think a game can become CPU limited when the cpu has free cycles to spare.
 
Little aside, but I was wondering--now with the current prospect of running four GPU setup in a consumer level system--what are the odds for implementing some sort of "free" 4xRGSS, combining the power of all the devices, with a simple driver hack/trick?
 
@Squilliam
You do realize that just one of the 3GHz cores in the E6850 would be far faster than even the fastest 3.73GHz Pentium 4 already right?

The clock frequency doesn't tell the whole story. We may have moved towards multiple cores, but each individual core has also significantly improved in performance.
 
@Squilliam
You do realize that just one of the 3GHz cores in the E6850 would be far faster than even the fastest 3.73GHz Pentium 4 already right?

The clock frequency doesn't tell the whole story. We may have moved towards multiple cores, but each individual core has also significantly improved in performance.

Certainly do!

Well look at FSX, Supreme commander etc :) I play a lot of strategy games. But on the FPS front, Could you say that crysis would perform at 2560 by 1600 @ 60 on very high with a current generation dual core? Thats multiple next gen graphics chips right there.
 
Looks like R680 review is out; this thread is done!
http://www.rage3d.com/board/showthread.php?threadid=33914731

cod4-2560.gif


Looks to me Radeon3870X2 $449.00 US dollars is attractive price compare to GF8800Ultra for $650.00 US dollars.
http://techreport.com/articles.x/13967/6

hl2-2560.gif


Edit: I'm waiting for drivers to make those in quad crossfire of 2X Radeon3870X2 (Total 4GPU's RV670XT). Wonder how well it will scale in FPS, because quad SLI GF7950GX2 didn't work well.
 
Last edited by a moderator:
Edit: I'm waiting for drivers to make those in quad crossfire of 2X Radeon3870X2 (Total 4GPU's RV670XT). Wonder how well it will scale in FPS, because quad SLI GF7950GX2 didn't work well.

Should work much better on Vista due to limitations of AFR on XP. I guess we'll see in a couple of months. :smile:
 
Certainly do!

Well look at FSX, Supreme commander etc :) I play a lot of strategy games. But on the FPS front, Could you say that crysis would perform at 2560 by 1600 @ 60 on very high with a current generation dual core? Thats multiple next gen graphics chips right there.

I play SupCom too (as well as the even more badly optimized Forged Alliance) and at 1280x1024 with my humble 3850, I'm certainly limited by the GPU rather than my CPU (Q6600). Besides, SupCom happens to be able to use dual cores (it doesn't seem to use the quad cores very well even with the optimizer). It's mostly the shields though.

If I even attempted something ridiculous like 2560x1600, the GPU can only become more of a bottleneck.
 
I'd generally agree; I can make quite a few recent apps completely bottleneck at the GPU, even in a crossfire system -- even on an app that supports crossfire. I can drag FarCry into the well-below-30fps dirt with ease (gave it a passing glance last night in 24xAA + Adaptive MSAA mode :D looked awesome!)

SO yeah, I can always find more room for video horsepower. Even on a four year old title ;)
 
I play SupCom too (as well as the even more badly optimized Forged Alliance) and at 1280x1024 with my humble 3850, I'm certainly limited by the GPU rather than my CPU (Q6600). Besides, SupCom happens to be able to use dual cores (it doesn't seem to use the quad cores very well even with the optimizer). It's mostly the shields though.

SupCom is strangely GPU limited. I think it's because they are forced to render a lot more at once than other RTS games due to the zoom feature (even when zoomed right into the ground). I have an 8800GTX and it bogs down at 1920x1200 in big battles. I also have a 3850 but haven't tried it on it. I used to play occasionally with an 8600GTS and it ran ok at 1680x1050 until things got busy. Yeah the shields are especially demanding.

The game is definitely extremely CPU limited if you get into huge battles though. And in multiplayer, the game will only run as fast as the slowest comp can run it. The load is shared equally between machines apparently.

http://www.techpowerup.com/reviews/HIS/HD_3870_X2/15.html
I wonder what the deal is with X2's perf in SupCom anyway....
 
Last edited by a moderator:
Probably not optimized. And yeah in the large battles it does get CPU limited. But if you had been trying to run it at an insane resolution like 2560x1600, it'd STILL be GPU limited (at 2FPS probably).
 
Indeed, and in fact one drawback I have yet to see addressed is the inability to turn off crossfire if it's sucking down your performance. At least I have the opportunity to go turn off the checkmark in the control panel if I run into a game that sucks canal water with CF enabled; not so much luck for an X2 owner.

The X2, just like the upcoming GX2, will live or die by drivers. If they do a good job with drivers, the cards will be excellent for the price. If the drivers suck, these cards will get very little love.
 
Back
Top