AMD confirms R680 is two chips on one board

Discussion in 'Architecture and Products' started by nicolasb, Dec 14, 2007.

  1. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    #421 Shtal, Jan 26, 2008
    Last edited by a moderator: Jan 26, 2008
  2. Kowan

    Newcomer

    Joined:
    Sep 6, 2007
    Messages:
    136
    Likes Received:
    0
    Location:
    California
    No doubt. But to brute force that much video power at Crysis would be playable at much higher then it is currently on anything. For any other game, it would be a waste IMO.
     
  3. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    If it were possible Quad-CrossFire 4X Radeon3870X2; nobody going to spend that much money on video-power just to play crysis. :lol:
     
  4. Kowan

    Newcomer

    Joined:
    Sep 6, 2007
    Messages:
    136
    Likes Received:
    0
    Location:
    California
    LoL, oh there would be some that would. Just cause they could. :lol:
     
  5. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Tell that to anyone buying an Alienware system.
     
  6. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    This CPU limited stuff is bull. Really. Are we back in the dark ages or something?Start running things with high enough settings at a high enough resolution, and then come back to us and tell us how's that CPU limited thingie coming.

    Do you actually check out CPU reviews?There's about zilch scaling with high-end CPUs beyond a certain point(read, over medium settings or whatever). If you're running a Pentium D, yeah, sure, you'll quite CPU limited. But under normal circumstances, you're almost always GPU bound. Even with SLi, even with Tri-SLi, and even with a hypothetical octo-GPU.
     
  7. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    http://www.hardocp.com/article.html?art=MTI2MiwsLGhlbnRodXNpYXN0

    Scaling between AMD and Intel processors. It seems there is a difference. We have a paradox I think of being limited at high resolutions again with power to burn still waiting untapped in the CPU. This is IMO because a lot of games still don't offer full threading.
     
  8. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    In F.E.A.R. and in Oblivion?Not to mention that those other titles are fairly ancient in their own right. If you were to take the top-end GPUs of the time when Oblivion was released and perform the test, for example, you'd be likely to be GPU bound. It's the same here. The hypothetical scenario of 8-way multi GPU rendering isn't for playing your three year old game at 1280x1024. For that it'd be a waste.

    Once you start getting into modern times, it becomes very useful FOR A CERTAIN NICHE. There are ppl buying 30" Dells to game on, buying the Skulltraill(when it'll be available) and Maximuses, QX9650s and 9770s, paying a lot of money in order to get top notch watercooling setups and so on. And they'll play Crysis, and Bioshock, and World in Conflict and so on. At the 30" Dell's native resolution. Preferably with AA and AF. And over there you need all the GPU oomph you can get.

    The trouble with an 8-way setup stems from the difficulties associated with getting it to scale, the lag that AFR brings, power requirements, not the fact that you'd be CPU limited. That's bull(for the target demographic of such a solution).

    Case in point:with a 3.2 Pentium D, 2900XTs in CF were indeed CPU limited at 1280x1024/1440x900 in 3DMark...even with 4x AA on, the score didn't fluctuate much. Punching it up to 4.5GHz, and voila, you got perceivable performance delta from no AA to 4xAA. A 4.5GHz Pentium D isn't exactly a speed demon by today's standards, and 3DMark06 isn't the most demanding thing out there, there are games that are more GPU intensive.
     
  9. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    I'm sorry im not convinced you've offered enough proof that at 2560 by 1600 you won't get cpu limited in modern games. Im not talking about crysis, im talking about games you can actually run at 2560 by 1600 with say XfireX 3870x2. They compared two GTX's in SLI on a dual core 3ghz. Surely how can you prove that a cpu has enough power to fuel two teraflops (theoretical but gpus seem to hit close enough to their potential) of GPU's when the speed of out of the box parts haven't changed much in that time. Sure we have dual vs quad, but not all games CAN take advantage of that even now. So what im saying is that I think a game can become CPU limited when the cpu has free cycles to spare.
     
  10. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
    Little aside, but I was wondering--now with the current prospect of running four GPU setup in a consumer level system--what are the odds for implementing some sort of "free" 4xRGSS, combining the power of all the devices, with a simple driver hack/trick?
     
  11. ChronoReverse

    Newcomer

    Joined:
    Apr 14, 2004
    Messages:
    245
    Likes Received:
    1
    @Squilliam
    You do realize that just one of the 3GHz cores in the E6850 would be far faster than even the fastest 3.73GHz Pentium 4 already right?

    The clock frequency doesn't tell the whole story. We may have moved towards multiple cores, but each individual core has also significantly improved in performance.
     
  12. Squilliam

    Squilliam Beyond3d isn't defined yet
    Veteran

    Joined:
    Jan 11, 2008
    Messages:
    3,495
    Likes Received:
    114
    Location:
    New Zealand
    Certainly do!

    Well look at FSX, Supreme commander etc :) I play a lot of strategy games. But on the FPS front, Could you say that crysis would perform at 2560 by 1600 @ 60 on very high with a current generation dual core? Thats multiple next gen graphics chips right there.
     
  13. Shtal

    Veteran

    Joined:
    Jun 3, 2005
    Messages:
    1,344
    Likes Received:
    4
    #433 Shtal, Jan 29, 2008
    Last edited by a moderator: Jan 29, 2008
  14. aca

    aca
    Newcomer

    Joined:
    May 4, 2007
    Messages:
    44
    Likes Received:
    0
    Location:
    Delft, The Netherlands
    Should work much better on Vista due to limitations of AFR on XP. I guess we'll see in a couple of months. :smile:
     
  15. ChronoReverse

    Newcomer

    Joined:
    Apr 14, 2004
    Messages:
    245
    Likes Received:
    1
    I play SupCom too (as well as the even more badly optimized Forged Alliance) and at 1280x1024 with my humble 3850, I'm certainly limited by the GPU rather than my CPU (Q6600). Besides, SupCom happens to be able to use dual cores (it doesn't seem to use the quad cores very well even with the optimizer). It's mostly the shields though.

    If I even attempted something ridiculous like 2560x1600, the GPU can only become more of a bottleneck.
     
  16. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
    I'd generally agree; I can make quite a few recent apps completely bottleneck at the GPU, even in a crossfire system -- even on an app that supports crossfire. I can drag FarCry into the well-below-30fps dirt with ease (gave it a passing glance last night in 24xAA + Adaptive MSAA mode :D looked awesome!)

    SO yeah, I can always find more room for video horsepower. Even on a four year old title ;)
     
  17. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    SupCom is strangely GPU limited. I think it's because they are forced to render a lot more at once than other RTS games due to the zoom feature (even when zoomed right into the ground). I have an 8800GTX and it bogs down at 1920x1200 in big battles. I also have a 3850 but haven't tried it on it. I used to play occasionally with an 8600GTS and it ran ok at 1680x1050 until things got busy. Yeah the shields are especially demanding.

    The game is definitely extremely CPU limited if you get into huge battles though. And in multiplayer, the game will only run as fast as the slowest comp can run it. The load is shared equally between machines apparently.

    http://www.techpowerup.com/reviews/HIS/HD_3870_X2/15.html
    I wonder what the deal is with X2's perf in SupCom anyway....
     
    #437 swaaye, Jan 29, 2008
    Last edited by a moderator: Jan 29, 2008
  18. ChronoReverse

    Newcomer

    Joined:
    Apr 14, 2004
    Messages:
    245
    Likes Received:
    1
    Probably not optimized. And yeah in the large battles it does get CPU limited. But if you had been trying to run it at an insane resolution like 2560x1600, it'd STILL be GPU limited (at 2FPS probably).
     
  19. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,119
    Location:
    WI, USA
    And thus the potential issues of CF and SLI become painfully apparent. It's performing like an 8600GT(S)!
     
  20. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    4,309
    Likes Received:
    1,107
    Location:
    35.1415,-90.056
    Indeed, and in fact one drawback I have yet to see addressed is the inability to turn off crossfire if it's sucking down your performance. At least I have the opportunity to go turn off the checkmark in the control panel if I run into a game that sucks canal water with CF enabled; not so much luck for an X2 owner.

    The X2, just like the upcoming GX2, will live or die by drivers. If they do a good job with drivers, the cards will be excellent for the price. If the drivers suck, these cards will get very little love.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...