Gigabyte creates first dual-GPU graphics card

Discussion in '3D Hardware, Software & Output Devices' started by phenix, Dec 16, 2004.

  1. Lecram25

    Newcomer

    Joined:
    Dec 3, 2003
    Messages:
    103
    Likes Received:
    0
    Indeed, just because the two prototypes don't have SLi connectors does not mean the final cards won't.
     
  2. dan2097

    Regular

    Joined:
    May 23, 2003
    Messages:
    323
    Likes Received:
    0
    But the resulting card would be ludicrous.

    It would have more shading power than slied 6800 ultras but STILL be only 128mb. It would be a complete waste. Slied 6800GTs would own it at 1600x1200 4xfsaa 16xaf.
     
  3. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    I'd thought about the concept of SLI'ing more dual core boards together, but if these are just using SLI in a single board anyway then, given the number of titles that already are forced to AFR because split screen rendering doesn't work / is not optimal, then I'm not sure that adding even more latency would be a good idea.
     
  4. Coz

    Coz
    Newcomer

    Joined:
    Dec 16, 2004
    Messages:
    36
    Likes Received:
    1
    Location:
    Kent, England
    But AFR is supposed to be the preferred SLi mode - it has less GPU-2-GPU communications overhead than SFR and less vertex load per GPU too.
     
  5. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    AFR is not a catch all solution since render to texture ops can be kept for more than one frame meaning that the board that initially rendered it will need to pass that to each other board that requires it in the frame its rendering. Also, for each board you add you are increasing the latency by an extra frame.
     
  6. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    There isnt much software that currently does penalise 128 meg cards, Because without AA, They are usually fine. Heck AA is usually fine if they stay in a lower res.

    Giving 3dmark the benefit of the doubt here. Its not called "Anti Aliasing Mark03" I understand your concern but its not like they are lying about its performance. In many cases it probably would be much faster.
     
  7. fallguy

    Veteran

    Joined:
    Jun 17, 2003
    Messages:
    1,367
    Likes Received:
    11
    And who is going to buy a card such as this, to play at 1024x768? Trying to showcase a card, and running it in one synthetic benchmark, with a low res, and no AA/AF is pointless to me. Unless of course, they're just trying to say, "its faster".
     
  8. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Well since right now that kind of is Gigabyte's whole point... ;)

    I agree it's too early to declare it the new high-end card of choice though, gotta wait until it's actually available and some people can really put the sucker thru its paces. :)
     
  9. fallguy

    Veteran

    Joined:
    Jun 17, 2003
    Messages:
    1,367
    Likes Received:
    11
    Yeah I know what they're doing, and I think its shady.
     
  10. GrapeApe

    Newcomer

    Joined:
    Apr 3, 2004
    Messages:
    57
    Likes Received:
    2
    Location:
    Calgary, Canada
    While I don't deny it's alot of PR-prattle, and this is unlikely to be the 95+% of the gaming for this caliber card, however something like the HDR implementation in FartCry Stresses the GF6600GT too much. and causes the GF6800U to struggle to keep 60fps avg at 1024x768. So, depending on minimum fps dips you might see some benifits, but of course only in specific and admittedly limited situations.

    This would represent a tiny tiny slice of gaming nowadays, but would likely represent more of future titles/engines IMO (by which time of course other chips/cards/etc have come along). This also means you are trading off your 1600x1200 + AA/AF performance for a pigeon-holed niche. But if perhaps you are limited to 1024x768 either due to an LCD or an HDTV (which has it own AA aspects) as your primary/sole monitor, this could be a best fit solution.

    Still it's not worth it to me, but people are willing to do alot of nutty things for their favourite games. Also still not worth developing a card for such a niche market, but if it's as cheap to produce as is being touted, maybe they think it's worth the risk.

    Just a thought, and an answer to your question.
     
  11. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    That was true under scan line interleave, but is it true for the new methods? I'd assume it would still be true for alternate frame rendering, but how about the single frame rendering that divies of the screen based on what will provide the most ideal performance, wouldn't that take advantage of the extra ram?
     
  12. Druga Runda

    Druga Runda Sleepy Substitute
    Regular

    Joined:
    Sep 30, 2003
    Messages:
    647
    Likes Received:
    43

    :lol:

    [​IMG]
     
  13. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Too bad SLI doesn't seem to help much if anything with FarCry in HDR mode... http://www.firingsquad.com/hardware/nvidia_nforce_4_sli/page13.asp
    Granted, the article is a month old, maybe a newer driver would help.
    Other than the "wow 2 chips" factor, I can't really see this card selling well - it depends on the game, resolution and AA/AF settings, but overall I'd value this below a x800xl or 6800gt.
     
  14. {Sniping}Waste

    Regular

    Joined:
    Jan 13, 2003
    Messages:
    833
    Likes Received:
    29
    Location:
    Garland TX
    I don't see SLI realy helping the gamers much. In benchmarks it will kick out high numbers but in real world setting like games, will be luck to get a 40% FPS boost.
     
  15. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    I was wondering something. Maybe it's stupid, but since this card is developed by Gigabyte, what's stopping them in developping something like this but with an ATi chip like the RV410? Aren't all ATi cores (higher than R300) supposed to be able to work together in multiconfigurations?

    If they managed to give this card a 256-bit memory interface, I don't see why they couldn't do this for ATi's RV410? Or maybe even use the 12 pipes X800 which is in the same pricerange as the 6600GT and already uses a 256-bit memory interface?
     
  16. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    I can't see how they'd be able to make that work, as long as ati doesn't provide SLI-enabled drivers... Just because the chips are supposed to work in multi-gpu configurations doesn't mean the driver supports this.
    Also, for the X800 the pcb layout would probably get complicated (=expensive), lots of traces are needed for 2*256 bit memory interfaces... And for other ati ("medium performance") gpus I don't see it as a viable option neither (just like I don't see the dual 6600GT as a viable product).
     
  17. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    Why would they need to develop special SLI drivers? Don't they have those already? Evans &amp; Sutherland have hooked up 4 R300's in their simFUSION 6000 series. And SGI has 2 to 32 R300's in their Onyx4 setups. Those solutions are obviously working one way or another (probably with some help of ATi)... So why can't Gigabyte create an ATi version of their dualchip cards? I don't think it's viable either, but Gigabyte seems to think it is and they have obviously put a lot of money into the 3D1 product. So I wouldn't be suprised if we saw Gigabyte announce an ATi solution sooner or later.
     
  18. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    4,047
    Likes Received:
    1,670
    Anyone see this?

    http://www.ocworkbench.com/2004/gigabyte/GV-3D1/g1.htm

    Superior Features

     Powered by dual GeForce 6600GT.
     First 6600GT card be built with 256MB DDR III memory.
     First 6600GT card supports 256bit memory bandwidth.
     “3D1â€￾ takes advantage of the dual fan cooling system to control the airflow sucked and pushed out for remaining in low GPU temperature and reduces the noise of the fan.
     To achieve 3DMark 2003 performance over 14,000, and to create a new milestone of the graphics card.
     To support with GIGABYTE unique overclocking utility V-TUNER 2, and let customers can tweak two GPU simultaneously.
     Support HDTV output
     To bundle selling with GBT extra performance 8Σ series GA-K8NXP-SLI motherboard to provide the most deluxe gaming package.


    GPU Dual 6600GT
    Memory Size 256MB DDR III
    Memory Bandwidth 256bit
    Memory Clock 1120 MHz
    Core Clock 500 MHz
    Direct X 9.0C
    D-SUB Yes
    HDTV Yes
    DVT Port DVI-I

    --------------------------

    Looks like it is 256bit bus after all.

    Interesting.

    In the Commanche4 test though it had a dip .. CPU affected?

    It does have alot of grunt though.. in most tests it has a significant jump in performance. Also the 71.24 drivers help.

    US
     
  19. incurable

    Regular

    Joined:
    Apr 20, 2002
    Messages:
    547
    Likes Received:
    5
    Location:
    Germany
    Unless they've changed the ASICs, it's still 2 128bit wide memory busses, one on each NV43.

    Probably, I'd guess Comanche 4 is just not supported by either driver's SLI profiles (it's quite old and probably not high on the list for optimisation), resulting in only a single GPU working in fall-back mode while suffering from some CPU overhead.

    cu

    incurable
     
  20. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    4,047
    Likes Received:
    1,670
    Well that's what I understood. So why are they marketing it like that if it's wrong? Can't they just simply say 2x128bit.

    Is it that hard?

    I mean if you have a look at the core speeds and memory speeds .. you would think they should then put it as 1000Mhz core and 2240Mhz Memory.

    US
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...