512MB GeForce 7800 GTX

Discussion in 'Pre-release GPU Speculation' started by KimB, Oct 31, 2005.

Thread Status:
Not open for further replies.
  1. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism

    game devlopers for complex engines usually dont get to run the engine in full spec for awhile. The unreal3 engine saw much of its work done on reference NV30s if i remember correctly. Even the NV40 seems to have severe trouble where it lacks power in that engine. This is echo'd by comments such as how much better it runs on the G70, which was recently said at E3. I just dont think it matters on the hardware out at the time, but more of where the developer see's their engine in its finality and what the APIs offer along the way. True possabilities of shader limitation in my opinion werent apparent until DX9, so it really shouldnt take longer then 1 or 2 years for this to show on cores in games. In that ATI may then be branded winner, savior, what have you, as not everyone has the luxury of updating yearly.
     
  2. Sunrise

    Regular

    Joined:
    Aug 18, 2002
    Messages:
    306
    Likes Received:
    21
    Yields are certainly of major importance on your low-end designs, but TSMC offers a whole lot of different targets for different requirements (they also do it with 80nm, which is their first half-node to offer such extensive possibilities, while 110nm was initially high-volume first), so each is already best suited for the markets you want your designs in. In that way, every design can be stressful to some extent, but there are also other reasons why IHVs nowadays have adopted the strategy to prototype their new designs on the low-end parts first.

    NV has the advantage that they have an architecture which is built extremely well to scale in a lot of different markets they want to compete in, with the performance that GPUs need for games that are built for nowadays requirements. There may also be the potential to execute G7X on 90nm, but that has to be seen first (and i don´t think that 6100 or 6150 could be indicative of that).
     
    #842 Sunrise, Nov 13, 2005
    Last edited by a moderator: Nov 13, 2005
  3. Unknown Soldier

    Veteran

    Joined:
    Jul 28, 2002
    Messages:
    4,047
    Likes Received:
    1,669
    Oops ye .. keep on forgetting the R520 includes the XL .. Pity the XT wasn't available on launch day though.
     
  4. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Of course these titles weren’t coded with R580 in mind, these titles being released now were coded with DX9 in mind, which hardware has had much higher ALU capabilities than previous generations of parts of which titles were generated on now.

    Bear in mind, though, that ATI’s ALU/ROP/Texture ratio has stayed largely unchanged from R300 all the way to R520, but in that time developers have had first greater program length capabilities with SM2.x and SM3.0, NV40, G70, developer relations and next gen consoles all giving them indications to “use shadersâ€￾.

    The point being, though, is that for as many years as I can remember, its usually a safe bet that most of the AAA titles, that are usually taxing on the system, are released before Christmas.
     
  5. fallguy

    Veteran

    Joined:
    Jun 17, 2003
    Messages:
    1,367
    Likes Received:
    11
    Why didnt they make it so the hot air went out of the case. :( Seems just about half of it does. Im a bit anal about that. Hopefully my Silencer will go right on it.

    Yes it was, 3 days before it in fact.
     
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,058
    Likes Received:
    3,116
    Location:
    New York
    Well yeah but how is that any different to the people that whine about IQ, price, performance, power consumption, noise etc and who will never purchase the hardware anyway? Availability is just another benchmark. How many people (myself included) who were debating the viability of SLI in the early days actually own SLI setups ?
     
  7. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I understood your point. Kudos to ATI for raising the IQ threshold beyond doubt; what I'm questioning here is how much the average consumer really values the added investment both in resources as in transistors in the end. And I'm obviously not refering to hardcore fans that will buy either/or IHVs GPUs no matter what.

    Besides it's a lot more easy to illustrate/prove sterile performance than it is with IQ ratings.
     
  8. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,090
    Likes Received:
    694
    Location:
    O Canada!
    Yes, I know what you're saying, but thats why I'm talking about FSAA - with R520 they improved the FSAA performance at the same bandwidth levels by some 10%-20% (for 4x AA) in current titles in relation to their previous parts and also put mechanisms in place to try ensure those gains are still carried on with higher performance memories.
     
  9. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    Back in spring I think R520 would have been a pretty hard sell against 7800GTX.

    Most reviewers describe the IQ of the two as "equal" - despite the fact they're not. But, more convincingly, all the "old skool" games around back then would have shown 7800GTX with an unassailable performance advantage, apart from when X1800XT shows about 10% better performance in SC:CT.

    It's only with CoD2, FEAR and BF2 that X1800XT is leaving 7800GTX in the dust - all of which have been released after R520 would have originally been launched. So back then R520 would have been summarised with "not a good all-rounder - the 16-pipe architecture is looking old compared to 24-pipes".

    The big unknown, still, is how much driver tuning is left. A spring launch for R520 would prolly have meant that the drivers were well tuned. But right now, apart from headline games (HL2, D3, Q4, BF2, FEAR, SC:CT) I think R520 will be shown to be severely un-tuned - and may never "catch-up", either.

    5.11 gives a big boost to HL-2, apparently. I guess that means that even the headline games are still in need of more tuning.

    Where's Prey?...

    Jawed
     
  10. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    I'm not so sure what exactly you mean with tuned/untuned in the end. There are enough similarities between R4xx and R5xx that ensure performance to be on a quite high level already.

    Besides it also comes down as to how much older titles you're refering to; the older those are the more vastly CPU bound they'll turn out to be. What's there to tune for UT2k4 as a simple example?

    ***edit:

    HL2 is amongst those games that are already CPU limited; despite that though it still counts to the so called key applications for benchmarks.
     
  11. Rys

    Rys Graphics @ AMD
    Moderator Veteran Alpha

    Joined:
    Oct 9, 2003
    Messages:
    4,182
    Likes Received:
    1,579
    Location:
    Beyond3D HQ
    It's not CPU limited at the resolutions and IQ levels that these cards were arguably somewhat designed for.

    EDIT: Checking, yes it is. However Source games with HDR content aren't CPU limited. DoD and Lost Coast will tax these boards at high res.
     
  12. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    I strongly disagree.

    Though the pipeline layout is "the same", the out of order scheduling, which is inextricably linked to the memory controller, hence to memory bandwidth/latency, is a whole new ball game. Anything that looks like a "we tweaked the memory controller" improvement has a direct impact on scheduling as well as use of the texture cache and the operation of the ROPs.

    We've already seen an example of how R3xx...R4xx-specific tweaks are counter-productive for R520.

    I'm referring to things that aren't headline games - they aren't necessarily particularly old.

    Pariah:

    http://www.xbitlabs.com/articles/video/display/asus-en7800gtx_14.html

    Pacific Fighters:

    http://www.xbitlabs.com/articles/video/display/asus-en7800gtx_19.html

    Warhammer:

    http://www.xbitlabs.com/articles/video/display/asus-en7800gtx_22.html

    I like the breadth of XBit Labs' reviews.

    Generally CPU-limited, yes. But there are plenty of places in the game where high levels of action bring frame rates into the 50s or lower.

    And with transparency/adaptive AA on high quality, the hit is even higher.

    The fact is, most of the time in HL-2, there isn't much going on, so high frame rates (limited by the CPU) are a cinch.

    That's why I suggested, recently, to use Lost Coast instead - even when there isn't much going on the GPU hit is considerable.

    Serious Sam 2 is another game with a big GPU hit. Hard to find it benchmarked, though. XBit Labs has just started:

    http://www.xbitlabs.com/articles/video/display/geforce-6800gs_14.html

    Jawed
     
  13. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    Thinking about "action" for a second, games like CoD2 (judging from the demo, which I've enjoyed quite a lot - suprising for me because I hated the CoD1 demo, prolly because that was intensely "on rails") seem to have a lot of "nearly continuous action" which I dare say makes them better candidates for benchmarking.

    I suppose SS2 is much the same, with almost continuous action it would make a good benchmark.

    Jawed
     
  14. nAo

    nAo Nutella Nutellae
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    4,400
    Likes Received:
    440
    Location:
    San Francisco
    I can't understand why SS2 is used as a benchmark. IMHO it's a bad game with a subpar engine..c'mon, it's fugly!
     
  15. Ailuros

    Ailuros Epsilon plus three
    Legend Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    9,511
    Likes Received:
    224
    Location:
    Chania
    Point being that older titles are already fast enough. What on God's green earth do you need more than say 60-70fps in 1600 with AA/AF for anyway? The next best answer is performance ratings in benchmarks, but when no one really uses them anymore it kills somewhat the purpose.

    ROFL to what kind of drivers are you linking me to anyway? ;)


    Depends what you mean with minimum framerates exactly and how any given game behaves in general. I'd personally have any driver team dealing with obnoxious single digit minimum framerates in recent games than anything above 20 fps in years old games.

    Directly proportional to the amount of alpha test textures.


    I use OGL in that one ;)
     
  16. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    The proliferation of winks makes that posting entirely meaningless to me.

    Jawed
     
    MuFu likes this.
  17. SugarCoat

    Veteran

    Joined:
    Jul 17, 2005
    Messages:
    2,091
    Likes Received:
    52
    Location:
    State of Illusionism

    why not? its better then using Doom 3 as the only reprisentation of ATI's over-all OpenGL performance such as many past review sites did.

    I think you have to be extremely careful here, we dont know what % impact having the extra 256 vid ram is having. I am absolutly positive its helping with high aa/af.

    http://www.xbitlabs.com/articles/video/display/asus-7800gt_11.html
    Fear for example, we see the XT trailing the GTX in all resoltion with candy off, turn it on and it switches places. AA is extremely detrimental to performance on all cards so im willing to bet that 512 is being put to good use.
     
    #857 SugarCoat, Nov 13, 2005
    Last edited by a moderator: Nov 13, 2005
  18. Sunday

    Newcomer

    Joined:
    Feb 6, 2002
    Messages:
    194
    Likes Received:
    6
    Location:
    GMT+1
    Couldn’t agree more!!
    This game is ridiculous (or if I may use phrase stupid ‘till death), and as far as I’m aware no one will be using their engine in any other game, so that is one more reason not to us it in any card evaluation!
     
  19. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,714
    Likes Received:
    2,135
    Location:
    London
    I suppose more comments on how good or bad SS2 is might be appropriate in the interview thread:

    http://www.beyond3d.com/forum/showthread.php?t=21878

    There's a link to the interview there, too.

    It's interesting that the graphics engine seems to be feature-laden yet is getting a panning.

    Jawed
     
  20. pharma

    Veteran

    Joined:
    Mar 29, 2004
    Messages:
    4,891
    Likes Received:
    4,540
    Not sure this was posted...Hexus reviewed the Leadtek PX7800 GTX TDH MyVIVO Extreme 256MB and SAPPHIRE RADEON X1800 XT 512MB.

    Unheard-of-frequencies ... sounds like he knows something!! Can't wait! :shock:


    http://www.hexus.net/content/item.php?item=3899

    Pharma
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...