Half Life 2 Benchmarks (From Valve)

Discussion in 'Architecture and Products' started by Dave Baumann, Sep 11, 2003.

  1. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Developers often aren't terribly vocal as a rule, and it usually takes something extreme either good or bad to get them to start talking. But in this case I think the loudest talking is coming from software like 3dMk03, shader benches, and DX9 games like HL2 (not to forget Tomb Raider, etc.) I think their software often speaks volumes even though they themselves might not. I think that the stimulation motivating Gabe to talk like he did was major and extreme. It was either people coming down on Valve and accusing them of partisanship or lousy programming, or it was Valve explaining things to people and attempting to educate them on why their software is as it is. I think they wisely opted for the second choice--not necessarily because they wanted to, but because they felt like they had to. Had nV3x been DX9-compliant from a hardware perspective, none of the things we've witnessed this year would have happened, IMO. Developers would rather not choose among IHVs, but when one progresses with the standards of the API and one does not, they get forced into these decisions. I don't blame them--I'd be pissed, too.
     
  2. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    Bjorn, after being around here so long surely you know that pixel shader units are essentially built into the R(v)3x0's pipeline? :) The RV350 has as much shader power per pipeline as the R350, in that it has the same pixel shader in each pipe--but only half the pipelines, thus half the fillrate. Same shader power per clock, though, as I understand it. If both cards were asked to render, say, one PS2.0-shaded pixel in one clock, I'd imagine the 9600P would score practically equal to the 9800P. The latter's extra memory bandwidth may become a factor with four PS2.0 shaded pixels in one clock: though it's still within the RV350's design capability to output four shaded pixels per clock, its 2x64 crossbar may limit it. (I'm speaking from no in-depth knowledge whatsoever, though, so my surface understanding of the memory switch to the pixel shaders and subsequent seemingly logical pairing of the two may be totally off-base.)

    The R350 does have twice the vertex shader units as the RV350, but that's probably to balance its doubled fillrate--I'm not sure the RV350 would benefit greatly, if at all, from extra VSs.
     
  3. OpenGL guy

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    2,357
    Likes Received:
    28
    The RV350 has half the pixel shaders of the R350. That means half the fillrate and half the pixel shading power.
     
  4. Bjorn

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,775
    Likes Received:
    1
    Location:
    Luleå, Sweden
    Yep, i have been around here a long time :)

    PS Though that doesn't automatically mean that i have learned that much :) DS
     
  5. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    You quoted exactly my chip, too. :cry: :cry: :cry:

    Must... upgrade... soon...!
     
  6. rwolf

    rwolf Rock Star
    Regular

    Joined:
    Oct 25, 2002
    Messages:
    968
    Likes Received:
    54
    Location:
    Canada
    So the NV3X has been out since January why does Nvidia still have to code new drivers to make games work with reasonable performance?
     
  7. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    Because they guessed wrong.
     
  8. xGL

    xGL
    Newcomer

    Joined:
    Sep 6, 2003
    Messages:
    144
    Likes Received:
    0
    No, this isn't it.
    They don't even have a DX8 integrated solution (no Pixel Shaders), I'm not even sure if they have a chipset which fully supports DX7...
     
  9. NapalmV

    Newcomer

    Joined:
    May 20, 2003
    Messages:
    7
    Likes Received:
    0
  10. L_i_n_k

    Newcomer

    Joined:
    Feb 26, 2003
    Messages:
    56
    Likes Received:
    0
    Location:
    Finland
    more info

    have anyone read this?

    continues.... http://www.3dcenter.de/artikel/cinefx/index_e.php
    [/quote]

    Opinnions about it ,referring the topic...
     
  11. Sharkfood

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    702
    Likes Received:
    11
    Location:
    Bay Area, California
    I've been trying to purposely stay out of this whole debacle/farce as best I can, but the public response has been nothing short of overwhelming.

    I'm a little skeptical of all that has been said mainly because, if we use history as a guide, what is generally the single most important points of benchmarks are those things the scores do NOT show... graphs of numbers are for all intents and purposes, useless information if there is no comprehensive analysis or even a little bit of research performed by 3rd party/unbiased individuals to try and explain the seen behavior.

    In this case, from already explored limitations in the NV3X architecture, it would be fairly expected to see lower performance in PS2.0 performance, but I dont think anyone expected as bad as what we are seeing. Alleged rumors of screen capture tampering and shader cheats bemuddle NVIDIA's standpoint even further. In all, it's making hammering the NV3x a fairly easy, almost "stylish" thing to do.

    I've run both a 5900U and a 9700 Pro side by side, and yes- there is pretty huge disparity between the two products, but not to the degree being seen with HL2. Especially given changes in shadermark scores FX12->FP16 enhancements since the NV30 and whatnot.

    I think NVIDIA would be 100x better off if they just left the hardware alone, stopped with the alleged application/screen capture hacks (if these pan out to be reproducible, verifiable findings) so as some real research can be accomplished. I also hope Beyond3D in it's normal style will do some research to help put a finger on the behavior being seen through insight into the hardware and final rendered image analysis. We haven't seen any of this from the sites yet... just posting blind benchmark number with no exploration to find "why" or quantify the issues.

    Right now, we dont have anything tangible or substantial... it's a bit dissapointing to see the same patterns repeat from years of old, just this time the IHV's have swapped names. I thought the 3D enthusiast fans had adapted to be a bit more sketical/insightful, but unfortunately it seems the same old habits have resurfaced- cheerlead graph figures that one cannot even verify, test, explore or research on their own.
     
  12. jvd

    jvd
    Banned

    Joined:
    Feb 13, 2002
    Messages:
    12,724
    Likes Received:
    9
    Location:
    new jersey
    Sharkfood - not for nothing but this isn't hte ati quack issue. This is just another lie in a year full of huge lies . It all started off with the nv30 and it taping out. Then horrible delays , insane cooling fans on the high end products , a very limited run of the high end cards. Also see the 3dmark fisaco. Now we come to half life 2 which is a real dx 9 game. The nv30 does not do well on this with current drivers. Even with valve tweaking for the card. Then a statment from valve basicly saying the 50dets are garbage . Same as what happened with 3dmark.

    So once again this isn't one thing that we can give them the benifit of the doubt with. This is just one more to add to us being weary of them.
     
  13. Tahir2

    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    2,978
    Likes Received:
    86
    Location:
    Earth
    I think that was very well put jvd.

    I get a jaded rundown feeling whenever I hear about NVIDIA and their drivers and their PR. It almost always is something negative when a time not so long ago NVIDIA drivers were the 'gold standard.'

    How things have changed in a relatively short amount of time.

    Word to the wise: It takes a lot of effort and time to gain trust. NVIDIA should be focusing on gaining the trust back and should not be concerned about trying to make their cards look faster than they are. It is too late for that NVIDIA. Wake up and do the right thing. Just execute and you can redeem yourselves slowly. I want the old NVIDIA of TNT days through to the GF3 days. I dont like the post 3dfx NVIDIA.
     
  14. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    Look at the brightside: all these people who are blindly loyal to nVidia in the face of all the evidence are going to get exactly what they deserve. 8)

    (Hey, it's my happy thought when the fanboys start swarming with nonsense lately. ;) )
     
  15. BRiT

    BRiT (>• •)>⌐■-■ (⌐■-■)
    Moderator Legend Alpha

    Joined:
    Feb 7, 2002
    Messages:
    20,502
    Likes Received:
    24,397
    What makes you think today's Nvidia is any different from yesteryear's Nvidia? The only difference I see is yesteryear's Nvidia was never caught...
     
  16. WaltC

    Veteran

    Joined:
    Jul 22, 2002
    Messages:
    2,710
    Likes Received:
    8
    Location:
    BelleVue Sanatorium, Billary, NY. Patient privile
    Agree completely. The difference now is that nVidia has some real competition--and they're just falling apart in the face of it. This is what happens to companies used to having it their way for a couple of years--when the competition rears its head with better products it takes them a long time to understand they can't weasel their way out of it with PR gimmicks, driver tricks, and gross misrepresentation.

    I've seen this happen time and again in this industry--companies get fat and start feeling entitled to the "number 1" spot--and shortly thereafter they get creamed--and they never even saw it coming. A lot of it has to do with the milking syndrome which is coupled tightly with the entitlement syndrome--they become legends in their own minds to the degree that they are no longer competitive. The amazing thing is that these companies often can't see it until their bad habits become so deeply ingrained the corporation simply can't change, and they become casualties of the technology wars.

    The thing is that nVidia didn't have to be so obvious with 3dfx because 3dfx was 3dfx's worst enemy--basically all nVidia had to do was sit back and watch them destroy themselves through overreaching, over expansion and overspending. ATi is a competitor of a different breed altogether. So I imagine nVidia is beginning to seriously flounder at this point in time. What's nVidia *really* got in the face of this kind of competition? I don't know, but it's for sure we'll find out within the next 6 months. nVidia's going to have its mettle tested--in the fire. I'm certain the tactics they used with 3dfx will get them nowhere in this race. If their products aren't number 1, then they won't be, either. And for once I say it's about time to see the companies stand or fall on the products they make--and to hell with PR.
     
  17. Sharkfood

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    702
    Likes Received:
    11
    Location:
    Bay Area, California
    I agree 100%, which is the main message behind my post.

    The problem is- for those that recognized the issues and were smart enough to require genuine factual information rather than blindly follow numbers/graphs seem to now be using a different measuring stick for what is truth versus fiction.

    The single most important points to be discovered are generally what you DONT see on the bars and graphs. It's interesting that some read this as being another "quack" issue when there was no mention of such thing. The only point being, without some solid, conclusive analysis, what has been provided is still mostly useless and meaningless until peer review and some research/logical progressions of findings can be provided.

    I've always looked to Beyond3D and it's forum folks to open up this channel. It's one of the best parts of this userbase and the site's staff. I just think it's a little early to be popping any corks for any IHV, even though the end result will likely be what is being illustrated- it's just not awardable until some more delving has been done. :)
     
  18. cthellis42

    cthellis42 Hoopy Frood
    Legend

    Joined:
    Jun 15, 2003
    Messages:
    5,890
    Likes Received:
    33
    Location:
    Out of my gourd
    I think people were assuming a reference from you with list line in your last post: Right now, we dont have anything tangible or substantial... it's a bit dissapointing to see the same patterns repeat from years of old, just this time the IHV's have swapped names. If you were referring to another thing completely, you probably just had to be less ambiguous in your wording. The "quack" issues gets brought up a lot.

    At any rate, there really IS a lot of information out there regarding some of the fundamental "why"s. (For instance this thing here.) Problem is for most reviewers they don't have the tools nor the access nor the contacts able to part with all the details they need nor the equipment to get REALLY into the nitty-gritty, so they basically have to guide by posted numbers and use official benches/reports, run some of their own tests, and then play out some conjecture. Devs are hardly ever forthright about the entire process the go through for one chipset or another, and the IHV's themselves tend to keep specifics to themselves as well, so it's hard to say ANYTHING "for sure"--not to mention the amazing amount of time and effort that has to be put into these kinds of studies.

    But really, all one NEEDS to do is look around more. There's plenty out there to absorb. ^_^
     
  19. incandescent

    Newcomer

    Joined:
    Nov 19, 2002
    Messages:
    15
    Likes Received:
    0
    eh? sure is a lot of fuss over something so trivial. All NVIDIA has to do is release the NV40 on time, and such that it is the undesputed perf leader. Things changed in an instant for ATi when the 9700 was launched --- and they can just as quickly change for NVIDIA.
     
  20. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    Just to be clear: pixel shaders (meaning hardware) aren't communal like vertex shaders, right? AFAIK, one pixel shader will work on one pixel to be shaded--you can't use two pixel shaders to halve the time of computation on a single pixel, because of the exclusive nature of the pipelines and this pixel shaders, correct?

    I left out "half the pixel shading power" b/c I thought it was obvious from my previous statement that pixel shades are "one" per pipeline.

    I'm also assuming nV's pixel shaders are also pipeline-exclusive, so that a 5600, with two multi-texture pipelines, has "two" shaders. Am I right? Are pixel shaders on the FX line really a "sea" or parallel rivers, like with ATi?

    (Perhaps I should read that 3DCenter article more closely....)
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...