X850 XT PE > 7800 GT in F.E.A.R.

Discussion in '3D Hardware, Software & Output Devices' started by Matas, Feb 20, 2006.

Thread Status:
Not open for further replies.
  1. karlotta

    karlotta pifft
    Veteran

    Joined:
    Jun 7, 2003
    Messages:
    1,292
    Likes Received:
    10
    Location:
    oregon
    Dude! First off it told me to buy a 9600aiw to get my Valve games! And with that i got just about everything Valve made, on Steam before HL2 came out. HL2 was delayed for a year... The issue way back when was dx9 for the FX line. But was "fixed" on release, but only with a setting change in the console: FX cards dx8 to dx9 ect... The point being 5200 to 5700nonultra just plane sucked at dx9. And we may see the same with x8xx doing sm3 only games....
     
  2. Matas

    Newcomer

    Joined:
    Feb 15, 2006
    Messages:
    159
    Likes Received:
    0
     
  3. HellasVagabond

    Newcomer

    Joined:
    Feb 19, 2006
    Messages:
    83
    Likes Received:
    0
    Location:
    Greece
    Im just saying its a bit fishy to see a Graphics Processor Company Advertise a game months and months prior to its release.......
    And like if that wasnt enough they even announced they would bundle it with their latest GPUs again months and months prior to its release....
    Imagine if that were to happen with DOOM3 and Nvidia....All hell would break loose since even without any of these 2 happening people say that Id supports Nvidia Cards.
     
  4. DOGMA1138

    Regular

    Joined:
    Oct 8, 2004
    Messages:
    533
    Likes Received:
    0
    Location:
    IsraHELL
    Id doesnt supports nVIDIA cards, nVIDIA supports Id games. Valve choose ATi because they had the only DX9.0 cards for quite some time. and yes ATi payed them alot of money and sponsored them for along time. its not fishy, its good just buissness. FX cards suck at DX9.0, none of them can compute decent DX9.0 code at full precision. thats not Valve fault, nither the fact that every ATi cards, since the R300 had excellent DX9 preformance, that was proven by alot more games then just HL2. any way if we`ll go back to HL2 preformance R4XX, R5XX dont have much of a preformance lead over their green counterparts. that was not due to magic patches, but due the the fact, that NV4X+ cards had good DX9.0 support. ATi cards, are still presumed to have higher shader power. so this feature can give the the lead in shader intensive games.
    Valve could used ATi only features, such as 3Dc, from my understanding, it could have worked well with their pre-rendered radiocity_like lighting. but they didnt, and yes they could`ve made their HDR FP16 blending that will run only on SM3.0 hardware. but as they want to make their engine more popular, they had no reason to limited such an "apealing" feature to only a limited market of cards at that time.
     
  5. HellasVagabond

    Newcomer

    Joined:
    Feb 19, 2006
    Messages:
    83
    Likes Received:
    0
    Location:
    Greece
    I dont think im the only one who condemned Valve for supporting ATI cards at the time.
    All i can remember when HL2 was released was complaints from people who had Nvidia cards forcing Valve to release 2 patches with which the game was playable with Nvidia cards. Couldnt they do it with the original release ? Did they have to wait all hell to brake loose in order to fix it ? Of course they could but as you said ATI sponsored the game.
    Bottom line is that what Valve did wasnt right since it gave the advantage to ATI ( for at least 1 month prior to the 2nd patch release ) so excuse me if i dont say Bravo to Valve.
     
  6. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    Valve gave the advantage to cards that actually had decent SM2 performance :roll:
    That includes the GF6...
     
  7. Blazkowicz

    Legend

    Joined:
    Dec 24, 2004
    Messages:
    5,607
    Likes Received:
    256
    the "Id is paid by nvidia" thing is overplayed IMO, if Nvidia cards run great on Doom 3 it's because of their superior opengl driver and Z/stencil rate due to their double-pumped ROPs (present since NV3x).
    And it runs fast on NV3x in FP32 because you have a main shader (covering all surface if I understand it) which is simple enough to have a version of it running on geforce 1's register combiners..
     
  8. Blacklash

    Newcomer

    Joined:
    Feb 26, 2004
    Messages:
    219
    Likes Received:
    3
    Is nVidia superior in OGL because they own the rights to certain OGL extensions and ATi will not pay them for use? Someone correct me, or confirm that.
     
  9. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    347
    Likes Received:
    126
    What patches? I'm convinced nothing like this happened, and I've looked through the update logs and don't see any references to performance increases on Nvidia cards.

    Since Valve started treating the FX line as DirectX 8 cards, performance has been relatively strong. I can't see why it would be otherwise.

    But, here's a benchmark run 2 days after the game was released:

    http://www.xbitlabs.com/articles/video/display/half-life_6.html

    I don't see any Nvidia cards that are incapable of running the game at an acceptable framerate.
     
  10. DOGMA1138

    Regular

    Joined:
    Oct 8, 2004
    Messages:
    533
    Likes Received:
    0
    Location:
    IsraHELL
    not realy since no one, or allmost no one actualy uses them. most OGL games now use ARB extentions only. nVIDIA lead in OpenGL is mostly due to driver advatages, which ATi closed, or is comming rather nicely to closing the gap. there arent that many OpenGL games these days, but NV doesnt have a clear lead in general OpenGL apps any more IMO. their supperior preformance in D3 didnt had much to do with the fact that it was an OpenGL2.0 game. their cards since the FX line were tailor suited to fit D3 and similar games(ultra Stencil shadow intensive mostly). ATi`s high fill rates, and superior shader power didnt helped them much. D3 isnt known for its use of shaders, they had only 1 master shader running all the time. that was responsible for most if not all the effects in game. if im not misstaken, their per pixel lighting, which is done in shader is whats giving all the D3 based games that(awefull IMO) plastic wrap look. nVidia could still have better OGL drivers, but ATi has come a large way into perfecting(as much as any thing can be perfect in this field) theirs too. but with allmost no OpenGL games out there, and with Vista it seems like there would be even less, if any. since as it seems OpenGL would be completly emulated with in D3D10. atleast at lunch, i havent seen any thing that will confirm they`ll be shipping vista with an OpenGL ICV that is capable of running their GUI.
     
  11. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    Ati has done double pumped z with fsaa since the 9700...
     
  12. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    The only updates that came out soon after the game was released was a fix to the stuttering issue that alot of people were having and some crashing.

    All Geforce FX cards ran DX8 by default because they could not run DX9 at a playable framerate. There never were any special optimizations for ATI cards, Valve prefered ATI at the time and ATI in turn payed them to include their game with their cards for marketing purposes. Maybe you should be concentrating on the other 90% of games that are TWIMTBP endorsed.
     
  13. DOGMA1138

    Regular

    Joined:
    Oct 8, 2004
    Messages:
    533
    Likes Received:
    0
    Location:
    IsraHELL
    and HL2 wasnt even an "official" GITG game, no evil logo or any thing =)
    it was just a general ATi promotion, nVIDIA has been paying publishers, and developers for years.
     
  14. Blacklash

    Newcomer

    Joined:
    Feb 26, 2004
    Messages:
    219
    Likes Received:
    3
    Ah ok. Well I have had quite a few ATi cards and never had a problem running OGL, just curious about that rumor. I have a X1900 XTX now and love it. It was a nice kick up in peformance from my AIW X1800 XL.
     
    #34 Blacklash, Feb 21, 2006
    Last edited by a moderator: Feb 21, 2006
  15. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    No longer a Clear Lead IN OGL APPS?! The fact that ATI gets handed its ass in CoR or IL means nothing. Or that a card with 25% less bandwidth still wins out or 1-2% slower at ultra high settings(7800GTX256 vs. X1800XTX). ATI has OGL driver issues, they are now simply masking their weakness with memory controllers and optimizations. Get real man, ATI needs to fix their OGL drivers. With the bandwidth they have, they should be crushing NV cards, the fact they dont tells the story.
     
  16. XMAN26

    Banned

    Joined:
    Feb 17, 2003
    Messages:
    702
    Likes Received:
    1
    The problem isn't that ATI can't run OGL based games/apps, they can and do so fairly well. Their problem is that they are lagging behind NV in OGL support and overall speed. So much so that in order to make people think they are doing something with their drivers(which they haven't yet) with memory controllers and optimizations. Truth be told, if they WOULD FIX THEIR OGL DRIVERS, THEY WOULD BEAT NV IN BOTH D3D and OGL!
     
  17. dizietsma

    Banned

    Joined:
    Mar 1, 2004
    Messages:
    1,172
    Likes Received:
    13
  18. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
  19. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
  20. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...