RAGE: That's actually what you do when trying to get the PC version to work

Discussion in 'PC Gaming' started by Farid, Oct 6, 2011.

  1. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,987
    Likes Received:
    3,529
    Location:
    Winfield, IN USA
    Yeah, it fixed a whole lot of the flickering NPC problems for me; but I did see one last night and it depressed me. :(

    My son's rig running win 7 32 has been having awful crashing nightmares with it, and we got almost identical rigs. :???:
     
  2. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Steam just dropped a ~150MB patch onto my system, adding some additional graphics adjustments including aniso settings, video memory useage and vsynch, and what seems like a whole bunch of driver bug workarounds.

    Ridiculous that a game can be in development for six fucking years and still need so much propping-up at launch; after all, it didn't exactly fall out of the sky all of a sudden... Then again, at least something's being done about this whole mess. I still haven't booted it up because I'm rather ticked off AMD isn't supporting crossfire for Rage (the game's so wonky it requires special tweaks for that too to work? :???:), but maybe I'll have a look tomorrow.

    My rig should be able to run this game fine even with just one GPU methinks.
     
  3. Ike Turner

    Veteran

    Joined:
    Jul 30, 2005
    Messages:
    2,110
    Likes Received:
    2,304
    First Rage patch from id coming soon and moredrivers fromAMD too accordingto Todd Hollenshead:

    Kinda weird / sad to see the friggin id CEO tweeting this (and replying to tweets about the issues, and trying to help). Oh wel..

    Edit: Grall beat me to it.. looks like the patch is now out on Steam..

    Full Patch changelog:

    As we can see there are problems on both sides (NVIDIA and AMD).
     
    #143 Ike Turner, Oct 9, 2011
    Last edited by a moderator: Oct 9, 2011
  4. Colourless

    Colourless Monochrome wench
    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,274
    Likes Received:
    30
    Location:
    Somewhere in outback South Australia
    Well that is a screwup on both ends. Id is attempting to use an extension that isn't being listed as supported and the drivers are not gracefully failing when attempting to use said extension when its not supported.
     
  5. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,044
    Likes Received:
    1,116
    Location:
    WI, USA
    "Using an AMD graphics card you may" .......... rage? ;)
     
  6. Farid

    Farid Artist formely known as Vysez
    Veteran Subscriber

    Joined:
    Mar 22, 2004
    Messages:
    3,844
    Likes Received:
    108
    Location:
    Paris, France
    What I don't understand is why id thought it would be a good idea to rely on specific drivers, full of optimizations, and on yet unimplemented functionalities (swap-tear)?

    I mean, I understand that JC and co used to work that way back in the days, when their engines were incredibly relevant to the discrete Graphics vendors. But, this is 2011, the last major game id released was in 2004, id Tech 5 will not be licensed to third parties, Rage will not move crowds (if it's a million seller on PC, when all is said and done after a year and a few sales, it will be impressive), the game engine has the ancient X360/PS3 consoles as baseline, so what's with the need to use bleeding edge drivers?

    We all know that JC loves new technology and smart optimizations and thus wanted to incorporate some of that in Rage. But why rely on these drivers only? Couldn't they include a compatible path, to which the game would degrade to if the more elaborate path is not available or, as we see here, broken?
     
  7. willardjuice

    willardjuice super willyjuice
    Moderator Veteran Alpha

    Joined:
    May 14, 2005
    Messages:
    1,386
    Likes Received:
    299
    Location:
    NY
    A bit off-topic, but rage is a decent example of why I prefer an "all-or-nothing" api over an api that allows caps/extensions.
     
  8. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,044
    Likes Received:
    1,116
    Location:
    WI, USA
    It seems to me that because they made a game engine that doesn't work like any other the video cards have been asked to do new things and to do things differently than usual. They did use some bleeding edge technology and I don't think it's wrong to ask NV and AMD to make it work right. It was the same with Doom3 actually. I remember JC warning that Doom3 would use parts of video cards that hadn't been utilized before and would expose flaws and instabilities unlike other games.

    The facts are NVIDIA came through fine and it's working fine on the consoles. You don't absolutely need bleeding edge NV drivers for it to run ok, as Sxotty has said. I believe that one party in particular really killed the perceived quality here. Maybe it was too much to ask AMD to get their OpenGL into shape for a single game. For all we know there have been AMD engineers banging their heads into the wall for months at it. ;)
     
  9. Sxotty

    Legend

    Joined:
    Dec 11, 2002
    Messages:
    5,496
    Likes Received:
    866
    Location:
    PA USA
    Really that is what you got from reading that? There are always some number of problems with any software on some PC configurations. The severity, probability, and other factors are pretty important things to keep in mind.

    I do agree with Farid that id should have realized the drivers would not actually be able to do what they were supposed to, but not that they would magically know AMD would put up the wrong driver.
    I completely disagree about the idea that they should not include opengl options that are not yet implemented. That allows a better option once Nvidia and AMD get their act together. It seems silly to leave such things out just b/c they do not yet work, however obviously something was screwed up if it blew everything up and crashed the game. I personally was very pleased with how stable the game has been for me. I can leave it running for hours and hours and have yet to have any sort of crash. Even the weird hang I got at first b/c I had 3 keyboard and 3 mice and 2 computers all hooked together on accident did not crash it. It simply got a bit confuse on the input stream.

    I just hope they did not release a buggy patch since the game has been nothing but perfect for me. It would suck if Steam decided I needed an update that ruined everything :)

    BTW from their forums
    :lol:
    Of course there is always someone with a problem. Maybe he broke his hdd between times or some random thing, but I was honestly a bit worried since I am sure they rushed it pretty hard. Hope all this mess doesn't mean doom4 won't bother trying to push PCs since half the PCs don't like being pushed in OpenGL anyway.
     
    #149 Sxotty, Oct 9, 2011
    Last edited by a moderator: Oct 9, 2011
  10. Grall

    Grall Invisible Member
    Legend

    Joined:
    Apr 14, 2002
    Messages:
    10,801
    Likes Received:
    2,176
    Location:
    La-la land
    Doom4 is going to be OGL also? :shock:

    What the funk! id doesn't even release Mac versions of their games anymore so why are they sticking with that ancient, outdated API? Literally NOBODY else does who also aren't into Mac development, and drivers show the result of this trend of course.

    Carmack needs to get on with the times, even he has said D3D is superior these days so start using it already, god dammit.
     
  11. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    I sure hope so. I'm a million times more interested in the next DOOM than I ever was about Rage.

    That's not what Vy is saying. He's saying you test for functionality before using it. I don't know the specifics of the new extension implementations but IIRC it was a negative value for "SMART" vsync. Before using it they should have tested the driver's response and if the implementations didn't differentiate between a negative value and zero (quite possible) then they should have tested driver versions as a last resort.

    What you do NOT do is just assume a particular third-party implementation will be "there" for your code to use. If that was the MO of web developers we'd still be using Geocities with Netscape v17.5.

    EDIT: It's quite perplexing now that I think about it some more. Carmack has recently been going on about static analysis crusade which any half-decent non-games programmer already loves (same thing with his recent revelation about object composition instead of class inheritance - really John? You don't say...) and yet, instead of statically checking whether it's safe to use a particular implementation the code assumes it will be there and fail in runtime.
     
  12. fellix

    Veteran

    Joined:
    Dec 4, 2004
    Messages:
    3,552
    Likes Received:
    514
    Location:
    Varna, Bulgaria
  13. Sxotty

    Legend

    Joined:
    Dec 11, 2002
    Messages:
    5,496
    Likes Received:
    866
    Location:
    PA USA
    BTW I was using WHQL drivers from August 8th the entire time and they worked completely perfectly. The reason I noticed is Steam just did what youguys wanted and told me to update. So when I started the game I had the prior WHQL drivers from June 1st and they ran perfect, then I got the WHQL drivers from August 8th and they ran perfect. I am sorry but that to me just says their openGL was working far better. Hopefully the beta ones work too :)

    You think they are going to write an entirely new game engine? The days of writing a new game engine for each game are firmly over. Maybe after the next doom they will transition.

    @ Richard we really don't know what happened. Maybe at first it was written to query whether swap-tear was there and AMD said no and everything worked fine (as it does across the fence). I don't necessarily see it as clear that the same behavior we see they were seeing. If so they should have just released the game with a disclaimer saying it would not work for crap on AMD (graphic) setups :)
     
    #153 Sxotty, Oct 9, 2011
    Last edited by a moderator: Oct 9, 2011
  14. sebbbi

    Veteran

    Joined:
    Nov 14, 2007
    Messages:
    2,924
    Likes Received:
    5,296
    Location:
    Helsinki, Finland
    Soft vsynch has been supported by the console hardware for ages. It's a really good feature, since it allows you to have zero tearing when the engine is able to reach the target frame rate (60/30), but it immediately disables vsynch when frame time is slightly more than allocated, instead of dropping fps directly to 30/15. It's basically best of both worlds. Best frame rate possible + minimal tearing. I am grateful that JC has so much influence that we finally have this feature on PC as well.

    As a game developer, I have to agree 100% with you. I still remember times when I had to code vastly separate DX9 paths for Intel, ATI and Nvidia cards. Intel didn't support hardware vertex shaders, and their SW vertex pipe only supported 32 bit float inputs (no 16 bit floats or integers), and didn't support any kind of vertex texture fetch. ATI had a render to vertex buffer API hack and Nvidia supported vertex texture sampling, but their DX9 drivers only exposed 5 different supported formats (basically just 8888 and float formats, no compressed formats, etc), but you could still use them (hardware supported those only officially on DX10). MRT support was also missing from Intel chips and there were lots of texture and vertex formats missing between all of them (and lots of vendor specific hacks for example for MSAA resolve and depth buffer reading). Basically you had to convert all buffers (vertices and textures) at load time to hardware format, because every hardware supported slightly different set of caps.

    Things definitely improved as DX9 got more mature and most hardware supported same set of caps. DX10/DX11 finally solved this issue completely, and now we can really code just a single PC path and it works on all Nvidia/AMD/Intel hardware. We have had no problems in virtual texturing on DX10/11. It has worked without problems on all Nvidia/AMD/Intel hardware we have tested in on (no blue screens, no driver crashes and no textures getting corrupted).

    Recently I realized that the situation in mobile space with OpenGL ES 2.0 seems to be even worse than the situation in the DX9 early days. MRT rendering is supported by most hardware, but ES 2.0 doesn't have official support for this and only Nvidia hardware has extensions for this (so MRT rendering on other hardware cannot be used). This means deferred rendering is not possible to do (efficiently) on ES 2.0. Virtual texturing gives even more headaches. Texture uploading and data downloading from GPU are really slow... there's no way to just upload/download raw texture data to GPU. The driver transforms it (on CPU) to a hardware/software format (tiling/untiling & format conversion). Partial updates of compressed texture data are also not supported, and neither it's possible to "cast" a compressed format as integer format and do GPU texture compression (ETC compression shader fits easily in the pixel shader instruction limits). Texture format support is even more limited. You even need an extension for 8888 formats. Some hardware supports float formats (and depth buffers) as textures, but not as render targets and cannot filter them. And vertex format support for 16 bit floats and 10 bit formats require extensions. There's lot of phones with exactly the same graphics hardware, but vastly different extensions supported. I am talking about the recent high end phones (released in 2011), not some old phones. The phone/tablet manufacturer's are claiming console level image quality for their next year products, but (standard) Open GL ES 2.0 isn't going bring that. Too many features are just missing (or partially supported). Nvidia has the best extension support (like it used to have on PC OpenGL), but not many developers are willing to release games that only support Tegra chips. Open GL ES 2.0 is soon five years old. That's an eternity in mobile hardware (every year 2-5x perf boost + many new generations of hardware).

    Maybe Id-software wanted to keep their options open (port it to Mac maybe in future) and that's why they chose OpenGL, or maybe it's because they have used OpenGL more in their past PC games and were more familiar with it. I don't understand really...
     
  15. Ike Turner

    Veteran

    Joined:
    Jul 30, 2005
    Messages:
    2,110
    Likes Received:
    2,304
    It will intersting to see what Microsoft does with Windows Phone 7/8. As of right now it uses a new version of Direct3D Mobile (with DX9.X features set IIRC it also supports Direct2D) but barely nothing is exposed to third party developers who can only used the predifined set of shaders that MS has created with XNA 4.X (BasicEffect, SkinnedEffect, EnvironmentMapEffect, DualTextureEffect, AlphaTestEffect) . Still now word on when HLSL support will be enabled.
     
  16. Davros

    Legend

    Joined:
    Jun 7, 2004
    Messages:
    17,879
    Likes Received:
    5,330
    whats rage using on the 360, ogl ?
     
  17. Richard

    Richard Mord's imaginary friend
    Veteran

    Joined:
    Jan 22, 2004
    Messages:
    3,508
    Likes Received:
    40
    Location:
    PT, EU
    They don't need to. They've said the renderer is pretty much API agnostic because it has to run on PC/PS3/XBOX. In fact JC himself said the only reason they stuck with OGL for Rage was inertia. The effort would not be inconsequential obviously, but it would likely pay off in the end.

    I agree! As I've posted like 4 times now, they should have a message box saying "you're drivers are too old" on game start like BF3. At the very least, it would have made clearer whose forum people should go and bitch. ;)

    I should also repeat what I've said before: ATI's behaviour in this matter is almost criminal: to themselves, to id and to pc gaming as a whole.

    Xbox's Direct3D.
     
  18. Sxotty

    Legend

    Joined:
    Dec 11, 2002
    Messages:
    5,496
    Likes Received:
    866
    Location:
    PA USA
    The update did not break the game, but I haven't tried the newer nvidia drivers yet. I hate updating drivers when everything works perfect. And it already does transcoding with the August 8th WHQL driver so that was a bit of misinformation saying we needed the new ones.

    Anyone think there is actually a reason I should update them? I looked in the release notes and Rage is mentioned 1 time. It says improved compatibility and performance with Rage. Then never says anything else in the 60 pages :) Does anyone know if it enables swap tear? My frame rate has been pretty steady at 60 though so I don't really need it, but it might be nice.

    PS. that stupid card game is addictive and a fun way to make game money.
     
  19. Florin

    Florin Merrily dodgy
    Veteran Subscriber

    Joined:
    Aug 27, 2003
    Messages:
    1,707
    Likes Received:
    345
    Location:
    The colonies
    I hadn't tried Rage before the first round of patches, but Steam does pop up warnings like that when starting Rage, at least now. It complained about my Nvidia drivers.
     
  20. tongue_of_colicab

    Veteran

    Joined:
    Oct 7, 2004
    Messages:
    3,773
    Likes Received:
    960
    Location:
    Japan
    Lol first boot and bad framerate and tons op texture popup right away. I'll leave this game for what it is. It's 2011 for gods sake. A game should just work.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...