Something wrong with the HL2 Story

Re: hELLBINDER

palmerston said:
But its all senseless talk anyway. My original thread does not argue whether r3xx may run hl2 better. Thats been argued to death on other threads. My point is that its a poor business decision by valve to alienate the vast bulk of their target customers by optimising for a minority product.

The problem is you are basing you evidence off data collected from a game that was available 3/4 years ago - of course there are going to be lots of people running MX's on that, however we are now looking at a title that is vastly different from that, with very different requirements. Do you expect Doom3 to run well on an MX? No, of course not, which would you expect HL2 to do the same?
 
I was browsing the hallife2.net forum and came across some interesting info from Brian Jacobson.

somebody asked him about what difference a 256mb to a 128mb cards will make in HL2.

here the answer:

Just got this reply from Brian Jacobson clearing up the issue of 128mb vs 256mb graphics cards:

The game will run fine on a 128 MB card. Most of the benefits of the 256 MB cards you'll see we expect will be more long-term, received via updates over Steam:

1) Something which happens immediately, we're able to store more data on the card instead of in AGP memory, which you might think would be a perf win, but so far, we're not finding ourselves to be AGP bus-bandwidth limited.

2) We expect to release a local-specular solution, perhaps at ship, perhaps over Steam at some point, which is a major texture memory consumer. In addition, we've made it easy for major HW vendors to write new shaders for us to take advantage of the extra memory.

3) A *lot* of memory is consumed by normal maps, since they can't be compressed well. We may ship uncompressed normal maps for the 256 MB cards if it turns out to be a big enough visual improvement. Future updates (not to mention mods) will contain high-end content that use ever-larger amounts of normal maps


someone explain me what he means with ihv writing new shader to take advantage of the more memory. It sounds interesting , could it mean that ihv will be able to write better looking shaders for HL2 if they like?

[/b]
 
Re: hELLBINDER

palmerston said:
My point is that its a poor business decision by valve to alienate the vast bulk of their target customers by optimising for a minority product.

How do you know they're "alienating" the "vast bulk" of their customers? You STILL haven't proven your claims that Nvidia is the dominating DX9 chip vendor.

Btw, it's not Valve's fault that Nvidia's MSAA technique is flawed; packed textures have been used in loads other titles as others have indicated.

Lets also not forget that there are *some* consumers with kyro or matrox products, -what about them too?

None of these products do MSAA, this is not an issue for them.

Valve should think more than just supporting ati

Actually, they don't even support ATI you crazy fanboy. if you took your head out of the fluffy clouds where you're currently floating about, you'd see Valve has said ATI isn't supported EITHER currently, because DX9 can't enable the kind of texture sampling needed on a selective basis!

You need to take off those selective reading glasses you put on before making your post...

feeling sorry for the under dog is one thing, letting down most of your customers is another. Assuming of course that your customers, in this case nvidia users would like to have fsaa support.

You sure seem to do a lot of that assuming stuff, don't you? :)

What were valve thinking of?

Well excuse them for making their game as good-looking as possible. *shakes head* Some people can never be pleased it seems...


*G*
 
Thannks OpenGL guy for pointing out my opinions vs facts. I did warn people that these were my thoughts on the subject and not all facts though. I thought the people here could do the parsing themselves, but no harm in you making it obvious. ;)

That said I still stand by the "opinion" that a boarder around sub-textures would fix the problem. You are right that once in a while a pixel could turn out wrong with very steep triangles, but we are only talking about subpixel differences and in my software rendering I didn't see that happening much.

Why do you think it's hacky? Because it's not supported by all IHVs? Maybe you think MRTs are hacky too.

Dude, chill out. Truth be known I like ati's high end hardware better right now than nvidia's. I'm basing my hacky claim off of work I've doing with a software raster engine, not on ihv support. Besides in the future I bet all ihvs support it. The reason I called centriod hacky is because you're changing filtering only on the edge of polygons in impossible to predict (on the app's side) ways. I believe in hardware doing predictable things.
 
Introducing this border around each subtexture would mean doubling the size of the texture pack, wouldn't it? If you've put 4 512x512 textures into one 1024x1024 one, and now you want a couple of pixels border around each, you'll have to expand up a size. Think of all that wasted space! To make optimal use of it they'd have to go through and see what could be fitted where in the new layouts. You'd still have big gaps left over.

If ATI's cards can change their multisample pattern I don't see why they can't just detect hl2.exe (or whatever) being run and change it appropriately. Eventually the choice might work its way into DX 9.0b or whatever the next release is.
 
Re: hELLBINDER

Grall said:
...
Actually, they don't even support ATI you crazy <bleep>. if you took your head out of the fluffy clouds where you're currently floating about, you'd see Valve has said ATI isn't supported EITHER currently, because DX9 can't enable the kind of texture sampling needed on a selective basis!

This comment well describes the current problem people are having wrapping their heads around the "ATi-nVidia issue." Right, it's not a question for developers of "supporting ATi" versus "supporting nVidia"--that has (and should have) nothing to do with it. The question is rather of how well the IHVs are supporting the API. The API is key and critical for the developer, and the differences stem from differing degrees of API support by the IHVs and not as the result of developers choosing to support one IHV over another.

I have always thought that Atari allowing nVidia to run ads promoting itself within UT2K3 (even though it's very easy to replace the nVidia logo with an ATi logo, or any other logo) was at best in poor taste and at worst highly misleading. Note the masses of the unwashed who immediately assumed the presence of the nVidia logo meant the game had been "optimized for nVidia hardware" when in fact it meant no such thing. If Atari wanted to sell nVidia (or anyone else) space on its program CD for nVidia to place a separate promotional advertisement, I'd have no problem with that, really. But to deliberately place such ads inside games themselves or on product boxes is an example of a software publisher deliberately allowing an IHV to at least attempt to create an illusion regarding something that doesn't exist--such as partiality in their software itself towards specific hardware made by a specific IHV. The truth is that the software is "partial" towards the API, and one IHV may come out better because the IHV does a better job supporting the API in its products--not because of any particular developer "optimization."

The only real exception to this I can think of is when a game developer builds his software engine around a particular IHV's OpenGl extensions, such as we see with Bioware's original NWN engine. However, this is definitely the exception to the rule, and the amount of effort Bioware had to expend to write in support for ATi's OpenGL extensions so that the game would support ATi's hardware proves the fallacy of this approach, in my view. While I don't blame Bioware for making this decision regarding the NWN engine at the time its development began (long before ATi shipped R300), it does conclusively prove, however, that such decisions are short-sighted and that developers should generally avoid them, IMO.
 
Myrmecophagavir said:
If ATI's cards can change their multisample pattern I don't see why they can't just detect hl2.exe (or whatever) being run and change it appropriately. Eventually the choice might work its way into DX 9.0b or whatever the next release is.
It has nothing to do with sample pattern at all but how the texture sample is taken. In normal MSAA, the texture sample is always taken at the center of the pixel. The problem with this is that, in some cases, that sample can be from outside the polygon you are rendering. This is problematic in cases where nearby texels don't match the texels the current polygon is using.

If you use sample at the centroid (meaning the "center" of the samples within the polygon), then you'll never run into this problem because you'll always sample texels from within the polygon itself. You can't use this technique all the time, however, because of the extra filtering that happens along edges. For example, take a quad with a checkerboard texture. If you render this as a triangle strip, then you will see a "seam" down the middle of the quad where the two triangles meet. Of course, this is a worst case scenario.
 
Well, I forced the GeForce FX 5900 Ultra to run 4xAA using supersampling instead of multisampling...The results weren't good. The framerates were down right HORRIBLE doing the UT2003 flyby. On a side note, it didn't look all that bad either, kind of blurry though...I should do some more testing on it to see if it's a feasible solution to use on the card. The thing that doesn't make much sense is that the card has 256 MB of RAM... 4xSSAA shouldn't really be as worse than 4xMSAA as it is..
 
surfhurleydude said:
The thing that doesn't make much sense is that the card has 256 MB of RAM... 4xSSAA shouldn't really be as worse than 4xMSAA as it is..

RAM isn't the issue - you are effectively forced down to one pixel pipe with SSAA...
 
RAM isn't the issue - you are effectively forced down to one pixel pipe with SSAA...

I thought SSAA is running the thing at a higher resolution than is actually on screen in the frame buffer, then displaying it on the screen at the resolution of your desire to have the "Anti Aliased" effect. :?:
 
With SSAA each pixel pipe effectively becomes a subsample pipe. If you are running 4X FSAA then each of the pipes is producing one of the subsamples that makes up the final pixel, so all 4 pixel pipes on the 5900 are used to create the value of one final pixel.
 
reply

Grall - I am making an assumption that nvidia no longer continues to ship g4 products based on;
a. the websites of their partners such as asus, leadtek, msi, etc
b. the fact they have continued to keep dominating market share, for that see; http://www.globeandmail.com/servlet/story/RTGAM.20030428.watii428/BNStory/Technology/

Dave - no I dont expect Doom3 to run well on an mx. But I do expect games developers to optimise for the graphics card I use which is an nvidia one today and likely an nvidia one tomorrow. I expect my next card to be a 5900 just as soon as they release one for under $250 which I expect before xmas. I want a 5900 because I trust nvidia's drivers and because I happen to be looking forward to doom3. Its annoying to me that valve is making my life harder by not supporting seeming to support nvidia.

I *DO* accept that hl2 will run ok on my 5900 and I *DO* accept that the issue as described for hl2 fsaa support affects both ati and nvidia. But my concern is, assuming this story is true is that it *looks like* valve is supporting ati over nvidia. This looks like a bad commercial decision for them.

I LIKE that the "the way its meant to be played" is in games because it lets me know that the developer is working with nvidia and its nvidia hardware I use. I dont see anything wrong with that - on the contrary in the same way that life has been easier since the world adopted ms office for documents I see life as being easier when all pc games are optimised for a single 3d platform around nvidia. Then I will never have to worry about my game being optimised for anything else than the hardware I will be using.

If ati could kill nvidia and become the dominant player than that would be easier too but thats unlikely and in any case they still have lousy drivers, in my humble, unscientific, end user lousy knowledge opinion.
 
DaveBaumann said:
With SSAA each pixel pipe effectively becomes a subsample pipe. If you are running 4X FSAA then each of the pipes is producing one of the subsamples that makes up the final pixel, so all 4 pixel pipes on the 5900 are used to create the value of one final pixel.

does this mean that proper DX9 spec cards with 8 pipes will be able to do SSAA faster? (eg. 9800)

Also would this have anything to do with the comments by Valve that the ATI cards have the "possibility" of implementing correct FSAA with HL2 whilst nvidia cards can't?
 
Re: reply

palmerston said:
I LIKE that the "the way its meant to be played" is in games because it lets me know that the developer is working with nvidia and its nvidia hardware I use. I dont see anything wrong with that - on the contrary in the same way that life has been easier since the world adopted ms office for documents I see life as being easier when all pc games are optimised for a single 3d platform around nvidia. Then I will never have to worry about my game being optimised for anything else than the hardware I will be using.

You do know that TWIMTBP is just a marketing campaign? Several developers here have stated that Nvidia pay for the ad at the beginning of the game (so the publisher gets more cash) but there is no obligation for any optimising. There is no more extra help given by Nvidia for being in TWIMTBP, and it's likely that ATI give the same levels of support when it comes to helping developers optimise their code.

palmerston said:
If ati could kill nvidia and become the dominant player than that would be easier too but thats unlikely and in any case they still have lousy drivers, in my humble, unscientific, end user lousy knowledge opinion.

You would be wrong. ATI drivers are excellent - they have come a long, long way in the last 12 months with their Catalyst programme. They've been releasing solid WHQL drivers every six weeks, which is a lot better than Nvidia has been.
 
Re: reply

palmerston said:
Grall - I am making an assumption that nvidia no longer continues to ship g4 products based on;
a. the websites of their partners such as asus, leadtek, msi, etc

Then I'll share this spanking fresh digitimes article with you:

digitimes said:
Albatron recently reported revenues of NT$280 million in June, of which $117 million were from graphics card sales, down 45% compared to May's graphic card sales of NT$211 million.

Albatron sold 50,000-60,000 graphics cards in June, down from an average of 70,000-80,000 cards per month earlier in the first half. However, Albatron in July may ship 80,000-90,000 graphics cards due to stronger European SI (system integrator) orders, according to sources.

Although profit margins for Nvidia's NV35 and NV31 chipset-based graphics cards are higher, Albatron's SI orders will mainly consist of NV18 and NV34-based graphics cards, according to sources.

So you see - NV18 (GeForce 4MX) is still a major product.
If you'd like to draw any conclusions about what a smashing success the GeforceFX series of products have been, feel free to do so, but I'd have to caution you on the small size of the sample. There have been indications that Albatron is not unique though.

Entropy

PS. The entire article is quoted. Taken from here:
http://www.digitimes.com/NewsShow/N...D7A07AEF7048256D650046C38E&amp;query=ALBATRON
 
Kalbaz said:
does this mean that proper DX9 spec cards with 8 pipes will be able to do SSAA faster? (eg. 9800)
I don't know in which way the number of pipes is supposed to be related to the DX9 spec.
But generally, a card with higher fill rate will perform better than one with low fill rate when you enable SSAA. Given they are not totally bandwidth or geometry limited.

Also would this have anything to do with the comments by Valve that the ATI cards have the "possibility" of implementing correct FSAA with HL2 whilst nvidia cards can't?
No.
 
palmerston said:
no I dont expect Doom3 to run well on an mx. But I do expect games developers to optimise for the graphics card I use which is an nvidia one today and likely an nvidia one tomorrow. I expect my next card to be a 5900 just as soon as they release one for under $250 which I expect before xmas.

So in otherwords, all you historical data you’ve presented bears no relation to what we’re talking about here, by your own admission. Had it occurred to you that, if Valve were supporting ATI over NVIDIA, they are serving their own best interested because the market conditions for DX9 boards are vastly different from previous DX revisions?

However, I don’t think Valve are actively supporting one over another in this case as MSAA is broken under all conditions, they are just pointing out that ATI may technically have the hardware to do it (if they find some way of exposing it). And, had it occurred to you that some developers may just like ATI’s architecture better? This is an opinion that has been voiced on numerous occasions.

I LIKE that the "the way its meant to be played" is in games because it lets me know that the developer is working with nvidia and its nvidia hardware I use.

So, developers supporting NVIDIA is OK, but developers supporting others isn’t? Perhaps ATI users like “Get in the Gameâ€￾...
 
DaveBaumann said:
DemoCoder said:
Example? Only thing I can remotely see approaching it is some of the fire and water effects.

For DX9 level boards this is probably going to be one of the most shader intensive titles to date. It is riddled with shader code.

Evidence? Quake3 and UT2k3 were also "shader intensive", but most of the shaders fit into a one or two passes on DX7/8. You seem to be suggesting that Valve is using effects branching, and will have two sets of shaders, one for DX9, and one for everything else. Otherwise, older cards would have to emulate these "intense" shaders via multipass, and why would Valve do something that will alienate 99% of their market?

There are publically available screenshots out there from the game levels (not the "tech" walkthrough part). Apart from the water, I would like someone to point out in one of these screenshots something that is truly a DX9 level pixel shader. I watched that E3 video dozens of times, and I saw very little shading that looked impressive. I saw impressively high res textures. I saw impressive fire and smoke and water shaders. I saw a little bit of bump mapping here and there, but nothing we haven't seen before. (e.g. on the pheromone level, the bathroom tile has that overly shiny specular highlight look we have all come to know and love, not really a DX9-required effect)

I'm not saying the game doesn't look impressive, but 90% of its look seems to come from well designed geometry and textures. Halo on the X-Box looks more "shader intensive" IMHO.
 
Re: reply

Allow me to interfere here. For all who know me, I seldom voice my opinion in forums, but when I do I prefer to have a complete opinion on a xyz subject.

palmerston said:
Grall - I am making an assumption that nvidia no longer continues to ship g4 products based on;
a. the websites of their partners such as asus, leadtek, msi, etc
b. the fact they have continued to keep dominating market share, for that see; http://www.globeandmail.com/servlet/story/RTGAM.20030428.watii428/BNStory/Technology/

You should by now have taken notice that ATI's marketshare is increasing, not decreasing. The R3x0 line has helped a lot, since the 'average Joe' opinions about ATi have begun to change for the better. (When circumstances allow that, ofcourse.) Catalyst team has made a really good impression to the public, who now have something not just comparable to the Detonators, but in fact much better than them (in the current stage, when Detonators were beta for months, while Catalysts were certified in every single release)

Check this response from Catalyst Maker in a rather malevolent review, in which I back him up 100% : http://www.rage3d.com/board/showthread.php?s=&threadid=33698698&highlight=not+clear+to+me

palmerston said:
Dave - no I dont expect Doom3 to run well on an mx. But I do expect games developers to optimise for the graphics card I use which is an nvidia one today and likely an nvidia one tomorrow. I expect my next card to be a 5900 just as soon as they release one for under $250 which I expect before xmas. I want a 5900 because I trust nvidia's drivers and because I happen to be looking forward to doom3. Its annoying to me that valve is making my life harder by not supporting seeming to support nvidia.

If I were you, I'd expect game developers to support 'fair play' and NOT optimise for one hardware vendor instead of another. Granted, you can't have a Trident and still wanna play UT2k3 in max details, but you seem to forget that, in the DX 9 cards userbase, ATi dominates (And we can safely rule out GF FX 5200 from 'DX 9' since there have been numerous debates concerning whether they are 'compatible' or 'compliant'...). I have a Radeon 9500 Pro here and I love it, I beta test the new Catalysts, I see the progress being made...
ATI has evolved more in the software department in the last year compared to the last 5 years altogether! I don't trust Nvidia's drivers (I did when I had a Radeon 7200 and ATI drivers were a pain, while Detonators used to be the 'shining star of drivers'), since they deny to work properly, lower quality on purpose leaving the user defenseless about it, and now come on a super-duper-whoa encrypted form for no1 to try and remove the 'optimizations'. If you trust that, then I suppose you behave like 'Screw everyone else, I have nvidia hw so I rule.' (Well, you do sound like that)

It's promising to me that ATI followed the typical DX 9 spec as much as possible so far (in this DX 9 generation) and thus can enable the feature Valve calls for in its drivers. If nVidia chose to follow a 'semi DX9, semi DX 8' route in their hardware, they are to blame. So kudos Valve for really evolving games.

palmerston said:
I *DO* accept that hl2 will run ok on my 5900 and I *DO* accept that the issue as described for hl2 fsaa support affects both ati and nvidia. But my concern is, assuming this story is true is that it *looks like* valve is supporting ati over nvidia. This looks like a bad commercial decision for them.

I dont see this commercially. It's purely some hardware issue, and they never said nVidia can't try to fix the thing (they could emulate it or produce a similar result I suppose, but they won't be able to 100% fix it, hardware restrictions apply). Valve is a company wanting to sell games, remember? They wouldn't leave nvidia users high and dry, but I suppose AA will remain strictly DX9 related in that title, and since the NV3x architecture has many flaws, you can't blame Valve for that.


palmerston said:
I LIKE that the "the way its meant to be played" is in games because it lets me know that the developer is working with nvidia and its nvidia hardware I use. I dont see anything wrong with that - on the contrary in the same way that life has been easier since the world adopted ms office for documents I see life as being easier when all pc games are optimised for a single 3d platform around nvidia. Then I will never have to worry about my game being optimised for anything else than the hardware I will be using.

If ati could kill nvidia and become the dominant player than that would be easier too but thats unlikely and in any case they still have lousy drivers, in my humble, unscientific, end user lousy knowledge opinion.

If all 3d games were optimised for a single IHV, we would be in a worse fate than consoles are. Monopoly is never good, and having 2 players on the market is quite better (I 'd like 3, but S3 needs time to try and become the 3rd I suppose). If you dont want to worry about your game working or not, buy a console. In PC's, there will never be an era when all games will work on a single card only (If we exclude the Glide era, since Glide did work only on 3dfx cards,indeed)

Your humble, unscientific, lousy knowledge opinion is so far-fetched I'll have to wonder if you're smoking something hallucinogenic, to quote the words of an infamous Nvidia CEO (familiar with him?)

Edit: Clarity
 
Back
Top