Something wrong with the HL2 Story

Well, it doesn't sound specific to the NV3x, it sounds like all cards < R300 which implement multisampling will have this issue. He didn't mention if the 7500 or 8500 had the hardware.

Supersampling should still work however and the Nv35 should have enough bandwidth to do it for reasonable resolutions. Some people might prefer to play in 800x600 or 1024x768 SSAA instead of 1600x1200 with no-AA. Unless of course you have an LCD monitor like me, where running at any resolution that is not native lowers quality.

Valve better be getting an enormous boost of performance out of this texture packing trick, otherwise, they are trading off backwards compatibility with a large installed base of cards (GF3s, GF4s, 8500s, etc) for little to no gain. Maybe this trick allows it to run faster on DX6 level cards, and since most people with DX6 and entry level cards didn't use AA, Valve targeted the Source engine to an audience who mostly wouldn't be able to turn on AA anyway.
 
palmerston said:
vrecan, you seem very credible technically until you state that ati has the greatest installed dx9 base, thats not true at all.
Source?
ati sold approx 1.3-1.5m R3xxx parts. Sales have slowed dramatically the last qtr as they had planned to go wide with rv350 (9600) but issues with the move 0.13u have prevented that happening. In fact rv350 is so bust that this has caused the move to r360 (for a speed boost) and rv360 to fix some metal layer issues.
Source?
ati has done well with some oems with the 9200 but as we both know thats dx8. One wonders what will happen to those sales if nvidia were to suddenly make an aggressive price move.

On the other hand nvidia is reputedly selling millions of its geforcefx range inc both the 5200, 5600 and 5900. We can assume that most of these are 5200 and 5600 which you may not like but it does mean that nvidia is setting the pace for dx9 and not ati.
"Reputedly".
This is from April and nothing new. Does it break down how much of nvidia's marketshare came from the integrated chipset market? Note that nvidia's integrated chipsets are DX7-class.
This looks like valves problem more than nvidia's.
Sure, blame the software because the hardware is inadequate.

Take your FUD elsewhere.

-FUDie
 
DemoCoder said:
Supersampling should still work however and the Nv35 should have enough bandwidth to do it for reasonable resolutions. Some people might prefer to play in 800x600 or 1024x768 SSAA instead of 1600x1200 with no-AA. Unless of course you have an LCD monitor like me, where running at any resolution that is not native lowers quality.
Except that supersampling is demanding on fillrate, not necessarily bandwidth.
Valve better be getting an enormous boost of performance out of this texture packing trick, otherwise, they are trading off backwards compatibility with a large installed base of cards (GF3s, GF4s, 8500s, etc) for little to no gain. Maybe this trick allows it to run faster on DX6 level cards, and since most people with DX6 and entry level cards didn't use AA, Valve targeted the Source engine to an audience who mostly wouldn't be able to turn on AA anyway.
Are you sure that texture packing is the only problem here?

-FUDie
 
vrecan, you seem very credible technically until you state that ati has the greatest installed dx9 base, thats not true at all.

One of the only methodologies to actually giving some kind of representative sample of DX9 spread, being results submitted to Futuremarks Orb database, still indicate this. ATI does have the largest installed userbase for DX9 parts according to the later reports publised by Futuremark.

palmerston said:
ati sold approx 1.3-1.5m R3xxx parts. Sales have slowed dramatically the last qtr as they had planned to go wide with rv350 (9600) but issues with the move 0.13u have prevented that happening. In fact rv350 is so bust that this has caused the move to r360 (for a speed boost) and rv360 to fix some metal layer issues.

The only issue with RV350 was that they didn't order enough because they thought that NV31 would be around to offer some competition, and it wasn't really.
 
DemoCoder said:
Supersampling should still work however and the Nv35 should have enough bandwidth to do it for reasonable resolutions. Some people might prefer to play in 800x600 or 1024x768 SSAA instead of 1600x1200 with no-AA. Unless of course you have an LCD monitor like me, where running at any resolution that is not native lowers quality.

The title is most likely to be shader limited - you probably don't want to be doing superampling in this situation.
 
FUDie said:
Valve better be getting an enormous boost of performance out of this texture packing trick, otherwise, they are trading off backwards compatibility with a large installed base of cards (GF3s, GF4s, 8500s, etc) for little to no gain. Maybe this trick allows it to run faster on DX6 level cards, and since most people with DX6 and entry level cards didn't use AA, Valve targeted the Source engine to an audience who mostly wouldn't be able to turn on AA anyway.
Are you sure that texture packing is the only problem here?

-FUDie

They state that texture packing is the problem.. Ironically, I remember reading in one of the Nvidia's presentation about the benefits of using texture packing (i.e. packing small textures into one large one) in terms of performance and better handling of textures in the driver.. I guess they do not think about how MSAA would work with this technique at that time ;)
 
DaveBaumann said:
DemoCoder said:
Supersampling should still work however and the Nv35 should have enough bandwidth to do it for reasonable resolutions. Some people might prefer to play in 800x600 or 1024x768 SSAA instead of 1600x1200 with no-AA. Unless of course you have an LCD monitor like me, where running at any resolution that is not native lowers quality.

The title is most likely to be shader limited - you probably don't want to be doing superampling in this situation.

Unlikely. Most of HL2's environments I see in the videos are traditional DX7 style multitexture "shading", with the exception of a few, like water, and some demo surfaces (church colored glass, "Predator" cloak effect) I doubt they would exceed the limitations of PS1.0. I see nothing that suggests DX9 level shading.

Unless you can produce some specific data, I find it highly unlikely that Valve targeted DX9, and the vast majority of their engine should run fine on DX7 and DX8 cards. I bet HL2 will be more geometry, memory, and CPU limited than shader limited.

It still stands to reason that this is Valve's problem, since they designed an engine that uses tricky performance techniques which don't work well with tens of millions of existing chips on the market, and isn't even legal according to DX9.

Even if the NV40 and R300 will support it under DX9.1 or DX10, it still leaves millions of people who have working MSAA cards out in the cold today.
 
palmerston said:
you miss the point

for all I know ati may have a better chip for hl2 but thats hardly going to help when most of the game buying public are buying nvidia products. its like valve backed betamax when the world was going to vcr.

I couldnt give a monkeys elbow what ati does, I trust nvidia and their drivers and I like the "the way its meant to be played" it makes it easier to choose games knowing that they'll look good on my pc.


spoken like a true f@nboy
 
Can NV3x cards even use supersampling without any multisampling? I remember the NV2x cards could, but I've never seen it with an NV3x.
 
gkar1 said:
palmerston said:
you miss the point

for all I know ati may have a better chip for hl2 but thats hardly going to help when most of the game buying public are buying nvidia products. its like valve backed betamax when the world was going to vcr.

I couldnt give a monkeys elbow what ati does, I trust nvidia and their drivers and I like the "the way its meant to be played" it makes it easier to choose games knowing that they'll look good on my pc.


spoken like a true f@nboy
At least he was nice enough to equate ATI with Betamax, which was most certainly the better format. :LOL:

(cry havoc and let slip the old school fanboys of war)
 
The Baron said:
Can NV3x cards even use supersampling without any multisampling? I remember the NV2x cards could, but I've never seen it with an NV3x.

All cards can do supersampling as long as they have the memory for a large framebuffer.
 
DemoCoder said:
DaveBaumann said:
The title is most likely to be shader limited - you probably don't want to be doing superampling in this situation.

Unlikely. Most of HL2's environments I see in the videos are traditional DX7 style multitexture "shading", with the exception of a few, like water, and some demo surfaces (church colored glass, "Predator" cloak effect) I doubt they would exceed the limitations of PS1.0. I see nothing that suggests DX9 level shading.
I do.
Unless you can produce some specific data, I find it highly unlikely that Valve targeted DX9, and the vast majority of their engine should run fine on DX7 and DX8 cards. I bet HL2 will be more geometry, memory, and CPU limited than shader limited.
Possibly "geometry, memory and CPU limited" on non-DX9 hardware.
It still stands to reason that this is Valve's problem, since they designed an engine that uses tricky performance techniques which don't work well with tens of millions of existing chips on the market, and isn't even legal according to DX9.

Even if the NV40 and R300 will support it under DX9.1 or DX10, it still leaves millions of people who have working MSAA cards out in the cold today.
Lots of games already have issues with MSAA, that doesn't stop people from playing them. The fact that there is a solution to the problems is a good thing, not a bad thing.
 
DemoCoder said:
The Baron said:
Can NV3x cards even use supersampling without any multisampling? I remember the NV2x cards could, but I've never seen it with an NV3x.

All cards can do supersampling as long as they have the memory for a large framebuffer.
Well yeah, but is there an option available through RivaTuner or something of the kind right now?
 
gkar1 said:
palmerston said:
you miss the point

for all I know ati may have a better chip for hl2 but thats hardly going to help when most of the game buying public are buying nvidia products. its like valve backed betamax when the world was going to vcr.

I couldnt give a monkeys elbow what ati does, I trust nvidia and their drivers and I like the "the way its meant to be played" it makes it easier to choose games knowing that they'll look good on my pc.


spoken like a true f@nboy
Or at least truly not full educated on the subject. "The way it's meant to be played" just makes it easier on the publishers' pocketbooks, not on the audience.

I'd like to see some solid info backing the assertion that nV is selling more DX9 hardware. I'm not convinced the 5200 qualifies as DX9, BTW (I'll leave the 5600 and 5800 alone for now). nV may be shipping more volume now, but ATi still had a six-month head start. The one rumor of an OEM complaining of inadequate RV350 supplies doesn't fully convince me of problems with building the cards, as others here have said otherwise. Supply problems are a separate (but related) issue, though.
 
I can really only imagine how much ATI had to shell out for this:

HL2ATi.jpg
 
OpenGL guy said:
Unlikely. Most of HL2's environments I see in the videos are traditional DX7 style multitexture "shading", with the exception of a few, like water, and some demo surfaces (church colored glass, "Predator" cloak effect) I doubt they would exceed the limitations of PS1.0. I see nothing that suggests DX9 level shading.
I do.

Example? Only thing I can remotely see approaching it is some of the fire and water effects. For it to be relevant, it would have to be something quite common in the artwork to cause such a bottleneck. (not the old ISV trick of sugaring a few showoff effects ontop of old game artwork to showoff one or two materials) I find it hard to believe that a game 4 years in the making would have a significant amount of content designed to be experienced on an API that just shipped at the beginning of the year.

I am willing to entertain the thought that a small fraction of the materials library might have some experimental DX9-only effects sprinkled about. Unless of course you're talking about effects which convert to multiple passes on lower hardware by design, by virtue of newer HW (DX9) can now automagically run in a single pass. However, if Valve's game doesn't run on DX7 and 8 hardware acceptably using multipass on these effects, they will be in trouble anyway. If they want to make money, their game will have to run on existing hardware. I highly doubt they can force people to upgrade to a 9800 just to get playable framerates.



Nevertheless, my point stands that if the game can run at 1600x1200 on a DX7 or 8 card, it can run at 800x600 supersampled.
 
silhouette

They just state that current MSAA implementations generates artifacts in the way they use the textures, and ATI hardware has a workaround for this (which supports another type of MSAA), but is not exposed in ps2.0 specifications.

...And we're back to my first question: what is HL2 using, that current FSAA methods (which worked fine for past four years!) becomes obsolete? What is "centroid sampling" in simple words? What it for?

What is "another type of MSAA" that ATI's cards support? I've never heard of such a thing, though i've seen almost all latest ATI's cards.

And since when does FSAA need API support? Don't we just "force" it from driver?

For the future generation cards, R420/NV40, which seems to be compatible with ps3.0 spec, they have to support these AA modes anyway..

I'm still not sure that R420 will be sh3.0 compatible...

Now. IIRC HL2 have something like DX6 in minimal video specs. And what about FSAA with this level of graphics? DX7? DX8? MSAA worked fine for these till today.

From my point of view it IS an issue of Valve, not NVIDIA. It is their choice to dump the majority of FSAA supporting hardware for something named "packed textures"...

DaveBaumann

The title is most likely to be shader limited - you probably don't want to be doing superampling in this situation.

Why? Shader limited means not fillrate or bandwidth limited, means you can use SSAA if you want, 2x1 for example. Doubt it'll be very big perfomance hit on NV35 hardware.

OpenGL guy

Lots of games already have issues with MSAA

Lots? I can only think of Splinter Cell...

And what will this technic which conflicts with MSAA bring us? Has anyone seen something very beatiful or special in HL2 texturing?

[edit] Less emotions :) more grammar :(
 
Well, in general, it will break "cinematic" style effects, but I have a feeling that the artifacts that Valve is talking about is worst than the usual MSAA induces artifacts.

Time for the industry to move on from MSAA and find something better that doesn't force programmers to code around it, but is transparent to the application I guess.
 
Back
Top