Reverend at The Pulpit #10

Status
Not open for further replies.

Reverend

Banned
It's been a while since the last edition but I'm sure no one misses this!

Nothing much's been happening for me site-wise. I haven't really been checking up with various developers on the progress of their upcoming games and I've given up on doing so with Carmack and DOOM3... he always appear to have moods... maybe I'll pester him about post-DOOM3 stuff at a more appropriate time in the future.

One of the more interesting thing that came out of my correspondences with developers recently has to do with a potentially A-list upcoming game and the level of shader support it will have. Get this :

Developer said:
... we'll have a high end renderer using SM 3.0 (and 2.0 now, ATI visited us <snip>)

... and then while asking him for a copy of a demo of the game's WIP and talking some more about the game's graphics features :

Same developer said:
I guess you found out why we are not planning to do a SM 2.0 version... ;) (The business/market reason)

How disappointing is that? It's not difficult to guess the reason (and I have a feeling Joe will have something to say about this!).

Dave have said that I may get to review a 6800Ultra. Crap... now I have to re-learn a lot of things...

As for NVIDIA's newly announced scalable stuff... this does indeed bring back memories of the great discussions I had with Scott Sellers about scanline interleaving. I thought some of you may enjoy some emails from Scott way back then (1999), perhaps as a refresher on scanline interleaving. All have to do with the ole dual-chip Voodoo5 :

....well, designing for something that works for non-powers of 2 gets difficult with address calculations and things like that...Plus, it was just never seen as a requirement for the product. Working with numbers that are powers of 2 are easy because of the simple binary math inherent. I know this is sort of vague -- there is no real answer to this other than just the fact that we did not design in the capability to support a non-power-of-2 number of chips...

...SLI works by partitioning up the screen into scanline "bands" -- it does NOT divide up the screen into halfs (a la the PGC or whatever that other hack solution was called...).

For the 2 chip configurations, there are various modes of operation:

Mode #1: SLI mode with no FSAA. In this mode, each chip, in parallel, renders scanline "bands". The particular band height is programmable, between 1 and 128 scanlines (power of 2). So, for example, one chip would render scanlines 1-32, while the next would render 33-64, etc. In this mode, then, each chip renders 2 pixels per clock, but since both are operating in parallel we achieve 4 pixels per clock throughput. Note that in this mode since we're not performing FSAA that each chip only renders one subsample per pixel (in other words, only pixels are rendered, not sub-pixels/sub-samples)

Mode #2: SLI mode with 2-sample anti-aliasing (FSAA): In this mode, each chip, in parallel, renders scanline "bands." However, we can set each chip up render 2 sub-samples per pixel. Therefore, since each chip is rendering 2 sub-samples per pixel, our effective fill-rate drops in half to 2 pixels per clock (each chip will render 1 pixel per clock since we're rendering 2 sub-samples, but since we have 2 chips running in parallel we achieve 2 pixels per clock sustained fillrate).

Mode #3: 4-sample anti-aliasing (this is also the mode which allows for the T-Buffer effects, as 4 subsamples are the minimum required to perform interesting T-Buffer special effects). In this mode, as with mode #2, each chip renders 2 sub-samples per pixel. So, in order to achieve a full 4 sub-samples per pixel we combine each chips 2 subsamples to form 4 sub-samples per pixel. In this mode of operation, we are not running in "SLI" per se, since each chip is actually working on the same pixel. The difference in this mode is that while each chip is working on the same pixel, each chip is working on a different subsamples within that pixel. So, each chip renders 2 sub-samples per pixel, and the 2 chips work in parallel to achieve a sustained 4 sub-samples generated per clock. So, in FSAA mode we achieve a sustained single pixel per clock fill-rate (each chip will render 1 pixel per clock since we're rendering 2 sub-samples each, and both work on the same pixel, but different sub-samples, to achieve a true single 4-subsample pixel per clock fillrate).

Of course, I'm sure you all know NVIDIA's method isn't what 3dfx did with the dual-Voodoo2s and Voodoo5s (read about this in our 3D Hardware & Technology forum). Personally, this isn't terribly exciting to me. It's cool, for sure (I remember how super-cool I felt telling folks I have SLI'ed Voodoo2s and how fast Quake2 ran back then!).
 
I think that is extremely sad that developers are being sold out.
When I find out the name of the game I'm going to avoid it.

I think it would be funny if the game ran like crap on nVidia cards when using SM 3.0.

I'm sure this time however there wont be a need to replace the shaders. :LOL:

It's been a while since the last edition but I'm sure no one misses this!

You're only saying this because you don't get compliments on your Pulpit thread.
You know darn well that the people of B3D and guests love reading your threads.
You're just feeling insecure. :)
 
I can't believe a game would NOT support pretty much the ENTIRE market of DX9 just to get some minor exclusive flashyness for Nvidia's latest chips - which can't even be bought for the vast majority of people since there are no low-cost products out yet.

Sounds like an extremely dumb choice to me, certainly they can't get enough money out of NV's pocket to cover up for the loss in sales.

I certainly wouldn't buy this game - whichever it is - even if I owned a GF6 card because I don't appreciate being held hostage by software and hardware companies. Likewise, I avoid Nvidia-branded games too, Epic are IMO totally untrustworthy and I don't believe a word coming out of Tim Swiney's mouth either. He's a bought NV puppet and will say anything that makes them look good.
 
Reverend said:
How disappointing is that? It's not difficult to guess the reason (and I have a feeling Joe will have something to say about this!).

Since you asked...how about you relay this message to said developer:

"Take your game and shove it up your ass."

That's where a crap game belongs, doesn't it? I mean, the game must be crappy...if the calculated "business reason" works out that whatever money nvidia game them would outweigh the loss of sale of the game to PS 2.0 cards from both nVidia and ATI, then the projected sales figure must suck schwetty balls.

Maybe you could then ask him how much mony "Bridge it" made....
 
DaveBaumann said:
Sadly there is more to this little story which really makes me :( for the industry.

Can someone spill the beans on this? Obviously, if the developer in question believes this is an honest and reasonable thing that will benefit their business and customers, they should have no problem standing up and saying who they are and what they are doing.

I doubt they will though, which pretty much proves that both they and Nvidia know they are doing shady deals. Don't forget, this includes orphaning all the SM2.0 Nvidia cards that have been sold. In fact, this is worse than orphaning, as Nvidia is actually *paying* developers to *deliberately stop supporting* their own SM2.0 cards (as well as the competition's), in order to try in increase sales of their newer SM3.0 cards.
 
I missed your Pulpit pieces Rev, they're actually one of the things that drew me here in the first place. (See? It's your fault! ;) )

DaveBaumann said:
Sadly there is more to this little story which really makes me :( for the industry.
Dave, I'm not saying you owe us after your little weekend jerk-around thread....but you kind of owe us for your little weekend jerk-around thread...

Spill it! ;)
 
this wont be a developers fault , ,this will be down to the publisher IMHO
guess the game won't be A-list anymore ..
-dave-
developers -> mostly nice
publishers -> mostly evil

still its a somewhat BIZARRE idea surely ? ? normally developers would be told "screw that sm3.0 rubbish, we need the game to work on gf2mx" :)
 
davefb said:
this wont be a developers fault , ,this will be down to the publisher IMHO
guess the game won't be A-list anymore ..
-dave-
developers -> mostly nice
publishers -> mostly evil

still its a somewhat BIZARRE idea surely ? ? normally developers would be told "screw that sm3.0 rubbish, we need the game to work on gf2mx" :)

Even if this was a deal with the publisher, the developer would in turn be getting money out of it from the publisher. Otherwise the developer is cutting down the market for their game, but not getting the recompense from Nvidia that the publisher is seeing.
 
If the game is an A title game then why would they need to make shady deals with nVidia?

Wouldn't the games repuatation(from marketing) allow it to make enough money? Or is this about keeping customers in the dark while still getting the dough from nVidia?
 
davefb said:
this wont be a developers fault , ,this will be down to the publisher IMHO
guess the game won't be A-list anymore ..
-dave-
developers -> mostly nice
publishers -> mostly evil

still its a somewhat BIZARRE idea surely ? ? normally developers would be told "screw that sm3.0 rubbish, we need the game to work on gf2mx" :)
Are you allowed to hint to us what the game is? (Do you know?)

K.I.L.E.R said:
If the game is an A title game then why would they need to make shady deals with nVidia?
Money. The bigger the game, the bigger the deal nVidia will make for an exclusive deal with them. (TWIMTBB :( )
 
It sounds to me like ATI was going to to help develop 2.0 version shaders, but nVidia's deal is actually prohibiting that.

Oh, but wait...it's nVidia "moving the industry forward"... :rolleyes:
 
[to digi]i aint got a clue , but most publishers/developers if given a good enough offer would take the money and run. i dont think exclusivity is good, but if nvidia think it's a good idea then they'll probably have a good budget to use.... i'd imagine funding a game development would be cheaper than advertising AND it does cut costs for the publisher/developer . .
lets face it if the nvidia money was substantial then the only way this wouldnt work for the people involved would be if the sales were apalling or maybe a wrapper came out now that would be funny!

-dave-
 
I know which game and developer Rev was referring to, and the strange thing in my mind is that it's a sequel to an A-title game that's going to sell fairly well (unless of course the developer screws the sequel's design up).

The hardcore online crowd that follows these sort of things are in the vast minority, so those who think that word of this, once leaked, is going to hurt sales are woefully wrong. The thousands of people who buy this sequel once it's released probably don't know what a pixel shader is, let alone being aware of Microsoft's API models and the differences between them.
 
John Reynolds said:
I know which game and developer Rev was referring to, and the strange thing in my mind is that it's a sequel to an A-title game that's going to sell fairly well (unless of course the developer screws the sequel's design up).

The hardcore online crowd that follows these sort of things are in the vast minority, so those who think that word of this, once leaked, is going to hurt sales are woefully wrong. The thousands of people who buy this sequel once it's released probably don't know what a pixel shader is, let alone being aware of Microsoft's API models and the differences between them.
As long as it ain't a part of the Grand Theft Auto series.... :oops:
 
so it isnt a case of making this SM3.0 ONLY , it's a case of only doing 1 high end version ? IE all the other versions still exist ?
*confused*
-dave-
 
I'm out of the loop on this, but the last I heard was that the sequel was going to support PS 1.1 and 3.0, with no fallback for 2.0. Which simply means all R3xx-R4xx hardware will run the game with 1.1 shaders. But I could be wrong.

Does this suprise anyone? I'm not trying to sound like an apologist for NVIDIA but why wouldn't, or perhaps shouldn't, they try and leverage a unique feature(s) their current chips have over the competition?
 
Status
Not open for further replies.
Back
Top