State of 3D Editorial

digitalwanderer said:
JoshMST said:
It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing.
Well, except for the fact that when ATi was in a similar situation they did nothing of the sort. :rolleyes:

Do we have a mild case of revisionsist history there? R200 featured hacked Trilinear for a number driver releases in Q3 and there was the whole Quake/Quack issue that has been debated for a long time.
 
DaveBaumann said:
I think is a valid question to ask if the efforts they put into Cg were not better focused on supporting HLSL fully and creating a driver compiler much earlier than they did.

Agreed. Although there's more to it I think; I'm not sure whether new drivers do any VLIW work, but I'm pretty sure Cg couldn't be used for that considering it just gives D3D or OpenGL shaders in output.

One of the theories I heard was that the driver team simply had no idea of how the NV30 worked and just assumed it was some type of "super NV25". I'm VERY doubtful of that theory, considering where it came from ( and more importantly, how much of the man's speculative mind might have gotten into it ), but it does seem possible similar problems happened at NVIDIA - communication was, and is, just so darn bad there...




Uttar
 
JoshMST ,

thanks for the reply to my post. I dont agree with it but thats my problem not yours :)

I look forward to seeing a revised article when its ready. TY.
 
DaveBaumann said:
digitalwanderer said:
JoshMST said:
It is a nasty business, and there are no excuses for it, but that is the way it is. If ATI finds itself in a similar situation, it would probably do the same thing.
Well, except for the fact that when ATi was in a similar situation they did nothing of the sort. :rolleyes:

Do we have a mild case of revisionsist history there? R200 featured hacked Trilinear for a number driver releases in Q3 and there was the whole Quake/Quack issue that has been debated for a long time.
I'm not arguing/denying that, but I don't remember them attacking John Carmack and iD about their software being the problem the way nVidia attacked FutureMark.

I also remember ATi addressing and fixing the problem in a timely manner. ;)
 
Doomtrooper said:
Half-life 2 engine is the way I always wanted to see a engine made, it is scalable.
This keeps the enthusiast community happy with the R300, NV35's while the Dell freaks with their GF MX and Radeon 7500's can play too.

Microsoft has already stated the HL 2 engine and benchmark is representivie of DX9 performance.

http://www.microsoft.com/presspass/press/2003/sep03/09-10HalfLifePR.asp

I'd agree, but according to Sxotty, it isn't a DX9 game because most of what it does can be approximated in it's DX 7/8 paths. I'm just questioning his stance on what makes a game truly DX9, as it seems nonsensical to me. HL2 won't suddenly become "more DX9" if Valve removes support for DX7/8, but according to Sxotty, it will.
 
Dio said:
nelg said:
Dio, join the marketing department now
Is that a compliment or an insult? :D Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.

Well if that means you and/or eric will have to leave your job(s) in the cafeteria, forget it. I would never suggest that anyone gives up free doughnuts. Kidding aside, yourself, sireric, OpenGl guy and Simon do a great job at explaining the inner workings of graphic chips. I am a lost to understand why the marketing department is so quite in this regards.
 
Bouncing Zabaglione Bros. said:
I'd agree, but according to Sxotty, it isn't a DX9 game because most of what it does can be approximated in it's DX 7/8 paths. I'm just questioning his stance on what makes a game truly DX9, as it seems nonsensical to me. HL2 won't suddenly become "more DX9" if Valve removes support for DX7/8, but according to Sxotty, it will.

Is Half Life 2 out? No
I said nothing out currently I would consider a DX9 game. Please do not put words in my mouth especially when they are incorrect.

Sorry I did not answer your question earlier, but I was busy with RL and have not perused this board since I posted.

I cannot yet tell you whether Half Life 2 is a DX9 game, it seems to be one, but since it is not yet out and I have not yet seen what it actually does in the game I cannot tell you. If it is like the tech demo then of course it would be, what I said

"the idea of DX9 as a minimum even though a work around may be possible."

It has absolutely nothing to do with whether it supports a work around for lower level hardware, it has to do with whether significant portions are missing on lower level hardware. For example doom3 (which is OGL I know) if card X did not show shadows then I would say that is a significant drawback and therefore it is above that level of card. In half-life there may be significant features that are missing with dx8 cards if so then yes I would consider it dx9, but if dx8 looks exactly the same only goes slower, or if there is only 2 effects in the whole game missing I would not consider it DX9.

This is really simple, if a developer adds 1 or 2 dx9 level shaders it does not IMO suddenly make it a DX9 game, if they invest a significant amount of time in making content based on DX9 and then make workarounds for that content it is DX9. It is a principle of what came first, if they make a DX8 game and add DX9 it is DX8 w/ added stuff, if they make a DX9 game and add workarounds for lower level hardware it is DX9 hopefully this is clear enough.
 
nelg said:
Dio said:
nelg said:
Dio, join the marketing department now
Is that a compliment or an insult? :D Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.

Well if that means you and/or eric will have to leave your job(s) in the cafeteria, forget it. I would never suggest that anyone gives up free doughnuts. Kidding aside, yourself, sireric, OpenGl guy and Simon do a great job at explaining the inner workings of graphic chips. I am a lost to understand why the marketing department is so quite in this regards.
If you ever saw Dio in the flesh you would suggest that he give up the free doughtnuts.

GP.
 
Genghis Presley said:
nelg said:
Dio said:
nelg said:
Dio, join the marketing department now
Is that a compliment or an insult? :D Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.

Well if that means you and/or eric will have to leave your job(s) in the cafeteria, forget it. I would never suggest that anyone gives up free doughnuts. Kidding aside, yourself, sireric, OpenGl guy and Simon do a great job at explaining the inner workings of graphic chips. I am a lost to understand why the marketing department is so quite in this regards.
If you ever saw Dio in the flesh you would suggest that he give up the free doughtnuts.

GP.
And what about yourself, GP? :D

(Probably, give up the free beer! ;))
 
OpenGL guy said:
Genghis Presley said:
nelg said:
Dio said:
nelg said:
Dio, join the marketing department now
Is that a compliment or an insult? :D Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.

Well if that means you and/or eric will have to leave your job(s) in the cafeteria, forget it. I would never suggest that anyone gives up free doughnuts. Kidding aside, yourself, sireric, OpenGl guy and Simon do a great job at explaining the inner workings of graphic chips. I am a lost to understand why the marketing department is so quite in this regards.
If you ever saw Dio in the flesh you would suggest that he give up the free doughtnuts.

GP.
And what about yourself, GP? :D

(Probably, give up the free beer! ;))
Never.

GP.
 
Dio said:
Actually I've just been watching sireric. He really gets it right - solid engineering facts presented clearly.
But isn't this because Eric is just that much more of a geek, and one who continuously wishes he has more hair?

;^) :^)
 
It wouldn't have to have FP16 only FP32.

Nvidia are trying to raise the minium precision requirement to FP32 because it is VERY likely that the R420 will still only have FP24 precision there for nvidia can advertise they are the only DX9.1 complaint product.
 
I guess all the ATI people would get really pissed, but I couldn't care less. This already happend right? DX8.1 was just for ATI wasn't it?
 
OpenGL guy covered this early in the thread, but I'll bring it up again because it is fundamental to the pipelined architecture of a graphics chip. From the article ...
By doing this NVIDIA does not suffer a speed penalty as ATI does with the conversion
Because graphics pipelines are very lengthy it is easy to insert conversion code into the pipeline. There is no performance cost. The only cost is the number of transistors required for the conversion.

Regarding the issue of going full FP32. Someday it will happen. Heck someday we'll probably find a reason to go higher. The timing of this issue is very important though. If you go FP32 too early you do nothing but waste transistor space because no games will take advantage of the extra precision. Meanwhile the competition uses their die space to fit extra rendering horsepower.

TheTaz said:
You say in your article that PS 3.0 will be in DX9.1, when it's more likely to be in DX 10. Now... whether DX9.1 PS 2.0+ (2.1?) is FP32 or not, I don't know, but I highly doubt it.
PS 3.0 will come before DX10.
 
NVidia's action this year were very disappointing to me and frankly I am surprised they haven't suffered a major class action.

Basically I see they lied and decieved a market and made false claims and rigged independent benchmarks to inflate the abilities of their products. To me this is acting deceptively for financial gain and it is normally a civil if not a criminal act intended to defraud someone.

Forget why they did it, or that they made promises that they may now be closer to delivering on, for a long while those mis-truths were in the public domain hurting consumers of their products ("opportunity loss") and their competitors (financial loss).

I see their opting out of MS HLSL and into Cg was trying to shift the standard to a proprietary standard that would have been the first moves to get a monopoly happening in the market - they would have owned and control the key industy API for 3d development.

Of these two points the first is clearly in the public domain - that they clearly did this - and that it was done to stay competitive - i.e. to reap financial gain. But this did this by deception for undue financial gain. In my country that is a crime and your company directors would be liable. I wonder if the same does not apply in America - it may sound fanatical but why has no one sued NVidia's pants off? Why are they despised in quarters - but let off the hook? That's what I don't get - to me their actions were wrong and documented enough to get them in a lot of hot water - how have they evaded this?

The article might have been in the right spirit but it had many inaccuracies that have been well discussed here. I don't see why its permissible or forgivable to let a company caught cheating and deceiving for a long time off.

Now NVidia might do magic with compiler optimisers and by say mid 2004 the NV3x may be able to do all NVidia originally claimed - but if a video card has a shelf life of 3 years then for 50% of it NVidia's cards didn't do what it was said they could do. NVidia didn't say they had the potential to be (unusably) DX9 capable - they said they were DX9 capable technology. Well frankly the h/w was 5 months late and the software to enable this is only appearing now - and its weak at best.

I am still suprised that NVidia didn't put alot more effort into compiler optimisation for their unorthodox architecture a hell of alot sooner. I think their instructions issued (code) vs temporary registers utilised (data) trade offs have to be heavily redesigned to achieve better NV3x performance by avoid pipeline stalls. Traditional compiler optimisation techiques seem to be exactly the wrong approach for the NV3x's architecture.
 
I thought PS3/VS3 was in DX9 as it stands.
 
As much love as the R300 gets, I still don't think it gets enuf. In retrospect, I think it is going to be rated up there with Voodoo 1 and TNT as a sea change moment in the history of the business. All this "what's wrong with NV?" stuff really boils down to about 80% of the answer being "ATI". I think NV never believed in their hearts that ATI could do something like that, and got so focused on benchmarking their future products against their past products. I feel 99% sure that when they first sat down to talk about what 5800 was going to have to be able to do, that there was *never* a moment when one NV guy turned to another and said "Of course it will have to be 3x as fast as the current generation in AA to be competitive".

Remember when people (pre-r8500 days) sat around and talked about how ATI's offering would have to be at least 20% faster than NV's current king-of-the-hill for people to give them a chance and try them? Remember how about the same time some NV rep snickered publically that it was fine with them that ATI was crowing about how the r8500 would be faster than a GF3?

So ATI learned their lesson that time, and really delivered a piece of hardware that stunned NV. Good-oh on them. NV is still catching up.
 
bloodbob said:
It wouldn't have to have FP16 only FP32.

Nvidia are trying to raise the minium precision requirement to FP32 because it is VERY likely that the R420 will still only have FP24 precision there for nvidia can advertise they are the only DX9.1 complaint product.

Aside from the fact that there is yet to be any proof of any kind that there will be a Microsoft DirectX 9.1, and counting the fact that Microsoft DirectX 9.0b already covers Pixel and Vertex shaders 3.0, as well as covering the 32bit precision (although it is not a required part of the spec while 24bit is), and tossing onto that the fact that the Microsoft DirectX team has stated publically that there will be no updates to the DirectX standard until the time of Longhorn...


Makes me wonder there Bloodbob were exactly your pulling this out of?

Is it from notes from some Application Specific Shaders that indicate this?

So what if it's likely that the R400 (I'm still trying to figure out why you people keep calling it R420 when that completely breaks naming traditions and goes against some of the internal ATi documents, and now back to the point) carries 24fp bit precision. Is this going to make a realisitc quality differnce that we can see? On the current processes of the day, that's a big no.

Nvidia can claim whatever they want, but only when the game scores start rolling in will it mean anything. Right now, it means absolutly nothing.

The Nv3x series can do FP 32 bit precision. But we all know that it runs like Stephen Hawking (or was that Halo that runs like Stephen Hawking??) Unless Nvidia can pull something out of their Application Specific Shaders... Nv4x is still going to loss, possibly even lose to the R350 in pure DirectX 9 specified shaders.


Now, to chew Scotty or whatever is name is out that can't tell what a DirectX title is. Yo, kid. You just love throwing Halo around like it's some kind of DirectX title.

REALITY CHECK PLEASE!

Halo is a DirectX 8.05 title to begin with. The GPU found in the X-box is basically a Geforce3 with Geforce4 shaders, and a some DX 8.1 technology (never seen in the GF3-4 releases) thrown in. In realitive power terms, the Nv2A is a little less powerful than a Geforce4 Ti 4200.

The main CPU is also a /CELERON/, not a pentium III as Microsoft so loves to claim.

If Gearbox had done the job they were supposed to, anybody with a 733mhz Pentium III (real one) and a Geforce4 Ti 4200 should be able to run Halo just as well as the X-box, albiet losing maybe 1 or 2 shaders in the transfer. Anybody with a 733mhz Pentium III (real one) and a Radeon 8500 should be able to run Halo exactly as the X-box would run it.

Plain and simply, they can't. Systems that meet and beat the X-box part for part are fully incable of running the Halo title. Even today's leading graphics cards which dwarf the X-box GPU by an order of Magnitude if not more, are barely able to break 30 average frames when playing the PC version.

I should know. I had to turn it all the way down to the X-box resolution and lock it at 30fps on a ~2ghz AthlonXp, 1gig DDR 333, Radeon 9800 Pro platform in order to be able to play the game.


The fact is this : Halo PC is not representative of any DirectX 8, 8.1, or 9 titles. It is a mess of jacked up coding that Gearbox should be ashamed for having released it upon the PC market.

Okat, rant over.
 
LOL that was quite humorous in light of the previous arguments.

edit: to enumerate I am simply saying you should try to read things completely


for example here is a post about halo I made
Sxotty said:
This is a major problem, it really is don't blame him for having a crappy system I wish we could demand engines that are efficient, HALO is horrible, so it Morrowind btw. If engines were as efficient as Q3 was think what you could have for visuals on a current gen machine with like 40fps it would be amazing.

If we keep getting more powerful hardware and companies simply keep providing shoddier and shoddier engines then we will end up with nothing to show for all our fancy pants hardware.... yeah

That was were I was discussing how poorly HALO was done.


Sxotty said:
Well if HALO is anything to go by then the r300 is not going to run DX9 applications very well at all. (of course halo isn't really DX9) Once real DX9 games come out,...

The argument was that I was saying HALO is not DX9 it just has a few features thrown in....
 
Back
Top