New demo based on the graphics engine I'm working on

I shouldnt be getting into this but...

The distinction that Rev was trying to make is whether Humus's demo is just a hobbyist programming venture into the Radeon 8500, or whether its a general purpose 3d engine. If its the former, then cool beans! If its the latter, well, then obviously things are....not so good. Right now nvidia is the baseline platform. Add to that the fact that - all religious debates aside - the GF3/GF4 and 8500 are 90-95% similar cards and you should be able to see the problem. Again, assuming this is supposed to be more than a radeon tech demo.
 
On 2002-02-14 18:34, Doomtrooper wrote:
I happen to disagree GREATLY with the thinking that you need to support OLD hardware..whats the point of buying new graphic cards if game developers continue to always code for hardware that is two years old.

Have you ever thought about the possibility that games are being produced to earn money with them? Therefor, they have to run on the hardware people are using...not on the latest and greatest only.
 
So when is the 'baseline platform' gonna change Johnnny. Certainly when coders CONTINUE to use only one brand all the time and ignore the other features Brand X brings. ATI cards are CERTAINLY not a minority in world, this Nvidia is baseline platform is getting old real quick. There is two major players, lets start playing this game on a 50/50 basis.
 
On 2002-02-14 18:34, Doomtrooper wrote:
I happen to disagree GREATLY with the thinking that you need to support OLD hardware..whats the point of buying new graphic cards if game developers continue to always code for hardware that is two years old.

Of course game developers can't target hardware that's only one year old. Too few would buy the game in the first place and how the hell are the developers going to build an engine on elusive hardware that's still one year away (given that a game takes at least two years to develop)!

Doom III is probably as close to getting an advanced engine [for its time] that you'll get. Why on Earth are you not happy that it takes advantage of your beloved PS 1.4?

I guess that I just don't get your agenda. More support for PS 1.4 is nice and all, but should gamers without a ATI 8500 suffer in this Holy Quest?

Regards, LeStoffer
 
On 2002-02-14 18:51, EgonOlsen wrote:
On 2002-02-14 18:34, Doomtrooper wrote:
I happen to disagree GREATLY with the thinking that you need to support OLD hardware..whats the point of buying new graphic cards if game developers continue to always code for hardware that is two years old.

Have you ever thought about the possibility that games are being produced to earn money with them? Therefor, they have to run on the hardware people are using...not on the latest and greatest only.

If a game developer started making a GAME right now using the latest and greatest features TODAY, following the usual 2 year development a budget card (Geforce 5 MX or Radeon 8800 LE) would run these features we consider high end today like childs play.....
 
On 2002-02-14 18:56, LeStoffer wrote:
On 2002-02-14 18:34, Doomtrooper wrote:
I happen to disagree GREATLY with the thinking that you need to support OLD hardware..whats the point of buying new graphic cards if game developers continue to always code for hardware that is two years old.

Of course game developers can't target hardware that's only one year old. Too few would buy the game in the first place and how the hell are the developers going to build an engine on elusive hardware that's still one year away (given that a game takes at least two years to develop)!

Doom III is probably as close to getting an advanced engine [for its time] that you'll get. Why on Earth are you not happy that it takes advantage of your beloved PS 1.4?

I guess that I just don't get your agenda. More support for PS 1.4 is nice and all, but should gamers without a ATI 8500 suffer in this Holy Quest?

Regards, LeStoffer

Just like GF2 owners couldn't see nature, just like Radeon owners couldn't see Dronez with bump mapping, sacrifices are made all the time to accomodate one card.
Sorry if you can't understand that people with OTHER brands of video cards would like to see some of their cards features too.

_________________




<font size=-1>[ This Message was edited by: Doomtrooper on 2002-02-14 19:05 ]</font>

Calm down DM. Wavey

<font size=-1>[ This Message was edited by: DaveBaumann on 2002-02-14 19:11 ]</font>
 
Still a commercial engine should run on a GF3, don't you think? I too want to see features of new hardware being supported by 3D engines, but not at the price of always having to buy the latest and greatest of hardware available.

Besides, it's not true that ATI features don't get supported by game developers. There are already a few games that use TruForm and more are coming.
 
On 2002-02-14 19:07, Xmas wrote:
Still a commercial engine should run on a GF3, don't you think? I too want to see features of new hardware being supported by 3D engines, but not at the price of always having to buy the latest and greatest of hardware available.

Besides, it's not true that ATI features don't get supported by game developers. There are already a few games that use TruForm and more are coming.

A commerical engine will 1st have alot more money, backing and probably more than one programmer Xmas. Humus stated some of these effects could be done with a Fallback options so sure...the idea here is the engine was built with advanced features enabled.
I do agree Xmas that commercial engines need to run on generic hardware but there should be a PENALTY for fallbacks, visually and most likely peformance too.

Consumer #1 goes out and buys Unreal 2 and a Geforce 8 at the cost of $600

Consumer #2 goes out and buys Unreal 2 and a Trident Blade Ultra 2 at the cost of $60

They bothe run the game and it looks the same, plays a little faster on the Geforce 8...I disagree with that Scenario.
 
Humus: Sorry for taking this so OT...

Doomtrooper: Two wrongs doesn't make a right. IMHO John Carmack is doing the only right thing and that is to make different optimizations for different cards (PS 1.4 is great and thus should be treated with it's own code path). All I'm trying to say is this: don't favor one vendor over other!

Regards, LeStoffer

PS: Sorry Humus, I have a [stoneaged] Geforce DDR and thus can't help you out! ;)
 
On 2002-02-14 19:24, LeStoffer wrote:
Humus: Sorry for taking this so OT...

Doomtrooper: Two wrongs doesn't make a right. IMHO John Carmack is doing the only right thing and that is to make different optimizations for different cards (PS 1.4 is great and thus should be treated with it's own code path). All I'm trying to say is this: don't favor one vendor over other!

Regards, LeStoffer

PS: Sorry Humus, I have a [stoneaged] Geforce DDR and thus can't help you out! ;)

ALL I'm trying to say is the same thing, and reality shows its never been that WAY yet. One vendor is favored over the other, in fact mentioned here 20 times now by different people...50/50 not 90/10 would be nice.
If Humus never did this engine using advanced effects I probably would have never seen these features used on my card..for that THX Humus it looks great.
 
A commerical engine will 1st have alot more money, backing and probably more than one programmer Xmas.
Sure. Rev made this distinction, so why are you so upset about it?
 
Originally posted by Humus:
Well, I honestly do not know, I don't have a GF3/4 card to test on, but since some guys at opengl.org complained recently about 3d textures being broken in several 2x.xx drivers I figured it might be this. I'm not calling any fragment shader routines though if it's not supported, I have tested it with a Radeon 7500 and it works there too. A GF3/4 should take the same path as the Radeon 7500.

Then I'd recommend you double-check to make sure that you aren't accidentally putting the driver in an undefined state (requesting MIP mapping and not uploading all MIP levels, for example).

Sometimes drivers handle this type of application bug gracefully, and other times the application just crashes.
 
On 2002-02-14 19:45, Xmas wrote:
A commerical engine will 1st have alot more money, backing and probably more than one programmer Xmas.
Sure. Rev made this distinction, so why are you so upset about it?


On 2002-02-14 16:39, Reverend wrote:
Crashed on a Win98SE (256MB)/GF3 (64MB) with official 23.11 drivers.

If it is because of the GF3 and/or its drivers, it's not a good engine... not now anyway

Pretty hard to judge if you can't see it.
 
On 2002-02-14 18:34, Doomtrooper wrote:

Pretty bold statement to make without even seeing it IMO. I happen to disagree GREATLY with the thinking that you need to support OLD hardware..whats the point of buying new graphic cards if game developers continue to always code for hardware that is two years old.

Since when was GF3 "old" hardware? This engine doesnt run on my GF3. That makes the engine useless for 97% of all users. But I am sure it's just a bug in the engine.
 
Try about 20%, recent survey on Counterstrike was GF MX's and GF2's and Radeons leading with 80% of the votes, the last 20% was all the high end cards.

This is based on a game that has 20,000 players a night online.
 
Maybe I can sum up all of the points being made in a way that doesn't offend anyone. ;)

1) There's nothing wrong with using a Radeon 8500 as the primary development platform. In fact, it can be a good thing as Humus is likely pushing the envelope as far as technology goes, driving the industry forward.

1) If Humus' app is eventually inteded to be an actual game engine, then at some point "proper" support for GeForce3 class hardware should be implemented. "Proper" doesn't mean "dumbing" the engine down to the "common denominator". It does mean trying to implement all of the effects as the "Radeon" engine. It may or may not mean comprimises in quality or features of the "GeForce" engine need to be made for performance purposes, or for purposes of development time.

Hmmm....did that work? :smile:

<font size=-1>[ This Message was edited by: Joe DeFuria on 2002-02-14 21:10 ]</font>
 
This sound good for me. Humus did not developed a fallback option because he does not have a GF3 to play with.

I think we need a new fund to buy a GF3 Ti200 for him (about $150 with H&amp;S). :LOL:

Anyway, nice work Humus and when you have a tested GF3 version I will download it ;)

<font size=-1>[ This Message was edited by: pascal on 2002-02-14 21:07 ]</font>

<font size=-1>[ This Message was edited by: pascal on 2002-02-14 21:09 ]</font>
 
If Humus never did this engine using advanced effects I probably would have never seen these features used on my card..for that THX Humus it looks great.

ATI Tech demo's?
 
On 2002-02-14 20:18, Doomtrooper wrote:
Pretty hard to judge if you can't see it.

Maybe you didn't see this
On 2002-02-14 17:34, Reverend wrote:
Joe, there's a smiley (and a winky) in my post.

Seriously though, anyone that aims to *make money* with anything (game engine, specific graphics codes, specific sound codes) would definitely want to ensure their thing runs on a GF3. If it requires a "fallback", mention it as such with explanations and so be it.

Humus may be doing something as a matter of pure "hobby" and "self interest". But the minute he shows it to any developer of publisher, he won't, IMO, get anywhere, and as nice as it may look, his hard work will be either wasted or simply a matter of self satisfaction.

You can't fight the market if you want to make money. If you don't want to make money and simply want to show, er, "the future", then that's fine.
 
Back
Top