Pixel or Vertex more important, looking forward?

Vertex or Pixel Shading Prowess of greater importance?

  • Pixel Shading

    Votes: 0 0.0%
  • Balance between the two

    Votes: 0 0.0%
  • or, like me, have no clue

    Votes: 0 0.0%

  • Total voters
    232
Ailuros said:
hovz said:
god this is going to be a test of patience...who gives a fuck about trilinear ops...that has nothing to do with this.

I'm showing already more patience than I usually do and no I don't agree with you in many departments. Do you really need that attitude? You're not going to force your personal opinion on me at the end of the day.

look. would u rather play a game like doom 3 at 1076 res with no aa or af? or a game like quake 3 at 1600 with high aa and af? does that make any sense to you?

Yes it does make sense. I play today Doom3 in 1280*1024 with AA/AF and Q3a in 2048*1536 with AA/AF. In fact there isn't a single game I haven't played in the past years in a lower resolution than 1152*864*32 and apart from some corner cases where MSAA wouldn't work, there was always a pinch of AA/AF added to the mix.

obviously i didnt mean sit completely idle, but if you think any games on the old ass unreal tournament 2003/4 engine are stressing these cards at all u need help.

It still remains the best multiplayer FPS I have on my system right now, with a very live and active 3rd party mapping community. It was a perfectly feasable point, since according to you VS units sit entirely idle; actually they don't because many games out there have T&L optimized code these days.

If anything else you have 4 VS units @ 380MHz on your R350, delivering far more than a 3.0 GHz P4.

i realize we dont have the power to totally phase out textures, but we have power to do alot more than we are now.

I still don't see the hardware available for that right now. I'd love too to have far more demanding games available, but I'd also love to have the according hardware too.

your point about dx9 class cards is exactly what i mean. developers code with the lowest common demonimator in mind as the base of their engine, and just add higher res textures and a few effects to scale up. that is NOT a scalable engine. a scalable engine would scale down the shaders, polygon complexity, etc. not just blur the textures and lower the res. how can you expect graphics to ever advance with the mentality of current developers?

You won't get any shader functionalities with a GF4MX today, if you'd play FarCry or any other game containing shaders out there. As for textures, here's another point why we need both more advanced hardware and underlying API in the form of dynamic on chip LOD, to tax the CPU even less. That's another advantage of a Geometry Shader and/or PPP in WGF.


my comment about high poly was directed at everyone who says high poly isnt the wave of the future. totally retarded train of thought.

I didn't see anyone even hinting anything close to that; rather the contrary.

yes change is gradual, but there has been almost zero change in the last 4 years.

Almost zero is a good one, especially considering how many average polys games had in 2000 and how they overall looked like.

If I'd have a reason to complaint, then it would be the marginal advancements in gameplay or original ideas for that department, but that's entirely OT.

i rly dont care about geforce 4 mx owners. you dont see ps 1 owners crying because they cant play ps2 games do you? if a game cant support shaders they should be replaced with older methods of rendering. thats part of the scalability.

i dont care if ut2k4 is a fun game to play, thats irrelvent. it still barely pushes the hardware. what are the ps units doing while running the game? almost nothing?

my doom 3 and quake 3 question obviously went over ur head. i fail to see how you missed it tho.

the current api is fine for more advanced graphics. look what 3dmark03 did. they put all the vertex skinning on the video card and made use of the vertex shaders. the cpu is almost completely free for physics and ai without worrying about skinning every character. there isnt a game that has approached the graphical quality of the final 3 game tests. doom 3 looks like a moderatly down graded battle of proxycon.

lets talk about games of 2000 compared to games of today. lets take 2 of the best looking games, sacrafice and giants. lets compare them tp ut2k4, painkiller, far cry. whats changed since then? higher res textures, more textures, a few effects? a modest increase in polygon complexity? wow in 5 years thats definitly a lot of progress.

answer me this. if we are taking advantage of such powerful hardware, why is it that games developed on a gefroce 3+ xbox system with a pentium 733 look as good as most of our current games, just at a lower res with no aa or af. pretty sad our newest games have less model and world complexity to them then games on a gefroce 3+ x box. dont even get me started on animation. we just try to cover it up with high res textures everywhere. and add on lots of aa and af. that is a waste of a 500 dollar video card.
 
lets compare halo 2 and ut2k4. ive tried to pick the more impressive shots of each game

halo 2
http://www.gamespot.com/xbox/action/halo2/screens.html?page=84
http://media.xbox.ign.com/media/482/482228/img_2371770.html
http://media.xbox.ign.com/media/482/482228/img_2342625.html
http://media.xbox.ign.com/media/482/482228/img_2122195.html
http://media.xbox.ign.com/media/482/482228/img_2122197.html

ut2k4
http://media.pc.ign.com/media/566/566925/img_2026957.html
http://media.pc.ign.com/media/566/566925/img_2026936.html
http://media.pc.ign.com/media/566/566925/img_2026927.html
http://media.pc.ign.com/media/566/566925/img_2026928.html
http://media.pc.ign.com/media/566/566925/img_2026920.html

sadly those were some of the best ut2k4 shots i could find.

even at the low res of the xbox theres not even any comparison. it looks much MUCH better. now give it the highres res textures of a pc game, up the resoluation to 1027, and add some aa and af and its like comparing dreamcast to nintendo64.
 
hovz your right that at some point we need to stop supporting older cards .

But whats the time limit ? Should we only support 2 generations , 4 gens , 2 lvls of dx ?

I dunno. But your right . Doom3 isn't great looking and i honestly don't think it pushes current hardware very hard. .
 
jvd its not that we support older cards, its that the game engine is designed around their limitations.
 
hovz said:
jvd its not that we support older cards, its that the game engine is designed around their limitations.

well its a double edge sword .

If you make jvd strikes back for a 2005 release and its based around the r520 and nv4x tech then what do those people with dx 8 cards do ? It will cost alot more money to down grade parts of the engine or make fall back paths for older games then it would be to just have the engine add moer polygons or higher textures on the newer cards .
 
Ailuros said:
I doubt that the majority out there has dx9.0 class accelerators even today.
It's getting there slowly though, according to the latest Steam survey (Scali, take a look how many Radeon 7500s there are) almost half of the Steam users who've logged on since Thursday (which is about the best sample of 'gamers' we have access to) have a DX9 card, or a little over a quarter if you don't want to count the FX line.
hovz said:
lets compare halo 2 and ut2k4. ive tried to pick the more impressive shots of each game
So what you're demonstrating is that XBox titles concentrate on shaders because the developers know the entire install-base is supports them, and they're the best way of compensating for the low texture detail, woeful polycounts, tiny maps, and so on? It's in a PC game developer's best interests to make a game look as good as possible on the widest range of hardware possible, which means first and foremost catering to the low end. And how do you best cater to the low end? You design the game around the low end. It doesn't matter how flash the game looks at max settings or how well it utilises the latest Bitchin' Fast 3D 2000, if a cheap card can't produce something pretty close to the screenshots on the box you start to piss people off and alienate your audience, and that's bad for business. It'd be great if developers made games that pushed the latest hardware, then top end cards might really be worth investing in, but it just isn't economically feasible for them to do so.

Not to mention that UT2004 is about the worst example (next to Call of Duty perhaps) of current PC graphics that you could pick and Halo 2 is the XBox poster-boy, which dare I say it was intentional? Hovz vs. Sweeney, Round 3.
 
jvd said:
hovz said:
jvd its not that we support older cards, its that the game engine is designed around their limitations.

well its a double edge sword .

If you make jvd strikes back for a 2005 release and its based around the r520 and nv4x tech then what do those people with dx 8 cards do ? It will cost alot more money to down grade parts of the engine or make fall back paths for older games then it would be to just have the engine add moer polygons or higher textures on the newer cards .

of course it will cost more money. on the other hand, we could just keep releasing cookie cutter titles. and console ports.
 
I think a point that is overlooked is that while there are various options for texture resolution and various pixelshader/fixedfunction paths, the same road is generally not traveled for geometry.
Like a set of lowres and highres textures, a game could also include a set of lowres and highres meshes, catering for various levels of 3d hardware.
With a modern game like Doom3 this takes very little effort, since the normalmapping already requires the artist to derive a lowres model from the original. He would just have to create two or more resolutions then. The tools for this are already available.
 
Fodder said:
hovz said:
lets compare halo 2 and ut2k4. ive tried to pick the more impressive shots of each game
So what you're demonstrating is that XBox titles concentrate on shaders because the developers know the entire install-base is supports them, and they're the best way of compensating for the low texture detail, woeful polycounts, tiny maps, and so on? It's in a PC game developer's best interests to make a game look as good as possible on the widest range of hardware possible, which means first and foremost catering to the low end. And how do you best cater to the low end? You design the game around the low end. It doesn't matter how flash the game looks at max settings or how well it utilises the latest Bitchin' Fast 3D 2000, if a cheap card can't produce something pretty close to the screenshots on the box you start to piss people off and alienate your audience, and that's bad for business.

Not to mention that UT2004 is about the worst example (next to Call of Duty perhaps) of current PC graphics that you could pick and Halo 2 is the XBox poster-boy, which dare I say it was intentional? Hovz vs. Sweeney, Round 3.

what im deomstarting is how much better a game on outdated ass hardware looks then games on our most powerful hardware.

how is it alienating their users? no where on a geforce mx box does it say u will get the highest graphical experience. does sony alientate there users when they make games for ps2 that look better than ps1? what about microsoft when they make a new version of windows that REQUIRES a faster system to run properly?
 
Scali said:
I think a point that is overlooked is that while there are various options for texture resolution and various pixelshader/fixedfunction paths, the same road is generally not traveled for geometry.
Like a set of lowres and highres textures, a game could also include a set of lowres and highres meshes, catering for various levels of 3d hardware.
With a modern game like Doom3 this takes very little effort, since the normalmapping already requires the artist to derive a lowres model from the original. He would just have to create two or more resolutions then. The tools for this are already available.

scalability to me would be downgrading the complexity of the world and models, downgrading or substituting shaders, etc. lets take doom 3 for example. everyone seems to be praising it for how technically advanced it is. the difference between this game on a nv40 and a nv20? resolution, texture quality, and some effects. even at the highest possible settings the game doesnt look that good. everything is extremely low poly, the textures are ugly, the particles look horrible, the specularity maps look awful. the shadows look good because they help to hide all of this. the flashlight effect is way off, and doesnt even look like its illuminating anything, rather it looks like your just seeing only part of whatever u shine it on.

the fact the xbox port is basically the same with lower resolution textures only proves my point
 
hovz said:
what im deomstarting is how much better a game on outdated ass hardware looks then games on our most powerful hardware.
But all it has going for it is liberal use of shaders? In every other respect it trails far behind UT2004, which is in itself fairly unimpressive? And what if you compare it to something like Far Cry, HL2 or Doom rather than a glorified expansion pack for a 2 year old game on a 3 year old engine? There's simply no contest.
hovz said:
how is it alienating their users?
What cards do the majority of gamers have? Shit ones. If you want your game to sell, which cards will it have to run on? Shit ones. Can you afford to make one game for the shit cards and one for the good cards? Odds are, no. Is it feasible to take the 'shit card game' and jack up some areas? Yes. Is it feasible to take the 'good card game' and scale down some areas? No.
 
i rly dont care about geforce 4 mx owners.

Obviously developers do and the simple reason is that they still want to earn some money out of their projects, otherwise they'd be running a charity project after all. And yes the high end market is the vast minority out there.

if a game cant support shaders they should be replaced with older methods of rendering. thats part of the scalability.

Developers aren't too fond of too many and too complicated paths, just because they cost too many resources.

my doom 3 and quake 3 question obviously went over ur head. i fail to see how you missed it tho.

I answered your question; I just didn't give you the answer you wanted to hear and frankly it isn't going to happen either.

the current api is fine for more advanced graphics. look what 3dmark03 did. they put all the vertex skinning on the video card and made use of the vertex shaders.

No it isn't "fine". A simple insight into future APIs is a real eye-opener to those willing to keep an open mind. What was it exactly so miraculous about 3dmark03 after all? They've put an unnecessary load of vertex calculations in game test 2 & 3 in order to emulate a Doom3 alike game case, which never occurs or most likely will occur in any game utilizing the specific game engine. It's basically skinning every object in both cases multiple times. That's supposed to be an advantage? If it is than I really wouldn't know what redundancy stands for.

the cpu is almost completely free for physics and ai without worrying about skinning every character. there isnt a game that has approached the graphical quality of the final 3 game tests. doom 3 looks like a moderatly down graded battle of proxycon.

Skinning each object only once per frame would had been enough and that's how Doom3 handles things. Removing as much redundancy as possible from game code is an advantage and not a disadvantage. I don't like Doom3 much but that for other reasons.

answer me this. if we are taking advantage of such powerful hardware, why is it that games developed on a gefroce 3+ xbox system with a pentium 733 look as good as most of our current games, just at a lower res with no aa or af. pretty sad our newest games have less model and world complexity to them then games on a gefroce 3+ x box. dont even get me started on animation. we just try to cover it up with high res textures everywhere. and add on lots of aa and af. that is a waste of a 500 dollar video card.

I'm not aware how long you can still stand that little crusade of yours, but in between all the loopsided arguments, half truths and inaccuracies I really don't see much reason for so much passion either. I am not a developer. Get in touch with one and ask him the very same questions you're asking me. Even better you could eventually consult ISVs how to run their business or developers how to create their games.

If you think that XBox games look better than on the PC than that's just as fine by me. Buy a next generation console then and call it a day.

sadly those were some of the best ut2k4 shots i could find.

Do you really think I need an illustration of the game?

even at the low res of the xbox theres not even any comparison. it looks much MUCH better.

To you but not to me. Try hard as you might you're not going to force neither your opinion nor any preferences down my throat.

Just for the record it's 1024*768.
 
Scali said:
Like a set of lowres and highres textures, a game could also include a set of lowres and highres meshes, catering for various levels of 3d hardware.
This does have problems with normal mapping, specifically that tangeant-space normal mapping (required for most compression schemes, and often best for animated geometry) is invalid if the model is changed.

But still, games have had LOD for geometry for quite a long time. As an example, the Unreal engine has supported geometry LOD since around the days of the original UT. City of Heroes is another game that supports geometry LOD. It's just that it's typically used mostly to scale back the geometry of objects that are far away, not so much to reduce polycount on low-end hardware (however, it would pretty much do that automatically anyway, since lower resolution would naturally mean lower geometry LOD).
 
Scali said:
Well, I don't want to destroy the nice flamewar here... but I think both sides are saying pretty much the same, but approaching it from a different angle.

Nope we aren't and I'll elaborate...

I believe one side stresses on the quality of textures and AA/AF to increase realism... And the other side stresses on geometric detail.

Wrong again. I want both and not just one part increasing, preferably at the same time and gradually. Texture filtering and/or Antialiasing alone will not bring any real progress as you won't come close to realism with higher geometry only. What on earth happened to happy mediums?

But it seems that the others interpret it as trading texture/image quality for geometry.

Partially yes. I just don't want the concentration to fall on just one department. I would want a lot of things too, yet I'm also at least attempting to understand natural restrictions and indirect measurements that sadly still exist in the graphics market in order to come to more balanced conclusions.

However, if you look at the polycount today, and say, 4 years ago, it has pretty much stagnated, and that's rather sad. Agreed, the same polycount looks better today, because of the better textures and normalmaps and pixelshading, but still, the silhouettes are blocky, and animation is a tad limited. So yes, I would be happy if this area were explored, and texture quality and resolution remained at the current level for now.

The real first big game that contained a pure T&L optimised code was UT2k3. How many years after the introduction of T&L units on GPUs? It always takes time. I personally was quite disappointed in the past from almost non-existant dx8.1 shaders support a couple of years back, while on the other hand hope started to rise again with the appearance of the R300 and developers quickly adopting even some dx9.0 class shaders.

It might sound foolish from me to put too much hope on WGF, yet it's real target IMHO is to really leave only AI and physics to the CPU, with the rest falling onto GPUs. On chip adaptive tesselation isn't just a gimmick feature, dynamic LOD, geometry compression, stencil shadows calculated w/o the CPU and the list could go on and on. I don't think early WGF hardware will bring a gigantic revolution either, yet it at least sounds at this point of time a step into the right direction.

Another rather simplistic sounding question: why are developers lately instead of picking real displacement mapping via VS3.0 rather opting for parallax mapping? (1) It's cheaper to implement and probably delivering higher performance too and, (2) it can be realised on a much wider variety of accelerators out there.

As for high resolutions, it's currently only a side measure for me to decrease partly in combination with other techniques general aliasing or other annoying patterns. If we ever would get as far to render games that would look damn close to Pixar's Finding Nemo as an example, resolution would be the least of my concern.
 
Chalnoth said:
Scali said:
Like a set of lowres and highres textures, a game could also include a set of lowres and highres meshes, catering for various levels of 3d hardware.
This does have problems with normal mapping, specifically that tangeant-space normal mapping (required for most compression schemes, and often best for animated geometry) is invalid if the model is changed.

Plus, it would make multiplayer quite unfair what with per-poly collision detection. :LOL:
 
Ailuros said:
i rly dont care about geforce 4 mx owners.

Obviously developers do and the simple reason is that they still want to earn some money out of their projects, otherwise they'd be running a charity project after all. And yes the high end market is the vast minority out there.

if a game cant support shaders they should be replaced with older methods of rendering. thats part of the scalability.

Developers aren't too fond of too many and too complicated paths, just because they cost too many resources.

my doom 3 and quake 3 question obviously went over ur head. i fail to see how you missed it tho.

I answered your question; I just didn't give you the answer you wanted to hear and frankly it isn't going to happen either.

the current api is fine for more advanced graphics. look what 3dmark03 did. they put all the vertex skinning on the video card and made use of the vertex shaders.

No it isn't "fine". A simple insight into future APIs is a real eye-opener to those willing to keep an open mind. What was it exactly so miraculous about 3dmark03 after all? They've put an unnecessary load of vertex calculations in game test 2 & 3 in order to emulate a Doom3 alike game case, which never occurs or most likely will occur in any game utilizing the specific game engine. It's basically skinning every object in both cases multiple times. That's supposed to be an advantage? If it is than I really wouldn't know what redundancy stands for.

the cpu is almost completely free for physics and ai without worrying about skinning every character. there isnt a game that has approached the graphical quality of the final 3 game tests. doom 3 looks like a moderatly down graded battle of proxycon.

Skinning each object only once per frame would had been enough and that's how Doom3 handles things. Removing as much redundancy as possible from game code is an advantage and not a disadvantage. I don't like Doom3 much but that for other reasons.

answer me this. if we are taking advantage of such powerful hardware, why is it that games developed on a gefroce 3+ xbox system with a pentium 733 look as good as most of our current games, just at a lower res with no aa or af. pretty sad our newest games have less model and world complexity to them then games on a gefroce 3+ x box. dont even get me started on animation. we just try to cover it up with high res textures everywhere. and add on lots of aa and af. that is a waste of a 500 dollar video card.

I'm not aware how long you can still stand that little crusade of yours, but in between all the loopsided arguments, half truths and inaccuracies I really don't see much reason for so much passion either. I am not a developer. Get in touch with one and ask him the very same questions you're asking me. Even better you could eventually consult ISVs how to run their business or developers how to create their games.

If you think that XBox games look better than on the PC than that's just as fine by me. Buy a next generation console then and call it a day.

sadly those were some of the best ut2k4 shots i could find.

Do you really think I need an illustration of the game?

even at the low res of the xbox theres not even any comparison. it looks much MUCH better.

To you but not to me. Try hard as you might you're not going to force neither your opinion nor any preferences down my throat.

Just for the record it's 1024*768.

doom 3 does skinning on the cpu, 3dmark on the gpu. thats my point. skinning on the gpu is infinitly faster than on the cpu. maybe then they wouldnt have shitty low polygon worlds and characters.
 
Fodder said:
hovz said:
what im deomstarting is how much better a game on outdated ass hardware looks then games on our most powerful hardware.
But all it has going for it is liberal use of shaders? In every other respect it trails far behind UT2004, which is in itself fairly unimpressive? And what if you compare it to something like Far Cry, HL2 or Doom rather than a glorified expansion pack for a 2 year old game on a 3 year old engine? There's simply no contest.
hovz said:
how is it alienating their users?
What cards do the majority of gamers have? Shit ones. If you want your game to sell, which cards will it have to run on? Shit ones. Can you afford to make one game for the shit cards and one for the good cards? Odds are, no. Is it feasible to take the 'shit card game' and jack up some areas? Yes. Is it feasible to take the 'good card game' and scale down some areas? No.

halo2 looks better than any of the current unreal games in every aspect.
 
Back
Top