What is the problem with todays developers ??

Come on, be fair. There are some great looking games out at the moment. Dungeon Siege is one. I was most impressed when I fired that up for the first time. I haven't seen them yet but I'm told NeverWinterNights and WarcraftIII are superb too. (ah, I see pcchen doesn't rate NVN)

Try Doom again, it looks pants compared to most of todays stuff.
 
Sure Dungeon Siege looks good and requires at least a 1000 mhz processor with a decent video card to run ??
Look at Diablo II, I was running a Athlon 1.4 ghz and a 64 meg Radeon card when it was released, you get 6 players casting spells and it made my system look like a 386 ..bleh. Yet Diablo 1 I was running a P166 and a ATI Xpert 98 and it ran well then too

P166
Vs
Athlon 1.4

If anyone has played Maddeon 2002 I must say that game runs well on mild machines and looks amazing...to me that is a example of a game that was coded to achieve optimum visuals and speed with current hardware.
 
Doomtrooper said:
Sure Dungeon Siege looks good and requires at least a 1000 mhz processor with a decent video card to run ??
Look at Diablo II, I was running a Athlon 1.4 ghz and a 64 meg Radeon card when it was released, you get 6 players casting spells and it made my system look like a 386 ..bleh.

I've been thinking about starting up a new thread about this. Forget the rendering part of 3D for a moment. What about the rest of the 3D engine work, the stuff that resides on the host?

The host system roughly doubles in processing power every 18 months. Hmm. That only makes a factor of ten in roughly 5 years. Not too impressive by graphics standards. It gets worse though, some of the calculations it does is O(n^2) rather than linear, and if those parts start to dominate, we only gain a factor of 3 in 5 years.
Add the trend that we want more interactive environments, and environments that are more "alive" for want of a better word, further increasing the load on the host. The outlook for the future starts to look comparatively grim.

How can we avoid getting bottlenecked by the host?

Entropy
 
Because not every game development company is a "id Software".

Because publishers demand developers to cut down on latest-and-greatest graphics so that the game will run on more machines and at a reasonable performance at that. And make money.

Developers themselves want to make money (re publisher influence above).

As for a technical point of view :

Additional CPU power is sucked up by a lot of factors, including APIs, driver overheads, AI, sound, physics, etc, etc. API and driver overheads are actually due to the advanced technology provided.
 
I think pcchen summed it up pretty good too. I also think there's another big factor that's not being taken into account here. Growing up as a kid, when playing games, you usually loved them all, or most of them. Every game you played, was being played with child-like wonder. We used our imagination where things were lacking, and generally it's much easier to have fun and be satisfied when playing games as a kid. Now that most of us are older and adults now, it's hard to view games with child-like wonder anymore. Our criterias and expectations are raised very high, and we're let down easily.

I read a story once a while back, that made sense. It involves a movie critic who reviews movies for his job. In every movie he watched, he would always pick it apart, and focused on the negative more than the positive. One night, he took his beautiful new girlfriend out to a movie on a date, and at the end, she asked how he liked it, and he picked it apart and said how much it sucked. She felt bad because he didn't have a good time, and that kept being the case for future movies. After a while, he realized that he was ruining the experience for her, because she was enjoying the movies, yet he wasn't. Then he decided to change his perspective, and not pick movies apart, and just try to enjoy them for what they were. In the end, he did .... and he got fired from his job because every movie he reviewed were great. :LOL:

So I decided to do that, and I've been enjoying a ton more games lately. I try to play them with some form of child-like wonder, and try to have fun. Sometimes that just can't be the case, and you end up playing a crud game, but that's just inevitable.

FWIW, the last game to wow me graphically is Morrowind. Gameplay-wise, would have to go to Grand Theft Auto 3 for the PC. Right now, I'm enjoying NWN immensely, and Warcraft 3 too. To me, it's a great time to be a PC gamer. :)
 
For me, it was a graphical epiphany to see Quake2 run for the first time, as the first game I loaded up on my voodoo1. Its true, since those days of the first 3d-accelerated games, there simply hasn't been the same progress in software to match the advances in hardware. T&L never had its "killer app", and now we're waiting for shaders to be utilized.

I'd say Quake3 was a significant step forward, both in appearance and technology, using significantly more complex geometry, higher res textures, and more rendering passes. That was in 1999 and I honestly have yet to see it put to shame by any released PC game.

I think there's hope in Doom3 to push the envelope again, and even though we've gotten used to 1024x768 and up with AA, I think Carmack is onto something by aiming for lower resolutions with as much on-screen detail as is possible. I know he's said that he doesn't see all that much improvement beyond 640x480, and while I don't completely agree with that, this approach should utilize hardware to an extent that other developers just aren't trying to do.
 
GlQuake on a V1 was an eye-opener, but I honestly think seeing Tomb Raider a few months earlier was more impressive. Unreal was impressive, and timely enough to really show off SLI's ability to run smoothly at 1024x768 (who needed AA back then at such amazing resolutions? <g>). Morrowind would be impressive if its poor performance didn't kill so much of its potential immersivity (yes, I hate that word, too). A version of the UT 2003 demo I've seen is somewhat impressive, though some of the flora is too blocky. I'm basically not sure anything is going to visually wow me until Doom 3 is released, unless Thief 3 makes it out earlier and really pushes the Unreal Warfare engine.
 
Reverend said:
Because not every game development company is a "id Software".

Because publishers demand developers to cut down on latest-and-greatest graphics so that the game will run on more machines and at a reasonable performance at that. And make money.

Developers themselves want to make money (re publisher influence above).

As for a technical point of view :

Additional CPU power is sucked up by a lot of factors, including APIs, driver overheads, AI, sound, physics, etc, etc. API and driver overheads are actually due to the advanced technology provided.

I disagree here :)

WinXp is much more efficient than Win 9.x for memory management and task management. Sure the GUI is more advanced and slightly more CPU intensive but testing here I installed XP on a PIII 333 and it runs as good as 9.X.
Sound is one I have a big issue with as IMO Creative did a poor job with the original SBlive, in testing on my old set-up Athlon 1.4 with UT I was taking a 35% hit with EAX enabled. The entire IDEA behind the PCI sound card was to offload some of CPU cycles to the Sound Card, I switched to a Turtle Beach Santa Cruz and I take maybe 4% now. So possibly hardware issues here. AI is my other complaint, some of the best AI so far was MOHAA yet it wasn't groundbreaking over Quake 2..especially some of the Mission Packs and this game ran on a P166 ??

I just have not seen the returns from some of the titles I expected,especially when you factor in:

16-bit operating systems to 32-bit
Bus speeds 50, 60, 66,75,100,133,266,533
Processor Speeds over 2ghz from as low as 40 mhz
Hard drive speeds from 4000 rpm to 10000 rpm and data transfer speeds of 8 Mbytes/sec to 133 mb/s
Then Video speeds like the classic Voodo 2

3Dfx Voodoo2
8 MB EDO DRAM
4MB Frame Buffer
8MB Texture Memory
25ns EDO DRAM (90MHz bus)


Ti4600
300MHz core clock
128-bit DDR memory bus running at 325MHz
10.4GB/s of raw memory bandwidth

I guess I expect too much :p
 
I think part of the issue is that the voodoo architecture was actually a pretty smart/efficient implementation, and could handle some very visually impressive stuff. It took awhile for the mainstream to push the limits of the voodoo2/SLI, and only once the TNT2 was released was there anything faster than even a single V2. But around 1997 and 98, you had a few smart developers who knew what they were working with in current cards and knew that things would only get faster. Id's Quake2 and Epic's Unreal were certainly cutting-edge, forward-looking games, and other developers (and you might say Epic themselves with UT) have basically been riding on the level of technology introduced by these games for the past several years. I mean, when exactly are we going to ditch lightmaps? It's only been, what, 6 years now?

So with developers happy to spin new content into engine technology of the voodoo-era, is it any wonder we aren't seeing much improvement?
 
Matt Burris said:
I think pcchen summed it up pretty good too. I also think there's another big factor that's not being taken into account here. Growing up as a kid, when playing games, you usually loved them all, or most of them. Every game you played, was being played with child-like wonder. We used our imagination where things were lacking, and generally it's much easier to have fun and be satisfied when playing games as a kid. Now that most of us are older and adults now, it's hard to view games with child-like wonder anymore. Our criterias and expectations are raised very high, and we're let down easily.

I read a story once a while back, that made sense. It involves a movie critic who reviews movies for his job. In every movie he watched, he would always pick it apart, and focused on the negative more than the positive. One night, he took his beautiful new girlfriend out to a movie on a date, and at the end, she asked how he liked it, and he picked it apart and said how much it sucked. She felt bad because he didn't have a good time, and that kept being the case for future movies. After a while, he realized that he was ruining the experience for her, because she was enjoying the movies, yet he wasn't. Then he decided to change his perspective, and not pick movies apart, and just try to enjoy them for what they were. In the end, he did .... and he got fired from his job because every movie he reviewed were great. :LOL:

So I decided to do that, and I've been enjoying a ton more games lately. I try to play them with some form of child-like wonder, and try to have fun. Sometimes that just can't be the case, and you end up playing a crud game, but that's just inevitable.

FWIW, the last game to wow me graphically is Morrowind. Gameplay-wise, would have to go to Grand Theft Auto 3 for the PC. Right now, I'm enjoying NWN immensely, and Warcraft 3 too. To me, it's a great time to be a PC gamer. :)

I agree entirely. There is something we lose as adults. My children can play on an old Nintendo (first one what ever that is.) and still enjoy it. Myself ..... na. You are right though you have to approach it with a different mindset... remmember you are playing.

Geek
 
You talk about the wow-factor of graphics in games...

But what about the GFX-Demo scene? I am a long time fan of GFX-Demos and there have been some really stunning graphics introduced over the years that have made their way to mainstream videogames (and vice versa, to be honest). Still I see that demos use shaders very little or not at all (or am I wrong about this?). GFX-Demo scene is all about experimenting and not about money making. That makes me wonder why we haven't seen Doom3 style graphics in them? Or is it THAT hard to implement in that relatively short time frame that demos are made?
 
I'm hoping that we'll see some really good demos once DX9 class hardware is released/announced (I'm really hoping that Nvidia releases the FF TSW demo but probably an impossibility with the licensing issues with Square.)
 
Where is a good place to download some gfx demos?
I have tons from a website that eventually went down because of bandwidth problems but that was over a year ago.
Still looking for an adequate replacement to that site.
 
Thanks guys.
Maybe I can make some use of my Broadband connection now.
:)
And sorry SlmDnk - didnt mean to disappoint you however a good rule in life is never expect anything that way you will never be disappointed. :)
 
SlmDnk said:
That makes me wonder why we haven't seen Doom3 style graphics in them? Or is it THAT hard to implement in that relatively short time frame that demos are made?

* d3-style per-pixel lighting is on one hand rather trivial, and on the other rather expensive for a sw rasterizer. coders of the demoscene-mindset would rather come up with a real-time raytracer :)

* lots of the visual impressivness of the demos/intros is based on the perfect mixture of software _and_ artwork. so a given algorithm doesn't make it into a demo unless the artwork guys are ok with it and can make impressive artwork for it.

ps: nevertheless, have patience ;) i, being a lazy (former) game coder of the non-demoscene mindset, am working on a software dot3 + cubic mapping renderer in my spare time, which is meant to be suitable for 64k intros. AAMOF, it also suitable for text-mode intros ;)
 
NV25 said:
Games haven't improved much at all. UNREAL was the last time I was blown away by graphics. That was released in'98. I still have that installed (the original, not UT) and am constantly amazed by what I see. It looks way better than many games released today. The first time you leave the prison ship is a classic moment. Sure, the low polygon enemies, trees, buildings, etc, do betray its age a bit, but this is a small nitpick with the graphical splendor on display. The sky alone, especially in the Harobed village area, with twin suns shining through the clouds, puts most modern games to shame. UNREAL in every sense of the word.

My system at the time - P2 266 + Voodoo2. Not bad by any means, but I'm sure 2gig cpus + GF4 4600 should be able to produce visuals that completely destroys Unreal, or any other game you can think of. I'm still waiting for that game. Nothing has ever had the same impact on me because almost all current games look the same, using the same graphics from the same engines until they have a 'been there, done that' kinda feel.

I'm currently playing NWN and while the game is decent the graphics are mediocre at best - very simple generic environments. You seen the forests in this? From the outside they're like one huge polygon with tree textures painted on... :D

Morrowind is quite nice, though. :)

You should try Unreal again (using the UT engine) with the S3TC compressed textures, It is great ;)
 
Entropy said:
Visual impression of quality is not a linear function of the resources put in. The detail you add gets smaller and more insignificant for each step.

Many good points in this thread, but this one really hits the mark.

What you're seeing is that with more powerful hardware, the onus on producing a graphically pleasing environment falls more and more upon the artists rather than the coders. And there's just a limited number of truly great artists to go around, and artwork in general can be a very time consuming process, so we're back to the "games take too long to develop" syndrome as well. The focus on the artists to control the quality of a games graphics, rather than the engine programmer will likely continue, especially as shaders get easier to work with.

I don't think is unique to the PC in any case, albeit larger budgets afforded to console developers may make quality art more prevelant (and the fact that when we spend $500 upgrading our PC, we expect to be blown away). As the example given, GT3 is indeed fantastic on the PS2 - what other games on the platform look comparable? Not much. The size of GT3's development team is enormous, which may be what it takes to have a game so graphically pleasing while also offering gameplay depth. TV's hide quite a few graphical faults that stand out on a PC monitor as well; I'm amazed how _good_ so many old PC games look when running on my 32" Toshiba though Svideo.

Regarding Unreal - it truly lived up to its name, and I haven't been as impressed with any title as I have when that first appeared (from a graphical standpoint). With UT and the second CD of textures (why, oh why couldn't more developers do that?) it still looks better than most newer games released today. Epic just set the bar far higher, so let's simply give Epic props rather than denigrate all current and past developers.

BTW - Pascal - have a link to the process used to include the UT engine+compressed textures in Unreal? I recall seeing it somewhere...
 
Back
Top