Diminishing returns with advancing 3d hardware?

In fact It does make big sense to me. It is the same architecture as x360 has. And advantages are enormous, there are no CPU->GPU uploading / downloading stalls, there is only one fast memory common to both GPU and CPU. It allows me to render thousands of animated crowd people with 60+fps in 1280x760 4xAA. The same system ported to PC does poor 25-40fps in 800x600 without AA. The reason is that on PC it uploads a lot of data to GPU (skinned meshes), and it is slow like hell. If the memory was unified as it is on x360 now, many games would benefit drasticaly.
 
_xxx_ said:
But that's all just lame low-end GPU's. I know of no integrated high-end GPU anywhere nor do I expect to see any, since I also think it would make no sense).
Heat and power. That's a significant barrier to putting a high-end GPU on a motherboard.
 
I think it has more to do with everyone wanting to run higher and higher resolutions. If we were still running at 480p then I think we would see bigger jumps in graphics.

I'm thinking a lot of that power is wasted going fo rsuper high resolutions.
 
skilzygw said:
I think it has more to do with everyone wanting to run higher and higher resolutions. If we were still running at 480p then I think we would see bigger jumps in graphics.

I'm thinking a lot of that power is wasted going fo rsuper high resolutions.
If we were still running at 480p, then GPU manufacturers would have much less incentive to bother giving us better hardware.
 
Forget diminishing returns with 3d hardware what about the diminishing returns with 3d software ( aka games )?

Remember the first time you saw Doom and thought wow?

Nobody is ever going to forget the first time they went out of the spaceship in Unreal and just walked about a bit looking at the clouds ..which were moving !

Far Cry .. god that water looks good, must have a paddle .

Since then nuthing ..zilch ..zero. HL2 pah, FEAR ..doh ..

NO WOW FACTOR.

I have one hope and that is Crysis. Seemingly I am going to have to upgrade my entire house and wife just so once again I can have my lower jaw bounce off my chest.


Maybe.
 
One of the biggest barriers to immersion for me personally is a simple thing (though maybe not so simple to fix). That is, blurry magnified textures. Real life just does not get blurry if you get close up... :oops:

So I guess I would like to see either much higher resolution textures (a near-field mip level?), or some kind of procedural (fractal?) content generation, such that the frequency content of the textures is always at least that of the screen pitch (/2?).

ERK
 
ERK said:
One of the biggest barriers to immersion for me personally is a simple thing (though maybe not so simple to fix). That is, blurry magnified textures. Real life just does not get blurry if you get close up... :oops:

Real life does get blurry because of a thing called lens optics. :p

This has probably been discussed before, but what ever happened to detail textures? They were one of the things that made quite a few surfaces in UT and Halo nice to look at instead of just a blurry wash.
 
rwolf said:
The majority buy canned systems from hp, dell, and other companies. Computers should be more like appliances.

That's the low end, where a person never plays games at half way decent settings, where the users keep their computers 5 years or untill the next version of windows. What you said would be high end which is nearly impossible due to several issues, and plus it wouldnt make sense money wise.
 
I dont believe this.

Look at the new consoles. They are gorgeous compared to the last consoles. Yet, there is still tons of room for improvement in them.

PC's can give a muddled picture since improvements are so incremental on them. Just because there is new high end hardware doesn't mean the games dont target mid-range.
 
sonyps35 said:
I dont believe this.

Look at the new consoles. They are gorgeous compared to the last consoles. Yet, there is still tons of room for improvement in them.

PC's can give a muddled picture since improvements are so incremental on them. Just because there is new high end hardware doesn't mean the games dont target mid-range.

See, you're comparing hardware that gets 5 years between upgrade (consoles), to a industry that launches new series every year and a half (PCs). The updates on PCs are very incremental and the problem is the the hardware, if used in a closed enviroment, to do what we're seeing on current generation of consoles has been out for PCs for a awhile now.

Of course the console is going to see huge improvements, they just got HUGE hardware improvements. But the jumps will start getting smaller due to the nature of the game soon. Graphics are starting to become limited by the artist and not the coding. It takes a very long time to provide all of the high resolution content in next generation games.
 
Skrying said:
See, you're comparing hardware that gets 5 years between upgrade (consoles), to a industry that launches new series every year and a half (PCs). The updates on PCs are very incremental and the problem is the the hardware, if used in a closed enviroment, to do what we're seeing on current generation of consoles has been out for PCs for a awhile now.

Of course the console is going to see huge improvements, they just got HUGE hardware improvements. But the jumps will start getting smaller due to the nature of the game soon. Graphics are starting to become limited by the artist and not the coding. It takes a very long time to provide all of the high resolution content in next generation games.

I dont believe so.

The "wow" factor in graphics died for me in about 1989. Does that mean, we were at diminishing returns with the TNT2?

I was absolutely stunned when I saw a Neo-Geo in the arcade for the first time. Maybe in about 1990. 16-bit Genesis wowed me nearly as much.

The wow factor died in about 1990 because of PC's and incremental improvement. Back then there were only consoles, or at least they were all I knew, before the internet age. So you got that huge jump from NES to Genesis. Now, that jump is spoiled because high end PC's are always incrementally improving. So what you see on a new console is already spoiled for you, it's high end PC games.

But that has been going on for 15 years, it's not new.

And another similar factor is the internet. We see so many pictures of the new stuff before it comes out, it dulls any wow even further. Take Unreal 3.0, it's almost like I'm already sick of that "game", when a title has yet to be released on it! The first screens of it, two years ago, had a slight wow factor, which is about all PC's are ever capable of due to constant incremental improvements..now it's harder to see it as amazing, even though it is.

I am sure I will get another slight wow when I see Gears of War in person for the first time. But again, slight wow is all we get, and that's not new, far from it.

It's basically too subjective the way you're framing the debate. For example, we could compare screen of UT to UT2004, then UT 2004 to UT2007, the jump from 04 to 07 is probably greater. I know the original UT still looked pretty decent to me last I checked. And how old is that game? So then, is the pace of graphics increasing, because there was more improvement from 04 to 07, than from the original to 04?

Again it's too subjective, but I see no evidence things are slowing down. Crysis is the latest too give me that slight wow.

If anything, it's amazing any wow factor is capable of being produced on PC's at all..

But I'm saying this is too subjective, and some comparison screenshots of year-old, versus upcoming, Pc games woul blow the whole "small improvents only" thing right out of the water.

I also could be crazy, but in some ways to me I think the pace of graphical returns is increasing. I say that because some of the well done next-gen console games like BIA, or Rainbow Six Vegas, are gorgeous, which is simply never a feeling I got with 3D games before. They may have been "awesome" or technically impressive, but they aged terribly, because of all the jaggies and whatnot. Like if you go back and look at 2D SNES graphics, they will still look "pretty" to your eye. Whereas a playstation one game simply looks horrible because of the jaggies. So basically, this is the first gen I fell we can do "pretty" 3D, ever. In that sense I think the graphical returns can be argued to be increasing at a faster pace. In fact, I'm sure they are, we just dont notice it anymore.

I think this being the first gen we got rid of the jaggies, was a huge inflection point, therefore can be argued as increasing returns in consoles.

Also the diminishing returns argument is incredibly old. The first time I recall it was an old issue of VG&CE, probably about 91, in which the columnist argued the SNES was at a point where further improvements would be diminishing and relatively useless, because our TV's simply couldn't display that much better images in the future anymore.

That's right, he said that about the SNES. Look at UT2007 compared to the SNES.

It's an old argument, that will constantly re-appear as new. It may come true oneday, but I dont think we're anywhere near that day.
 
Last edited by a moderator:
Chalnoth said:
Yeah, I didn't read rwolf's original post before replying. I really should have. Yes, integrated graphics will always be low-end, and will likely move even closer to the CPU (I wouldn't be surprised to see some GPU's on the same die before too long). But that model makes zero sense for anything but really low-end graphics.

Now, I did say a long time ago that eventually inter-chip communication may make SOC (System on a Chip) designs perform higher than today's more distributed designs, but I'm no longer certain that's the case. For it to work, it would require large amounts of memory integrated into the die or the packaging, and I'm just no longer sure we'll get process technologies small enough for that to be an improvement over just having more logic and external memory.

I have four systems (Pentium ISA, Celeron ISA (ATX), Athlon PCI, Athlon X2 PCI Express). Each time I upgrade the GPU I have to replace the motherboard, powersupply, memory, case, and with SATA2 I have to replace the hard drives. Why not make a disposable unit with a smaller foot print?

Integrate the GPU into the mainboard, get rid of the slots, integrate GPU & CPU memory, and support expandibility throught the south bridge. Have one large heat sink.

With the exception of the CPU the GPU board is far more expensive then the rest of the system. Even the CPU must be upgraded with a new GPU in order to take advantage of the GPU performance.
 
Skrying said:
That's the low end, where a person never plays games at half way decent settings, where the users keep their computers 5 years or untill the next version of windows. What you said would be high end which is nearly impossible due to several issues, and plus it wouldnt make sense money wise.

The problem as I see it is that gaming systems are not available as commodity systems. I go to a store like Best Buy and look at the systems and they all have integrated graphics solutions with weak power supplies and cases that can't handle the heat of a modern GPU. It is these systems that are killing the PC as a gaming platform.

I want a PC that is like a console with high end graphics, but is more than a console.
 
Last edited by a moderator:
I've updated my CPU and GPU a few times without having to update the motherboard. I also have upgraded the motherboard a couple of times without having to upgrade the GPU. There was only one time where I had to upgrade all four at once (motherboard, CPU, RAM, video card), which was when I upgraded from an Athlon XP 2000+ to an Athlon 64 3000+ (on a PCIe system).

While you personally may not have benefitted from incremental upgrades, I claim that's more of a personal choice. The vast majority of my upgrades have been very incremental.

There are also feasability limitations on placing a high-end GPU on the motherboard, such as cost and heat considerations.

And finally, the statement that the CPU must be upgraded to take advantage of a new GPU is just utterly false. The two upgrades are nearly orthogonal to one another. A new GPU buys you higher resolutions and more enabled effects. A new CPU buys you higher framerates. You can indeed do very well with a very high-end GPU setup but a more mid-range CPU, and vice-versa.
 
rwolf said:
The problem as I see it is that gaming systems are not available as commodity systems. I go to a store like Best Buy and look at the systems and they all have integrated graphics solutions with weak power supplies and cases that can't handle the heat of a modern GPU. It is these systems that are killing the PC as a gaming platform.

I want a PC that is like a console with high end graphics, but is more than a console.
We do have integrated graphics setups currently that can play games decently, albeit at low resolutions. The GeForce 6100 and 6150 are examples.

There are also low-end GPU's that could be added into systems for little cost.

It's sad, really, that you can't buy a gaming system off the shelf for only a tiny amount more than the cheapest non-gaming systems. You should be able to: the upgrades required are very small.

But Windows Vista should help things in this regard. With the OS requiring a fair amount of 3D graphics power, gaming performance of your basic off-the-shelf PC's should finally become acceptable.
 
But then again Chalnoth your not the typical gamer. I would call you a extreme enthusiast just based off your presence here. ;)

I am more of a mere mortal gamer who buys high end and uses it until it is driven into the ground.

My last upgrade was because my Geforce 3 can't do 3D at 1920 x 1200.
 
Last edited by a moderator:
Back to the main subject. I think that while the visuals haven't changed all that much the resolution has increased substantially. Newer graphics cards have allowed us to move from 14" to 30" monitors in just a few short years. In fact I am not sure I would want the monitors to get much larger then they are.
 
rwolf said:
But then again Chalnoth your not the typical gamer. I would call you a extreme enthusiast just based off your presence here. ;)

I am more of a mere mortal gamer who buys high end and uses it until it is driven into the ground.

My last upgrade was because my Geforce 3 can't do 3D at 1920 x 1200.
Well, and as I've said, for the much more casual gamer, the solutions you're describing already exist. It's just that system builders haven't been building systems designed around budget gaming.
 
dizietsma said:
Far Cry .. god that water looks good, must have a paddle .

Since then nuthing ..zilch ..zero. HL2 pah, FEAR ..doh ..

NO WOW FACTOR.
HL2 wowed me alot more than Far Cry, hell nothing has wowed me as much as when I saw those first tech demos at E3, which was before we saw anything from Far Cry btw.
 
Back
Top