No new videocard designs for 6months or so?

rms

Newcomer
I'm assuming no significant advances in videocard design will appear until near Xmas, is that about right?

I haven't seen any games lately that claim to stretch the hardware envelope, and the UE3 engine won't appear for a year at least.

Is there any reason to think my oc'd 6800GT won't last me -- even for high-end gaming -- well into next year?

Thanks!

rms
 
well.. R520 is due out sometime around summer. R520 should have an implementation of PS/VS 3.0 that includes tangible improvements over NV4X 's implementation of PS/VS 3.0 ...


You may be correct about there being no games coming up in the near future that will really push your card.... however, E³ will give us an opportunity to have a better idea of what to expect in H2 2005 and early 2006, as there are always surprise announcements/showings at E³ ;)
 
What really troubles me is that new hardware features are thrown out so fast, that software doesn't really keep up... what good is a card with the latest features, when the software that makes good use of it will not get released for a quite some time, and when it does, the said card will be considered obsolete?
 
Kombatant said:
what good is a card with the latest features, when the software that makes good use of it will not get released for a quite some time, and when it does, the said card will be considered obsolete?

Probably not much use to gamers but maybe it allows developers to start working with the new features so they are ready with them when the next iteration of the hardware is available? So in the end we benefit, no?
 
trinibwoy said:
Kombatant said:
what good is a card with the latest features, when the software that makes good use of it will not get released for a quite some time, and when it does, the said card will be considered obsolete?

Probably not much use to gamers but maybe it allows developers to start working with the new features so they are ready with them when the next iteration of the hardware is available? So in the end we benefit, no?

Depends on how you view it really; I guess I am more consumer-centric in my approach here.
 
It seems to me that Splinter Cell:Chaos Theory, FEAR and STALKER are all going to make use of the rendering power and/or shader model 3 capabilities of the next generation of NVidia/ATI cards.

EQ2 also seems to be more than any of the current cards can comfortably handle with all the eye candy on.

It seems to me that SM3 makes it much easier for developers to use all the power and functionality of these new cards. Well, except for the fact that when you turn on all the eye candy the frame rates drop into the teens. If all the eye candy turned on forces you to run a game at 1024x768 then I think it's fair to say the card has pretty much run out of power.

And then there's the problem of optimisation. I expect there's an argument for saying that SM3 code is harder to optimise since dynamically branching code can result in all sorts of unexpected variations in performance. But really a programmer should understand the bounds of their dynamic code. I notice that it's also possible to give the compiler hints about the bounds of dynamic code. So overall I'm unsure if SM3 really would lead to significantly harder code to optimise.

I suppose it's interesting to compare Doom 3 and Chronicles of Riddick. The visuals in both games are very similar, technically (though I suspect most people would say CoR is more advanced) but D3 seems to run a lot better. I dare say Id optimised the game better, even if it meant cutting back on features/richness. Obviously this is a comparison of Open GL games... Anyway, the point I'm making is that different developers will have different ideas about the performance/features mix for a game, even when targetting the same hardware.

Jawed
 
Kombatant said:
What really troubles me is that new hardware features are thrown out so fast, that software doesn't really keep up... what good is a card with the latest features, when the software that makes good use of it will not get released for a quite some time, and when it does, the said card will be considered obsolete?

Chicken and the egg, more fetures=better get them out there so games in the future look better, we are just starting to support the options that the r300 core had.
 
I think you can add another half a year after the hw is announced for the stores to begin selling it. Then add couple of months for the card to become mainstream hw. I'm actually for fast releases of hw since new features are replacing old and outdated ones and in the process are making things easier.
 
You buy the latest greatest for speed when using the older standards . Then your purchase increases the base of that latest and greatest so that developers then start to shift onto that .
 
Jawed said:
EQ2 also seems to be more than any of the current cards can comfortably handle with all the eye candy on.
EQ2 is only a SM1 game, so it's not a game that should be used as a reason why hardware isn't fast enough.
 
rms said:
Is there any reason to think my oc'd 6800GT won't last me -- even for high-end gaming -- well into next year?
To be certain, we'd need to know what games will be like next year and what the cards will be like as well.

In my observation, high end cards tend to remain useful for quite a while. iirc, the gf 2 ultra cost ~$500 but stayed in the top 5 for at least a couple years. Similar results with the 9700pro--- someone who bought it on release day could still be using it now, although it's now at mainstream performance.

I expect a 6800gt will have a good 2-3 years usable life.
 
radeonic2 said:
Jawed said:
EQ2 also seems to be more than any of the current cards can comfortably handle with all the eye candy on.
EQ2 is only a SM1 game, so it's not a game that should be used as a reason why hardware isn't fast enough.

Nonsense. Any time the video card is the bottleneck, it is the component that needs to be reworked to solve the problem. Who cares what shader model is used? You want the full experience.
 
to me the only reason EQ2 is slow is because of horrible code and nothing else. Graphicly the game is really sub par even on its highest levels where its running at 2-3 fps.
 
Kombatant said:
What really troubles me is that new hardware features are thrown out so fast, that software doesn't really keep up... what good is a card with the latest features, when the software that makes good use of it will not get released for a quite some time, and when it does, the said card will be considered obsolete?
This is a symptom of the type of platform that is the PC. Slowing the introduction of new hardware features would not help in the least. Put simply, you can't develop software to take advantage of the hardware until that hardware is available. Sometimes games can be modified to take limited advantage of new hardware in a relatively short time, but it takes much longer to get a good implementation.

So, the cycle of implementation of new technologies is, and always will be, roughly the same as the development cycle of games, which is typically about two years at the moment.
 
vrecan said:
to me the only reason EQ2 is slow is because of horrible code and nothing else. Graphicly the game is really sub par even on its highest levels where its running at 2-3 fps.
Well, until you get to some high-level zones. Everfrost, Permafrost, Lavastorm, and Sol Ro Temple typically have pretty good performance (Well, lavastorm slows down unecessarily in some places, but it's still typically better than many lower-level zones), and all look simply amazing. Either the artists did these zones last and just had a much better idea of what they were doing, or they just put a whole lot more love into these zones. I don't really know, but they do look quite impressive.

Regardless, the #1 thing hampering performance (particularly on slower CPU's) is the use of nothing more than PS 1.1. This forces large amounts of multipass rendering with any significant amount of graphical detail turned on, and thus hampers performance dramatically on slower CPU's.
 
wireframe said:
radeonic2 said:
Jawed said:
EQ2 also seems to be more than any of the current cards can comfortably handle with all the eye candy on.
EQ2 is only a SM1 game, so it's not a game that should be used as a reason why hardware isn't fast enough.

Nonsense. Any time the video card is the bottleneck, it is the component that needs to be reworked to solve the problem. Who cares what shader model is used? You want the full experience.
Nonsense?
If a game is designed for DX8 hardware, and you have DX9 hardware, how is it optimal to run it in DX8?
Am I not correct in believing EQ2 would run better in DX9, since DX9 can do things in 1 pass that takes many in DX8?
I find your way of thinking disturbing, I suppose the pc version of halo is limited by hardware, not by poor code?
 
wireframe said:
Nonsense. Any time the video card is the bottleneck, it is the component that needs to be reworked to solve the problem. Who cares what shader model is used? You want the full experience.
I'll add on to radeonic2's rebuttal here.

Put simply, newer shader models are out there because it makes complex calculations more efficient. EQ2 apparently attempts to do a number of complex calculations. Many newer games limit the ability to turn these things on only to SM2 and higher hardware, and for good reason. They are horribly inefficient with older shaders.

The newer shader models are much more flexible, and thus much more efficient at performing many of these operations. Yes, the same hardware would produce the same (or better) graphical effect at much higher performance in EQ2 if SM2 (or higher) were implemented.
 
radeonic2 said:
wireframe said:
radeonic2 said:
Jawed said:
EQ2 also seems to be more than any of the current cards can comfortably handle with all the eye candy on.
EQ2 is only a SM1 game, so it's not a game that should be used as a reason why hardware isn't fast enough.

Nonsense. Any time the video card is the bottleneck, it is the component that needs to be reworked to solve the problem. Who cares what shader model is used? You want the full experience.
Nonsense?
If a game is designed for DX8 hardware, and you have DX9 hardware, how is it optimal to run it in DX8?
Am I not correct in believing EQ2 would run better in DX9, since DX9 can do things in 1 pass that takes many in DX8?
I find your way of thinking disturbing, I suppose the pc version of halo is limited by hardware, not by poor code?

This is also in response to Chalnoth's assistance in your rebuttal so I mish mash in this one post, answering more generally than point by point.

First I want to point out that it seems both (or all three, rather) of you are familiar with the actual game whereas I am not. I read "because it's SM 1.1" and responded to that. I did not read "Everquest 2 is horribly coded". Keep that in mind.

So you are saying that a game specifically coded for DirectX 8 should have criticisms levelled on it because there exists a further 9.0 specification which would have allowed for higher precision and better performance? So, when running Source on a Geforce 6800 we can forgive any performance deficits because it would run much faster if it was coded for SM 3.0 ? This sounds very iffy to me. Hardware has to deal. Of course, if you are suggesting the devs for Everquest 2 made some impossible design decisions, that's another matter.

Your point seems highly hypothetical when, in fact, you are dealing with a very real game and hardware. Sure there is bad code, but part of advancing hardware is dealing with this. I am sure that Halo for PC was not thought of as some masterpiece in efficient coding. I even think the developer/publisher was well aware of the inefficiencies and was banking on big-gun hardware to solve that problem for them. If hardware only becomes good at running new code or, heaven forbid, increasingly worse at running old code, then you have a problem. You may want to think of this in terms of Petium 4/NetBurst and even Itanium from Intel. Both of these designs have a catch: they need new code to extract performance. This is not a good excuse if it performs terribly on legacy code.

Ok, so this was my long response to why I thought your statement was nonsense. Perhaps I didn't realize that you were saying things between the lines, being more familiar with the actual game. I took your quote verbatim and thought I called a spade a spade.

Consider what I quoted from you and then overlay that with a hypothetical "Half Life 2 is only a SM 2.0 game, so it's not a game that should be used as a reason why hardware isn't fast enough," while you think SM 3.0 hardware. I hope you agree that this would be absurd.

Sorry if there was any confusion.

EDIT:

I should probably have made it clear that each successive DX level implies support for the preceding ones and I think lack of backwards support is unthinkable. You cannot decide to suddenly cut performance on backwards SMs just because you have a new one that solves all those problems more elegantly. You still have the legacy code to support. But I think this is all just a misunderstanding where you meant that Everquest 2 is messed up a la Halo and I read it as a general statement about SM 1. The question then becomes did EQ2's devs count on future hardware being fast enough to cope with this DX8 inefficiency? Did they do so wisely? Hopefully the hardware is fast enough to make these questions go away.
 
You're really misunderstanding what's going on here.

There's a large number of shaders in EQ2 that no sane developer would code in SM1. They're clearly only going to be remotely efficient on SM2 and higher hardware. And yet the game has them.

Here's a simple example:
On my old Athlon XP 2000+ and GeForce 6800, if I had shaders turned off, the performance was good. With shaders on, the performance was (typically) terrible.

Now, with my Athlon64 3000+ and GeForce 6600GT (note that the vidcard is now slightly slower), there is typically no noticeable performance difference between shaders on and shaders off.

This is a situation that just should never be the case: shader usage is dependent on the CPU. It is dependent upon the CPU because of multiple passes being used.
 
Back
Top