The NVIDIA DX10 AGP Thread

AMD first has to release their R600, and they are currently struggling to do so, then mid-range and low-end pci-e, after that AGP. This is going to take a long time, and i want be waiting, i'll be ditching my AGP system soon.


if they wait much longer G90 will be ready to ship when the R600 is, this is worse for them than the R520 was, because they at least had a limited supply out in November 05 3 months after the G70, but G80 has been out since Nov, and R600 was pushed back to May, so Nvidia has 6 months to control the market, and will have midrange and low end by March, along with the 8900 cards to refreash and no word from AMD as of yet. I smell a problem there, i wouldnt wait on R600 if i was you, it looks like the Voodoo 5 6000 to me.
 
Thanks for the system info Ateo, US and Albuquerque! You have tipped the scales, and I will now upgrade my trusty but tired 9800 Pro and hold on to the rest of the system for a couple more years. That R300 sure was good value - lets hope this first round of dx10 cards will last as long.

(Sorry about the o/t...)
 
So if Nvidia is not going to release a G80 based AGP card, is the x1950 Pro 512 the fastest AGP card on the planet?

http://www.newegg.com/Product/Produ...cription=x1950+pro+agp&Submit=ENE&N=0&Ntk=all

$229.99 x1950 Pro AGP 256 620/1480
$229.99 x1950 Pro AGP 512 580/1400
$239.99 x1950 Pro AGP 512 575/1380
$289.99 x1950 Pro AGP 512 620/1480

It all seems a bit confusing to me as to which card will outperform my x800 XT PE and if my old P4 3.06 HT can handle it anyways...sigh.
 
So if Nvidia is not going to release a G80 based AGP card, is the x1950 Pro 512 the fastest AGP card on the planet?

http://www.newegg.com/Product/Produ...cription=x1950+pro+agp&Submit=ENE&N=0&Ntk=all

$229.99 x1950 Pro AGP 256 620/1480
$229.99 x1950 Pro AGP 512 580/1400
$239.99 x1950 Pro AGP 512 575/1380
$289.99 x1950 Pro AGP 512 620/1480

It all seems a bit confusing to me as to which card will outperform my x800 XT PE and if my old P4 3.06 HT can handle it anyways...sigh.



yes, and your CPU will bottleneck it, you will get better frames at higher res, but particles and stuff will bog down the CPU
 
yes, and your CPU will bottleneck it, you will get better frames at higher res, but particles and stuff will bog down the CPU

CPU being the bottleneck is not directly linked to AGP.

I'm sure thare are a large number of PCI-E systems with CPU's slower than my watercoooled 4.2Ghz Prescott. Even my ram is faster than the "standard" DDR533 found on most of those same boxes.
 
CPU being the bottleneck is not directly linked to AGP.

I'm sure thare are a large number of PCI-E systems with CPU's slower than my watercoooled 4.2Ghz Prescott. Even my ram is faster than the "standard" DDR533 found on most of those same boxes.
didnt say it was, i mean a Pentium4 will bottleneck an x950pro nothing to do with AGP vs PCIe, even your 4.2 should hold back the x1950pro a lil bit.
 
didnt say it was, i mean a Pentium4 will bottleneck an x950pro nothing to do with AGP vs PCIe, even your 4.2 should hold back the x1950pro a lil bit.

Let's put my situation in perspective first: I own one of the scant few 24 pipe G71 512mb AGP cards made by Gainward. At the overclocks I'm able to pull on this card, it will beat nearly all of the 1950Pro cards available currently since the ATI options overclock so terribly. (most hit a huge wall at about ~630, and the Pro is a 12-pipe / 36 ALU design)

If I were going to play games at 1024x768, then I should be concerned about my 4.2Ghz box truly impacting my framerate. However, at the native 1680x1050 resolution of my LCD, the CPU has long since stopped being the bottleneck in essentially ALL of my games -- in terms of both minimum and maximum framerates.

I'm of the opinion that someone who buys one of these video cards is not someone looking for "budget", and as such isn't going to have a "budget" system. The people most likely to buy a card like this will fit into two categories:
A: They want to enable more eye-candy functions that they can't do on their current video hardware (high levels of AA, high quality AF, HDR functions)
B: They want to run at "acceptable" detail levels at native resolution on their new higher-res LCD screens that are becoming so inexpensive

I can't fathom very many people with P3 800's running out to buy this card to improve their framerate in FEAR.
 
Last edited by a moderator:
Let's put my situation in perspective first: I own one of the scant few 24 pipe G71 512mb AGP cards made by Gainward. At the overclocks I'm able to pull on this card, it will beat nearly all of the 1950Pro cards available currently since the ATI options overclock so terribly. (most hit a huge wall at about ~630, and the Pro is a 12-pipe / 36 ALU design)

If I were going to play games at 1024x768, then I should be concerned about my 4.2Ghz box truly impacting my framerate. However, at the native 1680x1050 resolution of my LCD, the CPU has long since stopped being the bottleneck in essentially ALL of my games -- in terms of both minimum and maximum framerates.

I'm of the opinion that someone who buys one of these video cards is not someone looking for "budget", and as such isn't going to have a "budget" system. The people most likely to buy a card like this will fit into two categories:
A: They want to enable more eye-candy functions that they can't do on their current video hardware (high levels of AA, high quality AF, HDR functions)
B: They want to run at "acceptable" detail levels at native resolution on their new higher-res LCD screens that are becoming so inexpensive

I can't fathom very many people with P3 800's running out to buy this card to improve their framerate in FEAR.

by defination a PEntium 4 is a budget box, it has some of the worst design flaws of all time. A 4.2 Pentium4 is about equal to say a 3400 A64 in reality, and at the prices the A64 is, its value or very low mid range today. Also there are things besides looks in a game, take into account the things like physics, particles, and any other match calculations having to be done, thats where your CPU will bottleneck, and i dont care how good your video card is, i could show you simply from benchmarks that evan a dual core X2 isnt optimal for the G80, and you can get 20-30% better with Core2 and thats a fact, so saying your Pentium4 does just fine, is nothing but ignorance, instead of getting a new GPU replace your CPU and experince what your card could do with the proper hardware.
 
that's actually good news for me!

i don't have to buy my younger sis a new mobo to upgrade her comp.

she wants to upgrade to vista and wants a direct x10 card for her b-day (in 6 months)
 
by defination a PEntium 4 is a budget box, it has some of the worst design flaws of all time.
Says you, but it's a matter of opinion. Let's just leave it at saying there are a large number of P4 owners that would disagree.

A 4.2 Pentium4 is about equal to say a 3400 A64 in reality
Do you have anything to back that up? I'm able to find quite a few game benchmarks of the Pentium 570 (3.8ghz, non-dual core, still slower FSB than mine and also slower overall speed than mine) faring quite well against the A64 3800+ here, here, here, and here. You may not like the Prescott, but 4.2ghz != 3200.

Edit: Just for fun, you should tab around and look at how it fares against both of the A64 4000+ parts too :D

Also there are things besides looks in a game, take into account the things like physics, particles, and any other match calculations having to be done, thats where your CPU will bottleneck
Sure, if you're playing at low-enough resolutions not to hamper the video card. But you keep missing the point -- I'm running at 1680x1050, and with even a minimum level of AA turned on, I'm hitting the fillrate limit of the card well before the CPU begins to impact my performance? Don't believe me? Why don't you start reading right about here, and for the next several pages. This is a review of an AMD 2500+ and a 3400+, along with a midrange 7600GS card and a 1950Pro AGP card, running two resolutions of 1280x1024 and then 1600x1200.

If you read that link, you'll quickly see how even a single-core 3400+ is bottlenecked by the video card at resolutions of a "mere" 1600x1200 -- even the MINIMUM framrate goes down with the higher resolution, indicating that the entire spectrum is hampered by the lack of video horsepower.

The rest of your post is just flame bait, and as such, I'm not responding to it.
 
Last edited by a moderator:
Says you, but it's a matter of opinion. Let's just leave it at saying there are a large number of P4 owners that would disagree.


Do you have anything to back that up? I'm able to find quite a few game benchmarks of the Pentium 570 (3.8ghz, non-dual core, still slower FSB than mine and also slower overall speed than mine) faring quite well against the A64 3800+ here, here, here, and here. You may not like the Prescott, but 4.2ghz != 3200.

Edit: Just for fun, you should tab around and look at how it fares against both of the A64 4000+ parts too :D


Sure, if you're playing at low-enough resolutions not to hamper the video card. But you keep missing the point -- I'm running at 1680x1050, and with even a minimum level of AA turned on, I'm hitting the fillrate limit of the card well before the CPU begins to impact my performance? Don't believe me? Why don't you start reading right about here, and for the next several pages. This is a review of an AMD 2500+ and a 3400+, along with a midrange 7600GS card and a 1950Pro AGP card, running two resolutions of 1280x1024 and then 1600x1200.

If you read that link, you'll quickly see how even a single-core 3400+ is bottlenecked by the video card at resolutions of a "mere" 1600x1200 -- even the MINIMUM framrate goes down with the higher resolution, indicating that the entire spectrum is hampered by the lack of video horsepower.

The rest of your post is just flame bait, and as such, I'm not responding to it.

first off, the Pentium 4 is sevral years old, it has been replaced, by Core2 and is not really worth it anymore. 2nd Low end A64 is also budget now also, unless where talking 3700 and 4000, the single core A64 is a budget chip. By defination budget is under 100 dollars, and granted the PEntium4 sells for about 109 its over prices being as how the PentiumD is faster over all and those can clock nicely also. And reading that review means nothing to me, the fact is you would get higher framerates with a better CPU, dont belive me go get a real CPU.
 
we know perfectly about the CPU usage issue but you know, throwing away the CPU and motherboard you spent hundreds on while you first need a better GPU in games sucks, no matter how you look at it. Furthermore : are you suggesting to get a core2duo and 7600GS for high resolution /AA/AF gaming?
guess which upgrade will suck less for that purpose?
 
first off, the Pentium 4 is sevral years old, it has been replaced, by Core2 and is not really worth it anymore. 2nd Low end A64 is also budget now also, unless where talking 3700 and 4000,
Well funny enough, the 3.8Ghz prescott is trading blows with your A64 4000. Using "logic", the additional 400mhz of processor speed and 320mhz of bus speed on my system would put me above the Pentium 570 that is equal to your vaunted A64 4000+, which then puts me out of your "budget" class. And you know what? Even if my entire computer IS budget class, that still doesn't mean my video card isn't bottlenecking it.

And reading that review means nothing to me, the fact is you would get higher framerates with a better CPU, dont belive me go get a real CPU.
So a review that points out how blatently wrong you are is somehow immaterial? Sure, I'm willing to bet my maximum framerates might get better in several cases, but my maximum framerates aren't what's going to kill my gaming experience. What really presents a problem is minimum framerates; and if you had decided to stop and listen for a moment, you'd find that a faster CPU wasn't increasing the minimum framerates of the games that were being tested in most cases.

Can you guess why? Because, as I've been telling you, the minimum attainable framerate at these kinds of resolutions are not being limited by the CPU, but by the video card. You can sit here and tell me how right you are all day, but until you can PROVE otherwise (just as the thread I linked that you ignored) then what you are saying is still wrong.
 
Well funny enough, the 3.8Ghz prescott is trading blows with your A64 4000. Using "logic", the additional 400mhz of processor speed and 320mhz of bus speed on my system would put me above the Pentium 570 that is equal to your vaunted A64 4000+, which then puts me out of your "budget" class. And you know what? Even if my entire computer IS budget class, that still doesn't mean my video card isn't bottlenecking it.


So a review that points out how blatently wrong you are is somehow immaterial? Sure, I'm willing to bet my maximum framerates might get better in several cases, but my maximum framerates aren't what's going to kill my gaming experience. What really presents a problem is minimum framerates; and if you had decided to stop and listen for a moment, you'd find that a faster CPU wasn't increasing the minimum framerates of the games that were being tested in most cases.

Can you guess why? Because, as I've been telling you, the minimum attainable framerate at these kinds of resolutions are not being limited by the CPU, but by the video card. You can sit here and tell me how right you are all day, but until you can PROVE otherwise (just as the thread I linked that you ignored) then what you are saying is still wrong.

A CPU feeds data to the GPU, if the CPU is to slow the data isnt getting fed to the GPU faster, even at higher res with a better CPU your framerates go up no matter what. Take into account this, a friend was using a 6800GT with a sempron 2600 754, he upgraded to an Athlon 64 3700 754 and his framerates at 1600x1200 became playable with AA on, because his VPU was now getting fed data, and this was back in early 05, and a 1950pro is alot fater than a 6800GT, so you would want a faster CPU to feed the data. Im not saying get a 7600GS and a core2 duo, im saying get a core2 duo and the dual VSTA from ASrocks and experince what that card can really do at high res with AA and AF
 
Dont' sit here and try to explain CPU vs GPU bottleneck; I've been doing this for long enough to understand. The problem here is you not understanding... How did you measure that framerate increase? Tell me EXACTLY how you measured that framerate increase.

Because you probably did just like everyone else: Timedemo and report the result that comes out the end. The reality is, that result at the end is an AVERAGE of all the frames rendered. You continually and obstinately miss the point: minimum framerate is where your gaming experience truly suffers.

If I can hit an average of 100fps in a game, but my minimum framerate is 5, then the game will become unplayable at least one time through the run. Further, upgrading the CPU is only going to help if it's the bottleneck for that minimum framerate.

And what I'm continually telling you (and so is that benchmark link that you seemingly continue to ignore) is that upgrading the CPU isn't helping at high resolutions in the games tested. So again, how do you expect me to take you seriously when you continually miss this point?

If upgrading the CPU increases my maximum framerate, but does absolutely zero to help my minimum framerate, then the game will still be ultimately as playable as it was before.
 
Dont' sit here and try to explain CPU vs GPU bottleneck; I've been doing this for long enough to understand. The problem here is you not understanding... How did you measure that framerate increase? Tell me EXACTLY how you measured that framerate increase.

Because you probably did just like everyone else: Timedemo and report the result that comes out the end. The reality is, that result at the end is an AVERAGE of all the frames rendered. You continually and obstinately miss the point: minimum framerate is where your gaming experience truly suffers.

If I can hit an average of 100fps in a game, but my minimum framerate is 5, then the game will become unplayable at least one time through the run. Further, upgrading the CPU is only going to help if it's the bottleneck for that minimum framerate.

And what I'm continually telling you (and so is that benchmark link that you seemingly continue to ignore) is that upgrading the CPU isn't helping at high resolutions in the games tested. So again, how do you expect me to take you seriously when you continually miss this point?

If upgrading the CPU increases my maximum framerate, but does absolutely zero to help my minimum framerate, then the game will still be ultimately as playable as it was before.
i didnt time demo, the games felt smoother, just in general it was smother, i dont need fraps to tell me that a game is running smoother.
 
Game developer point of view:

Hi Demirug, didn't know you're making games now. What game/URL would that be?

More options = more possible configurations that need to be consider.

Sorry if this sounds not very customer friendly but since I am responsible for the performance of our game every new variable cause some more headaches.

In the case of the 7600 there would be no differences as we currently expect only AGP bandwidth for this GPU class. But as part of our research we may find some use of PCIe on the D3D10 class. If nvida now will offer a D3D10 hardware on AGP I have to check every idea from someone in the team if it will work with AGP too. Even if the attached feature would be optimal we need to detected such D3D10 AGP cards and disable such a feature on default. I need to stay on this high level sight at the moment as the game is still unannounced.
Well, in the first quote you mentioned only the word "performance" being a possible probelm for a DX10 AGP card. Then you said it could be more troublesome than just possible "performance" problems. Your last sentence is quite telling because it says your game is aimed squarely and primarily at PCI-e DX10 cards -- I don't know of any developer that has made these kind of crucial business decisions this early into the age of PCI-e and certainly not this early into Vista/DX10. IOW, I don't know of any developer that assumes they will make money from their game expecting the majority of gamers to own a baseline system of Vista(+DX10)+PCI-e video card.
 
Back
Top