NVIDIA Kepler speculation thread

I have an sli laptop here, its dead from bumpgate, so I hope you can understand why I prefer to go with AMD products.
I can understand that.

I've had some rather bad experiences with ATI products in the past myself. One that sticks out in my mind was back in the days of the ATI Radeon 9700 Pro. I had one as an upgrade for my GeForce 4200 Ti, so it should have been a nice upgrade.

Except that at the time, I was playing City of Heroes and Star Wars: Knights of the Old Republic. The old GeForce 4200 Ti did better than the 9700 Pro at both games. I don't think I've ever quite forgiven ATI for that.

(My next card after that was a GeForce 6600 GT, which I was very happy with)
 
Oh, another thing that annoyed me with ATI drivers back in the day was the installation. The drivers were released fairly frequently, and I found myself switching drivers rather often. But this required two reboots: once to uninstall the drivers, and again after installing them. While with nVidia's drivers I could do this after just one reboot.

These days, you can actually install nVidia's drivers without rebooting at all. Furthermore, when it uninstalls the old display driver, it sets the display to a reasonable resolution (1600x1200 on my 1920x1200 display) so that my desktop icons and windows don't get all screwed up while it's installing. I also can completely customize the display scaling at different resolutions, though admittedly I can't use newer AMD drivers on my notebook to see if custom display scaling is offered there now as well. Custom display scaling is, however, very important for playing older games on new widescreen monitors (e.g. Diablo II).
 
Oh, another thing that annoyed me with ATI drivers back in the day was the installation. The drivers were released fairly frequently, and I found myself switching drivers rather often. But this required two reboots: once to uninstall the drivers, and again after installing them. While with nVidia's drivers I could do this after just one reboot.

These days, you can actually install nVidia's drivers without rebooting at all.
Same goes for AMD drivers, so what is your point? Most of your opinions are based on experiences with a product (9700 Pro) that was released nearly 10 years ago?!
 
Same goes for AMD drivers, so what is your point? Most of your opinions are based on experiences with a product (9700 Pro) that was released nearly 10 years ago?!
Well, that's good that they've fixed that. But like I said, I have an ATI laptop right now. The main problem is that I can't use drivers newer than about three years in Windows. And those three-year-old drivers do not have support for display scaling customization, or most of the other options that I have become accustomed to with nVidia drivers.
 
their point

What is the point exactly? That reviewers received cherry picked samples that boost better than average? That's just whiny paranoid nonsense with no proof. Fact is that the feature works and soon enough there will be lots of end user feedback on whether it's a gimmick or not.

What I can say is that the average gamer is going to lap this shit up, especially with tools like precisionX out there. nVidia marketing wins again.
 
I think AMD and NV have near-parity on single-card drivers. Sometimes NV beats AMD to the finish line with newly released games and sometimes AMD has fewer single-card driver bugs. I would have to say, based solely on my experience, that AMD's CFX drivers are almost always behind NV's SLI drivers.

As for these two (7970/680) it seems to me that NV made a mistake going with 2GB as shown by all the higher-resolution, multimonitor benchmarks so the SLI driver advantage is moot for now.
 
This brings up an interesting question then. It appears that GPU boost is known for overclocks of both 1058MHz and higher (1123MHz as mentioned earlier in this thread). Did reviewers know how the card was performing in each game benchmarked? When you have a more controlled overclocking environment the results seem to be more of a tie. With the indication that the 7970 can provide slightly better performance using a slightly lower overclock.
 
This brings up an interesting question then. It appears that GPU boost is known for overclocks of both 1058MHz and higher (1123MHz as mentioned earlier in this thread). Did reviewers know how the card was performing in each game benchmarked? When you have a more controlled overclocking environment the results seem to be more of a tie. With the indication that the 7970 can provide slightly better performance using a slightly lower overclock.

GPU boost is a stock feature of the card. If you take it out of the box and stick it in your machine that's the performance you get, end of story.

I'm not sure what a post from AMD's blog about LN2 over-clocking and clock-for-clock comparisons has to do with GPU boost.
 
I'm still trying to figure out who pays $500 dollars for a card and plays at 1080P or less!

Raises hand :) My display for my gaming pc is a 65" 1920x1080, so no need for me to go higher res than that. For my work pc it's gpu assisted rendering performance that is most important, not resolution. So if a 680 speeds up my video encoding time then I'll get one for the office pc as well.
 
Well, that's good that they've fixed that. But like I said, I have an ATI laptop right now. The main problem is that I can't use drivers newer than about three years in Windows. And those three-year-old drivers do not have support for display scaling customization, or most of the other options that I have become accustomed to with nVidia drivers.

You laptop's video drivers must be supplied by your laptop manufacturer; ATI's default drivers purposefully block several manufacturers out of installation by manufacturer request. IF you want, there are very simple ways to get ATI's drivers directly into your laptop (like you can do for NV) and those might expose the later features that you were hoping to find. But your example isn't an ATI problem any more than it would be an NV problem...
 
You laptop's video drivers must be supplied by your laptop manufacturer; ATI's default drivers purposefully block several manufacturers out of installation by manufacturer request. IF you want, there are very simple ways to get ATI's drivers directly into your laptop (like you can do for NV) and those might expose the later features that you were hoping to find. But your example isn't an ATI problem any more than it would be an NV problem...
Yes, that's why I'm not complaining about my laptop drivers specifically (that is, I'm not complaining about not being able to update them). Though I admit that my experience with more recent ATI drivers may be somewhat colored by the fact that these drivers are rather old, they lack features that I had become used to with nVidia drivers for years previously. I am going to be with only my laptop for a few weeks soon, though, so perhaps I'll see about taking the time to figure out how to install some more up-to-date drivers. At least that will give me a better picture of the current status.

Now, in Linux, where I by far spend the most of my time on the laptop, I would say my experience is fully relevant. And basically it boils down to this:

1. For an AMD GPU, by far the way to go is to use the open source drivers, at least in my experience. This basically means no 3D games (though gaming in Linux is pretty difficult in the best of cases). The open source drivers are mostly functional, though there are some issues I have had when connecting/disconnecting extra displays. They are, at least, good enough to run the 3D-accelerated desktop. The use of the open source drivers does allow KMS, which dramatically speeds up sleep/wake times.

2. For an nVidia GPU, the vendor drivers are far and away better. The open source drivers are only barely functional, and not even good enough for the 3D-accelerated desktop. This does mean no KMS, which means waking from sleep mode easily takes 20 seconds or so (because the video driver has to be restarted). Aside from this annoyance, however, the drivers are generally more functional and more stable than either the ATI vendor or open source drivers, making gaming in Linux a real possibility (though not easy). In modern distributions, the added difficulty in installing the vendor drivers is much reduced, so it isn't a big deal.

So in the end, in Linux it's a choice between KMS (i.e. fast wake times) and higher general performance and functionality. Depending upon what I'm doing, I could go either way here. The laptop leans more towards KMS support (as I put it to sleep much more often), though the finicky multi-display setup is sometimes a nuisance, while the desktop leans more towards the better performance (as I play games in Linux more on my desktop).
 
Raises hand :) My display for my gaming pc is a 65" 1920x1080, so no need for me to go higher res than that. For my work pc it's gpu assisted rendering performance that is most important, not resolution. So if a 680 speeds up my video encoding time then I'll get one for the office pc as well.
This reminds me of one interesting bit in the 680 reviews: compute performance. Specifically, it seems like it's all over the place. It looks like nVidia simplified their instruction scheduler, offloading much of the optimization work off of the hardware and onto the software. This makes me wonder whether the compute performance is permanently crippled, or whether future drivers down the road will improve things. Will be interesting to see.

Will also be interesting to see what this means for their workstation line down the road.

P.S. I should mention, however, that encoding performance is one of those places where the 680 does well, though not as well as the 580.
 
CPU Boost has guaranteed values for all chips and doesn't boost further, there is no lottery here.
Nvidia guarantees a minimum of 5% for GPU Boost for every card. If it goes higher then that's a bonus and seems like a plus to me. Do you really want to leave performance on the table when playing games?

CPU Boost can be turned off.
Again, why disable performance?

And since you seem to want to disable performance enhancing features for benchmarking then why stop at disabling GPU Boost. Why not use older drivers that were less tuned. Why not disable caches. Why not reduce PCIe lanes. Slippery slope.
 
Nvidia guarantees a minimum of 5% for GPU Boost for every card. If it goes higher then that's a bonus and seems like a plus to me. Do you really want to leave performance on the table when playing games?

The problem is that reviews may have cards with Boost that goes up to 10%, but when people buy their own card, they may not get more than 5%, and even that 5% isn't really guaranteed to happen consistently in all games, at all times and under all conditions; just "sometimes".

Now a 5~10% discrepancy with reviews (at most) may not seem like much, but since the GTX 680 happens to be 0~10% faster than the 7970, it's understandable that some people might be uneasy about this.
 
This reminds me of one interesting bit in the 680 reviews: compute performance. Specifically, it seems like it's all over the place. It looks like nVidia simplified their instruction scheduler, offloading much of the optimization work off of the hardware and onto the software. This makes me wonder whether the compute performance is permanently crippled, or whether future drivers down the road will improve things. Will be interesting to see.

Will also be interesting to see what this means for their workstation line down the road.

P.S. I should mention, however, that encoding performance is one of those places where the 680 does well, though not as well as the 580.

My guess that Nvidia will have to hand-optimize certain compute tasks in the driver for the most 'important' apps for a while. I doubt that the on-the-fly compilation will do as good a job as a manually inspected workload that's been optimized in the driver.
 
This reminds me of one interesting bit in the 680 reviews: compute performance. Specifically, it seems like it's all over the place. It looks like nVidia simplified their instruction scheduler, offloading much of the optimization work off of the hardware and onto the software. This makes me wonder whether the compute performance is permanently crippled, or whether future drivers down the road will improve things. Will be interesting to see.

Will also be interesting to see what this means for their workstation line down the road.

P.S. I should mention, however, that encoding performance is one of those places where the 680 does well, though not as well as the 580.

Interesting, well it's definitely something I'll be testing myself at some point since I likely will be getting a 680 for the gaming machine, and i can do gpu encode tests on it to compare to a 580. For me though it has to be Sony Vegas Pro for the encoding, because I sampled over a dozen gpu assisted encoders and personally they all sucked because they cheated. They were much faster, but their gpu encoder provided inferior quality to software encoders so I use none of them. Vegas Pro is unique in that it's MainConcept gpu encoder provides identical quality encodes to the software version. So you can have your cake and eat it to, quality gpu encodes and speed assist. I have two identical overclocked i7-2600k machines that are identical hardware in every respect except game version has a 580, office version has a 560ti since the 580 showed no increase in gpu encode speed whether using a 560ti or 580. This will let me directly compare the 680 easily in gpu encoding speed.
 
GPU boost is a stock feature of the card. If you take it out of the box and stick it in your machine that's the performance you get, end of story.

I'm not sure what a post from AMD's blog about LN2 over-clocking and clock-for-clock comparisons has to do with GPU boost.

We already know that GPU boost is part of the card. What is being eluded is that when both is overclocked the AMD card is still slightly ahead at a slightly lower overclock. Since we aren't dealing with hot clocks it good to know. Also, since we know that card can be overclocked/boost past 1058MHz did reviewers verify what the boost was running at? There is still a performance difference between 1058MHz and 1100MHz (as gpu boost posted earlier in this thread).
 
Last edited by a moderator:
We already know that GPU boost is part of the card. What is being eluded is that when both is overclocked the AMD card is still slightly ahead at a slightly lower overclock. Since we aren't dealing with hot clocks it good to know. Also, since we know that GPU boost can overclocking/boost past 1058MHz did reviewers verify what the boost was running at? There is still a performance difference between 1058MHz and 1100MHz (as gpu boost posted earlier in this thread).

Let's assume the 7970 overclocks better and is faster at maximum overclock. That still has no bearing on the validity of GPU boost as a stock feature of the 680. If I'm reading you correctly you're saying that comparing the two cards at stock isnt fair and the only fair way to compare them is to overclock them to the max first?

I think these posts in that thread sum it up nicely.
When did we start comparing the clock speed of AMD's and Nvidia's respective GPU? Such foolishness. sigh... whats next colors?
You can't cripple stock features of a card just because you don't like the results.
The problem is that reviews may have cards with Boost that goes up to 10%, but when people buy their own card, they may not get more than 5%, and even that 5% isn't really guaranteed to happen consistently in all games, at all times and under all conditions; just "sometimes".

It's just as likely that they get cards that boost higher than the ones in reviews. Variance goes both ways.
 
Let's assume the 7970 overclocks better and is faster at maximum overclock. That still has no bearing on the validity of GPU boost as a stock feature of the 680. If I'm reading you correctly you're saying that comparing the two cards at stock isnt fair and the only fair way to compare them is to overclock them to the max first?

I think these posts in that thread sum it up nicely.
I think you mistakenly quoted me instead of someone else. I didn't make any mention about stock clocks. However, having posted about overclocking/boosting I did ask if reviewers verified what the boost frequency was in game since it can be higher. Since boosting the GPU can have an effect on frame rates. :D




It's just as likely that they get cards that boost higher than the ones in reviews. Variance goes both ways
But that's the point of me asking though. Since there can be some variance going both ways was those results using 1058MHz boost or something else? I really don't know.
 
Last edited by a moderator:
It's just as likely that they get cards that boost higher than the ones in reviews. Variance goes both ways.

Assuming NVIDIA doesn't cherry pick review samples, which is assuming a lot.

That said, I'm perfectly fine with Boost, I think it's a very good feature. I just think reviewers ought to make sure to warn their readers that their mileage may vary.
 
Back
Top