Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
And, of course, also: nobody really knows the relative performance levels of a GT530 vs a GT6something. I definitely don't. So it's much better to have a full 6xx line with some rebrands than not to have them and to have a mix of 4xx, 5xx, and 6xx instead with no clue at all about their relative performance.
Rebranding is a great for product line clarity, the contrary of 'digging your own grave'. If it were the latter, Nvidia would have stopped doing it long time ago. After all, they've been doing it for, what, more than 4 years now?
Do you honestly think people will upgrade from a GT510 to a GT610?Oh I think they've been doing it for a lot longer than that. Enthusiasts bitch about it—rightly so—but most people think the cards are new, buy them (often in OEM machines) and probably never find out that they weren't. I'm sure it's quite profitable.
Do you honestly think people will upgrade from a GT510 to a GT610?
Oh I think they've been doing it for a lot longer than that. Enthusiasts bitch about it—rightly so—but most people think the cards are new, buy them (often in OEM machines) and probably never find out that they weren't. I'm sure it's quite profitable.
If that's your concern, you should be more upset about AMD using numbers with 4 digits tricking consumers into thinking that they are better than Nvidia's products with only 3.Of course not, but people looking to upgrade from something older might pick the GT610 over, say, some HD 6000, on account of the former being perceived as "newer"; which is probably why AMD has renamed some low-end HD 6000s to HD 7000s.
That's not an issue limited to rebranding though.
Some 'new' cards such as the 9800GTX and HD 6850/70 had slightly worse performance than their predecessors by naming.
I don't think whether a card is 'new' really matters, performance and features important to the end users do.
If that's your concern, you should be more upset about AMD using numbers with 4 digits tricking consumers into thinking that they are better than Nvidia's products with only 3.
Do you really think consumers know whether or not an HD6000 was introduced before or after a GT610? I definitely don't know. Why does it even matter when a particular piece of silicon was first introduced?
Actually it's nVidia to blame, both used 4 digits first, nVidia changed to 3 digits laterIf that's your concern, you should be more upset about AMD using numbers with 4 digits tricking consumers into thinking that they are better than Nvidia's products with only 3.
Actually it's nVidia to blame, both used 4 digits first, nVidia changed to 3 digits later![]()
Found the exact wording. It is:So those tests are made by adjusting the power slider individually per game between 75% and 91%? (or did you set a fixed value from those findings)
What is nvidia really promising? that it will sometimes be able to hit 980, or that it's the average clock in normal games?
Having the test sample going 100mhz higher than specified seems quite suspicious.
Nvidia RG said:The “Boost Clock” is the average clock frequency the GPU will run under load in many typical non-TDP apps that require less GPU power consumption. On average, the typical Boost Clock provided by GPU Boost in GeForce GTX 680 is 1058MHz, an improvement of just over 5%. The Boost Clock is a typical clock level achieved while running a typical game in a typical environment.
http://tech.pnosker.com/2012/05/21/...lling-all-gtx-670-680-690-kepler-video-cards/
Just random click-whoring, or some truth to this?
True. I think it was Silent_Buddha who mentioned that someone he knew had "upgraded" from an HD 5870 to a 6870. So maybe the GT510 -> GT610 upgrade isn't all that far-fetched after all.