Nvidia BigK GK110 Kepler Speculation Thread

Heh well I also have a 990X, but I didn't pay any premium over regular hexacores for it, but in general paying the extra to go from 3930k to 3960X get's you basically nothing, but paying a similar amount of money to go from a 680 to a Titan will get you quite a bit of performance.
 
I doubt that Titan is going to look particularly impressive in price/performance ratio to say 2 x 670 setup. It's a super enthusiast product, where value is not that great. Unlike the Extreme CPUs from Intel, it should provide great single GPU power upgrade though.

When the first rumor about Titan have shown, i was ask me if Nvidia in the future, could not try to do something like Intel have done with his last CPU series: create a separate "ultra High end lineup" and a "consumer high end cards line up " ( like LGA1155 and LGA2011).

A ultra high end lineup who cost nearly 1000$ ( like extreme cpu from Intel ) and who permit to keep the " 104-114 SKU" in the high end price ( 500$ ), and use bigger sku, hard to sold, and not cheap to produce on top of the list.
( marketing wise, you drain a good feeling because peoples will base themsleves on the "extreme high end ", even if they will buy a low end card. ( Halo effect ).and you continue sold them an high end price cards without having to put it in the middle range price ( GK114 ).

Its like Intel since SandyBrige and the LGA2011. The top for the gamers and standard consumers was the 2600K-3770K. But if you do 3D benchmarks or use professionnal software, and need a 6-8 cores, you will need watch on the "extreme CPU's " and pay
the price for it ( I7 3930K- 3960x). This have never bring down the performance value of the I7 2600K-3770K ( even if they are cheap ) in the mind of the consumers, and Intel have been able to keep high prices on the 6 cores CPU.

We can even imagine Nvidia implement something like the "K" series on Intel CPU: Voltage control, Overclocking, TDP control on this "Extreme high end line up ", and limited possibility on the lower sku "high end".
 
Last edited by a moderator:
When the first rumor about Titan have shown, i was ask me if Nvidia in the future, could not try to do something like Intel have done with his last CPU series: create a separate "ultra High end lineup" and a "consumer high end cards line up " ( like LGA1155 and LGA2011).

A ultra high end lineup who cost nearly 1000$ ( like extreme cpu from Intel ) and who permit to keep the " 104-114 SKU" in the high end price ( 500$ ), and use bigger sku, hard to sold, and not cheap to produce on top of the list.
( marketing wise, you drain a good feeling because peoples will base themsleves on the "extreme high end ", even if they will buy a low end card. ( Halo effect ).and you continue sold them an high end price cards without having to put it in the middle range price ( GK114 ).

Its like Intel since SandyBrige and the LGA2011. The top for the gamers and standard consumers was the 2600K-3770K. But if you do 3D benchmarks or use professionnal software, and need a 6-8 cores, you will need watch on the "extreme CPU's " and pay
the price for it ( I7 3930K- 3960x). This have never bring down the performance value of the I7 2600K-3770K ( even if they are cheap ) in the mind of the consumers, and Intel have been able to keep high prices on the 6 cores CPU.

We can even imagine Nvidia implement something like the "K" series on Intel CPU: Voltage control, Overclocking, TDP control on this "Extreme high end line up ", and limited possibility on the lower sku "high end".

I hate the way Intel is positioning her cpus' gamma, so i hope nvidia never do that :D
 
Heh well I also have a 990X, but I didn't pay any premium over regular hexacores for it, but in general paying the extra to go from 3930k to 3960X get's you basically nothing, but paying a similar amount of money to go from a 680 to a Titan will get you quite a bit of performance.

Well I'm going from a 285 to a titan potentially. It all depends if performance on SLIed factory over clocked 680s perform more and for cheap compared to a single titan. If the titan beats the SLI then and only then in my mind it's worth the investment.
 
Well I'm going from a 285 to a titan potentially. It all depends if performance on SLIed factory over clocked 680s perform more and for cheap compared to a single titan. If the titan beats the SLI then and only then in my mind it's worth the investment.

It's not going to beat SLI 680s, except in perhaps a few rare corner cases. I think the rumour stating it to be 85% of a 690 is the absolute upper limit of what it can be.
 
Its like Intel since SandyBrige and the LGA2011. The top for the gamers and standard consumers was the 2600K-3770K. But if you do 3D benchmarks or use professionnal software, and need a 6-8 cores, you will need watch on the "extreme CPU's " and pay
the price for it ( I7 3930K- 3960x). This have never bring down the performance value of the I7 2600K-3770K ( even if they are cheap ) in the mind of the consumers, and Intel have been able to keep high prices on the 6 cores CPU.

If you consider that Intel's real high end is a pair of Xeon 2687W, at four grands (!), they play this really straight but really, for the consumer performance/price ratio gets exponentially lower the more you climb in the line up..

Intel even has successfully convinced people that the Celeron are crap. I see all the time threads were people ask what to get in a computer to watch movies, or for their mother to play card games etc. I tell people to get a Celeron but they insist on choosing between an i3 and an A10, usually. But the latest one is a 2.7GHz Ivy Bridge, replacing a 2.6GHz Sandy Bridge. Either one of those spanks my 2.9GHz Athlon II X2, though how much exactly I don't know : websites don't bother benchmarking low end CPU.
Biggest bang for the buck is to be found in those Celeron (some Pentium G models maybe match them), they have a strong single thread performance - I wonder if they beat the AMD FX on that front :LOL:

As a quite rough estimate, let's say Celeron has 10% the perf of the high end Xeon pair, in highly multi-threaded stuff (taking ivy vs sandy, clocks, cores, HT vs no HT, small performance hit of using two CPU and ECC and/or registered..). The Xeon pair is about 10x more powerful, and about 100x more expensive.
 
If you consider that Intel's real high end is a pair of Xeon 2687W, at four grands (!), they play this really straight but really, for the consumer performance/price ratio gets exponentially lower the more you climb in the line up..

Intel even has successfully convinced people that the Celeron are crap. I see all the time threads were people ask what to get in a computer to watch movies, or for their mother to play card games etc. I tell people to get a Celeron but they insist on choosing between an i3 and an A10, usually. But the latest one is a 2.7GHz Ivy Bridge, replacing a 2.6GHz Sandy Bridge. Either one of those spanks my 2.9GHz Athlon II X2, though how much exactly I don't know : websites don't bother benchmarking low end CPU.
Biggest bang for the buck is to be found in those Celeron (some Pentium G models maybe match them), they have a strong single thread performance - I wonder if they beat the AMD FX on that front :LOL:

As a quite rough estimate, let's say Celeron has 10% the perf of the high end Xeon pair, in highly multi-threaded stuff (taking ivy vs sandy, clocks, cores, HT vs no HT, small performance hit of using two CPU and ECC and/or registered..). The Xeon pair is about 10x more powerful, and about 100x more expensive.

Just for clarifiy, i have not put in it the "professional/workstation" Xeon .. just the standard CPU you will find on desktop computers from SB to SB-E. its clear with Xeon and multiprocessors plateform for workstation we enter another world. But reported to GPU, this will be more the Quadro and Tesla parts.
 
BTW don't get an affordable pair of Xeons, as they only run at 2.3GHz for instance and have six cores/twelve threads each, you'll end up with a PC slower than an i7 3930K at 4.x GHz.

Some people do this for a 3D workstation, they're wasting one grand to get the Xeon brand and ECC memory they don't need.

Just for clarifiy, i have not put in it the "professional/workstation" Xeon .. just the standard CPU you will find on desktop computers from SB to SB-E. its clear with Xeon and multiprocessors plateform for workstation we enter another world. But reported to GPU, this will be more the Quadro and Tesla parts.

Sure! We're not that far though, when some people want to get a SLI of geforce Titan.
 
I guess it depends on the individual need. I have a 2560 by 1600 30 inch display and playing games in that resolution as I'm sure you know is quite taxing on any hardware. Memory does not seem to be the problem. Outright horsepower of the chip matters.

Memory size will matter a lot in next gen console games. I wouldn't recommend getting a GPU with less than 4GB if you want it to last through the next generation (which in core power terms, SLI 680's or Titan will likely be able to with ease).
 
Memory size will matter a lot in next gen console games. I wouldn't recommend getting a GPU with less than 4GB if you want it to last through the next generation (which in core power terms, SLI 680's or Titan will likely be able to with ease).

Just as long as you get a pair of GTX 680's with 4 GB of RAM each and not 2 GB of RAM each. :)

Regards,
SB
 
I will future proof my purchase then based on your suggestions.

Take that with a grain of salt of course. That was just poking fun at at the SLI'd GTX 680's. 3 GB of memory will likely be fine if you consider next gen console specs. And assuming that most AAA PC games will likely be console ports again.

With 3.5 GB dedicated to games on the next Playstation and 5 GB (rumored) for games on the next Xbox, then it's unlikely that cross platform games will use more than 3 GB just for graphics. Of course, there's always hope for higher quality assets for a PC port, but I'm not sure how likely that will be.

Hence, anything with 3 GB or more is likely safe for at least the next 2-3 years or more. But 4 GB is probably safer still.

Regards,
SB
 
With 3.5 GB dedicated to games on the next Playstation and 5 GB (rumored) for games on the next Xbox, then it's unlikely that cross platform games will use more than 3 GB just for graphics. Of course, there's always hope for higher quality assets for a PC port, but I'm not sure how likely that will be.
Unfortunately higher assets eat up memory inefficiently for the PC. For example : GTA IV , Max Payne 3 consume more than 1GB of memory at maximum settings .. Max Payne 3 even eats up close to 1.8 GB @720p 8X AA.

Large multiplayer matches in Battlefield 3 also consume close to 2GB , sometimes even more than that .

Recent COD titles also consume more than 512MB due to high resolution textures , old cards equipped with that much memory can not run the games smoothly unless texture resolution is lowered .

I do think that for a while , developers will not be enthusiastic enough as to add more assets (mainly textures) for the PC , giving that the new consoles will provide very good texture quality .. but that will not last forever though , and when they do, boy ! memory usage will be up through the roof !
 
It's worth reminding that drivers and tools give an overestimated figure for consumed graphics memory, you might see that as "allocated memory".
Of course, you still have memory hungry games, for good reasons (lots of high res textures) or not so good ones (GTA IV's highest setting, which "brute force" scales up some aspect of the rendering to give you a small gain, but this is entirely optional)

To be on the safe side.. Maybe get a 2GB card if you want a decent but affordable one (650ti, 7850) and a 3GB one if you want to get a bigger card (7950, 660 ti)
 
To be on the safe side.. Maybe get a 2GB card if you want a decent but affordable one (650ti, 7850) and a 3GB one if you want to get a bigger card (7950, 660 ti)
I got a 3GB 660Ti , (even though I play at 720p) just to be on the safe side , still I don't feel completely safe now that I know consoles will have a video memory of 3GB+ !
 
Wow I did not realize battlefield 3 and Max Payne 3 took that much video RAM. No matter what card I go with I will definitely go with the largest capacity RAM option. I don't upgrade as often as I used to do before so future proofing my purchases is the most logical. Thanks guys.
 
142zo2.jpg


http://www.arabpcworld.com/?p=25911
http://www.overclockers.ru/hardnews/51993/GeForce_Titan_po_sledam_pervyh_benchmarkov.html
 
Back
Top