Nvidia BigK GK110 Kepler Speculation Thread

But why shift your entire product line pricing down when there is, supposedly, no real reason to? It seems pretty clear that their products are selling well and being able to position their products where they have is obviously quite an advantage.

I think we might see a slight shift with Nvidia pricing due to AMD having the sales momentum at the moment but that will probably come from more aggressive pricing of AIB's custom cards and not necessarily from Nvidia's MSRP.

AMD will and is move on... But im not sure the result is what we expect . ( in term of consumers prices ofc )
 
But why shift your entire product line pricing down when there is, supposedly, no real reason to? It seems pretty clear that their products are selling well and being able to position their products where they have is obviously quite an advantage.

The question is whether it's a short term or long term advantage. For the moment it could even be a necessary "evil" or let me call it just that. If you care more about revenues and less about increasing your volumes what does it tell you?
 
All in all, Crysis 3 is the most demanding game we’ve ever tested. And even though the game’s visuals justify its GPU requirements, there is certainly no excuse for its CPU requirements.

Here I would tend to disagree with the authors of the article.
Kudos for Crytek to put current APUs to their knees. I mean- why should the 3770K be so small with die size, be so limited with this 77W, and offer so few cores? I think they should put an effort, and double everything for the consumer market, and absolutely necessarily for such high-end type of chip either to make use of the graphics part for helpful calculations, or get rid of it once and for all and replace by proper CPU cores.
This is for the sake of progress- get rid of this energy-efficiency and push the progress up.
 
Crytek should publish an article about their engine. It's hard to know for sure if these game sites know what they're talking about.

Crysis 1 has hardcore CPU requirements too but because of problems like # of draw calls.

Kudos for Crytek to put current APUs to their knees. I mean- why should the 3770K be so small with die size, be so limited with this 77W, and offer so few cores?
Because it has a big IGP inside and the current reality is a lot of transistor budget is flushed for that. Which is unfortunate because I'm sure most of us don't even use that chunk of die.

Ivy Bridge-E is a 12-core chip. Unfortunately it will undoubtedly cost a lot of money.
 
Exactly, what part of this so called "masses" is going for the 3770K and use the iGPU inside. This is ridiculous and I doubt anyone is doing it.

Actually, I suspect some universities and similar can do it but it's insane.
 
Exactly, what part of this so called "masses" is going for the 3770K and use the iGPU inside. This is ridiculous and I doubt anyone is doing it.

Actually, I suspect some universities and similar can do it but it's insane.
What I mean is instead of an IGP in the chipset satisfying the need for graphics in all of the budget non-gamer machines out there, it's consuming transistors in the Ivy Bridge die. And since most of us use a graphics card, that semi-capable IGP is just a lot of wasted transistors. It's about the size of 3 CPU cores.
 
Intel is constrained by TDP if it adds more CPU cores, look at the Xeon 2687W, only 3.1GHz but 150 watts, and that's a chip with the GPU removed. It's doable but who wants a 150W CPU exactly?, not the OEM (they won't pay for the big heatsink, multiple fans unless they're doing a Mac Pro or Alienware) so you'd be only selling to a fraction of a fraction of people, self-builders who buy high end.

Well you can go socket 2011 or dual C32.. if you don't like it, too bad, wait for next gen 2011 socket.
For games themselves, developers can't cut out themselves from their customer base, we're barely seeing games requiring at minimum a slower/older quad core or very fast i3 to play and.. you want games that require double that else it runs like shit? (i.e. minimum CPU is 2500K/3770K/FX8150 etc.)

Talk about cutting the wood under your feet..
Consoles are a counter argument maybe but they should be well enough matched by Intel quads and AMD eights.
 
We're just commenting on Crysis 3 and its CPU demands being supposedly physics related, and how that could be shifted to GPU. It's also not likely that game developers will focus on Tahiti and GK104 when they plan their games out. I have a hard time believing that performance would improve with added GPU load anyway (unless you're horribly CPU limited).
 
If the game can top 60fps on the fastest GPU's and CPU's available while maxxing both out then I'm not sure what logic there is in shifting the workload from one of those processors to the other. You're simply going to reduce overall performance while leaving part of your CPU idle. I'm much happier seeing all of a modern quad core used to it's potential and freeing up as much rendering power on the GPU as possible than having physics burdening the GPU but the game being playable on an ancient dual core.

If you find yourself CPU limited in a game like this then there's an argument to be made that your system isn't very well balanced, at least not for this particular game. The article seems to be a cas in point there. There using the fastest dual GPU card in the world with an old (albeit overclocked) Penryn based CPU.
 
What I mean is instead of an IGP in the chipset satisfying the need for graphics in all of the budget non-gamer machines out there, it's consuming transistors in the Ivy Bridge die. And since most of us use a graphics card, that semi-capable IGP is just a lot of wasted transistors. It's about the size of 3 CPU cores.

This is what I keep going on about. It doesn't have to be wasted transistors. I think it's crazy that here we are on the one hand arguing that physics should be farmed off to the GPU and on the other hand arguing that we need more CPU cores because IGP's are useless and wasting space on the CPU die. Gee I wonder what the obviously solution might be there! This is exactly the direction AMD are heading and hopefully (almost inevitably I'd have thought) Intel will follow.

The biggest blockers are how capable current IGP's are in this regard (Trinity isn't all that far away from Kaveri for example), the capability of the API to handle this kind of workload split (no problem for physX or hybrid SLI) and developer support.

A year or two from now many high end PC's will have a fairly capable "physics processor" sitting there idle on the CPU die. I just hope we're not still arguing about highly stressed CPU cores vs overloading the discrete GPU further with physics calculations at that time.
 
I am sorry , but the "use the full potential of my CPU" argument is not simply valid in this case. When the top CPUs are struggling to deliver 40 FPS , then you have a serious problem .. your engine is suffering from a massive bottleneck that needs to be resolved , either by more optimizations , or shifting the load elsewhere.

It's pretty obvious , the Cryengine had never had this kind of CPU dependency , not in it's first version nor the second , the fact that it does now , asserts the nature of this newly created problem, it's like they wanted to intentionally cripple the hardware to restore their glorious days of the "Can it run Crysis?" era !
 
I think it's more important to understand why it's so CPU intensive. Does the end result justify the performance hit? Crysis 2 was pretty CPU intensive as well even with nothing happening on the screen.
 
I am sorry , but the "use the full potential of my CPU" argument is not simply valid in this case. When the top CPUs are struggling to deliver 40 FPS , then you have a serious problem .. your engine is suffering from a massive bottleneck that needs to be resolved , either by more optimizations , or shifting the load elsewhere.
They didn't use a top CPU, they used a Q9650, a Core 2 era quad core, a 4½ year old CPU.

Modern 3rd gen Core i5/7s would be twice as fast.

It's pretty obvious , the Cryengine had never had this kind of CPU dependency , not in it's first version nor the second!

Cryengine has always been CPU intensive. Far Cry almost maxed out my dual core Athlon 64. Crysis almost maxed out my Core i7 920.

GPU physics is wonderful for eye candy, but the latency is too high for gameplay to be based on it. That might change in the future, especially with the architecture the new consoles have.

Cheers
 
They didn't use a top CPU, they used a Q9650, a Core 2 era quad core, a 4½ year old CPU.

Modern 3rd gen Core i5/7s would be twice as fast.
Even a modern hexa core CPU is not enough :
http://www.techspot.com/review/642-crysis-3-performance/page5.html


Cryengine has always been CPU intensive. Far Cry almost maxed out my dual core Athlon 64. Crysis almost maxed out my Core i7 920.
They have been , but they were never bottlenecked by it ,and they certainly were never so dependent on it like this .

GPU physics is wonderful for eye candy, but the latency is too high for gameplay to be based on it. That might change in the future, especially with the architecture the new consoles have.
That's why I think they should provide another route , or provide the option to turn that thing off completely.
 
Back
Top