Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

But then the 40x0 products upon release will be at higher prices and inflated by scalpers etc then we're in the same position waiting on stock and prices to drop.

I guess they simply don't want to sell me a product at all then. :LOL:
 
TBH I'm not optimistic about the next generation either. The cynic in me feels that both IHVs will think that there's been some acclimatisation to higher prices over the past year or two and they will take the opportunity to adjust the MSRPs upwards.

I'm still running a 1080Ti, I've given up on this round and will see what happens but there's no way I'm paying more than about £600 for a graphics card. Anything in the four-digit range can, in the famous words of some Ukrainian soldiers, go fuck itself.
 
I'm sure Intel is reading this forum and has come to the conclusion that they need to disrupt pricing with their upcoming release. 3070 performance is worth, what, maybe $350? Thinking GTX 970 days here. ;)
 
I'm sure Intel is reading this forum and has come to the conclusion that they need to disrupt pricing with their upcoming release. 3070 performance is worth, what, maybe $350? Thinking GTX 970 days here. ;)

While I'm excited to see whether Intel can compete at least within the performance bracket on down (enthusiast level isn't really necessary for Intel, IMO) and how they will price it, I'm still concerned about whether they will finally have good driver support. The little bits we've seen for A350 equipped laptops doesn't really give much hope that they'll be up to AMD and NV driver quality levels.

Crossing fingers that driver issues will be a large focus for Intel once their cards are available in Western markets because I'd really like to see a new player come in and disrupt things a bit.

Regards,
SB
 
While I'm excited to see whether Intel can compete at least within the performance bracket on down (enthusiast level isn't really necessary for Intel, IMO) and how they will price it, I'm still concerned about whether they will finally have good driver support.
They seem serious to me. They've unified their installer to support everything Skylake and newer in one easy to find download. I noticed they fixed up Vulkan and it works with Dolphin now. And whenever I feel the urge to play some oldie it seems to work just fine on Intel HD. But performance on the latest releases? Will VR even work?? Who can say!!

Gonna need steeeeeep discounts to get me onboard, Intel. :D
 
I'm sure Intel is reading this forum and has come to the conclusion that they need to disrupt pricing with their upcoming release. 3070 performance is worth, what, maybe $350? Thinking GTX 970 days here. ;)

I would allow them $400. The 3070 is a solid 1440p card. 970 at $320 was more of a 1080p card that flirted with 1440p in some games. And wafers are more expensive these days. The 4070 needs to be a solid 1440p card with RT on for under $450.
 
I would allow them $400. The 3070 is a solid 1440p card. 970 at $320 was more of a 1080p card that flirted with 1440p in some games. And wafers are more expensive these days. The 4070 needs to be a solid 1440p card with RT on for under $450.
The 4070 won't be less than $600 MSRP because they know gamers will buy them.
 
The 4070 won't be less than $600 MSRP because they know gamers will buy them.
Do they? Turing sales struggled because of being perceived as being too expensive.

Are you sure you're not still misunderstanding that the people mainly paying for all these really expensive GPU's this past year were primarily cryptominers, not gamers?

That said, it really depends on what a 4070 actually is/how much faster it is. If it's like 20% faster than a 3090, then yes, there might be a market for it at $600. If it's only as fast as a 3090 for $600, that will feel pretty lousy, given a 3080 was $700 two years prior and isn't much slower.
 
4070 being faster than 3090Ti while costing $600 would still be a sizeable perf/price improvement when compared to 30 series.
So it is possible.

It is also long overdue for people to stop thinking that a random product name have some pricing range attached to it for some reason.
 
Do they? Turing sales struggled because of being perceived as being too expensive.
I'm not sure it was just too expensive, but also a mistrust of paying higher price for additional features for raytracing and tensor cores with practically no perceived real world value. Several years on, gamers are now seeing the fruits of that investment in their games and are likely far more willing to pay a premium again.

That being said, there's going to be a flood of Ampere going into the market that a lot of folks like myself still using Pascal and my wife with Maxwell are going to upgrade to. Maybe even some lower end Turing users.

The 3090 is a terrible comparison though as the halo "titan" cards have always been stupid in pricing for the extra VRAM and just because they can, for what amounts to little performance over the top end regular gaming card. So yeah the 3080 MSRP and performance should really be the benchmark for the next gen, but certainly not the fuck-the-consumer-over-a-barrel 12Gb version.
 
lIt is also long overdue for people to stop thinking that a random product name have some pricing range attached to it for some reason.

Yep, and it doesn’t help that Nvidia has been random with its SKU naming in the past few generations. It starts making a bit more sense if you look at die sizes (cost) and performance (value) instead.

The slight problem with that is that 3080 was basically unavailable at it's MSRP during its lifetime. It may change by the time of 40 series launch of course.

People’s reaction to 40 series pricing will likely be based on what Ampere is selling for at the time. And with the direction prices are heading you should be able to snag a 3070 Ti for under $600 by then.

The reality though is that transistors are getting more expensive. It’s not reasonable to expect a 400mm^2 5nm die to sell for the same price as a 400mm^2 16nm die.

Having said that I expect die sizes to be on the lower side for this upcoming generation. Doesn’t help to pack in a lot more transistors if you can’t cool them adequately. Laptop darlings like AD106 should be tiny.
 
It is also long overdue for people to stop thinking that a random product name have some pricing range attached to it for some reason.

The problem with that is that consistent naming exists for a reason. It's to more easily denote to the consumer (and sometimes more importantly retailers) the market segment the product is intended for. That in turn governs the prices that consumers expect to pay for it and retailers expect to charge for it.

Obviously everything about that has gone really wonky for the past 3 generations (Pascal, Turing, and Ampere) due to various crypto-mining bubbles. However, outside of the crypto-bubbles, retailers and consumers will generally want consistency with naming versus pricing.

The only time you see an upheaval is when the entire product stack is shifted. AMD did this back with the older Radeon 6xxx generation (not the current 6xxx gen, again bad naming scheme), for example, and I hated that they did that. So the 6870 didn't target the performance segment as the 5870 did. Instead it was now targeting the performance bracket that the 5770 did but at the price bracket of the 5870, bleh.

Another example is with LG's lineup where the B series used to be the "budget" OLEDs but since they've shifted the price of everything upwards the A series is now the "budget" OLEDs.

So, yes, while there are examples where entire product stack shifts occur, it's generally more the exception than the rule. And usually a company will just change the entire naming scheme rather than shift the existing one as both AMD/ATI and NV have done in the past with changing the naming scheme used.

For myself, I always find shifting the product stack to be slimy. I hated it when AMD did it. I hated it when LG did it with their OLED TVs. And I'm always going to hate it.

Regards,
SB
 
The problem with that is that consistent naming exists for a reason. It's to more easily denote to the consumer (and sometimes more importantly retailers) the market segment the product is intended for. That in turn governs the prices that consumers expect to pay for it and retailers expect to charge for it.

That hasn't happened everywhere though. It's really only the xx70 series that had a big swing from Pascal days.

MSRP for the 1060 was $299, the 2060 was $349 and the 3060 is $320.
MSRP for the 1070 was $329, the 2070 was $499 and the 3070 is $499
MSRP for the 1080 was $599, the 2080 was $699 and the 3080 is $699.
 
The problem with that is that consistent naming exists for a reason. It's to more easily denote to the consumer (and sometimes more importantly retailers) the market segment the product is intended for.
Nope. It's there precisely to make a consumer spend more money than they probably would because "I'm always buying an x70 card!" Look at perf/price, forget the product names, they are there to confuse you.
 
Back
Top