Nvidia GeForce RTX 4070 TI Reviews

So 10-25% faster than a 3080 for 15-20% more money. I can't wait to see how laughable the 4070 and below are. Either you're willing to buy a 4090 or don't bother this generation.

According to TPU it's 22-23.5% faster on average (Raster/RT) and it's 14% more expensive by MSRP. Plus you get DLSS3 and the performance advantages of SER if it ever shows up outside of tech demoes, not to mention the power efficiency and extra memory. So on price/performance it's pretty good compared to the 3080 which was previously considered very good. Obviously that's not accounting for the fact that we should be getting much better price/performance 2 years on with a new generation, but nevertheless, the 3080 doesn't make much sense vs this card unless you're getting it at a much bigger discount than the MSRP gap.
 
According to TPU it's 22-23.5% faster on average (Raster/RT) and it's 14% more expensive by MSRP. Plus you get DLSS3 and the performance advantages of SER if it ever shows up outside of tech demoes, not to mention the power efficiency and extra memory. So on price/performance it's pretty good compared to the 3080 which was previously considered very good. Obviously that's not accounting for the fact that we should be getting much better price/performance 2 years on with a new generation, but nevertheless, the 3080 doesn't make much sense vs this card unless you're getting it at a much bigger discount than the MSRP gap.
MSRP should also be adjusted for inflation. This was less relevant in prior years, but has been serious in the past 2 years.
 
It seems like a nice enough card, but the pricing remains out of touch, at least if Nvidia wants to sell these with any considerable volume. The primary reason most consumers get excited for new generations & node shrinks is because, for the last 3 decades, that has meant a performance / $ increase.

That hasn't happened. And Nvidia doesn't seem to have understood that the the label on the card doesn't determine whether it is low, mid, or high end. The price does.

If the MSRP of the 4070 Ti was $699 (same as 3080), it would at least be an improvement over the previous gen. I don't doubt that it will sell to some degree - high-end volume buyers (would be 3080 owners that never got one last gen), but I'll be very surprised if Nvidia moves the quantity they are clearly expecting.

TBH, it looks like they got ripped off by TSMC while the crypto craze was going on and demand was high. They should use current capacity producing as many AD102 as possible and re-negotiate their contract for future supply. Yes, inflation exists, but when wage increases don't match inflation demand for non-essential products goes down... forcibly. And gaming GPUs are non-essential.
 
Last edited:
If it was $500 it would fly off the shelves but $800 is a crazy price. Bit faster than the 3080 for a similar price increase years later is not particularly enticing
What if you didn't bother with the previous generation? (Like most people usually do.)
It's roughly 3090Ti level in rasterisation at 1440p, that makes the 4070Ti about 2x faster than the 2070S at 1.6x higher MSRP ($800 vs $500, so 1.25x better value) 3.5 years later. That's about 1.065x/6.5% better value year on year. Or on average it's about 20% faster than the 3080 in raster at 15% higher price >2 years later

*excluding inflation

Whether people think that's "good" or "worth it" will depend on on the person, what they have now, their disposable income, if they're willing to get a GPU second hand, how long they've been looking at GPUs etc. Historically that level of value increase would be terrible but maybe they're relatively new and don't have those comparison points, plus things are more expensive and increasing so maybe it's not fair to expect similar value gains, although that's another discussion
 
Last edited:
Obviously that's not accounting for the fact that we should be getting much better price/performance 2 years on with a new generation, but nevertheless, the 3080 doesn't make much sense vs this card unless you're getting it at a much bigger discount than the MSRP gap.
Yeah this is quite a decent upgrade for Turing/Pascal owners, but I don't see many Ampere users wanting to upgrade. Usually anyone willing to spend $900+ on a GPU likely spent that last gen. I certainly wouldn't be buying a 3080 now instead of 4070Ti.
 
Read the edit.
Just saw it. Yes, I agree that wages haven't kept up with inflation (well, at least mine hasn't). It sucks.

Still, the cost of moving stuff around the world has gone up and has affected prices of everything from bread to cars to homes. So you can't hold GPUs up to a different standard.

But I get that as far as buyer purchasing decisions are concerned, the attractiveness of a $799 luxury good in 2022 is degraded vs. a $699 luxury good in 2020.

Thank you for addressing it in a coherent way. That's all I ask for.
 
if somebody would still ask me what GPU to buy for an upgrade. Which rarely happens these days.

Yeah its easier than ever picking the gpu you want. Ive noticed family and friends able to make their choices themselfs these days.
 
A couple of points after skimming through reviews.

1. The card obviously struggles in 4K with its 192-bit bus. This was already the case with 4080 but it's even more pronounced here.
This (coupled with "just" 12GB of VRAM) leads to interesting comparisons with 7900XT - the whole "unlaunch" looks even more like a late minute product repositioning to me now.
While it's arguably a better product for 1080p-1440p resolutions the tables turn in 4K where 7900XT starts to look a lot better even despite being a bit more expensive - and even (or maybe specifically as this is where 12GB looks weak even today) considering RT performance.
This is a good result for AMD here I'd say as many potential buyers will choose a card which has some more headroom for 4K in this pricing range.

2. The card's efficiency is worse than that of 4080 which is a weird result as you'd expect it to be the opposite.
It seems like the GPU is pushed to 1.1v almost constantly - which is more than in either 4080 or 4090:


4070Ti:
clock-vs-voltage.png


4080:
clock-vs-voltage.png


4090:
clock-vs-voltage.png


This could mean that they are pushing more power trying to salvage more chips to offset the $100 "price drop".

Overall the card is just as "meh" as any new gen launch (with the exception of 4090) - perf/price isn't improving a lot and it's not an easy sell because of that.
 
Last edited:
the tables turn in 4K where 7900XT starts to look a lot better even despite being a bit more expensive - and even (or maybe specifically as this is where 12GB looks weak even today) considering RT performance.

I'm not really seeing that. According to the TPU review the 7900XT is 10% faster on average in raster at 4K and 7% slower in RT. This is while being 12.5% more expensive, drawing more power, and lacking DLSS2/3.

The Computabase averages on the previous page make it look even worse from a performance perspective with it being just 7.7% faster in raster and almost 15% slower in RT.
 
I'm not really seeing that. According to the TPU review the 7900XT is 10% faster on average in raster at 4K and 7% slower in RT. This is while being 12.5% more expensive, drawing more power, and lacking DLSS2/3.
That's the thing, it's just 7% slower in RT while having 20GB of VRAM which makes it a bit more future proof for someone buying a GPU for a $1000 (ish) and willing to plug it into a 4K TV for example.
This average of course doesn't account for games with heavy RT usage where it may be up to 50% slower - but there's a general feeling that such games won't run well on 4070Ti either.
So in this particular choice scenario 7900XT is faring better than I'd expect IMO.
 
That's the thing, it's just 7% slower in RT while having 20GB of VRAM which makes it a bit more future proof for someone buying a GPU for a $1000 (ish) and willing to plug it into a 4K TV for example.
This average of course doesn't account for games with heavy RT usage where it may be up to 50% slower - but there's a general feeling that such games won't run well on 4070Ti either.
So in this particular choice scenario 7900XT is faring better than I'd expect IMO.

I'm not sure if 12GB itself would become a limiter as it seems like scaling past 12GB this generation would likely require heavy RT at native 4K (which would also require performance scaling) as opposed to just texture related settings (which doesn't affect performance). For instance Portal RTX can blow past the 12GB limit I believe but it isn't playable even on the 4090 at native.

I feel 12GB itself is an important demarcation point in terms of VRAM for this entire console generation much like 6GB was for the last one which (why I think in practice the 4070ti will be considered much better than the 3080 10GB). The major VRAM concern with this generations GPUs will the 8GB ones which may even start staggering at 1080p.
 
This average of course doesn't account for games with heavy RT usage where it may be up to 50% slower - but there's a general feeling that such games won't run well on 4070Ti either.

I don't see why it wouldn't. It has near enough 3090Ti RT level performance which has always been more than sufficient for the heaviest of RT titles provided you're willing to use DLSS. And if those games force a resolution drop - even with upscaling, to maintain playable framerates on the 4070Ti (and by extension the 7900XT) then the 4070Ti's advantage will likely be even greater.
 
It's quite a bit worse than I thought it would be, hard sell along the 4080. Rough start for the new gen of GPUs. Although I guess to be expected after pandemic + mining craze.
 
Unless a GPU is required for one's work, you absolutely can.... and the vast majority of people will. But I'll let Nvidia's quarterly report reflect that....

I'm curious, what do you suggest? Selling at loss and have an even worse quarterly report?
 
What if you didn't bother with the previous generation? (Like most people usually do.)
I’d just keep on waiting. What about the 4070ti would be enough to entice people that decided the 3080 wasn't worth it? I suppose there may be a small amt of people still remaining who wanted but couldn't find a 3080 in stock at a reasonable price. I would think most of the people willing to spend this much on a GPU would likely already have done so. I struggle to see what the target audience is for this card to sell in any volume.
 
Last edited:
Back
Top