Benetanegia
Regular
What didn't you understand?
Stop. Right there.
What didn't you understand?
What the hell are you even talking about? 8800 GT released practically 2 months before HD3000, there was zero motivation for 8800GT's low price: https://www.techpowerup.com/review/zotac-geforce-8800-gt/
Only competitors were 2900 Xt and Nvidia's own 8800 GTS 640 or 8800GTX, all of which were more than $100 more expensive.
Same with Maxwell, where they priced it well below AMD's or their own cards, especially the 970: https://www.techpowerup.com/review/nvidia-geforce-gtx-980/
Well hopefully there won't be scaling problems with RDNA2. 80CU clocked high enough should be able to more than double the performance of the 5700XT which would make it competitive with the 3080. AMD has been really quiet but stuff should start coming out now that Nvidia announced their stuff.
Utter nonsense. It's not about "special", it's about time to market. 2 years after DLSS introduction, still not a single competitive solution on the horizon...Utter nonsense, DLSS isn't special in any way, devs will get there soon enough
To be fair, DLSS hasn't been good until a few months ago. The first iteration was worse than a post processing filter. Although I doubt AMD would be able to provide a similar solution to DLSS 2.0 any time soon, other companies might have more of a chance.Utter nonsense. It's not about "special", it's about time to market. 2 years after DLSS introduction, still not a single competitive solution on the horizon...
This is fully agreeTo be fair, DLSS hasn't been good until a few months ago. The first iteration was worse than a post processing filter. Although I doubt AMD would be able to provide a similar solution to DLSS 2.0 any time soon, other companies might have more of a chance.
I'm not sure what HZD PC is indicative of. That game had a ton of patches on the PS4 and was/is prone to crashing. And that's on the closed platform! And even with PS4's fancy video recording of crash and debug upload system that I thought was pretty cool lol. Hauling it over to the PC was sure to bring forth more joyful problems. It seems very Bethesda-like in its robustness.Probably similar to this gen with more HZD situations since Sony is planning to release more games on PC.
In that you will need noticeably more powerful hardware to offer an equivalent experience.I'm not sure what HZD PC is indicative of. That game had a ton of patches on the PS4 and was/is prone to crashing. And that's on the closed platform! And even with PS4's fancy video recording of crash and debug upload system that I thought was pretty cool lol. Hauling it over to the PC was sure to bring forth more joyful problems. It seems very Bethesda-like in its robustness.
https://horizon.fandom.com/wiki/Horizon_Zero_Dawn_updates#PlayStation_4_updates
I'd say two years after introduction this solution starts to deliver, what was described in the two years old introduction. In selected games.Utter nonsense. It's not about "special", it's about time to market. 2 years after DLSS introduction, still not a single competitive solution on the horizon...
Utter nonsense, DLSS isn't special in any way, devs will get there soon enough,
the GPUs are within less than 10% of each other,
[over a lifetime the Xsx is a far better investment as a 3070 will be outclassed by both consoles within a few years due to targeted optimizations,
and if you've only got $500 then you don't have enough money for an NVME SSD, 8+ core CPU, etc. etc.
Only thing i disagree with in the wider context of your post.While I agree that longer term, a 3070 will fall back due to lack of driver support from Nvidia and game developers moving on to more modern and powerful architectures
Well aside from the fact that it's the only machine learning based resolution upscaling solution on the market from any vendor, has been for the last 2 years, and no-one else has even announced they're working on a competing solution at this stage (outside of a few high level patents that may or may not lead to something).
And of course there's the fact that the only truly good version of it (DLSS 2.0) requires Tensor cores with huge INT8/INT4 performance giving the 3070 something like a 5x performance advantage over the XSX.
But other than that, no, nothing special at all.
This is baseless speculation unsupported by any evidence at present. Our only data points so far for the performance of the XSX are the Gears 5 and Minecraft RTX demo's. Both put it at or below (in RTX) 2080 level performance. While I acknowledge 2 data points are not enough to reach a decent conclusion, and do expect it to move up from there, at least in regular rasterization, when more data points become available, for the time being that's all we have, so it's not logical to assume something that contradicts the only 2 available data points without some other evidence. A 2080Ti is almost 30% faster than a 2080 in 4k and the 3070 is faster still. So claiming a 10% difference seems a stretch at best. And what of Ray tracing performance? Or ALU heavy workloads where the 3070 has a 66% advantage over the XSX?
Not if you're after "max graphics" like you stated above. For the next 2-3 years "max graphics" is more than likely going to require a 3070 given a choice between the two, especially where Raytracing is involved. While I agree that longer term, a 3070 will fall back due to lack of driver support from Nvidia and game developers moving on to more modern and powerful architectures, that doesn't change the situation in the here and now. Nor is it a particular problem given that the vast majority of PC gamers who purchase a x070 class GPU don't intend to keep it for an entire console cycle and would usually upgrade to something much more powerful within 2-4 years.
So yes, as a value proposition over the long term, the XSX is likely better. But if you want "max graphics" you'll want to go with the 3070, and then reconsider your options again in a few years.
That goes without saying.
The test were done in March 2020 without final devkit, only delivered in June. We can compare end of year with Digitalfoundry Test... I suppose the 3070 will be faster but we have no idea how much it will be faster.
New gpu releases also takes a couple driver releases also, so could say haven't seen the full potential there either.The test were done in March 2020 without final devkit, only delivered in June. We can compare end of year with Digitalfoundry Test... I suppose the 3070 will be faster but we have no idea how much it will be faster.
And the Series X is already close enough to a 3070 to be a basic wash as far as that's concerned.
both consoles within a few years due to targeted optimizations
But if you want "max graphics" you'll want to go with the 3070, and then reconsider your options again in a few years.
Almost double the performance in rasterization for the 3070. Then theres RT, dlss etc. Not even close.
10TF vs a 20TF gpu, hell of an optimization there.
A 3070 will last a entire gen. A 7870 did, plays hzd fine. And thats closer to the console then the 3070 is.
CEASE.10TF vs a 20TF gpu
Don't we all love ALU pilesWe will wait they release more drivers but currently Ampere seems less efficient than Turing in real game performance per flops.
Explicitly 80.Btw, i wonder if Big Navi could be more than 80 CUs?
Lol.Would be barely enough to compete 3070 but not more.
80 is the actual driver listing (2*5*2*4).But i think all rumors were 80
Yeah.and only Arcturus is 128?