They stated that for 400$ you shouldn't have to be lowering texture settings below console equivalents at low resolutions. I fail to see what’s unreasonable about that.
There are no percievable quality difference between High and Ultra textures in TLOUPI, the latter seem to just allocate more VRAM (+1.5GB or so) for textures as can be seen from monitoring. Thus High is the console equivalent option there.They stated that for 400$ you shouldn't have to be lowering texture settings below console equivalents at low resolutions. I fail to see what’s unreasonable about that.
They haven't. They used it at 1440p to show the VRAM limitations of the 4060 Ti.And yet they've stopped talking about TLOU since the VRAM issue was addressed.
Crazy huh?
HU tested the heaviest section of TLOU in the woods. Daniel Owen tested the prologue so it could be why.Now they are using "bandwidth" as an argument. Look at the TLoU result from the 4060TI review. The card is slower than a 3060TI - but only in their benchmark. Computerbase, Daniel Owen and Co. the 4060TI is at least as fast or even faster.
"Never change a running system".
No one in particular, just my thoughts on the current HUB drama.What is this in reply to? Use quotes.
TLOU isn't the only game. He shows serious problems with RE, Calisto, Plague Tale and Forspoken. He mentions Hogwarts as well. 8GB is not enough VRAM for this cards performance profile. Going forward things will more likely get worse rather than better.There are no percievable quality difference between High and Ultra textures in TLOUPI, the latter seem to just allocate more VRAM (+1.5GB or so) for textures as can be seen from monitoring. Thus High is the console equivalent option there.
It would destroy all of HUBs VRAM narrative in TLOU though as High textures now after all patches fit just fine in 8GB in 1080p-1440p.
Edit: or is it Very High? Don't remember. The point is that there are no reasons to run Ultra textures there unless you're purposefully aiming at producing VRAM issues on 8GB GPUs.
Yeah. I think just historically at this point we would have been expecting better (2.5 years after PS5 launch), using PS3 or PS4 as comparison points.They stated that for 400$ you shouldn't have to be lowering texture settings below console equivalents at low resolutions. I fail to see what’s unreasonable about that.
8800 GTX was available at PS3 launch for only 100$ more and was well over twice as fast. Probably 2.5-3x. Not in 1 specific area, but in general performance. The additional features the G80 offered were also available in nearly every game.I think it's worth pointing out that at the point of PS3's release it was using a GPU that was up there with fastest PC GPU's on the planet and there was only two/three PC GPU's available at PS3's launch that were faster, being the 7800GTX 512mb and the 7900GTX.
That was not the case as PS4 launched with an already 12 month old mid-range PC GPU so it was much easier to find a GPU that could easily and cheaply beat it.
PS5 is closer to PS3 in respect to it's GPU performance vs what was available at its launch on PC.
8800 GTX was available at PS3 launch for only 100$ more and was well over twice as fast. Probably 2.5-3x. Not in 1 specific area, but in general performance.
Games the PS3 ran at sub 720p and 20-25 fps the 8800 GTX ran at 1080p 30+ fps.I had an 800GTX (Well two actually) and it wasn't 2.5-3x faster.
Some times it was 2x and sometimes it was less.
But my point still stands, PS3 was up there with the best when it released where as PS4 was miles off it.
Games the PS3 ran at sub 720p and 20-25 fps the 8800 GTX ran at 1080p 30+ fps.
RE has similar texturing buffer size setting which isn't really improving any quality above a certain point. Using it to prove the point is disingenuous.TLOU isn't the only game. He shows serious problems with RE, Calisto, Plague Tale and Forspoken. He mentions Hogwarts as well. 8GB is not enough VRAM for this cards performance profile. Going forward things will more likely get worse rather than better.
Again it's nice to want things but judging the current situation by what was happening 15 years ago (!) with PS3 isn't really a good idea since everything's changed now. It's like you're expecting something just because you think it should happen and then reality happens and everyone is disappointed for some reason.I wanted a GTX 1060 style chip here - but they did not manage that at all.
From the point of view of business, NVIDIA has to starve their gaming GPUs of VRAM, they don't want AI people to buy gaming GPUs to do their AI work on, and AI workloads are hungry for more memory.
NVIDIA is also facing severe chip shortages at TSMC, as demand for their AI chips is literally through the roof, the market is willing to pay top dollar for these chips, so NVIDIA has less incentive to release any chips at low prices.
And it's not like the alternative is offering anything better, the RX 7600 is launching today with 8GB and 270$ price tag, whille offering slightly above 3060 12GB performance and way behind the 3060Ti (essentially a regular 4060 tier performance). The whole market is screwed.
It's much easier than that and the illustration why will arrive soon in the form of 4060Ti16 with all its perf/price "glory".From the point of view of business, NVIDIA has to starve their gaming GPUs of VRAM, they don't want AI people to buy gaming GPUs to do their AI work on, and AI workloads are hungry for more memory.
What are you basing your skepticism on?I am very sceptical Nvidia couldn't use the full AD103 for a 4070 Ti that is $799 with a good enough cooler + board (doubling die cost would what, 80$ more in bom?
Indeed, even 24Gb VRAM is barely enough for optimized trainers in the stable diffusion space, 40Gb is recommended.AI is not really a good reason though. Even 16GB is pretty limiting.
And it's not like the alternative is offering anything better, the RX 7600 is launching today with 8GB and 270$ price tag, whille offering slightly above 3060 12GB performance and way behind the 3060Ti (essentially a regular 4060 tier performance). The whole market is screwed.