NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
The lockdowns and miners stuff are mostly over, It will help I guess ? But, yeah, I agree that maybe prices will just stay the same. Honestly, the best factor will be competition. Intel is not there yet, neither is AMD right know with their RT performances. If RDNA3 is still lagging a lot, nVidia won't feel the need to lower prices ? Time will tell.

Competition lol. Nvidia is a monopoly now, they don't give a hoot about Intel's and AMD's GPU "competitors" and will continue selling overpriced products with a 80% market share.
 
Competition lol. Nvidia is a monopoly now, they don't give a hoot about Intel's and AMD's GPU "competitors" and will continue selling overpriced products with a 80% market share.
Then don't buy their overpriced products?
It's the only way to figure out if they are overpriced or overspecced. Probably both.
 
Then don't buy their overpriced products?
It's the only way to figure out if they are overpriced or overspecced. Probably both.

Exactly this. If people stop buying them then NV will have to decrease prices, but aslong as people are willing to pay more and more, NV (and other companies like Sony, Apple, AMD, Intel etc) will gladly increase prices.
its the same with phones, here in EU we pay 1600euros for a base model iphone 14 pro max, 500 or so up from last year. Doesnt stop people from getting this in masses. They just know they can ask more so they do.

TAA is indeed worse. There’s no winning in this game. If I disable TAA using the config file, the image becomes noisy and the aliasing is all messed up.

TAA is just awful generally. I wish they’d get rid of it.

TAA is the worst of the bunch, DLSS quality is quite good but still no native in motion. I use it in conjunction with 2077 since that game is quite heavy on the gpu when you want this tad more RT fidelity.

The lockdowns and miners stuff are mostly over, It will help I guess ? But, yeah, I agree that maybe prices will just stay the same. Honestly, the best factor will be competition. Intel is not there yet, neither is AMD right know with their RT performances. If RDNA3 is still lagging a lot, nVidia won't feel the need to lower prices ? Time will tell.

He has pointed out before that RT is a number one priority. His 3060Ti is miles head of the PS5 in ray tracing, while also being faster at normal raster. He also mentioned before that he got the gpu at a nice deal. Coupled to a Zen3/R5 cpu, nvme and enough main ram. He then wants to get a PS5 instead of that system because a 3080/90 isnt giving him enough of a performance upgrade. I wont have to explain whats going on there.

Users on a >3060Ti arent going to see much of an upgrade because they already have a capable modern GPU, in special when your looking at a 3080 or 3090 which are in the same generation.
 
Last edited:
More like Jensen is tired of making up excuses.

Moores law is dead has little to do with 1600 dollar GPU's. Moores law may slow down somewhat in normal raster (tbh i think raster increases are still acceptable, about what it has been before), though RT and AI is where its at these days, and theres room for improvement there each generation. You cant say 'moores law is dead' when you have a 4090 being twice as fast in normal raster vs a 3090Ti 4x the RT performance, xx times the ML acceleration and other added features and efficiency increases (tsmc 4nm). These gpus clock a ghz higher...
Gamers where willing to pay scalpers fantasy prices the past two years, they have shown the willingness to pay for their gpus.

Its the same with Apple increasing the price by a whopping 500/600usd for the EU market due to 'reasons', same for Sony upping the prices on consoles, games and services. Its because they can and they know people will buy into it.
 
Market dictates price. Jensen can claim whatever he wants. If shelves are packed, prices will come down. If empty, up. Always been that way.

Im good with the 2080Ti. This bubble of increasing prices (not just nv) will implode sometime, not about when but if.
 
Unused hardware bits is normal ...

Tessellation (nearly no modern benchmark suites uses it today), Mesh Shaders (some vendors implement a separate geometry pipeline for it), programmable blending/geometry shaders (performance problems), other VR accelerated rendering features (even if high-end PC VR is clearly dead), and many more etc. ...

Ampere introduced HW accelerated ray traced motion blur too even though the vast majority of developers are going to implement the effect as some post-process pass with motion vectors and call it a day. Hardware VRS might not even matter in the future since it has less utility in a deferred renderer and it can't be used with compute shaders either ...

Who are we to judge IHVs (some of which who pride themselves more than others on excess hardware) since they're their own experts who believe that these features do give themselves a competitive advantage ?

Not sure what point you're trying to make. Nvidia advertised SER performance increases in real games. If it relies on a proprietary extension it will see limited adoption in real games. The other examples you mentioned - mesh shaders, tessellation and VRS are all available via standard apis. Btw I'm pretty sure tessellation is widely used, it's just not a special thing anymore so nobody talks about it.
 
Having worked many years as gfx artist, i can assure we just don't know about the potentially many forms and effects of color blindness. So the problem is missing education. It never was a topic in the art shool i was, for example.

I have assumed, some people have issues with detecting difference in hue, for example confusing red and green.
But those charts all have the same hue of green. So you can't see the difference in brightness? :O That can't be?

Let's compare: I agree the difference between the 2nd. and 3rd. chart is much too subtle. I have issues detecting the difference at all and agree that's a design flaw.
But i can see the difference between 1 and 2, and also 3 and 4. How's that for you?

It's also very hard to match the charts with the little squares on top. Another design flaw.
But because that's gradual changes relative to background, that's hard for anybody, indepenmdent of color blindness.

Yah, you're probably right that it's a design issue and not my colour blindness. The problem with colour blindness is you lose the ability to detect a lot of different hues. So any time I look at something and can't tell the difference I just assume it's because I can't detect the difference in hues, but in this case it was probably just the brightness. I can easily distinguish the 1st and the 4th from the rest, but 2 and 3 look the same, especially in the small boxes above.
 
Not sure what point you're trying to make. Nvidia advertised SER performance increases in real games. If it relies on a proprietary extension it will see limited adoption in real games. The other examples you mentioned - mesh shaders, tessellation and VRS are all available via standard apis. Btw I'm pretty sure tessellation is widely used, it's just not a special thing anymore so nobody talks about it.

Not really sure but I'd expect NVAPI adoption to be pretty high, because it gives you access to the driver and the gpu to query states, metrics. I'd be more concerned about the other two RT accelerators because they require significant changes to the renderer.
 

He honestly doesn't seem to give much of a crap about PC gaming tbh. That came across in the GTC video which gave a few minutes to the Geforce and spend the vast majority talking about their AI applications. It seems pretty clear that's were Nvidia is putting the vast majority of its focus now with gaming just being an afterthought.

Unless AMD or Intel come through for gamers then I can see this being my last generation as a PC gamer.

That said, I find it hard to believe AMD won't release something faster and cheaper than the Ampere range in a few weeks so I may well make the jump back over there if they do.
 
This could be AMD's chance to breakthrough, if they price them much lower for equal raster performance but acceptable RT performance. That might put some pressure on NV for sure.

He honestly doesn't seem to give much of a crap about PC gaming tbh. That came across in the GTC video which gave a few minutes to the Geforce and spend the vast majority talking about their AI applications. It seems pretty clear that's were Nvidia is putting the vast majority of its focus now with gaming just being an afterthought.

Unless AMD or Intel come through for gamers then I can see this being my last generation as a PC gamer.

That said, I find it hard to believe AMD won't release something faster and cheaper than the Ampere range in a few weeks so I may well make the jump back over there if they do.

Yeah however if you switch to console, you could aswell go for mid-range GPU's for the PC, matching or somewhat exceeding the console specs/experience while having pc gaming's advantages.
And yes, i dont think AMD has to compete with Ada, their main competitor is Ampere since Ada is too far away to become mainstraim. NV has a mountain of Ampere backlog that needs to be cleared which is probably partially the price problem.
 
Status
Not open for further replies.
Back
Top