NVidia Ada Speculation, Rumours and Discussion

Status
Not open for further replies.
This is not surprising. The trend in the high perf semi is clear.

The current line of GPU-based accelerators is at 400/500W already. That's with TCO in mind unlike the gaming sector. The upcoming Hopper is rumored to be rated at 1kW. So why not a 0.85kW gaming card?

Youtubers will praise anything at least 10% faster than competition. The RGB crowd will love it due its sheer size. Tech ppl will praise it since 'perf/W' will be somehow better than the 7/8nm gen.

Mind you, CPUs are going the same direction. The stupidly overclocked 2013 Bulldozer-based "marketing stunt" was labeled as 220W. Everybody was shocked. Recent highend desktop Intel's have been hitting 250W and nobody cares. The upcoming server CPUs are approaching 500W in TCO constrained env.

In a few years the enthusiasts are going to simply replace their puny electric cords with a more beefy ones - like those used on 10kW welding machines.
 
Gotta say, Greymon seems to be just throwing out endless amounts of weird and often conflicting rumors over the past months. They've obviously had some hits, so know somebody, but it's getting hard to keep taking their tweets with much seriousness.

I'll just say that if Nvidia feel the need to go through the incredible engineering trouble of designing a consumer board and single chip GPU that can handle 800w+ stock, then they are likely fearing what AMD are doing with Navi 31 quite dearly.

Also, what PSU could handle this? Aren't even these new connectors built specifically for super high end GPU's only good for like 600w?
 
Gotta say, Greymon seems to be just throwing out endless amounts of weird and often conflicting rumors over the past months. They've obviously had some hits, so know somebody, but it's getting hard to keep taking their tweets with much seriousness.
I wouldn't pay much attention to any of these rumors. Even Kopi7kimi is guessing way too much at this point.

Also, what PSU could handle this? Aren't even these new connectors built specifically for super high end GPU's only good for like 600w?
You can have two cables for 1200W of power.
 
500W is enough to warm one room in a well insulated Scandinavian home when it's very cold (below 0F). Not many people want an extra space heater inside their study.

I would remove the undervolt from my 3080 to do exactly this, but there are a few games where I get coil whine if my gpu utilization is low and my clock boosts high.
 
850W is MCM I guess.

It's kind of interesting. The 3080/3090 series cards are pretty much power limited. Maybe the new cards basically are basically setup to never be power limited. I would still expect that 850W number would have to come with a massive performance increase over a 3090.
 
Remember there are rendering workstations that come configured with multiple 3090 cards. Very possible they’re addressing that niche with the 4090, and actually leaving a real gap for the ti gaming card. Currently there wasn’t much room for a ti or super card between the 3080 and 3090.
 
Can you elaborate further on what you mean here? Seems to be plenty of room for a 3080Ti, since they make one and I'm using one right now :)
Just because it exists doesn't mean that it needs to. It's purely a profiteering card. And whether you own one or not doesn't reinforce it's need for existence.
 
That's a bit of a myopic take I think. Taken to the logical conclusion, what does the 3090 do that's so much better than the 3080? Really, what does the 3080 do that the 3070 can't reasonably accomplish? We're talking very low double-digits percentages in uptick, at best.

The best bang for the buck is way down the range in like the 3050 area. Arguably anything above that and we're right back into profiteetring. However, there's still additional performance to be had, and the 3080Ti is certainly more performant than the 3080 -- nearlyso equivalent to the 3090, yet with a lesser price tag. Arguably, the 3080Ti has more of a right to "exist" than the 3090 based on price vs performance.
 
That's a bit of a myopic take I think. Taken to the logical conclusion, what does the 3090 do that's so much better than the 3080? Really, what does the 3080 do that the 3070 can't reasonably accomplish? We're talking very low double-digits percentages in uptick, at best.

The best bang for the buck is way down the range in like the 3050 area. Arguably anything above that and we're right back into profiteetring. However, there's still additional performance to be had, and the 3080Ti is certainly more performant than the 3080 -- nearlyso equivalent to the 3090, yet with a lesser price tag. Arguably, the 3080Ti has more of a right to "exist" than the 3090 based on price vs performance.

My point was that in the past like with the 1080, the ti variant had a large performance improvement. The 3090 was introduced and was not such a big increase over the 3080, and that didn't really leave a spot for the 3080ti to be largely different from either. If they segment the 4080 ti to be a reasonable upgrade from the 4080, then that leaves maybe some absurd position for the 4090 that's meant to be more "professional" in use, with maybe some bizarre mcm and "dual card" type performance and massive power consumption. I'm basing this largely on pro workstations that have dual or quad 3090s.
 
Status
Not open for further replies.
Back
Top