Nvidia's 3000 Series RTX GPU [3090s with different memory capacity]

Why not? As long as it can be cooled quietly 600w is going to happen. There’s no other option as silicon physics isn’t keeping up with our appetite for more performance.

Sure there are. Like I said, I'll give up on AAA games as most of them aren't worth the time or energy cost. I'll miss out on some gems, but such is life. Most of the good games these days are Indie titles and they don't require graphics cards that consume as much or more electricity than my entire PC.

Just because I "could" do it, doesn't mean I "want" to do it. I really don't want to be having to run A/C just to keep my place comfortably cool while gaming. I don't need a furnace running under my desk anytime I want to game. And with how things are looking, I'd rather not contribute any more than I need to towards a burgeoning energy crisis nor waste that money on having a space heater running anytime I want to game.

This obviously doesn't only apply to NV, but AMD and Intel as well. Hell, I avoid the Intel 12k series CPUs like the plague because I don't want to have that high power consumption during some workloads (I don't only game on my machine).

Regards,
SB
 
Edit - Nevermind, I was reading it and the pins are actually different. You can use an adapter to do 2x 8 pin to 12 pin as long as the card doesn't exceeded 375W though.
Sure. I just think that it's unlikely that cards will be using anything but the new 12(16)-pin power plug in a couple of years from now. The idea is to unify all these configurations of 6+8 plugs into one common standard after all. AFAIU it doesn't even require to provide the maximum 600W on all implementations so the plug can be used even on "low end" 450-700W PSUs too.
 
Most cards will likely be 12-pin PCIE5 in several years, with the exception of those which will use PCIE slot power only, of course.
It's 16-pin, though (I mean both literally and what's actually used, personally I'd prefer calling it 12+4-pin)
 
Well, consoles are still under 300W for the whole package.

Performance per-watt usually gets worse the higher performance you go.The low-end market (like the consoles) also offer better wattage figures, if you want to dive into that market for a gpu.

Most of the good games these days are Indie titles and they don't require graphics cards that consume as much or more electricity than my entire PC.

Or you could settle for lower settings (console settings maybe), a mid-ranger gpu, and wattage will be obviously be less.
 
Performance per-watt usually gets worse the higher performance you go.The low-end market (like the consoles) also offer better wattage figures, if you want to dive into that market for a gpu.



Or you could settle for lower settings (console settings maybe), a mid-ranger gpu, and wattage will be obviously be less.

Yes, but the argument here seems to be that it is inevitable that all GPUs will eventually be very power hungry due to physical limitations.
 
Not all, just the ones which can be considered "high end".

I don’t see how it’s unique to the high end. For any given power budget there will be increasingly smaller gains each generation. So you will either have to wait longer to see noticeable gains after an upgrade or you will need to accept higher power consumption.

Chiplets aren't a silver bullet especially on the lower end of the market. Any power saved by going wider and slower will be lost moving data between chips.
 
I don’t see how it’s unique to the high end. For any given power budget there will be increasingly smaller gains each generation. So you will either have to wait longer to see noticeable gains after an upgrade or you will need to accept higher power consumption.

Chiplets aren't a silver bullet especially on the lower end of the market. Any power saved by going wider and slower will be lost moving data between chips.
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.
 
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.

Yes, but if the lowest common denominator is miles behind the top end, software will stagnate, just like we observed with console generations, especially during PS3 / X360 times.
 
Yes, but if the lowest common denominator is miles behind the top end, software will stagnate, just like we observed with console generations, especially during PS3 / X360 times.
Not really, since s/w use console h/w as the lowest common denominator. PC low end is rarely a target for any game being developed.
 
Chiplets aren't a silver bullet especially on the lower end of the market. Any power saved by going wider and slower will be lost moving data between chips.
I agree, you can see that in the category of MI250X from AMD, to beat A100 at 400w they had to leap frog it with chiplets at 500w. Chiplets in GPUs are done to save manufacturing cost, not to save power. In all cases we are heading toward a future where chips need more power.
 
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.

I didn’t say that it does. Point was that increasing power consumption will be a concern throughout the stack. Solid performance at under 150w is a thing of the past.
 
12 pins for power, 4 pins for communication. There can be 12 pin only implementations AFAIU.
I'm aware of the pin configuration, that's why I said I'd prefer calling it 12+4-pin.
The 12-pin microfit NVIDIA uses can be compatible apparently (with limited power supply), but no-one will be using those. Everyone uses 12VHPWR which is called 16-pin by everyone including NVIDIA.
 
I agree, you can see that in the category of MI250X from AMD, to beat A100 at 400w they had to leap frog it with chiplets at 500w. Chiplets in GPUs are done to save manufacturing cost, not to save power. In all cases we are heading toward a future where chips need more power.
To be fair (as far as vendor-supplied benchmarks can be), the MI210 in it's 300W PCIe flavour will probably be faster than A100.
 
Back
Top