Most cards will likely be 12-pin PCIE5 in several years, with the exception of those which will use PCIE slot power only, of course.
That's fine no? As long as they are just 1x 12 pin...
Most cards will likely be 12-pin PCIE5 in several years, with the exception of those which will use PCIE slot power only, of course.
Why not? As long as it can be cooled quietly 600w is going to happen. There’s no other option as silicon physics isn’t keeping up with our appetite for more performance.
That's fine no? As long as they are just 1x 12 pin...
Sure. I just think that it's unlikely that cards will be using anything but the new 12(16)-pin power plug in a couple of years from now. The idea is to unify all these configurations of 6+8 plugs into one common standard after all. AFAIU it doesn't even require to provide the maximum 600W on all implementations so the plug can be used even on "low end" 450-700W PSUs too.Edit - Nevermind, I was reading it and the pins are actually different. You can use an adapter to do 2x 8 pin to 12 pin as long as the card doesn't exceeded 375W though.
It's 16-pin, though (I mean both literally and what's actually used, personally I'd prefer calling it 12+4-pin)Most cards will likely be 12-pin PCIE5 in several years, with the exception of those which will use PCIE slot power only, of course.
12 pins for power, 4 pins for communication. There can be 12 pin only implementations AFAIU.It's 16-pin, though (I mean both literally and what's actually used, personally I'd prefer calling it 12+4-pin)
Well, consoles are still under 300W for the whole package.
Most of the good games these days are Indie titles and they don't require graphics cards that consume as much or more electricity than my entire PC.
Performance per-watt usually gets worse the higher performance you go.The low-end market (like the consoles) also offer better wattage figures, if you want to dive into that market for a gpu.
Or you could settle for lower settings (console settings maybe), a mid-ranger gpu, and wattage will be obviously be less.
Yes, but the argument here seems to be that it is inevitable that all GPUs will eventually be very power hungry due to physical limitations.
Not all, just the ones which can be considered "high end".Yes, but the argument here seems to be that it is inevitable that all GPUs will eventually be very power hungry due to physical limitations.
Not all, just the ones which can be considered "high end".
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.I don’t see how it’s unique to the high end. For any given power budget there will be increasingly smaller gains each generation. So you will either have to wait longer to see noticeable gains after an upgrade or you will need to accept higher power consumption.
Chiplets aren't a silver bullet especially on the lower end of the market. Any power saved by going wider and slower will be lost moving data between chips.
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.
Not really, since s/w use console h/w as the lowest common denominator. PC low end is rarely a target for any game being developed.Yes, but if the lowest common denominator is miles behind the top end, software will stagnate, just like we observed with console generations, especially during PS3 / X360 times.
I agree, you can see that in the category of MI250X from AMD, to beat A100 at 400w they had to leap frog it with chiplets at 500w. Chiplets in GPUs are done to save manufacturing cost, not to save power. In all cases we are heading toward a future where chips need more power.Chiplets aren't a silver bullet especially on the lower end of the market. Any power saved by going wider and slower will be lost moving data between chips.
The fact that low end won't improve it's performance as much as high end doesn't make it non low end. It's a pricing category first and foremost.
I'm aware of the pin configuration, that's why I said I'd prefer calling it 12+4-pin.12 pins for power, 4 pins for communication. There can be 12 pin only implementations AFAIU.
To be fair (as far as vendor-supplied benchmarks can be), the MI210 in it's 300W PCIe flavour will probably be faster than A100.I agree, you can see that in the category of MI250X from AMD, to beat A100 at 400w they had to leap frog it with chiplets at 500w. Chiplets in GPUs are done to save manufacturing cost, not to save power. In all cases we are heading toward a future where chips need more power.