Speculation: GPU Performance Comparisons of 2020 *Spawn*

Status
Not open for further replies.
I'm puzzled. Why would Nvidia distribute PCAT to select NDA partners just now?
As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.
 
As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.
Totally makes sense, when you could go and buy the apparatus for power measurements on the open market.

The question is: Would Nvidia make this available, if they knew or suspected to come in at second place against AMD in this very metric.
 
The question is: Would Nvidia make this available, if they knew or suspected to come in at second place against AMD in this very metric.
Of course not.

They're changing the narrative from "look at how much better we are at this super important metric" to "we're not that far behind if you measure it in this specific way".

They wouldn't be sending these tools to reviewers if they weren't expecting to fall short power efficiency. The current methods served their interests pretty well in the past, and now they don't.
 
Of course not.

They're changing the narrative from "look at how much better we are at this super important metric" to "we're not that far behind if you measure it in this specific way".

They wouldn't be sending these tools to reviewers if they weren't expecting to fall short power efficiency. The current methods served their interests pretty well in the past, and now they don't.
lower perf/watt would be better disguised in the noise that is system power consumption.
 
Of course not.

They're changing the narrative from "look at how much better we are at this super important metric" to "we're not that far behind if you measure it in this specific way".

They wouldn't be sending these tools to reviewers if they weren't expecting to fall short power efficiency. The current methods served their interests pretty well in the past, and now they don't.

Exactly. Perhaps the higher system power consumption of RTX 3xxx looks even worse with PSU losses so this allows them to bypass that. And currently Ampere would be compared to RDNA1, so still in favour of NV. Maybe they thought it was a good idea. Maybe they think RDNA2 wont be that great. Could be anything.
 
I'm more interested to see what these tools show for Maxwell2.
GTX980/970 were listed as ~150w card but were actually closer to a typical 200w TDP.

Did their new boost mess up their numbers or were the rating just the package or was it on purpose? Anyway, whatever it was they corrected it with Pascal.
 
And desktop GPU power consumption is also variable depending on workload. We do have a maximum Wattage as per you so anything below is obviously a plus. Either ways, the discussion was on XSX since it is also RDNA2.
Disagree, desktop chips always push max power consumption once you allow the engine to go full blazing fps. I can see my 2080Ti always power limited @250w in any game @4K resolution max settings.

The XSX's 52 CU GPU consumes sub ~150W at 1.825 Ghz
Sorry, that was never clearly stated, or even mentioned in the context of a specific workload or game. A conjecture for now. And again ZERO relevance to desktop GPUs unless the workload (resolution/settings/fps) is stated.
 
desktop chips always push max power consumption once you allow the engine to go full blazing fps
Consoles do it even harder.
Limited resources <> max IQ possible.
The estimate console number we're running with here is worst case, period.
Since MS is allocating power worst case.
And again ZERO relevance to desktop GPUs unless the workload (resolution/settings/fps) is stated.
Jeez just stop coping.
It's not funny.
 
As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.
But that's completely ridiculous and comes down to just tech sites being lazy. The equipment has been available forever no matter what Linus claims, it's just that it requires some extra work from the sites behalf instead of manufacturing handing it over on a silver plate, so only few sites ever went that route.
 
One still has to wonder why AMD would bother releasing a large size chip to compete with nVidia, if they are practically guaranteed to gain a profit if they use those wafers to create Zen 3 CPUs instead...
Everything points to them being capable of at least matching the RTX 3080, but would they want to?
 
One still has to wonder why AMD would bother releasing a large size chip to compete with nVidia
Imagine a good ass marketing campaign but without the hustle of actually doing one.
if they are practically guaranteed to gain a profit if they use those wafers to create Zen 3 CPUs instead...
Yeah.
but would they want to?
Hell yeah drag them on stage and slit the throats.
Worked for Rome, worked for Matisse, worked for Castle Peak and sure as hell it would work here.
The question of "matching" 3080 with something without DLSS and possibly slower RT isn't as clear as some here think.
Oh lord.
 
Imagine a good ass marketing campaighn but without the hustle of actually doing one.
I'm not exactly sure what you're referring to here... But, it is quite interesting that after nVidia's announcement, suddenly a bunch of people want to see what AMD will bring to the table, including me.

Hell yeah drag them on stage and slit the throats.
You seem quite confident. Don't tell me the rumor of the biggest navi die being 4x the 5700XT is true lol.
 
Status
Not open for further replies.
Back
Top