Bondrewd
Veteran
I wish but alas.True, but everything in this thread has been IF so far.
I wish but alas.True, but everything in this thread has been IF so far.
As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.I'm puzzled. Why would Nvidia distribute PCAT to select NDA partners just now?
Totally makes sense, when you could go and buy the apparatus for power measurements on the open market.As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.
Why not? It's pretty obvious AMD wasn't going to provide an accurate tool for this metric.The question is: Would Nvidia make this available, if they knew or suspected to come in at second place against AMD in this very metric.
Of course not.The question is: Would Nvidia make this available, if they knew or suspected to come in at second place against AMD in this very metric.
lower perf/watt would be better disguised in the noise that is system power consumption.Of course not.
They're changing the narrative from "look at how much better we are at this super important metric" to "we're not that far behind if you measure it in this specific way".
They wouldn't be sending these tools to reviewers if they weren't expecting to fall short power efficiency. The current methods served their interests pretty well in the past, and now they don't.
Of course not.
They're changing the narrative from "look at how much better we are at this super important metric" to "we're not that far behind if you measure it in this specific way".
They wouldn't be sending these tools to reviewers if they weren't expecting to fall short power efficiency. The current methods served their interests pretty well in the past, and now they don't.
Disagree, desktop chips always push max power consumption once you allow the engine to go full blazing fps. I can see my 2080Ti always power limited @250w in any game @4K resolution max settings.And desktop GPU power consumption is also variable depending on workload. We do have a maximum Wattage as per you so anything below is obviously a plus. Either ways, the discussion was on XSX since it is also RDNA2.
Sorry, that was never clearly stated, or even mentioned in the context of a specific workload or game. A conjecture for now. And again ZERO relevance to desktop GPUs unless the workload (resolution/settings/fps) is stated.The XSX's 52 CU GPU consumes sub ~150W at 1.825 Ghz
Consoles do it even harder.desktop chips always push max power consumption once you allow the engine to go full blazing fps
Jeez just stop coping.And again ZERO relevance to desktop GPUs unless the workload (resolution/settings/fps) is stated.
But that's completely ridiculous and comes down to just tech sites being lazy. The equipment has been available forever no matter what Linus claims, it's just that it requires some extra work from the sites behalf instead of manufacturing handing it over on a silver plate, so only few sites ever went that route.As the Linus Tech Tips video stated they (and likely others) have been asking for the longest time to have a tool like this provided by the IHV's. It will be good to see reviewers with a common baseline for comparison and hope they make it available to consumers.
That's the whole point here, isn't it?instead of manufacturing handing it over on a silver plate
The question of "matching" 3080 with something without DLSS and possibly slower RT isn't as clear as some here think.Everything points to them being capable of at least matching the RTX 3080, but would they want to?
Imagine a good ass marketing campaign but without the hustle of actually doing one.One still has to wonder why AMD would bother releasing a large size chip to compete with nVidia
Yeah.if they are practically guaranteed to gain a profit if they use those wafers to create Zen 3 CPUs instead...
Hell yeah drag them on stage and slit the throats.but would they want to?
Oh lord.The question of "matching" 3080 with something without DLSS and possibly slower RT isn't as clear as some here think.
The question of "matching" 3080 with something without DLSS and possibly slower RT isn't as clear as some here think.
I'm not exactly sure what you're referring to here... But, it is quite interesting that after nVidia's announcement, suddenly a bunch of people want to see what AMD will bring to the table, including me.Imagine a good ass marketing campaighn but without the hustle of actually doing one.
You seem quite confident. Don't tell me the rumor of the biggest navi die being 4x the 5700XT is true lol.Hell yeah drag them on stage and slit the throats.
Being the best works very well as far as pushing things down the stack go.I'm not exactly sure what you're referring to here
That's silly, but the products are good (excellent? depending on how they price it and mem configs).Don't tell me the rumor of the biggest navi die being 4x the 5700XT is true lol.
True, but everything in this thread has been IF so far.