He tests all 3 of the common resolutions but 4k helps minimize CPU limitations.So, why is he testing 4K?
He tests all 3 of the common resolutions but 4k helps minimize CPU limitations.So, why is he testing 4K?
The premise behind this thread is that HUB is biased towards AMD and possibly even puts out fake/altered benchmarks to make Nvidia look worse.
Or worse, choose a number of games that support RT but test them with RT disabled on $1000+ GPU's that are more than capable of playing them with RT enabled.
It's actually totally fine. Imagine a hypothetical reviewer that only wants maximum frame rate. They could review every game with ray tracing turned off. That's not an unfair test. They're just not testing ray tracing. If that's what they and their audience wants, then they're actually doing their audience a service. Reviewers can have niches to cater to specific people. If you don't like it, don't read it/watch it. Find a reviewer that tests the things you want to see tested.
Edit: These gpus aren't even strictly for gaming. Most reviews are heavily biased towards gaming, and all of the video editing, blender are lightly covered. How many of the gaming sites do any kind of benchmarking with deep learning? It's totally reasonable to cater reviews towards personal interest and make recommendations based on how well something suits your own needs.
How can a selection of 30-50+ games with all the popular and AAA titles tested be chosen specifically to favor AMD?
Like Cyberpunk, Dying light 2, Metro and Control? Are those titles that favor AMD’s lesser RT? They tested one title, MW 2, at two settings profiles. I would guess it’s because of how popular it is as well as the ultra settings still falling a bit short of a truly high framerate experience. Oh wait he also tested Fortnite with and without the nanite/lumen features. So yeah, I really don't see this cherry picking. Watching HUB content doesn’t paint the disparity in RT performance between the vendors differently than we all know.Because where a large proportion of those games support RT, they choose to only turn it on in the games where the hit to AMD isn't that big compared to Nvidia. They also test some of the games multiple times at different settings but this duplication of testing more heavily favours AMD and seems to have little logic behind it.
It takes time to migrate all your benches to a new CPU because you have to retest all GPUs you plan to have in your comparison.HUB has done some dumb stuff. Like I think one of the 4090 reviews they did used a 5800x3d as the cpu, which isn't the fastest choice and the 4090 can be cpu limited in a lot of games. Probably would have made more sense to use a 13900k to really see the gpu scaling. But overall I don't think they're really bad. Mostly I'd just like them to show their test runs so I could actually compare.
Like Cyberpunk, Dying light 2, Metro and Control? Are those titles that favor AMD’s lesser RT?
They tested one title, MW 2, at two settings profiles. I would guess it’s because of how popular it is as well as the ultra settings still falling a bit short of a truly high framerate experience.
Metro Exodus Enhanced runs fine on a Series S with RT enabled yet here he is testing the non RT version on a 7900XTX in a head to head with Nvidia.
The 13900k launched after the 4090.HUB has done some dumb stuff. Like I think one of the 4090 reviews they did used a 5800x3d as the cpu, which isn't the fastest choice and the 4090 can be cpu limited in a lot of games. Probably would have made more sense to use a 13900k to really see the gpu scaling.
You've just perfectly illustrated my point. No those titles don't favour AMD's lesser RT which is exactly why despite being included in the bench, it is without their RT enabled:
Metro Exodus Enhanced runs fine on a Series S with RT enabled yet here he is testing the non RT version on a 7900XTX in a head to head with Nvidia. Can't imagine why...
I'm sure that's the reasoning he'd give as well. And the fact that it's a massive performance outlier for AMD which he's now counting twice in an average performance score which just happens to beat the Nvidia GPU by the tiniest of margins (at 4K) has nothing at all to do with it.
In Riftbreaker you don't have to throw half your performance out the window for the RT effects it has.HUB included both original Metro and Metro EE. RT was disabled in Control, Cyberpunk and DL2 though.
Obviously they’re not serious about evaluating RT performance or IQ impact when they enable it in Riftbreaker but not the games where it actually matters.
Exactly.In Riftbreaker you don't have to throw half your performance out the window for the RT effects it has.
Based on your past posts the 'games where it matters' are those where it hurts the performance most. For many, including HUB, in most cases that kind of sacrifice is not acceptable for the added effects it brings.
I don't understand how this simple concept is so hard for some to grasp:
It's not about RT or no RT, it's about the performance penalty it brings vs the gains in IQ.
For some it may be just fine to get accurate reflections while losing half your performance, but for others it's not. Everyone has their own scales for it.
People need to accept the fact that their own opinions, contrary to popular beliefs, are in fact not universal truths.
He tested Metro EE as well. It’s right there at the bottom heavily weighted in Nvidia's favor. He tested the other titles I mentioned in the 7900 launch video. This witch hunt is really not very well thought out. He included World War Z which is a disaster for AMD with virtually equivalent margins as COD only in Nvidia’s favor.You've just perfectly illustrated my point. No those titles don't favour AMD's lesser RT which is exactly why despite being included in the bench, it is without their RT enabled:
Metro Exodus Enhanced runs fine on a Series S with RT enabled yet here he is testing the non RT version on a 7900XTX in a head to head with Nvidia. Can't imagine why...
I'm sure that's the reasoning he'd give as well. And the fact that it's a massive performance outlier for AMD which he's now counting twice in an average performance score which just happens to beat the Nvidia GPU by the tiniest of margins (at 4K) has nothing at all to do with it.
In Riftbreaker you don't have to throw half your performance out the window for the RT effects it has.
It's not about RT or no RT, it's about the performance penalty it brings vs the gains in IQ.
No, it does not. Maybe they like to test high-profile new games at the lowest and highest settings. When the game isn't hot anymore they settle on one setting.Does that explain their decision to test Witcher 3 RT? It was universally panned as overly expensive.
Start an architecture thread somewhere else, maybe they won't noticeWhen is the architecture products section getting opened again mods ? .. place is getting boring without discussion hardware