I thought Intel skipped PCIe 4.0 and was on 3.0 until the very latest Alder Lake?Every motherboard released in last 2-3 years has PCIe 4.0, so every new system has at least PCIe 4.0 slot.
I thought Intel skipped PCIe 4.0 and was on 3.0 until the very latest Alder Lake?Every motherboard released in last 2-3 years has PCIe 4.0, so every new system has at least PCIe 4.0 slot.
Nope, mobile had it since Ice Lake and desktop since Rocket Lake (also Comet Lake was supposed to have it but they axed it when they couldn't get it stable)I thought Intel skipped PCIe 4.0 and was on 3.0 until the very latest Alder Lake?
Nope, mobile had it since Ice Lake and desktop since Rocket Lake (also Comet Lake was supposed to have it but they axed it when they couldn't get it stable)
Performance is work done per unit of time. If you want to comparatively quantify one, you need to keep the others constant.The other additional thing to consider here is reviewers tend to test all max (well except RT, but lets not go there ) settings while individual setting adjustments are available. If it's memory pressure causing causing the most performance drop then it can be a situation in which this particular card would need to sacrifice more memory sensitive settings (typically texture quality related) relatively more so than other settings to bring performance up when compared to others. However at the same time we could also debate the issue of those max likely texture settings.
I feel though in general this is an example though of a situation in how while the traditional review format can reflect the academic performance but doesn't necessarily convey the actual usability performance.
On a semi related note. Has anyone looked into whether or not PCIe limitations may affect games that rely more on streaming of data to manage memory load in terms of things such as more texture pop in?
Apparently you missed "worst case" in my posting. RTX 3050 performance is lost on PCI Express 3.0 systems in some games.Using a single outlier result out of all the tests done is.
Nope, I did not miss it, but tests presented in this thread seem to indicate that this single, minium fps result is very much an outlier result. Are you into worst cases being treated as the basis for a discussion? Me, I'm not.Apparently you missed "worst case" in my posting. RTX 3050 performance is lost on PCI Express 3.0 systems in some games.
4 out of the 6 games listed have a performance loss due to PCI Express 3 at 1080p.Nope, I did not miss it, but tests presented in this thread seem to indicate that this single, minium fps result is very much an outlier result. Are you into worst cases being treated as the basis for a discussion? Me, I'm not.
No GPU is "immune" to a performance loss running on a slower bus. 3050 isn't much different from any 16x PCIE GPU here. 6500XT on the other hand is quite a bit different.The narrative that RTX 3050 is immune to PCI Express 3 performance loss at 1080p is a lie.
Well, it's not any narrative of mine or even one which I particularly care about. To be honest, I haven't even seen this narrative mentioned here in the thread except by you, so I'm not sure about the point you're trying to make.4 out of the 6 games listed have a performance loss due to PCI Express 3 at 1080p.
The narrative that RTX 3050 is immune to PCI Express 3 performance loss at 1080p is a lie.
I've seen it mentioned in quite a few softball reviews where they talk about how "you'll probably not have any issues with only 4 PCI lanes gaming at 1080 as you'll rarely exceed the VRAM limit", (I'm paraphrasing, not from any particular review), and I believe that's the narrative they're talking about.Well, it's not any narrative of mine or even one which I particularly care about. To be honest, I haven't even seen this narrative mentioned here in the thread except by you, so I'm not sure about the point you're trying to make.
GT 1010, huh?I repeat: the 6500 XT is a terrible product, I would damn near get anything else over it if possible
Wow, okay. Fair. I'd never heard of that monstrosity, but I'll also defend myself 'cause I said "damn near" in there for wiggle room.GT 1010, huh?
I repeat: the 6500 XT is a terrible product, I would damn near get anything else over it if possible.
They've already confirmed RDNA3 for this year and there's no roadmap with N6 RDNA2. You really think they're gonna refresh the RDNA2-lineup?AMD Radeon RX 6900 XT Graphics Card With 3.15 GHz Clocks Achieves Top Position In 3DMark Fire Strike & Time Spy Hall of Fame
https://wccftech.com/amd-radeon-rx-...ghz-clocks-achieves-top-position-in-3dmarkrk/
Due to new CPU used for overall score, but hoping for similar normal overclocked numbers with the 6nm refresh.
N7 to N6 won't give you much in terms of clocks, especially with what would be the exact same design.Due to new CPU used for overall score, but hoping for similar normal overclocked numbers with the 6nm refresh.
Besides cost and performance? Yes.You shared your feelings many times, but do you have any other metric to determine what makes a product terrible?