Starfield to use FSR2 only, exclude DLSS2/3 and XeSS: concerns and implications *spawn*

Valhallas specs indicated AMD would have a disproportionate performance advantage over Nvidia... and it did.

Mirages specs indicate that Nvidia will have a disproportionate performance advantage over AMD, which I argue would be interesting given the AMD sponsorship of the former that doesn't carry over to the latter.

You have said we should ignore the specs because they don't mean anything. That wasn't the case with Valhalla but perhaps it will be with Mirage, in which case my point it moot. So as I suggested, let's wait for the benchmarks to see which is the case.
Nothing is going to be proven, since you fundamentally dont understand that these requirement sheets aren't as carefully curated as you seem to bizarrely think they are. In all the many, many years of PC gaming, these have NEVER been accurate, but you still seem to think we can actually use this specific one as a reliable judge of some specific variance between GPU performance between AMD and Nvidia.

I am honestly bewildered at how PC gamers could have never, ever learned anything from the past thousand times they weren't accurate, while treating every new requirement listing as gospel. It's absolutely insane to me. Like, I genuinely dont grasp why other people's brains dont seem to work.
 
Nothing is going to be proven, since you fundamentally dont understand that these requirement sheets aren't as carefully curated as you seem to bizarrely think they are. In all the many, many years of PC gaming, these have NEVER been accurate, but you still seem to think we can actually use this specific one as a reliable judge of some specific variance between GPU performance between AMD and Nvidia.

I am honestly bewildered at how PC gamers could have never, ever learned anything from the past thousand times they weren't accurate, while treating every new requirement listing as gospel. It's absolutely insane to me. Like, I genuinely dont grasp why other people's brains dont seem to work.

I'm not sure why you're so fixated on the spec sheet.

The interesting point will be if the game ends up performing relatively better on NV hardware given how Valhalla, which was AMD sponsored performed.

The spec sheet is merely an indicator that that will be the case. You're arguing that its not. Great, let's wait and see then.
 
I'm not sure why you're so fixated on the spec sheet.

The interesting point will be if the game ends up performing relatively better on NV hardware given how Valhalla, which was AMD sponsored performed.

The spec sheet is merely an indicator that that will be the case. You're arguing that its not. Great, let's wait and see then.
Exactly! It will definitely be of interest once benchmarks are available to see how close performance is.
 
The interesting point will be if the game ends up performing relatively better on NV hardware given how Valhalla, which was AMD sponsored performed.
A ballpark guess would match a 3080 against 6800XT, not a 6900XT. And in a game that performs the same as the previous game why would they need to randomly guess?
Funny thing happened, Assassin's Creed Mirage got released, and the AMD advantage is indeed wiped out. A 3080 is almost as fast as 6900XT at 4K.

 
Funny thing happened, Assassin's Creed Mirage got released, and the AMD advantage is indeed wiped out. A 3080 is almost as fast as 6900XT at 4K.


So I'm noting here that at 4K Ultra the 3080 averages 57fps and the 6900XT averaged 60fps.

And these were the game requirements:

2160p (4K), Ultra Preset, 60 FPS

  • CPU: Intel Core i5-11600K/AMD Ryzen 5 5600X
  • GPU: NVIDIA GeForce 3080 10GB/AMD Radeon RX 6900 XT 16GB
  • RAM: 16GB (dual-channel mode)
  • OS: Windows 10/11
  • SSD Storage: 40 GB
It's almost as if the requirements were put together to accurately reflect the games performance.... 🤔
 
So I'm noting here that at 4K Ultra the 3080 averages 57fps and the 6900XT averaged 60fps.
Yep, the 3080 is again as fast as 6900XT @4K according to GameGPU. NVIDIA GPUs are ahead of AMD here, this game is not behaving as Valhalla at all.


 
I'm not sure why you're so fixated on the spec sheet.

The interesting point will be if the game ends up performing relatively better on NV hardware given how Valhalla, which was AMD sponsored performed.

The spec sheet is merely an indicator that that will be the case. You're arguing that its not. Great, let's wait and see then.
I'm not the one fixated on the spec sheet! lol I'm arguing exactly opposite of that, that we shouldn't use it to start making any kind of assumptions or conclusions. We should ALWAYS just wait and see actual results. These requirement listings have too terrible a record in terms of accuracy to keep thinking they should be used as any kind of reliable indicator of anything. If I had a weather app that constantly told me temperatures that were like 5 degrees(C) off almost every day, I would not keep using that app to ascertain what the temperature will be tomorrow, even if it is spot-on on the odd occasion.
 
Back
Top