Current Generation Hardware Speculation with a Technical Spin [post launch 2021] [XBSX, PS5]

Status
Not open for further replies.
Could it be that in order to get the I/O complex on die and still remain within transistor and power budget they had to make some cuts to the FPU and other changes they thought were small?
The die shots show the IO section is on the other side of the die, and that there seems to be an area of unused silicon next to the FPU, which is close to if not matching the area "saved" by shrinking the FPU.
Unused die area negates the supposed savings in transistors, as the cost of the die doesn't change if the final dimensions are the same.

If the power constraint is that significant, it runs counter to some of the implication from Mark Cerny's presentation about making 256-bit instructions more sustainable, since the core isn't able to sustain 128-bit instructions at the expected level.
What is odd on top of that is that since the core does support half the throughput of 256-bit instructions, that is the same vector throughput of normal 128-bit execution. So why did 128-bit throughput get cut?
What is potentially saved at 256-bit is some of the scheduling and data routing of 4 128-bit SSE ports, but at this point, how much of a sliver of the power budget was saved?
 
IIRC Denuvo requires AVX, so any game that uses it for DRM would require AVX also. I would assume the same game on console would not require AVX, as long as the actual game code doesn't use it as well. But it's hard to tell if a console game would need or leverage AVX just by looking at the same game's requirements on PC
 
IIRC Denuvo requires AVX, so any game that uses it for DRM would require AVX also. I would assume the same game on console would not require AVX, as long as the actual game code doesn't use it as well. But it's hard to tell if a console game would need or leverage AVX just by looking at the same game's requirements on PC

Do any Microsoft Game Studio titles even use Denuvo? I can't recall any using it. Gears 5 didn't use it and neither did Forza Horizon 4. They used Arxan DRM.

Is there news about MS switching to Denuvo?

Regards,
SB
 
Last edited:
Just watched this review of AoE 4, and the game itself doesn't look like it's doing anything special. Looks like there's some destructible physics on the buildings but that might also be pre-rendered.

 
Do any Microsoft Game Studio titles even use Denuvo? I can't recall any using it. Gears 5 didn't use it and neither did Forza Horizon 4. They used Arxan DRM.

Is there news about MS switching to Denuvo?

Regards,
SB
I have no idea. I was just pointing out that just because a PC game has system requirements that include AVX, that doesn't mean the console version would even use those instructions.
 
I have no idea. I was just pointing out that just because a PC game has system requirements that include AVX, that doesn't mean the console version would even use those instructions.

That's true. But considering that MS deliberately left the AVX instructions basically untouched in the XBS consoles, it wouldn't surprise me at all if their first party studios made use of it. Unlike 3rd party studios they don't need to worry about whether it will or won't work on a rival console or even necessarily older CPUs on PC.

For example, Far Cry 6 which we know uses Denuvo doesn't require AVX in their min CPU spec on PC. It goes even lower than AOE 4 to a Ryzen 3 1200 with no mention of an AVX requirement. That makes me skeptical that an AVX requirement would be used for DRM. Gears 5 which likely uses the same DRM as AOE 4 goes even lower with only an AMD FX-6100 as the min spec again with no mention of AVX requirement.

There's a lot of games on PC that will use AVX instructions if they are available, but I can't think of many that actually require it regardless of what DRM package they use.

This isn't say that it's not used for DRM. But if it is, it'll be the most resource hungry DRM scheme used in a game to date.

Regards,
SB
 
For example, Far Cry 6 which we know uses Denuvo doesn't require AVX in their min CPU spec on PC. It goes even lower than AOE 4 to a Ryzen 3 1200 with no mention of an AVX requirement. That makes me skeptical that an AVX requirement would be used for DRM. Gears 5 which likely uses the same DRM as AOE 4 goes even lower with only an AMD FX-6100 as the min spec again with no mention of AVX requirement.
All Ryzen and FX CPUs support AVX, though. So by requiring a CPU like a Ryxen 1200 or an FX 6100, you have AVX support already.
 
Every game that requires AVX doesn't do so because of DRM. A lot of game developers may have avoided AVX because it didn't do much to improve performance while excluding a lot of setups with older cpus that offered the necessary performance to effectively run titles but lacked AVX.

https://devblogs.microsoft.com/cppb...rovements-in-visual-studio-2019-version-16-2/
Here is the Infiltrator demo running with AVX. 2-3% gains when frame times are at their worst which is hardly an improvement.

However, that may be a past reality as you have games like Dual Universe that uses AVX for procedural generation and claims it is necessary for performance. Ubisoft makes performance claims too. Diablo 2 Resurrected had AVX added during optimization of the game. For Star Citizen, AVX was added through a patch.
 
Last edited:
Possibly, but an i3 6300U with 2 cores and 4 threads (listed as min.req) is obviously not more powerful than a Phenom II x6 1100T even though one supports AVX and the other doesn't, so there has to be an alternative to the performance explanation, unless the game is poorly optimized.
 
Possibly, but an i3 6300U with 2 cores and 4 threads (listed as min.req) is obviously not more powerful than a Phenom II x6 1100T even though one supports AVX and the other doesn't, so there has to be an alternative to the performance explanation, unless the game is poorly optimized.
???
the i3/5 6300U is much more powerful (in current tasks). The Phenom II architecture is just an really old. After a while (over 10 years now) it just doesn't make sense (even for developers) to look at such old hardware and try to support it. Maybe there are games out that not really need AVX but have the extension-flag on in the compiler. But the fact is that the non AVX CPU-Architectures are now 10+ years old. At some point those CPUs are just not relevant for the market anymore.
 
Last edited:
Possibly, but an i3 6300U with 2 cores and 4 threads (listed as min.req) is obviously not more powerful than a Phenom II x6 1100T even though one supports AVX and the other doesn't, so there has to be an alternative to the performance explanation, unless the game is poorly optimized.
Well, the 6 thread Phenom is going to be at a disadvantage in plenty of workloads optimized for 4 threads like many games were from their time.
Lol, I'd like to see that laptop cpu run cyberpunk at mostly plus 30fps framerates.

Here's a desktop i3 6100 running CP2077. Doesn't seam so bad.

should be possible
Intel Core i5-6300U Cyberpunk 2077 Performance - Can Core i5-6300U Run Cyberpunk 2077? - CPUAgent
That was tested with ultra settings. Your video used low-settings, so even cpu intensive settings were set to low. The bigger problem is that most of the time the 6300U as a mobile CPU is paired with a low-end GPU (if there is an extra GPU at all).
That's an i5, not an i3. Although, I think the only difference might be 100mhz on the base clock and the i3's total lack of boost clock.
 
That's not the point. The point is, there are still some CPU's without avx capability that are still somewhat capable, even today. I think an 1100T could probably run AoE4 if they offered an SSE code path, but it's their decision to cut off non-avx cpu's. Assuming it's not Denuvo or some other resource hog DRM program to blame that is.
 
That's not the point. The point is, there are still some CPU's without avx capability that are still somewhat capable, even today. I think an 1100T could probably run AoE4 if they offered an SSE code path, but it's their decision to cut off non-avx cpu's. Assuming it's not Denuvo or some other resource hog DRM program to blame that is.
Yes they are. But they are just to old. If developers still concentrate on CPUs that are no longer relevant for the market (there are not so many users with a phenom cpu today) that use a 10+ years old tech, the pc market does never come to a point where new technologies are used. Also it is additional support. Even if it just a compiler flag it means tests must be done.
 
Status
Not open for further replies.
Back
Top