Next Generation Hardware Speculation with a Technical Spin [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Because the jaguars where kinda gimped even for their time but yes.
I still contend that an 8 core Jaguar was probably the best choice available at the time, given the cost and power requirements.

It was a smart-aleck retort that somehow 30fps magically goes away because of next-generation console hardware. Which is obviously false, since the vast majority of PCs (high-end ones as well), still have problems maintaining a solid 30fps in many modern day games, and even some prior generation titles.
Can you name some titles? I mean, I know some older games are locked at 30FPS but I can't think of any that run sub 30FPS on what I would consider high end PC hardware without such an artificial ceiling.
 
Can you name some titles?
Just from what i remember, all those recent titles fail on 60fps in 1080p using midrange GPUs:
SW Jedi Fallen Order
RDR2
Ghost Recon Breakpoint
Control
(probably many more)
Problem is benchmarks are usually made with ultra settings, so maybe an unpractical impression.
 
Can you name some titles? I mean, I know some older games are locked at 30FPS but I can't think of any that run sub 30FPS on what I would consider high end PC hardware without such an artificial ceiling.

I wasn't talking about PC games being artificially capped at 30fps. I specifically stated, that I have seen high-end PC hardware not even maintain certain modern and prior generation PC titles at 30fps. Modern titles such as Control and RDR2 and older titles such as GTA V and Witcher 3 can (does) drop framerates under 30fps for sustained periods of time, even on modern 2018-2019 high-end AMD/Intel CPUs and AMD/Nvidia GPUs.

Point being, 30fps gaming isn't going anywhere anytime soon, especially with the next-generation of consoles.

FYI: And by high-end PC gaming I mean, unrestricted eye-candy settings that are inclusive of maxed LoD, draw distances and density settings running at 4K.
 
Because the jaguars where kinda gimped even for their time but yes.



What has the pc to do with this? Anyway that would depend on what hardware one has, and what preferences for settings are.

For next-generation games when they will begin to push the CPU for reaching 60 fps in a 30 fps next-generation consoles games will probably need a 12 cores CPU. I think the consoles will help AMD and Intel to sold CPU. :) All the people keeping "old" CPU because the bar for 60 fps was low will need to change CPU.
 
Which is obviously false, since the vast majority of PCs (high-end ones as well), still have problems maintaining a solid 30fps in many modern day games, and even some prior generation titles.

Depends on settings again. RDR2 runs medium/low on even one-x, some even lower then low. You don’t need ultra to get better then that for 60fps even on older hardware. ’Ultra’ is going to give trouble in most situations, DF has explained before.
 
As a point of reference, Control running at 1440p w max settings, including rtx maxed out, can drop frames at 30fps - on my rtx 2080 (1700mhz) w intel 8750h (6 core 3.2ghz locked turbo). The game is entirely GPU limited. It's basically redlining the GPU the entire time. I mostly console game, and that was the choice I settled on. Playing at 1080p w reduced raytracing (w/o rtx AO) and 60fps is also an option but I wanted the eye candy.

We already get similar performance profiles on the mid gen consoles (quality and performance). I doubt much will change in that regard next gen.
 
Depends on settings again. RDR2 runs medium/low on even one-x, some even lower then low. You don’t need ultra to get better then that for 60fps even on older hardware. ’Ultra’ is going to give trouble in most situations, DF has explained before.

Once again, that was never my point. But let me make it clear once again, expect many 30fps games on next-generation systems.
 
Just from what i remember, all those recent titles fail on 60fps in 1080p using midrange GPUs:
SW Jedi Fallen Order
RDR2
Ghost Recon Breakpoint
Control
(probably many more)
Problem is benchmarks are usually made with ultra settings, so maybe an unpractical impression.
The claim was for "high end" PCs, not midrange

I wasn't talking about PC games being artificially capped at 30fps. I specifically stated, that I have seen high-end PC hardware not even maintain certain modern and prior generation PC titles at 30fps. Modern titles such as Control and RDR2 and older titles such as GTA V and Witcher 3 can (does) drop framerates under 30fps for sustained periods of time, even on modern 2018-2019 high-end AMD/Intel CPUs and AMD/Nvidia GPUs.

Point being, 30fps gaming isn't going anywhere anytime soon, especially with the next-generation of consoles.

FYI: And by high-end PC gaming I mean, unrestricted eye-candy settings that are inclusive of maxed LoD, draw distances and density settings running at 4K.
OK. I haven't played Control or RDR on PC, and I haven't experienced sub 30FPS Witcher 3 or GTA V either, but I also don't have a 4K monitor. But that's a believable list if we mandate 4k and max settings. Based on your initial post I was assuming console like quality settings and resolution. But I would agree that 4k maxed settings can be prone to drops.
 
XB1 is the only console with 4 render backends. 1X and PS4 had 8. Pro had 16!

The XB1X has a surprisingly low pixel fillrate for a system that releases so many titles at native 4K resolution.
I really thought Scorpio had 12*4 ROPs, but turns out it's really just 8*4.
 
The XB1X has a surprisingly low pixel fillrate for a system that releases so many titles at native 4K resolution.
I really thought Scorpio had 12*4 ROPs, but turns out it's really just 8*4.
I recall there being heavy speculation that this would be X1X Achilles heel with regards to memory setup (12 GB / 384 bit bus) and the available ROPS
 
I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.

There are three problems with the actual implementation.
- It's not support the 32-bit snorm format, because the fixed function hardware is not designed around it. This is a huge limitation, and it can sacrifice the performance and the memory too much, and the supported 32-bit float format don't gives you really better results.
- The used acceleration structures are not public, so it could result extremely wild performance variations depending on the scenes. This needs to be solved.
- The ray traversal stage is extremely limited. It should be programmable.

He also claims that Xbox and PS5 will have different pipelines for RT.
The PS5 has a very different RT solution compared to what is implemented in DXR. Even the Xbox has an upgraded pipeline

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs.

I can't say too much about this, but the next step will be the custom BVHs.
 
I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.



He also claims that Xbox and PS5 will have different pipelines for RT.


https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs.
All of this seems within reason. I don't see anything here that smells entirely rotten given how much time they have to develop the next generation of ray tracing hardware.
 
All of this seems within reason. I don't see anything here that smells entirely rotten given how much time they have to develop the next generation of ray tracing hardware.
The bits about snorm could be a misunderstanding on his part, but the most interesting bits is the PS5 and Xbox having wildly different RT solutions, Will Sony make custom modifications for RT this time?
 
The bits about snorm could be a misunderstanding on his part, but the most interesting bits is the PS5 and Xbox having wildly different RT solutions, Will Sony make custom modifications for RT this time?
To me, the probability that they are using the same RT hardware base is still very high, given that they are using the same architecture base and have access to the same future technologies from AMD.

That sort of leaves the following possibilities:
a) both use the same RT hardware from RDNA2
b) one chooses to use RT hardware from RDNA2, and one does not
c) both choose to not use RDNA2

I think if I were to spread the probabilities, I would bet on A and B, but not C. If the statement is true by that developer, then B is likely the answer, because C would mean that there are at least 3 ways to approach RT acceleration, possibly a 4th. That doesn't make a lot of sense considering form a high level standpoint we see how close Nvidia's RT setup is to PowerVR for instance. That's already 2 methods but they are close in nature. AMD provides their own, that's a 3rd method; then Sony and MS make their own ones as well? How many realistic high performance methods remain that could be "wildly different".

So naturally I will bet towards (B) if what he says is true. You either have a RT solution that you think is better than RDNA 2. Or you just use RDNA 2 - and you can go back and forth testing with 2 designs right up to the cut off line. That's how I would see this.

And if it is true, I will bet on MS being the different one and Sony rolling with RDNA 2. Nothing to do with talent or skill; just thinking back to
a) wanting to release in 2019
b) rushing a lot of headcount for BC such that they moved the vega team over to Navi for Sony
c) perhaps blindsided on the release of Ray Tracing hardware since they planned on 2019 release.

Basically hardware needed to be finalized about now to begin console development for launch next year. These types of changes may have a real impact here. Perhaps Sony did make their own RT solution, but with resources being shifted or what not, perhaps it doesn't outperform RDNA 2, so you just change the design to RDNA 2 because AMD is paying for it and it's going to deliver for 2020.

With MS - they would have known for some time that this was coming since they were developing RT in-tandem with Xbox One X
a) they may be looking at creating an RT solution that can be used in a variety of different applications
b) they have shown that they have other groups working on different things (Xbox One BC hardware) for instance that was added to the final product.
c) with AMD being generally behind on RT and DX features, perhaps they felt they wanted to roll their own solution anyway.

And while it's possible that perhaps they didn't develop a solution that could outperform RDNA 2, knowing the types of things that developers are asking for in the RT pipeline, it's possible they rolled their own solution to support more features down the road.

To be clear I am referring to the hardware. Not the API pipeline. I'm not sure what the developer has specified.
 
Last edited:
The claim was for "high end" PCs, not midrange

Obviously on PC it's all up to the user, one can lower beyond low with mods or ini files to achieve 120+ FPS in AAA titles. One can also enable full ray tracing with everything else to max and experience 30fps. In general this gen though, 60fps @ atleast console settings wasn't/isn't a problem. 30fps isn't over on next generation consoles cause devs are going to want gfx over fps.
 
I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.

He also claims that Xbox and PS5 will have different pipelines for RT.

https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs.

I'm guessing the person is getting DXR and RTX mixed up. Can't really say I haven't heard that RTX, in its current implementation, is a bit too black box-y and rigid.
 
Relevant to the last couple pages next-gen FPS discussion?: According to this Gears 5 tech presentation they made an "educated bet" that
'60fps is the new "base line". Line in the sand early.'
 
I am reposting this again here so it can have a better exposure.

According to an obscure discussion thread on Anandtech's forum, a member that is known to be a developer had some posts about the hardware RT in consoles, he said that consoles will support 32 bit snorm format and have a more programmable traversal stage, he claims current DXR hardware lacks those, and so will be at a disadvantage moving forward, requiring new DXR hardware. His points are contested in the thread though, and he never explained the usefulness of the snorm format, so I don't know about the accuracy of his argument.



He also claims that Xbox and PS5 will have different pipelines for RT.


https://forums.anandtech.com/threads/ray-tracing-is-in-all-next-gen-consoles.2571546/#post-39954331

He also says that consoles will do custom BVHs.

We know Sony Interactive Entertainment at the very least has been working on an RT solution for the PS5 since 2016 (if I'm not mistaken) when they hired the lead PowerVR RT software engineer from Imagination Technologies. So, whatever the PS5 RT solution is, its definitely not a last minute thought on just having an RT PR bulletpoint.
 
We know Sony Interactive Entertainment at the very least has been working on an RT solution for the PS5 since 2016 (if I'm not mistaken) when they hired the lead PowerVR RT software engineer from Imagination Technologies. So, whatever the PS5 RT solution is, its definitely not a last minute thought on just having an RT PR bulletpoint.
Link for 2016? Both points is new info for me.
Sounds like an expensive endeavor here.
More power + licensing out for RT tech could be fairly pricey here. I'm not going to rule it out, but this sounds significantly more expensive.
 
The bits about snorm could be a misunderstanding on his part, but the most interesting bits is the PS5 and Xbox having wildly different RT solutions, Will Sony make custom modifications for RT this time?
That wouldn't be surprising considering they already did custom hardware modifications on both PS4 and Pro GPUs.
Link for 2016? Both points is new info for me.
Sounds like an expensive endeavor here.
More power + licensing out for RT tech could be fairly pricey here. I'm not going to rule it out, but this sounds significantly more expensive.
That's the thing. They hired the guy so they own everything he did for them. I think @chris1515 had followed that case closely.
 
Status
Not open for further replies.
Back
Top