Protected? what is this? some kind of good vs evil fairy tale? AMD's graphics business will collapse with no console funding, they won't even survive with their lack of software features or hardware performance. They won't carve anything out, if they go exotic route (away from DirectX and API standards), they would die in a heart beat, as they don't have the marketshare or the mindshare for anyone to care.
What exactly do you think PC graphics is worth to AMD anyways when their biggest customers left are Microsoft and Sony ? Eventually, Sony is going to be the only voice they can hear and their designs will soon change to exactly accommodate them while Microsoft will be content with whatever ...
Nobody cares, PC is scalable, you can run the game on low and be done with it, unlike consoles, if you have a 3050 you can have a 120fps experience, you can't do that on consoles. You can always buy cheap hardware that runs all the games on PC, you have iGPUs that can run every PC game in the wild. That point is irrelevant.
And we already have technological jumps on PC, they are called Path Tracing now, and it's already bigger than any rasterization jump in the past decade.
If PC doesn't care about price accessibility anymore then I'm sure you won't mind the idea of Sony attempting to follow through on their recent statement to convert PC gamers to their own console platforms and what better way to do that than to introduce exotic hardware so that PCs are left with the inefficient bruteforce solutions ? Their perfectly content with the idea of AAA PC gaming being gatekeeped behind RTX 4090s in the future if it means that they can now show a justification to their shareholders to never release their games on PC again if it'll be met with poor sales ...
PC never experienced missing graphical effects in the past no matter what exotic hardware out there.
Deus Ex on PS2 had higher detailed polygon meshes than than it's PC counterpart for starters. There's also tons of instances like older Disney based games or FIFA 21 where the PC versions were clearly based on last generation consoles. PC also never got subsurface scattering for the Tomb Raider reboot in comparison to the next gen ports. Only the GC version of Tales of Symphonia supports 60Hz. MGS2/3 has missing or reduced rain and DoF on PC in comparison to PS2. I'm not going to go any further since you can take a look at a more detailed list
here ...
As for high hardware demands, then this comes at the expense of consoles nuking themselves out, getting rid of backward and forward compatibility (which you seem to value so much given your arguments for Switch 2 to ditch NVIDIA), and raising costs of game developments/porting/remastering .. while also raising hardware costs. Engine costs will also explode, since console makers will have to fork special builds of popular engines to be able to use them (which you don't like).
Never have I claimed that AMD/console vendors can't to iterate an older standardized architecture into a new exotic HW design (they can keep both BC and incompatible features with PC) but for sure Nvidia has never really been this (whether exotic or not) and their current partner is well aware of this fact either way ...
You are basically all over the map on this one, your whole basis of arguments is anything opposite to what NVIDIA does, if some vendor goes to NVIDIA then oh it's bad, backward compatibility is lost. If some other vendor is going AMD, but NVIDIA is still sweeping the market, then ditch backward compatibility, ditch AMD, and go do something else! If new APIs are coming out with new features, then hail GPU driven rendering and hail AMD for sticking with the "good APIs", but if that is not enough and NVIDIA is acing the game with strong ray tracing and machine learning features, then ditch the APIs, ditch everything and go exotic! If NVIDIA is making a special UE5 build with the upgraded features then it's bad and no one will use it, but if we go exotic and fork out specific engine builds then that's fine and dandy, as long as they are not NVIDIA builds! Be consistent please.
And console vendors supposed to believe that PC apologists somehow represents their best interest when the likes of you keep championing the very same HW vendor who keeps losing out on these very contracts ?
GPU driven rendering is nice but is currently vaporware. I’m sure we will see multiple RT improvements from all vendors before GPU driven rendering takes off. Not sure why you’re framing them as competing technologies.
@Bold Explain how a yet to be standardized theoretical/hypothetical feature will a currently beat already API exposed feature to the punch in terms of app integration ?
Custom console hardware would be a lot more exciting than the current “mini PC” configs. I’m not sure AMD is the one to deliver that though. How will they pay for it?
I guarantee you that this 'custom hardware" would be easier/cheaper to implement within whatever constraints of their die area budget entails on top of current existing console architectures than making HW RT as 'robust' as a RTX 4090 (well on it's way to the reticle limit) ...
Texture caches are tiny, especially on some architectures, so they are not a panacea. They won't help much if you need to trace through all screen pixels. Register spilling with RT is not a problem when the software is properly designed, except maybe for some architectures that do traversal in software and require more registers for the additional traversal kernels.
Spilling variables/arguments from registers just means you don't care about performance anymore. Shading languages were designed to
EXACTLY prevent that case as much as possible from happening and there's many GPU algorithms out there that are fundamentally memory bandwidth limited ...
BVH is comparable in cost to a depth prepass, G-buffer fill, shadow fill, or other raster passes, and it may cost even less. Many would rather get rid of some raster passes rather than BVH. And you're suggesting adding even more raster passes for planar reflections on steroids, etc., which is ridiculous when you mention the BVH cost. At least BVH is unified, and you can replace a lot of raster passes with it, preferably all of them in the near future, I hope.
Unless console holders have gone crazy, I have a hard time imagining them wanting to make games even more technically complex and difficult to develop with even more different raster subsystems to keep in sync with each other. Sony sleeps and dreams of developing games in half the time and cost, not the other way around. So, all those fantasies about creating custom raster hardware for all cases in life (reflections, shadows, and whatever else) are unrealistic not just from a technical standpoint but also have zero practical viability.
Tracing against a
static pre-baked BVH is comparable to those passes. Not so much a with geometrically full featured dynamic runtime generated BVHs ...
Console vendors also have dreams of developing on RTX 4090s too but there's no market for $1000+ consoles so paying a few graphics programmers crafting more complex solutions is more sustainable than either having a dead platform or raking in billions in hardware losses ...
There have always been restrictions and rules for rendering correctness. What do they have to do with pipeline states and their hardware costs?
Restrictions = less programmability/more fixed function HW states
Restrictive interfaces in gfx APIs often correlates to HW designs having these "more optimized" states in the presence of enabling these fixed function units to speed up runtime execution. Nvidia are a well
documented poster child of implementing HW for just about everything ...
Well, maybe because this is how it fits best for the SER realisation. You're really just guessing and grasping at straws here. It says nothing about the hardware states or anything else.
SER exists because their HW can't get full/optimum speed with either callable shaders or real function calls which implies that there exists some special HW state to enable a faster spilling mechanism on ray generation shaders than those other methods ...
These mostly come from pastgen consoles, where PC have always lagged in this regard due to more powerful hardware (CPU/GPU), standardization hell, and fewer requirements for console style optimizations, which typically come at the expense of GPU performance and stability on PC. It's no secret that RT was late to the party for current gen consoles, so consoles didn't push the development of RT much for many reasons, as they have too weak RT HW to change something on PC. Once more performant RT solutions are available on consoles, which are expected by the end of the year, people will start to care more about supporting new advancements on PC as well.
Using last generation consoles as an example is just a scapegoat when the nearly all (?) of the graphically high-end AAA UE5 games are only released with software lumen ...
RT integration in current games only went as far as it did because of the fact that many games or it's technology were tied to last generation consoles. RT wouldn't be proliferating anywhere near as much as it does now if so many games weren't still being released on last generation consoles or based on their technology which often have lower scene complexity. There wouldn't he a lot of RT modes featured in games if there built to maximize visual fidelity out of current consoles. "More performant" RT implementations isn't going to do anything for consoles in the future when they can't use much of the said overly contrived HW features ...