IHV Business strategies and consumer choice

there's a non-trivial probability of developers avoiding HWRT while using Nanite on UE5 ?
So far, every UE5 title with Lumen has included HWRT except two: Layers of Fear, Fortnite, Ant Ausventure, The Fabled Woods, and Desordre all use HW Lumen. Immortals of Aveum is going to support HWRT in a future patch. On the other side, we only have Fort Solis with SWRT (so far too).

I won't respond to the rest of your rant, as it's just ramblings on your part. Early access games are just as viable to include as finished games. And yes I am counting the games with RT in the year they received RT in, because in the past they were never counted as RT games to begin with. GTA V, Resident Evil 7 and Witcher 3 received ray tracing updates after 6+ years of release, I must now list them as having ray tracing in 2015 before ray tracing was even invented and not in 2022? Brilliant!

You are just grasping at straws to prove a point that doesn't even exist (few RT titles in 2023). Knocking two games out of my list won't change the conclusion.
 
Last edited:
there's a non-trivial probability of developers avoiding HWRT while using Nanite on UE5 ?
Why would they avoid HWRT when, with good normal level design, they can achieve both better image quality and high performance?

We all know that if an UE 5 game lacks HWRT, the reasons could be either laziness, a lack of concern for users, or poor level design practices with kitbashing — all of which could compromise both quality and performance.

So, if devs who don't care about their game's users avoid using HWRT, it's probably for the best. Because if a developer doesn't prioritize optimizing their game, it's likely not worth adding RT or any other graphics-enhancing tech. Such games probably wouldn't utilize these technologies efficiently anyway.
 
Last edited:
Why would they avoid HWRT when, with good normal level design, they can achieve both better image quality and high performance?

We all know that if an UE 5 game lacks HWRT, the reasons could be either laziness, a lack of concern for users, or poor level design practices with kitbashing — all of which could compromise both quality and performance.

So, if devs who don't care about their game's users avoid using HWRT, it's probably for the best. Because if a developer doesn't prioritize optimizing their game, it's likely not worth adding RT or any other graphics-enhancing tech. Such games probably wouldn't utilize these technologies efficiently anyway.
Out of the 4 games so far that uses nanite, only 1 of them enables HWRT which is Fortnite. The other UE5 games which don't use Nanite all look like low budget asset flips. It's pretty clear to anyone that nanite will be a better representation of AAA games using UE5 than games that don't ...

You can throw shade all you want at the developers for their content authoring practices but even Epic Games is not going to outright stop them from 'kitbashing' since they realize it's an aspect of nanite that can massively improve an artists iteration times ...
 
Yes, but UE5 games with Nanite aren’t particularly impressive either…yet.
Sure that might be a hot take that's shared with some others here but that's not going to stop them from becoming the measuring stick in the immediate future going forward and they're not going to be superseded by some glorified tech demos with low quality asset flips ...
 
Out of the 4 games so far that uses nanite, only 1 of them enables HWRT which is Fortnite.
I understand why you shift your goalposts with each post to justify your narrative, but let's examine some facts first:

Fortnite - has HWRT and lacks kitbashing.
Desodre - has HWRT and, while it apparently exhibits some level of kitbashing by design, this is not sufficient to cripple HWRT.
Fort Solis - does not have HWRT and lacks kitbashing. However, its many metal surfaces would apparently have benefitted from at least specular occlusion with HWRT from characters.
Remnant 2 - lacks both HW and SW RT, does not have kitbashing, and mainly features mostly diffuse graphics, which would not significantly benefit from HWRT, except for better draw distances.
Immortals of Aveum - employs SW Lumen, lacks kitbashing, and primarily showcases diffuse graphics. However, the developers are working on incorporating HWRT for improved distance GI.

So, among these 5 games with Nanite:
  • 2 feature HWRT, with one more being added later on.
  • 2 currently use SW RT only, but one will transition to the HWRT category in the future.
  • 1 game does not use Lumen at all, so is irrelevant.
There are zero games where kitbashing would prevent the introduction of HWRT for better quality graphics.

You can throw shade all you want at the developers for their content authoring practices
I'm not throwing any shade on devs. Kitbashing is generally considered a bad practice, and the fact that it doesn't work outside of the UE 5 makes it even more problematic for the people learning level design in UE. I've yet to see a developer create a game using this approach, and I hope there won't be many.

Epic Games is not going to outright stop them from 'kitbashing' since they realize it's an aspect of nanite that can massively improve an artists iteration times ...
With current games at 720p, I'm afraid that promoting new bad design practices to produce even lower resolution games won't benefit neither game developers nor UE 5 developers. Apologize if this sounds harsh, but there are many issues to address in UE to make UE games better before embracing even less optimal game development practices to make them worse.
 
Last edited:
It's clear that a few people will not self moderate and bring their fanboyism in every thread.
It's not just fanboyism across IHVs, it's about ideology. Some think RT is the only way to achieve better visuals and visuals matter the most, others think the cost is too high and visuals are already overrated and exhausted, etc.
Even if anybody hides IHV preferences and brand agendas, this conflict of ideology remains.
I think we have to deal with it.
And since conflicts give a reason to discuss, no forum should close just because discussions become more intense. It's rather a major reason to keep those forums up, imo.

But i agree it's annoying AMD is constantly talked down just because they first see what the market demands, while NV attempts to form and dictate the market to follow their personal vision of how games should work.
Other than RT performance i see no big difference across their results. Both struggle to achieve those raw performance jumps we remember from the past, because MLID.

And while i'm at it, it also sucks people divide GPUs into just 'rasterization perf.' and 'RT perf.'. That's really naive, and a very outdated view coming from the days when 3D accelerators could just draw textured triangles but nothing else.
Today GPUs are mainly general purpose parallel processors. And parallel programming is useful for graphics and games too, ofc. But within any discussions about gaming, people ignore all that and just focus on RT cores, ROPs, or other fixed function features which only matter significantly for applications which saturate such features while underutilizing others.
Related discussions tend to narrow down to such corner cases, focusing on a local maxima, but missing to see the larger picture, i feel. But that's hard to avoid.

Market will regulate itself. Will AMD become insignificant because they fail to keep up with RT? Will NV become insignificant because they fail to serve a market preferring affordability over pomp?
We will see, but i guess it's neither.
 
Hold on, are people suggesting that RT adoption, performance and marketing have nothing to do with “ AMD execution” ? Or is it that RT is an Nvidia invention that has nothing to do with AMD’s ability to sell GPUs? RT certainly isn’t the reason for most/all of AMD’s woes but come on.

What exactly is this thread for again?

And while i'm at it, it also sucks people divide GPUs into just 'rasterization perf.' and 'RT perf.'. That's really naive, and a very outdated view coming from the days when 3D accelerators could just draw textured triangles but nothing else.

Yep, we should be talking about results and the tech that delivers those results.
 
Last edited:
It's not just fanboyism across IHVs, it's about ideology. Some think RT is the only way to achieve better visuals and visuals matter the most, others think the cost is too high and visuals are already overrated and exhausted, etc
It's still about fanboyism. Whatever Nvidia does is the way, therefore RT is the only way. That's why the same users argue so much against software Lumen in the UE thread. It's part of the agenda to promote Nvidia constantly.
 
That's why the same users argue so much against software Lumen in the UE thread.
Or maybe they argue against s/w Lumen because it is both slower and result in worse image quality than what you get by using h/w RT on GPUs which are capable of running it well. And it's in fact an agenda to downplay h/w RT constantly because it's adoption hurts your favorite brand.

See? This game of yours can be played both ways.
 
It's still about fanboyism. Whatever Nvidia does is the way, therefore RT is the only way. That's why the same users argue so much against software Lumen in the UE thread. It's part of the agenda to promote Nvidia constantly.

Replace Nvidia with AMD and Lumen with RT in that same sentence and you get the other side of the same coin.

None of that matters of course to AMD’s bottom line. Pretending that RT is just some Nvidia marketing thing that AMD can choose to ignore and still be successful is pure fantasy at this point. Cat’s out the bag and it’s not going back in. There have been suggestions from some users here that AMD should actively sabotage adoption of raytracing in games. On this forum of all places. That’s simply embarrassing and pales in comparison to any pro-RT fanboyism from Nvidia fans. Clearly those people don’t actually care about graphics.

This is a symptom of the whole vibe around AMD’s graphics division of late. Instead of being associated with pushing tech forward their biggest supporters are more concerned with stifling progress so they don’t look bad. And that’s all you need to know to explain their current market position.

Baffling how the so called holy grail of rendering has been relegated to having a “side”. Imagine if we had sides for 3D vs 2D rendering here based on IHV preference. Utterly stupid.
 
Most people don’t seem to be buying AMD hardware.
Nvidia has had a very strong mindshare advantage well before ray tracing/Turing came around. And Radeon has a pretty consistent habit of shooting themselves in the foot in ways that make them less appealing, that have nothing to do with lack of ray tracing performance in particular. So yea, this comment alone really isn't that relevant.

And while yes, there are AMD fanboys who try and downplay ray tracing to defend their brand, there are lots, and perhaps even more counterpart Nvidia fanboys doing the opposite and trying to play it up as much as possible in order to make AMD look worse.

I'd have to agree with the person above, that while ray tracing might be the future, it's not yet some 'must have' element, given the huge relative performance costs, even with the best RT hardware available. And it's really path tracing that's the 'holy grail', and we're nowhere near being able to do that in modern AAA games with mainstream hardware. Even a game like Portal 2 from two generations ago is bringing topline GPU's from today to their knees using path tracing.
 
Last edited:
I'd have to agree with the person above, that while ray tracing might be the future, it's not yet some 'must have' element, given the huge relative performance costs, even with the best RT hardware available. And it's really path tracing that's the 'holy grail', and we're nowhere near being able to do that in modern AAA games with mainstream hardware. Even a game like Portal 2 from two generations ago is bringing topline GPU's from today to their knees using path tracing.

This sorta proves my point. Path tracing is already here in AAA games. Instead of being amazed that Cyberpunk PT and soon Alan Wake 2 PT even exist some people prefer to pooh pooh the progress with “well it’s not perfect so whatevs” as if anything in graphics is perfect. If perfect means no upscaling and no denoising well good luck waiting for that. We’ll all be six feet under before then.

I’m not even sure there are “must have” features when it comes to graphics as that’s a purely subjective thing. Some people turn off shadows and AA. Other people can’t stand the smallest of artifacts. Lots of people are happy with baked lighting. If AMD was delivering all of the “must have” features that gamers want they surely would be doing a lot better.
 
The problem with the argument that AMD is being more practical and 'looking at the market reaction' to determine where to place their resources vs. Nvidia, who is trying to strongarm the industry into it's AI-generated future, is that we just don't really see the results in products.

By that I mean: If AMD was consciously deciding to focus their architecture on raster performance atm, which seems to still be paramount in the minds of most gamers so I don't necessarily think that would be a disastrous goal, then they have to deliver on that focus. If Nvidia is supposedly frittering away its transistor count budget on tensor cores, then AMD's more conservative approach on pure brute force should have resulted in offerings that are significantly superior on a price/performance scale in raster - but...they're usually not, at least to the point where they really stand out. Routinely, the equivalent AMD offering for most price brackets will deliver either a little better raster performance, at a little better price. Maybe.

So the problem is, even if you believe Nvidia is too focused on RT/AI, their attention to this area still isn't significantly opening up the door for AMD to walk through with a devastating offering that really shines a spotlight on any huge advantage they have in traditional rendering. There's some cracks, yes - like Nvidia's cache-focused architecture for their midrange negatively impacting 4K performance where it can be below the previous gen, but otoh AMD's latest offerings aren't exactly setting new benchmarks in generational improvements either.

Maybe the 7700/7800 are it. At $100 cheaper with 4 more GB of vram, and the potential to outpace the 4070 in raster, the 7800 may be getting into the 'kinda interesting' territory. Or maybe Nvidia does what they just did with the 16GB 4060 and drop its price by $50 and we're basically back on familiar ground again.
 
This sorta proves my point. Path tracing is already here in AAA games. Instead of being amazed that Cyberpunk PT and soon Alan Wake 2 PT even exist some people prefer to pooh pooh the progress with “well it’s not perfect so whatevs” as if anything in graphics is perfect. If perfect means no upscaling and no denoising well good luck waiting for that. We’ll all be six feet under before then.

I’m not even sure there are “must have” features when it comes to graphics as that’s a purely subjective thing. Some people turn off shadows and AA. Other people can’t stand the smallest of artifacts. Lots of people are happy with baked lighting. If AMD was delivering all of the “must have” features that gamers want they surely would be doing a lot better.
Path tracing is NOT here in AAA games. lol

Cyberpunk 2077 is a last gen game, and it requires an incredibly expensive GPU to get decent performance. Something hasn't arrived til it's something an ordinary person can and is happy to use on the regular.

I can see I'm in for a 'I'm willing to argue anything at all' fight here, and that sounds exhausting to me. You're clearly, and quite ironically, here only to fight a specific side.
 
Path tracing is NOT here in AAA games. lol

Cyberpunk 2077 is a last gen game, and it requires an incredibly expensive GPU to get decent performance. Something hasn't arrived til it's something an ordinary person can and is happy to use on the regular.

I can see I'm in for a 'I'm willing to argue anything at all' fight here, and that sounds exhausting to me. You're clearly, and quite ironically, here only to fight a specific side.

The ability to play AAA games today with PT isn’t an opinion. Everyone has eyes. Again more downplaying of advanced graphics tech which is baffling given the forum you’re posting in.

Given your criteria for things that “have arrived” I’m curious to know what’s on your list of those things in modern games.

The problem with the argument that AMD is being more practical and 'looking at the market reaction' to determine where to place their resources vs. Nvidia, who is trying to strongarm the industry into it's AI-generated future, is that we just don't really see the results in products.

Yup, if we actually focused on results and not philosophy these discussions would be a lot more productive.
 
Cyberpunk 2077 is a last gen game, and it requires an incredibly expensive GPU to get decent performance. Something hasn't arrived til it's something an ordinary person can and is happy to use on the regular.
There's a big difference between when a game was released and what would be considered the games technical capabilities regarding last/current gen. I really don't think CP2077 would ever be considered a "last gen" game, just because it was released on the last gen. CP2077 is considered a graphical and technical showcase well before path-tracing was implemented.
 
There's a big difference between when a game was released and what would be considered the games technical capabilities regarding last/current gen. I really don't think CP2077 would ever be considered a "last gen" game, just because it was released on the last gen. CP2077 is considered a graphical and technical showcase well before path-tracing was implemented.

I'd describe it as a game with last gen geometry and current/next gen lighting (RT path). Which means it can sometimes look amazing (when the low geometry doesn't get in the way) or it can sometimes look awful (when the RT lighting exposes the low geometry).

Overall I'd put it somewhere between last and current gen or maybe even current gen if the geometry doesn't bother the viewer with RT on. But then, I've been a proponent of the need for vastly increased geometric complexity in games for over a decade now. I had hopes for tesselation but that never panned out due to difficulty of implementation combined with less than stellar implementation when actually used in games (Crysis 2 for example).

Nanite finally might give us the geometry we need which RT quality lighting desperately (IMO) needs. But then RT hardware needs to get better and significantly faster in order to deal with more complex geometry ... /sigh.

Regards,
SB
 
Back
Top