IHV Business strategies and consumer choice

eastmen

Legend
Supporter
I've met Scott and I liked him, but that was some serious turd-polishing type answers. :(
Some of them are just really dismissing the questions imo.

I do have to agree that I don't care too much about efficiency as long as the hardware is within the same ball park.

Leak suggests that the 7700XT scores ~17k in 3DMark TimeSpy. Allegedly the 7800XT scores somewhere in the 19k ballpark.
By comparison...;

6700XT scores ~12.5k
4060Ti scores ~13.4k
3070 scores ~13.6k
6800 scores ~15k
6800XT scores ~17.2k
4070 scores ~17.7k
3080 scores ~17.8k
3080 Ti scores ~19k
6900XT scores ~20k

So the 12GB $449 7700XT has a score of around 3.5k points higher (25% faster than 4060 Ti), while costing 13% more than the 8GB 4060 Ti, or costing 10% less than the 4060 Ti 16GB. It really makes the 4060 Ti look more stupid than it already is.

If the 19k is correct for the 7800XT, we're getting 3080 Ti performance at $499, which is good. It would be $100 cheaper than the 4070 but still beating it slightly. Seems a bit too good to be true for this one, although if RDNA3 would have perfect scaling per CU for some reason (11% more CUs for ~11% more performance), the scores would make sense. But scaling is normally not linear.

It seems crazy too, for a 54CU 7700XT to be able to keep up with the 72CUs of the 6800XT, or even the 60CUs for the 6800. Same applies for the 60CUs of the 7800XT keeping up with the 80CUs of the 6900XT. After all, the 7900XT required 96 CUs to beat the 80 CU 6950XT. That's 20% more CUs for 18% more performance. Unless AMD fixed some sort of bug in RDNA3, the scaling portrayed in these leaks is bogus. It would seem weird if the 7800XT is able to even reach the 6800XT at all with 25% less CUs, let alone beat it... AMD may have lied about their performance again just like with the 7900 cards, and these leaks are maybe AMD themselves trying to market their cards.

The problem is that its AMD. The consumers just continue to buy nvidia no mater what. AMD would need to to have 4080 performance at 4060ti prices for anyone to even start to care
 
The problem is that its AMD. The consumers just continue to buy nvidia no mater what. AMD would need to to have 4080 performance at 4060ti prices for anyone to even start to care
The problem is not that it's AMD. The problem is that people don't look at AMD provided benchmarks when they make their decisions and they do in fact use ray tracing and DLSS. AMD GPUs won't sell well on perf/price parity in rasterized games.
 
The problem is that its AMD. The consumers just continue to buy nvidia no mater what. AMD would need to to have 4080 performance at 4060ti prices for anyone to even start to care
Well, this has been discussed many times. AMD's business strategy has been like:
  • create a minimal GPU lineup piloting a hitech aspect as a reusable IP
  • minimize other R&D cost to good-enough levels
  • minimize SW dev/support cost
  • slap a price tag just a bit lower than nV's
  • care a lot about margins not about the market
I hasn't worked in the past years. Maybe it is fine from AMD's PoV. If it is not there should come a strategy change, right?

Selling a 4080 performance for 4060Ti price would be an interesting plot given AMD's Radeon position:
  • almost zero brand value - not Yugo levels but not far
  • long history of "broken SW" - like it or not but releasing a flagship product with completely broken VR support is bad
  • minimal marketing
  • lacking latest, shiny "frontier features" - DLSSx, RT, noise cancellation, etc.
  • almost no support in productivity apps - no OptiX, CUDA, duh
The lacking SW features alone should make a notable difference in expenses - all those hundreds (thousands?) of nV's SW engineers per multiple years should make that difference.
 
The problem is that its AMD. The consumers just continue to buy nvidia no mater what. AMD would need to to have 4080 performance at 4060ti prices for anyone to even start to care
AMD is not going to succeed by keeping the very same playing field that disadvantages them so they should focus on altering the playing field itself (benchmarking set) to give themselves the advantage which means getting reviewers to include more of COD:MW2, Far Cry 6, UE5 titles, Starfield's and less of the Control or Cyberpunk 2077's ...
 
The problem is not that it's AMD. The problem is that people don't look at AMD provided benchmarks when they make their decisions and they do in fact use ray tracing and DLSS. AMD GPUs won't sell well on perf/price parity in rasterized games.
Considering the majority of people on Steam still use the 1650, which doesn't have ray tracing, and the next one being the 3060, which is practically too slow for ray tracing, I don't think RT is as prevalent as you think. The majority tries it but more often than not turns it off, because the performance cost is too high. And if you think that the RTX 3060 can do RT, be reminded that many of AMD's cards have faster RT than the 3060, even when they suck at it.

The fact is that AMD is held to an impossible standard compared to nVidia. Even the driver arguments are still prevalent while their drivers are fine, and in some ways superior to nVidia's. It's simply mind share and brand recognition. Nothing more. It is up to AMD to prove their worth, but, they have likely given up putting that as a priority, since gamers have left them in the dust multiple times for no good reason.

Well, this has been discussed many times. AMD's business strategy has been like:
  • create a minimal GPU lineup piloting a hitech aspect as a reusable IP
  • minimize other R&D cost to good-enough levels
  • minimize SW dev/support cost
  • slap a price tag just a bit lower than nV's
  • care a lot about margins not about the market
I hasn't worked in the past years. Maybe it is fine from AMD's PoV. If it is not there should come a strategy change, right?

Selling a 4080 performance for 4060Ti price would be an interesting plot given AMD's Radeon position:
  • almost zero brand value - not Yugo levels but not far
  • long history of "broken SW" - like it or not but releasing a flagship product with completely broken VR support is bad
  • minimal marketing
  • lacking latest, shiny "frontier features" - DLSSx, RT, noise cancellation, etc.
  • almost no support in productivity apps - no OptiX, CUDA, duh
The lacking SW features alone should make a notable difference in expenses - all those hundreds (thousands?) of nV's SW engineers per multiple years should make that difference.
You call it features, but in reality, they are business strategies to lock in gamers. They always have some new feature that ultimately ends up superseded or dead, but is used in the moment to create FOMO. Every generation has some shiny new feature to get gamers excited and convince them that they need it, which in reality does not truly enhance your gameplay experience, but simply costs you money. The more you spend the more you save, right?
We've seen it with PhysX, HairWorks, Ansel... Even the ones that are truly useful like G-Sync, end up being superseded or dead. RT and DLSS are simply their latest bait in their long track record, and people keep falling for it.
 
  • lacking latest, shiny "frontier features" - DLSSx, RT, noise cancellation, etc.
Huh? They've had 1 gen less with RT. They have noise cancellation (both CPU & GPU accelerated versions) and probably at lot of those "etc"s too
  • almost no support in productivity apps - no OptiX, CUDA, duh
They do have supposedly easy CUDA translator though, ProRender includes RT?
 
You call it features, but in reality, they are business strategies to lock in gamers. They always have some new feature that ultimately ends up superseded or dead, but is used in the moment to create FOMO. Every generation has some shiny new feature to get gamers excited and convince them that they need it

Yep, very good business strategy.

...which in reality does not truly enhance your gameplay experience, but simply costs you money.

I assume this is a joke. If it's not a joke it explains a lot about AMD's current position.
 
Considering the majority of people on Steam still use the 1650, which doesn't have ray tracing, and the next one being the 3060, which is practically too slow for ray tracing, I don't think RT is as prevalent as you think. The majority tries it but more often than not turns it off, because the performance cost is too high. And if you think that the RTX 3060 can do RT, be reminded that many of AMD's cards have faster RT than the 3060, even when they suck at it.

The fact is that AMD is held to an impossible standard compared to nVidia. Even the driver arguments are still prevalent while their drivers are fine, and in some ways superior to nVidia's. It's simply mind share and brand recognition. Nothing more. It is up to AMD to prove their worth, but, they have likely given up putting that as a priority, since gamers have left them in the dust multiple times for no good reason.

You call it features, but in reality, they are business strategies to lock in gamers. They always have some new feature that ultimately ends up superseded or dead, but is used in the moment to create FOMO. Every generation has some shiny new feature to get gamers excited and convince them that they need it, which in reality does not truly enhance your gameplay experience, but simply costs you money. The more you spend the more you save, right?
We've seen it with PhysX, HairWorks, Ansel... Even the ones that are truly useful like G-Sync, end up being superseded or dead. RT and DLSS are simply their latest bait in their long track record, and people keep falling for it.
Even if RT is not prevalent, as you say, the RT is still felt like an important "tick on the box". Just like HW T&L used to be.

The drivers (firmware?) experience is still pretty strange. Black screens, fully clocked VRAM with multi-monitors, power issues with video acceleration, VR being broken at RDNA3 launch. All those issues are significant for X * $100 products. Even more when those issues repeat gen after gen. This is not a $10 burger but a $500+ electronic piece we are talking about.

Well, that business strategy actually pushes the technology boundaries. It innovates. It brings new stuff to real-time GFX. It entertains. Does it improve the gameplay? Probably not but nearly every AAA game is praised for "great looks", "impressive graphics", etc. So it means the non-gameplay aspects are praised.
 
The problem is not that it's AMD. The problem is that people don't look at AMD provided benchmarks when they make their decisions and they do in fact use ray tracing and DLSS. AMD GPUs won't sell well on perf/price parity in rasterized games.

In the end any purchase decision is limited by people's budgets. Budgets which are shrinking because of the economical situation and political agendas(limiting access to resources for the dirty masses) beyond the current decade.

Personally I see no long term future for current(high-end PC+GPU with 1000+Watt PSUs) PC gaming just because of these reasons but why AMD couldn't attack NV by significantly better price/performance products in the mid term I don't see. They just need the right momentum. With Intel/AMD we've seen how quick good products can change perception.

Some high-end APU strategy above Console performance as a desktop product might be the best long-term strategy.
 
reviewers to include more of COD:MW2, Far Cry 6, UE5 titles, Starfield's and less of the Control or Cyberpunk 2077's ...
LOL, in other words cheat and ignore DirectX features and games with advanced graphics to focus on pure rasterization, which is quickly becoming out of date.

This strategy can only backfire on AMD and do more harm than good, as gamers will buy a 1000$ 7900XTX and expect it to play everything with all the bells and whisles, then discover that it sucks at Path Tracing, VR, heavy Ray Tracing, Upscaling, Pro Rendering, Latency Reduction .. etc. This will damage the brand to no end.

Considering the majority of people on Steam still use the 1650,
Majority of people on Steam has RTX 3060, you need to combine the laptop 3060 numbers with the desktop 3060 numbers, as they are separated for the 3060 model, as opposed to other models.


Even the driver arguments are still prevalent while their drivers are fine
The driver argument still stands, at launch the RDNA3 had terrible VR performance, worse than RDNA2, still is by the way. All because of drivers. RDNA3 also has terrible idle power consumption when connected to multiple displays too, because of drivers. The AMD rep talked about this yesterday and said the driver solution will be wonky. In several games the 7900XTX continues to have terrible performance compared to Ada, because of driver issues. Games like Forza MotorSport 7, Detroit, PUBG and Final Fantasy Remake, even Forza Horizon 5 had problems that only got fixed last month.


You call it features, but in reality, they are business strategies to lock in gamers
And hardcore PC gamers are locked in because these features add to their PC experience, it elevates it above consoles, AMD on the other hand is happy serving PC gamers carbon copies of console versions with no added value except "moar frames". That's why AMD will remain the budget brand, while NVIDIA will remain the premium brand. Just compare Starfield vs Alan Wake 2. AMD added nothing to Starfield, not even FSR3 is supported there, while NVIDIA added Path Tracing, DLSS3.5, Frame Generation and Reflex. Even for Esports shooters NVIDIA gives Reflex, while AMD gives nothing. PC gamers feel NVIDIA is the premium brand because they offer their best tech to make PC games something more. You play with advanced graphics that staisfied your itch, play with.

We've seen it with PhysX, HairWorks
With no replacement in sight. Physics continues to suck in modern gaming, and hair rendering barely improved. Where is their industry wide replacement? Where is the AMD solution? Where is the console solution? All dead too. So the fact those are dead is not a point against them in the first place. The progress of the industry itself is dying, without the influence of NVIDIA and others, certains aspects will die without even a competent replacement in sight.

Even the ones that are truly useful like G-Sync
G-Sync is alive and kicking, it has more industry wide support with it, successful grading system (G-Sync compatible, G-Sync and G-Sync Ultimate).

RT and DLSS
RT is an industry standard that works through DirectX and Vulkan, and it is well supported on consoles, they days of equating RT with PhysX is over, they were never the same thing to begin with.

As for DLSS, it achieved the widest support of any tech outhere (350 games), it's the best quality upscaler in a world that has embraced upscaling wholeheartedly, it's quality is improving each year even beyond native resolution (DLSS3.5), it offers the best AA solution as well (DLAA, DLDSR) and it managed to achieve Frame Generation, so no it's no the same thing as PhysX (which never managed more than 30 titles). DLSS operates on a different level entirely.
 
LOL, in other words cheat and ignore DirectX features and games with advanced graphics to focus on pure rasterization, which is quickly becoming out of date.

This strategy can only backfire on AMD and do more harm than good, as gamers will buy a 1000$ 7900XTX and expect it to play everything with all the bells and whisles, then discover that it sucks at Path Tracing, VR, heavy Ray Tracing, Upscaling, Pro Rendering, Latency Reduction .. etc. This will damage the brand to no end.
Benchmarks are the only thing that truly matters in the end and so AMD taking the approach to consistently crowd out games with RT centric technology for several more years is very much a viable strategy ...

Can you keep continuing to make that claim in good faith that non-RT performance isn't going to matter soon after what we've recently seen with Starfield or especially the more recent UE5 titles ? (not to mention the likes of Horizon Forbidden West and FFXVI both of which will almost assuredly be included in benchmarks when they see a PC release)
 
Well, this has been discussed many times. AMD's business strategy has been like:
  • create a minimal GPU lineup piloting a hitech aspect as a reusable IP
  • minimize other R&D cost to good-enough levels
  • minimize SW dev/support cost
  • slap a price tag just a bit lower than nV's
  • care a lot about margins not about the market
I hasn't worked in the past years. Maybe it is fine from AMD's PoV. If it is not there should come a strategy change, right?

Selling a 4080 performance for 4060Ti price would be an interesting plot given AMD's Radeon position:
  • almost zero brand value - not Yugo levels but not far
  • long history of "broken SW" - like it or not but releasing a flagship product with completely broken VR support is bad
  • minimal marketing
  • lacking latest, shiny "frontier features" - DLSSx, RT, noise cancellation, etc.
  • almost no support in productivity apps - no OptiX, CUDA, duh
The lacking SW features alone should make a notable difference in expenses - all those hundreds (thousands?) of nV's SW engineers per multiple years should make that difference.
I think a lot of this is from pre ryzen times when the company was dying. The graphics side just doesn't make enough money and so the funding they get is limited.

To be perfectly honest they need to increase raytracing performance. Their regular rasterization performance is great but as more titles move to raytracing the performance issues are becoming more apparent.

I believe they really need to leverage the console market. I don't understand how it is possible that games like ratchet and clank which were made for zen2/rdna2 and ships broken on amd hardware for the pc. AMD should be creating optimization teams that go to as many developers as possible and implement optimizations for amd hardware.

I am also really praying that the rumors of rdna 3.5 and 4 being cancelled is because they are focusing completely on a whole new architecture that addresses ray tracing performance in a major way. But who knows.
 
The fact is that AMD is held to an impossible standard compared to nVidia.
So Nvidia is impossible? Okay then.

General discourse here is that people will somehow buy Nvidia h/w no matter what despite there already being lots of examples which prove this wrong, between Intel vs Ryzen and even GeForce vs R300. People will buy good h/w, a brand is very much irrelevant.
 
Can you keep continuing to make that claim in good faith that non-RT performance isn't going to matter soon
Can you in good faith buy an expensive high end GPU only to play high end games with miserable performance?

Both the 4080 and 7900XTX deliver good performance in rasterized games, you don't lose much by going with either. However, only the 4080 can play heavy ray traced games, and path traced games while also delivering considerably better denoising/upscaling quality.
 
Can you in good faith buy an expensive high end GPU only to play high end games with miserable performance?
A benchmark is a benchmark irregardless whether or not it's using your ideal graphical features ...
Both the 4080 and 7900XTX deliver good performance in rasterized games, you don't lose much by going with either. However, only the 4080 can play heavy ray traced games, and path traced games while also delivering considerably better denoising/upscaling quality.
Both the 4080 and the 7900XTX were nearly on par in rasterized performance but the gap is now clearly growing wider in favour of the latter in recent releases and given how few and far in between RT games were this year expect the gap to grow ever wider with the coming of AC Mirage, COD:MW3, and Lords of the Fallen all just after Starfield ...

It is the responsibility of review outlets to provide a fair assessment as to how other AAA games will perform on different hardware out of necessity besides just RT performance because consumers deserve to know better and be equipped with this information in hand whether you like it or not ...
 
given how few and far in between RT games were this year
What? Returnal, Star Wars Jedi: Survivor, Hogwarts Legacy, Dead Space Remake, Forspoken, Deliver Us Mars, Elden Ring, Ratchet & Clank Rift Apart, The Lord of the Rings: Gollum, F1 2023, Layers of Fear, The Outlast Trials, Resident Evil 4, A Plague Tale: Requiem, Armored Core VI, Exoprimal, DESORDRE, Portal Prelude RTX.

Upcoming there will be Alan Wake 2, Avatar Pandora, Forza MotorSports, Cyberpunk 2077: Phantom Liberty, Immortals of Aveum and Diablo 4. And who knows what else? This certainly doesn't fit into the definition of few and far in between.

but the gap is now clearly growing wider in favour of the latter in recent releases
Recent? What is that supposed to mean? what is your defintion of recent? now? a few months ago? this year? It's a big undefined mystery. If you are trying to be fair then at least pick up from the beginning of the year, if you do that the pattern you claim disappears.

Neutral Games:
Fort Solis (UE5)
Armored Core VI
Atlas Fallen
The Last of Us Part 1
Jagged Alliance
Star Wars Jedi: Survivor
Street Fighter 6
The Outer Worlds Spacer's Choice Edition
Wo Long: Fallen Dynasty
Company of Heroes 3

2023 Games that prefer NVIDIA:
Tempest Rising
Ratchet and Clank
Baldur's Gate 3
Portal RTX Prelude
Trepang2
Satisfactory (UE5)
Hogwarts Legacy
Diablo 4
Alone in the Dark Prologue
Miasma Chronicles
The Lord of The Rings: Gollum
System Shock
The Outlast Trials
Layers of Fear (UE5)
F1 23
Age of Wonders 4
Sherlock Holmes The Awakened
EVERSPACE 2
Atomic Heart
Forspoken
Dead Space
Deliver Us Mars
Returnal
DESORDRE (UE5)

2023 Games that prefer AMD:
Remnant 2
Aliens Dark Descent
Amnesia The Bunker
Dead Island 2
Resident Evil 4
Immortals of Aveum
Starfield
 
rks are the only thing that truly matters in the end and so AMD taking the approach to consistently crowd out games with RT centric technology for several more years is very much a viable strategy ...
You mean the strategy that put them in the ~10% AIB market share ballpark? Not a single objective person would call it a success...
 
With no replacement in sight. Physics continues to suck in modern gaming, and hair rendering barely improved. Where is their industry wide replacement? Where is the AMD solution? Where is the console solution? All dead too. So the fact those are dead is not a point against them in the first place. The progress of the industry itself is dying, without the influence of NVIDIA and others, certains aspects will die without even a competent replacement in sight.
Control uses CPU driven PhysX and it looks great and runs fast. Apart from extreme examples as Teardown I know hardly any game with physics and destruction that is comparable to Control. Too bad that not much more titles using PhysX.
 
Last edited:
What? Returnal, Star Wars Jedi: Survivor, Hogwarts Legacy, Dead Space Remake, Forspoken, Deliver Us Mars, Elden Ring, Ratchet & Clank Rift Apart, The Lord of the Rings: Gollum, F1 2023, Layers of Fear, The Outlast Trials, Resident Evil 4, A Plague Tale: Requiem, Armored Core VI, Exoprimal, DESORDRE, Portal Prelude RTX.

Upcoming there will be Alan Wake 2, Avatar Pandora, Forza MotorSports, Cyberpunk 2077: Phantom Liberty, Immortals of Aveum and Diablo 4. And who knows what else? This certainly doesn't fit into the definition of few and far in between.
Do you keep cheating with your listings ? Either you count the early access releases in it's initial year or at the year of it's full release but NEVER EVER include both Deliver Us Mars and The Outlast Trials simultaneously as an example for consistency's sake because now everyone knows that you're just being dishonest ...

Both Elden Ring and A Plague Tale: Requiem were released last year so they're irrelevant examples in the context of the current year and are you really going to count ACVI when the feature is gated behind some avatar customization mode when no reviewer would consider it benchmark worthy ?

You're reach is really showing to list a mod on top of another mod, a non-standalone expansion since Phantom Liberty isn't even a separate game, and a Lord of the Rings game that no reviewer will ever touch with a 10ft pole ...

Also do you see the irony in listing several UE4 titles when there's a non-trivial probability of developers avoiding HWRT while using Nanite on UE5 ?
 
Last edited:
Back
Top