Vendor lockout of GPUs? *spawn*

I did not follow this discussion, but obviously the answer is: After iGPUs become mainstream, because discrete GPUs became too expensive in comparison.
I really see no other option to move on anyway. FPU merged with CPU too, and nobody complains anymore.

Integration is the most sensible way to capture more market share. However iGPUs are subject to the same realities of semiconductor manufacturing as dGPUs. There’s no end in sight for demand for better graphics which means faster hardware. So unless the software side also stagnates it will be a long time before iGPUs are good enough for mainstream. On the other hand integrated FPUs very quickly became “good enough”.
 
There’s no end in sight for demand for better graphics which means faster hardware.
Yes, but beside that, it's also about economy, which is more important in the end.
So i believe the conclusion across game devs will be like so: We want further growth, and we even need it to compensate increasing production costs. Thus we have to settle on a HW standard which is affordable and attractive to the masses.
Just my personal crystal ball.
On the other hand integrated FPUs very quickly became “good enough”.
Consoles are good enough as well. On PC we only need to solve the VRAM bandwidth problem. Either the same way as consoles do, or skipping over that and integrate memory into CPU, like M1 shows.

Personally i don't see a big downgrade at all. With unified memory we can do fine grained work distribution over CPU and GPU, and pick the better option without data movement show stoppers.
Additionally, we have to research and use more efficient algorithms. GPU power won't guarantee progress for free anymore. We have to work harder, which aligns to the trend of having less engines, so the costs remain the same overall.
Just saying. I do not really think majority of end users cares about top notch gfx anymore that much. Industry seems to target a broader audience, so the weight of expectations on high end visuals decreases from this growth.
The changes in game gfx of recent years seemingly were about two things: 1. Cartoony gfx. Fortnite, or mature / family / girly / retro games. 2. RT on high end. The former point seems to have more impact, outside our geeky interests.

So i don't think we have a choice if we target only 'just one' mainstream.
Fragmenting the market into 'high end core gamers' and 'occasional low end gamers' might be nice too. But then, maybe we can't serve all markets with the same games anymore. This could give us better games, but only possible if they can reduce production costs to compensate for smaller markets.
 
There’s no end in sight for demand for better graphics which means faster hardware.

Are we 100% sure about this? Is your point "we need more graphics cards in the market because of the current shortage" or "we're going to need faster consumer graphics cards because software keeps getting held back by hardware performance"?
I keep seeing mention of inter-generational diminished returns in actual gaming discussions subs, forums and websites. DF themselves seem to be complaining about the lack of a "next-gen feel" in the latest videogame trailer reels.
The latest God of War Ragnarok trailer seems to have spun-off a lot of discussions on whether or not videogames can look that much better than they do now on hardware that is economically viable to a critical mass of gamers, considering how much more expensive the newer fab nodes are, as well as the packaging for 3D stacking.
 
Consoles are good enough as well.

The standing of consoles in the market hasn't changed for decades. They've always been good enough but that hasn't quenched the hunger for better graphics.

Additionally, we have to research and use more efficient algorithms. GPU power won't guarantee progress for free anymore. We have to work harder, which aligns to the trend of having less engines, so the costs remain the same overall.

All of those things also benefit powerful GPUs. There will be innovation on the hardware side as well. GPU chiplets will open up a whole new range of performance and power consumption.

Just saying. I do not really think majority of end users cares about top notch gfx anymore that much.

What makes you say that? The reaction to the recent Halo reveal tells a different story.
 
Are we 100% sure about this? Is your point "we need more graphics cards in the market because of the current shortage" or "we're going to need faster consumer graphics cards because software keeps getting held back by hardware performance"?

Neither. We need faster consumer graphics but current software isn't being held back. Software also needs to catch up.

I keep seeing mention of inter-generational diminished returns in actual gaming discussions subs, forums and websites. DF themselves seem to be complaining about the lack of a "next-gen feel" in the latest videogame trailer reels.
The latest God of War Ragnarok trailer seems to have spun-off a lot of discussions on whether or not videogames can look that much better than they do now on hardware that is economically viable to a critical mass of gamers, considering how much more expensive the newer fab nodes are, as well as the packaging for 3D stacking.

How many of those games are actually doing next gen stuff? A lot of the "next gen" console games are just now catching up to features we've had on PC for years. So we've seen it all before.
 
Source and when was this precedent established ?
While filing their suit against Intel, NVIDIA also made formal complaints to the FTC, who was already building a cast against Intel for actions against AMD. The FTC included some of their complaints in their own suit, and when that was settled last year NVIDIA received some protections against potential Intel actions. For all practical purposes Intel is barred from making technical decisions that lock out 3rd party GPUs from their platforms for the next several years, enforced by requiring they continue to offer PCI-Express connectivity and at the same time barring Intel from making changes that would reduce GPU performance unless those changes specifically improve CPU performance.
https://www.anandtech.com/show/4122/intel-settles-with-nvidia-more-money-fewer-problems-no-x86

In other words NVIDIA is immune to any tactic Intel might use to lock them out.

For a "weak position", $1.5B was enough to make Nvidia fold all of their disputes and share their graphics IP ...
Once more, NVIDIA couldn't care less about the dying chipset market, they were already in the process of folding down the business. NVIDIA also gained access to CPU patents to build their ARM CPU.

I suggest you read a little about the matter and the history because you seem to be unaware of both, you also seem to be unaware of the delicate legal and patent balances between hardware companies.
 
Intel doesn't have the leverage these days for this. Maybe if it was during bulldozer era where they had >90% of the gaming CPU market. They are already losing marketshare in pretty much ever sector so they can't be making moves to accelerate that. I do expect some Intel CPU + GPU integration feature eventually but there will be way too much risk on Intel to try to block other GPUs from their systems.

Anyways looking at things from the legal perspective isn't going to sway corporations, legally there would be some repercussions but Intel could just write those off. They made way more money off their anti trust practices than the eventual fines were. If they had the leverage for it, they'd be wasting potential equity to not pursue some kind of vendor lock.
 
While filing their suit against Intel, NVIDIA also made formal complaints to the FTC, who was already building a cast against Intel for actions against AMD. The FTC included some of their complaints in their own suit, and when that was settled last year NVIDIA received some protections against potential Intel actions. For all practical purposes Intel is barred from making technical decisions that lock out 3rd party GPUs from their platforms for the next several years, enforced by requiring they continue to offer PCI-Express connectivity and at the same time barring Intel from making changes that would reduce GPU performance unless those changes specifically improve CPU performance.
https://www.anandtech.com/show/4122/intel-settles-with-nvidia-more-money-fewer-problems-no-x86

In other words NVIDIA is immune to any tactic Intel might use to lock them out.

Basically the opposite of what you stated and nothing we didn't knew before ...

It is, if you do it by blocking access to your platform. Intel will be fined and forced to give access to others either free of charge or through a fee. It happened once and it will happen again.

The courts didn't determine that Intel was guilty of violating anti-trust laws in this instance so consequently they weren't even fined for it and in that very same case you referenced, Intel still blocked out Nvidia by paying $1.5B to them ...

Once more, NVIDIA couldn't care less about the dying chipset market, they were already in the process of folding down the business. NVIDIA also gained access to CPU patents to build their ARM CPU.

Well it's good to know that we've already established than Intel can and will block others out of a market and Nvidia exploring ARM is an even bigger reason why Intel should proceed to block them once more. Maybe if Nvidia stops all development of their own ARM cores Intel could still offer compatibility but otherwise they aren't interested in a competitor building their own platform to just to subvert the dominance of x86 ...
 
One day, GPUs too could become socketed as well just like CPUs today which currently prevents interop from happening ...
 
Bundling has been around forever though and its impact is already factored into current market dynamics. And of course bundles only work if customers feel like they’re getting decent value. If Arc GPUs are any good then Intel will certainly gain a lot of OEM market share. But only if they’re any good. And they won’t need to artificially fragment the market to do that.

I don't agree it's that straight forward and this may be a matter of perspective.

Remember back to the AMD and Intel lawsuit with respect to how Intel was applying OEM pressure to keep out AMD CPUs? Inherently bundling in this sense would be doing the same thing. You could technically say neither Intel (or AMD, both can do this) are telling/incentivizing OEMs (or whomever) to not purchase Nvidia GPUs. They're just offering greater discounts if you use their GPUs instead. Glass half full/empty perspective basically.

Again this is an example of an advantage more on the "softer" side with respect to having vertical integration. Maybe it's just me, but I'm not see how having both the CPU/GPU side is not an advantage. Yes if the GPU only vendor's offering is the clear winner than it can override any benefits, but what if it's only slightly better? You can offset that via the CPU side.

Also circling back bit I want to say this shouldn't only be a concern now with Intel entering the market as AMD shouldn't be overlooked either. This was why I never bought the notion that a "balanced" market had AMD with near 50/50 share in either the CPU or GPU space. That would've given them enormous leverage in the x86 CPU/GPU segments as opposed to have being "balanced."
 
I don't agree it's that straight forward and this may be a matter of perspective.

Remember back to the AMD and Intel lawsuit with respect to how Intel was applying OEM pressure to keep out AMD CPUs? Inherently bundling in this sense would be doing the same thing. You could technically say neither Intel (or AMD, both can do this) are telling/incentivizing OEMs (or whomever) to not purchase Nvidia GPUs. They're just offering greater discounts if you use their GPUs instead. Glass half full/empty perspective basically.

Again this is an example of an advantage more on the "softer" side with respect to having vertical integration. Maybe it's just me, but I'm not see how having both the CPU/GPU side is not an advantage. Yes if the GPU only vendor's offering is the clear winner than it can override any benefits, but what if it's only slightly better? You can offset that via the CPU side.

Also circling back bit I want to say this shouldn't only be a concern now with Intel entering the market as AMD shouldn't be overlooked either. This was why I never bought the notion that a "balanced" market had AMD with near 50/50 share in either the CPU or GPU space. That would've given them enormous leverage in the x86 CPU/GPU segments as opposed to have being "balanced."

I think we’re saying the same thing. Intel can make significant inroads without needing to create some Intel specific hardware interface.
 
The courts didn't determine that Intel was guilty
The FTC forced Intel to stick to offering PCI Express for several years following their own investigation and NVIDIA's complaints, they didn't even need a court ruling to do that. It's simple anti trust law, of which you seem oblivious to.

Well it's good to know that we've already established than Intel can and will block others out of a market
No we didn't establish that, Intel can't block any competitor, unless that competitor surrenders willingly.
 
Last edited:
Back
Top