Will gaming move 100% to the cloud? *spawn

The latency issue isn't as insurmountable as people make it out to be. Keep in mind that for the console market it's only the last few years that there's been somewhat mainstream messaging around how much more latency TV post processing adds and more awareness/marketing around "game modes" to reduce latency. Yet the average buyer wasn't be put off by that latency issue or even aware of it.

But the framing of these types of questions is always too extreme. It's not going to be no adoptions or 100% adoption scenario, at least not in a reasonable time frame. The better question will be if the console and equivalent market moves to streaming only for the majority, with local hardware simply being "mobile" class hardware handling input/output in the reasonable future.

Also in terms of the demographics question the reality is it's not about where the "people" are but where the "money" is. Apple's entire business strategy is the prime example of this, they aren't trying to cater to everyone but people who are able and willing to spend. If the spending market is mostly based around people in developed urban areas and those customers can handle streaming that will be what the market caters to.
 
The latency issue isn't as insurmountable as people make it out to be. Keep in mind that for the console market it's only the last few years that there's been somewhat mainstream messaging around how much more latency TV post processing adds and more awareness/marketing around "game modes" to reduce latency. Yet the average buyer wasn't be put off by that latency issue or even aware of it.

But the framing of these types of questions is always too extreme. It's not going to be no adoptions or 100% adoption scenario, at least not in a reasonable time frame. The better question will be if the console and equivalent market moves to streaming only for the majority, with local hardware simply being "mobile" class hardware handling input/output in the reasonable future.

Also in terms of the demographics question the reality is it's not about where the "people" are but where the "money" is. Apple's entire business strategy is the prime example of this, they aren't trying to cater to everyone but people who are able and willing to spend. If the spending market is mostly based around people in developed urban areas and those customers can handle streaming that will be what the market caters to.

I think theres another hurdle, other than just 'high latency' to overcome for a good streaming experience, that is the variable latency of an internet packet. I know that my internet connection has enough bandwidth for streaming and on a good day it probably has good enough latency to stream a game. But I would 100% not trust it to be able to stream a game that required low latency interactions at a high frequency.
 
The better question will be if the console and equivalent market moves to streaming only for the majority, with local hardware simply being "mobile" class hardware handling input/output in the reasonable future.
If the majority end up streaming games, wouldn't the console makers reconsider the economics of complex home consoles? Once you swing a significant direction towards cloud gaming, I can see consoles becoming niche market items at high cost.
 
I think theres another hurdle, other than just 'high latency' to overcome for a good streaming experience, that is the variable latency of an internet packet. I know that my internet connection has enough bandwidth for streaming and on a good day it probably has good enough latency to stream a game. But I would 100% not trust it to be able to stream a game that required low latency interactions at a high frequency.

We can say the same issues exist with the transition from fixed lines to wireless however a large part if not most of the market seems content with the consistency tradeoffs.

It should be kept in mind that the audiences aren't really in the middle spectrum. I certainly see "enthusiast" demographics not preferring a move to streaming only, I shouldn't don't, but it's the masses that will move the market. Local only may either be low end only (mobile and casual) while high end becomes a more niche and expensive hobby.

If the majority end up streaming games, wouldn't the console makers reconsider the economics of complex home consoles? Once you swing a significant direction towards cloud gaming, I can see consoles becoming niche market items at high cost.

By that I was referring to the customer base, essentially the "middle" of gaming. Not necessarily strictly console gamers but some PC gamers would be in here as well. Low end casual gaming that as seen it's growth on mobile devices would likely remain local.

What I think would likely happen if that scenario were to occur is that the current console hardware would effectively disappear. Any "consoles" that existed would be more along the lines of the "Steam Box" concept. Essentially local high fidelity gaming would be an expensive niche both on the hardware and software side (at least for upfront cost games, a bit on this later). The latter is likely more what would be concerning to the people on here, since I'm guessing most are already acclimated to idea of "expensive" hardware via PC gaming already (which can also serve other functions/hobbies).

Local gaming would also be "low fidelity" on mobile devices as well.

The one other factor in here is the growth of competitive gaming and esports. I can see this almost being a divergent market should the trajectory increase. It will not target expensive hardware on the local side (relatively speaking). The software model will likely slant towards a service model entirely so at least the upfront costs won't increase.
 
The one other factor in here is the growth of competitive gaming and esports. I can see this almost being a divergent market should the trajectory increase. It will not target expensive hardware on the local side (relatively speaking). The software model will likely slant towards a service model entirely so at least the upfront costs won't increase.
Depends on the game of course since a lot of esport-style games target mid-range hardware, but most competitive and professional esport gamers absolutely only game on high end expensive hardware. This is even truer for esports when you take into account monitors and peripherals, which many normal PC gamers will accept mid-range quality and refresh rates.
 
I'm firmly in the Cloud camp on this question. I wrote an article about this as a gaming journalist for SegaWeb during the Dreamcast era and I'm still of this mindset.

The main thing I think people forget in these discussions is that, even during peak hours, only 10% of console users are gaming at any one time. So, theoretically you only need 30 million blades to service 300 million console gamers (which is about the actual size of the console market). This is a tremendous savings in hardware. Assuming an average price of hardware at $300 over a generation (and this is rising), that's $9 billion per generation instead of $90 billion needed to service the same community. Not to mention the fact that you don't have to start your userbase all over again - just improve the blades and migrate everyone over to the new hardware perhaps with a premium subscription and so forth.

What are the caveats? All the things people have listed, like latency, locality etc... but $81+ billion saved per generation is a lot of money to flesh out the network issues. Not to mention the fact that with Cloud gaming you could easily see the market expanding to 1 billion or more players which will feed the beast even more.

All of these things I'm saying also help in a world where process nodes aren't shrinking and you need large supercooled high frequency blade farms to service everything and where there are chip shortages, as 90% of chips won't be needed. Even more if you solve the networking issues well enough so that people in Asia at 6 pm can use hardware that is idle in NA at 6 am.

Also, think of situations where you are on an 8k TV and your neighbours are still on 1080p. Why should the hardware be the same. One blade can run multiple neightbours gaming sessions instead of yours, which might use a full blade. Scalability becomes more possible.

My guess is that it will take another 10-15 years to sort this out, but even the naysayers here will see the deal that they can get with a GamePass dumb terminal (with phone, tablet versions as well) running 8K games for $25 per month with monthly AAA releases and no upfront hardware cost and fold like a house of cards. :)
 
Most people don’t have the means to do that even if they wanted to. Do you think there would be a large enough market for developers to target that?

I don't agree. I"ve been involved in pc gaming since the mid 80s. We have gone from big horizontal cases to big vertical to small machines for lan back to big vertical and back again. There are plenty of gamers out there who will invest in larger machines to get the best experience just like there is today . The thing is today in 18-24 months there is something much faster avalible at the same size as what you have.

If we hit a wall and the only way to increase power is to get bigger then of course people will still buy it and devs will still target it. We've already experimented with it in the past.

The first voodoo was an add on 3d only board. Then they integrated 2d and 3d into one chip but then they went back to multi chip in what the voodoo 5 ? At one point nvidia and ati had quad sli/crossfire. I remember the hotness when I was young was buying a motherboard with two sockets to have a dual core solution.

It will all come back if that is the only way to increase power.

More to the point that all existed with micron drops. It be a lot easier if say a a geforce 8800 rtx was the last video card they mad to just over the course of years buy 4 of them or 8 of them or 16 of them. Same with cpus. It be easy over the years to get two thread ripper cpus or 4 thread ripper cpus.
 
Name recognition of latency != perception of latency. Plenty of lay people don't know what that is but will still feel it and say "this console sucks". I think cloud gaming will continue to grow for quite a while, but it will only see massive adoption competitive to home consoles if its so much more profitable that quality becomes irrelevant.
 
I wonder if a thin client could use ML upscaling or somesuch to fix low(er) bandwidth video feed, and then potential AI gaming assists on the server end to accommodate irregular packets?
 
  • Like
Reactions: JPT
Depends on the game of course since a lot of esport-style games target mid-range hardware, but most competitive and professional esport gamers absolutely only game on high end expensive hardware. This is even truer for esports when you take into account monitors and peripherals, which many normal PC gamers will accept mid-range quality and refresh rates.

Esports games will do what they mostly do now. They'll target and be runnable on even low end local hardware. They will however have high scalability, as they aren't designed so much to scale up in terms of visual fidelity based on hardware but performance to be more competitive.

I don't agree. I"ve been involved in pc gaming since the mid 80s. We have gone from big horizontal cases to big vertical to small machines for lan back to big vertical and back again. There are plenty of gamers out there who will invest in larger machines to get the best experience just like there is today . The thing is today in 18-24 months there is something much faster avalible at the same size as what you have.

If we hit a wall and the only way to increase power is to get bigger then of course people will still buy it and devs will still target it. We've already experimented with it in the past.

But how many people would? We already see this with the existing market there's diminishing returns in terms of how much extra people are willing to pay, otherwise everyone would just buy top of the line PC systems, there would be no mid range or lower PC systems or consoles.

And if the customer base is small why would developers target them? Well unless there is more money to be made of course if that customer base is willing to spend. This has always been an interesting facet especially with respect to the PC market. The content creators don't actually make any money from hardware sales. Unless of course with the modern DLC business model if those with higher end hardware were to also be willing to pay for higher quality graphics on the software side. However this concept seems to be taboo from the customer side which I've always found interesting that some people who buy high end PC hardware seem to expect that those that don't benefit from said sales to cater to them more since they spent more on said hardware.
 
Esports games will do what they mostly do now. They'll target and be runnable on even low end local hardware. They will however have high scalability, as they aren't designed so much to scale up in terms of visual fidelity based on hardware but performance to be more competitive.



But how many people would? We already see this with the existing market there's diminishing returns in terms of how much extra people are willing to pay, otherwise everyone would just buy top of the line PC systems, there would be no mid range or lower PC systems or consoles.

And if the customer base is small why would developers target them? Well unless there is more money to be made of course if that customer base is willing to spend. This has always been an interesting facet especially with respect to the PC market. The content creators don't actually make any money from hardware sales. Unless of course with the modern DLC business model if those with higher end hardware were to also be willing to pay for higher quality graphics on the software side. However this concept seems to be taboo from the customer side which I've always found interesting that some people who buy high end PC hardware seem to expect that those that don't benefit from said sales to cater to them more since they spent more on said hardware.

People are spending a shit ton on graphics cards right now. Cards that were $600 or $700 are going for $1500 right now and for the last year. i would say there is a huge market for it actually
 
I'm firmly in the Cloud camp on this question. I wrote an article about this as a gaming journalist for SegaWeb during the Dreamcast era and I'm still of this mindset.

The main thing I think people forget in these discussions is that, even during peak hours, only 10% of console users are gaming at any one time. So, theoretically you only need 30 million blades to service 300 million console gamers (which is about the actual size of the console market). This is a tremendous savings in hardware. Assuming an average price of hardware at $300 over a generation (and this is rising), that's $9 billion per generation instead of $90 billion needed to service the same community. Not to mention the fact that you don't have to start your userbase all over again - just improve the blades and migrate everyone over to the new hardware perhaps with a premium subscription and so forth.

What are the caveats? All the things people have listed, like latency, locality etc... but $81+ billion saved per generation is a lot of money to flesh out the network issues. Not to mention the fact that with Cloud gaming you could easily see the market expanding to 1 billion or more players which will feed the beast even more.

All of these things I'm saying also help in a world where process nodes aren't shrinking and you need large supercooled high frequency blade farms to service everything and where there are chip shortages, as 90% of chips won't be needed. Even more if you solve the networking issues well enough so that people in Asia at 6 pm can use hardware that is idle in NA at 6 am.

Also, think of situations where you are on an 8k TV and your neighbours are still on 1080p. Why should the hardware be the same. One blade can run multiple neightbours gaming sessions instead of yours, which might use a full blade. Scalability becomes more possible.

My guess is that it will take another 10-15 years to sort this out, but even the naysayers here will see the deal that they can get with a GamePass dumb terminal (with phone, tablet versions as well) running 8K games for $25 per month with monthly AAA releases and no upfront hardware cost and fold like a house of cards. :)

What happens during big game releases however ? Even now when a new mmo or online game launches servers crash. So you always have to over build vs under build .

Also the cost is all on the company releasing the cloud service where as the cost is shifted to the end user when you sell them a console.
 
People are spending a shit ton on graphics cards right now. Cards that were $600 or $700 are going for $1500 right now and for the last year. i would say there is a huge market for it actually
But it's about volume not expenditure.

The discussion about Cloud isn't 100% whether it will take over local hardware for gaming. It's about whether all games going forward will be released on cloud; and that doesn't mean it won't be released on local hardware as well. There will be exceptions to the rules as always, perhaps there will be hardware exclusives and perhaps there will be cloud exclusives. But on a whole, I don't see the industry moving away from moving their games to cloud.

Newzoo_Games_Market_Revenues_2020-1024x576.png


I think the trend is clear to move to streaming with mobile currently the largest customer base worldwide; it already generates the majority of gaming industry revenue. I don't think publishing companies really care that gamers are willing to spend $2500 for a 3090; in the end whether they have a 3090 or not, buying a single copy of Battlefield 2042 is the same profit to them. So I'm not sure why this argument is about whether or not Cloud can't be a viable alternative to pursue profits such that nearly 100% of games are also released to cloud.

I've not seen yet a single reason brought forward that should convince a publisher otherwise to not move their game to target the 2B+ mobile devices out there, reaching a large variety of markets that they cannot deliver hardware to. How much does it cost to port to cloud? Probably not that much.
 
Stadia is a real world example of cloud gaming with full price games. That's didn't exactly go over well. It hasn't been a very successful platform either.

What would be the major factor that makes cloud gaming suddenly viable for full priced games? Because MS or Sony are providing the service? A lot more games available on the platform? Maybe.

Note that I'm discounting Game Pass here and I'm specifically talking about these first parties and big publishers wanting to continue selling $70+ games.
 
Stadia is a real world example of cloud gaming with full price games. That's didn't exactly go over well. It hasn't been a very successful platform either.

What would be the major factor that makes cloud gaming suddenly viable for full priced games? Because MS or Sony are providing the service? A lot more games available on the platform? Maybe.

Note that I'm discounting Game Pass here and I'm specifically talking about these first parties and big publishers wanting to continue selling $70+ games.
I think Google failed because of it's model, and general lack of understanding of the gaming market; and this imo is where MS or Sony can shine. They have established relationships with both customers, publishers and developers. They have extensive networks, and they have the muscle to market, support, develop and promote titles.

Google could have done so much better if it had a service like gamepass or PS Now. Just pay for a service that gives you access to a large catalog of titles that you can play daily. The typical mobile player isn't going to pay $70 AAA price for a steaming experience that is locked to a single platform (when they are used to not paying for gaming at all). No one likes gaming so much to buy each game for local hardware and for streaming.

Google was just too early for it's time imo. The all streaming only future is still a ways out. There has to be a transition period. But I should be clear, cloud is not a slam dunk. It's a inevitable future, but it doesn't mean it's a 100% guaranteed profit for everyone who attempts it.

We've had a lot of failed consoles, CPU and GPU vendors leading to the current landscape of the remaining big three / big 2.
 
Last edited:
People are spending a shit ton on graphics cards right now. Cards that were $600 or $700 are going for $1500 right now and for the last year. i would say there is a huge market for it actually

It's not about whether or not anyone is willing to spend but what the relative sizes of each audience is.

We've also already seen what happened with the shift with multi platform game design and how the "target" focus for games become the consoles. With PC "hardware" gamers, especially the higher end buyers, lamenting how they are not the target anymore compared say over a decade+ ago.

Not to mention the other factors of what is contributing to the current spending on PC hardware, especially GPUs. There are a segment of buyers currently very willing to build "large" PCs multiple graphics cards wide but it certainly isn't for gaming.

But it's about volume not expenditure.
I think the trend is clear to move to streaming with mobile currently the largest customer base worldwide; it already generates the majority of gaming industry revenue. I don't think publishing companies really care that gamers are willing to spend $2500 for a 3090; in the end whether they have a 3090 or not, buying a single copy of Battlefield 2042 is the same profit to them.

What can change that is if the people willing to spend more on hardware also are willing to do so on software. This has always been an interesting aspect of PC gaming in how buyers (at least some) are willing to spend extra on hardware yet expect the software side (whom makes nothing extra on said hardware) to cater to them more.
 
It's not about whether or not anyone is willing to spend but what the relative sizes of each audience is.

We've also already seen what happened with the shift with multi platform game design and how the "target" focus for games become the consoles. With PC "hardware" gamers, especially the higher end buyers, lamenting how they are not the target anymore compared say over a decade+ ago.

Not to mention the other factors of what is contributing to the current spending on PC hardware, especially GPUs. There are a segment of buyers currently very willing to build "large" PCs multiple graphics cards wide but it certainly isn't for gaming.



What can change that is if the people willing to spend more on hardware also are willing to do so on software. This has always been an interesting aspect of PC gaming in how buyers (at least some) are willing to spend extra on hardware yet expect the software side (whom makes nothing extra on said hardware) to cater to them more.

The original premise that started this thread was that moores law stopped or at least fell of a cliff. In my opinion if that happens then people will end up with the best technology in their home. They will just devote more space to it. I would certainly have a closet dedicated to a high end gaming rig or at least a fridge sized system.

If the only way to increase graphics and performance is to go bigger with more video cards and more cpus instead of continuing to shrink well the cloud blades will have to adopt the same thing wouldn't they ? Which means that developers will target that too
 
Back
Top