Has consumer pressure ruined gaming for the next decade? *spawn

Likewise, like Shifty Geezer pointed out, if they plastered this on the box itself, sure, those potential customers would have been pre-excluded. Bearing in mind, the DRM would also exclude people with broadband internet but doesn't like the idea of console being mandatory-online (e.g. people taking their console to their summer house for 2 weeks with no connection, who would otherwise have 50 weeks of internet access would opt to get PS4.) Xbox would be limiting the target population for no valid reason.

Yes it "pre-excludes" people but it's not a publisher decision at that point, they know what the market is and they can't directly change it. This changes the publishers decision making process when deciding what features can be shipped.

The sticker on the game box is also less attractive to publishers because it leads to support issues when people don't see it and buy the game anyway.
 
Yes it "pre-excludes" people but it's not a publisher decision at that point, they know what the market is and they can't directly change it. This changes the publishers decision making process when deciding what features can be shipped.

The sticker on the box is also less attractive to publishers because it leads to support issues when people don't see it and buy the game anyway.
How do you relate these business choices with cross-platform titles? If XB1 maintained its always on, and publishers were looking at supporting one always on console and one not always on console, and PC, how do they deal with the excluded market, or the compromises, and how is that any different now that XB1 is exactly in the same boat as PS4?

Seems to me it makes no difference. The only people affected are makers of platform exclusives, who can still target the cloud because MS wants to promote it as a differentiator. For everyone else, either they'd avoid the cloud because PS4+PC doesn't have it, making XB1's always online an 'inconvenience' for those offline games, or they'll make their games cloud-dependent anyway (The Division) and put a requirement icon on the packaging.
 
How do you relate these business choices with cross-platform titles? If XB1 maintained its always on, and publishers were looking at supporting one always on console and one not always on console, and PC, how do they deal with the excluded market, or the compromises, and how is that any different now that XB1 is exactly in the same boat as PS4?

Seems to me it makes no difference. The only people affected are makers of platform exclusives, who can still target the cloud because MS wants to promote it as a differentiator. For everyone else, either they'd avoid the cloud because PS4+PC doesn't have it, making XB1's always online an 'inconvenience' for those offline games, or they'll make their games cloud-dependent anyway (The Division) and put a requirement icon on the packaging.

The difference between PS4 and XB1 would certainly limit the use by 3rd parties doing cross platform titles, though I think you'd still see some use it, even if only as the checkbox platform exclusive feature.

And as I've said before if I were MS I'd be happy to sell azure time for PS4 or PC titles if publishers wanted to pay for it, on PC I'd even provide a compatible API, but that's a strategic decision, which very much depends on what you view the real long term platform to be.

In the end it is what it is, I was more irritated with not being able to install a disk and throw it away.
 
I wouldn't say that. The title Below by an independent developer (although in this case Microsoft may have decided to publish it in order to get around the publisher requirement for Xbox One games) is going to be cloud based.

Made by Capybara games (http://www.capybaragames.com/ ).

I'm personally interested to see what they do with it. As well with it being a roguelike game, that makes me even more interested.

Regards,
SB

My money is on Capy just using this cloud indistinguishably from what we consider dedicated servers, multiplayer connectivity, or just plain account storage.

Funny thing is, I think there are still plenty, and I mean plenty, of experiences that will never need an online connection waiting to be made. Having connectivity and cloud is a buzz word, it has real-world benefits, but those benefits are not suddenly intrinsic to creating good games.

Just look at some of the big hits this last generation, you'll see just as many games that were successful just by their single-player cloud-less incarnations as there are multiplayer.

The past decades of gaming has shown this, and they've also shown the dangers of pushing too much connectivity in the wrong ways. (Diablo 3)

So all that talk from the OP about how the gaming community gets ruined is silly. And just because some mythical gaming tech utopia isn't achieved by the MS technocracy. Games will be made regardless, and this generation will be a big transition into the full switch to online only.

I think the switch to digital only is coming, but a 5-10 year transition period probably works better. And if it's still an issue, there's always the PC, and always the console online stores to download from.
 
Just look at some of the big hits this last generation, you'll see just as many games that were successful just by their single-player cloud-less incarnations as there are multiplayer.

The past decades of gaming has shown this, and they've also shown the dangers of pushing too much connectivity in the wrong ways. (Diablo 3)

What's the point of comparing it to games from a generation where the technology could not be fully embraced because a publisher couldn't confidently know that all of it's target market could use it?

Looking at multiplayer games before Xbox basically made online standard for Xbox Gold. Look at the state of video games before 3D acceleration became relatively standard. This is most easily scene in PC gaming when 3D acceleration was relatively new. Good luck finding many games that were 3D accelerated until quite a few years after the first proficient consumer level 3D accelerator was released (3dfx Voodoo Graphics along with Rendition Verite). It was still relatively rare when Nvidia finally jumped in with a more standard 3D accelerator in the Riva series.

Look at the adoption of 256 color graphics in PC games before the majority of PC's contained 256 color graphics? Want to get one of those fancy games greenlight by a publisher or developer? Good luck trying to make that happen.

Of course, there's little evidence of how much things can change before there's a critical mass of machines/users that can make use of those things. Just look at how long it's taking for 64 bit games on PC as well as Dx11 native games on PC to take off. This despite there being a good 64 bit OS being released back in 2009 and the first Dx11 cards back in 2009 as well.

Basically until it is a standard, universal, or nearly universal technology or piece of hardware, it will not get enough universal adoption to drive gaming forward.

Hell look at how environmental audio was flourishing when Creative Labs made EAX 1.0 and 2.0 open license and thus having wide adoption even on motherboards with onboard audio. Then look what happened to it when they made it closed license for EAX 3.0-5.0, meaning publishers could no longer rely on nearly all PC users having access to EAX.

The consoles have an advantage over PCs in moving game changing technology forward in that whenever there is a new generation, a console manufacturer can guarantee publishers that ALL of its consoles will support X features. That gets technology advanced to that level relatively quickly assuming it is available on ALL machines. That does not happen with optional things...ever. At least in the history of console gaming.

If it isn't available on ALL machines from that console maker then good luck getting games greenlit to use them. Lightgun games, Move games, PS Eye games, etc. Kinect support in games would have eventually died off as well if MS didn't make it standard for every single Xbox One. If they had made it optional as was Kinect 1.0. I can virtually guarantee that it would have failed miserably this generation due to lack of support from publishers due to it being an optional accessory. That despite it being technologically quite impressive and allowing for gameplay that would not exist without it.

Hopefully things change and we'll see widespread adoption and experimentation with online compute, but I'm with Joker in thinking that by making it optional on all consoles, we've now stuck our heads in the sand and relegated most of the wonderful possibilities that are possible to the next generation of consoles after this. Assuming that there is a next generation of consoles after this.

Regards,
SB
 
No, I'm saying now that it's optional it will take far longer to become used to it's potential, just like anything else that's optional. History has shown this many times.

They don't exist because it takes time, c'mon now you know this as I presume you are a coder. New ideas don't get coded overnight, why are suddenly people expecting revolutionary ideas before the damn device is even being sold to the market? It takes years of joint worldwide effort for the killer ideas to hit and it also takes standardized features. Now it will take much longer because it can't always be counted on to be there hence it will be treated like an optional item, just lime multiplayer was treated on the ps2.
Sure, for third party titles I'd expect a low adoption rate, but for MS I'd expect a few tech demos at the very least to entice both developers and consumers. However, there are none. Like I said, they designed the thing to be always online from the beginning so they should have worked on that from the beginning too. If that's the case and they have nothing to show for, it seems to me like it's just vaporware.

The multiplayer space has moved forward at a glacially slow pace because it's been optional. I guess people are forgetting how primitive it was in the ps2 era. It limped along that entire generation because it was optional, before it finally took off in this generation. That's a really long time! Look at the pc space as well, they had internet available forever yet how many years did it finally take for someone to take a chance on an mmo using that optional internet connection? And that's on heavily connected pc's! Now imaging how that affects consoles.
Yeah but multiplayer games, even multiplayer-only games have shown to be quite profitable so I don't see why publishers would be against the idea of using the cloud extensively. I'm not talking about lots of games, just one. It seems more plausible to me that there's simply no good use for it just yet.
 
Sure, for third party titles I'd expect a low adoption rate, but for MS I'd expect a few tech demos at the very least to entice both developers and consumers. However, there are none. Like I said, they designed the thing to be always online from the beginning so they should have worked on that from the beginning too. If that's the case and they have nothing to show for, it seems to me like it's just vaporware.
There was the 300,000 asteroid orbits tech demo? Versus 40,000 possible on the console alone. Supposedly tracked at "~500,000 updates per second".

There hasn't been a shortage of suggestions from various developers.
 
...
Looking at multiplayer games before Xbox basically made online standard for Xbox Gold. Look at the state of video games before 3D acceleration became relatively standard. This is most easily scene in PC gaming when 3D acceleration was relatively new. Good luck finding many games that were 3D accelerated until quite a few years after the first proficient consumer level 3D accelerator was released (3dfx Voodoo Graphics along with Rendition Verite). It was still relatively rare when Nvidia finally jumped in with a more standard 3D accelerator in the Riva series.

Look at the adoption of 256 color graphics in PC games before the majority of PC's contained 256 color graphics? Want to get one of those fancy games greenlight by a publisher or developer? Good luck trying to make that happen.

Of course, there's little evidence of how much things can change before there's a critical mass of machines/users that can make use of those things...

This is all true; but there is one factor you missed out. It wasn't the creation of the standards on their own that drove the mass adoption. It was the technology itself reaching a mature enough standard to be effective and available to the mass market. And then it was developers like ID who started the ball rolling with regards to how that tech was pushed. The platform holders did nothing but implement, and in some cases extend, it.

It's just the same now, the only difference being a platform holder is simultaneously creating the tech, implementing the tech, and trying to mass market the tech. All of which could have transpired had they not proved themselves to be the most ham fisted handlers of such an idea. Whether it would work is another matter.

When it's all mature, ubiquitous to every device, and second nature then it all will make sense. But it's not just a question of tech, but the very infrastructure to run it. Imagine marketing your 3D card to a market where only 4 in 10 PC's were capable of running it 80% of the time.
 
Sure, for third party titles I'd expect a low adoption rate, but for MS I'd expect a few tech demos at the very least to entice both developers and consumers. However, there are none. Like I said, they designed the thing to be always online from the beginning so they should have worked on that from the beginning too. If that's the case and they have nothing to show for, it seems to me like it's just vaporware.

For 3rd parties personally I'd expect them to keep it all secret, so like I wouldn't expect them to reveal the next big thing even if they found it. They are better off shipping their game with it first and then do a post mortem where they can explain the coolness of their tech and take full credit for it. Seems like that would be more ideal rather than a 3rd party saying "Hey competition we just discovered this cool new idea with cloud, here it is, take it and ship a game with it before we do!"

For 1st party I imagine Microsoft was preoccupied with getting tools and api all sorted to let everyone get up to speed quickly. They probably have basic tech demos available but the onus is on them to get the 3rd party coders coding as easily and quickly as possible. It's like with the 360 before it came out, they didn't really show us anything revolutionary ahead of time on the console as far as algorithms or tech demos go, but they did blow us away with their toolset which let us his the ground running on day 1. Titanfall presumably served dual purpose of being both their high profile exclusive and helping them design the cloud api/tools along the way, but I really wouldn't expect much from the first batch of games.


Yeah but multiplayer games, even multiplayer-only games have shown to be quite profitable so I don't see why publishers would be against the idea of using the cloud extensively. I'm not talking about lots of games, just one. It seems more plausible to me that there's simply no good use for it just yet.

Multiplayer though hasn't changed much in many years. It's basically become a drop in feature now for the most part. In other words you don't have long extensive meetings on multiplayer code anymore, there are design aspects but code side the basics of multiplayer are all the same as they have been for a long time, not super complicated. Counter this with cloud compute related work which is literally all new, there's nothing to fall back on and you have to rethink everything as far as how to engineer it, and even figure out what in heck can be moved to cloud to begin with. It's an all new frontier which translates to mucho man hours required to get something out of it, likely with lots of failed ideas at first. Someone has to sign off on those programmer man hours, and that's where it alas gets complicated once you are dealing with an aspect of the console that is no longer standard.
 
See even there I don't agree, I think far too many people are thinking of cloud in terms of milliseconds of latency, which it doesn't need to be to be useful. Heck I'd go as far as making 24 hour cloud latency ideas a question on interviews to be perfectly honest. People need to disconnect themselves from treating cloud as part of the real-time render pipeline, there's far more interesting ideas to be conjured up with minutes or even hours long of latency. This is why you need everyone on board getting their minds cracking on the problem. It's like the equivalent of when cd's came to the scene of gaming and of course most first ideas were about full motion video, because that's all they saw cd's as to begin with, as a vast storage medium on which to store videos. Same with cloud, everyone keeps thinking ok how does it help bring down the ~16ms of a render frame. To which I keep saying no no no, don't think of it that way! People need to jump out of the box and think far more broadly than that. The 24 hour internet check was perfectly adequate to get the cloud creative juices flowing, and as I've also argued before it's what I would suggest all developers start with, ie very heavy latency ideas measured in minutes or even hours of latency.

We have not really gained anything from CDs/Blu-rays other than FMV and better sounds/voice acting.
 
We have not really gained anything from CDs/Blu-rays other than FMV and better sounds/voice acting.

Compared to what? Champions of Norrath was 9GB on PS2, and that was on one single dual layer DVD, and mostly due to a type of megatexture technology creating nice varied levels (much better imho than even Diablo 3). Wouldn't really have been possible on a Nintendo cartridge - heck it wasn't possible even on the 360 on one disc. ;)

Even the choice for DVD in the 360 has held back some games but harddrives have solved that issue at least to some extent. A surprising amount of 360 even first party exclusives comes on two discs, of which content of one is then typically installed to the HDD.

Seriously, there's probably more than a DVD's worth of textures just in one of the latter Uncharted games, so I'm not sure what you mean with your comment.
 
Compared to what? Champions of Norrath was 9GB on PS2, and that was on one single dual layer DVD, and mostly due to a type of megatexture technology creating nice varied levels (much better imho than even Diablo 3). Wouldn't really have been possible on a Nintendo cartridge - heck it wasn't possible even on the 360 on one disc. ;)

Even the choice for DVD in the 360 has held back some games but harddrives have solved that issue at least to some extent. A surprising amount of 360 even first party exclusives comes on two discs, of which content of one is then typically installed to the HDD.

Seriously, there's probably more than a DVD's worth of textures just in one of the latter Uncharted games, so I'm not sure what you mean with your comment.

Yes, we have more storage so we can put more stuff (textures, sounds and movies) in the games. It is not like you have to think about that, it is completely obvious.
 
We have not really gained anything from CDs/Blu-rays other than FMV and better sounds/voice acting.
...and better textures and models and more variety.

Yes, we have more storage so we can put more stuff (textures, sounds and movies) in the games. It is not like you have to think about that, it is completely obvious.
:???: No more obvious that FMV and better sounds/voice acting
 
There was the 300,000 asteroid orbits tech demo? Versus 40,000 possible on the console alone. Supposedly tracked at "~500,000 updates per second".

There hasn't been a shortage of suggestions from various developers.

I must have missed it. Do you have a link?

For 3rd parties personally I'd expect them to keep it all secret, so like I wouldn't expect them to reveal the next big thing even if they found it. They are better off shipping their game with it first and then do a post mortem where they can explain the coolness of their tech and take full credit for it. Seems like that would be more ideal rather than a 3rd party saying "Hey competition we just discovered this cool new idea with cloud, here it is, take it and ship a game with it before we do!"
I don't know, just look at Ubi showing Watch_Dogs and The Division way ahead of time. Other companies would have copied them already. But the thing is that in order to do so they have to gauge consumer response. Mostly in the form of sales. No matter how cool a feature, devs/publishers wouldn't jump in without some hard data first so I don't think that's actually the case here.

For 1st party I imagine Microsoft was preoccupied with getting tools and api all sorted to let everyone get up to speed quickly. They probably have basic tech demos available but the onus is on them to get the 3rd party coders coding as easily and quickly as possible. It's like with the 360 before it came out, they didn't really show us anything revolutionary ahead of time on the console as far as algorithms or tech demos go, but they did blow us away with their toolset which let us his the ground running on day 1. Titanfall presumably served dual purpose of being both their high profile exclusive and helping them design the cloud api/tools along the way, but I really wouldn't expect much from the first batch of games.
And the problem with that is that all they ended up was a lot of PR talk and nothing to show-off.

Multiplayer though hasn't changed much in many years. It's basically become a drop in feature now for the most part. In other words you don't have long extensive meetings on multiplayer code anymore, there are design aspects but code side the basics of multiplayer are all the same as they have been for a long time, not super complicated. Counter this with cloud compute related work which is literally all new, there's nothing to fall back on and you have to rethink everything as far as how to engineer it, and even figure out what in heck can be moved to cloud to begin with. It's an all new frontier which translates to mucho man hours required to get something out of it, likely with lots of failed ideas at first. Someone has to sign off on those programmer man hours, and that's where it alas gets complicated once you are dealing with an aspect of the console that is no longer standard.

Yes but I don't think the MP has been stagnant due to the lack of the cloud. Just developers and publishers playing it safe to make more money.
 
I must have missed it. Do you have a link?
It was discussed in the cloud augmentation thread. MS showed a demo of 40,000 spinning pink dots running on the local machine, and then 300,000 pink dots when they connected to the cloud, simulating NASA asteroid data. Apart from a couple of large figures, there were no details, and nothing could be learned from the demo.
 
I don't know, just look at Ubi showing Watch_Dogs and The Division way ahead of time. Other companies would have copied them already. But the thing is that in order to do so they have to gauge consumer response. Mostly in the form of sales. No matter how cool a feature, devs/publishers wouldn't jump in without some hard data first so I don't think that's actually the case here.

They showed new graphical features, they have to because that's 99% of the reason to get people to buy into next gen since the new consoles don't offer much more than that, and because their competition is showing off their graphics engines as well so they have to compete alongside them. Gameplay wise there wasn't a whole lot new there on both Watch Dogs and The Division, they are both reminiscent of games past. In any case I don't think anyone has yet figured out a good use of cloud compute, nor would I expect them to for quite some time so in this case I doubt there's anything yet to show. You'd think first step is just get new games ready for console launch, that alone is a daunting/expensive enough task as it is let alone having to deal with new stuff like cloud. Once that's done then they can take time to figure out the new stuff, bat around new ideas, etc. Or at least I hope they do with cloud compute, now that's it's optional I'm not so sure.


And the problem with that is that all they ended up was a lot of PR talk and nothing to show-off.

Historically how often is something new introduced along with revolutionary ideas on day one? I'm trying to think back and nothing comes to mind. Think video cards with hardware assist, do you remember the very first ones that came out what seems like an eternity ago and how they were pr'd to people? At first it was stuff like look, you can drag around your windows at full speed in non outline mode and it's all smooth. Wow! Not exactly earth shattering...but like anything else it would take time for people to figure out what to do with this new fangled invention called the video card with hardware assist. The good stuff usually would come years later when someone finally figures out that really cool thing to do with that really cool invention. Even with the internet itself which I first used back in 1990 or there abouts, it wasn't earth shattering at first but you could see the potential. Websites, etc all came much later to the general public. Regarding cloud compute I think people are being unfair here expecting earth shattering new things on day one, give the developers a few years with it and then let's revisit it and see where it's at.


Yes but I don't think the MP has been stagnant due to the lack of the cloud. Just developers and publishers playing it safe to make more money.

Sure I don't disagree there but I was replying more to this comment you made:

L. Scofield said:
Yeah but multiplayer games, even multiplayer-only games have shown to be quite profitable so I don't see why publishers would be against the idea of using the cloud extensively.

...so as to the reason multiplayer is still in games even though internet connections aren't guaranteed is because code wise it's become an almost drop in affair, so there's little reason to not offer multiplayer. Using optional cloud extensively though is a different matter because it's far from a drop in thing.
 
They showed new graphical features, they have to because that's 99% of the reason to get people to buy into next gen since the new consoles don't offer much more than that.../
That's silly talk. They show graphics because that's what most gamers (certainly those watching E3) want. It's games that have pushed the development of GPUs and the development of DirectX and new rendering technqiues, because people like pretty stuff! If all the power of the next gen consoles was spent on fancy world simulation and unique gameplay experiences with PS2 level visuals, everyone would switch off. The reason no-one showed an amazing god game, or a new abstract puzzle game, is because the market doesn't support those games so no-one's going to waste time making them (at least in publishers eyes). There isn't a half-dozen developers out there mumbling to themselves, "Damn, I'd really love to make a new Populous type game; it'd sell tens of millions, but these poky little consoles aren't up to the job and the cloud can't be relied upon. I'll go make a shooter instead."

Gameplay wise there wasn't a whole lot new there on both Watch Dogs and The Division, they are both reminiscent of games past.
What gameplay changes can the cloud provide that local compute resources can't? We've had discussions in the past about things like dynamic game physics and emergent play and why they don't feature in games, and a considerable part of that reason is game design issues (what if you cause an avalanche that blocks off a necessary route to advance the story?).

If you had started a thread a year ago on why games keep showing the same old things, you'd have got a lot of replies covering a lot of ground, none of which is 'lack of cloud'. One needs only point to the variety achieved on the 16 bit computers to see it's not processing power that's holding originality back.
 
What gameplay changes can the cloud provide that local compute resources can't? We've had discussions in the past about things like dynamic game physics and emergent play and why they don't feature in games, and a considerable part of that reason is game design issues (what if you cause an avalanche that blocks off a necessary route to advance the story?).

It isn't about what the cloud can do that a local machine can't do. It's about the limitation of resources that a game programed to only run on a local machine has to deal with.

You have a finite amount of resources with which to do everything in a game. Hence why you see things like advanced sound algorithms get virtually no resources dedicated to it. Hence why non-combat AI and even to an extent combat AI gets so few resources dedicated to it.

Potentially with the cloud you now have X amount of extra resources that are completely and totally unavailable on a local machine. Sure the local machine could still do those things, but only at the expense of reducing the resources used for other things. Perhaps it's only 10% more computational power. 20%. 30%. Whatever it is, it is still more than what you have with the local machine alone.

That goes way above and beyond what exists with most online enabled games currently. It represents a potential paradigm shift in how you code a game and what things you can now enable that you couldn't before due to being limited to the resources available on the local machine.

The local machine is always going to be the best at rendering graphics and immediate latency sensitive physics. So the more stuff you can move off the machine means more resources you can dedicated locally to those things. If you move X% of latency insensitive calculations online, that's potentially X% more local resources that could be used for better physics for example.

There's also the potential to draw in realtime relevant data in game from the cloud or online resources or whatever you want to call them that an always online connection would allow. Things like having your game reflect the real world time, day, weather, traffic, etc. conditions. Or having a living constantly changing and updating world.

Regards,
SB
 
Back
Top