Middle Generation Console Upgrade Discussion [Scorpio, 4Pro]

Status
Not open for further replies.
Hmm
The Blu-ray Disc format originally launched with 25GB single-layer and 50GB dual-layer disc capacities, later upgrading to 100GB and 128GB with the BDXL format in June 2010.
Contrasting

The initial 4K Blu-ray specification allows for three size densities – 50GB single layer, 66GB dual-layer and 100GB triple-layer each with 82Mbit/s, 108Mbit/s and 128Mbit/s data read speeds, respectively.


Not much information on the XBOS drive; except it's Philips DG-6M5S.

Can't figure out the speed. Not sure how relevant it is.

Sent from my iPhone using Tapatalk
 
Last edited:
I am sure they can compress at least some of that stuff and remove some of the unused assets.
They do! Everything's compressed (except meshes I guess) - it loads faster and you can fit more in RAM. Unused assets should only be a small overhead. Who's going to spend time and money making 20 GBs of assets to only use half of them in game? The rejects and things they couldn't fit in time or rejected for artistic reasons.
 
Games grow to the room available. It's always been this way.

With less storage available, games rely on a combination of both storing less (obviously) and engineering solutions to work around problems - though this can consume resources.

Chiptune vs pre-recorded audio; FMV vs in-engine cutscenes; one instance in storage vs many to improve access patterns; building up large textures from smaller textures and decals vs "megatexture"; physics based animation vs huge amounts of precanned ... there are lots of tradeoffs that can impact on storage. Compressing an entire game package down vs allowing playback after partial install would probably have an impact too - at least in terms of distribution size.

Only thing you can say for sure is that for a given set of techniques, more storage allows for more (or higher quality) content.
 
In addition to that though, note that the 'more storage' options are still compressed. Chip tunes versus mp3s; cutscenes versus Blink/h.264 video; smaller compressed textures versus compressed megatextures.
 
FMV vs in-engine cutscenes

Interestingly enough FMV has a higher storage cost than in-engine cutscenes, unless the assets used in the cutscenes do not exist anywhere else in the game (almost unheard of).

In-Engine cutscenes evolved after FMV as a method to reduce not only cost (good CG is expensive) but game storage requirements as well, despite game sizes growing. The first decade of two usually presented far lower presentation quality compared to full FMV, but in recent years has gotten to a somewhat acceptable quality in comparison.

Regards,
SB
 
In addition to that though, note that the 'more storage' options are still compressed. Chip tunes versus mp3s; cutscenes versus Blink/h.264 video; smaller compressed textures versus compressed megatextures.

Yeah, compression makes sense pretty much all the time. Consoles are so fast now that the cost of decompressing is much lower than the cost of waiting longer to transfer data into memory. I suspect that even meshes are compressed or packed now.

Even back in 1999 Shenmue was decompressing data on the fly on the CPU as it loaded assets into memory. Was one of the causes of slowdown in some areas, iirc.

Interestingly enough FMV has a higher storage cost than in-engine cutscenes, unless the assets used in the cutscenes do not exist anywhere else in the game (almost unheard of).

In-Engine cutscenes evolved after FMV as a method to reduce not only cost (good CG is expensive) but game storage requirements as well, despite game sizes growing. The first decade of two usually presented far lower presentation quality compared to full FMV, but in recent years has gotten to a somewhat acceptable quality in comparison.

I was using in-engine as an example of where you can save on storage. Even though engine cutscenes can chew through a lot of data these days, you'll probably still save a lot over 4K video.

I think of in-engine scenes as being a continuation of where sprite based games were back in the 1980's, with text, sprite animation (often repurposed and re-used from elsewhere) and sound effects used to convey the story. The same story could basically be got across in a few kilboytes that could potentially take up tens of Megabytes on CD.

I've always preferred in-engine due to the way it maintains visual consistency, though CGI animators have fewer limitations on what they can do.
 
All these doesn't explain the day 1 20GB patches that don't add any assets and are just bug fixes.

That's going to come down to how patches are distributed and how game files are installed.

There's 2 typical methods of patching,
  1. Download files and overwrite existing files.
  2. Download a patch file. Check version and integrity of existing files. Patch the file to be correct.
Method 2 is generally going to result in a far smaller patch download at the expense of use of computational resources and potentially a longer time investment unless the file downloaded in method 1 is large.

Method 1 can be quick if game files are stored uncompressed or in individual small packets. However, many games store assets in large wad files. If method 1 is used that entire wad file must be downloaded even if the fix or added content is only a few MB in size.

Method 2 doesn't see a significant increase in patch download size regardless of how the game files are installed on a machine. But can be computationally intensive, as mentioned. Due to this on consoles, historically Method 1 was used. I don't know if the current gen consoles also still do it that way or not.

That can result in large downloads even if only a few contents of a wad were changed.

Both methods have advantages and disadvantages.

Method 1 can obviously result in far more data being downloaded than is absolutely required, but is fast (once download is complete) and doesn't require X version of Y file to be in place in order to apply Z patch. It also uses no computational resources, which on consoles has historically been pretty limited. Since you are just overwriting whatever files exist, you can just check to see what version is currently installed and download everything needed for the latest version. So it doesn't matter if you have Version 1.0 of a game and the game is currently on version 3.2 after tens of patches.

Method 2 has an obvious benefit in reducing the size of downloads. Hence you see this used almost ubiquitously in PC MMOs as patches are frequent and files are often stored in large mult-GB wads. The downside is that once the download is finished there can be a lengthy patch period as the patch is applied. Additionally, when patching, you have to patch through each version. So using the example above. If you are on version 1.0 and want to patch up to version 3.2, you'll most likely have to apply patch 1.1, then 1.12, then 1.5, then 2.0, then 2.1, etc. The result being it could take an extremely long time to patch up to the latest version if you haven't been keeping up. Hence why occasionally MMO's will update the install package and then recommend reinstalling the game versus patching the game as it'll be much faster that way (IE - patching similar to method 1). Some developers will host multiple versions of the latest patch that can be applied to various previous versions, but it's somewhat rare.

Regards,
SB
 
Yeah, compression makes sense pretty much all the time. Consoles are so fast now that the cost of decompressing is much lower than the cost of waiting longer to transfer data into memory. I suspect that even meshes are compressed or packed now.
Everything has been compressed since PS2 days. Decompression is faster than reading data from HDD / disc. Compression thus both improves loading times and allows more content.

There are preprocessing methods to improve mesh compression ratios. For example triangle indices can be rotated so that lowest is first and then simply store the increase (most high bits become zero -> it compresses very well with LZ based compression). Vertex data should also be stored in SoA layout so that similar data is near each other (vertex colors for example tend to be highly similar, UVs change smoothly, etc. Delta preprocessing also helps here).
 
Some developers will host multiple versions of the latest patch that can be applied to various previous versions, but it's somewhat rare.

There's no excuse for consoles not to do that. The point of consoles is offering a streamlined user experience. They should use every single trick in the book to make games, patches and content as small as possible, and to speed up their download.
I feel like the OS of current consoles is still very sub-optimal and un-friendly to users.
 
With 4k game content they definitely need a new Bluray drive. A lot games now are close to 50GB and if they mean 4k serious you can expect 100GB+ content and not everybody has VDSL or similar connections.
Wouldn't that break forwards compatible? Unless a. the Xbox One standard blu-ray drive can read the first two layers of an ultra-hd blu-ray disc or b. they ship two discs in the retail case.
 
Looks like someone misspoke with Scorpio or they are going back on their word about Native 4K for first party titles
Lots to read, but I'll cite the part:
Source: http://www.gamespot.com/articles/xbox-head-phil-spencer-talks-scorpio-ps4-pro-4k-re/1100-6444198/

Microsoft GM Shannon Loftis recently said all first-party games launched in the Scorpio time frame will run be rendered natively at 4K instead of upscaled--do you think third-party games will generally follow suit?

I want to put the tools in the hands of the creators and let them decide. Even for our first-party games, and Shannon owns a large portion of our first-party portfolio and I know she's got a vision that she wants to drive in gaming, and I'm glad she's got her voice and is setting a vision for her teams… even on the launch of the original Xbox there was some push to say "Okay, shouldn't we mandate HD, that everything is 1080p?" and I just think that the best games aren't defined by their resolution or framerate, frankly. I know a lot of people say that 60fps is the holy grail of frame rate. But I'll just say, give the developers the power and the tools to develop the best realisation of their vision and their game, and they will make the right decisions.

On the third-party side, a lot of the teams already have a 4K version of the game because of PC and what they're building. And they'll decide on Scorpio and what they're going to go do. I definitely think we're going to see native 4K games, but you'll also see teams take different approaches and I think that's absolutely fine.

We shouldn't let gaming turn into an artform that's defined by a number. Nobody asks when you look at a painting, how many colours were used? Even the standards in the way movies are shot, there's also a lot of flexibility and artistic flavour in what's put in TV and movies and we should allow that same freedom in the gamespace and not try to excel or review things based on X plus Y equals how good something is.
 
Looks like someone misspoke with Scorpio or they are going back on their word about Native 4K for first party titles
Lots to read, but I'll cite the part:
Source: http://www.gamespot.com/articles/xbox-head-phil-spencer-talks-scorpio-ps4-pro-4k-re/1100-6444198/

Hmm, in the passage you quoted, I only see mention of how 3rd parties may approach 4K and that it is up to them.

He doesn't really touch the first part of the question about 1st party games as that might be set in stone whether he agrees with it or not. From the tone of his response though, it's implied that he doesn't want to force internal studios to adhere to a 4K mandate, but to allow them to choose. That may be out of his control however, and thus he doesn't address it directly.

Regards,
SB
 


Looks like someone misspoke with Scorpio or they are going back on their word about Native 4K for first party titles
Lots to read, but I'll cite the part:
Source: http://www.gamespot.com/articles/xbox-head-phil-spencer-talks-scorpio-ps4-pro-4k-re/1100-6444198/
From the same interview, maybe not Zen, but it seems like Jaguar isn't going to be the CPU of the Scorpio.

"We'd looked at doing something that was higher performance this year, and I'd say the [PS4] Pro is about what we thought--with the GPU, CPU, memory that was here this year--that you could go do, and we decided that we wanted to do something different."
 
From that Gamespot interview:

"How is Xbox Scorpio development shaping up? You said announcing at E3 in June was in part to give developers time to get themselves familiar with the hardware. How is that coming along?

Really well. Actually, with the hardware timelines right now we're a little ahead of plan."

The Project Scorpio video said "Holiday 2017" which basically sounds like anywhere from November to December. Personally I'd love for Microsoft to be able to get this out in mid-late September. I think that would be awesome time to launch.
 
Do they even have a working dev kit with release hardware yet?

Unlikely. Developers probably have the specs available that they can start working with. As work will be done on PC's it won't matter too much whether they have final hardware or not until it comes time to optimize their engine and their games.

Until then if whatever they create is too optimistic for the hardware that ends up shipping they can always scale back effects, etc. If the game sees release on PC, which is quite likely, it's not like any of those assets would go to waste.

Regards,
SB
 
Status
Not open for further replies.
Back
Top