That's the same thing and basis for my serviette maths. Sebbbi posted that an ideal virtual texture (only loading parts of textures you will be needing to draw the visible surfaces on screen) engine would need about 7 MB/s for 720p*. From that, I extrapolated 63 MB/s for 4K. however, that's for games using virtual texturing which are very few and far between. That idea even extended to virtual meshes last gen, but the tech never took off in a big way. I don't know why, but there it is. Most games still load textures up front well in advance of using them. I don't think anyone can guess how engines will develop over next gen between the consoles and PCs. I expect a fair degree of divergence, converging on ideal streaming solutions towards the generation end rolling into the next, but I could be very wrong.I think Shifty posted some napkin math based on an older post that went, roughly, 62mb per 16 ms frame to change everything...There is supposed to be something about only loading (SSD to RAM) the part of a texture that is actually going to be used, rather than the whole thing.
Consoles could get away with having the OS on the SSD, and save a little bit on ram?
That's the same thing and basis for my serviette maths. Sebbbi posted that an ideal virtual texture (only loading parts of textures you will be needing to draw the visible surfaces on screen) engine would need about 7 MB/s for 720p*. From that, I extrapolated 63 MB/s for 4K. however, that's for games using virtual texturing which are very few and far between. That idea even extended to virtual meshes last gen, but the tech never took off in a big way. I don't know why, but there it is. Most games still load textures up front well in advance of using them. I don't think anyone can guess how engines will develop over next gen between the consoles and PCs. I expect a fair degree of divergence, converging on ideal streaming solutions towards the generation end rolling into the next, but I could be very wrong.
* Note that Sebbbi implemented in in Trials (dunno which version) where user created levels could include every single object into one level and it was all just streamed. So he kinda knows what it takes to pull off virtual texturing. And he's at Unity now AFAIK; fingers crossed we get his graphical brilliance into our favourite Indie engine!
If I remember correctly tier 1 that was in the consoles had some limitations, so in the end devs ended up rolling their own software version.tiled-resources hardware that was talked about but didn't get used much.
Along the same lines, XBSX is 6+ GB/s when using the hardware decompression block. This was stated as in what a developer would typically see and not a theoretical maximum. I view it as the minimum attainable with hardware decompression.
This is false. MS said their decompression chip has a theoretical maximum output of 6GB/s. 4.8GB/s is stated as the typical output. It's not clear if 6GB/s output is ever even physically attainable as the device is configured, though I assume it probably is as a best case scenario.
"Our second component is a high-speed hardware decompression block that can deliver over 6GB/s," reveals Andrew Goossen.
Sony said:A quick update on backward compatibility – With all of the amazing games in PS4’s catalog, we’ve devoted significant efforts to enable our fans to play their favorites on PS5. We believe that the overwhelming majority of the 4,000+ PS4 titles will be playable on PS5.
We’re expecting backward compatible titles will run at a boosted frequency on PS5 so that they can benefit from higher or more stable frame rates and potentially higher resolutions. We’re currently evaluating games on a title-by-title basis to spot any issues that need adjustment from the original software developers.
It's quite incredible that Sony has been forced to come out and clarify this because some people just want to misrepresent what Cerny said. He said, btw, that most of the 100 most played games on PS4 work fine on PS5 and some needed rework, he did not say only 100 PS4 games work on PS5.
This is what happens when you use a GDC tech talk as your first comms packages for a nextgen console. This is not Mark Cerny's fault, this is Sony PR's fault. Wasn't Sony centralising marketing amid-layoffs last October supposed to improve communications?
How could anybody think this tech talk in isolation would be a good thing to market through all the regular consumer-facing comms?
They actually said it was capable of over 6 GB/s.
"Our second component is a high-speed hardware decompression block that can deliver over 6GB/s," reveals Andrew Goossen.
from the digital foundry reveal.
Except it won't be the minimum. If Microsoft are saying 6Gb is "typical", it means it can go slower under some circumstances and faster under others. It's likely a median of observed throughput through considerable testing. Thats' why that used "typical" and not minimum or guaranteed.Along the same lines, XBSX is 6+ GB/s when using the hardware decompression block. This was stated as in what a developer would typically see and not a theoretical maximum. I view it as the minimum attainable with hardware decompression.
I think the times quoted were for the stock PS4/Pro HDD. I was fully willing to put a big SSD in my Pro but the DigitalFoundry testing was disappointing - the times not the article.Fast travel has a fixed amount of data that is required to be loaded, and it resulted in 8 seconds for original PS4 (I suppose SSD, because I played spiderman and if HDD could do it in 8 seconds it's really short then. I don't even recall it fast traveling in 8 seconds and I put a SSD in my PS4 pro from day 1)
This is a nice clarification. Mark Cerny is an incredibly precise guy when it comms to communication and I'm sure there were a few who wilfully interpreted the "we tested the 100 PS4 games" comment, which was ripe for mis-interpretation and this could have been avoided by just saying the "vast majority" of the whole library would be playable.
That's some beastly speeds, now i see why MS sells those SSD-memcards, most SSD drives for pc are way below that. Going to be intresting to see huge open world games what they can do with that. Back in 2001, i thought the first halo had impressive open areas (thx to hdd?).
Can you provide a quote? I'm not seeing it.He did. When he started talking about backwards compatibility and before he went into the section about boost mode and having tested the top 100 games. Plus, you know, the last year+ of them saying the PS5 was BC with PS4, fullstop.
It must get a bit frustrating. He said exactly that, but people didn't understand because they can't interpret the context of the numbers; they just hear the numbers. "Less than a hundred games are BC?! "This is a nice clarification. Mark Cerny is an incredibly precise guy when it comms to communication and I'm sure there were a few who wilfully interpreted the "we tested the 100 PS4 games" comment, which was ripe for mis-interpretation and this could have been avoided by just saying the "vast majority" of the whole library would be playable.
The Microsoft SSD isn't really that fast
Liar.Before the reveals i was considering ps5 for it’s ssd...
Except it won't be the minimum. If Microsoft are saying 6Gb is "typical", it means it can go slower under some circumstances and faster under others. It's likely a median of observed throughput through considerable testing. Thats' why that used "typical" and not minimum or guaranteed.
Before the reveals i was considering ps5 for it’s ssd, but after reading alot, and seeing comments like this, the xsx/bcpack eith their api/velocity engine, and how consistent, they arent far off in the end, ssd aint the reason anymore.