And yet that is exactly what a lot of people have been saying in the forums about PS5.
Dunno. What I've seen is much more of the other side: SSDs in PCs are useless for gaming therefore PS5 is made by fools.
And yet that is exactly what a lot of people have been saying in the forums about PS5.
No one is developing a game for a 2080ti as the baseline afterall.
Or even 2080ti region.
An SSG version of a gaming card could be a pricier way to get a storage system into PCs with similar parameters to the customized console subsystems. It wouldn't require mass replacement of all the systems where the CPU doesn't have built-in compression and extra DCMAC hardware and the motherboard lacks a PCIe 4.0 NVME slot. There would need to be some transfers over the PCIe bus to the graphics card, but those could be limited to swapping a game's asset partition in and out rather than constant transfers.
It'd be a value-add for AMD's hardware, at least.
An SSG version of a gaming card could be a pricier way to get a storage system into PCs with similar parameters to the customized console subsystems. It wouldn't require mass replacement of all the systems where the CPU doesn't have built-in compression and extra DCMAC hardware and the motherboard lacks a PCIe 4.0 NVME slot. There would need to be some transfers over the PCIe bus to the graphics card, but those could be limited to swapping a game's asset partition in and out rather than constant transfers.
It'd be a value-add for AMD's hardware, at least.
I can't tell if serious anymore.Are you saying an SSD cannot save processing time on a GPU?
I can't tell if serious anymore.
Maybe they were testing CrysisNo SSDs will not increase your framerate by 50%
Cerny said in the presentation that the access to the game assets is mapped, and the dev doesn't even need to know if or how it's compressed, they address the virtual uncompressed data layout, it's all transparent.
Of course they need to know it comes from a 2.4GB/s or 5.5GB/s drive. Can't defy the laws of logic.
I can't tell if serious anymore.
No SSDs will not increase your framerate by 50%
Since XSX is not too far from that 2080Ti, we actually have a 2070/2080 baseline minimum with the next gen consoles.
100 times inaccurately.I'm not sure we need to explain it in a 100th time. It was perfectly fine the first 10 times, but now it's getting ridiculous.
Incorrect. Fixed clocks are common. Exhibit A) PS4 - While under max load (ie. God of War), the CPU and GPU will never dip below or go above 1.6GHz and 800Mhz respectively. Developer is guaranteed that performance. Heat and fan noise is another story.They had problems maintaining 2+3 for a maximum possible load. Because it's a something that nobody does in hardware world.
Incorrect. Claim is normal and there is no basis for your assertion.XBSX claims that they do it, but that claim should be scrutinized, because that claim is unrealistic, not Sony's one.
Agreed. Except for the phrasing of used too much. It's not a scenario where you can use 100% of the chip for short periods and then it slows down. The closer you get to 100% the more it slows down. The thing which hasn't been disclosed, and the real point of contention is what are the clocks at 100% utilization. The numbers that make sense are 2 and 3, because that is what Cerny himself indicated they would be had they opted for "fixed" clocks.Sony's claim is perfectly fine: when power hungry operations are used too much the GPU or CPU underclocks.
Incorrect. While variable clocks in and of themselves aren't new, in the console space clocks are generally deterministic with a predefined consistent speed. PS5 keeps the deterministic part in a way, by ignoring heat which is how things in the PC space generally operate. But even there, the min clocks under load are disclosed. The issue with Sony, is they have failed to disclosed those max load clocks, and so folks cling to the "most of the time" and "a couple percent" statements, which have no context of load to go with them.It was a case for every GPU and CPU till now.
This year it's all about Xone baseline for MSFT.
While under max load (ie. God of War), the CPU and GPU will never dip below or go above 1.6GHz and 800Mhz respectively. Developer is guaranteed that performance. Heat and fan noise is another story.
Incorrect. Claim is normal and there is no basis for your assertion.
The thing which hasn't been disclosed, and the real point of contention is what are the clocks at 100% utilization.
The issue with Sony, is they have failed to disclosed those max load clocks
No sane console corporation is going to drop their 50 to 100m user base during the first months (this year), or even year.
I can't tell if serious anymore.
No SSDs will not increase your framerate by 50%
lol.I would not be so shure about that, although I would prefer not to advance a number...
But take a look at this example of frustum culling:
(Not managing to place an animated gif, so I’ll leave a link)
https://giphy.com/gifs/linarf-xUPGcgiYkD2EQ8jc5O
If an SSD allows you to avoid processing anything outside the cone of vision, you will save a lot of processing power.
And this is a good example of frustum culiing. Other games are not so efficient.
I cannot say how much processing power you could save here if an ssd would reduce the outside part to a minimum. But looking at that image there are moments where it looks the outside part is almost as much as the inner parts.
Another case is when you are limited on geometry by the HDD speed. An SSD could allow for extra geometry. This would not be an performance boost, but it would count as such due to the increased visuals.
And an extra fast SSD can even do both.
But it still means no "2080 as a baseline" until at least 2021 for XBSX.
And yet that is exactly what a lot of people have been saying in the forums about PS5.
lol.
sure I guess. If I/O is the limiter.
It's not going to make your 1080 operate like a 2080 S though.
Just didn’t understood the lol.
I gave an example... It's a valid one!
Just don’t forget That with an SSD you can change the way you create games, putting all the GPU power on what’s on screen and relying on it to stream the rest.
But want to get out of that example? What about it beeing used as virtual RAM. Like when Ray tracing starts creating those large data sets that grow exponentially with adicional geometry and high resolution textures, using gigantic amounts of memory, and limiting you on what you can have on screen and what you can do with it (RT) due not to GPU performance, but to memory limitations.
Just didn’t understood the lol.
I gave an example... It's a valid one!
Just don’t forget That with an SSD you can change the way you create games, putting all the GPU power on what’s on screen and relying on it to stream the rest.
But want to get out of that example? What about it beeing used as virtual RAM. Like when Ray tracing starts creating those large data sets that grow exponentially with adicional geometry and high resolution textures, using gigantic amounts of memory, and limiting you on what you can have on screen and what you can do with it (RT) due not to GPU performance, but to memory limitations.