Great video! Some nice performance comparisons at nearly matched PS5 settings and it's good to see the 3060 seemingly offering a largely equivalent to PS5 experience (as close as it's possible to measure given the differences) - so much for needing an RTX 3070 for this which is almost 50% faster than the 3060.
Interesting CPU comparison using the 11400F and 12400F too - both of which are falling a little below the locked 60fps level (although it's still worth remembering that according to Alex the PS5 uses an object range of between 7 and 8, not exactly 8 and this could have a noticeable performance impact given Richard shows the difference between 8 and 10 here is around 18%).
I still would have liked to see an 8 core compared here though as I can't help but feel that after giving up a whole core to work on decompression, a 6 core CPU is just at to great a core disadvantage vs the PS5 for equivalence, even if those cores are much faster.
Also interesting to see the RT impact on performance. It seems most of the game isn't actually stressing the RT capabilities that much.
First Post on this Forum after quite a lurking time
So what i think about the Spiderman Port to PC is that in contrary to the overall opinion in this Thread, PC is as a platform (not any high end user mind you) but as a platform of gamers it is in trouble. Cosnoles offer much better per watt / performance than PCs - they are engineered that way.
You always can make things go more fluid if you alter them from being general purpose (PC) to specifically render Game code as efficient as it can possibly get.
Especially Consoles APIs (and here Sony realy shines) are much more lean and have every layer removed that is not needed.
So that was and is true for PS4. The latest Video from Richard Leadbetter shows this - PS4 cannot be matched on PC with similar Hardware. He is running the Game in 900p and it still runnes less performant. The Base PS4 Game renders almost all of the time in 1080p. With hardlocked 30fps. And thats all with Richard Leadbetter using a CPU that is running circles around the PS4s Jaguar Cores.
The Performance Edge that Consoles have did not manifest as much last Generation mainly because the extraordinary edge the PC Community had with their CPUs over the Jaguar Cores. Console Performance and API Edge over PC was here "sacrificed" in a way by using a under normal (PC) circumcstances totally unfit for gaming CPU.
Only the less friction ( hardware and software wise ) Enviroment on Consoles made it possible to found a whole gerneration of still very good looking games with a Netbook CPU at a Time where so much more performant PC CPUs were used by people.
This Time however , all is different!
This Time a very good CPU is used. But it is still a lean and friction less console enviroment. Even more so on PS5 where its dedicated HW Decompression block does even more free up CPU time. Something the PC simply cannot copy 1:1.
Direct Storage will help but it will come at a cost.
Ill jump over a few points now and adress my take on the Question of a RTX3070 is sufficient to play PS5 Ports in the future at the same ( or even better) Detail Settings.
My Answer is of course NO.
We all see what it takes CPU and GPU wise to play a mere PS4 Game uplifted per PS5 patch on PC to play at the same Settings. Gone are the times where way worse PC Hardware (read GTX 750) could bring something to the table.
I saw i a page or so earlier in this thread the Idea that People with RTX 30 Series or even RTX20 Series GPUs would be fine simply because RTX I/O is here to save the Day. It was aknowledged that RTX I/O on RTX 20 and 30 Series does not even come with dedicated decompression HW. Nvidia simply wants to use underutilised GPU parts. Someone stated that a mere 2Tf sacrifice would be enough to match PS5s I/O throughput. Also wrong - RTX30 Cards from 3070 upwarts enjoy a rasterisation edge vs PS5 but its not like it is in the next league. For RTX 20 cards? Forget it. I state that when the first 2nd Wave PS5 exclusive arrives wich should leverage PS5s I/O Block hard, the RTX 20 cards are not going to cut it anymore. And the chance is there that a RTX 3070 could not match a PS5.
Heck, i want to see a Ratched and Clank Rift Apart port to PC right now. Does anyone, given how a PS4 Game with a PS5 Patch stresses PC Hardware right now, believe that Rift Apart would run very good on the same range of hardware of today? The Game relies heavily on real time decompression of several GB/s with every cameramove. The "what is not in sight is not in memory" aproach that Cerny forecasted in his Road to PS5 Talk. And Rift Apart does not even bring the PS5 to sweat - we know that because the swap out of external SSDs wich are technicaly under Sonys recommendation for a SSD upgrade.
And please do not read this post as a stab against PC in general. I game on one as well - but things need to adressed in all fairness.