Playstation 5 [PS5] [Release November 12 2020]

I do agree with you on the fill rate aspect now. I don't believe the bottleneck is straight computational performance either though. Note how the 1070 is more than twice as fast as the 1060 despite having less than 50% more shader resources while being an identical architecture. It seems the benchmarks works in mysterious ways.

The bottleneck is maybe somewhere else but we will have soon a game benchmark by Digitalfoundry where I expect the PS5 to perform much slower than a 2080 Ti and it will be interesting to compare to a 5700XT.

EDIT:

https://sluglibrary.com/SlugDemo.zip

There is a demo available on windows it is possible to compare GPU result.

https://sluglibrary.com/

The demo use arial but it is possible to change to any font.

Slug is distributed as a static library that has a basic C++ interface, and full source is included with every license. Slug is platform agnostic and runs on Windows, Mac, Linux, iOS, Android, and all major game consoles. It can be used with Vulkan, Direct3D 10/11/12, OpenGL 3.0–4.6, Metal, and WebGL2. The API is fully documented in the Slug User Manual.

The primary function of Slug is to take a Unicode string (encoded as UTF-8), lay out the corresponding glyphs, and generate a vertex buffer containing the data needed to draw them. When text is rendered, your application binds the vertex buffer, one of our glyph shaders, and two texture maps associated with the font. One texture holds all of the Bézier curve data, and the other texture holds spatial data structures that Slug uses for efficient rendering.

Slug can render each glyph as a single quad. This has the advantage that it’s easy to clip to an irregular shape or project onto a curved surface in a 3D environment. It also means that vertex buffer storage requirements do not depend on the complexity of the glyphs. There is also an optimization that uses a tighter polygon having between three and six vertices that can be enabled to increase speed at moderate and large font sizes.)
 
Last edited:
Launch PS3s had ps2 hardware. (At Europe)

At least the 60Gb model had it(not sure about the 40gb one, or the 80gb)

But some definetly had it, I have one of those too.

40gb didnt have card readers and some other items I believe

Japan and US models had full PS2 hardware for bc. EU launch modell had partial hardware. I do not remember if it was the ps2 cpu or gpu that got removed at EU launch.
 
Japan and US models had full PS2 hardware for bc. EU launch modell had partial hardware. I do not remember if it was the ps2 cpu or gpu that got removed at EU launch.

It was the CPU, but in the end, every PS3 even the ones without any PS2 hardware received full software BC, with hacks you could use it for every title, DF did a video on this
 
Yeah, CPU was removed on CECH-C/E as well as the Rambus memory.

PS3 even the ones without any PS2 hardware received full software BC, with hacks you could use it for every title

ps2_netemu (the .self that was used for PSN classics) was far far away from full BC even with community configuration patches. For the stuff that is fully playable there might be various audio/visual glitches.
 
Yeah, CPU was removed on CECH-C/E as well as the Rambus memory.



ps2_netemu (the .self that was used for PSN classics) was far far away from full BC even with community configuration patches. For the stuff that is fully playable there might be various audio/visual glitches.

In total numbers with what is playable (without glitches) it still dwarfs official Xbox emulation, which is often praised for its numbers. But you are correct, only the games released on PSN were fully tested and playable.

Here is a list:
https://www.psdevwiki.com/ps3/PS2_Classics_Emulator_Compatibility_List
 
I do agree with you on the fill rate aspect now. I don't believe the bottleneck is straight computational performance either though. Note how the 1070 is more than twice as fast as the 1060 despite having less than 50% more shader resources while being an identical architecture. It seems the benchmarks works in mysterious ways.

What I think you're seeing here is the result of two distinct GPU resources being tested are interleaved as [fill rate] then [computational] then [fill rate] then [computational] then [fill rate] then [computational] repeated but with both GPU tasks initiated by the CPU. Ergo, what this benchmark demonstrates is less overall theoretical computation or fill rate performance but more crucially CPU setup times of the individual operations required to achieve the result.
 
Sony have been a long-time contributor to LLVM. It's a shame Microsoft don't also use LLVM because they would lead to better performance for all code produced by the compiler.
LLVM traditionally compiles faster, it does not necessarily make the code run faster.
 
LLVM traditionally compiles faster, it does not necessarily make the code run faster.

Compile times definitely do not translate to runtime performance, but I've not seen any comparison of code performance between the two compilers used by Sony and Microsoft. What I'm saying is, the more people using and - contributing using to the improvement of - compiler tech, the better for everybody. :yep2:
 
Compile times definitely do not translate to runtime performance, but I've not seen any comparison of code performance between the two compilers used by Sony and Microsoft. What I'm saying is, the more people using and - contributing using to the improvement of - compiler tech, the better for everybody. :yep2:
Well, I disagree. Competition is good also for compilers.
 

Hm... weird, he is getting better temps than Gamer's Nexus. Topping at 97ºC on the memory when covering the console with a blanket. My guess is his results wont get much attraction as they are good.

upload_2020-12-6_20-43-37.png
 

Hm... weird, he is getting better temps than Gamer's Nexus. Topping at 97ºC on the memory when covering the console with a blanket. My guess is his results wont get much attraction as they are good.

View attachment 5056
Not surprised. The methodology of Gamer's Nexus have being questioned by some, and not only because of thermal pads. We could see on his setup the cables were preventing the shield to be correctly positioned.
 
It was the CPU, but in the end, every PS3 even the ones without any PS2 hardware received full software BC, with hacks you could use it for every title, DF did a video on this

Yes, but the discussion was if "big" changes was done within the first year of release of the device. The change was to save on cost, it might be the same again for PS5DE.
I mean if 6NM for some reason makes the DE a viable product for Sony to sell ( My thought is that they lose to much money on it, currently) it might be a change Sony will do.
 
Last edited:
Back
Top