Baseless Next Generation Rumors with no Technical Merits [post E3 2019, pre GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Waiting for some new concrete information for PS5 be like
source.gif
 
For the past year I've been keeping monthly tabs on SIE's European, North America, and Japanese career hiring pages looking hopefully to see if they are hiring positions that could even HINT at possibly be related to working on PS3 emulation. Unlike MS Xbox division where there are numerous, so far I've seen nothing from SIE. A few years ago MS mentioned in a video tour they have a team of +100 developers and QA staff dedicated towards their BC initiative. Now Sony doesn't need necessarily +100, but its a huge undertaking in terms of coding and testings, and I've never seen even seen a handful of positions that could be attributed to emulation development. I encourage anyone now to take a look at their 3 regions hiring pages and look for yourself.

Also emulator author Lantus/Modern Vintage Gamer recently said on a podcast that there is NO WAY Sony is currently working on a PS3 emulator because they are not putting any resources into making one (I assume he and others he's connected to are looking for hints/evidence they are making one, and that they have some idea where and how to look)
Code:
*here is a link to his lengthy credentials helping work on numerous emulators, ports and homebrew projects
https://www.reddit.com/r/originalxbox/comments/7we5a7/hi_im_lantus_known_as_modern_vintage_gamer_these/

Yes at the end of the day it kinda sucks, but we are still getting BC for PS4.
 
Last edited:
Baldurs gate works fine on PCSX2? I even played champions of norrath (a more demanding title??) on PCSX2, no problems at all. Most games work fine on PCSX2, yes you need to find the right settings and it is a hassle sometimes if you want perfect, but for 90% of the time it is 'perfect', even better then real ps2, higher res, better fps in some cases, filters and fixes etc.
I meant I wasnt sure if that list was exclusively about PS2 emulation on PS4 or if it was emulation in general with PS4 Classic emulation mentioned in between (hence why I wondered why it said unplayable and no mention of PSCX2)
 
And the One S and One X already supports VRR variable refresh rate and ALLM auto low latency mode, two of the key HDMI 2.1 features.
 
36CUs makes hardware BC with Pro trivial.

Cerny put 64 ROPS in the pro when there isn't enough bandwidth for 32, just so they can get BC working via butterfly. It's not a stretch of the imagination to say that they might be hindered by BC for the PS5 design. At the very least they could have done 22WGP, with 20 enabled like the 5700 XT. Just my analysis.
I think that's more of an architectural requirement for GCN. I questioned whether there would be 64 ROPs once the Pro's method of backwards compatibility was revealed, since GCN needs ROPs associated with any new shader engines, and it seemed like it would add complication for little gain to change that. The original PS4 half would need 32 ROPs, but there was no precedent for GCN to allow CUs to send things out of their shader engine and AMD's tiling method and historical practice only had symmetric ROP allocations.


Why? What is it about RDNA that it can run GCN code perfectly as long as you have an identical number of CUs? This 36 number is being fixed on understandably as related, but there should be a far better technical argument for BC over the numbers matching, identifying what are the problems running PS4 code on RDNA and what are the solutions.

I'm not saying it isn't good for BC, but I want to see technical justification. ;) Why can't you just throw an arbitrary number of CUs at the problem and have the GPU schedulers handle to workloads?

I ran across one suggestion that I cannot verify that Sony had lower-level options for allocating wavefronts to CUs, or specific numbers of CUs, and there may have been assumptions built into the code about the maximum number available. Code that launched up to the maximum number of available CUs might have included kernels past the PS4's 18 that had some dependence or synchronization issue that didn't manifest because the original PS4 couldn't launch the later wavefronts at the same time.
Possibly, bitfields developed for 18 CUs or counter variables that only assumed 18 simultaneous updaters might hit ambiguities if the hardware can jump to a point beyond their representations. The Pro also jumps over a power of 2 when it went to 36, although that seems like there would be some very specific code to get nailed by that.

I would have expected that code developed for the Pro would have been given the guidance to more explicitly check its hardware limits, so why there would be a similar need for the PS5 to match on the upper end isn't as clear. I could understand it if Sony made its guidance simpler by promising that they could assume at least 36.
I believe the Xbox One had some options for controlling CU allocation strategies, so perhaps the PS4 had some more low-level method that could get them in trouble.

That was my thought as well, but then why run it at 2.0GHz? And why have a comment in one of the tests with 18WGPs that "full chip is used", and this was not BC test. Then there is a 288GT/s number which again, clearly indicates 36CUs.
For backwards compatibility on a different microarchitecture, I'm actually curious as to whether having an exact match in counts and clocks would be sufficient. CU count might require a match for other reasons, but unless the hardware is painstakingly matching the cycle behavior of the original hardware, I'd expect that additional clock speed at each BC level would be needed to help get over any areas where hardware doesn't fully match.
Navi does drop a number of instructions, which if any were used would need additional hardware put back into Navi or some kind of trap and emulate method that would cost clock cycles. One potential area that well-optimized GCN vector code could hit is a stream of dependent vector instructions that managed to hit high utilization on Southern Islands. In its first generation, RDNA is 25% longer-latency for dependent instructions, which is one place where bumping the clock would be a ready option for compensating.


The hardware doesn’t exist to run the program. So it is not strict Backwards Compatibility. But the downloads don’t modify game code; they run a separate 360 OS with modified drivers to run the game. The game code itself is left unchanged. So by in large I would not call this an illusion. Original code is unchanged. There needs to be an addition of a modified 360 wrapper to support the title to run.
I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:
https://www.eurogamer.net/articles/...x-one-x-back-compat-how-does-it-actually-work
 
I think that's more of an architectural requirement for GCN. I questioned whether there would be 64 ROPs once the Pro's method of backwards compatibility was revealed, since GCN needs ROPs associated with any new shader engines, and it seemed like it would add complication for little gain to change that. The original PS4 half would need 32 ROPs, but there was no precedent for GCN to allow CUs to send things out of their shader engine and AMD's tiling method and historical practice only had symmetric ROP allocations.




I ran across one suggestion that I cannot verify that Sony had lower-level options for allocating wavefronts to CUs, or specific numbers of CUs, and there may have been assumptions built into the code about the maximum number available. Code that launched up to the maximum number of available CUs might have included kernels past the PS4's 18 that had some dependence or synchronization issue that didn't manifest because the original PS4 couldn't launch the later wavefronts at the same time.
Possibly, bitfields developed for 18 CUs or counter variables that only assumed 18 simultaneous updaters might hit ambiguities if the hardware can jump to a point beyond their representations. The Pro also jumps over a power of 2 when it went to 36, although that seems like there would be some very specific code to get nailed by that.

I would have expected that code developed for the Pro would have been given the guidance to more explicitly check its hardware limits, so why there would be a similar need for the PS5 to match on the upper end isn't as clear. I could understand it if Sony made its guidance simpler by promising that they could assume at least 36.
I believe the Xbox One had some options for controlling CU allocation strategies, so perhaps the PS4 had some more low-level method that could get them in trouble.


For backwards compatibility on a different microarchitecture, I'm actually curious as to whether having an exact match in counts and clocks would be sufficient. CU count might require a match for other reasons, but unless the hardware is painstakingly matching the cycle behavior of the original hardware, I'd expect that additional clock speed at each BC level would be needed to help get over any areas where hardware doesn't fully match.
Navi does drop a number of instructions, which if any were used would need additional hardware put back into Navi or some kind of trap and emulate method that would cost clock cycles. One potential area that well-optimized GCN vector code could hit is a stream of dependent vector instructions that managed to hit high utilization on Southern Islands. In its first generation, RDNA is 25% longer-latency for dependent instructions, which is one place where bumping the clock would be a ready option for compensating.



I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:
https://www.eurogamer.net/articles/...x-one-x-back-compat-how-does-it-actually-work

Thanks for the great response.

Do you think the 4+TF Navi Lockhart is capable of Xb1x BC? From the Github leak it doesn't seem like it. Do they need to much on the exact teraflops and/or CU count?
 
Last edited:
Thanks for the great response.

From the Github leak, do you think the 4+TF Navi Lockhart is capable of Xb1x BC? From the Github leak it doesn't seem like it. Do they need to much on the exact teraflops and/or CU count?

I missed seeing what was on Github prior to its being removed, so I'm not sure what the context was. Scorpio is at 6 TF, which is substantially higher than 4 TF (not sure how much the + is).
If the question is whether such a GPU can run existing software with the same settings for all titles, I'd suspect not. However, Microsoft's method of BC has more intervention by their BC group and tools, which could optimize some things. If the other rumors about Lockhart targeting more modest resolutions or streaming are true, then Microsoft could potentially adjust more problematic titles.
Some of the barriers experienced for backwards compatibility for prior generations are likely less of a problem for the current gen, as I assume Microsoft has had enough experience with hitting licensing problems that might prevent the recompilation of existing software and made agreements for the current gen with that in mind.

Maybe Sony's emphasis on hardware BC might be a combination of its weaker software capabilities and a longer history older IPs that might now be caught in licensing limbo.
 
Even if it could, I’m sure they’ll want to differentiate the BC according to the SKU level. Premium feature for previously premium SKU.

Durango/Edmonton -> Lockhart
Scorpio aware -> Anaconda

Maybe throw in some measure of AF while boost takes care of dynamic res/framerate.

And if they are able, OS-level SSAA, although it could be even more troublesome than x-enhanced BC considering the increased complexity in rendering engines over the years.
 
Last edited:
Some of the barriers experienced for backwards compatibility for prior generations are likely less of a problem for the current gen, as I assume Microsoft has had enough experience with hitting licensing problems that might prevent the recompilation of existing software and made agreements for the current gen with that in mind.

Thanks.

I am looking at the benchmarks for 5500 XT. At 4.8TF, it's slightly faster than the 6TF 580 in the majority of the benchmarks.
 
I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:
https://www.eurogamer.net/articles/...x-one-x-back-compat-how-does-it-actually-work
Yes you are correct as per their article. I guess a recompile would count as a code change in that sense. At least at the assembly level. Curius how they managed to transcompile so many titles.
 
I maybe could have worded it better, but essentially they were selling to the Kinect crowd and had lost momentum regarding releasing decent exclusives whereas PS3 had not.

I agree about the exclusives drying out, but they still had the usual Halo, Gears and Forza games coming out and plenty of 3rd party big hitters. I think the platform was doing well till the end.

Conversely Sony had turn the car crash PS3 completely around and were the ones with true momentum going into this gen.

People over emphasize Sony's self imposed difficulties on this matter (excluding financials lol). They had fixed their problems before the platform turned 1 year old by launching the 40GB model for $399 in 2007. I'm not saying the launch fumble didn't count, but it's overemphasized. Momentum can be gazed from different timelines. Sony halved it's install base from PS2 whereas MS more than trippled, that has to be counted as momentum. People can blame the launch and high price of PS3 as the reasons, but as stated those factors existed only for a short time, even if the echoes of them were heard for long. 360 earned it's place.

Either way PS3 sold more than MS all the time it was out, so it’s hard to really take your initial comment the way it was intended.

I think it's more than debatable whether that argument is true in 2010-2011, that however or even precise momentum factors in 2013 was not my base argument, I was saying that MS had built enough momentum and presence that had they launched an Xbox One that was more like their other consoles, in essence a price competitive 2.7-3TF machine would have had enough gravity to vastly transform the marketshare numbers we are seeing today.

Imo they could and should have done it, but their focus was in other things. The people that made those decisions are now long gone. MS has been back on track for a while, but during MS's fumble Sony has secured a very strong position in the market they did a lot of things right, so it's still hard to compete with them.
 
PSP yes maybe. PSX? I wouldnt call it any special achievement. :p
Which is why I didn't make anything of PSX emulation. ;)

EDIT: Btw what is that list? It says PS2 Classics Emulator Compatibility List (on PS4).
Jailbroken PS4's can run PS2 games on the emulator, so we're seeing the emulators capacilities 'raw' in this list of games people have tried.

Is this an emulation compatibility list in general regardless of platform?
No, it's the internal PS2 emulator results with games unofficially played on it. BGDA is noted as unplayable, so when someone tried that game, the PS4 emulator failed. The same game works well on PCSX2 as do many, many others, on an unofficial emulator. It was speculated that Sony, with its insider information, could do better than a bunch of enthusiasts reverse engineering and the like. Sony possibly could, but they haven't wanted to with PS4.
 
I ran across one suggestion that I cannot verify that Sony had lower-level options for allocating wavefronts to CUs, or specific numbers of CUs, and there may have been assumptions built into the code about the maximum number available. Code that launched up to the maximum number of available CUs might have included kernels past the PS4's 18 that had some dependence or synchronization issue that didn't manifest because the original PS4 couldn't launch the later wavefronts at the same time.
Possibly, bitfields developed for 18 CUs or counter variables that only assumed 18 simultaneous updaters might hit ambiguities if the hardware can jump to a point beyond their representations. The Pro also jumps over a power of 2 when it went to 36, although that seems like there would be some very specific code to get nailed by that.
Stuff like that I'd have thought would be better off handled in customised hardware, such as having a PS4 execution mode that maps a whole load of stuff differently. Even if so, the wavefronts are executed completely differently on RDNA than GCN so the code flow will be different. If the wrong number of CUs is enough to break stuff, wouldn't the change to RDNA be even more impactful with code working at that low a level? And what are the GCN-compatilbility features of RDNA and are they removed in RDNA2?? :-?
 
My prediction is they show Tlou II running on PS5's BC Boost mode at 4k/60fps on a HDMI 2.1 A9H Oled and call it a day...:sleep:
Most likely a montage of current and upcoming games with a good chance of a new reveal. They are after all rebooting their combined content offering (for the umpteenth time) so game content will be the order of the day. The one hardware related item I'm keeping an eye for is a PSVR2 announcement. Any PS5 info will likely be in the "With the launch of our next generation console this year..." variety.

Other than that, I'm looking for any perceived panic on Sony's part which could show up as all divisions coming to the rescue of the distressed mothership. This though would be a very subtle difference from divisions jumping on the PS coattails so it will will no doubt generate many pages of posts here to divine the difference :).
 
Status
Not open for further replies.
Back
Top