My prediction is they show Tlou II running on PS5's BC Boost mode at 4k/60fps on a HDMI 2.1 A9H Oled and call it a day...
*here is a link to his lengthy credentials helping work on numerous emulators, ports and homebrew projects
https://www.reddit.com/r/originalxbox/comments/7we5a7/hi_im_lantus_known_as_modern_vintage_gamer_these/
I meant I wasnt sure if that list was exclusively about PS2 emulation on PS4 or if it was emulation in general with PS4 Classic emulation mentioned in between (hence why I wondered why it said unplayable and no mention of PSCX2)Baldurs gate works fine on PCSX2? I even played champions of norrath (a more demanding title??) on PCSX2, no problems at all. Most games work fine on PCSX2, yes you need to find the right settings and it is a hassle sometimes if you want perfect, but for 90% of the time it is 'perfect', even better then real ps2, higher res, better fps in some cases, filters and fixes etc.
Yeah I wonder if one of the consoles won't have HDMI 2.1.
I think that's more of an architectural requirement for GCN. I questioned whether there would be 64 ROPs once the Pro's method of backwards compatibility was revealed, since GCN needs ROPs associated with any new shader engines, and it seemed like it would add complication for little gain to change that. The original PS4 half would need 32 ROPs, but there was no precedent for GCN to allow CUs to send things out of their shader engine and AMD's tiling method and historical practice only had symmetric ROP allocations.36CUs makes hardware BC with Pro trivial.
Cerny put 64 ROPS in the pro when there isn't enough bandwidth for 32, just so they can get BC working via butterfly. It's not a stretch of the imagination to say that they might be hindered by BC for the PS5 design. At the very least they could have done 22WGP, with 20 enabled like the 5700 XT. Just my analysis.
Why? What is it about RDNA that it can run GCN code perfectly as long as you have an identical number of CUs? This 36 number is being fixed on understandably as related, but there should be a far better technical argument for BC over the numbers matching, identifying what are the problems running PS4 code on RDNA and what are the solutions.
I'm not saying it isn't good for BC, but I want to see technical justification. Why can't you just throw an arbitrary number of CUs at the problem and have the GPU schedulers handle to workloads?
For backwards compatibility on a different microarchitecture, I'm actually curious as to whether having an exact match in counts and clocks would be sufficient. CU count might require a match for other reasons, but unless the hardware is painstakingly matching the cycle behavior of the original hardware, I'd expect that additional clock speed at each BC level would be needed to help get over any areas where hardware doesn't fully match.That was my thought as well, but then why run it at 2.0GHz? And why have a comment in one of the tests with 18WGPs that "full chip is used", and this was not BC test. Then there is a 288GT/s number which again, clearly indicates 36CUs.
I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:The hardware doesn’t exist to run the program. So it is not strict Backwards Compatibility. But the downloads don’t modify game code; they run a separate 360 OS with modified drivers to run the game. The game code itself is left unchanged. So by in large I would not call this an illusion. Original code is unchanged. There needs to be an addition of a modified 360 wrapper to support the title to run.
I think that's more of an architectural requirement for GCN. I questioned whether there would be 64 ROPs once the Pro's method of backwards compatibility was revealed, since GCN needs ROPs associated with any new shader engines, and it seemed like it would add complication for little gain to change that. The original PS4 half would need 32 ROPs, but there was no precedent for GCN to allow CUs to send things out of their shader engine and AMD's tiling method and historical practice only had symmetric ROP allocations.
I ran across one suggestion that I cannot verify that Sony had lower-level options for allocating wavefronts to CUs, or specific numbers of CUs, and there may have been assumptions built into the code about the maximum number available. Code that launched up to the maximum number of available CUs might have included kernels past the PS4's 18 that had some dependence or synchronization issue that didn't manifest because the original PS4 couldn't launch the later wavefronts at the same time.
Possibly, bitfields developed for 18 CUs or counter variables that only assumed 18 simultaneous updaters might hit ambiguities if the hardware can jump to a point beyond their representations. The Pro also jumps over a power of 2 when it went to 36, although that seems like there would be some very specific code to get nailed by that.
I would have expected that code developed for the Pro would have been given the guidance to more explicitly check its hardware limits, so why there would be a similar need for the PS5 to match on the upper end isn't as clear. I could understand it if Sony made its guidance simpler by promising that they could assume at least 36.
I believe the Xbox One had some options for controlling CU allocation strategies, so perhaps the PS4 had some more low-level method that could get them in trouble.
For backwards compatibility on a different microarchitecture, I'm actually curious as to whether having an exact match in counts and clocks would be sufficient. CU count might require a match for other reasons, but unless the hardware is painstakingly matching the cycle behavior of the original hardware, I'd expect that additional clock speed at each BC level would be needed to help get over any areas where hardware doesn't fully match.
Navi does drop a number of instructions, which if any were used would need additional hardware put back into Navi or some kind of trap and emulate method that would cost clock cycles. One potential area that well-optimized GCN vector code could hit is a stream of dependent vector instructions that managed to hit high utilization on Southern Islands. In its first generation, RDNA is 25% longer-latency for dependent instructions, which is one place where bumping the clock would be a ready option for compensating.
I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:
https://www.eurogamer.net/articles/...x-one-x-back-compat-how-does-it-actually-work
Thanks for the great response.
From the Github leak, do you think the 4+TF Navi Lockhart is capable of Xb1x BC? From the Github leak it doesn't seem like it. Do they need to much on the exact teraflops and/or CU count?
Some of the barriers experienced for backwards compatibility for prior generations are likely less of a problem for the current gen, as I assume Microsoft has had enough experience with hitting licensing problems that might prevent the recompilation of existing software and made agreements for the current gen with that in mind.
Yes you are correct as per their article. I guess a recompile would count as a code change in that sense. At least at the assembly level. Curius how they managed to transcompile so many titles.I was under the impression that there was some transcompilation for the CPU and GPU code, at least as indicated from the following article that mentioned translating x86 and translating shaders:
https://www.eurogamer.net/articles/...x-one-x-back-compat-how-does-it-actually-work
I maybe could have worded it better, but essentially they were selling to the Kinect crowd and had lost momentum regarding releasing decent exclusives whereas PS3 had not.
Conversely Sony had turn the car crash PS3 completely around and were the ones with true momentum going into this gen.
Either way PS3 sold more than MS all the time it was out, so it’s hard to really take your initial comment the way it was intended.
Which is why I didn't make anything of PSX emulation.PSP yes maybe. PSX? I wouldnt call it any special achievement.
Jailbroken PS4's can run PS2 games on the emulator, so we're seeing the emulators capacilities 'raw' in this list of games people have tried.EDIT: Btw what is that list? It says PS2 Classics Emulator Compatibility List (on PS4).
No, it's the internal PS2 emulator results with games unofficially played on it. BGDA is noted as unplayable, so when someone tried that game, the PS4 emulator failed. The same game works well on PCSX2 as do many, many others, on an unofficial emulator. It was speculated that Sony, with its insider information, could do better than a bunch of enthusiasts reverse engineering and the like. Sony possibly could, but they haven't wanted to with PS4.Is this an emulation compatibility list in general regardless of platform?
Stuff like that I'd have thought would be better off handled in customised hardware, such as having a PS4 execution mode that maps a whole load of stuff differently. Even if so, the wavefronts are executed completely differently on RDNA than GCN so the code flow will be different. If the wrong number of CUs is enough to break stuff, wouldn't the change to RDNA be even more impactful with code working at that low a level? And what are the GCN-compatilbility features of RDNA and are they removed in RDNA2??I ran across one suggestion that I cannot verify that Sony had lower-level options for allocating wavefronts to CUs, or specific numbers of CUs, and there may have been assumptions built into the code about the maximum number available. Code that launched up to the maximum number of available CUs might have included kernels past the PS4's 18 that had some dependence or synchronization issue that didn't manifest because the original PS4 couldn't launch the later wavefronts at the same time.
Possibly, bitfields developed for 18 CUs or counter variables that only assumed 18 simultaneous updaters might hit ambiguities if the hardware can jump to a point beyond their representations. The Pro also jumps over a power of 2 when it went to 36, although that seems like there would be some very specific code to get nailed by that.
that would be extremely boring, I’d rather not see anything at all
Most likely a montage of current and upcoming games with a good chance of a new reveal. They are after all rebooting their combined content offering (for the umpteenth time) so game content will be the order of the day. The one hardware related item I'm keeping an eye for is a PSVR2 announcement. Any PS5 info will likely be in the "With the launch of our next generation console this year..." variety.My prediction is they show Tlou II running on PS5's BC Boost mode at 4k/60fps on a HDMI 2.1 A9H Oled and call it a day...