Why is that interesting for BC? RDTSC is nowadays often run to a different clock rather than counting CPU cycles. If the CPU is set to 1.6 GHz regardless in PS4, it'll be the same in 4Pro, being the same processor. Or is that indicative of an internal working? Not sure what the AMD RDTSC is set by.
At this point, modern hardware purposefully keeps RDTSC at some fixed reference separate from CPU clock. Pentium M is one generation that had variable clocks but no fixed RDTSC tick rate, which had problems. Later generations moved to methods that did not increment based on a core's clock.
I think AMD's modern chips use a northbridge-based fixed increment. Not so coincidentally, AMD's introduction of HSA had requirement for a globally visible and monotonically increasing timing method, which is one of the few elements that HSA hardware-compliant architectures would need to have even as they discard many of the software or platform elements of HSA.
Jaguar can do everything P3 (with SSE2) could. It's only a matter of speed. You can't optimize your code so deep for a p3, that it couldn't run on jaguar (if there aren't any bugs) because even assembler commands will be translated into micro-code via the cpu itself. It's like a API in the OS. The API shouldn't change, just get expanded. What can change is the speed of the calculation. There can be some viariations (bugs), but those can most time resolved via a firmare/bios update..
That's why MS has DirectX (Direct2/3d) for GPUs and Sony GMN (and so on). Because there is no such thing as x86 on hardware-level, so the driver will decide how the commands/data will be send to the GPU.
I'm replying somewhat late after watching the DigitalFoundry video on Xbox One's backwards compatibility (referenced in
https://forum.beyond3d.com/posts/2013012/). Without a transcript, and pending an a full article, I interpreted some statements on original Xbox compatibility to mean that there's translation of the x86 binaries as well.
One reason I thought about why this would happen--besides perhaps getting more optimal code to compensate for other overhead--is that while it's true that Jaguar should be able to run any code that the P3 could, x86-64 hardware compatibility uses exclusive and incompatible hardware modes. x86 is not forward-compatible with x86-64 at an architectural level, and the newer architecture was not designed to permit them to function concurrently in the same context.
A definition of backwards compatibility that doesn't include active platform intervention wouldn't work. x86-64 repurposes a number of opcodes (REX prefixes are 32-bit mode ALU instructions), loses some of the P3's own legacy backwards compatibility elements (unclear how often the Xbox code would have hit that), and assumes system-level changes since its addressing and conventions do not align totally with 32-bit mode.
If it's just a case of WOW64 (albeit even this is somewhat weaker than the compatibility some consoles have had) or dedicating a system to one mode or the other, that would be fine. However, neither console vendor is likely to recode their platforms, create an even more complex system context, or lose security/modern features to switch Jaguar down into 32-bit mode.
Beyond that, it may also be that it cannot readily do so since there's a good chance that the ongoing translation and emulation activity for the GPU's compatibility is using 64-bit mode (memory addressing, protection, VM, instruction/hardware emulation, extra registers, modern extensions, etc.).