Why has the xbox not been emulated up to now?

People can read about the various explanations. Certainly the CPU wasn't 64 bit. Curiously as a 68000 it's tooted as a 32 bit processor, while the ST and Amiga with the same CPU were considered 16 bit machines. Some Jaguar processors were 64 bit though. Ultimately, a bit metric was considered important, like the old MHz metric, until finally something laid the smack down and people moved on (to find some other number to crow about : Mega/gigaflops - I'm looking at you!).

I don't think anyone counted the 68k in the bits calculation for Jaguar. It had two 32-bit RISC processors labelled GPU and DSP. They had to manage code/data in a limited amount of dedicated scratchpad RAMs but they were otherwise fully autonomous CPUs. If anything the 68k held the system back because so long as it was running it wasted a lot of bus time. Games were literally better off spending most of the time with the 68k put to sleep. I think there were plans to include a better 68k descendent that had a cache to alleviate this problem, but they didn't work out.

The reason Jaguar was called 64-bit (not that it's a good one) is that it had a 64-bit shared system bus and custom graphics processors that could perform full 64-bit accesses on it. The 32-bit RISCs could also do a dual register load/store with 64-bit accesses.

Hardware guys might consider the 68000 a 16bit CPU. The data bus is 16bit, and the ALU handled 16 bits at a time (32 bit ops took twice as long). However the registers were 32bits, the programming model is 32bit, hence the architecture is 32bit.

The physical bit width is an implementation detail. Otherwise we'd have:
1. 8088 (from the original PC) would be a 8bit CPU while the 8086 would be 16bit.
2. 386SX would be a 16 bit CPU.
3. Exynos 7420 (Cortex 4xA57+4xA53) would be 32bit processors.

Ie. plainly wrong.

Cheers

Fair points on 8088 and 386SX but Cortex-A57 has (two) 64-bit ALUs and a 128-bit load/store path to L1 cache (the load/store pair instructions have 1/cycle throughput even w/64-bit registers) so I'm not sure what you'd be referring to. Don't know about A53 but it probably also has 64-bit ALUs. That doesn't apply to the whole Exynos 7420 but that's an SoC, not a CPU.
 
Last edited:
I don't really want to beat a dead horse here but what games and what effects could only work if the bus width is emulated properly on a per-cycle basis?
Pretty much all of them?
 
Abstracting distinct graphics hardware in a PS2 is the easy bit because your working with graphics primitives. You can deconstruct that and re-construct it on a different renderer throwing in additional features to like AA/AF or targeting a different resolution. The PS2 is shifting 48Gb/sec and the distinct elements need the right data in the right place at the right time. On a real PS2 if it takes 20 clocks to throw data into DRAM to work on then the emulator has to do that in 20 emulated cycles as well. Only the system isn't moving data around in mere 32-bit or 64-bits it's moving 2,560-bits of data in one gulp.
 
Plenty of frame buffer effects still aren't emulated on the PCSX2 properly (if at all), Haunting Ground and MGS3 for example. Cycle accurate PS2 emulation at playable speed seems eons away.
 
Abstracting distinct graphics hardware in a PS2 is the easy bit because your working with graphics primitives. You can deconstruct that and re-construct it on a different renderer throwing in additional features to like AA/AF or targeting a different resolution. The PS2 is shifting 48Gb/sec and the distinct elements need the right data in the right place at the right time. On a real PS2 if it takes 20 clocks to throw data into DRAM to work on then the emulator has to do that in 20 emulated cycles as well. Only the system isn't moving data around in mere 32-bit or 64-bits it's moving 2,560-bits of data in one gulp.

Okay but what exact effects (specific examples please) require cycle accurate emulation and emulation of PS2's exact bus configuration to work? You're not answering this, you just keep insisting that if you don't do it exactly like the PS2 does it it'll fall apart.

I haven't emulated PS2 but I can tell you that on PS1 timing accurate emulation of the GPU (how it interfaces its SDRAM, how the caches work, what the timing overheads are on per-primitive and per-line bases) just doesn't play into what's on the screen looking correct at all.

Plenty of frame buffer effects still aren't emulated on the PCSX2 properly (if at all), Haunting Ground and MGS3 for example. Cycle accurate PS2 emulation at playable speed seems eons away.

And which of these effects requires cycle accurate emulation? Which of them don't work with the software renderer? Can you give an explanation why, or are you assuming that it's because of timing? I was using GPU rendering as an extreme example where a lot of things work, I know it doesn't cover everything (one of the reasons why I've done software renderers for PS1 and DS) but just to show that it gets you a lot of the way a lot of the time. DSoup said outright that pretty much all games only work correctly with cycle accurate GPU emulation.
 
And which of these effects requires cycle accurate emulation?

They don't. Those 2 sentences were meant to be separate. I was just thinking about how cycle accurate SNES, TG16 and Genesis emulator projects have come together in the last few years and how much processing power they need to achieve that. Given that PS1, Saturn, N64 emulators still aren't even close to mature, PS2 cycle accurate emulation at playable frame rates seems so far away.
 
while there might be some technical challenges I think by far the biggest reason for not having an Xbox emulator on PC is the lack of motivation, it's by far the console with the least exclusive games, almost every single game is playable on PC, PS2, GC or DC; as for an official emulator on Xbox One it's probably some of that (but less so, playing the GTA3 Xbox on Xbox one, like it happened with PS2 on PS4 would be valid), but also maybe licensing issues, still, they made it happen on the 360, with the superior XBone hardware and more compatible CPU it should be easy to have games running at fullspeed and higher resolution, it's a shame it's not a thing.
 
Maybe they did, it was around the same time as UltraHLE. Not sure why they'd have bothered seeing as how Project Unreality had very little progress, AFAIK it only showed an opening framebuffer logo from one commercial game. If Nintendo was aggressively going against emulators with legal threats I think it was at best limited to N64 emulators over a year or two.



Emulators for GBA (gpSP), PC-Engine (Temper), Nintendo DS (DraStic), and the GPU part of a PS1 emulator (PCSX-ReARMed). They're all targeting mobile platforms and ARM in particular. I've done some other stuff that didn't really merit releases.
in which programming language did you write them? cheers!
 
Just like Panzer Dragoon Orta, Gunvalkyrie had a lot of nice detail hidden by low resolution. No longer!


MS really needs to get this on the BC program. (but also with control remapping;))
 
Back
Top