Xenon , Ps3 , Revolution ...

^^ The point aaaaa00 is trying to make is that desktop CPUs take advantage of out-of-order CPUs because of unoptimised, crappy, spagetti code. But for a specialised games consoles, wasted transistors are not necessarily needed for out-of-order logic in a CPU. Finely tuned code can be crafted for in-order CPUs, thereby saving transistors for cost or increasing them for more performance.

In this case, all three consoles have a good chance of sharing tech from IBMs in-order PPC core for CELLs PPE, Xenons cores and Revs (but without the VMX units, IMO).
 
Andy said:
aaaaa00 said:
Console CPUs tended to have a different focus and different performance characteristics. (Xbox used a desktop CPU - Pentium 3, and it was the only console last generation to do so.)

Actually the processor in XBox is a custom Pentium 3/Celeron Hybrid, so it wasn't completely a desktop CPU as it was customised for XBox.

there were 2 p3 cores, the coppermine and the tualatin, not counting the cache/bus-downgraded celeron variations. the xcpu being yet another variation of the cache/bus configurations, it had nothing not met in the rest of the p3 family. so i think one can safely put the xcpu with the rest of the desktop (read: multi-purpose) line.
 
Jaws said:
^^ The point aaaaa00 is trying to make is that desktop CPUs take advantage of out-of-order CPUs because of unoptimised, crappy, spagetti code. But for a specialised games consoles, wasted transistors are not necessarily needed for out-of-order logic in a CPU. Finely tuned code can be crafted for in-order CPUs, thereby saving transistors for cost or increasing them for more performance.

To be clear, this is not necessarily a bad thing -- using a desktop CPU makes developer's lives a lot easier. Maybe you trade off some potential performance and/or cost, but the performance you do get is easier to extract.
 
aaaaa00 said:
Jaws said:
^^ The point aaaaa00 is trying to make is that desktop CPUs take advantage of out-of-order CPUs because of unoptimised, crappy, spagetti code. But for a specialised games consoles, wasted transistors are not necessarily needed for out-of-order logic in a CPU. Finely tuned code can be crafted for in-order CPUs, thereby saving transistors for cost or increasing them for more performance.

To be clear, this is not necessarily a bad thing -- using a desktop CPU makes developer's lives a lot easier. Maybe you trade off some potential performance and/or cost, but the performance you do get is easier to extract.

i can still remember the time when out-of-order was considered a feature for the server market.. it was somewhere about the ppro time : )
 
Jaws said:
^^ The point aaaaa00 is trying to make is that desktop CPUs take advantage of out-of-order CPUs because of unoptimised, crappy, spagetti code. But for a specialised games consoles, wasted transistors are not necessarily needed for out-of-order logic in a CPU. Finely tuned code can be crafted for in-order CPUs, thereby saving transistors for cost or increasing them for more performance.

In this case, all three consoles have a good chance of sharing tech from IBMs in-order PPC core for CELLs PPE, Xenons cores and Revs (but without the VMX units, IMO).

You could have finely tuned code on a console, but many devs don't.
Why should quality games be limited because the developers lack the programming talent or will to fight with an architecture?
 
darkblu said:
Andy said:
aaaaa00 said:
Console CPUs tended to have a different focus and different performance characteristics. (Xbox used a desktop CPU - Pentium 3, and it was the only console last generation to do so.)

Actually the processor in XBox is a custom Pentium 3/Celeron Hybrid, so it wasn't completely a desktop CPU as it was customised for XBox.

there were 2 p3 cores, the coppermine and the tualatin, not counting the cache/bus-downgraded celeron variations. the xcpu being yet another variation of the cache/bus configurations, it had nothing not met in the rest of the p3 family. so i think one can safely put the xcpu with the rest of the desktop (read: multi-purpose) line.

Ahh yes, it tickles me to hash up about what really is the difference between Celerons and Pentiums. Naturally, you could reason there is no difference between a Celeron, a Pentium, and a Xeon grade core. Essentially, they are the exact same cores with the exact same functionalities, but you'd be crazy to discount the differences in cache sizes. You would no sooner consider a Celeron a Pentium, than call a Xeon just another Pentium. You can be sure a 128 kB cache will limit a few things compared to a 256 kB cache, just as a 1024 kB cache in a Xeon server chip is quite a different ballpark from your garden variety Pentium desktop chip. Unless it is to be suggested that 2/4/8 way cache is capable of saving your bacon from any sort of cache size limitation, then all an XB2 CPU is, is a Celeron/128 kB L2 cache with some associativity tweaks to make the size not hurt that bad. This was the cheapest, yet reasonably performing, chip MS could get out of Intel for their console project. No way was it a choice for brute performance or even "desktop level" performance (otherwise, it would really have been "a Pentium" in the XB in every sense of what makes a Pentium a Pentium).
 
aaaa0 said:
Whoops. I was under the impression that Gekko was in-order, but ok, maybe I'm wrong.
Not in order - it's a modified 750cx core, so it inherits OOE and other stuff that's there for the desktops (like 52bit address space).
 
Speaking of old wounds.. Was I the only one disappointed Xbox1 didnt use an Amd athlon core. Back in 99-00 athlons were alot better bang for the buck than p3's and even a little faster clock for clock. As I remember didnt Intel steal the xbox deal at the last moment. Wonder what speed athlon was gonna be used.

Even more interesting. I would've thought AMD would have fought hard to get a chip in one of the upcoming Next Gens machines.
 
Teasy said:
Can't see Revolution having less ram then XBox 2 unless it uses lower latency ram. Like GC did with its 1T-Sram.

Just to check the G5 is a dual core processor right? So basically your saying that you think Revolution will have a 4 core 3Ghz PPC CPU with 2MB cache vs a 3 core 3Ghz PPC CPU with 1MB cache for XBox 2? That would suprise a few people here if it happened. Then again, who cares if this is just guess work :)

Actually its a 3 dual core PPC in x2

and 2 single core g5 with full blown vmx units

i dont understand the ps3 diagram why the 512 on one end and 256 on the other?
 
randycat99 said:
darkblu said:
Andy said:
aaaaa00 said:
Console CPUs tended to have a different focus and different performance characteristics. (Xbox used a desktop CPU - Pentium 3, and it was the only console last generation to do so.)

Actually the processor in XBox is a custom Pentium 3/Celeron Hybrid, so it wasn't completely a desktop CPU as it was customised for XBox.

there were 2 p3 cores, the coppermine and the tualatin, not counting the cache/bus-downgraded celeron variations. the xcpu being yet another variation of the cache/bus configurations, it had nothing not met in the rest of the p3 family. so i think one can safely put the xcpu with the rest of the desktop (read: multi-purpose) line.

Ahh yes, it tickles me to hash up about what really is the difference between Celerons and Pentiums. Naturally, you could reason there is no difference between a Celeron, a Pentium, and a Xeon grade core. Essentially, they are the exact same cores with the exact same functionalities, but you'd be crazy to discount the differences in cache sizes. You would no sooner consider a Celeron a Pentium, than call a Xeon just another Pentium. You can be sure a 128 kB cache will limit a few things compared to a 256 kB cache, just as a 1024 kB cache in a Xeon server chip is quite a different ballpark from your garden variety Pentium desktop chip. Unless it is to be suggested that 2/4/8 way cache is capable of saving your bacon from any sort of cache size limitation, then all an XB2 CPU is, is a Celeron/128 kB L2 cache with some associativity tweaks to make the size not hurt that bad. This was the cheapest, yet reasonably performing, chip MS could get out of Intel for their console project. No way was it a choice for brute performance or even "desktop level" performance (otherwise, it would really have been "a Pentium" in the XB in every sense of what makes a Pentium a Pentium).

Why didn't MS use a Duron? It probably would have performed better since Durons weren't quite as crippled as Celerons. I think 128KB L1 cache plus 64KB L2 cache comes out to the same total cache as a Celeron, but L1 cache is better right?

Even more interesting. I would've thought AMD would have fought hard to get a chip in one of the upcoming Next Gens machines.

Think AMD, then or now, has the capacity to supply enough chips for that? It probably would have meant almost a complete disappearance from the retail market for AMD.
BTW, the way I heard the story was that MS was able to get a much cheaper deal from Intel than from AMD. It went something like Intel had a bunch of left over P3/Celeron chips, and was willing to offload them onto microsoft at a cheap price for the initial batch.

And I'd guess 600mhz athlon, which was later upped to the 733mhz p3. Depending on the P3 you're talking about and the athlon you're talking about, athlons either destroyed the p3s or they will fairly close in performance(with the p3 having an edge at times). I'd say the athlon really pulled away from the later p3s when it got an integrated L2 cache instead of external, and ddr ram.
 
blakjedi said:
i dont understand the ps3 diagram why the 512 on one end and 256 on the other?
it can be thought of kinda like PCs with their "system RAM" (AKA "main RAM") and "video RAM"; i.e.: Nvidia has just announced a 512MB GeForce 6800 Ultra (the card itself has 512 MB of RAM onboard) ..... and most PC gamers these days have anywhere from 512MB to 2GB or system RAM in their PCs

.. in the case of the diagram in posted in the first post of this thread, it's like the "system RAM" is the 256MB chunk, and the "video RAM" is the 512MB chunk...
 
Wunderchu said:
blakjedi said:
i dont understand the ps3 diagram why the 512 on one end and 256 on the other?
it can be thought of kinda like PCs with their "system RAM" and "video RAM"; i.e.: Nvidia has just announced a 512MB GeForce 6800 Ultra (the card itself has 512 MB of RAM onboard) ..... and most PC gamers these days have anywhere from 512MB to 2GB or system RAM in their PCs

sorry i almost forget not everyone uses UMA... but thats ALOT of memory...
 
Fox5 said:
Why didn't MS use a Duron? It probably would have performed better since Durons weren't quite as crippled as Celerons. I think 128KB L1 cache plus 64KB L2 cache comes out to the same total cache as a Celeron, but L1 cache is better right?

Vague memory tells me that is, indeed, what AMD had bid as a CPU. Maybe it was an Athlon, but it escapes my memory. The Durons were quite strong, despite the reduced cache. A Celeron was not even comparable, performance-wise (referencing a very old Tomshardware article). It was suggested that there was utterly no compelling reason to pick a Celeron over a Duron, unless you plan to do some extreme overclocking (and 733 Mhz is certainly short of that strategy).

It was nearly in the bag that the XB CPU was going to be an AMD part. However, Intel threw in an offer that could not be ignored, and the rest is history. MS chose cost savings with the Celeron, and AMD's "performance" option was out the door.
 
blakjedi wrote:

Actually its a 3 dual core PPC in x2

Unless you're a developer/ insider, and know more than most people do, the Xbox 2/Xenon CPU system is *not* 3 dual cores, but a triple-core CPU. that's one tri-core CPU.

1 chip / die, with 3 cores in it.

not 3 dies/CPUs with 2 cores each, and not 1 die/CPU with 6 cores.


you are also confusing *threads* with cores. each core in Xenon CPU can do 2 threads, for a total of 6 threads, but there are only 3 cores, on 1 chip.


that, according to the leaked Xenon document and Xenon block diagram, and most other reports about Xenon CPU.

obviously that could be wrong, or could have been outdated, and maybe things have changed.

but never was Xenon's CPU system going to be 3 CPUs with 2 cores each. that is a BIG misunderstanding.

xbox2_scheme_bg.gif


see, you can only see 3 cores one 1 chip. and btw, the 'VPU' in each core cannot be counted as a core itself. (just to nip that angle in the bud)

the VPUs are VMXs / SIMD units / FP units / AltiVec units, of some sort

(probably equivlanet to Gekko's SIMD unit and the VMX unit within the PPE/PU of the Cell Processor.
 
Fox5 said:
Jaws said:
^^ The point aaaaa00 is trying to make is that desktop CPUs take advantage of out-of-order CPUs because of unoptimised, crappy, spagetti code. But for a specialised games consoles, wasted transistors are not necessarily needed for out-of-order logic in a CPU. Finely tuned code can be crafted for in-order CPUs, thereby saving transistors for cost or increasing them for more performance.

In this case, all three consoles have a good chance of sharing tech from IBMs in-order PPC core for CELLs PPE, Xenons cores and Revs (but without the VMX units, IMO).

You could have finely tuned code on a console, but many devs don't.
Why should quality games be limited because the developers lack the programming talent or will to fight with an architecture?

So,

Fox5 said:
You could have finely tuned code on a console, but many devs don't
...

Lazy-ass devs...

Fox5 said:
...
Why should quality games be limited because the developers lack the programming talent...

Stupid-ass devs...

Fox5 said:
...
or will to fight with an architecture?

Lazy-and-stupid-ass devs...


There are devs with differing skillsets. A noob walking out of college who's an ace in high level languages isn't going to walk into a studio and start optimising low-level code. Studios will have and apply high/low level programmers where necessay.

Also easy to develop != good games

Consoles are closed, specialised embedded platforms and there will ALWAYS be devs willing to push the platform to it's limits by scrapping their teeths against the metal, if not to gain a performance advantage over games from other studios but only to test their skills for FUN.

Desktop/Server CPUs are there in an open platform to sell to a pick-n-mix market for diverstity in hardware and software. They're designed to run unoptimised crapware on servers and desktops and maintain backwards compatability. Sony is choosing an in-order CPU; MS is choosing an in-order CPU and Nintendo MAY choose an in-order CPU. This is not to say out-of order CPUs wouldn't have their advantages in a games console is just that top tier programmers tend to work in the games industry and not designing corporate work-ware in joe-bloggs PLC.
 
Back
Top