Revolution specs - the good side

darkblu said:
when was the last time you saw a desktop application that had requirements for a _particular_ cpu id, _particular_ gpu, _particular_ north & south bridges, all running at _particular_ clocks. guess what, the majority of the console software _has_ those requirementas inherently

That still sounds false to me. This isn't an N64 (where certain games used microcoding) or something more archaic we are a talking about... the Gamecube was a clean and modern programming platform. Care to elaborate on any possible reason that GC devs would have explicitly been countiing cycles and hinging their code on a certain instruction ordering or cycle-accurate completion time? It makes no sense when most of the code would have been done in high-level C++ anyway. And I can see no reason to worry about the clocking of the north and south bridge when you only have to deal with DMA.
 
From Revolution Report:
On the topic of horsepower, Harrison said a souped-up GameCube would have most likely performed as poorly as the original. Hence why Nintendo chose a different route. He did assure, though, that Revolution will be a multiple of GameCube's technical capacities.

:rolleyes:


PS: Anyone with the Game Informer Issue can confirm this ?
 
Ingenu said:
From Revolution Report:


:rolleyes:


PS: Anyone with the Game Informer Issue can confirm this ?

I don't have the issue, but others with it have reported this. I'd agree with what he said, I think. Just to be clear, he's not talking about technical performance here, but market performance. If Nintendo had simply tried to make a console in the traditional mould and focussed on improved hardware, it probably would have done no better than GC. There's already two consoles doing that. They needed a USP.
 
darkblu said:
ok, this is getting wild, so i'll be rather blunt in this post.

parallels between backward compatibility on desktops and backward compatibility on consoles are naive at best. the two environemnts are _largely_ different from the POV of the software. when was the last time you saw a desktop application that had requirements for a _particular_ cpu id, _particular_ gpu, _particular_ north & south bridges, all running at _particular_ clocks. guess what, the majority of the console software _has_ those requirementas inherently (possibly with the exception of those loathed desktop-originated and heavily mutiplatform ports). in contrast to that, all desktop software is largely _portable_ - even that which runs on one platform and one os only - all it cares about is certain OS services and a cpu instruction set - but timings are carefully taken care of (explicitly or implicitly). ask microsoft how easy it is to be backward compatible on an arbitrarily-deviating console - they could've gone for static re-compilation and device-access patching and be done with it - they went for a full sw emulator for a reason. ask nintendo why the DS hw mimics the GBA at 101% when running latter's titles. ask sony how the ps is emualted. and refrain from making bold backward compatibility statements unless you have sufficient sw experience (i'm looking at you, oden)


It's possible to overclock and underclock most consoles without causing major issues. You can overclock N64 by about 50% and improve performance of some games that way.

On Genesis/Mega Drive you can swap out the original 68000 CPU for a faster one and get rid of slow downs.

NES emulator authors have underclocked their systems to get accurate voltage levels from the video output so that they can have a proper palette(NES does not use RGB).

Some software will crash with the higher speeds, as some base their framerate on knowing how slow it will get, but most will generally work.
 
Reznor007 said:
It's possible to overclock and underclock most consoles without causing major issues. You can overclock N64 by about 50% and improve performance of some games that way.

On Genesis/Mega Drive you can swap out the original 68000 CPU for a faster one and get rid of slow downs.

NES emulator authors have underclocked their systems to get accurate voltage levels from the video output so that they can have a proper palette(NES does not use RGB).

Some software will crash with the higher speeds, as some base their framerate on knowing how slow it will get, but most will generally work.

under/over-clocking the _whole_ system is much less prolematic than boosting serparate components (eventhough you are not 100% safe with the former either). that's the whole point i've been trying to popagate in this discussion. if you read the whole thread you'd have seen. re the use of 'particular clocks' in the post you quoted, it was a poor wording for clock ratios among the system components.
 
Bohdy said:
That still sounds false to me. This isn't an N64 (where certain games used microcoding) or something more archaic we are a talking about... the Gamecube was a clean and modern programming platform. Care to elaborate on any possible reason that GC devs would have explicitly been countiing cycles and hinging their code on a certain instruction ordering or cycle-accurate completion time? It makes no sense when most of the code would have been done in high-level C++ anyway. And I can see no reason to worry about the clocking of the north and south bridge when you only have to deal with DMA.

did you read all my posts in this thread? i gave a rudimentary example to oden somewhere on the first or second page of this thread.
 
I did read it, but it was just a speculative scenario. What kind of quality software wouldn't check that a DMA has completed? I can't see it being common with Gamecube games.
 
Reznor007 said:
You can overclock N64 by about 50% and improve performance of some games that way.
Do you have any details of how that is accomplished? I've always wanted to try it myself. From what I figure, it involves replacing one of the two oscillators on the mobo (other would be video reference clock I believe), which should be a very simple task, especially since the board itself only has top and bottom layer. Unfortunately I never managed to find out which oscillator, and what a suitable replacement would be...
 
Bohdy said:
I did read it, but it was just a speculative scenario. What kind of quality software wouldn't check that a DMA has completed? I can't see it being common with Gamecube games.

it's a speculative scenario indeed, but it's one of the various possible scenarious where intentionally or as a bug such a behaviour would break down your title when run on a disproportionately boosted cpu. which would be a very non-speculative reason for nintendo to re-run all their titles through a full QA cycle, and eventually end up with a compatibility list ala ms.
 
darkblu said:
which would be a very non-speculative reason for nintendo to re-run all their titles through a full QA cycle, and eventually end up with a compatibility list ala ms.

That was to be expected anyway. This is not emulation we are talking about though, but binary compatible hardware (with probably some translation), so unlike the case of 360, the compatibility will be closer to the PS2's 99% or whatever.
 
Last edited by a moderator:
darkblu said:
under/over-clocking the _whole_ system is much less prolematic than boosting serparate components
Surely if the appearance of going with direct hw compatibility is correct, the Rev will have some sort of compatibility mode running on GC spec when in compatibility mode. Akin to how PS2 runs IOP/SPU chipset and optical drive on lower clock for PS1 games too.
With the constant hinting about extra ram being "external" I am more and more convinced Rev chipset is basically GC on a chip with two clock speed settings (one for compatibility).

And of course nothing is 100% - even in 8bit days C128/Speccy128 were not 100% backward compatible, even though they included all legacy hardware.

Guden said:
Binary compatibility relies on wether the result of two different architectures are identical, not wether the architectures themselves are identical.
Darkblu already addressed much of this - just wanted to add that while you can get programs to execute easily enough, the backward compatibility in a console requires identical end-result performance, so while going all whacky with a new CPU could certainly work, it means a lot more work to get compatibility up to spec.

Anyway I think it's all a moot point. The very fact that Rev CPU is rumoured at sub 800Mhz clocks pretty much leaves no doubt that it's from the same 750x family as Gekko, so any performance differences between the two (outside clock speeds) would be trivial.
 
Just by curiosity, what if Broadway if everything else isequal but the CPU is multicore (and you cant turn any core of), would BC be affected?

Fafalada said:
The very fact that Rev CPU is rumoured at sub 800Mhz clocks pretty much leaves no doubt that it's from the same 750x family as Gekko, so any performance differences between the two (outside clock speeds) would be trivial.

I guess that it is dependent of the term extention of Gekko (as they speak about Broadway), but they (Ubi/RS) said that they want to surpasse F.E.A.R. AI (I guess that this is only possible with, at the least, the same features of fear or features that surpasse fear AI features, yet Monolith stated that performance is a problem with their AI (in the begining), should that mean that is something more in this CPU than a just a clock/minor features/ tweaks update, after all you said that for a game like HL2 would be needed 5-10x what the Gekko can offer and this game seems to be give more than HL2:?: (this also come with features that, at the moment, I dont remember to have seen in any GC games (or at least not all in one) like destructible environments, a good deal of fxs like the little explosions, more physics objects, new animation system for the player(s)...)


Unless this is possible because of 1T-Sram (wouldnt in that case it get a good deal of gain in performance with P4EE and the like?) I dont see that if Gekko cant do that why would this one do?

Any thoughts from someone?

With the constant hinting about extra ram being "external"

Will this (being external) have any downside effect on latency/BW.
 
Last edited by a moderator:
Fafalada said:
Anyway I think it's all a moot point. The very fact that Rev CPU is rumoured at sub 800Mhz clocks pretty much leaves no doubt that it's from the same 750x family as Gekko

Why do you think that? It could be a more advanced cpu but simply clocked lower to reduce heat and power consumption.
 
Fafalada said:
Anyway I think it's all a moot point. The very fact that Rev CPU is rumoured at sub 800Mhz clocks pretty much leaves no doubt that it's from the same 750x family as Gekko, so any performance differences between the two (outside clock speeds) would be trivial.

I normally agree with you Faf but I really can't see how you can come to such a conclusion in this case..
 
Last edited by a moderator:
darkblu said:
under/over-clocking the _whole_ system is much less prolematic than boosting serparate components (eventhough you are not 100% safe with the former either). that's the whole point i've been trying to popagate in this discussion. if you read the whole thread you'd have seen. re the use of 'particular clocks' in the post you quoted, it was a poor wording for clock ratios among the system components.

In the case of N64, the overclock there only affects the main CPU, not the RSP, so it's not a whole system overclock. Same with Genesis, you only overclock the main CPU, not the Z80.
 
Guden Oden said:
Do you have any details of how that is accomplished? I've always wanted to try it myself. From what I figure, it involves replacing one of the two oscillators on the mobo (other would be video reference clock I believe), which should be a very simple task, especially since the board itself only has top and bottom layer. Unfortunately I never managed to find out which oscillator, and what a suitable replacement would be...

You have to set 2 pins on the main CPU to various combinations of high/low to set the multiplier of the R4300.

http://www.gamesx.com/misctech/n64oc.htm

Just for fun, here's someone who installed a 68010 in a Genesis:
http://nfggames.com/forum/index.php?showtopic=1893
 
Reznor007 said:
In the case of N64, the overclock there only affects the main CPU, not the RSP, so it's not a whole system overclock. Same with Genesis, you only overclock the main CPU, not the Z80.

and i guess you know what is the respective consoles' libraries break ratio under those overclocked conditions, right? and by breaking i don't mean inability to run at all (as ector mentioned many gc titles "run" even under bogus timings emulation, of course nobody really QA-ed them in that case) - but any glitches that would not pass a production-level QA cycle. not to mention that a break ratio may not give the whole picture - what if 2% break but those are major sellers?
 
Teasy said:
I normally agree with you Faf but I really can't see how you can come to such a conclusion in this case..

What else would it be? Seems like the only conclusion that can be drawn in this case. In fact, I wouldn't be surprised to find that it is a 750CXe, since that processor tops out around 700 MHz anyway.
 
Back
Top