Xbox 2 coming in Nov-Dec 2005 - Revolution could be stronger

Status
Not open for further replies.
Qroach said:
Nintendo never gave out theoretical numbers. That's the point I'm making.

Qunicy,
Nintendo did release the 40 Mio. Poly/sec unlit number. I don't know to what extent this differs from a theoretical[/i ] output.
 
GwymWeepa said:
I've seen a large swath of games from each platform, though some ps2 games can compete with anything on the other two machines, most fall short from my experience.

You stated:

GwymWeepa said:
The ps2 had 16 pixel pipelines, only now is that being matched by videocards, its video memory badwidth was over 40GB/s, which has yet to be matched...but what does that give you? Sub-gamecube looking graphics...I don't know wtf they did with the thing internally, on paper ps2 was going to be a monster lol.

Obviously, you were questioning what the impressive achievements of the PS2 architecture (16 pixel pipelines, bandwidth etc) accounted for and I named you exactly that. The point is entirely that there are games that are very impressive and technically unmatched because they target those specific advantages of its hardware. Of course that doesn't exclude the possibility that all games are coded as efficiently with the same amount of effort, nor should we ignore that the most popular console usually also feature the most low quality software. Compare big name games though and you'll see that those games end up being at the higher end.
 
For Qroach regarding poly-output & system efficiency. And no, not cost efficiency either. Courtesy of Fox5:

http://cube.ign.com/articles/088/088713p1.html

This must be the ea canada article, but apparently I was way off with the numbers....could have sworn I heard those numbers somewhere though, but Xbox and PS2 aren't even mentioned in this.


"Gamecube development hardware running with eight texture effect layers + all other effects on: Approximately five million polygons per second
Gamecube development hardware running with four texture effect layers + all other effects on: Approximately 14 million polygons per second."

Wording regarding the remaining benchmark tests was vague, but evidently the company also did experiments with the Gamecube development hardware running at least four hardware lights and other effects with impressive results of approximately 17 million polygons per second. Sources we spoke with said this is not only entirely possible, but highly conservative.



So all effects + 8 texture layers = 5 million polygons
All effects with only 4 texture layers = 14 million(wow, it takes a bit of a dive at 8..memory limited maybe?)
And 17 million under what probably could be in game conditions.
BTW, early ea games on gamecube were sometimes better than the ps2 version, sometimes not. Agent under fire was much better, and nightfire was better too.
Still pretty sure I remember an artificial graphical demo out of EA that did 25 million pps though....

http://gameztech.8m.com/consolewar1.htm
From here....
Quote:
Also, developers have recently stated that the Gamecube can push more 20 million polygons per sec.


http://cube.ign.com/articles/094/094556p1.html
Here's a factor 5 article, just thought it was interesting how the only thing he noted gamecube had over xbox was memory bandwidth. (which means better textures) Memory access times too.

http://www.segatech.com/gamecube/overview/
Scroll down a bit and there's a blurb about factor 5 stating they could do 20 million polygons per second with all effects, with effects standing for texture layers and not actual effects.(umm...does gamecube have a physical max for texture layers?) Maybe at its old clockspeed it could do 25? (but then it'd have a bottleneck in the cpu...)

And although this article by Anandtech I find to be ill-informed regarding the GCs specifics, (like the TEV, operations like hw lights done in parallel to other functions, etc.) & skewed in the X-Box's favor, it still makes some interesting points regarding efficiency:

The basics of this PPC 750CXe derivative (codenamed Gekko) are fairly simple; the PowerPC core features a 4-stage basic integer pipeline which is mostly responsible for the very low clock speeds the core is able to achieve. Most important for gaming performance however are more precise floating point calculations and the Gekko's floating point pipeline is 7 stages long. Since the Gekko is a native RISC processor it does not suffer the same fate as its Xbox counterpart in that it doesn't have to spend much time in the fetch/decoding stages of the pipeline. Immediately upon fetching the RISC instructions to be executed, they are dispatched and one clock cycle later, they are ready to be sent to the execution units.

The PowerPC architecture is a 64-bit architecture with a 32-bit subset which in the case of the Gekko processor, is what is used. The CPU supports 32-bit addresses and features two 32-bit Integer ALUs; separate to that is a 64-bit FPU that is capable of working on either 64-bit floats or two 32-bit floats using its thirty two 64-bit FP registers. This abundance of operating registers is mirrored in the 32 General Purpose Registers (GPRs) that the processor has, dwarfing the Xbox's x86-limited offering (8 GPRs).

In the case of the GameCube, the CPU is clocked at 485MHz, or 3 times its 162MHz FSB frequency. The benefit of a shorter pipeline is of course, an increased number of instructions that can be processed in those limited number of clocks.

The role of North Bridge is played by Flipper in that it features a 64-bit interface to the Gekko CPU running at 162MHz. The entire Flipper chip runs at 162MHz which lends itself to much lower latency operation since all bus clocks operate in synch with one another.

Based on the operating frequency of the core (162MHz) you can tell that the Flipper graphics core isn't a fill-rate monster, but what it is able to do is portray itself as a very efficient GPU. The efficiency comes from the use of embedded DRAM.

The 2MB Z-buffer/frame buffer is extremely helpful since we already know from our experimentation with HyperZ and deferred rendering architectures that Z-buffer accesses are very memory bandwidth intensive. This on-die Z-buffer completely removes all of those accesses from hogging the limited amount of main memory bandwidth the Flipper GPU is granted. In terms of specifics, there are 4 1T-SRAM devices that make up this 2MB. There is a 96-bit wide interface to each one of these devices offering a total of 7.8GB/s of bandwidth which rivals the highest end Radeon 8500 and GeForce3 Ti 500 in terms of how much bandwidth is available to the Z-buffer. Z-buffer checks should occur very quickly on the Flipper GPU as a result of this very fast 1T-SRAM. Also, the current surface being drawn is stored in this 2MB buffer and then later sent off to external memory for display. Because of this, dependency on bandwidth to main memory is reduced.

The 1MB texture cache helps texture load performance but the impact isn't nearly as big as the 2MB Z-buffer. There are 32 1T-SRAM devices (256Kbit each) that each has their own 16-bit bus offering 10.4GB/s of bandwidth to this cache.

This is design engineered efficiency, not cost efficient. The X-Box's major advantage over the GC is its programmable vertex shader, & larger RAM pool. Raw poly output means nothing until effects are applied to those polys. (shaded, textured, lighted, shadowed, self-shadowed, bump-mapped, etc.) The point is, given more R&D time & access to basically the same partners & possibly slightly newer technology, why wouldn't/couldn't the Revolution be technically superior to Xenon? You attempting to justify this by using Nintendo's past console efforts is moot. Iwata is at the helm now, not Yamauchi.
 
GwymWeepa said:
I haven't seen ZOE 2, I've been wanting to though. But anyhoo, PS2 from what I've seen wouldn't be able to handle Ninja Gaiden, but some games really take art design and fudge nearly as impressive graphics, like the Jak series.

True ATM, but I wouldn't be surprised if in the [near] future, PS2 games would match NG either technically and/or fudged via art direction (IMHO).

If the net result graphically seem the same to the eye, does it matter if its fudged or not?

Also for ZOE2, you really have to play it thru to appreciate it.
 
To anyone who hasn't played ZOE2. Please, do yourself a favour, go rent it or buy it (should be cheap now). Play through it to the end. You deserve it. Your eyes and your brain deserve it. I'm stressing the "to the end" part because the last stage is mesmerising.

Then, if you still think PS2 isn't very impressive, considering when it was released (4-5 freaking years ago now), then that's great. At least u'll have played one of the best looking games this generation. Which also happens to be a VERY good game too.
 
I tried to get someone to list a benchmark game for the Xbox, but responses were not that impressive.

I take it that Ninja Gaiden is such a title?

60fps, prog scan, impressive poly counts, beautiful textures, large levels, many characters on screen at once, particle effects, level streaming, etc?

How much better looking is NG compared to say Splinter Cell 2, DOA Volleyball or Panzer Dragon?

Is Chronicles of Riddick another title that really shows off the Xbox?
 
PC-Engine said:
Rygar on PS2 is the closest to NG you'll ever get. Both are by Tecmo.
IMO Onimusha 3 looks like a better "PS2 NG" canditate.
I don't understand what makes you think Rygar is even remotely close to NG. Even the gameplay is totally different.
Their only common nominator is Tecmo.
 
london-boy said:
Deepak said:
PC-Engine said:
Rygar on PS2 is the closest to NG you'll ever get. Both are by Tecmo.

You mean to say that Rygar is best looking PS2 game?

No way! Just meant it's the clostest kind of game u can find on PS2. It's really an average game on its own...

It's not a game from Team Ninja is it?

Well, then take better examples, Dead or Alive 2: Hardcore for instance.
 
Li Mu Bai

And although this article by Anandtech I find to be ill-informed regarding the GCs specifics, (like the TEV, operations like hw lights done in parallel to other functions, etc.) & skewed in the X-Box's favor, it still makes some interesting points regarding efficiency:

The best source for hardware lights done in parellel would be the official console benchmarks EA released. Not unsupported comments released by them in 200 regarding over clockled gamecube devkits. Btw, why didn't you post a link to the anandtech article.

For one, this information from EA regarding the gamecube on IGN would have to come from the original higher clocked gamecube development hardware if it was a year ahead of the release. Honestly the best numbers rleased were in the PDF EA created that used real benchmarks simulating what they would be doing in REAL games.

This is design engineered efficiency, not cost efficient.

I've been talking about gamecube as a whole, not the individual parts. How is this not cost efficientcy when it's the cheapest console to create as a whole? I'll give you that a power PC processor will always win in a design contest to an intel chip in efficiency and cost, but remember that nintendo didn't design the power PC processor, and developers are far more used to getting maximum game performance out of intel processors.

GC overall has a great elegant design, but from the standpoint that teasy was trying to make regarding it being THE MOST efficient from a performance stand point (of it not dropping far from the performance numbers released) isn't supported by this argument, because nobdy ever saw theoretical numbers posted in the publi to even be able to say how far off it's performance is.

The X-Box's major advantage over the GC is its programmable vertex shader, & larger RAM pool. Raw poly output means nothing until effects are applied to those polys. (shaded, textured, lighted, shadowed, self-shadowed, bump-mapped, etc.)

Xbox is still faster with the "effects" you mentioned.You basically supported my point, by showing what the gamecubes performance numbers are with 8 texture layers. Xbox seems to be more optimized for 2 -4 texture layers which is what developers tend to use. I mean who cares if gamecube is more optimized for 8 texture layers when hardly any developers are doing that? I even said before that you'll always find a few cases where one set of hardware will out pace the other in a specific benchmark.

The point is, given more R&D time & access to basically the same partners & possibly slightly newer technology, why wouldn't/couldn't the Revolution be technically superior to Xenon? You attempting to justify this by using Nintendo's past console efforts is moot. Iwata is at the helm now, not Yamauchi.

I'm not saying it's impossible, I've been saying that since nintendo is complaining baout the need for more & more relaistic graphics, they don't see the need to try and win any sort of hardware race. You can agree or disagree for all i care. Iwata is Yamauchi's 2nd in charge, even though he's retired. He's still on the's board or directors and influences what nintendo does. THis is a totally different argument then the one with teasy right now.
 
I have'nt bothered to read all the posts but has anyone brought up RE4 on GCN? Good gracious a mighty other (3rd party) devs need thier asses kicked if Capcom can put out final product that looks and runs as good as what was at E3. No excuses. Only Half Life 2 looks better from what my eyes have seen. As good as Halo 2's bump mapping is and even taking into account Metroid Prime 2's upgraded smooth ass graphics (which look glorious btw). RE4 is just tops imo. Like I said, other devs need an ass kicking if Capcom gets RE4 near 30fps with graphics like that. Lil ole Cube that could.
 
Yep, RE4, the new Metroid and maybe the Zelda demo... those are IMO the best looking games on any current console, especially RE4.
In my eyes the GC does look like it is more "powerful" than the xbox or PS2, making it IMO the most "powerful" console this gen.

GCN 'AAA' games look good, polished and they run fast(er than xbox 'AAA' games)
 
Status
Not open for further replies.
Back
Top