X-Box hacker interview

marconelly! said:
What I was discussing has nothing to do with your discussion about bandwidth and performance. I was just pointing out how 60FPS updates work on regular TV, why the games use it, and why it looks better / has no flicker. It started with the Rougue Leader being 60FPS...

Wasn't RS apart of the debate about performance? It seems to me you did include RS in your side of the debate or was that just ipso facto?
 
OK, so what are you arguing exactly? I mean it's not like anyone here doesn't know that Xbox has the most graphics features, biggest memory, etc. I was just saying PS2 and GC also have some very good looking upcoming games and generally can do good looking graphics, in part because artistry makes big portion of one's impression with visuals.

Almost as a side thing, I mentioned that RL does DOT3 bump mapping and updates 60 frames/sec.

I don't know how to prove it's 60FPS, but numerous reviews have acknowledged this. Also take a look at the features list on this page (more than likely provided by developers)

http://cube.ign.com/articles/135/135337p1.html

Features
- Pilot the best vehicles from Episodes IV-VI
- Take part in the most famous battles and scenes of the original trilogy
Eleven primary campaign missions include the attack on the Death Stars, a battle on Hoth, dogfights over Tatooine, raids over Bespin, and many more
- Real-time environmental effects -- watch the twin suns of Tatooine rise and set or race your vehicle to complete a mission before darkness falls
- Experience amazing levels of textures, geometry, and lighting effects at a blazing 60 frames per second
- Single-player only
 
Sheesh! talk about causing a firestorm! :oops:

care to back this up?

What is there to back up?

FAF i have to disagree with you. How are you judging this? If you want to rule out the benchmarks how are you capable of making a comparison?

Experience...

Regardless, benchmarks only show what you're willing to see. Even SPECcpu isn't necessarily a very good processor benchmark, it's really more of a compiler bench that can be very obvious when compiler engineers slip optimizations into their compilers that recognize certain data patterns in a couple of algorithms that cause their processor to fly on a subset and artificially inflate their scores. SPEC can also be rather brutal on archs' without extensive OOE resources if they don't have clairvoyant C and Fortran compilers.

Anyhow, I think Faf explained my point more eloquently... :p

Well a 5.1 Dolby Digital stream is only 384k/s (or so says Gordian Knot ).

Dolby Digital, Pro-Logic II, DTS Interactive, et al are basically data transport mechanisms off of the system.

How can you say honestly that the ps2 could perform just as well in DOT3 bump mapping (using the developers convoluted methods) as the Xbox. I am sure you will say "well it can't -but it could use another method." Sorry you have already admitted the xbox has a performance feature that enables it to win out over the PS2.

He hasn't said any such thing. Since you bring up bump maps, you won't get any argument from me about the usefulness of XBOX, and the GCN's DOT3 capabilities. It sure as hell makes global illumination models a lot more straight forward. But since you do bring up bump maps, why not explore displacement maps? I know my little meshify algorithms which I had delved into to compress arbitrarily large object meshes, had interesting sideffect of being readily adaptable to displacement mapping...

Likewise tangent space normal mapping can not only yield a easy, fast per-pixel light model, but is also works well for morphing geometry, it's lean on texture space, and leaves the tangent available for various lighting models.

In the PRESENT the PS2 is the lead development system for the marjoity of cross platform games (which make up roughly 75% of MS and Sonys systems), thus any extra power in the Xbox simply goes to a giant memory card, possibly better frame rate, and slightly sharper textures.

Even in the case of it being the lead platform, the premise that a project is going to be cross-platform doesn't bode well for a developer getting really creative on the lead platform if it can't be reasonably portable to the other platforms. I mean you're not going to go out and roll your custom EE Lisp compiler for a project if it's also bound for XBOX, and GCN, and if you do, it's likely you wouldn't be using it for the project.

That's why I get rather irritated when people try to use EA's tool as a hardware benchmark, when (as Faf pointed out) it's really more a benchmark of how effective their asset compiler is at multiple (vastly different) targets (which I might point out is incredibly impressive).

Nope. Still curious to hear from Archie what he meant about the hundreds of megs/sec of audio!

Regarding audio, part of the response was in regard to sub-processor, and sub-busses (e.g. S-bus and HT) not only being audio paths but also I/O paths. The other was a bit of irritation at the relative disregard for the importance of audio and the effects of a complex audio model. (I guess I shouldn't expect too much after all this *is* B3D */ graphical that is */ ).

Yes high-resolution, multi-channel audio can be a bandwidth hog (depending on what sort of processing you're performing on it, regardless hence the heavy usage of fast SCSI arrays). Of course games (especially console games) tend to rely lower resolution samples that tend to be at least moved around in a ADPCM format.

Of course there's more to audio bandwidth than just the raw data. You know you do have to move it around, perform calculations, store and buffer it; that does consume hardware bandwidth.

How about considering a more complex audio model where you havedozens, or hundreds of emitters (ala State of Emergency type crowd although not necessarily running around in mad fashion), where you have batch of complex sounds being processed that stochastically generate even more sounds (crowds, rain, etc...). IIR vs. FIR filtration, various compression/decompression routines while not processing intensive to utilize bus activity. How about a complex scene where you've got a lot of emitters in a 'busy' room (lots of obstructions) and you're throwing up a bunch of AABBs to calculate sound obstruction. In some cases you're going even be invoking the CPU regardless of how powerful the MCPX is. Of course I don't specialize in audio programming either though so...
 
Of course there's more to audio bandwidth than just the raw data. You know you do have to move it around, perform calculations, store and buffer it; that does consume hardware bandwidth.

Yeah, but that has little to do with the audio hardware itself, which at most streams audio via DMA, and applies effects to it.

Moving it around in memory, performing calculations on it (beyond what the audio hardware is capable of natively) and such -- that's all the domain of the CPU (and possibly DMA controllers for the actual bulk transfer), not the audio hardware itself.

And the point is, when the audio chip is playing back these streams, presuming it has some scratchpad DSP RAM and some decently sized FIFOs, it's not hard to imagine that the memory accesses for it would be well-aligned, extremely regular, and easily prefetched or otherwise optimized so as to have a minimal impact on the overall performance of the system.

How about considering a more complex audio model where you havedozens, or hundreds of emitters (ala State of Emergency type crowd although not necessarily running around in mad fashion), where you have batch of complex sounds being processed that stochastically generate even more sounds (crowds, rain, etc...). IIR vs. FIR filtration, various compression/decompression routines while not processing intensive to utilize bus activity. How about a complex scene where you've got a lot of emitters in a 'busy' room (lots of obstructions) and you're throwing up a bunch of AABBs to calculate sound obstruction. In some cases you're going even be invoking the CPU regardless of how powerful the MCPX is. Of course I don't specialize in audio programming either though so...

Again, we were discussing the memory bandwidth consumed by the audio hardware itself.

If you're invoking the CPU to manipulate your sample data dynamically (for example by algorithmically generating sounds), then it's not the audio hardware we're talking about, that's CPU cycles and CPU memory bandwidth you're consuming.

The MCPX is capable of applying obstruction filters and doing reverb, echo, HRTF, and other pretty sophisticated effects processing to the audio streams it is playing back -- however you have to set the parameters, and the CPU has to do that, not the audio hardware.

Determining how to set the parameters can get as sophisticated and complex as you want (Remember Aureal's WaveTracing?), but is strictly the domain of the host CPU, not the audio hardware.
 
Let's see, if fps isn't a valid measure of performance then why 99% of benchmarking apps on the market for PCs include fps?

Keep in mind that a system has many bottlenecks depending on the game or software that's running on it.

Being able to run a game with all effects on at 60fps instead of 30fps requires higher overall performance.
 
Legion said:
that is irrelevant. The N64 did still have a power lead. How can you say it didn't make a difference? Can you play Zelda: Oot or Majora's Mask and say this? DId the graphics make a difference? Yes I would say they did. If the N64 was just as powerful as the PSX and turned out a product that was just as good what would have been the insentive to buy it?

The reason people bought the N64 was Zelda, Mario, GE, not the specs of the n64. The ps1 was underpowered in the eyes of techs fans, but in the end it was pointless bucause it has the games. If you think that the gap in graphics inside this generation is a reason to buy xbox over ps2, ok for you, but you are in the minority.

I disagree. What about all those people who bought the PS2 to have a dvd player? What about some of those people who bought the n64? Can you say that 100% of the people who bought a n64 bought it because of the nintendo label? I would say a pretty good number bought it because of its "superior" nature and its "cool graphics."

no. N64 was not considered as superior, neither as cool in every department by the massmarket. There are always people which will consider this to be valid, but this is the minority.

To me this is sophistry. Graphics may not have helped it outsell the PSX but the truth is the PSX outsold the N64 for very valid reasons. This is not to say that its power didn't help it sell better then it would without it.

Of course, but since you can not evaluate the number of people which bought the N64 only for technology, this not a fact at all (since you like facts ;) ) . The only fact you have is that the ps1 outsold 3 to 1 the n64 being at the same time the inferior technology with a gap comparable to the gap between xbox and ps2.
 
probably the same reason you hven't been banned cybermerc :LOL:. Cybermerc honestly you say things at times that just makes me want to place you in the same category as captainhowdy.
 
The reason people bought the N64 was Zelda, Mario, GE, not the specs of the n64. The ps1 was underpowered in the eyes of techs fans, but in the end it was pointless bucause it has the games. If you think that the gap in graphics inside this generation is a reason to buy xbox over ps2, ok for you, but you are in the minority.

I entirely disagree. Mario 64 was much of a tech demo of the hardware. I don't see how you can deny the "wow" effect of the game? Zelda most likely wasn't possible to be done on another system without some forms of modifications here and there. Again i disagree with you. Part of the draw o the new nintendo game was in fact the draw of having a new zelda on N64. Was this simply because it was another game in the line of Zelda? I can't say that everyone just had 1 single reason to buy that game. I didn't state anything about the graphics being anything for me.


no. N64 was not considered as superior, neither as cool in every department by the massmarket. There are always people which will consider this to be valid, but this is the minority.

Even if some did consider the graphics important that still defends what i was saying. I didn't say the vast majority of gamers bought the games due to their appearance. I said that some did buy it because that was a factor to draw them in.


Of course, but since you can not evaluate the number of people which bought the N64 only for technology, this not a fact at all (since you like facts ) . The only fact you have is that the ps1 outsold 3 to 1 the n64 being at the same time the inferior technology with a gap comparable to the gap between xbox and ps2.

Now this is becoming absolutely ridiculous. I never stated the reason why EVERYONE bought the N64 was because of its power. Never did i even state the majority felt this way. You took it apon yourself to assume this. I did say the draw for some to buy these games (not necessarly to buy the game) would be the games appearance. Titles like Mario and Zelda have already made a name for themselves. The progression from 2d to 3d created some interest in itself. I hardly believe such a step was possible on any of the N64s ancestors. What about the comment i made earlier. Simply because the hardware didn't enable it to sell more then the PSX doesn't mean it didn't have some success at helping the N64 sell as well as it did.
 
PC engine i completely agree with you. I am trying to figure out what the meaning behind the FPS statement that was made earlier. i am also curious about this tv rendering process. mathmatically it seems to using about the same frame buffer bandwidth as as a game rendering 30 full images every 2 refreshes. I am very interested in finding what the overall results would be.

BTW why does Benskywalker belive the game is running at 30 fps? Why do some of these forum goers believe the game is running at 60 fps?
 
marconelly! I never disagreed with on what effects RL is using. I am just wondering why you think they are comparable.

PS I am not trying to say that any of you are feeble minded. If you are assuming this then that is on you.
 
mathmatically it seems to using about the same frame buffer bandwidth as as a game rendering 30 full images every 2 refreshes. I am very interested in finding what the overall results would be.
Without going into the frame buffer bandwidth calculations, I think it's pretty obvious that 60FPS game has to process, texture, and render twice the amount of geometry in the same time.


marconelly! I never disagreed with on what effects RL is using. I am just wondering why you think they are comparable.
I have no idea if they are comparable, except visually, but I just thought DOT3 is a DOT3 and 60FPS is a 60FPS, so it made logical sense to compare...
 
Do we have a reasonable figure for the geometry? How would the geometry and textures be affected by the downsampeling?
 
Legion said:
I entirely disagree. Mario 64 was much of a tech demo of the hardware. I don't see how you can deny the "wow" effect of the game?

I'm not denying anything about M64. The fact is that at the end of the gen, all the N64 superior games (which I own most of them) were not enough in the eyes of the majority.

Even if some did consider the graphics important that still defends what i was saying. I didn't say the vast majority of gamers bought the games due to their appearance. I said that some did buy it because that was a factor to draw them in.

You will always find people buying games for their appearance. They are not the majority. People buy games based on marketing and hype.

Simply because the hardware didn't enable it to sell more then the PSX doesn't mean it didn't have some success at helping the N64 sell as well as it did.

You will always find people buying consoles for their graphics, so if finding "some" is enough to validate your point, ok. I recall that the start of our quotes exchange was about you saying that people will recognize the difference between xbox and ps2 like they did for the ps1/n64. Looking at the past, fact is that the majority does not care or does not see it.
 
Only pixel intensity values(rgba) are sent through the RamDAC to display. All other data is delt with prior to rasterization and is not in anyway affected by display settings. And Gamecube uses a three-line filter when downsampling to screen resolution.
 
So Legion you read the specs on the game before you buy it?

Clearly the X-Box games doesn't look those "50%" better than the Cube and PS2 games. It's not the specs, it's how it looks and with the current generation the gap just isn't big enough for the X-Box to make a difference. And those that want the best graphics buys a High End PC anyway.
 
I got to play on an N64 the other day and loaded up Mario Kart... ahh how I marvelled at how really bad it looked yet I remember it being so pretty...

Hey Nintendo were one of the first companies to make 'mip mapping' and 'bi-linear filtering' sound cool to a 16 year old kid who knew nothing about it except it made everything blurry looking and pastel coloured.

Hehe...
 
Back
Top