Provocative comment by Id member about PS2 (and Gamecube)!

function said:
This is getting a bit crazy IMO.

We're talking about tens or hundreds of thousands of polygons a frame in modern games. And people can tell which machine can comfortably use higher polygon counts by looking at screenshots of a game they like...?
It's not like you can't spot individual polygons anymore, were not doing REYS rendering (yet).
Squeak said:
I didn’t say they were not. I said that few xbox games actually use them

Anisotropic maybe but trilinear?? How do you know this???

I use my eyes.

In the nicest way possible, I suspect you don't know what you're looking for. I haven't seen anything on the Xbox that isn't using trilinear filitering. If you aren't using trilinear the point where the textures change between mip-map levels are pretty obvious. And without mip maps you get the kind of texture aliasing (distance shimmer) common on PS2 and some DC stuff.
I can see the mip line in plenty of games, it’s just that on some textures it isn’t as obvious (maybe because they have less contrast?).

Squeak said:
Not trilinear, but anioso. With anioso you’re decreasing the resolution of the texture perpendicularly to the tilt of the polygon, the same thing can be done with a little creative use of clamp.

With aniso you're sampling texels in a none square pattern, usually something like a rectangle, in the direction that the textured surface points away from the camera. I've never heard it described as decreasing the resolution of a texture before.
Unless I’ve completely misunderstood the principles in aniosotrophic filtering, you’re effectively downsampling the texture along the axis perpendicular to the one the polygon is being tilted toward or away from the camera position. (to my knowledge, a rectangular sampling pattern is still the one used in real time applications, although a trapezoid or better still an ellipsoid would be preferable)

I've never actually come across this "clamp" operation on PS2 before, but I don't see how 'stretching' a texture (as you describe it) can do the same thing as aniso. Presumeably that would just distort the texture ...
It is just a simple wrap method, where the texture is stretched to fit the available space. Standard on pretty much every piece of modern 3d hardware I believe.
How it would work? Well one way to do it, would be to render to texture in a long 2d polygon, bilinear filter, and then return the texture to its original dimensions when rendering.
Squeak said:
london-boy said:
not to be off topic, :LOL: :rolleyes: but why are there still people who argue about hardware superiority? I have a PS2, not an Xbox (for obvious financial reasons, nothing else), and i'm the first to acknowledge that hardware that came out 18 months after Ps2 will of course be superior.

Much of the technology in xbox has the same age, or is older, than the technology in PS2, and most importantly overall PS2 is better optimised for realtime 3d.

And the Nv2A wasn't designed around doing realtime 3D graphics? :eek:
Of course it was (although it is probably optimised for higher resolutions than what a tv offers, due to its PC heritage). It is the other parts of the system that is less than ideal. “Swizz army knife†CPU, high latency RAM, and not enough bandwidth.
 
Squeak said:
It is the other parts of the system that is less than ideal. “Swizz army knifeâ€￾ CPU, high latency RAM, and not enough bandwidth.

PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith :rolleyes:
 
Zeross said:
PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith :rolleyes:

Where has this been documented? AFAIK, it is only determined that Intel i850-based RDRAM has a higher latency than DDR. The PS2 implementation of RDRAM is not synonomous to i850-based RDRAM. Being a more simple and direct design, I imagine the PS2 implementation to easily be comparable to Xbox DDR in latency spec, if not exceeding it. This is the sort of configuration where RDRAM was truly born to live up to its expectations, not the more well-known PC architecture version.

As has already been noted by others, embedded-DRAM will naturally have superior performance to DDR. As such, the PS2 has it where it will count the most (in the GS), and the Xbox does not have it in an analogous manner. It [Xbox] has to rely on the relatively ponderous DDR main memory for all video operations, hence the notion that there is potentially a bandwidth issue that throttles potential video performance in the Xbox in the first place.
 
Paul said:
PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith

GS VRAM: e-DRAM: 48GB/s

Yeah I know that but we were talking about main RAM and NV2A has a texture cache with even higher bandwith, given it's a lot smaller of course... And dont forget texture compression and bandwith saving techniques. I've never seen a Xbox developper complaining about bandwith issue. An interesting article : http://www.gamasutra.com/gdc2003/features/20030307/dobson_01.htm
 
randycat99 said:
Zeross said:
PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith :rolleyes:

Where has this been documented? AFAIK, it is only determined that Intel i850-based RDRAM has a higher latency than DDR. The PS2 implementation of RDRAM is not synonomous to i850-based RDRAM. Being a more simple and direct design, I imagine the PS2 implementation to easily be comparable to Xbox DDR in latency spec, if not exceeding it. This is the sort of configuration where RDRAM was truly born to live up to its expectations, not the more well-known PC architecture version.



i'm not sure why MS would use memory that is actually SLOWER than memory that came out 18 months before their product. but i digress.
the point is, when it comes to the whole spectrum of performance and bottlenecks, memory latency in both consoles is one of the last things we should be worrying about....
remember that these memories are fast enough for what the processors attached to them are going to process.
really, although latency can make things faster to a certain degree, a quicker and easier boost in performance would come by sorting out other much more irritating bottleneck in both consoles. bandwidth first. and, unless i got this all wrong, bandwidth has more to do with the buses between memory and processor than memory speed itself.
 
london-boy said:
randycat99 said:
Zeross said:
PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith :rolleyes:

Where has this been documented? AFAIK, it is only determined that Intel i850-based RDRAM has a higher latency than DDR. The PS2 implementation of RDRAM is not synonomous to i850-based RDRAM. Being a more simple and direct design, I imagine the PS2 implementation to easily be comparable to Xbox DDR in latency spec, if not exceeding it. This is the sort of configuration where RDRAM was truly born to live up to its expectations, not the more well-known PC architecture version.



i'm not sure why MS would use memory that is actually SLOWER than memory that came out 18 months before their product. but i digress.
the point is, when it comes to the whole spectrum of performance and bottlenecks, memory latency in both consoles is one of the last things we should be worrying about....
remember that these memories are fast enough for what the processors attached to them are going to process.
really, although latency can make things faster to a certain degree, a quicker and easier boost in performance would come by sorting out other much more irritating bottleneck in both consoles. bandwidth first. and, unless i got this all wrong, bandwidth has more to do with the buses between memory and processor than memory speed itself.

It's not that it is slower, per se. It's just that the PS2 implementation is much more direct whereas the goal is simply to hardwire two 16 MB memory chips to the memory bus. An Intel P4 implementation, OTOH, has to mate the bus to 2 or 4 memory card carriers which may or may not be occupied, plus each memory card has to further combine multiple memory modules to come up with 64/128/256/512 MB net capacities. It is far more complex, therefore much more areas where some "slack" has to exist to make sure everything synchronizes properly. That "slack" will manifest itself as some sort of latency. It's not that the memory itself is slower.
 
Squeak said:
function said:
This is getting a bit crazy IMO.

We're talking about tens or hundreds of thousands of polygons a frame in modern games. And people can tell which machine can comfortably use higher polygon counts by looking at screenshots of a game they like...?
It's not like you can't spot individual polygons anymore, were not doing REYS rendering (yet).

Spotting individual vertices is more an indication of the way that a games polygon budget has been used, or of the skill of the modellers, than of how many polygons are being used in the rendering of a frame.

Lets say you can spot 10 vertices in a frame in one game and 20 in a frame from another game. Does that mean the frist game is using twice as many polygons per second?

No matter how much like the look of Jax and Dexter 2, it doesn't [necessarily] mean the PS2 can draw more polygons than the Xbox. :)

Squeak said:
I didn’t say they were not. I said that few xbox games actually use them

Anisotropic maybe but trilinear?? How do you know this???

I use my eyes.

In the nicest way possible, I suspect you don't know what you're looking for. I haven't seen anything on the Xbox that isn't using trilinear filitering. If you aren't using trilinear the point where the textures change between mip-map levels are pretty obvious. And without mip maps you get the kind of texture aliasing (distance shimmer) common on PS2 and some DC stuff.
I can see the mip line in plenty of games, it’s just that on some textures it isn’t as obvious (maybe because they have less contrast?).

Could you point out any of these plenty of games? I'd like to check them out.

Mip lines may be less visiable on some textures than others - possibly where there is a low contrast between any of the texels within the 2x2 texel area to be averaged for the next mip level. After years of gaming with the DC and on old PC cards (where I turned off trilinear filtering to gain performance) I can't believe they've suddenly become practically invisible though.

Dithering can be used to hide the edges of the mip-map levels, but it's a very rough way to try and hide the problem. Besides, I've never known anyone who felt it necessary to try and turn off Trilinear filtering on their GF4 to try and boost frame rate.

Squeak said:
Not trilinear, but anioso. With anioso you’re decreasing the resolution of the texture perpendicularly to the tilt of the polygon, the same thing can be done with a little creative use of clamp.

With aniso you're sampling texels in a none square pattern, usually something like a rectangle, in the direction that the textured surface points away from the camera. I've never heard it described as decreasing the resolution of a texture before.

Unless I’ve completely misunderstood the principles in aniosotrophic filtering, you’re effectively downsampling the texture along the axis perpendicular to the one the polygon is being tilted toward or away from the camera position. (to my knowledge, a rectangular sampling pattern is still the one used in real time applications, although a trapezoid or better still an ellipsoid would be preferable)

You don't actually alter the texture, but you sample texels from it in a pattern that will more accurately allow you to display what's supposed to be 'under' a pixel. You may consider this to be the same things as downsampling the texture (in a similar way to how SSAA works) but I like to consider the samples you're taking and the actual source of the samples to be separate things.

I've never actually come across this "clamp" operation on PS2 before, but I don't see how 'stretching' a texture (as you describe it) can do the same thing as aniso. Presumeably that would just distort the texture ...
It is just a simple wrap method, where the texture is stretched to fit the available space. Standard on pretty much every piece of modern 3d hardware I believe.
How it would work? Well one way to do it, would be to render to texture in a long 2d polygon, bilinear filter, and then return the texture to its original dimensions when rendering.

I'm just having a little trouble getting my head round the description, it's late over here. You're saying that you stretch the texture (storing it at a higher resolution), bilinear filter it, then then downsample it to the original dimensions before applying it as normal?

Don't have time to think about how effective this might be, but it would seem to be an awfully expensive way of filtering textures, especially if it had to be done to whole textures rather than simply the texels local to (within the sampling pattern) the pixel you wish to texture.

Squeak said:
london-boy said:
not to be off topic, :LOL: :rolleyes: but why are there still people who argue about hardware superiority? I have a PS2, not an Xbox (for obvious financial reasons, nothing else), and i'm the first to acknowledge that hardware that came out 18 months after Ps2 will of course be superior.

Much of the technology in xbox has the same age, or is older, than the technology in PS2, and most importantly overall PS2 is better optimised for realtime 3d.

And the Nv2A wasn't designed around doing realtime 3D graphics? :eek:
Of course it was (although it is probably optimised for higher resolutions than what a tv offers, due to its PC heritage). It is the other parts of the system that is less than ideal. “Swizz army knife†CPU, high latency RAM, and not enough bandwidth.

I don't think the Xbox CPU is at all bad as it happens [in terms of what it can actually do, ignoring size, heat and cost!]. When you look at what's doing the same kind of work in the PS2 and GC, it actually comes out as looking pretty strong. Plus it gives MS some headroom for making the Xbox into more than just a simple games console.

P.S. These nested quotes took a while to get right! Shoulda probably just chopped everything down instead ...
 
Hmmm, I've got a question...

If we were to do a new rez, iow rez2.0 with massive amounts of particles, and polys... Which would be better xbox or ps2?

Or would they be pretty close, since they both would gain alot from avoiding texturing altogether(no?)?

As for what I want to see.... I want a glimpse of the next-gen... a glimpse of the future... 70+K highly animated models with supah texturing.... just like the one from last gen.... if I'm not mistaken it can be done... but will it be :?:

ed
 
randycat99 said:
london-boy said:
randycat99 said:
Zeross said:
PS2 RDRAM has higher latency than Xbox DDR and offers half the bandwith :rolleyes:

Where has this been documented? AFAIK, it is only determined that Intel i850-based RDRAM has a higher latency than DDR. The PS2 implementation of RDRAM is not synonomous to i850-based RDRAM. Being a more simple and direct design, I imagine the PS2 implementation to easily be comparable to Xbox DDR in latency spec, if not exceeding it. This is the sort of configuration where RDRAM was truly born to live up to its expectations, not the more well-known PC architecture version.



i'm not sure why MS would use memory that is actually SLOWER than memory that came out 18 months before their product. but i digress.
the point is, when it comes to the whole spectrum of performance and bottlenecks, memory latency in both consoles is one of the last things we should be worrying about....
remember that these memories are fast enough for what the processors attached to them are going to process.
really, although latency can make things faster to a certain degree, a quicker and easier boost in performance would come by sorting out other much more irritating bottleneck in both consoles. bandwidth first. and, unless i got this all wrong, bandwidth has more to do with the buses between memory and processor than memory speed itself.

It's not that it is slower, per se. It's just that the PS2 implementation is much more direct whereas the goal is simply to hardwire two 16 MB memory chips to the memory bus. An Intel P4 implementation, OTOH, has to mate the bus to 2 or 4 memory card carriers which may or may not be occupied, plus each memory card has to further combine multiple memory modules to come up with 64/128/256/512 MB net capacities. It is far more complex, therefore much more areas where some "slack" has to exist to make sure everything synchronizes properly. That "slack" will manifest itself as some sort of latency. It's not that the memory itself is slower.

Are you implying that the Xbox uses DIMMs instead of surface mount DDRAMs on the motherboard???
 
No. You should also note that my post was more about Intel's i840/850 implementation as it relates to RDRAM than what sort of DDR goes into an Xbox.
 
randycat99 said:
No. You should also note that my post was more about Intel's i840/850 implementation than what goes into an Xbox.

...therefore your post is irrelevent with regards to Xbox DDR vs PS2 RDRAM latency.
 
You speak too soon. I was simply commenting that the RDRAM latency typical of what you find in a PC isn't necessarily the same as you would find in a PS2. Thus the comment that the Xbox necessarily has less latency in main memory as the PS2 just because the PS2 uses "icky" RDRAM is not exactly clear.
 
randycat99 said:
You speak too soon. I was simply commenting that the RDRAM latency typical of what you find in a PC isn't necessarily the same as you would find in a PS2. Thus the comment that the Xbox necessarily has less latency in main memory as the PS2 just because the PS2 uses "icky" RDRAM is not exactly clear.

Fair enough, though you have to admit that Xbox DDR has very nice bandwidth compared to PS2 RDRAM which I completely overlooked. Thanks Zeross :D

Again, what is your point?

As I’ve already said, I can’t see the great, cost to usefulness, ratio in putting a HDD in the machine as standard. It is still the biggest cost ms has to swallow in the production of the box.
And the point about the built-in Ethernet port is flawed, as you still have to put down extra money, to actually use it for anything else than just linkup gaming.

When did the PS2 HDD and ethernet get bundled with US PS2's? How many PS2's were already bought before this bundle? What happens to those people who already own a PS2 and want the ethernet or HDD? What about Japan and Europe? What MS has to absorb in costs is unimportant to a consumer as long as the consumer doesn't have to pay more for the HDD and ethernet. Regarding online gaming, on Xbox you get the hardware at no extra cost and you have to pay to play online games. How does the online gaming service work on PS2? Is it free??
 
Nice link Zeross.

I agree, the bandwidth 'issue' on Xbox is thrown around alot here, with nothing to back it up. On the other hand, these articles (like the one you linked to at Gamasutra) usually cite XCPU as the limiting factor, again usually related to animation.
 
PC-Engine said:
When did the PS2 HDD and ethernet get bundled with US PS2's? How many PS2's were already bought before this bundle? What happens to those people who already own a PS2 and want the ethernet or HDD? What about Japan and Europe? What MS has to absorb in costs is unimportant to a consumer as long as the consumer doesn't have to pay more for the HDD and ethernet. Regarding online gaming, on Xbox you get the hardware at no extra cost and you have to pay to play online games. How does the online gaming service work on PS2? Is it free??

He was just stating the simple version, where you can't wholly lash out at PS2's lack of broadband adaptor (though it's bundled now) as an "extra cost" but overlook Live costs on the Xbox itself. It costs the consumer to but the Broadband Adaptor, and it costs the consumer to join Xbox Live. <shrugs>

HDD is still not bundled with the PS2, and unlikely to be so for a while, since they're wrapping it and FF11 together as a $99 deal right now. We have yet to see how appealing non-FF/MMORPG fans find it, nor how much use they put the HDD to in general. (Though they already announced games with downloadable content and other such stuff coming down the pipe--probably what they were waiting to see before launching the HDD now. Obviously they've had this thing around and available for quite some time for Linux setups. Guess they had to get more devs on board and wait for a big title to need it to start off strong.) Hard drive is most DEFINITELY an Xbox advantage. (Though as was pointed out, probably a goodly sum to absorb.)

The adaptor, though, is free outside of the initial cost (which is less than the subscription price of Xbox Live, in fact), and so far the only games which have any cost attached are MMORPG's, which by nature have to. (Though there's a chance Microsoft is letting True Fantasy Online be free for Live subscribers, which is pretty cool. If ALL such games where bundled as part of the Live price that would be pretty cool, but I don't think this will extend to anything not developed by MS themselves. And it's still uncertain as to whether TFO will have extra subscription or not.)

Certainly developers CAN charge extra for their online services, but I don't see them wanting to for a while. It's certainly been non-existant in on PCs (excepting in MMO's of course), though of course Valve is trying Steam out now... (But that's a pretty different model in general, as well.) So for now there are plenty of trade-offs, and no obvious "cost winner" as far as online capacity goes. (And since it can always change at the drop of a hat, this will remain a "for now" decision.)

Consumers are used to "absorbing costs," though. It's the price they pay for early adoption. Heh... Consoles go down in price, games get cheaper, bonuses and packages get added, new peripherals come along... <shrugs> We're used to it by now on all fronts. Most CERTAINLY those of us with computers! We get that in spades. <grins>



Oh, and thanks for the link, Zeross. Interesting read. ^_^
 
HDD is still not bundled with the PS2, and unlikely to be so for a while, since they're wrapping it and FF11 together as a $99 deal right now. We have yet to see how appealing non-FF/MMORPG fans find it

Something tells me even FF fans will hate it ;) Generic PC MMORPG with Chocobos does not a Final Fantasy make!
 
Yeah I know that but we were talking about main RAM and NV2A has a texture cache with even higher bandwith, given it's a lot smaller of course...
I've read somewhere how that cache is 32GB/sec (so it's not faster), but feel free to correct me.

How does the online gaming service work on PS2? Is it free??
Yes, it's free, and has been since the beginning. Unlike Xbox, If you buy a PS2 now (for $200 you get a console and a network adapter) you actually DO get an online gaming out-of-the-box, just like with Dreamcast, as you don't have to pay anything to play games online.

Even before, it was chaper to get online with PS2 than with the Xbox ($40 for adapter vs $50 for Xbox Live subscription)
 
zurich said:
Something tells me even FF fans will hate it ;) Generic PC MMORPG with Chocobos does not a Final Fantasy make!

True, though I'm staggered that this point that there are still people who don't understand what a MMORPG is. Some will learn to love it, though. As much as they won't be advancing to super-duper-rediculous levels in short order, what people CAN do is impressive (lots of big numbers about, and lord knows FF players like those. ;) ), and the lack of storyline direction is compensated for by LIVING in the world and interacting with others. Many avoid MMORPGs as "not for them," and I can certainly see plenty of console RPGers getting disillusioned, but it's still appealing in its own way to a variety of folks, so it should do all right.

Most interesting about FF11, I think, is that it will let console and PC gamers exist on the same servers... Right now I'm not sure what to think, though... it could go either way. Hehe... But it should at least be interesting! ;)
 
I know what the attraction is to MMORPGs.. I've been there and done that with EQ some time ago. Problem is, nothing has changed. One would think after some 3-4 years or so that the genre would evolve, but alas :?
 
Back
Top