Some questions and thoughts about next gen...

Leeco

Newcomer
I have some questions for people who really understand the design of the next gen consoles.
If I remember it correctly, it is "The system is as powerful as it's weakest link", isn't it?
From most opinions it looks like XBOX360 has a very thought through design without any flows (if that is possible...)
PS3 on the other hand gives me the impression of being very good with floating point computation, but I haven't seen anyone saying how great the general design is.
So, my questions is, what are the weekest links in the design of both PS3 and XBOX that we know of? (I am not talking about the HDD, which is clearly a great point, but more technical points like bandwidth, etc).
 
I asked this regards XB360. The chief limiting factors by consensus are the bandwidth to main memory (22 GB/s shared between CPU and GPU) and the cache size (1 mb shared between 3 cores, 6 thread). Not to say these are particular problems; just the areas where the system isn't strongest. Overall it does look like a very balanced system.

It's too early to say for PS3 as we don't know where the GPU fits in and are lacking feedback from devs on how it is to code. Suggestions are it'll be hard to adapt to/write for and the GPU won't be able to manage the same level of image quality as Xenos as there's no known efficient AA mechanism. Until we see the whole box it's all pie-in-the-sky conjecture at the moment though.
 
Shifty Geezer said:
it's all pie-in-the-sky conjecture at the moment though.

mmmmmm...pie-in-the-sky...

I have a feeling when the PS3 comes out, there will be new Portal-LCD Technology that will allow you to transport yourself through an LCD, into someone eles monitor....which there in will allow you to strangle the said person....this will be the main form of combat over the internet between PS3 gamers and Xbox 360 gamers....Revolution gamers will watch and laugh...
 
Shifty: So explain it for a noob - surely AA isn't the be all and end all of graphics quality is it? What about at "ultra" high resolutions - is it as relevant? Maybe I'm wrong (and as indicated I have noooo idea when it comes to the tech side) but from the write-ups I've seen on the G70, its very powerful (but with a couple of flaws?) - it's weird that they'd spec in all the features they have but not have enough capacity for varying levels of AA?

As for the Cell - hmm. Strikes me very much as if it's gonna be a lot of "untapped potential" again. Those devs that can be bothered to code with SPE's in mind will get phenomenal results but those that dont will reach lesser heights. Although I have to say I was impressed with the quality of the Unreal demo from E3 (bear in mind I don't have a high-end PC!)
 
slider said:
Shifty: So explain it for a noob - surely AA isn't the be all and end all of graphics quality is it? What about at "ultra" high resolutions - is it as relevant? Maybe I'm wrong (and as indicated I have noooo idea when it comes to the tech side) but from the write-ups I've seen on the G70, its very powerful (but with a couple of flaws?) - it's weird that they'd spec in all the features they have but not have enough capacity for varying levels of AA?

As for the Cell - hmm. Strikes me very much as if it's gonna be a lot of "untapped potential" again. Those devs that can be bothered to code with SPE's in mind will get phenomenal results but those that dont will reach lesser heights. Although I have to say I was impressed with the quality of the Unreal demo from E3 (bear in mind I don't have a high-end PC!)

The about the Ultra High Resolutions is you need a TV to support it. The 1080p seems sufficient enough but 1080p TVs are just starting to hit the market at a reasonable price. I wonder if 1080i is good enough to mask whatever lack of AA there is...
 
So even on a 1080i AA would be an issue? Sheesh. Totally unsighted on this stuff as I've got a PS2 on my humble telly & a strictly non gaming PC.

Just as well I look for more from gaming that just pretty graphics!
 
The chief limiting factors by consensus are the bandwidth to main memory (22 GB/s shared between CPU and GPU) and the cache size (1 mb shared between 3 cores, 6 thread).


I don't see why people on the net think the bandwidth to main memory is a problem. Isn't the mjority of bandwidth to main memory on a console mostly used by the frame buffer? and in this case the frame buffer is located inside the edram thus leaving the full 22GB/s banwidth for everythign else?

Also WHY do people think the 1 meg shared cash is a problem when you can lock a portion of the cache so there's no conention between devices? ERP, or DeanoC please correct me if I'm wrong.
 
WHile thrashing could be an issue with the cache I think people are concerned there's simply not enough... and you're going to end up having the CPUs polling a lot more from main memory, which obviously is a lot slower.

As to the bandwidth... dunno really... it's 10.8GB/s up/down so it's seems enough... but pehaps heavy use of vertex tranformation based on physics sims might require lots a bandwidth... I imagine it would though I have no real concept of a number.
 
1) alot of the memory bandwidth hit is reduced because of the edram.

2) i would have liked to see more cache in the cpu. I'm really hopping that devs bitched about this too and ms did some last minute changes and added some more cache but thats not going to happen .
 
Just get a decent tv with DNR. It clears those PS2 jaggies right up though it seems to blur the picture a bit. ;)

But really it will most likely come down to personal preference on whether you think the amount of AA or lack there of is sufficient. I'd say that both of the consoles weaknesses are probably graphics memory bandwith.
 
Qroach said:
The chief limiting factors by consensus are the bandwidth to main memory (22 GB/s shared between CPU and GPU) and the cache size (1 mb shared between 3 cores, 6 thread).


I don't see why people on the net think the bandwidth to main memory is a problem. Isn't the mjority of bandwidth to main memory on a console mostly used by the frame buffer? and in this case the frame buffer is located inside the edram thus leaving the full 22GB/s banwidth for everythign else?

Also WHY do people think the 1 meg shared cash is a problem when you can lock a portion of the cache so there's no conention between devices? ERP, or DeanoC please correct me if I'm wrong.

Without edram, then Xbox360 would have been significantly bandwidth limited compared to PS3, so it is logical choice to include edram.
For PS3, by having dedicated memories for CPU and GPU..wouldn't need edram as much as Xbox360 would...so it is just design decisions.
 
slider said:
So even on a 1080i AA would be an issue? Sheesh. Totally unsighted on this stuff as I've got a PS2 on my humble telly & a strictly non gaming PC.

Just as well I look for more from gaming that just pretty graphics!

when you goto higher resolutions it only makes AA more necessary.

If you though the jaggies on GTA: SA were bad on normal TV, you she see how terrible they are at 480p.

At 1080i the jaggies would be ridiculous.
 
BlueTsunami said:
The about the Ultra High Resolutions is you need a TV to support it. The 1080p seems sufficient enough but 1080p TVs are just starting to hit the market at a reasonable price. I wonder if 1080i is good enough to mask whatever lack of AA there is...

The thing is, those without HDTVs would probably see an even bigger benefit from higher resolutions in terms of AA. High resolutions shrunk down to lower resolutions for SDTVs is effectively AA. Even 1080p -> 720p would similarly introduce some AA (whilst still benefitting from higher resolution).

scooby_dooby said:
when you goto higher resolutions it only makes AA more necessary.

Not true, unless I'm completely misunderstanding aliasing theory. More resolution, more samples, is the most direct solution to aliasing. With enough resolution you would not need AA at all (not that we'll have "enough" resolution this gen to remove the need for AA, of course, but it'll be better than last gen).

At 1080p aliasing does not seem to be a big problem, especially when you've other post-processing effects going on like DOF and motion blur. I'd still take AA, but I wouldn't consider its presence to be absolutely critical. (Well, I'm judging this by the Heavenly Sword demo at E3 - 1080p, no AA).
 
Jaggies are noticeably worse on all my XBOX games when I switch to HD.

Really? But that's so counter intuitive. In the simplest terms: Think of that old riddle where some Dude is asked by a King to draw a circle with straight lines. I'm not a mathematician, but... the more lines you have to continually angle off from the preceding line the smoother the circle. Oh, someone explain it better please. :?
 
It because your going fronm interlaced to progressive scan the resolution hasn't increased at all going from 480i to 480p.
 
scooby_dooby said:
Jaggies are noticeably worse on all my XBOX games when I switch to HD.

But Xbox games ran at native 480p(or is it native 480i, maybe depends on some games) resolution so it should look worse, but if game runs native 1080p(that is 1920x1080), it might be a little different story.
 
JasonLD said:
scooby_dooby said:
Jaggies are noticeably worse on all my XBOX games when I switch to HD.

But Xbox games ran at native 480p(or is it native 480i, maybe depends on some games) resolution so it should look worse, but if game runs native 1080p(that is 1920x1080), it might be a little different story.

Yeah...since the image is being outputted progressivley it make the image sharper but its the same resolution so any jaggies that where there to begin with just get augmented (I believe thats what happened in Chronicles of Riddick). If the game ran at 720p then the edges of objects would look noticebly better then 480p.
 
I would've liked ~50+ GB/sec main memory bandwidth and 1 MB L2 cache *per* core, 3 MB L2 total - but with a console that retails for about $300, there always have to be trade-offs.
 
Back
Top