Xbox 360 Case Study: Reasons why 2nd generation console titles will excel

Blakjedi I think you're getting ahead of yourself with those assertions - especially since we know Cell is being underutilized at this point, but that aside I just have to say great post Acert - two thumbs up!
 
Thread closed due to pointless tit-for-tat arguing. I can see several people getting temporary bans from the console section soon.

Edit: Okay, thread has now be reopened having had time to remove the junk from it.
 
scificube said:
It's not wasted it's an exchange.

That's why I said 'wasted' < note quotes. ;)

Let's be fair here. Take a look at Mass Effect, Too Human, PGR3, or GOW. The jump seems quite good to me and this is still at a point in time when devs haven't come to terms with the HW.

But this is true for the launch of every platform.

We need to remember the learning curve here is HUGE and if not the greatest in video-game history it's certainly up there.

How do you figure that? We've gone from 2D to 3D and it's not like it's the first time there is more than one processor to battle too...

FWIW: I do feel there is a lot of potential left in the X360 design, and I'm looking forward to the procedural stuff, but I don't see any revolutions yet.
 
Along the lines of increasing graphical effects from launch title to gen 2, what would be an example of a change from within the same engine? The cores would be better utilized, so better physics of course (Rain, hair, skin, water). Better shaders too. Smoother AA perhaps (I can see 6xAA in some circumstances).

But what would be the visual end-result of better utilization of the overall architecture? Can anyone give a specific example (besides physics) of what improvements to look for in Gen 2 titles compared to what we see in launch titles for the X360? Or have I answered my own question? ;)
 
This happen with all consoles,but don't spec and earth to moon jump,few games give DOA 3 a run for its money and it was and xbox launch game,we also remember how Halo on E3 2001 was full of jagies and slow down,cuz the xbox was also rush to the market.


All consoles will improve with time from Xbox 360 to the revolution.
 
xbdestroya said:
Blakjedi I think you're getting ahead of yourself with those assertions - especially since we know Cell is being underutilized at this point, but that aside I just have to say great post Acert - two thumbs up!

Truth is we know ALL CPUs are being underutilized at this point. :!:

But we also know that the 7800/RSX in PS3 will be HIGHLY utilized when we first see it because of developer familiarity with its "normal" rendering process, while Xenos is barely if at all being used right now in the way that makes it unique... which is really what Acert was saying would comprise the jump between generation 1 x360 and generation 2 x360.
 
Acert93 said:
Every console launch I can remember has demonstrated a significant disparity between the quality of launch titles, and more importantly, between the best looking launch titles and 2nd and 3rd generation software. This thread is being created for your feedback, pro and con, on this disparity in graphic quality.

I am not so sure this trend will continue actually. :oops:

By pushing more and more of the design burden on software developers, hardware makers are able to come up with new gadgets faster than they can be programmed -- optimally that is.

What this means is that tomorrow's innovations are decreasing the shelf life of today's hardware ... while software development cycles are increasing.

Xbox 360 and PlayStation 3 (and a host of CPUs and GPUs for the PC) should have less time to nurture a software evolution than their predecessors before being pushed into obsolescence.

Consequently, not only is the first software generation more likely to underutilize new hardware, but the dwindling generations that follow are less and less likely to be distinguishable from the first -- at least as profoundly as they were in the previous hardware generation. :idea:

Well that's my two cents anyway.
 
standing ovation said:
By pushing more and more of the design burden on software developers, hardware makers are able to come up with new gadgets faster than they can be programmed -- optimally that is.

What this means is that tomorrow's innovations are decreasing the shelf life of today's hardware ... while software development cycles are increasing.


But standing ovation that's thing with consoles. The big advantage is they will be with us for five or more years. Devs have time to utilize their power. Consoles aren't like PCs were things change year to year and you have 15 different users to program for. The PS3 and X360 will probably be pushed years from now. PC gpu's won't.
 
You're absolutely right mckmas8808 because consumers buy games, not flops or rendering passes. The problem with PC hardware power is that it never translates to the user experience in a timely fashion. It took 3 years for PC devs to catch up to Xbox graphics with releases of Doom 3 and Half-Life 2, even though PC hardware was technically better about 6 months after the Xbox launch. PC gaming thrives mainly through gameplay (RTS, FPS, and MMORPG control advantages), not through technical advances. Even today PC devs are targeting the 95% of users that don't have the latest hardware, not the few that do.
 
Johnny Awesome said:
You're absolutely right mckmas8808 because consumers buy games, not flops or rendering passes. The problem with PC hardware power is that it never translates to the user experience in a timely fashion. It took 3 years for PC devs to catch up to Xbox graphics with releases of Doom 3 and Half-Life 2, even though PC hardware was technically better about 6 months after the Xbox launch. PC gaming thrives mainly through gameplay (RTS, FPS, and MMORPG control advantages), not through technical advances. Even today PC devs are targeting the 95% of users that don't have the latest hardware, not the few that do.

The problem this gen is that the graphic subsystems in these consoles are dx 9 . There have been dx 9 cards on the market for what 4 years now ? So the games are going to start catching up . Games in 2006 will be heavly targeting dx 9 for grpahics , the unreal 3 engine iwll hit on the pc and many games will take advantage of it .

We have powerfull cards now like the r520 that by itself has 512 megs of ram the full system ram . The main diffrence will be the cpus .But how much of a diffrence is there ? In the next year or so x86-64 will be very wide spread and dual cores will start getting market penetration . Developers will start targeting 3000+s and 1 gig of ram as the reccomended specs .

So how long do these consoles really have the graphics edge ? t may be a much smaller graphical lead than previous generations . Thats because as i stated they hit up the ihvs for chips at the end of a api generation .
 
jvd said:
The problem this gen is that the graphic subsystems in these consoles are dx 9 . There have been dx 9 cards on the market for what 4 years now ?
No, DX9 cards have been on the market 3 years, not 4. Fall 2002 saw the launch of the Radeon 9700Pro.

So the games are going to start catching up . Games in 2006 will be heavly targeting dx 9 for grpahics , the unreal 3 engine iwll hit on the pc and many games will take advantage of it .
UE3 is a DX9 game, but it is targetting mainstream cards as the "baseline". As Epic has said themselves a 6600GT should play it fine. Further, Epic is aiming at a SM2.0 featureset as the baseline--that is pretty far behind what we are seeing now in the hardware.

And judging from Steampowered.com's user stats respectible GPUs are less common than the bottom feeding FX5200, 6200, 9600SE, etc type cards. The PC market is smaller and the majority of people are still using outdated hardware (features) that underperforms (performance).

And it is not as clear cut from a market penetration perspective that DX9 is "here" now. We have yet to see a major DX9-Only release. We should see that... in 2006 like you said. But we do not know how many games will require DX9 cards. But waiting almost 4 years after the API's release for a game the demands the minimum spec DX9 card :???: And below I will touch on how that is even behind the consoles below (Xenos/RSX are waaaaay ahead of SM2.0).

DX9 is not a monolithic API and acceptible performance from one or the other IHV has lagged behind each API release/update fragmenting the market and stalling it for a year or two. e.g. Only looking at flagship models (ignoring the mainstream bottoms feeders):

In late 2002 The first DX9 SM2.0 GPU was the Radeon 9700 (R300). NV "co-released" the FX 5800 (NV30) which are poor DX9 cards. This fragmented the market--you had GPUs that supported DX9 SM2.0 well (ATI) and GPUs that did not support DX9 SM2.0 well (NV).

More of the same in 2003 with the R360 (9800) and NV35 (FX5900). Solid SM2.0 support from ATI, poor SM2.0 support from NV.

In 2004 we saw the first DX9 SM3.0 GPU. With this card came API compatible Geometry Instancing, Flow Control & Branching, Vertex Texturing, and as a plus FP16 blending (along with other perks). SM3.0 also raised a lot of the limitations of shader code length and so forth. Branching is kind of poor in NV40 but it a fully compliant SM3.0 GPU--even if some SM3.0 features are not very usable in practical terms.

ATI's offering, R420 (X800), was still and SM2.0 card. No FP16 blending, no API supported GI, no Vertex Texturing, no branching or flow control.

Fall 2005 has seen GPUs from both vendors offering SM3.0 GPUs. G70 still has weaker flow control, but it is there. ATI's cards still don't do vertex texturing because they feel it is too slow.

Basically, these are neutered SM3.0 cards--the full spec is supported but it is slow in certain areas.

So from a developer perspective, on the PC, you are going to have to support SM2.0 unless you are doing a console game and don't care much about PC sales or don't have the time to redo a lot of shaders.

So we may see our first SM3.0 required game in 2007. Another 3 years after the API release :???:

And to tie this back in there are two key points:

Performance.

Features.

Performance: Xenos (and RSX) are definately ahead of the game in performance. Putting aside the closed box paradigm vs. the open platform (which is significant in the end product), ATI wont be releasing a GPU with the shading power of Xenos until 2006 and wont architecturally have a GPU as effecient until a year after the Xbox 360 launches. We are expecting 48 fragment shaders in R580 which should come out in the Spring. In certain situations the R580 should perform better (PS-limited tasks) but in vertex heavy games, bandwidth sensative games, etc... Xenos will *still* have a perfomance edge.

To contrast, the GF4 Ti series launched 6 months later with more performance and features than the NV2A.

Features. No contest. Xenos has all these features *standard*. So all 20-50M Xbox 360 units will have them. Hardware tesselation, HOS, FP10/16 blending, 3Dc, Vertex Texturing, Flow Control and Branching, MEMEXPORT, etc... Toss in the eDRAM and the bare Xenos spec is light years ahead of the standard PC game/PC API. DX9 SM3.0 and SM3.0 hardware are a bit behind in features (and the ability to perform those features in game).

Further we know Xenos does most of this really well.

Xenos is not a mainstream GPU, it is a flagship class GPU. Further it has a good couple handful of very useful features it does fast. We have not seen Displacement Mapping as a key feature in PC games. Why? Vertex Texturing is slow and is not on all SM3.0 hardware. Xenos is going to rock in this area with all 48 ALUs having vertex texturing support and having hardware tesselation. How about HDR+MSAA? Even the high end GPUs struggle with this and/or cannot do both at the same time. HOS? Not even on the radar.

A developer making a DX9 game right now may target something like a 6600GT as a baseline. It has 8 PS units and SM3.0 with poor branching and incapable of HDR in games due to bandwidth/ROP limitations. So basically you are looking at an SM2.0 game with Radeon 9800/GF 6600GT class performance. This is basically what UE3 targets.

Xenos gives it a smack down with 8x the performance and a ton more features that are usable. A game targetting the Xbox 360 or PS3's GPUs *as a baseline of development speed and features* is going to look a LOT better than a PC port.

The good news is that the new consoles will be pushing the PC to advance. Developers will want to port their games over to the PC and so older PC stuff will be phased out due to the Console pressure.

The gap between the consoles and PCs is quite large right now. It took almost 3 years to get more advanced games on the PC. I expect that process to be repeated. The gap is further this time (Vista is the one thing that, IMO, will close the gap quicker).

We have powerfull cards now like the r520 that by itself has 512 megs of ram the full system ram .
You ignore that the high end, for the PC, are "trend setters" and developers do not design with them as the minimum spec. 256MB cards are finally getting a decent workout in some new games yet the majority of products sold are 128MB cards.

Developers are not going to disregard over 50% of new 2005 GPU sales (i.e. the number of new 128MB cards sold) to support 512MB as a standard.

Consoles and PCs have reverse paradigms. PCs use the bottom feeder low end POS cards as the "baseline" features and performance--everything else is tacked on. Console design with the hardware in mind.

The main diffrence will be the cpus .But how much of a diffrence is there ? In the next year or so x86-64 will be very wide spread and dual cores will start getting market penetration .
Intel dominates sales and the majority of their chips sold over the last 24 months have not been 64bit. Further, a lot of companies have seen no need to upgrade CPUs over the last 24-36 months due to the stagnent market. The CPUs hit a wall and software is not requiring more performance.

You may be surprised at how significant Celeron and Sempron processors are to the market. Less than 35% of the *gaming PC market* is above 3.0GHz class CPUs. A similar situatione exists for memory.

As a business man, a developer, you are nuts to cut off 60% of the gaming market. The market moves in baby steps on the PC front. We saw this with BF2. It required PS1.4 support and you should hear all the GF4 Ti owners, which is only PS1.3 capable, upset and swearing they will never buy another DICE game. Quite a few people just bought other games. True, some upgraded, but upgrades mean less money spent on games (a point you have made in relation to consoles; the more spent on the console is less spent on games).

In this respect UE3 is designed with DX9 as a base but it wont required SM3.0. Why? SM3.0 features are still weak on all but the newest GPUs and only compose less than 10% of the market install base.

SM3.0 is fast on the new consoles and is 100% of the install base.

Developers will start targeting 3000+s and 1 gig of ram as the reccomended specs .
Eventually, but not in 2005 and probably not in 2006 for the memory. Doom 3 require 384MB ins 2004, BF2 required 512MB in 2005, yet most games will work with less. It is hard to imagine a lot of developers targetting 1GB when games can simply reducing memory size by lowering texture quality. And the fact the new consoles are 512MB systems will also figure into that targetting.

Thats because as i stated they hit up the ihvs for chips at the end of a api generation .
1. SM3.0 is just getting decent *flagship* hardware (forget that the mainstream still sucks eggs).

2. Xenos is, for all practical points and purposes, a DX10 class GPU minus redundant hardwired features, Avivo, etc. The only significant feature we know Xenos is missing is a Geometry Shader which, ironically, is not needed due to the fact Xenos can use Xenon as a slave. With cache locking, compressed vertex streaming, and a CPU with a Dot Product this feature was unnecessary.

So I don't get where MS is hitting ATI for an "at the end of a api" GPU.

It has rocking branching and flow control and has more features than any DX9 class GPU. To put it simply it performs better at shaders than any GPU on the market. And that is the baseline.

On the PC, even when R580 or R600 or G80 come out, the PC will be handicapped by the main stream and performance markets. The X1600 has 4 TMUs and 12 PS ALUs. Xenos has 16 TMUs and 48 USA ALUs.

It is going to be a while before PC devs start designing games with Xenos class GPUs as the baseline.

So IMO the gap between this generation of GPUs and Consoles is larger than the Xbox1/GF4 generation. Xenos has more performance and features compared to its contemporaries then that generation.
 
Of course Acert's covered all the bases here, but I'll add my spot of 'anecdotal' evidence that I do in such cases. Hack'n'slash dungeon crawlers being my preferred genre this gen, I point back to CON on PS2, released 2004 on now 5 year old technology, and the latest genre release on PC, Dungeon Siege II, released a couple o' months ago.

CON
http://media.ps2.ign.com/media/568/568803/img_1917892.html

DS2
http://media.pc.ign.com/media/569/569719/img_2759139.html

Visually they are comparable, but one's running on 5 year old tech, and the PC version is running on vastly more powerful hardware.

Whatever the future of PC tech, not every game harnesses it effectively. By my experience (which is very limited) PC tech is, in use, several years behind what it's capable of. So even with more powerful hardware than an XB360 or PS3, there's definitely the possibilty that your favourite genre (as long as it's not FPS which is a tech rolemodel for PC!) won't look any better on PC software.
 
Acert93 said:
It is going to be a while before PC devs start designing games with Xenos class GPUs as the baseline.

So IMO the gap between this generation of GPUs and Consoles is larger than the Xbox1/GF4 generation. Xenos has more performance and features compared to its contemporaries then that generation.

We'll have to see.

I know that Madden 2002 on my PC, which was only 350 Mhz and had a 64 MB GF2MX card, looked better than the PS2 version.

No hardware T&L so probably more polys in the console version. But I could run at a modest 1024x768 and it looked way better. Not just because of the jaggies on the console version but because the higher resolution on a monitor made the outlines of the players standout better. Plus there were some lighting effects, as them simulated the sun breaking through a cloud cover.

Of course this time, we'll have resolutions comparable to PC and a lot more shader effects. But it wouldn't surprise me if there are PC gamers boasting about running Madden 2006 PC at 1600x1200 at some ungodly frame rates and it looking better than the X360 version despite the latter being a new engine.
 
mckmas8808 said:
But standing ovation that's thing with consoles. The big advantage is they will be with us for five or more years. Devs have time to utilize their power. Consoles aren't like PCs were things change year to year and you have 15 different users to program for. The PS3 and X360 will probably be pushed years from now. PC gpu's won't.

Console lifecycles ARE getting shorter. PlayStation was with us for more than a decade, PlayStation 2 for nearly as long. And PlayStation 3? Well, I doubt it will have the same longevity before being steamrolled by a successor.

Rabid innovation and competition are shortening hardware lifecycles. ;)

What's more, tomorrow's gadgets are becoming profoundly more complicated than today's -- not only hardware but software too ... especially software. If this were an English class, then making games on PSone would be like writing 1-page book reports in middle school. Composition on PlayStation 3, however, sounds more like a doctoral thesis.

Think about it. If you're wrestling an opponent who is getting more complicated with each iteration, it will take you longer and longer to pin him. :mrgreen:

And looking further down the road, it's only a matter of time before software development for a single generation consumes the entire hardware lifecycle.
 
Shifty Geezer said:
Of course Acert's covered all the bases here, but I'll add my spot of 'anecdotal' evidence that I do in such cases. Hack'n'slash dungeon crawlers being my preferred genre this gen, I point back to CON on PS2, released 2004 on now 5 year old technology, and the latest genre release on PC, Dungeon Siege II, released a couple o' months ago.

CON
http://media.ps2.ign.com/media/568/568803/img_1917892.html

DS2
http://media.pc.ign.com/media/569/569719/img_2759139.html

Visually they are comparable, but one's running on 5 year old tech, and the PC version is running on vastly more powerful hardware.
A rather silly comparison. To hand pick one game that is somewhat in the same genre and use that as a basis to judge hardware capabilities?

Especially when the game you're comparing has been (rightly, IMO) criticized for the fact that it's graphics barely changed from the original title years ago. DS2 looks like crap for a game released 2 years on the PC, let alone now - CON was always one of the PS2's most impressive titles visually and rightfully received raves for its graphics. How about the original Unreal Tournament on the PS2? PC's with TNT2 cards ran it better and more attractive - so I guess the PC wins then! OMG!!

That being said, I do agree with Acert that the hardware gap is likely bigger now than when the Xbox debuted, but I also agree that I don't necessarily think this is a bad thing, as the PC has suffered lately due to the fact a lot of the tiles are console ports from the PS2/Xbox which barely utilized the PC hardware in the translation. Titles released in mid 2006 for the 360 may exhibit the hardware/software gap between the PC and the next gen certainly more prominently than the launch titles, so as I've said before from now until Vista it will be console heaven.

The difference now as opposed to when the Xbox launch is that MS seems to understand that keeping gaming on the PC is important, with the Xbox launch MS basically shuffled PC gaming into the corner and let it wither. There's just too many companies involved in PC gaming to sit by and watch Sony and MS get all the pie, and MS has always preferred to make their money off software and not hardware. DX10 is a good start, MS looks to be using their muscle to force compliance with the OEM's so hopefully in the future we can avoid the mess than Acert so aptly described in his last post. The PC needs to get more console-like in its operation if it is expected to warrant top-tier titles anymore, and I think MS "gets it".
 
standing ovation said:
Console lifecycles ARE getting shorter.
This console generation was relatively short by comparison due to MS wanting to dump the billion-dollar losses from the Xbox. There's little information indicating publishers really wanted to move to a new generation now, if MS wasn't in the game or hadn't suffered such ridiculous losses for the Xbox I seriously doubt we wouldn't see the next gen hit the streets until sometime in 2007.

So no, I wouldn't say its a foregone conclusion that lifecycles are getting increasingly shorter, I think this past gen was a special case.
 
This generation's lifecycle was not short, it's spanned 1999-2005.

MS entered the market 2, arguably 3 years after the current cycle began(if you count dreamcast)!

It's not that they shortended the console life-cycle, just that they joined 2 years after it had already begun.
 
Back
Top