I will cut to my conclusion now: After going paragraph by paragraph through this post, I see (1) nothing new information wise (2) and a lot of his opinion that is very subjective.
There is nothing new here. There is no new technical data and his "statements" are about as subjective as the speculation we have here on a daily basis. I am not impressed... with that said... my breakdown:
Almost everything said here dovetails with the leak. So the technical discussion is nothing new; the only "new" stuff is his interpretation of the architecture. I am not overly impressed because people made similar statements about CELL based on patents and such. And a lot of people were predicting CELL to hit certain milestones and were dead on--yet were also just as wrong in many other areas (e.g. some were right on 1PE:8SPE @ 4GHz = 256GFLOPs, yet were dead set it would have other features like eDRAM) so you have to take some things with a grain of salt. Just because someone makes a claim and it comes true does not mean they are "in the know", especially when they agree with previously "leaked" info.
The main value to his post is the interpretation of the architecture. So here are some questions:
The fact that they choose to centralize their FSB or share a single L2 cache among 3 processors shows some real lack of insight. The biggest flub would have to be that 10 MB eDRAM on the GPU -- which I'm told is really MS's idea (both MS and ATI told me that much) -- that just says they didn't even think about resolution.
1. L2 cache is in the leak/patents I believe. Is the shared L2 cache bad? Some here have already disagreed. Obviously he has a right to an opinion, but is this an opinion or fact? Another relevant question: This is a gaming console with a VERY rigid design structure and not a PC. Does the weakness of a shared L2 cache downplayed or exagerrated within such an environment?
IBM has shared cache on some of their dual core chips as will
Intels new chips (do not shoot me, I got that off a google search), while it looks like AMD is going with independant caches. I have read blurbs saying either way is better. BUT, is it reasonable for a console to go the route AMD? Intel/IBM think a shared cache is ok at this point--that says something. Also, desktop chips are pretty expensive, so I am wondering if this is a case of, "In an ideal world we get feature X" but reality tells us a $300 console cannot have every 'key' technical feature on the horizon. Compromises have to made somewhere along the road... the question is how significant this compromise is. The fact IBM/Intel chips (will) have shared cache and others here have said they do not see this as a problem makes me thing this is an opinion.
My take: Old technical facts; His opinion.
2. 10MB of eDRAM is in the leak. From this discussions here is seems very clear 10.5MB of eDRAM would be enough for 720p and there being very feasible workarounds if this is a limitation (and it may not if the chip is 32FP as Dave indicated may be a possibility). Again, perfect world says we have unlimited die space and transistor count, and heat/speed/yields are not an issue. Real life tells us compromises and reality sets in at some point for a $300 console.
My take: Old technical facts; His opinion.
Hardware-wise Xbox2 is getting disappointing the more I look at it... and I know I shouldn't really be saying that since I'm actually developing on it.
The "tone" message of his post. He is disappointed, for whatever reason, and most of his points relate to this.
As for his comment, people here are said to be be working on X2 and VERY excited. So who do you believe? Both can be true--but both are opinions. Based on the pretty open discussion (and accuracy of a certain number of people here) I would have to say I put more weight on what people here say who are working on the system.
My take: His opinion.
let's just say it's Moore's Law looking perfectly normal and healthy.
Pure silliness. Since when have $300 consoles broke Moore's Law? Even past consoles have "wowed" at their demo, but a year LATER when they actually release are well within the realm of what is standard (and this happens all the time... Intel will demo a chip on 'X' process and a year later release a mainstream product).
Three other points: (1) X2 will probably be using the 90nm process. With a 2005 launch there is no other option--that is as good as it gets. (2) Consoles are fixed designs. The ten(s) of millions of X2s that will be shipped will all be the same, therefore every piece of software can exploint the strengths of the system unlike a PC or Mac. (3) The system
is defying Moore's law in that it has 3 CPUs. Well, maybe not defying it but if you look at the condition of CPU processing the last 2-3 years you will see that stuffing 3 CPUs into a system averts the problem chip makers are having with heat and yields and the ability to shrink. We should have 8GHz P4s right now... the fact X2 will have multiple CPUs shows some foresight (both in design limitations AND the fact it will help push multithreaded games onto the PC).
My take: His opinion, and quite a silly opinion at that.
But if you think of the difference between PS1 and PS2, you should see about the same growth from Xbox to Xbox2, but at the same time, taking into account the difference in resolution, content, shader complexity and everything else put together.
I disagree. The PS was one of the first 3D consoles. Looking back the PS was BRUTAL. Fixed function and very limited features. Definately a first stab at making basic 3D. The PS and N64 made some basic mistakes (e.g. N64 had a 4k texture cache size!) but how were they to know how these limitations would play out? The PS2 got 5 years of SEEING how the market and technology would develop and get feedback on weaknesses on the current design. This was also coupled with the breakneck pace of process shrinks.
As we all know it is getting harder and harder to shrink chips (there is conjecture that in the 2010/11 timeframe consoles will ship at the 45nm process which would only increase the transistor count about 4-fold). And the fact is chip makers have a better handle on what works, and what does not work, in 3D rendering. There are less and less weakspots and design mistakes so the change will be more evolutionary than revolutionary.
And he makes an important comment at the end about resolution, content, shader complexity, etc...
1. PS1/PS2 ran at the same resolution. PS3 will support HDTV. That alone requires more power.
2. We expect much bigger worlds, with a plethora of objects, with more dynamic/interactive content in the next gen. That requires more power and memory.
3. PS1/PS2 had fairly fixed function features. To get the great rendering effects we expect today Shaders come into play--but the better looking the game, the more shaders required. This is a feature that actually will help cover up the lack of progress when comparing PS2=>PS3 to PS1=>PS2. Designers get to choose how to tailor the hardware to their game design more than in the past.
So adding the limitations the market has, and then change the Resolution format and a demand for much larger interactive/realistic worlds we can see that we are not asking for just new car/shooter/fighting games with prettier graphics, we are asking for a new level of realism that was not required between the first and second gen of 3D consoles. So comparisons are only skin level in many ways.
My take: His opinion.
The thing is that SIMD is very important to getting any major performance out of PPC processors these days. Without it, they're basically Celerons. So avoiding pipeline stalls and concerning yourself with *instruction latency* is going to be huge on all 3 consoles with this upcoming generation. In some ways, that actually means we've gone back to the '60s in terms of programming. It's just that it's the 60s with 3 million line codebases
As he notes, this is not only a problem the X2 faces. So his "disappointment" in this area is not an X2-only thing, but a general one with all three consoles. Which begs the question: WHY is he so disappointed with this? There really is no other reasonable option at this point. These are $300 boxes that have to cut corners.
My take: Welcome to console reality for the next 5-6 years. Try to avoid the hard bump on your way in.
I should also note that based on what I'm hearing from every studio I've been to, I'd have to say that, at least for the first generation of games on next-gen consoles, you will not see anything running at 60 fps. There is not one studio I've talked to who hasn't said they're shooting only for 30 fps. Some have even said that for next-gen, they won't shoot higher than 30 fps ever again.
Please note: He did not say X2, he says NEXT GEN.
So I must ask: Is there some technical limiation preventing games from hitting 60fps?! The only thing I can think of is 1080i/p limitations. But lets get real: The first couple years people will be running these games on 480i/p while the games WILL support 720p and maybe more. So if there are GPU pixel-rendering limitations and 720p runs @ 30fps, 480i/p users will get 60fps. Second, I think most of us expect the first 2 software gen releases to be more rehashes from the PC or current console games. These most likely WONT tank the CPUs.
I fail to see
how what he says is true in general. He does not state WHO, or HOW MANY, devs he talks to, so the point is mute. I hope someone remembers this statement when we see the first PS3/X2/Rev software. If a healthy percentage (lets say 35%) are not running at a solid 60fps @ 480i/p I will be shocked. What is the point of better graphics if they are choppy? There will always be developers who everestimate what they can get out of a system (or get development time cut short or just aim more for pretty still shots versus smooth framerates) and get choppy games--that will always be the case. But I can hardly believe, based on what we know about the X2 and PS3, that we can believe that 1st gen next-gen console games will not be running at 60fps.
If anything it tells us a little bit about the developers he claims to know.
My take: I will believe it when I see it. Until then, and based on the impression of other developers who are impressed with next gen power, this is pure speculation without any evidence and based on his personal experience which is ALWAYS a bad borameter of any objective issue.
As for PS3... well, it looks as though PS3 will be the hardware king this time around. Just as Xbox had the powerful hardware in current-gen.
Wow, PS3 will be the HW king? What breaking news
All I think I should add to this is that Sony and MS have really different design philosophies this generation. I think each set of hardware will cater to different developers, game genres, and budgets. I think we are at a juncture in HW development that we can no longer say "X is better/more powerful than Y" because those statements are founded in a very fixed environment. When developers all made 2D side scrollers it was fair to talk about "Who can do more spirits and has more color depth". Today we are looking at development teams ranging from 20 people to hundreds of people; development budgets from less than a million to close to 50M. We are looking at games with very small landscapes with fixed gameplay to games with sweeping dynamic worlds. And NONE of these are better than the other when guaranteeing a FUN game that sells well. Not every big budget game outsells/outperforms a small budget game. And this does not even begin to touch on the subject of development tools, libraries, and ease of use--power is more than throwing muscle at a situation, but intelligably making the task quicker and easier.
So when we talk about the HW king, it is important to consider what we are saying. From a technical # standpoint I think there is no problem saying PS3 will be the king.
My question is: Which of the 3 consoles will allow the most developers to get the most out of their games. Whatever console thats hardware allows the most quality games is the most powerful IMO. Other people will look at it, "The console with the single best looking game is the most powerful". And others will say, "The console that has the most quality games, regardless of developer issues (i.e. large install base=more developers=more games=more quality games despite HW issues) is the most powerful". Again, we are talking about a very subjective subject. I am not sure we can say anymore, with any amount of definitiveness that one console is head and shoulders above the rest--at least not at this point.
Maybe a year after their release we can start drawing these conslusions, BUT until we can see what developers can do with the HW, and what that HW will be, a lot of questions remain.
My take: We already knew the PS3, from a technical standpoint, would be the most "powerful". Nothing new here.
PS3's will probably have some features that Xbox2's doesn't and vice versa.
Wait, I thought he was on the inside? If he is a developer he should know what the X2 is, and with PS3 to be demonstrated in a month he should have a pretty good grasp on what the 18+ month (as claimed) PS3 GPU project should hold. If I was "in the know" I think I could be a bit more firm than "probably". Heck, people here under NDA say more than that!!
My take: His self proclaimed synopsis of his post: Conjecture just like everything here on B3D!!
Of course, not everything he said was negative, but we already knew this stuff:
Microsoft definitely makes great developer tools and documentation, and it would be silly to think that XNA will not amount to much. -- There was a recent "speculation" thread that pretty much dumped on XNA based on conjecture (i.e. people who are not working with it), but the quotes from developers who have access to the tools have been impressed. That aside, in general MS is a software company and of course they will leverage their forté and this should help developers.
"In that sense, [X2] will probably be easier to develop for." -- A very overlooked fact. To get similar performance % out of a more powerful chip requires more time, more money, and/or easier development.
Overall, I found not a SINGLE item not found in a patent or the "leak". I am not sure this is even news worthy--this is like someone linking to a Vince, Pana, Dean, nAo, Faf, etc... (I would say Dave but all he would say is "Yes"!) post here giving their summy of some hardware based on the info we know. Actually, I find each of the people I listed above to have a pretty good handle on the HW specs from the leaks and able to give some pretty insightful geedback
Beyond that a lot of what he says is his OPINION, and many times that opinion flies in the face of general sentiment here. OK, he is not impressed. But others are. So who to believe? I have a hard time believing some of his claims being as universal as some people want to imply. e.g. He may be telling the truth about the 30fps based on who he talked to, but I high doubt this will apply to the majority of developers. And I just want to point out again he did not limit this to the X2, so his comments are just as valid for the PS3. I just cannot fathom X2 and PS3 titles, with the CPU power they have AND the GPUs they have, being limited to 30fps. He may be telling the truth based on who he talked to, BUT I refuse to believe that this is something that will apply industry wide. If in fall of 2006 70% of games are running at 30fps I will apologize, but until then what he says makes no sense to me.
But all of this doesn't matter--I will believe the software. If the X2 is a disappointment as he claims the software will bare that out. Until then he speculation is just his opinion on what appears to be already public info.