arstechnica part 2 : inside the xbox 360

Since I posted that AI bit a little while ago, I just want to add on a similar note that at this point on the ArsTechnica board related to the article:

1) Several individuals have made the point in favor of AI and physics actually being destined to do quite well in a heavily parallelized architecture environment, and made their cases such that Hannibal says he will ammend and edit his article. (the primary reason for my posting this)

2) 'The GameMaster' has joined said forum in a vain attempt to pursue his questions on VMX128 and to hear a result on Cell and 360 dot product capabilities that are more to his liking

3) And Hannibal takes a dig at J. Allard's lack of knowledge as far as computer architectures go

Frankly, I think Allard either doesn't understand microprocessor architecture, or he's just feeding some journalist a line. Even if IBM did start with the 970 core (which they didn't), stripping out all the dynamic execution and making it a two-issue machine would make it a fundamentally different core, and there's no way you could say that it was "bascially the same."

I mean, dispatch grouping? the group completion table? the reservation stations? hello! If you ripped all that stuff out of the 970 and then calling it "basically the same core," you'd be lying. Also, if you ripped all that stuff out, there's no way you'd get enough ILP out of anything to make a core that wide anything more than a big waste.

So Allard is full of here it any way you slice it.

Catch all the action here
 
Rockster said:
I don't think Hannibal's closing statements should be so literally dissected. To me they were merely generalizations made to qualify statements made earlier in the article in an effort to avoid any fanboi criticism. It's a simple attempt at playing both sides of the fence.

I also think he is framesetting his opinion within the context of CPUs in general. He never downplays their performance in certain situations, just that they are not competitive in certain tasks compared to a more complex GP processor like a P4 or AMD64. You cannot have everything in a limited amount of die space/transistor budget. Sacrifices and compromises are made. The benefit to a console is that it is closed box... so the only real question is how feasible are the work arounds and were they the right compromises for the intended software. Since MS and Sony both went a similar direction (yet very different also) it seems they are thinking along the same lines, just in different ways.
 
ralexand said:
No need to get snippy. He's just wondering if the 360 processor is based on cell ie. PPE or the 970 ppc.

Maybe I'm wrong but it definitely looked like an affirmation and not a question. And this is the second thread where's implying that Xenon is based on PPC970 while everybody's telling him that this is not the case. This is beginning to become annoying.
 
xbdestroya said:
2) 'The GameMaster' has joined said forum in a vain attempt to pursue his questions on VMX128 and to hear a result on Cell and 360 dot product capabilities that are more to his liking

Priceless :LOL: All the information he needs is right here!
 
xbdestroya said:
...
2) 'The GameMaster' has joined said forum in a vain attempt to pursue his questions on VMX128 and to hear a result on Cell and 360 dot product capabilities that are more to his liking
...

LOL!

I just answered that there with a link back to this forum! 8)
 
MfA said:
Given that they talk about a software managed branch target buffer I presume the branch hint will say the equivalent of "the branch at PC + X will likely jump to Y" rather than "the next branch will likely not be taken".

(Only X being an immediate operand ... hence dynamic.)

Well, it would be cool to have a hint mechanism like that.
A pseudocode example:

Code:
...
mDummy = mLocal2World*mWorld2Camera

if (bShake ) 
{
   mDummy *= mShakeCamera
}

mProjection = mDummy*mCamera2Projection
...

a not too dumb compiler would produce something like this:

Code:
Hint_BranchNotTaken(bShake)

mDummy = mLocal2World*mWorld2Camera

Branch_IfNotTrue(bShake, NoShake)

mDummy *= mShakeCamera

NoShake:
mProjection = mDummy*mCamera2Projection

If the first assignment to mDummy takes as much to process to hide all the cycles are needed to prefetch the correct path this code would run without any pipelines bubbles even if bShake is changed on a random basis (better than with a branch predictor ;) )
 
This works for collision detection too, where the expected case is that branches for "hits" are not taken (collisions are rare compared to the size of your database) , and other branches are always taken (the main iteration loop, which will surely be unrolled alot too) So you might face an 8-cycle penalty everytime a hit is detected, but if you use loop unrolling and predication coupled with the fact that collisions are the exception, not the rule, your penalty won't be that bad.
 
DemoCoder said:
On what basis do you think Microsoft, which is a recent contender in the games market, has more experience and wisdom than Sony?

Isnt DirectX as old as the playstation? Voodoo cards ran on windows systems right? :rolleyes:
 
DirectX was a piece of junk and vastly inferior to OGL or Glide at the time it was introduced. Voodoo cards ran on windows, but during that era, most games did not use Direct3D, but instead used either software rasterization, or proprietary APIs like Glide. If I except your hypothesis, than Silicon Graphics, HP, and Sun should have EVEN MORE relevant experience. Voodoo cards ran *in spite of windows* not with it. I used Voodoo on NT4 where Direct3D was useless and unable to support HW acceleration at all. If it weren't for Glide and OpenGL, I wouldn't have been able to game at all.

But given the fact that MS designed many many crappy revisions of their game APIs (Win-G anyone? DX1-3...) before they learned enough from OpenGL to make something good, architecting the API does not imply they have any idea what game workloads in the next-gen will be, or even current-gen. For example, if they had a clue, they would have implemented much of DX9's support for instancing and WGF's user-mode DX along time ago.

Given Microsoft's past performance, and demonstrated failure to learn from those who had gone before them when designing their APIs, I think we can safely dispel any rumors of their status as a game technology Oracle.
 
DemoCoder said:
I used Voodoo on NT4 where Direct3D was useless and unable to support HW acceleration at all. If it weren't for Glide and OpenGL, I wouldn't have been able to game at all.

To be fair, NT4 was not a consumer oriented OS in the Voodoo days.

The only reason NT4 even had OpenGL was that there were some high-end CAD and modelling apps that used it.

NT4 was not the focus for a consumer oriented 3D accelerator like Voodoo.
 
Yes, but before I had used windows, I was used to using Unix and OS/2, where I was able to have the benefits of memory protection, security, and resource tracking.

I never bought into the tripe that consumers could not have a real kernel in their OS. Much of the HW of the day could have supported it. I know guys who work on the MS NT Kernel team who chalk up the long delay in getting a proper kernel into the hands of consumers to the low quality of programmers in the MS division responsible for the GUI shell and application level APIs that sit on top of NT. Backwards compatibility with DOS was a red herring, since most of the DOS apps could be made to run virtualized, and those that didn't, could run dual boot, just like Win95's original "boot to dos" mode.
 
DemoCoder said:
Yes, but before I had used windows, I was used to using Unix and OS/2, where I was able to have the benefits of memory protection, security, and resource tracking.

Agreed, though OS/2 didn't have anything much in the way of security (I ran it for years before I switched to NT4). You had to run OS/2 Server to get ACLs, and OS/2 didn't have the notion of user sessions either.
 
blakjedi said:
DemoCoder said:
On what basis do you think Microsoft, which is a recent contender in the games market, has more experience and wisdom than Sony?

Isnt DirectX as old as the playstation? Voodoo cards ran on windows systems right? :rolleyes:

Actually Microsoft has been involved in 3D gaming since before Nintendo had a console!

http://fshistory.simflight.com/fsh/versions.htm

Of course, they bought their way in, but I guess that is the Microsoft way.

:LOL:
 
xbdestroya said:
3) And Hannibal takes a dig at J. Allard's lack of knowledge as far as computer architectures go

J. Allard was one of the primary authors of Winsock... I think that means he might know a thing or two about computer architectures. Just cause somebody is in PR mode trying to get stuff across the public, doesn't mean they don't know the details.

The problem is that people try and disect every little PR post as gospel truth, whereas is usually just meant as an impression. Same thing for all the PR about PS3.
 
Back
Top