Official Microsoft PR thread

Titanio said:
Anyone wanna guess where they came up with this figure? Looking purely at the main ALUs in the vertex and pixel shaders, I make it to be at least 255Gflops for RSX..
Um... no offense, but it really doesn't matter. FLOP counts are already goofy without asking MS about Sony's performance. :p

He also mentions it is not their intention to have any vertex processing done on the CPU, although I guess he's drawing a line between vertex transformation and vertex creation/generation..?
That's the only way it makes sense to me. No vertex shading on CPU. Animation and vertex creation are still expected to be on CPU, I would guess.
 
Mefisutoferesu said:
Again, if it's standard across all sales why would you're friend need it for saves?

And if you're on the go, why would you need your console HDD? I've never taken my PC HDD with me. There's no iPod-esque functionality, so that's out. Trade saves with your friends seems fair, but you won't be in reach of your console then.

Looks... I guess... seems kinda weird for a reason though... like saying I play games without a memory card because it ruins the seemless look on my PS2.

Size... how desperate for space are you?

Honestly, there isn't any solid reason aside from a slpit launch that makes this logical. I seriously think MS may indeed sell 2 different models.

Those were just examples off the top of my head. I'm sure someone can come up with a reason to remove their drive. Lets say I goto a friends house and want to take my saves with me so I bring my drive, but my brother wants to play my xbox while I'm gone. Its not really important _why_ someone would remove their drive, simply that they can and it may or may not be in a position to be reattached. What if it gets stolen, lost or broken?
 
Without our Embedded DRAM on the GPU and the huge 256Gbyes/second dedicated bus from the GPU core to the EDRAM, it is impossible to render true next-generation scene complexity at high-resolution with anti-aliasing.
They've done it again! :oops:
So he's saying it's impossible for XB360 to render true next-gen games? Coz they sure as hell ain't got 256 GB/s from GPU core to eDRAM.

Do MS just not know their own hardware, or are they using a 'bald-faced fib' campaign about their hardware?

Interviewers should ban the allowance of metrics from interviewees, because they only ever provide bogus/dubious figures. Let 'em spout subjective claims like 'greatest games ever more realistic than reality blah blah' and leave out all the flop/op/BW figures if they just gonna talk BS. This goes for all FUD merchants, Sony, nVIDIA...the whole shooting match. Think I'll buy Nintendo next gen just because they don't act like politicians!
 
Shifty Geezer said:
Without our Embedded DRAM on the GPU and the huge 256Gbyes/second dedicated bus from the GPU core to the EDRAM, it is impossible to render true next-generation scene complexity at high-resolution with anti-aliasing.
They've done it again! :oops:
So he's saying it's impossible for XB360 to render true next-gen games? Coz they sure as hell ain't got 256 GB/s from GPU core to eDRAM.

Do MS just not know their own hardware, or are they using a 'bald-faced fib' campaign about their hardware?

Interviewers should ban the allowance of metrics from interviewees, because they only ever provide bogus/dubious figures. Let 'em spout subjective claims like 'greatest games ever more realistic than reality blah blah' and leave out all the flop/op/BW figures if they just gonna talk BS. This goes for all FUD merchants, Sony, nVIDIA...the whole shooting match. Think I'll buy Nintendo next gen just because they don't act like politicians!

Its interesting how you get these very well educated people in very important positions and yet when they're interviewed they sound like they haven't a clue what they're talking about sometimes. The guy holds a masters and a bachelors degree in EE from Stanford. He is also "responsible for the design, engineering, testing and manufacturing of all Xbox consoles worldwide. This includes silicon design, manufacturing partnerships, and Xbox new-product ideation and incubation." Clearly he isn't an idiot and clearly he knows exactly what the xbox is capable of, so why the stupid comments?
 
I still see issues with even that reasoning or any, to be honest... but I don't wanna pointlessly argue to death such a small detail... it's not a big deal after all. It could just be something as simple as what you said, so I won't push it.

EDIT - I take that back this is FAR from inconsecuential (sp?)!!! If the HDD isn't necesary that means all saves must fit within a memory card, no HDD streaming, dled content must fit within a memory card, so on and so on. This is important... so I'm back to disputing it... ^^

That said what bothers me is I can't see MS trading in all the advantages that a HDD brings to the table. If they're saying you won't need the HDD, then, for example, games will have to be designed with load times in mind that are playable without HDD streaming... it may be used later to remove all load, but the game has to play without it and so on.

I can't see MS trading in advantages like that for the sake of letting you're little brother play while you're trading saves with you're friend. You're little bro wouldn't be able to save any since there'd be no HDD, and if you've got a memory card then why not just put the save on that? Heck of a lot better than walking around with your HDD.

There has to be a substantial reason to make the advantages of the HDD moot to MS, right? So, then what advantage is there??
 
SanGreal said:
Clearly he isn't an idiot and clearly he knows exactly what the xbox is capable of, so why the stupid comments?
Mine, or his? ;) MS are tooting that there is a bandwidth figure of 256 GB/s between the main GPU and the eDRAM daughter die. They raise this point every interview it seems. This is false information. The bandwidth between GPU and Daughter die is 32 GB/s in, 16 GB/s out (or thereabouts - I'm lousy with figures!). Read Wavey Dave's Xenos article. Look at the schematics. The 256 GB/s BW figure is internal to the daughter die, between it's logic and local storage.

Anyway, this marks an end of my personal crusade. MS will continue to spread this FUD (like the others), people will believe them because they're the experts and they obviously won't lie in order to give a false impression of their hardware, and my bleating following every PR quote is lost into the vacuous expanse of the 'net.

It's a lost cause and I'll leave it be :rolleyes:
 
One quick note that I haven't seen noted yet...

He mentioned that the Wireless is designed to use 4 controllers and 4 wireless headsets... This means that the WUSB isn't limited to 4 items and might also include future products (like a wireless camera) beyond the 4 & 4.
 
Wicked_Vengence said:
One quick note that I haven't seen noted yet...

He mentioned that the Wireless is designed to use 4 controllers and 4 wireless headsets... This means that the WUSB isn't limited to 4 items and might also include future products (like a wireless camera) beyond the 4 & 4.
AFAIK the Wireless USB standard was just recently finalized and the first product is going to be released after XBOX 360 is released.
 
Shifty Geezer said:
SanGreal said:
Clearly he isn't an idiot and clearly he knows exactly what the xbox is capable of, so why the stupid comments?
Mine, or his? ;) MS are tooting that there is a bandwidth figure of 256 GB/s between the main GPU and the eDRAM daughter die. They raise this point every interview it seems. This is false information. The bandwidth between GPU and Daughter die is 32 GB/s in, 16 GB/s out (or thereabouts - I'm lousy with figures!). Read Wavey Dave's Xenos article. Look at the schematics. The 256 GB/s BW figure is internal to the daughter die, between it's logic and local storage.

Anyway, this marks an end of my personal crusade. MS will continue to spread this FUD (like the others), people will believe them because they're the experts and they obviously won't lie in order to give a false impression of their hardware, and my bleating following every PR quote is lost into the vacuous expanse of the 'net.

It's a lost cause and I'll leave it be :rolleyes:

I meant his
 
Inane_Dork said:
Titanio said:
Anyone wanna guess where they came up with this figure? Looking purely at the main ALUs in the vertex and pixel shaders, I make it to be at least 255Gflops for RSX..
Um... no offense, but it really doesn't matter. FLOP counts are already goofy without asking MS about Sony's performance. :p

Agreed, but this is by far the most conservative figure that's been thrown out. I guess I shouldn't be paying too much attention to ATi's analysis of a Nvidia chip, but I'd still like to see their derivation to see if it makes any sense at all, or if they just worked the figures to come out lower than Xenos (without more info, I'll assume the latter!).
 
Shifty Geezer said:
Mine, or his? ;) MS are tooting that there is a bandwidth figure of 256 GB/s between the main GPU and the eDRAM daughter die. They raise this point every interview it seems. This is false information. The bandwidth between GPU and Daughter die is 32 GB/s in, 16 GB/s out (or thereabouts - I'm lousy with figures!). Read Wavey Dave's Xenos article. Look at the schematics. The 256 GB/s BW figure is internal to the daughter die, between it's logic and local storage.

Anyway, this marks an end of my personal crusade. MS will continue to spread this FUD (like the others), people will believe them because they're the experts and they obviously won't lie in order to give a false impression of their hardware, and my bleating following every PR quote is lost into the vacuous expanse of the 'net.

It's a lost cause and I'll leave it be :rolleyes:

It's a matter of fighting fire with fire.

Sony exagerates their specs, so MS says what the hell, we might as well play dirty too.
 
Just wondering, when the two graphics chips ("Xenos" and the "3dram") are combined onto one physical chip, will you guys stop taking such umbrance to MS and ATI describing the GPU core as having a 256 GB/s connection to the embedded dram?

I don't really care how many physical parts the GPU (as a "conceptual unit") is broken up into, two or fifty, it's what it can do that matters. If you could chop the PS2's GS into two units it wouldn't lessen what its embedded memory allowed it to achieve.
 
function said:
Just wondering, when the two graphics chips ("Xenos" and the "3dram") are combined onto one physical chip, will you guys stop taking such umbrance to MS and ATI describing the GPU core as having a 256 GB/s connection to the embedded dram?

I don't really care how many physical parts the GPU (as a "conceptual unit") is broken up into, two or fifty, it's what it can do that matters. If you could chop the PS2's GS into two units it wouldn't lessen what its embedded memory allowed it to achieve.

That isn't exactly the problem people are having. It is that ATI/MS seem to think that the GPU shader core has a 256gb/s pipe to the '3dram' -- but it doesn't, it has a ~32gb/s pipe to it and the 256gb/s pipe is between the actually edram and the blenders (the 'smart' part of the edram).

At least that is my understanding behind the situation.
 
Bobbler said:
function said:
Just wondering, when the two graphics chips ("Xenos" and the "3dram") are combined onto one physical chip, will you guys stop taking such umbrance to MS and ATI describing the GPU core as having a 256 GB/s connection to the embedded dram?

I don't really care how many physical parts the GPU (as a "conceptual unit") is broken up into, two or fifty, it's what it can do that matters. If you could chop the PS2's GS into two units it wouldn't lessen what its embedded memory allowed it to achieve.



That isn't exactly the problem people are having. It is that ATI/MS seem to think that the GPU shader core has a 256gb/s pipe to the '3dram' -- but it doesn't, it has a ~32gb/s pipe to it and the 256gb/s pipe is between the actually edram and the blenders (the 'smart' part of the edram).

At least that is my understanding behind the situation.


where did they say that again?

The last I saw, they were trying to describe how using the 256 between the Daughter die's logic and ram will significantly reduce the need for bandwidth in other areas compared to traditional architecture.

It's awfully tough for them to try to explain that to the average gamer looking at specs.


I know they have said it a couple of times previously but I thought that they are now trying to explain it properly.
 
New interview with Todd Holmdahl from MS

This guy completely digs into everything that the PS3 is about. He stats many times on all the mistakes that Sony made when creating the PS3. I will bold all the remarks that show that the x360 will crush the PS3 in everyway. (In his eyes anyway)



Q All those connections (composite, component, s-video and VGA) have one thing in common: they carry the video signal in an analog format. What about digital connections such DVI or HDMI? Will there be A/V packs with DVI and/or HDMI connections? If not, will you consider offering that in the future?

Todd Holmdahl: Xbox 360 will support HD component video output, which is compatible with nearly every HD-ready TV on the market today. We’re poised to hit the sweet spot of the HD market at launch and as the market matures, and we will provide an HDMI for our customers when it makes sense. The reality is, you don’t need HDMI for HD gaming.

Q I understand the choice of 720p over 1080i because of the fast-motion action sequences involved in gaming; a progressive scan picture is better than an interlaced one for this case, but lately we have seen more and more 1080p capable displays, especially DLP-based flat screens (although 1080p doesn’t belong to ATSC’s specifications for HDTV). The question is, will the Xbox 360 be able to offer a 1080p signal? Is the Xbox 360 GPU powerful enough to render games up to that resolution at playable framerates?

Todd Holmdahl: Xbox 360 offers choice for both the game developer and the end consumer. The game developer can create their game in any resolution. The consumer can request any output resolution (480i, 480p, 720p, 1080i). The Xbox 360’s advanced video scaler will scale the game’s native resolution to the end consumer’s requested resolution with extremely high quality output. Bottom line, the games will look amazing.

Xbox 360 does not support 1080p at this time. It’s an incremental improvement at an astronomical expense, and we don’t see consumers clamoring for 1080p TVs yet. We will continue evaluate the market and deliver the capability when and if customers want it.

As for the GPU, the Xbox 360 has the most powerful GPU of any gaming console; even better, we made specific design choices to enable game developers to render real-time complex scenes at high resolutions with high-quality anti-aliasing. Without our Embedded DRAM on the GPU and the huge 256Gbyes/second dedicated bus from the GPU core to the EDRAM, it is impossible to render true next-generation scene complexity at high-resolution with anti-aliasing.

Q The Xbox 360 has three general purpose cores and three VMX vector units (one per core) while the PS3 has one CPU core, one vector unit, and seven SPEs. These differences found in both consoles’ architectures result in the Xbox 360 having more general purpose power than the PS3 while the PlayStation 3 has more floating-point performance (2 TFLOPs against 1 TFLOP of the Xbox 360 to be precise). Isn’t this a major advantage of the PS3 over the Xbox 360?

Todd Holmdahl: No. The PS3’s vaunted teraflop “advantage†is only an advantage on paper, because the PS3’s performance will be limited by its architecture. Real games are about 80% general purpose code and about 20% floating point code. Xbox 360 has three 3.2 GHz general purpose processors to Sony’s one, and it is these main cores, not special purpose SPEs, that are best suited for game code.

The PS3’s design requires high levels of floating point performance, because the PS3 GPU is unable to do automatic load balancing between pixels and vertices, so performance will drop off during vertex processing. The PS3’s design requires that the CPU take up the slack. On Xbox 360, we do not plan for the CPU to do any vertex processing at all, which leaves all of the processor’s power for game simulation code.

Floating point performance is much more relevant on the GPU, where Xbox 360 has more floating point performance than PS3. Not only does the Xbox 360 GPU have greater raw shader power than the PS3 GPU (240 GFLOPS versus estimated 228.8 GFLOPS on PS3), it will also use more of its power. The Xbox 360’s unified shader model will automatically optimize graphics for each game (vertex or pixel shading), without the developer having to write any extra code. Xbox 360 will also have embedded DRAM to avoid bandwidth bottlenecks and to give developers “free†anti-aliasing to eliminate jagged edges in every game.

Both systems have 512 MB of memory, but we gave developers the flexibility to decide how they use it, while Sony’s split memory architecture is more limited in its options. (We actually looked at the split memory architecture and decided not to use it because of those limitations).

Sony’s emphasis on certain areas of the PS3’s performance is a nice way to distract attention from the fact that they seem to have no response to Xbox Live. Only Xbox 360 has the hardware, software and services to enable the complete gaming and entertainment experience.

Q I don’t know if this more a software question, but it is related to the CPUs. You know how current programming languages such as C++ were not developed with multithreading software in mind, making very difficult to write multithreaded applications. What specific tools (like Intel provides Pentium IV developers the OpenMP API) does Microsoft provide developers to unlock the power of the multi-threaded, multiple core Xbox 360 CPU?

Todd Holmdahl: We’ve done several things to make the multi-threaded transition easier for developers. We brought Open-MP to Xbox 360 to give developers an easy toolkit for getting started with multi-threaded code. We also made our processing cores symmetrical so that you don’t have to completely change your development paradigm to take advantage of them (i.e. convert the algorithms to stream processing to run on multiple DSP-like processing elements. Finally, we are updating our world-class developer tools like PIX and visual studio to make analyzing data from multiple threads much simpler for developers.

Q Something that is more of a curiosity; you know, there have been multiple-core processors for a long time, in servers, and now they are arriving to desktop computers in the form of the Athlon 64 X2 and the Pentium D dual-core processors. The amount of cores in multiple-core CPUs is always found in powers of two. Why does the Xbox 360 have a three-core CPU and not a dual or a quad-core CPU?

Todd Holmdahl: The Xbox 360 selected three cores, each with two threads, specifically for the next generation game workload. We have analyzed hundreds of games for Xbox 1, and this gives us a clear insight into how many hardware threads are useful. We then talked to game developers to make sure our design matched their next-generation aspirations for their architectures. It turns our from a game development perspective, six threads is a good upper bound. There is no reason why a power of two is important for CPU cores other than size and power constraints the system might have.

Q Can the “well beyond Shader Model 3.0†hardware capabilities of the Xbox 360 GPU be associated to those features (an unified shader model) that will debut with Longhorn and Windows Graphics Foundation 2.0?

Todd Holmdahl: We are aligned with DirectX and their roadmap. So, yes, you will see support for our hardware capabilities in LH (via DirectX) down the road.

Q In the specs, it is always mentioned 4X MSAA, but can developers choose a higher order anti-aliasing for their games?

Todd Holmdahl: The hardware supports up to 4X MSAA. However, developers can perform supersampling and hence render to a larger frame buffer as an alternative. This would be done in software. An important thing to understand is that most games end up turning anti-aliasing off due to the performance penalties from standard architectures. With Xbox 360 we designed the GPU from the ground up so that enabling anti-aliasing would not create any performance hit for developers.

Q What about anisotropic filtering? What would be the standard and what other levels of anisotropic filtering will developers be able to achieve in real world situations?

Todd Holmdahl: We support a custom adaptive anisotropic filtering. So, the “level†is not really relevant here.

Q Can we really compare the 100 million shader ops per second (RSX+Cell) of the PlayStation 3 with the 48 billion shader ops per sec of the Xbox 360 GPU, considering your shaders are unified and theirs aren’t?

Todd Holmdahl: The unified shader model is definitely a win due to the flexible architecture. Most of the rendering will be done on the GPU, and that’s where we have a clear advantage. Also, if you take into account the simultaneous texture fetches, control flow operations and programmable vertex fetch operations, you get 80 billion operations per second. And this doesn’t even take into account work that could be done on our 3 CPU cores.

Q One of the features that Sony highlighted at their press briefing was the RSX ability to render colors with 128-bit floating-point per pixel precision, a useful feature when rendering high dynamic range lighting. What is the level of per pixel precision that the Xbox 360 GPU can render?

Todd Holmdahl: It’s not sufficient to add pixel depth. The hardware must be able to support the bandwidth requirements associated with that pixel depth. It’s necessary in order to have a balanced system. In a real-world scenario, we have a much higher usable data/pixel bandwidth due to our embedded DRAM. Also, you need to design for the usable sweetspot between color depth and human perception on modern display devices; 128bits is not only impractical from a bandwidth perspective, the extra detail over 64bits has marginal impact at best in real-world rendering situations.

Q So far, developers working on both platforms tell us that hardware wise, the PlayStation 3 is more powerful than the Xbox 360. Are you confident that Microsoft’s advantage in the software and services areas can close any lead that Sony might have with its hardware?

Todd Holmdahl: First of all, I dispute the notion that the PS3 is more powerful than Xbox 360. We outperform where it matters, and our system is much more balanced, making sure we can actually harness all the power of the system, unlike Sony. We believe that their teraflop calculations are wrong, and we put them at a similar teraflop number as Xbox 360. So when you consider that the hardware is actually about equal, we absolutely believe that software and services will be a huge advantage for us in the next generation.

Our tools help developers get the maximum power out of the box, therein enabling them to develop higher quality games. And I don’t know how anybody can claim leadership in this generation of gaming without an online gaming solution. Now in its third generation, Xbox Live is a community of nearly 2 million and growing, spanning 24 countries worldwide. Frankly, Sony’s got a lot of work to do if they want to come close to matching what we offer.

Q What do you think of the PlayStation 3 ability to deliver two 1080p video signals? That’s not just a hardware feature; it has definitely an impact in applications, for example, dual-display instead of split-screen multiplayer. Will you consider adding a dual-display feature to the Xbox 360 to match Sony’s offer? Is the Xbox 360 GPU powerful enough to deliver, let’s say, two 720p video signals?

Todd Holmdahl: The Xbox360 GPU is definitely powerful enough to render two 720P video frames. However, it is important to note that rendering two separate views will degrade the quality of each individual view as the same system resources are now being used to deliver two frames instead of one. Given the tradeoff, we do not see this as a strong consumer demand. If the situation changes, we will re-evaluate our position on this. Looking at the PS3 bus architectures it is hard to understand how the GPU will have sufficient bandwidth to drive one 1080p output buffer with the scene complexity in a next-generation game. Driving two such outputs would mean simplifying the graphics scene to a point where it would definitely not be perceived as “next-gen.â€

Q In that same interview, Kutaragi suggested that the Xbox 360 is an extension of the first Xbox, extending the product line as a general-purpose computer. Kutaragi said that "The approach of adopting multiple, all-purpose processors will just raise integer calculation capabilities that will only benefit general applications. This approach will increase the machine's capabilities as an all-purpose computer, but it won't change the type of entertainment." He also said “We welcome the fact that Microsoft invests is this area but, by only improving the graphic performance and increasing the output resolution of the current Xbox, [Xbox 360] won't change anything from the current world of videogame consoles.†What can you say after Kutaragi’s comments?

Todd Holmdahl: Well, as we have a more powerful GPU in Xbox 360 than Sony added to the PS3, we would have to agree that increasing graphics processing power is important. The reality is that the Xbox 360 hardware is a huge leap forward in gaming technology. We have increased integer performance as Mr. Kutaragi states, which is important for game performance. We also believe Sony’s PS3 teraflop calculations are wrong, and that their performance is actually closer to ours when we do the math. This gives us greater integer performance, greater memory bandwidth, and greater floating point performance on the GPU with much better utilization. There is no doubt that our hardware will push gaming forward, but when it comes to fundamentally changing the type of entertainment experience, I think Xbox Live and system software features like the Xbox Guide give us a clear edge. Powerful hardware coupled with software and services make Xbox 360 the only console that can deliver a complete gaming and entertainment experience.

So what do you think? I know, I know it's PR but how much of what he is saying to true? How much of it is false? And is this any different than what Ken K. has been doing? And finally why isn't this interview on every website saying "Todd says the Xbox 360 is stronger than the PS3 in every catagory?"
 
I hardly think he's tearing into the PS3 - more like defending 360's design decisions and why he feels they were the right ones. That's not saying I agree with this guy, it's just saying that I far prefer this sort of interview style than Allard's nonsense.
 
mckmas8808 said:
It sounds like he is tearing a hole into Sony to me. I mean the man says that they are better at everything.

it sounds to me that since these systems aren't being developed in a vacuum but are in a marketplace, that he is just reiterating what he sees as strengths of their system and how they relate to the competition. In their opinion.

They obviously did their homework on their design and since Sony has several times now, used X360 as a comparison (see slides at E3 an KK's interviews) he was just countering those points in what sounds like to me, like reasoned explanations.
 
Back
Top