TXB on Xbox2 = 51.2GB/s bandwidth??

I was under the impression DX9 is going to be tied to Longhorn. Otherwise when Longhorn comes out there will be virtually no PC's capable of running it. Especially entry level Intergrated graphics type PC's which are the majority sold. Also there is some talk that the reason behind it's delay (is there a delay?) is that Intel does not have a solution that supports DX9 anytime soon. DX10 maybe scheduled for a similar timeframe to Longhorn but i'm sure DX9 is the API tied to it as a requirement.

I seem to remmember that either Parhela(?) or P10 was thought to have the hardware required for it and they are only partial DX9 parts.

I could be wrong my memory ain't the best... :D
 
sir doris said:
I was under the impression DX9 is going to be tied to Longhorn. Otherwise when Longhorn comes out there will be virtually no PC's capable of running it. Especially entry level Intergrated graphics type PC's which are the majority sold. Also there is some talk that the reason behind it's delay (is there a delay?) is that Intel does not have a solution that supports DX9 anytime soon. DX10 maybe scheduled for a similar timeframe to Longhorn but i'm sure DX9 is the API tied to it as a requirement.

I seem to remmember that either Parhela(?) or P10 was thought to have the hardware required for it and they are only partial DX9 parts.

I could be wrong my memory ain't the best... :D

Yes Longhorn will be based on DX9 but that doesnt mean DX10 wont launch along side of Longhorn. Thats what they are saying (that it will launch along with it).
 
Xbox2 will most likely be fully dx10 . But i'm pretty sure launch tittles will be mostly dx 9 lvl graphics.
 
Grall said:
Panajev2001a said:
Deadmeat, that would be probably barely enough at HDTV resolutions.

Worse still, at just 640*480, framebuffer without any AA at all occupies 4800 kilobytes at deep color depth. 5MB is obviously entirely insufficient!

*G*

Deep color depth ? 24-32 bits are not too deep for the back-buffer.

I could have responded like this:

That is why Flipper with 2.21 MB of frame + Z-buffer has such a hard time :p

Or the GS with 4 MB :p

But I won't as I know you were trying to "burn" the good ol' Panajev.

I am not against doing what ATI might do and put a minimum frame-buffer and Z-buffer in e-DRAM.

I am just saying that if they plan to support 720p by default without having to move the Z-buffer in system RAM that they should add a bit more e-DRAM.

At 24 bits they take enough space.

( ( (( 1,280 * 720 ) * ( 24 / 8 ) * 2 buffers) / 1,024 ) / 1,024 ) = 5.27 MB

That is the requirement for 720p with back-buffer ( to which you render ) and Z-buffer stored in the e-DRAM and being 24 bits each.

720p is a Widescreen HDTV mode hence it being 1280x720.
 
Xbox2 will most likely be fully dx10 . But i'm pretty sure launch tittles will be mostly dx 9 lvl graphics.

GOD NO!!! I want at the very very very least a demo of gollum's head(and no nvcheapo demo, I want it to be undistinguishable or slightly better)!!!
 
If XGPU2 has eDRAM, I would like to see 12-16 MB. no way 5 MB would be enough. that's barely more than GS in PS2 (4 MB) or Flipper in GC (3.12 MB)

GS3 in PS3 is ment to have 32-64 MB.
 
If e-DRAM is going onto the XGPU it's going to be in the form of some type of cache. My guess is that they will shove 1-2MB at it.

Than again... I thought Xbox2 would use like 512mb of ram as a UMA structure again? Why would they ditch UMA.
 
Re: ...

DeadmeatGA said:
I agree with one thing, XGPU2 must have eDRAM frame buffer or it will be a toast. Nothing massive like Sony is sticking in their GS3, but just enough to hold one frame buffer, ala Flipper style. 5 MB should be enough to hold 1 Z-buffer and 4 subsample buffers.

If they go with Mosys 1T-SRAM-Q for embedded memory, getting over 5mb of ram on the die shouldn't be much of a challenge. TSMC (or possibly UMC) should be able to handle that. The only other big player I could see fabbing a VPU for Microsoft is Micron. Micron cleary would have no problem including a substantial amount of eDRAM on a die.
 
Panajev: I saw this posted on another board, its about using MCM type of package for R500 / NV50 and also especially for PS3. like what we have with POWER4. that is four twin-core CPUs on a chip package.

..just nevermind the raytracing comment ok :)


I keep on thinking that the PS3 is going to use some sort of multi-chip
module like the POWER4. I mean everything gets squeezed into a package
that fits into your hand - the CPUs, the GPU and even the Rambus memory!
That would allow for hundreds of GB of bandwidth to main memory at
relatively low latencies. eDRAM is an expensive option with relatively
poor yeilds compared to seperate processor and memory dies. My mind is
still considering Sony is going to push ray tracing on the PS3. If the
rumors of 8 PPC chips with 8 vector units per PPC hold true, they might be
able to do just that. The idea of 8 PPC chips would fit into in the MCM
plan very nicely. The only problem would be cooling the beast.

eDRAM is a rumored addition to the R500/NV50 generation of chips in small
amounts. It wouldn't surprise me if ATI or nVidia goes to MCM's for their
high end graphics chips. A MCM can easily be designed to accomedate a 512
bit wide memory bus running at speeds greater than 1 Ghz. The performance
gains would be the justification for the added expense.

The shader quality of the R500/NV50 might be of comparable quality to ray
tracing found on CPUs. The chips may become just as programmable as a CPU
in this respect. In other words, the X-Box 2 is going to be doing stuff
on its GPU where the PS3 uses several CPUs to do.

It will be interesting to see how things play out in the future. I can't
wait till all the rumors and speculation becomes stop when they start
launching the next generation of hardware and the facts are revealed.
 
Paul said:
If e-DRAM is going onto the XGPU it's going to be in the form of some type of cache. My guess is that they will shove 1-2MB at it.

Than again... I thought Xbox2 would use like 512mb of ram as a UMA structure again? Why would they ditch UMA.


I'm not sure how Microsoft can have an outstanding performance console with a UMA. Graphics will need massive bandwidth, while a CPU needs very low latency as do networked games. Online games have 16 players or more playing at the same time. It's critical for each console to recieve and send packets as fast a possible, otherwise you start to notice lag (high latency effects like players warping around and shooting a player but not really hitting them). RLDRAM-II comes across as the perfect solution for a online gaming console.

The latency of GDDR-3 is much higher but the cost/bandwidth/density ratio makes it a good fit for a VPU.

What kind of RAM could Microsoft include to make a good UMA?
 
While I have not heard of RLDRAM-II before, or at least I don't remember reading about it, I totally agree with your post Brimestone.

I honestly hope MS ditches the UMA approach for the next Xbox console.


I hope for a killer multi-core CPU from Intel or even AMD. not some run of the mill CPU like Xbox had. that combined with a massively powerful VPU that has extra processing units compared to the standard R500/R550 PC VPUs (like PowerVR2DC over PowerVR2)
 
Hey don't ask me how they will do it. But UMA was a great feature of Xbox and supposidly good for developers, so I figured they would still use it.

Although network packets are more on the Internet connection side of things..(latency, bandwidth) I'm not sure what this has to do with a CPU.
 
Yes, Paul's ultimatly correct. IMHO it's [odds] 8:2 in favor of a UMA-type architectre. Segmented memory reminds me of the people talking about multichip solutions - just not happening.

And, just to follow up quickly those who are commenting that the difference is between eDRAM and logic; which isn't necssarily correct in praxis. If ATI or nVidia had a history or producing IC's that are approaching 300mm^2, I'd agree. But, as history shows - SCEI's had almost a 100mm^2 advantage over the other two with the GS. Even the R300, which is large by their standards is rumored to be below or approaching 200mm^2. Goes a long way to masking the eDRAM footprint, and just shows how important a strong grasp of the lith/process is.
 
Last planned PC-desktop graphics core with eDRAM would have had it only 3MB... so, you don't need it so much as you think. Only thing is that you need to make few things in different way than now.


It is possible make quite fast eDRAM chip without trading so much features. Architechtually it will look quite different than nowadays chips, but hey, who gives a damn if it runs games twice as fast as your neighbours console. ;)
 
didn't the Rendition / Micron V4400E (V for Verite) have 12 MB of eDRAM back in 1999-2000 ? it was 120 or 125 million transistors, as much as today's NV30.
 
Megadrive1988 said:
didn't the Rendition / Micron V4400E (V for Verite) have 12 MB of eDRAM back in 1999-2000 ? it was 120 or 125 million transistors, as much as today's NV30.
I don't remember the transistor count, but it was supposed to have 12MB eDRAM. IIRC, one of the reasons it didn't make it was due to extreme heat. Would have been blazin' fast at the time though. :)
 
If ATI or nVidia had a history or producing IC's that are approaching 300mm^2, I'd agree

You know Vince, ATI can churn out 300 mm^2 chip, if MS approved ;)

I know DMGA wouldn't approved :D
 
While I'm a huge fan of the Xbox and I believe the next iteration will be a great machine with great games, I think it will ultimately fail if Microsoft doesn't include a mouse and keypad as part of the default retail package. Sony and Nintendo can get away with gamepads only, since their audiences have been raised on platform, sports, and racing games. A large part of the Xbox user base, however, are refugees from PC gaming (myself included) who prefer typical PC fare like FPS and RTS games. Most of the games we like have evolved in conjunction with a mouse and keyboard as their control mechanism. I love Halo and Brute Force, but I could never get that slight bit of disgust out of the back of my mind that came with having to play them with dual sticks. I think Microsoft needs to consider this matter carefully, as I believe it's of equal importance to the graphics hardware for the next Xbox. I know I'll just return to PC gaming exclusively if the sole control device for the Xbox 2 is a gamepad.

Cost doesn't have to be an issue if they'd just be willing to get creative with the solution. For example, they could include an optical mouse that plugs into a data port (memory card slot) on the gamepad. For movement, the gamepad itself could be used. As long as it's shaped right, they could include an adapter that snaps onto the gamepad and provides a flat surface to steady the pad. Then you could just use the thumbstick for directional movement. In such a scheme, the only additional hardware that must be included is the optical mouse (and a cheap piece of plastic that connects to the gamepad). Since this hardware would be included with every Xbox unit, all FPS games appearing on the console could be designed with the mouse as the sole means of aiming.

This is just an quick idea. I could probably come up with a dozen other possible setups that would allow MS to include better control for certain games without going to great expense. The point is that I don' think it will matter if the Xbox is twice as powerful as the PS3. Many of us Xboxers have had a taste of gamepad control and we don't care for it, whereas the main fans of the Japanese consoles will return to their systems' next machines even with just a gamepad. Microsoft is in danger of losing the console war (and billions of dollars) outright if they don't consider the ancillary issues along with the hardware specs. Of course, if their intention is just to make a living room 'media center' with the Xbox 2, then none of this matters, anyway. I have a PC and TIVO for that stuff. I don't need a console to serve all my 'multimedia needs'. A lot of the comments coming from Microsoft and Sony sound very much like that is what they intend with their next consoles. I think consumers are much smarter than these corporations give them credit for. I doubt many people will want to sign up for a godlike media machine when they can instead pick and choose the exact components they want and need from a range of different vendors.

I guess all this will play out in the next few months, and we'll see exactly what we can expect in 2005-6.
 
I don't know why you guys expect a VS4.0/PS4.0. VS3.0/PS3.0 has pretty much anything a general purpose instruction set needs. I don't see much additions to it besides upping the limits of registers, loop, and call nesting. This is unlike the huge differential between PS1.0 and PS2.0.

What future hardware needs is to run VS/PS3.0 much faster. DX10 will utilize PS3.0. That's why MS created VS3.0/PS3.0, so they wouldn't need to keep revising the instruction set. The 3.0 shaders set the bar atleast 2 hardware generations ahead in terms of pipeline flexibility. NV40/R420 will be the first implements (MAYBE). By the time Xbox2 rolls around, what they need is a second generation implementation of PS3.0, not some illdefined PS4.0 that people don't even have a clue as to why they would need it.


(Could someone suggest exactly what they expect PS4.0 to have that justifies it as PS4.0, and not PS3.1, that is a worthwhile addition and doesn't kill performance?)
 
Back
Top