Stop trying to bring us back to logic. Logic has no place in this portion. The X at the end of it adds a 10. XTX has two (100), XT has one (10), and XL has one (10).
Can you imagine if XFX got into AMD's game ?
"XFX X2900 XTX XXX"
Stop trying to bring us back to logic. Logic has no place in this portion. The X at the end of it adds a 10. XTX has two (100), XT has one (10), and XL has one (10).
No need to get so defensive. Most of us could think of many situations where a GPU would NEED more than a 512bit wide bus to local mem, as I'm sure could you. You can't think of a way to saturate ~179 GB/s (@ 2.8ghz GDDR4)? I think its kind of a pointless exercise though.
Actaully its quite hard to saturate that much bandwidth before you hit a shader or fillrate bottleneck . Shaders get more complex less bandwidth thats necessary, post render effects take up bandwidth though, how many types of post render effects are we using in today's games, I really can only think of 2 or 3 that will become main stream, HDR, motion blur, (I wouldn't put deffered shading into this but since Unreal 3 engine has it.....) Deffered shading. Then we have AA and AF that take up bandwidth. Opps forgot depth of field, so thats 3-4
AMD Releases Final "R600" Specs
Interesting title but the article is actually a bit light on the specs, nothing that wasn't already known...
Two 8-pin (2x4) VGA power connectors are featured on Radeon X2900, but the connectors are also backwards compatible with 6-pin power supply cables.
INQ reported that R600 architecture will utilize PCI Express x16 connection to the fullest by delivering both video and audio content using HDMI. R600 board will be shipped with at least one DVI-to-HDMI dongle, with DVI serving as the bandwidth provider for video and audio. R600 chip has the bandwidth to drive not one, but two HDMI ports at the same time with resolution at 1920x1080 (1080p or 2560x1440 (1440p).
So eh.....hmmm....:smile:
All this talk about R600 having HDMI sounds sweet, but when I heard HD audio, the first think that clicked in my mind was this old old slide leaked long ago and dismissed as fake(because some stated it was sloppy looking...). When this slide was released, I got really excited about the HD audio part and the possibilty of R600 having a sound chip. I mean come on!! LOOOK AT MY NAME!!
Then we hear (imo) solid info on R600 using HDMI in this tid bit...
notice the part I highlighted in red that stikes me the most. This part completly reminds me of that old slide that stated the samething.
here is the slide...
It actully makes perfect sense IMHO. Look at GPU boards. They are actully mother boards on a mother board. It has a processor, it has ram, and a seperate PCB. The difference is that a GPU board is a entertainment/multi media board. It makes sense to add a sound chip to this entertainment/multi media board. Might as well intergrate it in a all in one package.
Id love to see this. Im tired of onboard sound and if i have to pay 600+ dollars for this card, I want to get audio too.So eh.....hmmm....:smile:
All this talk about R600 having HDMI sounds sweet, but when I heard HD audio, the first think that clicked in my mind was this old old slide leaked long ago and dismissed as fake(because some stated it was sloppy looking...). When this slide was released, I got really excited about the HD audio part and the possibilty of R600 having a sound chip. I mean come on!! LOOOK AT MY NAME!!
Then we hear (imo) solid info on R600 using HDMI in this tid bit...
notice the part I highlighted in red that stikes me the most. This part completly reminds me of that old slide that stated the samething.
here is the slide...
[EDIT::IMG]http://i21.photobucket.com/albums/b299/Genocide737/zplaa92921teqi4kd2.jpg[/IMG]
It actully makes perfect sense IMHO. Look at GPU boards. They are actully mother boards on a mother board. It has a processor, it has ram, and a seperate PCB. The difference is that a GPU board is a entertainment/multi media board. It makes sense to add a sound chip to this entertainment/multi media board. Might as well intergrate it in a all in one package.
Approximately one month later, the company will launch the GDDR3 version of the card. This card, dubbed the Radeon X2900 XT, features 512MB of GDDR3 and lower clock frequencies than the X2900 XTX. The X2900 XT is also one of the first Radeons to feature heatpipes on the reference design.
Heh, I forgot it was 128 scalar in G80. For some reason I was thinking 64.
Then figure that they're doubled, and assume a good deal more efficient, and then add ATI might once again be somewhar texture choked, and we get more reasonable competition (which could still allow ATI to be a good deal faster).
If you'd have hypothetically a 1024bit wide bus with 1.4GHz (2.8 DDR), the resulting bandwidth would be at 358,4 GB/s. As for whether I could find ways to saturate say 180GBs of bandwidth...sure 16x (real) MSAA in high resolutions. But before GPUs would go this far they'd need the analogue amount of pixel/zixel fillrate first, or even probably ROPs that are capable of more than single cycle 4xMSAA. Theoretically 4 loops through the ROPs doesn't sound that "efficient" to me. Besides a 1GB framebuffer would suffer for such stunts most likely too.
S
It actully makes perfect sense IMHO. Look at GPU boards. They are actully mother boards on a mother board. It has a processor, it has ram, and a seperate PCB. The difference is that a GPU board is a entertainment/multi media board. It makes sense to add a sound chip to this entertainment/multi media board. Might as well intergrate it in a all in one package.
AMD Releases Final "R600" Specs
Interesting title but the article is actually a bit light on the specs, nothing that wasn't already known...
AMD Releases Final "R600" Specs
Interesting title but the article is actually a bit light on the specs, nothing that wasn't already known...
No quiet fan for retail versions?ATI guidance claims the X2900 XTX retail card comes as a two-slot, 9.5" design with a vapor chamber cooler. Vapor chambers are already found on high-end CPU coolers, so it would be no surprise to see such cooling on a high-end GPU either. The OEM version of the card is a 12" layout and features a quiet fan cooler.
No quiet fan for retail versions?