The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
Editted my post to include links, apparently while you were posting your comments :). I had originally had my comments include the 8600GT, which I believe will also have a 256-bit bus, but I can't prove that anyone else is rumor-mongering that, so I won't say it.

For the record, if the 8600 Ultra has the specs it is projected to have, 512MB RAM on 256-bit bus, all for ~$200, that will be my next card. I have unexpectedly moved to a SFF platform, so a GTX-ish size card will be unsuitable for me. That also kicks out any handle possessing R600 cards, but I wouldn't have purchased something that big for my tower either (unless it was really cheap and fast).

I am also interested to see what R600 will bring in the ~$200 - $275 price range. That's what I am willing to spend for a DX10 part. I can't wait forever, though - whoever gets a good DX10 part out in that price range first is getting my business, unless there is some compelling reason not to do so, like it's 15 inches long or takes up three slots.

Look at the date on that article. ;)
I think Uttar already debunked that as just bad info from the original chinese source.

Of course, it's too soon to say for sure, but the earlier rumor of 128bit + high clock GDDR3/GDDR4 seems more likely than a 256bit bus with lower clocked memory.

But, since R600 has a 512bit external bus, we can't rule out a midrange R6xx with 256bit bus either.
This keeps getting weirder and weirder every day. :D
 
vr-zone said:
ATi has finally provided some updates to their eagerly awaiting partners regarding the schedule of their upcoming DX10 cards. R600 will be launched in CeBIT but cards availability in April and we heard that the official name could be Radeon X2900. http://www.vr-zone.com/?i=4572

Maybe the reason AMD will call X2900 as appose to X2800 is because if they call X2800 then that mean they are late and behind and ATI is catching up to Nvidia, but if they call X2900 then that mean that they are showing to the world that they are ahead of Nvidia....
 
I don't think G70 was any great surprise. Nice part and all, but no great surprise. If R520 had been out in April as expected, it'd been a fine competitive situation. R580 was actually a bit late from the original schedule because of R520 being very late.

Sorry, I was meant to say that the first suprise was the NV40, which was released before the R420 and had SM3.0. The NV40 was still a huge suprise following the NV3x debacle, I certainly wasn't expecting a Good fully compliant SM3.0 part. The G70 built on that and there was no news(at least to the public until just a few days before release and with that release, consumers could actually get the card on shelves, which is still something ATI need to do on launch day. Then the G70 was again released before ATI's R520(of course with the PCB and core problems and needed the R580 to fix several problems.

The G80 again was a suprise cause 1) early release, 2) full unified shaders, 3) flipping fast compared to previous cards 4) in stock at launch again.

So, can ATI suprise us with the R600? I would expect it to harsh the G80 cause of a few factors 1) Better, faster memory 2) GPU should be better overall? 3) higher core speeds?

I'll be suprised and disappointed if the R600 is only 5-10% faster than the G80(in both DX9 and DX10) since it will most probably also use more power in the end.

Thus, my speculation that Nvidia might have something in the wing again waiting to suprise.

Presumably R600, like G80, will rough up DX9 class cards in DX9 on Vista quite handily. That's a given.

I would think so since the head honcho claimed it would be the fastest DX9 card ever a few months ago. ;)

US
 
Look at the date on that article. ;)
I think Uttar already debunked that as just bad info from the original chinese source.

Of course, it's too soon to say for sure, but the earlier rumor of 128bit + high clock GDDR3/GDDR4 seems more likely than a 256bit bus with lower clocked memory.

But, since R600 has a 512bit external bus, we can't rule out a midrange R6xx with 256bit bus either.
This keeps getting weirder and weirder every day. :D

Here's the Inq's rehash

Someone here started the rumor that it was a 128-bit bus ;). I haven't seen it repeated since this article, but the "debunking" occurred at the same place the rumor originated. If you can wait until tomorrow, I can dig up the 15 other links that reference translated Chinese sites, the link to Nordic Hardware, and the VR-Zone rehash that both debunks and then restates that 8600 Ultra is 256-bit bus.
 
I haven't seen it repeated since this article, but the "debunking" occurred at the same place the rumor originated. If you can wait until tomorrow, I can dig up the 15 other links that reference translated Chinese sites, the link to Nordic Hardware, and the VR-Zone rehash that both debunks and then restates that 8600 Ultra is 256-bit bus.
I fail to see how some random sites repeating the same stuff over and over again makes it ant more reliable... :)
 
I fail to see the same thing, which is why I didn't post it. :D

I also still believe that the 8600 Ultra will be on a 256-bit bus, and that there is a possibility that the GT will be, as well.

We'll see.

We've had so much bad and unvalidated information lately that it shouldn't come as a surprise to anybody that some of the things they believed aren't true. :D (but in this case if 8600 Ultra is not on a 256-bit bus it will be a huge surprise to me)
 
It would be prudent to wait until it actually comes out or until we get some specs from nVidia before stating this categorically. There is no data to prove or disprove it. That's what is rumored and that rumor recently has gained momentum. That also does not prove or disprove it. :)
 
What's going on with RV570 and RV560?:
Hmm, use the 4 ringstops of RV570 & only solder in the 4 RAM chips 1 per ring stop to get the 128bit (4*1*32)?
But then, since you are doing some functional unit disabling, might as well disable 2 stops to give 2 stops (2*2*32) and make it only one direction to gain some reduced power requirement.

Either way I don't really see how they justify giving a separate designation for the RV560 when my X1900GT is still an R580 :???:

What's with the blur-vision?
R600 engineers have been watching The Ring too much?
 
Lets get this thread back on topic, new details from vrzone:

AMD remains secretive about their R600 but we still managed to dig out a little more information. We can expect to see R600, RV610 and RV630 demos @ CeBIT but no performance tests to be revealed there. The R600 launch is slated for end of March it appears now and there are 3 SKUs; XTX, XT and XL. ATi briefly mentioned to their partners a 2nd generation Unified Shader Structure and a new hardware Crossfire for R600 family which can be bridgeless yet faster. We are not sure how does this new HW CF work out yet. The retail card is half an inch longer at 9.5" and the OEM/SI version remains the same at 12". It comes with 512-bit memory bus and 1GB DDR4 at more than 2GHz frequency. Core clock remains a mystery as yet.
 
  • Like
Reactions: Geo
Whee, a 9.5" XTX would make some people happy, I think. Personally, I still want to see it.
 
Gee wizz Polly - 128GB/s of bandwidth sounds mighty fine, mighty fine indeed.

Allow me to recap possible R600 XTX bandwidth figures, based on available Samsung GDDR4 parts (i've "rounded" them up a bit for easy reading):


R600 (512 bit bus):

- 1.0 GHz GDDR4 (x2): 128 GB/s
- 1.1 GHz GDDR4 (x2): 141 GB/s
- 1.2 GHz GDDR4 (x2): 154 GB/s
- 1.4 GHz GDDR4 (x2): 179 GB/s


Just for curiosity, here are the numbers for both the existing G80 GTX (384 bit bus), as well as if it was using the same GDDR4 chips detailed above, while keeping the same bus width.

G80 (384 bit bus):

- 0.9 GHz GDDR3 (x2): 86 GB/s (current version)
- 1.0 GHz GDDR4 (x2): 96 GB/s
- 1.1 GHz GDDR4 (x2): 106 GB/s
- 1.2 GHz GDDR4 (x2): 115 GB/s
- 1.4 GHz GDDR4 (x2): 134 GB/s



edit
Forgot to mention this, but the 1.0 GHz GDDR4 part (as used in the current X1950 XTX) is not listed anymore at the official Samsung semiconductor website, and was replaced by a 1.0 GHz GDDR3 chip.
So, GDDR4 seems to be restricted to 1.1, 1.2 and 1.4 GHz, unless someone has a pile of older GDDR4 stock in storage for PCB integration.
My personal bet would be at 1.2 GHz chips for the top R600 XTX, but i guess final decisions can be made very late, based on the number of different SKU's and RAM prices.
 
Last edited by a moderator:
My personal bet would be at 1.2 GHz chips for the top R600 XTX, but i guess final decisions can be made very late, based on the number of different SKU's and RAM prices.

What's the differentness is it going to make 1.1 or 1.2 GHz or whatever in that range. It will not gain any differentness in performance for R600 XTX.
 
What's the differentness is it going to make 1.1 or 1.2 GHz or whatever in that range. It will not gain any differentness in performance for R600 XTX.

At this point, i doubt it's much more than "having to be the best".
Remember, even a single 8800 GTX is CPU limited in most scenarios, and only in combinations of AA+AF+insane resolutions do we see GPU limitations with current and near-market software titles.

However, it would be cool if part of that single-GPU bandwidth could be used concurrently with 3D rendering on the same card, say, with physics processing loads...
 
At this point, i doubt it's much more than "having to be the best".
Remember, even a single 8800 GTX is CPU limited in most scenarios, and only in combinations of AA+AF+insane resolutions do we see GPU limitations with current and near-market software titles.

However, it would be cool if part of that single-GPU bandwidth could be used concurrently with 3D rendering on the same card, say, with physics processing loads...

what do u consider insane resolutions? i dont know why people keep saying that we dont need faster than the g80 for anything under 2560 x 1600 when we clearly do.
 
Status
Not open for further replies.
Back
Top