The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
That 'red cooler' OEM pic is false colour photoshopped.
That or ATI have a new PCB manufacturing process that makes the PCB red at the left & black at the right...

I'd say its actually just a black PCB, black cooler same as the other pics of that model.


They're all red PCBs.... The black part is for the fan and the card holder.
 
In fact that makes the most [sense].

One 8 pin + PCIe = 225w.
Two 6 pin + PCIe = 225w.

It's highly possible that [ATi] put a backwards compatible 8 pin in [there] to suit the needs for people that can use a single 8 pin cord.

Good call. That does make a lot of sense. I didn't agree with with the Geo 'it uses one or the other' scenario because 150w just didn't make sense, but the Geo '8-pin OR the 6-pin and 8-pin as a 6-pin' does seem to make quite a bit of sense.:smile:

BTW: Sure, either of these methods are only rated up to 225w, but surely there must be a comfort zone built into the spec, and one would think 240w would seem possible. Would going over the rated spec for the slot/plugs make it not comply with pci-sig?
 
Last edited by a moderator:
Good call. That does make a lot of sense. I didn't agree with with the Geo 'it uses one or the other' scenario because 150w just didn't make sense, but the Geo '8-pin OR the 6-pin and 8-pin as a 6-pin' does seem to make quite a bit of sense.:smile:

BTW: Sure, either of these methods are only rated up to 225w, but surely there must be a comfort zone built into the spec, and one would think 240w would seem possible. Would going over the rated spec for the slot/plugs make it not comply with pci-sig?


haha like spell checking my post eh? lol, it's ok. I should really slow down and double check from now on. :p


I think it's a little crazy to think ATi is going for a 8x1 and a 6x1 at the same time. This is why I truly think that the 8x1 is backwards with a 6 pin cord. This method is to help cator the needs for people that have PSU's that support 8x1 pin connectors. So People have the choice to either use 2 6 pin connectors or one 8 pin.

So tell me how the xl board draws 180 watts according to VR zone with only one 6 pin power connector? That's imposible. Which leads me to think VR zone is not thinking right now.

Also these specs that were posted are from a random forum poster at [H] who was taking a guess. Not VR zone. :p

:???:
 
R600%20info.jpg


http://www.ocworkbench.com/news/news.php
 
Wow, what a monster! With optimized drivers, this thing should rock all over. Now only if we can get some DX10 games to go with it :( Likely I'll have to get a x3800xtx2x2theMAX!!!!!! in fall when hopefully we get some games!
 
hehe, you see that as well? :smile:

What's more intresting is that it has the same MSRP as the single XTX. :p

That XTX2 board(s?lol) looks(according to the chart), like two gto boards together. :cool:

Not quite.
Apparently, the GTO maxes out at 1.4GHz GDDR3, while the X2800XTX2 has 2.0GHz GDDR4, x 2...

Maybe 2*256bit and 2*512MB in the X2800XTX2, versus 512bit and 1GB for the X2800XTX, means they've designed a preemptive weapon against the G80 refresh, using it for the raw shading abilities (96*2), but can't be used in unfavorable benchmarks, i.e., the ones that don't scale well with multi-GPU setups.

Pricing them similarly and releasing them at more or less the same time could mean fracturing the high-end towards specialized options, based on software type usage.

I wonder if Nvidia is working on a follow up solution to the 7950 GX2, or if they have abandoned it in favor of the rumored "Triple" SLI/SLI 2.0.
The Quadro Plex however, could configure a long term development strategy, just not on the "home entertainment PC" facet.
 
That fx57 link is eerily similiar (3dmark scores exactly the same!) to this post at [H] - http://www.hardforum.com/showpost.php?p=1030617253&postcount=3

Probably the same guy - at least he could be consistent with the core clock - sometimes it's 700, sometimes it's 800....
Or sometimes 750. It's clear a lot of those numbers are off, I don't see how MADD + MUL makes it a 4-way configuration. More likely it sticks with Vec4 like every card since the R300. With a 800MHz clock, that comes out to the ~614GFLOPs specified, as Arnold Beckenbauer said. And I have no idea where they came up with those fillrates, or vertice numbers.
 
If this chart is correct and fact, the lower end cards look very good for its price. for $60 you get a 32 shader unit card is a big jump from what you can get now like a x1300 that has a total of 6 shader units both pixel and vertex shader units combined for $60.
 
This looks like stuff from a random wish list, I'll eat just mayonnaise (which I hate most) if this comes true. ;)
 
Just a question?

How much will R600XTX attract people who really does not understand about video cards and only knows basic stuff.

Like on the Fancy Red-Green ATI/AMD Box itself that says with BIG letters....
1GB-GDDR4 video memory 1.2GHz (153.6 GB/s memory)
512bit Memory controller
~750-800MHz GPU core

And then they look at Nvidia G80 GTX
768MB-GDDR3 video memoy 900MHz (86.6 GB/s memory)
384bit memory controller
575MHz GPU core

And probably they will by ATI over Nvidia at the High-End.
Because GDDR4 vs. GDDR3. oooohh! probably double speed and also almost double bandwidth.
Then memory size 1024MB vs. 768MB. oooohh more memory and also 512bit vs. 384bit contoller. Wow!!!! Cool 800MHz vs. 575MHz GPU core. (Just like intel P4 vs. AMD XP)

Because I seeing how people buying new video cards, that's what they look at first....
 
Last edited by a moderator:
Just a question?

How much will R600XTX attract people who really does not understand about video cards and only knows basic stuff.

Like on the Fancy Red-Green ATI/AMD Box itself that says with BIG letters....
1GB-GDDR4 video memory 2.2GHz (153.6 GB/s memory)
512bit Memory controller
~750-800MHz GPU core

And then they look at Nvidia G80 GTX
768MB-GDDR3 video memoy 900MHz (86.6 GB/s memory)
384bit memory controller
575MHz GPU core

And probably they will by ATI over Nvidia at the High-End.
Well the thing that sticks out to me is having 1GB of ram, that should be a big win for selling to noobs.
 
If this chart is correct and fact, the lower end cards look very good for its price. for $60 you get a 32 shader unit card is a big jump from what you can get now like a x1300 that has a total of 6 shader units both pixel and vertex shader units combined for $60.

you mean 8 PS/VS. That are 32 scalar units "only".

But for the price the performance will be great.


Regarding the high power draw. Could that mean that the 2800XTX2 needs the 250W and the normal single chip cards need less? Maybe 1/2 - 2/3 the 250Watt.

But IMHO it does not look good for ATi/AMD when they need a GX2 type of card to compete with the G80. At least that's how I read the chart.
 
But IMHO it does not look good for ATi/AMD when they need a GX2 type of card to compete with the G80. At least that's how I read the chart.
I didnt see this argument come up this gen when Nvidia released the GX2. Not that I'm saying an XTX2 will be a reality. :oops:
 
I didnt see this argument come up this gen when Nvidia released the GX2.
It was there nevertheless. :) Nvidias GX2 was the fastest Single-Board-Gfx-Solution out there but due to the inherent limitations of Multi-GPU-Rendering not so few people deemed it an act of desparation.
 
Just a question?

How much will R600XTX attract people who really does not understand about video cards and only knows basic stuff.

Like on the Fancy Red-Green ATI/AMD Box itself that says with BIG letters....
1GB-GDDR4 video memory 1.2GHz (153.6 GB/s memory)
512bit Memory controller
~750-800MHz GPU core

And then they look at Nvidia G80 GTX
768MB-GDDR3 video memoy 900MHz (86.6 GB/s memory)
384bit memory controller
575MHz GPU core

And probably they will by ATI over Nvidia at the High-End.
Because GDDR4 vs. GDDR3. oooohh! probably double speed and also almost double bandwidth.
Then memory size 1024MB vs. 768MB. oooohh more memory and also 512bit vs. 384bit contoller. Wow!!!! Cool 800MHz vs. 575MHz GPU core. (Just like intel P4 vs. AMD XP)

Because I seeing how people buying new video cards, that's what they look at first....

but since pretty much everyone on this board agrees r600 will be faster, why wouldnt they buy it?
 
Status
Not open for further replies.
Back
Top