How about some speculations about AMD R680_65nm

R680
DX10.1
65nm
1100mhz core clock
1gb of 2.5ghz GDDR5(based off DDR3)
:LOL:

:p

In all seriousness, I agree with that...accept of course the clockspeeds/mem. It seems ATi is targeting somewhere in the 700-800 range for the R600, so one would assume it's close to the 1ghz mark.

The questions that are burning though are:

1. Will it be a straight shrink to 65nm, ala to get the size down (to < 300mm2), or will they change the architecture by adding more shaders/tmus/whatever, or by redefining those shaders (perhaps something along the lines of 64->96, or vec4->vec5), or will perhaps both be possible and done?

2. Are those old rumors of R600 running at 1.2v to the core true? Because if they are, and R600 runs hot even at that spec (because of it's size, setup, clocks, whatever, at least on A12) it would seem that that spec leaves a lot of room for boosting come revisions. We know ATi's played that game in the past with leverging parts based on stock core voltage (XL/XT). Havn't high-end GPUs in the past been in the 1.4-1.5v range with mid-range at 1.2v? If R600XT/X comes out at 1.2v, perhaps R680 could be a straight shrink to 65nm with 1.4v stock or something, with the smaller area allowing the heat to be reasonable for such voltage, resulting in massive clocks.

3. What spec of GDDR4 will be available in Q3/Q4 of this year? 3.2ghz? 3.6ghz? We've heard both will be available, and that yields are good, yet adoption seems to be slow and samsung is slow to increase bins and Hynix is yet to even come out with something.
 
The problem is that RAM can be very expensive. So unless you want a $999 card with performance barely higher than that of a $499 card, I think it doesn't make sense for the consumer market... :)
Uttar

The only thing I'm still thinking! extra 1GB memory might not cost ~$499 dollars
I was thinking about ~$249 for extra 1GB Ram.

If R680 or something like that - 2900XTX series will cost $649.99 for the video card with 1GB of ram, then adding another 1Gig will be additional $249.99 for the memory that will bring about $899.99 for the video card. "Just a logic calculation"

[Edit: I agree that it will not be wise/smart for the consumer market "It does not make sense"]
 
Last edited by a moderator:
:p
1. Will it be a straight shrink to 65nm, ala to get the size down (to < 300mm2), or will they change the architecture by adding more shaders/tmus/whatever, or by redefining those shaders (perhaps something along the lines of 64->96, or vec4->vec5), or will perhaps both be possible and done?

I was thinking more like what competition will bring to the table.... Or how much time ATI will have to bring maybe some technologies from R700 into R680.

Just like Nvidia NV45 (6800 series) to NV47 renamed GPU core to G70 (7800 series).
 
Just like Nvidia NV45 (6800 series) to NV47 renamed GPU core to G70 (7800 series).

Renaming the products had nothing to do with competition or architecture, going from NV50 to G80 doesn't give you a 50% speedboost.
If AMD will anounce it as a FX2800, it won't run faster than a r600
 
Renaming the products had nothing to do with competition or architecture, going from NV50 to G80 doesn't give you a 50% speedboost.
If AMD will anounce it as a FX2800, it won't run faster than a r600

That is not what I mean; you are missing my point!

[Edit: NV35/GF5900 same thing, NV45/GF6800 same thing, NV47/G70 same thing, also NV50/G80 same thing]

Nvidia instead going from as usual generation after generation NV10, NV20, NV30, NV40, NV50.
NVIDIA G70 (7800 series) technically supposed to be NV50 instead NV47; after NV45 previously generation. If nvidia fallowed old way G80 would be NV60 to match R600.

Example:
NV10 = R100
NV20 = R200
NV30 = R300
NV40 = R420
NV50 = R520
NV60 = R600

The possibility could be R680 fight G90 and R650 could fight G81. :) Or it could be opposite R700 to fight G90.
 
Last edited by a moderator:
Nvidia instead going from as usual generation after generation NV10, NV20, NV30, NV40, NV50.
NVIDIA G70 (7800 series) technically supposed to be NV50 instead NV47; after NV45 previously generation. If nvidia fallowed old way G80 would be NV60 to match R600.

The possibility could be R680 fight G90 and R650 could fight G81. :) Or it could be opposite R700 to fight G90.

No, I'm sorry, you fell victim to the marketing machine.
Read the G70 technology preview (http://www.beyond3d.com/previews/nvidia/g70/)

The name G70 was used as a clean slate for nV's product numbering scheme NV47 is G70 and no theory will change that. NV50 is G80.

NVIDIA G70 (7800 series) technically supposed to be NV50 instead NV47
No

after NV45 previously generation. If nvidia fallowed old way G80 would be NV60 to match R600.
NV40 was the GeForce 6 line, not NV45, NV45 was native PCI Express instead of AGP, that's all. G70 really isn't all that much different from NV40 hence a small step from NV40 to NV47.
 
different from NV40 hence a small step from NV40 to NV47.

Are you saying like small step ATI R300 to R420.

Yes, ATI R3xx, R4xx, R5xx share same technologies but performance from each other doubles. ATI R600 will share technologies from R500 Xenos (xbox360)....
R600 should be next big step; just like ATI R300 from R200.

NV40/NV45 share same technologies with NV47 but performance doubles from NV40.
NV50/G80 is a big step into the future.

We will have to see if ATI R600 to R680 be same transition like NV40 to NV47/G70.
 
Last edited by a moderator:
NV45 was native PCI Express instead of AGP, that's all.

Small correction, NV45 was not native PCI-Express.
I have one and can confirm it's still the good ol'NV40 AGP8x, bridged to PCIe with the BR02 chip on-package.
The fact it shared the package with the GPU was what gave Nvidia the opportunity to call it "NV45", not to mention reusing most of the PCB components from the AGP line.

But the outcome was acceptable, since the name is still "6800 GT/6800 Ultra", and the translation chip is no perceived bottleneck in the apps i've been using it. :D
 
Err, didn't the pcie variant not have the PureVideo problem? Wouldn't that indicate it was a different chip?
 
Are you saying like small step ATI R300 to R420.

It's somewhere right in between R300-R350 and R520-R580


NV40/NV45 share same technologies with NV47 but performance doubles from NV40.
Performance Doubles? hardly at all. maybe some benchmarks will be better, but the only thing that is close to doubling is the shader and texture performance, simply because G70 has 24 units and NV40 has 16.
The pixel fill rate etc. hardly increased at all. Memory bandwidth increased 10%
Really, check the reviews done here and then come back, G70 was no big step at all.
http://www.beyond3d.com/previews/nvidia/g70/index.php?p=07



We will have to see if ATI R600 to R680 be same transition like NV40 to NV47/G70.

I still think it's R650 btw.
 
Err, didn't the pcie variant not have the PureVideo problem? Wouldn't that indicate it was a different chip?

You mean the borked Windows Media Acceleration? that was NV40. but I have lost all track on which models did have the problem and which not. seems like all spins in NV45 were working and NV40 were not. Thus making it a AGP only problem (what a clever way to get you to buy a PCIe board eh?)
 
Err, didn't the pcie variant not have the PureVideo problem? Wouldn't that indicate it was a different chip?

Different revision for sure (even if only due to the extra chip on package).
Purevideo was announced before the PCIe version came out, so it's easy to infer that they did one.

But that does not invalidate the fact that it's still NV40, or else they would use a truly native chip, instead of the bridge.
 
It's somewhere right in between R300-R350 and R520-R580.

R520 vs. R580

If you comparing a game that rely more on texture units oppose on pixel processors then R520 & R580 both have 16 texture units and that game will run about same, either on R520 or R580.

R300 vs. R350

They are both exact same chips the differentness between them is higher memory clock speed and GPU frequency, also SmoothVision was updated from ver. 2.0 to 2.1 in catalyst drivers for new R350 core hardware "Small hardware update"....
 
Last edited by a moderator:
Err, didn't the pcie variant not have the PureVideo problem? Wouldn't that indicate it was a different chip?

you might be thinking of the non GT 6800, which was NV40 on AGP, but NV41 and NV42 on PCIe.

there's also a NV48 which is still a NV40, right? what is it for? (I remember reading it was for the 6800 ultra 512MB but that seems stupid. btw did anyone really buy that 700 euros, late and useless card.)
 
there is absolutely no reason to have 2 gigs of memory on a gfx card at this time or in the next 2 years. period. 1 gig maybe in a year. very few games need more then 256 to work well. 512 for extreme high resolutions which most of us don't play with probably less then 10% do. and you can make almost any game work well with 512. 2 gigs of textures is stupid and unneeded and unheard of...

game designers build for what the average user has which now and in the next year will be 256.
 
game designers build for what the average user has which now and in the next year will be 256.

doom 3 ultra quality *cough*

If game designers build for what the average users have, everything would be super smooth on my 64 meg shared memory i915G.

Software sells hardware, NOT the other way around.
 
Software sells hardware, NOT the other way around.

It sells a small amount of hardware to a small number of people.


Code:
[B]VRAM  (945512 Users)[/B]
32 MB     21,868     2.31 %
64 MB     88,783     9.39 %
96 MB      4,959     0.52 %
128 MB   321,428    34.00 %
256 MB   396,133    41.90 %
512 MB    75,612     8.00 %
Other     36,729     3.88 %
From here.

Doom 3 ultra-quality didn't represent the baseline hardware configuration for which the game was designed. If it did, they wouldn't have sold many copies. D3UQ is added bonus content for the small number of people who have cards which can cope with it, perhaps it makes good benchmark fodder too which keeps the game in the public eye for years to come.
 
It sells a small amount of hardware to a small number of people.
Like all the people looking for Vista compliant PC's?

Steam survey

The steam survey is a pile of steaming dog droppings. Are you sure the Geforce 8800's outsells all i915GM or i945GM chips? (hint, my company owns 2500 intel graphics based laptops)

Code:
[i]DX10 Grahpics card vs. something you wouldn't even advise your mother to buy[/i]
NVIDIA GeForce 8800	[color=green]5,964[/color]
Mobile Intel 915GM/GMS, 910GML	[color=blue]5,323[/color]
Mobile Intel 945GM Express Chipset Family	[color=blue]4,984[/color]

People buy performance cards nowadays..
Two or three generations ago 512MB cards allready sold like hotcakes (512MB 9600's), it's people still on geforce 4's that mess up the statistics. They can't run DX9 games but they can run a lot of Steam's catalogue and make others think we actually like to see our games optimized for 64 and 128mb vram :(
 
Last edited by a moderator:
The steam survey is a pile of steaming dog droppings. Are you sure the Geforce 8800's outsells all i915GM or i945GM chips? (hint, my company owns 2500 intel graphics based laptops)

I do agree that software sells hardware but I'm really trying to understand why we should care about the configurations of corporate workstations :???: I'm quite sure "average user" means average gamer and not average computer user.
 
Back
Top