The LAST R600 Rumours & Speculation Thread

Status
Not open for further replies.
There are two graphs in the link, but that's besides the point.

A very simple example: my UPS (1500VA) has a small front panel LCD screen measuring apart from battery capacity also the power load that goes to it from the system. Imagine five fields to fill; with the former 7800GTX@490/685MHz it would fill one field in 2D at full load two fields. Now with the 8800GTX@default it's constantly stuck at two fields w/o any fluctuations up or down.

It might sound a tad naive and simplistic overall, but it's an alternative to the 10W up 20W down hairsplitting. As long as the thing is as silent as the G70 (in an already as silent as possible system) I hardly have any reason to bother much about it. If someone now interested in either R600 or G80 sits all day and measures how many kilowatts what exactly burns it can go in the anal direction.

Yes, there is a second graph in the link, but I only pointed out the only graph showed in your post was the idle one (which was the one with less meaning for me) and someone could think it was the "right one" to show differences in maximum power draw. Anyway, I agree with you, every one has the money to buy a G80 or R600 (or even a SLI or Crossfire setting of these!) should have also the money for high rated power supplies, monstrous cases and big fans (or even liquid cooling) and should not care about a 10-20W difference when buying a 120+W card.
 
Yes, I'm talking about an application sense (so, yes, software dev's can save the planet by using less I/O and more shaders! :p ).
 
Power is not quite as simplistic as that. I/O is actually one of the biggest causes for power consumption (both at the board level and chip level) even though you may have full utilisation of the shader core power consumption can vary wildly dependant on the amount of I/O going on to and from the chip. Massive shaders that keep the core nice and busy could cut down I/O could actually reduce power draw relative to other scenarios.

Slight difference he said more power consumption. Agreed on the points above, but wouldn't also the reduced overall driver overhead also play a role in not necessarily increasing consumption under D3D10?
 
Is there any chance of R600 being developed on 65nm and that that is the reason for all these delays? Could AMD keep it so secret, that no one would know? Just a conspiracy theory, but that would be something, wouldn't it? Showing of 80nm just to fool everyone? :D
 
Is there any chance of R600 being developed on 65nm and that that is the reason for all these delays? Could AMD keep it so secret, that no one would know? Just a conspiracy theory, but that would be something, wouldn't it? Showing of 80nm just to fool everyone? :D

I don't know how if ATI would have developed for an AMD process when it didn't know it had access until recently. That would be just about the only way that would be the case.

If ATI didn't target the 65nm process, then the reworking of the core and masks would take much longer than the delays we're seeing. They certainly wouldn't design the same core on both 80nm and 65nm at the same time, then neither would be even taped out right now.

It's also doubtful something like that could be kept secret, especially from board partners who would have samples. You can't lie to them and think they can just plunk a 65nm core into a package meant for a larger chip.
 
Right before a new DirectX revision, a company released an surpiursing, out of nowhere GPU that almost no one saw coming. They stuck with a tried-and-true process, but still managed to put out a huge chip running at very competetive clockspeeds. Their competitor, on the other hand, decided that they needed a new process for their chip, that had around 20 million more transistors. Yet, their design was months late, used exotic, never-before-seen cooling while running at previously unheard of speeds.
 
Is there any chance of R600 being developed on 65nm and that that is the reason for all these delays? Could AMD keep it so secret, that no one would know? Just a conspiracy theory, but that would be something, wouldn't it? Showing of 80nm just to fool everyone? :D
It's extremely unlikely that there's a highspeed version of the 65nm process that's ready enough for ATI's/AMD's development efforts. High speed usually comes some time after the normal variant of the process.

http://www.edn.com/article/CA6336194.html?ref=nbsa

Interestingly, that roadmap indicates 45nm low power comes ahead of 45nm general purpose. There's also way more grades (G, GT, LP, LPG) than I was expecting :LOL:

Jawed
 
Fundamentally, I/O reads and non-destructive writes can be made to consume zero energy. It is erasing information that consumes energy. :) You just need to replace all your RAM with WOM (write once memory, write once per boot would be good, if it could be bulk erased efficiently), and run computations backwards when you need to garbage collect. :)
 
Wouldn't it be logical to assume that AMD knew well ahead of time that they would have marketing issues when Intel released the Core chips? I'm sure they had a good idea if the merger would go through or not and even if the merger was rejected there wouldn't be anything stopping ATI from inking a deal to create chips with/for AMD. If the deal didn't go through they'd still likely try to acquire rights or contract with ATI to get their stream processors designed anyways.

Besides if they do want to integrate GPU/stream capability onto their CPUs they might as well start fabbing the chips sooner rather than later so they have an idea what exactly they can expect out of the chips. Secondly just how much fab space would it take to produce R600's? Assume the mid/low range chips are still on 80nm and only the high end went 65nm.

Also with that die shot that was released a while ago, was it just assumed that it was an 80nm part? If R600 is roughly the size of G80 and the transistor estimates were based on the die size and nothing else the estimates could be way off the mark. People have been speculating how R600 would use all that bandwidth. It appears the bandwidth should nearly double but what if they went 65nm and nearly doubled the processing power as well?

I would imagine strong sales for ATI would have a coattail effect on their other businesses. It would make sense that a lot of people wanting to use ATI graphics would stick with "ATI/AMD" based chipsets and CPUs. Knowing that their CPUs would likely have marketing problems this seems like a good way to try and sell them. Besides if they did go 512bit and 65nm they could have had a good idea that Nvidia wouldn't be able to come close as far as performance until they found a smaller process to use.
 
http://www.anandtech.com/tradeshows/showdoc.aspx?i=2905

One of the more interesting developments we did notice was on the power supply side where the majority of manufacturers were showing the new 8-pin PCI Express 2.0 cables. The majority of these cables featured a 6+2 pin arrangement that allows them to be used with current graphic cards or upcoming products. The first product to utilize this new power standard will be the AMD R600 that will require both an 8-pin and 6-pin PCI Express connector to power the card. We asked about potential cable adapters for current power supplies and were told by most of the suppliers that they did not know if an adapter would work or not until they received final specifications from AMD. Enermax will have an 8-pin cable available for their current Liberty series that will allow the use of a single R600 or future NVIDIA graphics card. Pricing was not set but is expected to be below $10.
 
  • Like
Reactions: Geo
Well, "we don't know yet" pretty much means what it says. Let's not borrow trouble just yet on that front. It would be out of character for the history of such things for it to be the case that an adapter would not be available.
 
Awesome. The more cables dangling from my card juicing it up, the better. I just hope the final version still has that massive handle/bracket like in that VRZone pic, that looks neat. :LOL:
 
Alright then. So if that's that, R600 consumes >225w. Agreed? If it were less than that, one 8-pin or two 6-pin (or like 8800gtx) would be sufficient, as either would add 150w to the card plus the 75w from the slot. While actual exact power consumption of either card you can argue with me about, just like if a G80 were built to consume at max 150w, it would only have one 6-pin connector (75+75). Since it has two, we assume >150w, <225w, which is true (total power under load is between those numbers). Is it not safe to assume then that R600 is >225w, < 300w from the 8-pin (150w), 6-pin (75w) and board (75w)?

No, I'm not talking total system consumption. ;)
 
Last edited by a moderator:
Status
Not open for further replies.
Back
Top