AMD: R9xx Speculation

Changing clock speeds is not something you do on a whim, nor can it be necessarily done quickly. Dependant on where you are in a qualification cycle changing clocks will have major ramifications that can result in potentially months of schedule alteration.

Can you let us know what the cause of the 6900 series delay is?
 
But wouldn't you change clock speeds if you knew the cards can do better than what you reckon is safe.

i.e. according to rumours the XT will be clocked up to 900 yet if they were say, to be released at 825 to keep it safe yet after the 580's release you decide that 875 or 900 would be better.

Hell, you might actually have a demon chip on your side and actually decide to down clock it so as to make overclockers happy that they can get more increased performance from the chip.

haha, that's all silly though. I'm pretty sure you guys know what you are doing.
 
Can you let us know what the cause of the 6900 series delay is?

You missed this?

"Please find an announcement from AMD below regarding the change of Cayman NDA:

Demand for the ATI Radeon HD 5800 series continues to be very strong, the ATI Radeon HD 5970 remains the fastest graphics card in the world and the newest members of the AMD graphics family, the AMD Radeon HD 6850 and HD 6870, have set new standards for performance at their respective price points, and are available in volume.

With that in mind, we are going to take a bit more time before shipping the AMD Radeon HD 6900 series. As of today, the NDA lift for information relating to the AMD Radeon HD 6950 and HD 6970 will be week 50. We will be providing additional information on these products, including the exact date and time of the NDA lift, in the weeks prior to launch."
 
I find it hard to believe AMD was surprised by GF110 in any way that would compel them to delay the launch. Most likely scenario is the TI shortage.
 
Yep, Kaotik did mention that.

We know it's VLIW4

On another news, VR-Zone says the cards are indeed delayed, and according to Sampsa one of his non-AMD sources says the same
http://vr-zone.com/articles/amd-rad...age-launch-date-to-be-re-confirmed/10276.html

However, the issue is not yields or such, but just too low availability of one specific driver-MOSFET from Texas Instruments, which is "so new there's no info about it available, not even from TI"
HD6800 uses the same, and apparently there just isn't enough of them at the moment
 
I find it hard to believe AMD was surprised by GF110 in any way that would compel them to delay the launch. Most likely scenario is the TI shortage.

I'd bet on a manufacturing or driver issue. I don't buy the official reason and I fear that at some level AMD faked the sample arrangements with the press earlier this week to counter the first delay rumors so that while publishing the GTX 580 review we’d still think the launch would go as planned. That would be a card you play only once of course.
 
So when are we kidnapping Wavey and torturing some good intel out of him?
There goes his Prius now!

:)
 
I bet it's not GPU manufacturing problem in a 'needs a respin' sense. Not with so much experience with 40nm. Driver issues also seem fishy to me. Drivers delaying launch, srsly? TI MOSFET sounds weird but I'd believe that sooner than first two options.

Could also be TSMC capacity problem. Maybe they decided to allocate capacity to current line of products for now. Lets not forget Zacate and Ontario.

From Daves hint we can safely say it's not due to clock change ;-)
 
I bet it's not GPU manufacturing problem in a 'needs a respin' sense.

No amount of experience with a given manufacturing process will keep you safe from damning bugs. That said, it's probably the sort of thing that they would have caught much earlier, early enough not to have this little SNAFU… Unless of course it was intentional.
 
I find it hard to believe AMD was surprised by GF110 in any way that would compel them to delay the launch. Most likely scenario is the TI shortage.
Why not? If Charlie's supposed deep-inside-nvidia source had no clue about the GTX580 then AMD could have been caught blind.

It's now looking more like Charlie's deep-inside-nvidia-source is either himself or AMD thus the blind leading the blind.
 
Why not? If Charlie's supposed deep-inside-nvidia source had no clue about the GTX580 then AMD could have been caught blind.

It's now looking more like Charlie's deep-inside-nvidia-source is either himself or AMD thus the blind leading the blind.

Nvidia's specs for the original 512 SP beast w/ similar clocks were talked about a year ago

You don't think AMD knew or had a pretty good idea what the GF100 was supposed to be?

Unlike Nvidia, AMD already knew what the competitors targets were in the past... why do people think AMD would go for their second gen DX11 card to be worse than where their competitor was supposed to be a year earlier?
 
I am sure the delay has something to do with GTX580 launch. May be AMD didn't expect GTX 580 to be available at the launch and hence thought that launching cayman early even without sufficient stock will be alright. Or may be it didn't expect GTX 580 to be 15% faster than GTX 480.

Or may be they thought GTX580 will completely obliterate HD5970 and a Cayman will be required to counter it. And now when it hasn't turned out that way they believe it's better to cut prices and let the HD59XX inventory get cleared before unleashing Cayman.

For last couple of years, AMD has been flawless with GPU launches. It doesn't seem likely that they slipped this time — which is quite crucial because for the first time in a year Nvidia has a compelling product that is better than HD5870 (GTX480 was too loud and hot) and HD 5970 — due to driver glitches or TI probs or TSMC yield issues.
 
Why not? If Charlie's supposed deep-inside-nvidia source had no clue about the GTX580 then AMD could have been caught blind.

It's now looking more like Charlie's deep-inside-nvidia-source is either himself or AMD thus the blind leading the blind.

AMD surprised by 20% performance bump? Maybe positively surprised yes. I think AMD expected GTX580 level performance 1 year ago. At least that's how they reacted when they finally saw GTX480 reviews. Complete lack of being impressed to put it mildly.

'We are even more confident now.' Their words on CC.

The only thing from nV that caught them off guard IMHO is GTX460.
 
Or may be they thought GTX580 will completely obliterate HD5970 and a Cayman will be required to counter it. And now when it hasn't turned out that way they believe it's better to cut prices and let the HD59XX inventory get cleared before unleashing Cayman.
My money's on this.
 
So when are we kidnapping Wavey and torturing some good intel out of him?
There goes his Prius now!

:)

Im with Sodium Penthanol. It doesn't leave any marks so Dave can go into work acting like nothing has happened whilst we make off with all the goods on not only this generation but perhaps the next generation of AMD graphical goodness.

He deserves it IMO for launching a massive counter-espionage action against us. He started it! :p
 
Nvidia's specs for the original 512 SP beast w/ similar clocks were talked about a year ago

You don't think AMD knew or had a pretty good idea what the GF100 was supposed to be?
I wasn't talking about the specs of the GTX580 but that it was actually a real part that it would be hard launched early November.

That was a complete surprise...

See Charlies rant here to see how he and AMD were both caught off guard.

http://www.semiaccurate.com/2010/10/18/nvidia-gtx580-paper-launch-next-month
 
AMD might've been unsure when it would drop, but I doubt they were caught completely by surprise. However, seeing as how Charlie proclaimed doom and gloom a year ago, I don't take much stock in him actually knowing a whole lot
 
I think that Barts contain 1280 SPs.
Yes I know very well that this is a "high level" overview but it seems to work. ;)

Althou i'm not so good at math you're right there, and it aint just a high level overview. But then EG_Redwood and EG_Juniper have only one setup engine to oversight their work, while EG_Cypress and NI_Barts have two, so it's more proper way to me looking what is cut down.

They're just cut down EG_Cypress part, no "new NI shaders" (that took more die space per SP), all tessellation improvements in Barts came plainly fro it's higher clock so there's no real improvement (or need for die area) and with all stuff they cut-off (320SPs, DP support, 2nd CFX and reduced MC) ... it simply should be 1280SPs :oops:

Question is: does using a power-of-two SIMD count use that much more die space than a power-of-two-minus-one count?
(...)
Is it better to have coarse grained redundancy with a merely bigger die, fine grained redundancy or no redundancy at all?

I agree there with you. Redundancy if it translates into merely 15-20mm2 (7-9% bigger) die space is nifty feature for gpu business.
And who advocated that 1280SP "new shaders" require 280mm2?
Lets hope that nVidia will soon introduce speculating gtx475 cards on fully working gf104 chips, and that amd will be forced to abandon 6850 with laser-cut 160SPs in favor of fully working HD6870.
What makes this chips so large?! It doesn't even support FP64, nor even have 1280SPs, but accordingly it sports somewhat smaller Redwood MC (x2 to become 256-bit, and not supporting differential gddr5 signaling makes him 50% smaller than one in Cypress/Juniper).

So what's the legend behind it? It's not like only one more tessellation dispatcher aka. RPE (what it stands for?) needs to consume enormous die space.


but seems nobody doesnt wish to confirm my thoughts

Large considering only 70% instead 80% active engines (ed:compared to Cypress)


As now seems, neither AMD doesn't want to address fully working gf104 chip, that is still non-existing as real product ex.gtx475->now.gtx560(?)
And as the only reason for redundancy in Barts would be to address some mythical future nVidia product, so as there are no requirements for it we probably wont see that "fully enabled Barts" either. But who knows may we'll see HD6890 after all and that would be great :LOL:


I'd bet on a manufacturing or driver issue.

My bet is on AMD cleansing out of piled up HD5800 cards from clogged warehouse inventory and further down the retail stream before newcomers arrive. It's not like they're really afraid vaporware GTX580 will spoil their sales ;) And Q3 show some sales slowdown-drop of ATi gpu line so its last moment to sell them at discount prices and yet for none the less than HD6800 cards.
 
Back
Top