ATI at E3

webmedic said:
overclocked_enthusiasm said:
webmedic said:
wonder if the pe is that 600mhz version that is rumored about but most thought was not true. :rolleyes: :oops:

Core or memory?

the rumor was for 600mhz core and others said that they may use 833mhz memory. I'm not guessing about either as I dont want ot be wrong and I have not heard anything at all about the pe version of the card.

Wasn't the 833 MHz rumor from Micron discounted? I was under the impression that only Samsung had GDDR3 available at this time and most/all of it was 500 MHz. Isn't 600 MHz core the theoretical maximum for low-k .13 micron? What is the theoretical maximum for .11 micron?
 
I don't think there is anyway they'll have enough 833Mhz GDDR3 to make launch, not by a long shot.

If that is to happen, I think it will be after nVidia launches their real Ultra. ;)

Last stats I heard were 525/1120 or so for the XT, but don't go telling anyone 'til Monday please. :)
 
um no micron invented gddr3 and is supposedly making as much 833 as they can. samsung is way behind the curve in gddr3 and the rumor is that the only reason they have it at all is because they kiped the plans for it from micron. Micron has tightened up sucrity a whole bunch and are very unhappy about it all.
 
overclocked_enthusiasm said:
Wasn't the 833 MHz rumor from Micron discounted? I was under the impression that only Samsung had GDDR3 available at this time and most/all of it was 500 MHz. Isn't 600 MHz core the theoretical maximum for low-k .13 micron? What is the theoretical maximum for .11 micron?

Natoma said:
remember the day when we thought it'd be nuts that the 9700 Pro was running 300+ core clocks at 0.15u? :)

I wouldn't put any theoretical maximum limits on ATI. They've shown us over the years that they can be amazingly adept at squeezing literally everything out of a process before moving on to the next one.

That's the main reason why I said to one of my friends a couple of years ago that I thought going forward, the company most poised to do well was ATI, because they weren't dependent on the latest technologies and processes to fuel their bigger and better GPUs as Nvidia was. With the slowing pace of process improvement and the maturation of the industry, efficiency becomes king rather than brute force. ATI has a huge leg up in this regard because they spent years in the mobile sector optimizing for power consumption rather than speed, while nvidia was doing the opposite in the desktop sector.

Now we see the fruits of those early labors.
 
webmedic said:
um no micron invented gddr3 and is supposedly making as much 833 as they can. samsung is way behind the curve in gddr3 and the rumor is that the only reason they have it at all is because they kiped the plans for it from micron. Micron has tightened up sucrity a whole bunch and are very unhappy about it all.

The Micron website said that GDDR3 was not available yet. There was also an article (don't remember where) that said only Samsung GDDR3 would be used in this first round. The 6800 Ultra certainly uses Samsung and we will have to see about ATI. I am aware that ATI helped develop the standard with Micron.
 
overclocked_enthusiasm said:
webmedic said:
um no micron invented gddr3 and is supposedly making as much 833 as they can. samsung is way behind the curve in gddr3 and the rumor is that the only reason they have it at all is because they kiped the plans for it from micron. Micron has tightened up sucrity a whole bunch and are very unhappy about it all.

The Micron website said that GDDR3 was not available yet. There was also an article (don't remember where) that said only Samsung GDDR3 would be used in this first round. The 6800 Ultra certainly uses Samsung and we will have to see about ATI. I am aware that ATI helped develop the standard with Micron.

not being made != not available. Of course it's not going to be available if somebody is buying all you can make and you dont have anymore to sell to other parties. Mind you I dont know either way.
 
digitalwanderer said:
I don't think there is anyway they'll have enough 833Mhz GDDR3 to make launch, not by a long shot.

If that is to happen, I think it will be after nVidia launches their real Ultra. ;)

Last stats I heard were 525/1120 or so for the XT, but don't go telling anyone 'til Monday please. :)

I LOVE that extra 20 MHz over the Ultra memory clock. ATI is not going to play second fiddle to NVDA this round in the clock depeartment from all indications. Higher core/memory clocks on all products is my guess for ATI.
 
thop said:
I guess the card will score pretty well in 3DMark03, so i guess the PE refers to Penis Extender :oops:

That's a bit cliche', don't you think? Personally, I think ATi will go with something more original, like the phrase I prefer, Pecker Extrapolator.

:D
 
webmedic said:
um no micron invented gddr3 and is supposedly making as much 833 as they can. samsung is way behind the curve in gddr3 and the rumor is that the only reason they have it at all is because they kiped the plans for it from micron. Micron has tightened up sucrity a whole bunch and are very unhappy about it all.
There is no 833MHz GDDR3 officially, there´s not even specs for it.
You can forget the thought that 800MHz GDDR3 will be available before Q3/04. Until you hear something else "officially", i wouldn´t bet on it.

It seems very likely for a refresh though, and Samsung currently has plans to go higher than that, but this will be available even later.
 
Eolirin said:
What if it's the r500? I mean, unlikely, but...
No way, no way in hell! :oops:

I think the R500 will come out this year; but BARELY this year, like around x-mas time....and people think I'm being a wide-eyed optimist in that estimate!

No way the R500 is their E3 surprise, just NO WAY!.
 
Natoma said:
I wouldn't put any theoretical maximum limits on ATI. They've shown us over the years that they can be amazingly adept at squeezing literally everything out of a process before moving on to the next one.

We're only talking about one series of products though (9500-9800). You could just as well say that NVidia has shown over the years that they are quick to adapt new processes and do it succesfully. But wait, then the NV3X was released and we all know how that turned out. Doesn't mean that i think Ati will "fail" but i wouldn't put to much faith in the past. We've learned that much from the NV3X.
 
webmedic said:
um no micron invented gddr3 and is supposedly making as much 833 as they can. samsung is way behind the curve in gddr3 and the rumor is that the only reason they have it at all is because they kiped the plans for it from micron. Micron has tightened up sucrity a whole bunch and are very unhappy about it all.

Oh dear. The spec were open from day one, all JEDEC members have access. Micron is one such member, as are a bunch of other DRAM makers.

Rys
 
Bjorn said:
Natoma said:
I wouldn't put any theoretical maximum limits on ATI. They've shown us over the years that they can be amazingly adept at squeezing literally everything out of a process before moving on to the next one.

We're only talking about one series of products though (9500-9800). You could just as well say that NVidia has shown over the years that they are quick to adapt new processes and do it succesfully. But wait, then the NV3X was released and we all know how that turned out. Doesn't mean that i think Ati will "fail" but i wouldn't put to much faith in the past. We've learned that much from the NV3X.

Not necessarily so Bjorn. ATI has been adept at making low-power, feature laden, parts since the Rage days. What has mainly been the downfall of ATI in every generation until the 9700 pro? Drivers.

Nvidia has actually had many problems with process in the past. Look at the TNT debacle for instance. Nvidia has always been slave to the latest and greatest process for each of their products. It bit them in the ass with the TNT, and again with the NV30. So in each case, there is a history of ATI squeezing everything out of a process because they start off with a philosophy of "Keep this within a certain power limit," usually because they've always targetted the OEM market, and Nvidia saying "Dump everything possible into the design and we'll scale back from there."

It's two different philosophies that have borne themselves out over the years. It's just that ATI's philosophy is winning out now because of the maturation of the industry in both process and execution.
 
Natoma said:
Not necessarily so Bjorn. ATI has been adept at making low-power, feature laden, parts since the Rage days. What has mainly been the downfall of ATI in every generation until the 9700 pro? Drivers.

The Rage series wasn't really anything to brag about if i remeber it correctly. The Radeon 8500 lacked MSAA and the 7500 was rather late to the market with regards to it features. And neither of them was a GeForce killer like they were supposed to be (8500 vs GF3 and 7500 vs GF2). Sure, the drivers at the time might have been bad but that was hardly their only problem. And sometimes bad drivers are connected to "twitchy" hardware :)
 
The Rage Fury was actually a pretty decent card. It ran 32bit games a little better than the TNT it was competing against at the time. My coworker had a Rage Fury, I had a TNT, and the rest of them had Voodoo 2 SLI. :LOL:

Performance wise, no, the 8500 wasn't a GeForce killer. In fact I had the GF2 GTS until I upgraded to my 9800 Pro. But it did have more features than the GF3/GF4 it was competing against. I don't remember the 7500 all that well.

The only reason I didn't get the 8500 frankly is because of the drivers. I didn't trust ATI to make good drivers, even though the image quality was, imo, far superior to the Nvidia cards at the time. And I stayed away from the 9700 Pro until it was proven that ATI knew what they were doing in the driver department. Hence the reason I got the 9800 Pro. :)
 
Back
Top