AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
There is no indication that 40nm @ TSMC had problems reaching clock speeds, it seems to have problems producing use able chips. So smaller and faster might be more efficient.

Just look at 4770/4750 GPU clock seems no problem at all, as there is no low clocked Version out there.
 
399 for 5870? Not so nice. I thought it was supposed to be 299.

They'll sell but at 399, not like hotcakes.

If these prices are real though, it also sets some very high performance expectations. Even with DX11, you would think 5850 would have to be well faster than 4890 to command 299, since 4890 can often be had at $170 after rebates. Well faster than 4890 is screaming.
 
May be they want to book profits until nv can come up with something. Plus, why cannibalize the 4870 and it's siblings? If this is real and nv is delayed to 2010, AMD is gonna have some real manna on their hands.
 
There is no indication that 40nm @ TSMC had problems reaching clock speeds, it seems to have problems producing use able chips. So smaller and faster might be more efficient.

Just look at 4770/4750 GPU clock seems no problem at all, as there is no low clocked Version out there.

Hmmm... I thought it was mainly due to leakage.
They were having a hard time hitting the needed clockspeeds within the voltage/TDP ranges.

Previous rumors that were heard after TSMC supposedly fixed the 40nm process and RV870 was getting "good" yields, was that they were seeing around a 30% improvement in clocks per power.
 
I personally don't bet on anything just yet. Besides where's the major difference in theory in terms of yields if you have let's say a part with 1280@900MHz vs. a part with 1600@700MHz? What tells me that lesser complexity cannot be theoretically invested in higher frequencies or vice versa?

As long as we don't know how much additional die area the X11 requirements account for, how many TMUs, ROPs etc. the guessing game can go on forever.

Let's just take the TMU part of the speculative math; you get roughly similar texel fillrates whether you have 80@700MHz or 64@900MHz. First scenario though could mean that the TMUs there capture ~20% more die area. Now wouldn't those TMUs also need some additional logic for some of the X11 requirements?

Frankly I don't know how you guys make those equasions like it's a simple 1+1=2, but for me personally for the time being it's adequate to know that Cypress has a maximum theoretical arithmetic efficiency of over 2TFLOPs.

Look, I'm not sure about anything, of course. But, history says that normally is more difficult to get higher clock speeds on a new process, and that in the past new GPUs on a new process were not so much higher clocked with respect to slightly older GPU made with an older process. RV790 was a "singlularity" as it was specifically targetted for high clocks but this was realized on a process proved for more than one year, after some refinement. If initial rumored 40 nm problems were not a fairy tale, most of the effort should have been spent on the yield issue instead of getting the absolutely better clock speed obtainable. Moreover, power consumption usually goes higher with clock speed (as higher clock often needs higher voltage), so a bigger chip with lower clock speed sometimes could be more easily obtainable. If the hints about the die size of Cypress are right, DX11, RBE and texturing improvements are unlikely to be responsible for such a big increase in size, considering the scaling.
Of course there are so many "if" but this is the "AMD: R8xx Speculation" thread and we shoud reason about what is rumored to be true.
If we want to reason about the "2 Teraflop" wording (but again, the sentence said "more than 2 teraflops") we can see that 2 teraflops could be reached by a 800SP @ 1.25 GHz, 1200 SP @ 850 MHz, 1600 SP @625 MHz, 2000 SP @ 500 MHz or 2400 SP @ 417 MHz. This if the architecture is quite similar to what RV770 is, of course.


There is no indication that 40nm @ TSMC had problems reaching clock speeds, it seems to have problems producing use able chips. So smaller and faster might be more efficient.

Just look at 4770/4750 GPU clock seems no problem at all, as there is no low clocked Version out there.

4770 is 750 Mhz (same as Rv770) and does not seem to overclock to 900+ Mhz so easily.
 
May be they want to book profits until nv can come up with something. Plus, why cannibalize the 4870 and it's siblings? If this is real and nv is delayed to 2010, AMD is gonna have some real manna on their hands.

Not to mention it's very possible that they won't meet demand at $299 (HD 4770 availability is still low even after all this time). With everybody chomping at the bit for new shiny GPUs AMD will still sell all the Evergreens they can make even at $399. Nvidia being MIA is icing on the cake and makes it all the more worthwhile.
 
Not to mention it's very possible that they won't meet demand at $299 (HD 4770 availability is still low even after all this time). With everybody chomping at the bit for new shiny GPUs AMD will still sell all the Evergreens they can make even at $399. Nvidia being MIA is icing on the cake and makes it all the more worthwhile.

And by the looks of it they'll still have time to drop prices before NV gets their DX11 stuff out. Unless NV has us horribly mislead..
 
The availability of the 4770 has increased quite a bit lately. There is 6 different brands instock at newegg, and 4 brands instock at mwave.

Sure it's improved a bit but most are on allocation (1 or 2 limit per customer). So it's definitely not the status quo for cards in that price bracket - there's usually plentiful supply both to OEMs and retail.

And by the looks of it they'll still have time to drop prices before NV gets their DX11 stuff out. Unless NV has us horribly mislead..

Yep exactly. Sell all you can make for the highest price that you can get until there's competition and then adjust to suit.
 
$299 confirmed? I doubt PowerColor would throw anything but top-of-the-line card for big top prize
20090902pc.jpg


I couldn't find the specific image from PowerColors site though, so it might have been removed as unintentional leak
http://www.powercolor.com/Global/NewsInfo.asp?id=701
 
Ever since GF4MX/GF4 ;) since GF2MX had problems even with TV-out that needed extra TVool just like GF3s.
No, GF2MX indeed had two display controllers. In fact some quick googling says it even had two integrated TMDS transmitters (though, just like r200, only one DAC). I dunno what problems it had with some of its output like TV-Out (well TV-Out is a bug in itself), and there certainly could have been bugs in the implementation but it definitely had two independent display controllers.
But anyway, I really think 3 display controllers is a nice feature, I'm really wondering which card will have it in the end and if nvidia will follow.
 
How many RV740-based salvage parts have gone on sale?

Jawed

:LOL: Every salvage part is sell as fully fledged HD4770 part but we all know it aint the truth. HD4770 is just a first kitty and they could sold out everything if they didnt fail so badly in deployment. In here HD4770 are at least 15% more priced than 4850 that they suppose to supsitiute as cheaper mainstream card :LOL: And no that fancy review cooler but some poor aluminum crap worse than fully aluminum c2d slim parts. And the price is over 110USD not below 100USD as it suppose to be with this poor equipment and cheapest coils and half of the supposed capacitors installed on board

It's a weird path of making money for greedy retailers. And too much of street stories how RV740 is so much greater than RV770 (in terms of TDP) making retailers put even more premium prices on overpriced part that makes great ROI for ATi. Are even HD4870 at launch and with shortages made such 100%+ ROI for ATi i somehow doubt. but thats a reason why the shelves are now so much full of them like the overpriced X1600Xt parts which cost somewhere like X1900GT in that moment.
 
^ 'Get the 8 Experience' now? Where does that come from? :)

@Kaotik: How good you are with Photoshop? :)

Good enough, but I picked that from finland's biggest hardware site (Muropaketti.com, I'm sure you're familiar with the name of Sampsa Kurri ;) I'm moderator at Muropaketti's forum)
 
No, GF2MX indeed had two display controllers. In fact some quick googling says it even had two integrated TMDS transmitters (though, just like r200, only one DAC). I dunno what problems it had with some of its output like TV-Out (well TV-Out is a bug in itself), and there certainly could have been bugs in the implementation but it definitely had two independent display controllers.
But anyway, I really think 3 display controllers is a nice feature, I'm really wondering which card will have it in the end and if nvidia will follow.

Is this some kind of number of posts thing :LOL:

Every radeon since RV100 had TWO DACs first they were only 350MHz and since RV250/R300 they became 400MHz. And GF2 were some obsolete parts that in those days were good enough for gamers even if they had just one DSUB output. Only G400/450 from MATROX had dual-head displays before RV100. What googling say i dont wanna even imagine. They create that weird MX4000 that came out when FX5200 failed with quality and what series that was i really dont know but i know that many cross reference GF4MX as GF MX so they all suppose to be GF2 as they indeed were button different process and with different "features" added to these new so called GF4MX revisions to mumble jumble the market.
TVout on all cards before GF4Ti were total crap including just a polished up GF4MX with speed bumps and DDR support which was the main difference thattime since GF2MX and GF420MX was nothing more than rebranded GF400MX with SDR only

About nvidia following i certainly hope for it, but as the time passes by i'll stood disappointed as i was ever since 7900GT and nVidia. As i said at the time when GF5FX were actual they had great GF256, than GF4Ti and they'll probably had great GF7 (happened) ... follow the wabbit. But now i'm totally confused with lack of morale in envy lines. What they expect that intel will bought their shares so that they could settle in billionare pensions. IMNSHO they suck too much as i said ever since G70. Overpriced chips and lack of real mainstream for a year after of maybe never in G200 case.
 
No. (I know you admit your deviant definition later...)

RV740's better power/temps now seem to be in the same sense that RV770 also has better power/temps now than the initial chips did 14 months ago. That stuff happens simply because of node maturity.

Cmon Jawed. I take my hat down for turtle explains exactly my point but much more eloquently explained. And cmon (x2) for node maturity. Maybe if you go for CHIP maturity. In the time of RV770 deploy 55nm was an 9 month old process, all they could fix is to better arrange problematic parts of RV770 but the didnt do that they just get rid of crappy firs revisions first while the hype last. And then they sell more reliable parts later. (real maturity just came out as RV790) Just like in the time of X1950Pro which had brethen X1950Gt with maybe even better chips than X1950Pro had initially or better stock of badly reputation R600 chips that were sold out after deeply below price crappy HD2900GT parts with only 240SPs and with HUGE OC POTENTIAL. even over 800MHz w/o any voltmods.
 
Every radeon since RV100 had TWO DACs first they were only 350MHz and since RV250/R300 they became 400MHz.
Nope R200 only had one (internal) DAC. You could in fact get cards IIRC which had VGA and DVI-I connector but no external DAC hence DVI-I to VGA adapters didn't actually work. That only DAC was already 400Mhz though :)

And GF2 were some obsolete parts that in those days were good enough for gamers even if they had just one DSUB output. Only G400/450 from MATROX had dual-head displays before RV100.
Hmm in fact I think GF2MX was released a bit earlier than rv100 based cards (Radeon VE) though it might have been close... Not arguing about output (TV) quality that's another issue but really the presence of two display controllers is key, there were cards with several outputs before they just couldn't be used at the same time (at least they were not independent). In any case looks like both ATI and Nvidia thought it's a feature they need around the same time (after Matrox did it).
Though of course Nvidia missed this feature for some reason on the (later released) GF3.
 
Back
Top