AMD: R7xx Speculation

Status
Not open for further replies.
Supposedly, they do. You'd have to ask ATI for how exactly they managed to do pretty much the same work in less space or increase transistor density, but a single RV770 ALU takes up less die area than a single RV670 ALU.

Clearly lots and lots of hamsters were used. As modular as the chip seems to be I wouldn't be surprised if there was a whole lot of custom work on certain parts. If they start with one of those 10 pipelines (for lack of a better word) and fine tune that as much as they can it would in turn be used on almost every SKU on a similar process. After that they simply network them all together.

Still the setup strikes me as eerily familiar to quad core CPU configurations. Cores connected to an onboard memory controller. High speed (hyper-transport) links connecting to other CPUs, memory, and devices.

Heck smacking just one of those pipes onto an actual CPU would give you a whole lot of horsepower to play with. Might even be a direction AMD is heading in the future. They'd just need TSMC to manufacture their CPUs for it to work.
 
With reviews of HD4850 hitting, what's left to speculate about on the R7xx? Anything other than the X2 configurations and pricing? How surprising the edge-detect AA modes will be?
 
Fixed your quote. ;)

Fact is, we should not let companies get away with FUD/BS/Spin/lies when those of us with an understanding can educate those people who don't know what is going on.We simply shouldn't accept companies that lie to us to get our money.

It's obvious when we see PR say "number of cores aren't meaningful" and then flip around to "we have more cores than Intel" when it's convenient. Which is true and which is untrue? Can we just discount everything Nvidia claim because we don't know what is true and what is lies?

Well said, and of course the issue isn't about "semantics" at all, but it's about attempting to create perceptual linkages in the minds of consumers, and whether such linkages actually exist in reality rarely concerns the people who write such marketing copy. Rather, in such cases, it is my opinion that the goal of the negative marketeer is to create perceptions that are not grounded in reality as opposed to those that are. I agree with you completely.

Generally as a rule, negative marketing will always fail, which makes it all the more surprising when we see companies continuously fall back to doing it whenever a competitor catches them by surprise. Two major examples of relatively recent negative marketing that I can think of that were notable for their degree of failure were these:

(1) When AMD was burning the midnight oil to push the first Opterons out of the door and to bring x86-64 into the mainstream, Intel's PR response was to mount a nearly year-long, massive negative PR campaign that often amounted to precisely the following quote:

"You don't need 64-bits on the desktop." Short, sweet, to the point and incontrovertibly wrong.

First Opteron and then the A64 proved that mantra wrong, and then Intel itself placed the final nail in that particular coffin when it launched Core 2.

(2) When ATi struck out of the blue with R300 to leapfrog nVidia's discrete 3d gpu technology a few years ago, nVidia responded with a knee-jerk spate of negative advertising that was so pedantic, so intense, so acrimonious, and so erroneous that few of us who lived through it have forgotten it. And that negative advertising campaign lasted only so long as it took nVidia to stop talking about how "wrong" ATi's approach to 3d was when nVidia was finally able to bring its own similar products to market.

Neither of these negative advertising campaigns was successful, and in my opinion, both of these campaigns did far more to bolster the competition than they did to bolster the the companies that created and funded the negative PR.

Moral of the story is that negative advertising doesn't work and is a waste of time and money. So why do companies continuously from time to time keep repeating the same basic mistakes with negative advertising? Why not just forgo the negative ads and simply run ads accentuating the positives of the products you wish to sell?

I think that hubris plays a large part in all of this. Companies large or small are no better or worse than the human beings who run them. If you are a company who has been sitting at the top of the heap for a time then a certain complacency sets in and you begin to view everything that you do as "right" in some fashion, and you tend to see your competitors as "wrong" and you reason that this must be so because you are so much more successful commercially than they are. And so naturally, when the other guy, or the "little" guy, manages to blindside you with a product it had never occurred to you to make, then the immediate apprehension is that he's "wrong" somehow or else he has "unfairly" usurped your rightful position in the scheme of things and upset the apple cart through some mysterious and obviously dishonest machinations you haven't yet been able to figure out...;) And so the negative ad campaigns begin to flow freely while you scramble to understand what he saw that you did not. So that's one explanation for negative advertising that I think is all too human. The fact is, though, that the real fault was your own, since had you not been so enamored of your own imagined market position you might very well have seen the very thing he saw long before he saw it himself.

To that end, you have two different types of companies: innovators and milkers. Now, certainly, all companies both innovate and milk the technologies they bring to market. The difference between companies, however, is one of degree. Some companies spend most of their time innovating while other companies spend most of their time milking. And there you have it. The milker thinks that the "right" way to do things is to spend 20% of his time innovating and 80% of his time milking, while the ratio may well be the reverse for the innovator.

It's long been my opinion about AMD as a company that it has never had the luxury to spend most of its time milking as opposed to innovating, and so the great spurts in its fortunes have come because of this necessary emphasis in corporate philosophy on innovating as opposed to milking. I also think that with its new-found success versus Intel with Opteron/A64 that AMD became a bit too comfortable with that position and began making assumptions about Intel's competitive intentions that perhaps it should not have made--and I think that AMD was perhaps remiss in slacking off on its Phenom development when instead it should have been accelerating it. I think the lesson has not been lost on AMD, that the price of success is eternal innovation, and I think this renewed philosophical emphasis within AMD as a whole is being demonstrated again at present with these new ATi gpus to be followed later this year with its newer 45nm Phenom cpu product lines. The lesson here, I think, is that while imitation is a sincere form of flattery indeed, there are some traits your competitors have that it is best to eschew if at all possible...:D
 
wo wo,

hang on a minute.

What are they suggesting about the enviroment in the ruby demo?

http://www.youtube.com/watch?v=ROAJMfeRGD4&feature=related


It sounds me like each frame is a 'fancy' version of the quicktime 360 picture thing. But with each pixel having a depth and being more like a voxel. And then z buffer compositing the 3d elements... Robot, ruby etc.

Notice he says rendered with no polygons, even though he said the environment is CG assets? So its pre rendered right? He did say the city contains GI and photon maps, Not real time type stuff at the mo is it?

So let me get this straight, Cinima 2.0 is linear film, that you can look 360 degrees round in, with real time elements composited on top, ala resident evil 2 style?

Please tell me im totally wrong and talking shit, cos that would make me feel alot better. I cant help thinking there is something weird going on here.

EDIT: Notice that on that given frame the camera stays in a fixed location looking around the city. Much like those stupid 3d pictures on quicktime?
 
Last edited by a moderator:
Looks like the 9800GTX+ leads the ATI 4850 in most games they tested at FiringSquad.

Also the 8800gt SLI card looks pretty good compared to the 4850CF in Anand's review.

Looks like you could buy either Nvidia or ATI this time in the mid range and get the same thing for the same price.
 
With reviews of HD4850 hitting, what's left to speculate about on the R7xx? Anything other than the X2 configurations and pricing? How surprising the edge-detect AA modes will be?

Everything seems... plain right odd. In terms of performance that is.
 
The TweakTown 4850 preview pits it against a BFG 9800GTX OCX, which is clocked slightly higher than the proposed 9800GTX+ (755/1890/2300 vs. 738/1836/2200 core/SP/RAM, so +3%/+3%/+4.5%), if people want another data point.
 
Looks like the 9800GTX+ leads the ATI 4850 in most games they tested at FiringSquad.

Also the 8800gt SLI card looks pretty good compared to the 4850CF in Anand's review.

Looks like you could buy either Nvidia or ATI this time in the mid range and get the same thing for the same price.

its just an overclocked 9800 GTX. overclock the 4850 and things will switch. plus the GTX+ costs $100 more. 4850 wins.
 
its just an overclocked 9800 GTX. overclock the 4850 and things will switch. plus the GTX+ costs $100 more. 4850 wins.



The 9800GTX retails for $229 so that would be $30 more.

It is also done at 55 so it is not just an OC 9800GTX.

It is cheaper to produce.

I think it is actually a little smaller than the 4850/70s
 
lets crank up the AA and see how the 4850 and 9800+ stack up.

for me the 4850/70 is looking a great buy. E-8400, an X-38, 2gb of ram and a 48XX card is a killer band for buck package!
 
I couldn't even imagine using dual 30" monitors. I much prefer a 30" monitor paired up with 20" monitor in portrait view. Otherwise the viewing area gets to be so wide as to be distinctly uncomfortable to use. At least for me.

Dont knock it till you've tried it. If the 4870's dont have dual duallink, theyve just lost a customer in me.

BTW, which set up would you prefer?

desk0r

The ridiculous number of machines?

newnewdesk

A larger number of monitors?

desk60

More pixels per screen? (yeah yeah, dont hassle me about the mess)
 
Dont knock it till you've tried it. If the 4870's dont have dual duallink, theyve just lost a customer in me.

BTW, which set up would you prefer?

http://www.pogdesign.co.uk/stuff/desk0r
The ridiculous number of machines?

http://www.pogdesign.co.uk/stuff/newnewdesk
A larger number of monitors?

http://www.pogdesign.co.uk/stuff/desk60
More pixels per screen? (yeah yeah, dont hassle me about the mess)

Can anyone give me drool smiley :oops:

:drool:
drool2.gif
(PS. Found some drool smiley)

:D
 
Last edited by a moderator:
That is if its in stock. Ahh, but its not. :LOL:

A hypothetical card that is expected to retail for $229 versus a card that can be had for as little as $170 today. Hmm.

And one that has proven to survive 4xAA +16xAF and higher resolutions whereas we don't know if G92b will survive as G92 chokes
 
its just an overclocked 9800 GTX. overclock the 4850 and things will switch. plus the GTX+ costs $100 more. 4850 wins.

So far 4850 does not seem such a great overclocker though. Some sites got 700 no problem, which is the max the current OC software will allow. But another site only got 675/682 on two different brands of 4850, and concluded most of the better chips must be binned for 4870. Also the chip seems to run really hot, could be a negative on OC.

I am wondering how this 9800GTX+ will OC? If it's truly a 55nm part, maybe it will have more headroom. But it wont even be available for three-four weeks, a pretty big negative itself.

It's really close on 4850/9800GTX, depending on where final prices settle out with rebates, OCing capabilities, future driver improvements, etc.
 
Status
Not open for further replies.
Back
Top