AMD: R7xx Speculation

Status
Not open for further replies.
Why?Because ATi must always have a superior part?Does being fair about what your design goals were=admitting a monumental failure?Why is it so beyond the realm of imagination that during the design phase, the R600 is how they envisioned a high-end part in the timeframe they were aiming for, should look?

The R600 was poor because the G80 was great...and ATi sure as hell had no involvement with the second part.

Forget G80 then, compare directly to R580 and you'll see R600 didn't hit performance targets (other than in synthetic tests).

RV770 is what R600 should've been. A year later than it already was.
 
RV770 is what R600 should've been. A year later than it already was.

It fits it more properly into the segment that R600 was supposedly meant to target, at least. R600 was too much of a power-burning furnace to not have been an attempt at the top.

It's a wild crazy monster at certain tasks, that's for sure. They just mis-targeted where games would be.
 
Well, as I see it, the rumored specs of Rv770 is where games are ending up.

R600 was targeting a speculation of where ATI expected gaming trends to continue with a much higher math/shader useage and the same or slowly growing texture use.

Had game developement proceeded as ATI was gambling on, the payoff might have been huge. As it was. Nvidia with a more traditional architechture ended up being much closer to the mark.

Other than heat and process issues, I'm pretty sure R600 hit the mark ATI was aiming for.

Unfortunately the mark ATI was aiming for ended up not being on target as to where games would be when it finally came out.

So yes it ended up being a failure in that ATI guessed wrong all those years ago when developement of R600 started. However, it was a successful product for what they designed it for.

Unfortunately, that's only reflected in the professional market and not in the gaming market. Which isn't going to get you very far.

Luckily with Rv670 they were able to take the architechture and turn it into a rather attractive and fairly successful product by reconfiguring for the market segment its performance most closely matched.

Regards,
SB
 
Forget G80 then, compare directly to R580 and you'll see R600 didn't hit performance targets (other than in synthetic tests).

This being apparent in what particular scenarios?Let's not get into the AA debate, as that can be argued either way. In recent titles, the delta between the R600 and the R580 is less than what was customary with generational shifts?Really?Could you please direct me to proof of that?Mind you, not some pre-release beta-drivered investigation, but one including somewhat decent, post-release drivers.
 
Luckily with Rv670 they were able to take the architechture and turn it into a rather attractive and fairly successful product by reconfiguring for the market segment its performance most closely matched.

Haha that has to be the fanciest way of saying "lowered the price" that I've ever seen :LOL:
 
Battlefield 2142 doesnt have Nway enabled for anything beyond 2 way AFR. Probably because it wasnt tested yet. But you can enable 4 Way/3Way easily by changing a single numerical value in nvapps. Oh wait. Thats only a single 9800GX2. :S
 
If it's true they are making another dual core AFR based GPU would shared memory be the best benefit?
 
Last edited by a moderator:
I'm sure it would at least alleviate VRAM restrictions a bit in higher res as well as saving the transfer of Buffers between the individual GPUs.

That said, even a PCIe-v2.0-bridge on the X2 could help scaling a bit. But in some magazine i've read them being quite expensive at about 75 Dollars...
 
Other than heat and process issues, I'm pretty sure R600 hit the mark ATI was aiming for.
I disagree because the available bandwidth essentially goes to waste, which is why the drastic bandwidth cut for RV670 had very little impact on absolute performance in games (there may be some synthetics that show a big drop).

Jawed
 
That said, even a PCIe-v2.0-bridge on the X2 could help scaling a bit. But in some magazine i've read them being quite expensive at about 75 Dollars...

Sorry if I'm being dense, but why would that be the case?
 
Because on current reference boards both RV670-GPUs on an HD3870 X2 are communicating with each other using a 3-Port PCIe-1.1-compliant 48-Lane-Switch (Link).

Yes, and?What's the precise limiting factor in that?
 
Here's a random thought: what if there is no 2xRV770? Basically, RV770 and R700 are one and the same and it's a high-end single-chip GPU. I don't really think that's very likely but I wonder if others here think it's possible at all.
 
Here's a random thought: what if there is no 2xRV770? Basically, RV770 and R700 are one and the same and it's a high-end single-chip GPU. I don't really think that's very likely but I wonder if others here think it's possible at all.
A chip so big and fast that we don't need 2 of them on a single board? :oops:

Or, two won't fit on a single board? :LOL:

I'm sceptical that ATI will abandon the X2 configuration - if nothing else there's still a marketing advantage in offering such a board. Although the "short life" of 7950GX2 and (apparently) 9800GX2 might offer a different perspective on the value of X2.

I dare say there won't be much surprise if the X2 board takes a while to appear after the single-chip board's launch - but I'm struggling to conceive of a reason for it never appearing. Unless it's superseded.

Jawed
 
Status
Not open for further replies.
Back
Top