AMD: R8xx Speculation

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

  • Within 1 or 2 weeks

    Votes: 1 0.6%
  • Within a month

    Votes: 5 3.2%
  • Within couple months

    Votes: 28 18.1%
  • Very late this year

    Votes: 52 33.5%
  • Not until next year

    Votes: 69 44.5%

  • Total voters
    155
  • Poll closed .
So what is the general consensus of the 5870 vs whatever the new nVidia best single chip card is? And, what is the X2 compared to that new nVidia?

Trying to get a sense of scale this time around.
 
The Eyefinity looks pretty cool. I have so many LCD monitors. I'd stack them all together but I dont have the desk space.
 
BTW: How shure are we that RV870 will be a 256bit chip? I mean the die is rotated, the card has 16 memory chips and they are showing Eyefinity with insane resolutions. Could it be 512bit or is this just a lill bit too much dreaming?
 
I'm soooooo glad to see ATI abandoning their proclaimed plans to focus on the sweet spot (200-300 USD) first and address the enthusiast market with dual GPU products.

But there's no practical value to blindly chasing a "sweet spot" strategy. It all depends on the competitive environment (an argument many have made in the past). Your disappointment is a product of a belief in ATI's generosity :LOL: Besides, the 5850 is still $299.....

@wombat, yep I wouldn't be surprised in the least. I'm not sure how they could avoid bandwidth starvation at 256-bit.
 
hd5870_3.jpg
lawghr

Chiphell: 1900x1200, 4xAA, 16xAF, DX10, Very High.

Posted this some time ago, has all the cards posted in the 5870/crysis thread
HD5870crysis.png


Worth noting is that HD5870 was OCd (to unknown numbers) and that the other cards used QX9650 @ 4GHz, while HD5870 used Phenom II 955BE
 
Yes and I'm sure we'll all be applauding how great the prices are when the x2 launches at 699-799 USD. Again climbing back into the stratosphere of pricing.

I'm soooooo glad to see ATI abandoning their proclaimed plans to focus on the sweet spot (200-300 USD) first and address the enthusiast market with dual GPU products. /sarcasm.

Meh, whatever, I'm just disappointed.

Regards,
SB
Yes, I too am disappointed that I cannot buy a Veyron at $100,000. I'm disappointed that I cannot purchase the Samsung LED-backlit 50" TV that's about 5cm thick for $1,000. I'm disappointed that I cannot buy that 4500 Sq Ft house in San Juan Capistrano, California for the price that was paid for it in 1992 -- it would make my relocation SO much easier from Doo-Dah, Kentucky.

I mean, how much does this stuff REALLY cost, let's be honest? They're just jacking up the price because they maintain the upper echelon of their segment. Those unrelenting capitalistic bastards.

In reality, my disappointment is mostly sarcasm, but if it were real, it would be due to unrealistic expectations. Just as your disappointment is also founded in unrealistic expectations. If you want the upper echelon of performance, you will pay for it. Be disappointed all you like, but be realistic.
 
2.jpg


Confirmed details, 850MHz core clock, anyone saying different is wrong. Default model has 1GB, not 2GB like some claimed, and price is "<$400", not "$399" (though $399 is most likely I suppose)
 
Posted this some time ago, has all the cards posted in the 5870/crysis thread
HD5870crysis.png


Worth noting is that HD5870 was OCd (to unknown numbers) and that the other cards used QX9650 @ 4GHz, while HD5870 used Phenom II 955BE

I don't beleive those results for a second. I want to be proven wrong but that looks waaaay too optimisitc.
 
Besides, the 5850 is still $299.....

Ok, thats just a dumb argument.

Yes, I too am disappointed that I cannot buy a Veyron at $100,000. I'm disappointed that I cannot purchase the Samsung LED-backlit 50" TV that's about 5cm thick for $1,000. I'm disappointed that I cannot buy that 4500 Sq Ft house in San Juan Capistrano, California for the price that was paid for it in 1992 -- it would make my relocation SO much easier from Doo-Dah, Kentucky.

I mean, how much does this stuff REALLY cost, let's be honest? They're just jacking up the price because they maintain the upper echelon of their segment. Those unrelenting capitalistic bastards.

In reality, my disappointment is mostly sarcasm, but if it were real, it would be due to unrealistic expectations. Just as your disappointment is also founded in unrealistic expectations. If you want the upper echelon of performance, you will pay for it. Be disappointed all you like, but be realistic.

So are these analogies.
 
Why's it dumb? Just because you assert it? Since he's using examples of the best available at the moment (without cutting corners like using a dual chip would be) what's wrong with the analogy?

Isn't it more dumb to just assert something without any sort of explanation?
 
I'll wait on benches, but whining about $399 for a card that almost doubles the perf of a GTX295 seems a bit, uh, silly?
 
Ok, thats just a dumb argument.

I agree, it is quite dumb to assert some sense of "wrong" to an item with no other definition than "it should be cheaper."

Please tell me which of these points denotes why it should be cheaper:

  • It will not be competitive unless it's at a lower price
  • It will not sell unless it's at a lower price
  • You personally won't buy it unless it's at a lower price
  • Random people on the internet will hate on it unless it's at a lower price

If your answer was one or more of the LAST TWO items, then your argument is the dumb one. If your answer was one or more of the FIRST TWO items, then your argument may hold merit. In order to determine that merit, you'll need to expound on your answers for the first two.
 
HDMI uses an identical electrical link to DVI, and DP uses a very similar one. The difference is that DVI specs a clock frequency limit, so to do 25x16 you need 2 of the electrical links. HDMI (>=1.3, not 1.0) and DP run at higher clock frequencies so can do it with just 1.

So yes, if you want 6 25x16 displays you need HDMI 1.3 or DP. Do they actually make active adapters that take a single HDMI link in and produce 2 DVI links out? How does that play with HDCP?
There's a couple active DP -> dual link DVI adapters. apple selling one for instance. It seems to be problematic at times and I've no idea how well it plays with HDCP... I guess with these adapters the output is really using DP mode, and it really translates DP to dual-link DVI. Don't know anything about active HDMI 1.3 -> dual link DVI adapters, I don't think this exists.
The trouble with so-called HDMI 1.3 devices is that they only support subset of HDMI, and usually not including the higher clock frequency. In fact all 30" monitors I know of only accept their native (2560x1600) resolution over (dual-link) DVI (and DP if they have that input), with the HDMI inputs ranging from useless (only 1280x800 for those monitors without scaler) to crappy (1920x1200 for monitors with scaling), if they have HDMI input. Most graphic cards with HDMI output don't support higher clock frequencies neither (not sure actually there's a single card which does currently).
In retrospect, dual-link DVI looks like a mistake. Incompatible to HDMI (except using type b connector and there's exactly zero consumer grade hardware out there using that), signal can't be carried over DP port (too many data wire pairs I guess). Though maybe the higher clock frequencies needed otherwise for that kind of resolution wasn't easily achievable when it was introduced...
 
I agree! People had more money back then!

Unless you mean something else, so you are more than welcome to explain.

ATI was in a very bad place prior to the launch of 48xx, getting smacked all over and forever by G80 and follow-on parts. Some of the bi-polar types here were predicting the end of ATI (same people now predicting the end of NV by the way).

4870 pricing was a way to get back some much needed market-share and mind-share - and it worked very well. Now they're back and looking reasonably good, there's no real reason to repeat that. They would be stupid and irresponsible not to try to exploit the advantage they seem to have, especially given the mauling their CPU division is going to be taking for the foreseeable future.

Sorry if that means that their halo has slipped in your eyes. It wasn't a real halo anyway.
 
Back
Top