AMD: R8xx Speculation

Discussion in 'Architecture and Products' started by Shtal, Jul 19, 2008.

?

How soon will Nvidia respond with GT300 to upcoming ATI-RV870 lineup GPUs

Poll closed Oct 14, 2009.
  1. Within 1 or 2 weeks

    1 vote(s)
    0.6%
  2. Within a month

    5 vote(s)
    3.2%
  3. Within couple months

    28 vote(s)
    18.1%
  4. Very late this year

    52 vote(s)
    33.5%
  5. Not until next year

    69 vote(s)
    44.5%
  1. Chris123234

    Regular

    Joined:
    Jan 22, 2003
    Messages:
    306
    Likes Received:
    0
    So what is the general consensus of the 5870 vs whatever the new nVidia best single chip card is? And, what is the X2 compared to that new nVidia?

    Trying to get a sense of scale this time around.
     
  2. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    The Eyefinity looks pretty cool. I have so many LCD monitors. I'd stack them all together but I dont have the desk space.
     
  3. w0mbat

    Newcomer

    Joined:
    Nov 18, 2006
    Messages:
    234
    Likes Received:
    5
    BTW: How shure are we that RV870 will be a 256bit chip? I mean the die is rotated, the card has 16 memory chips and they are showing Eyefinity with insane resolutions. Could it be 512bit or is this just a lill bit too much dreaming?
     
  4. eddieobscurant

    Newcomer

    Joined:
    Dec 5, 2008
    Messages:
    8
    Likes Received:
    0
  5. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,578
    Likes Received:
    622
    Location:
    New York
    But there's no practical value to blindly chasing a "sweet spot" strategy. It all depends on the competitive environment (an argument many have made in the past). Your disappointment is a product of a belief in ATI's generosity :lol: Besides, the 5850 is still $299.....

    @wombat, yep I wouldn't be surprised in the least. I'm not sure how they could avoid bandwidth starvation at 256-bit.
     
  6. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,045
    Likes Received:
    2,921
    Location:
    Finland
    Posted this some time ago, has all the cards posted in the 5870/crysis thread
    [​IMG]

    Worth noting is that HD5870 was OCd (to unknown numbers) and that the other cards used QX9650 @ 4GHz, while HD5870 used Phenom II 955BE
     
  7. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,864
    Likes Received:
    363
    Location:
    35.1415,-90.056
    Yes, I too am disappointed that I cannot buy a Veyron at $100,000. I'm disappointed that I cannot purchase the Samsung LED-backlit 50" TV that's about 5cm thick for $1,000. I'm disappointed that I cannot buy that 4500 Sq Ft house in San Juan Capistrano, California for the price that was paid for it in 1992 -- it would make my relocation SO much easier from Doo-Dah, Kentucky.

    I mean, how much does this stuff REALLY cost, let's be honest? They're just jacking up the price because they maintain the upper echelon of their segment. Those unrelenting capitalistic bastards.

    In reality, my disappointment is mostly sarcasm, but if it were real, it would be due to unrealistic expectations. Just as your disappointment is also founded in unrealistic expectations. If you want the upper echelon of performance, you will pay for it. Be disappointed all you like, but be realistic.
     
  8. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,045
    Likes Received:
    2,921
    Location:
    Finland
    [​IMG]

    Confirmed details, 850MHz core clock, anyone saying different is wrong. Default model has 1GB, not 2GB like some claimed, and price is "<$400", not "$399" (though $399 is most likely I suppose)
     
  9. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,792
    Likes Received:
    1,075
    Location:
    Guess...
    I don't beleive those results for a second. I want to be proven wrong but that looks waaaay too optimisitc.
     
  10. Chris123234

    Regular

    Joined:
    Jan 22, 2003
    Messages:
    306
    Likes Received:
    0
    Ok, thats just a dumb argument.

    So are these analogies.
     
  11. ChronoReverse

    Newcomer

    Joined:
    Apr 14, 2004
    Messages:
    245
    Likes Received:
    1
    Why's it dumb? Just because you assert it? Since he's using examples of the best available at the moment (without cutting corners like using a dual chip would be) what's wrong with the analogy?

    Isn't it more dumb to just assert something without any sort of explanation?
     
  12. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,578
    Likes Received:
    622
    Location:
    New York
    And that is your well reasoned analysis? Lol, give me a break :lol: Cry all you want but $399 for the fastest single-GPU card on the planet is a damn bargain.
     
  13. Mize

    Mize 3dfx Fan
    Moderator Legend Veteran

    Joined:
    Feb 6, 2002
    Messages:
    5,079
    Likes Received:
    1,149
    Location:
    Cincinnati, Ohio USA
    I'll wait on benches, but whining about $399 for a card that almost doubles the perf of a GTX295 seems a bit, uh, silly?
     
  14. Albuquerque

    Albuquerque Red-headed step child
    Veteran

    Joined:
    Jun 17, 2004
    Messages:
    3,864
    Likes Received:
    363
    Location:
    35.1415,-90.056
    I agree, it is quite dumb to assert some sense of "wrong" to an item with no other definition than "it should be cheaper."

    Please tell me which of these points denotes why it should be cheaper:

    • It will not be competitive unless it's at a lower price
    • It will not sell unless it's at a lower price
    • You personally won't buy it unless it's at a lower price
    • Random people on the internet will hate on it unless it's at a lower price

    If your answer was one or more of the LAST TWO items, then your argument is the dumb one. If your answer was one or more of the FIRST TWO items, then your argument may hold merit. In order to determine that merit, you'll need to expound on your answers for the first two.
     
  15. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,045
    Likes Received:
    2,921
    Location:
    Finland
  16. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,018
    Likes Received:
    114
    There's a couple active DP -> dual link DVI adapters. apple selling one for instance. It seems to be problematic at times and I've no idea how well it plays with HDCP... I guess with these adapters the output is really using DP mode, and it really translates DP to dual-link DVI. Don't know anything about active HDMI 1.3 -> dual link DVI adapters, I don't think this exists.
    The trouble with so-called HDMI 1.3 devices is that they only support subset of HDMI, and usually not including the higher clock frequency. In fact all 30" monitors I know of only accept their native (2560x1600) resolution over (dual-link) DVI (and DP if they have that input), with the HDMI inputs ranging from useless (only 1280x800 for those monitors without scaler) to crappy (1920x1200 for monitors with scaling), if they have HDMI input. Most graphic cards with HDMI output don't support higher clock frequencies neither (not sure actually there's a single card which does currently).
    In retrospect, dual-link DVI looks like a mistake. Incompatible to HDMI (except using type b connector and there's exactly zero consumer grade hardware out there using that), signal can't be carried over DP port (too many data wire pairs I guess). Though maybe the higher clock frequencies needed otherwise for that kind of resolution wasn't easily achievable when it was introduced...
     
  17. nutball

    Veteran Subscriber

    Joined:
    Jan 10, 2003
    Messages:
    2,230
    Likes Received:
    565
    Location:
    en.gb.uk
    ATI was in a very bad place prior to the launch of 48xx, getting smacked all over and forever by G80 and follow-on parts. Some of the bi-polar types here were predicting the end of ATI (same people now predicting the end of NV by the way).

    4870 pricing was a way to get back some much needed market-share and mind-share - and it worked very well. Now they're back and looking reasonably good, there's no real reason to repeat that. They would be stupid and irresponsible not to try to exploit the advantage they seem to have, especially given the mauling their CPU division is going to be taking for the foreseeable future.

    Sorry if that means that their halo has slipped in your eyes. It wasn't a real halo anyway.
     
  18. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    9,045
    Likes Received:
    2,921
    Location:
    Finland
  19. w0mbat

    Newcomer

    Joined:
    Nov 18, 2006
    Messages:
    234
    Likes Received:
    5
    I think Juniper is 256bit and Cypress 512bit. I hope im right :D
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...