First GTX295 preview(s) out

Discussion in '3D Hardware, Software & Output Devices' started by Kaotik, Dec 17, 2008.

  1. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    They are close. Theres a few instances where the GTX superior clocks and ram can be helpful. I've seen mostly in Crysis and Fallout3. But thats mostly with 8x MS in super high end resolutions. Otherwise with 16x/8x/4x/ with transparency SS you wouldnt be able to pick one out from the other running side by side. The real performance gains for me didnt come till Quad SLI. But I cant talk anymore about that :p

    Brent said he was stuck with 8x MS on titles like fallout 3 and far cry. I did send him a message in private on ways to get 16xAA and 16xQ properly working as well as transparency SS. He hasnt replied to me yet though :)
     
  2. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,716
    Likes Received:
    2,137
    Location:
    London
    XBit is one of the rare sites that routinely uses transparency/adaptive sampling for textures with transparency.

    http://www.xbitlabs.com/articles/video/display/7games-fall2008_2.html#sect0

    Jawed
     
  3. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Well at least my crystal ball works :)
    I've said months ago that nVidia would probably make a SLI-on-a-stick card out of the GT200 series once they moved to 55 nm. And there it is.

    ATi-supporters were saying it wasn't possible because of heat and power consumption issues, but they ignored the fact that ATi uses 55 nm for the 4870 aswell, and nVidia hadn't played that trump card yet.
    But it seems that nVidia made a clean sweep? Lower power consumption than 4870X2, better performance, and at the same price.
    Which makes the whole argument that nVidia's dies are bigger and more expensive rather moot. Yes, they're bigger, and they might be more expensive... but the overall card is no more expensive than ATi's... and there are no power consumption or heat issues from the larger die either. So who cares really?
     
  4. Kaotik

    Kaotik Drunk Member
    Legend

    Joined:
    Apr 16, 2003
    Messages:
    10,245
    Likes Received:
    4,465
    Location:
    Finland
    Board manufacturers care, their profits get cut quite short when the chips are expensive but they still need to sell cards cheap
     
  5. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    But that only goes when nVidia sells the chips at high prices to their partners. Perhaps nVidia sucks up the difference in costs themselves... or perhaps nVidia's yields are better than ATi's, making chip cost the same regardless...
    I'd like to see some info on how much nVidia and ATi charge their board manufacturers, and what the rest of the boards cost to design, and how the total cost of manufacturing relates to the retail prices.

    It could well be that there's only $10-$20 difference in raw costs (or even that nVidia's board design and choice of other components such as memory compensate for the chip cost altogether), which could easily be compensated by more sales. In which case, the board manufacturers wouldn't care.
     
  6. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    1. The GTX295 isn't available yet in retail, so comparing prices of an unavailable product and a 5-6 month old product is a bit of a moot point.
    2. The X2 has been hovering at the same pricepoint since it's launch in August. What makes you think AMD will keep the price at $499 on the 8th January 2009?
    3. You're basing the 'better performance' thing on the "NV wants you to use these games" benchmarks. Just wait until January to see how it stacks up in other popular games...
    4. You would expect that a product which is launched 6 months after it's competitor is faster... but yet it isn't convincingly faster in every previewbenchmark and even loses in some... seems a bit like it is a day after the fair.
     
  7. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    12,059
    Likes Received:
    3,119
    Location:
    New York
    What's current MSRP? $549 or something like that right?
     
  8. SirPauly

    Regular

    Joined:
    Feb 16, 2002
    Messages:
    491
    Likes Received:
    14
    Hehe, looking forward to these settings/gains.
     
  9. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Wow, you seem to go out of your way to try and 'correct' me.
    You fail.
     
  10. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Well they weren't ignoring nVidia's move to 55nm, but don't forget nvidia's chip is still (approximately) twice as big even with 55nm.
    I dunno I'm not sold yet on the lower power consumption (under load at least), were there any numbers somewhere? Better performance, yes, but really given the complexity of the chip it should be faster. Same price? That remains to be seen. I suspect AMD has way more room to adjust pricing with these cards (unless gddr5 prices are very high, but I've no idea there).
    Well, you can't deny that power/heat put some limits to the GTX295. Clock is only the same as GTX260, that is way lower than the 55nm chip would be capable of (so nvidia likely lowered voltage quite a bit, which can make a huge difference in power consumption). Same goes for memory clock, and I suspect the sacrifice of 1 (x2) memory channels is also more due to power concerns than cost.
    Yes, in the end it might still be faster than AMD's solution (I suspect though just barely if you look at a wider selection of apps), but really given the chip complexity it's not much - I bet AMD makes way more money out of its card if they are sold at near the same price.
     
  11. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I was talking about heat and power consumption though. Die size doesn't necessarily have anything to do with either.

    Yes, don't recall where, but I've seen at least one (p)review with numbers and the GTX295 came out lower in all tested applications.

    We'll have to see. nVidia doesn't necessarily have to stick to the price they suggested now. They just based it on ATi's current price, I suppose. There's usually HUGE margins on such high-end cards anyway.
    nVidia priced their GTX260 really competitively against ATi aswell. With 55 nm they can only improve on performance/price.
     
  12. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Its not uncommon for Nvidia to just put enough into their cards to make them attractive. You got to remember unlike the 4870X2 the GTX 295 is neither a GTX 260 or a GTX 280. Its in between. The power draw on the card is very reasonable. And I make this comparison between GTX 260/280 SLI power usage. What makes you believe the power consumption is so wrong?

    But the GTX295 is just like the 9800GX2 in regards to the fact that its core clocks are under the higher end single GPU of the time ((9800GTX/GTX 280/285)). But it also has the same stream processors as the high end. ((GTX280/GTX 260 = 9800GTX/8800GT)) Nothing they've done here is really out of the ordinary from their past multi GPU cards.

    As far as margins goes.. when did everyone become so bleeding heart to be concerned about NVidia's ATI margins? If the cards get priced the same and stay the same price. Then all that matters in the long run is pros/cons of each individual card. And thats up to the user to decide. I'll admit that I dont buy alot of graphic cards and get them from Nvidia for testing. But I wouldnt use this kinda logistics towards anything else I am buying.

    *Edit* I just checked into the backplate design of the engineering samples. And theres no plans to have the back PCB covered.
     
    #72 ChrisRay, Dec 19, 2008
    Last edited by a moderator: Dec 19, 2008
  13. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,022
    Likes Received:
    122
    Some reviews had some slides coming from nvidia which showed slightly lower numbers. Tomshardware actually measured power draw, and claimed significantly lower power draw than HD4870X2 under full load (and slightly lower at idle, though I especially don't trust the idle numbers for prototype hardware as things like idle clocks could change). That's good, though in past there was quite some variance in power draw for the same card used in different reviews. In any case, from the official TDP figures both solutions should be similar (and HD4870X2 like other HD48xx products isn't especially good at idle, so the GTX295 might well end up better). Other GT200 products though always seemed to stay quite well below their TDP for some reason, so it indeed could end up using less power in real life scenarios (I wonder though how you'd reach TDP with these cards, maybe need to use that idle double precision unit?)

    True that GTX260 is priced competitively - but it most certainly wasn't planned at that price.
    I'm only concerned with margins insofar (as consumer) as higher margins obviously allows for more pricing flexibility. Thus AMD may just lower the price to a level where it makes no sense for nvidia to follow, hence the HD4870X2 ending up cheaper - there's nothing wrong with that for anyone if it's a bit cheaper but performs a bit worse.
    As for performance, it seems to match expectations quite well - I think GTX260 core 216 against HD 4870 1GB was called pretty much a draw, so "GTX260 core 240" should be like 10% faster and thus GTX295 10% faster than HD4870X2 - if it's not then that's likely due to driver differences handling SLI/Crossfire (at least I don't think there's anything in SLI/Crossfire technology making one or the other scale better), which could evaporate with every driver revision, or in some rare cases GTX295 running out of memory.

    Oh btw one last rant: what's up with the GTX295 name? I don't like it - gives the impression this being just a faster version of GTX260/280/285, whereas it's SLI on a stick. Not that nvidia's product names made a whole lot of sense lately, but the GX2 moniker was ok.
     
  14. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    No just pointing you to your premature assumptions/conclusions.
     
  15. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    9,237
    Likes Received:
    4,260
    Location:
    Guess...
    I totally agree. I'm always confused why people are so concerned about he size of GT200, or what it costs NV to make. Why the hell does that matter to us as consumers as long as its priced the same as the competition? If anything its a good thing. We're getting more chip for our money!
     
  16. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    No, you think they're premature, but you have no way of knowing what I know.
     
  17. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    Idle clocks are the same as the GTX 260/280. Very unlikely they will change. Same power saving as any other GT200 hardware.
     
  18. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    Isn't it obvious?
    There have been huge discussions on something insignificant like nVidia's die size... but do you hear people complaining about AMD's lack of GPGPU solutions, physics acceleration, and their laughable Avivo transcoder? Hardly.
     
  19. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    Well ain't that funny. I could say the exact same thing...

    No reason to be all uptight now is there? :roll:
     
  20. Speccy

    Newcomer

    Joined:
    Oct 11, 2002
    Messages:
    86
    Likes Received:
    6
    I was under the impression that this was a technology oriented discussion forum and these types of things are considered and discuessed here. If thats not the type of things that you're interested in discussing then there are all kinds of other forums that purely look no deeper.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...