ATI RV740 review/preview

Discussion in 'Architecture and Products' started by LunchBox, Feb 25, 2009.

  1. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
  2. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    That's a bit like comparing apples to oranges isn't it?

    Both companies always released new generations on a higher number and it was usually pretty obvious that an x8xx card from the previous generation would be faster than an x3xx or x6xx from the new generation.

    However... in all these cases they used new GPUs*. In the current case with the GTS250 and GTS240, they have been using the same old G92 GPU for nearly 3 "generations", all the way from the 8800 to the 9800 to the GTS200 series.

    Anandtech put it nicely here:

     
    #122 CJ, Mar 3, 2009
    Last edited by a moderator: Mar 3, 2009
  3. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    704
    Location:
    Guess...
    But I see no issue with using the old architecture if both the feature set and performance are near enough identical to what they would have been on a mid range version of the new architecture (gamers don't give a damn what version of CUDA the GPU supports).

    In comparison to a true GT2xx based mid range part, the price is right, the performance is right, and the feature set is right. So what is the consumer losing?

    Yes it was obvious in past generations that x8xx card from the previous generation would be faster than an x3xx or x6xx from the new generation to anyone with a basic level of GPU knowledge, but exactly the same is true here. I don't see anyone who knows how GPU naming schemes and generation performance jumps work automatically assuming the GTS 250 will be faster than the 9800GTX+.

    Anandtech complains that NV is treating its customers like idiots and yet in this very thread we have talked about how this is prevailent amoungst average joe gamers.

    The fact is that from a consumer perspective, there is virtually no difference between G92b and a mid range GT2xx so if consumers are shying away from G92b because it has an old name, then I don't see an issue in renaming it to something appropriate to its performance and feature set.

    Note some of the other thngs they are doing, i.e the fast/slow versions and specifying which games to use in reviews are actions I strongly disagree with.
     
  4. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,492
    Likes Received:
    515
    Location:
    New York
    Yet that's entirely irrelevant. Performance, features and price are the only things that matter. We all share the frustration vented in the Anand article but their logic is flawed IMO. If Nvidia had taped out a "new GPU" with the exact same performance as G92, called it the GTS 250 and some dolt upgraded from his 9800GTX is that somehow more acceptable because Nvidia wasted money on taping out a new chip?

    The whole "new chip" argument is completely ridiculous in the context of G92 and GT200. There isn't a big enough architectural or feature difference to make the venture worthwhile. Especially for the consumer.
     
  5. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    At least in this case. Someone who upgraded their 9800GTX to a GTS 250. Would have the option to SLI them together. I know I know. Alot of people here don't like SLI. But these cards can SLI together and its not immediately obvious.
     
  6. mboeller

    Regular

    Joined:
    Feb 7, 2002
    Messages:
    922
    Likes Received:
    1
    Location:
    Germany
    Is this chip really a RV770? I'm not 100% sure cause the RV770 has a size of 260mm² according to all the old articles. The chip on the jpg has a size of 264,5mm². Well that could be a simple missmatch but it could also mean that the chip is not a RV770 but a RV790 which according to the rumours is a slightly improved RV770. So maybe AMD needed a new layout for the RV790 to improve clock-speed etc.. and therefore the chip grew a little bit in size.

    I found it also a little bit strange that AMD gave the exact size of both (new?) chips on the jpg. Why should they do it? The size of the RV770 is well known.

    Also, why haven't we seen a M4850 since Q3/Q4 2008 already? The RV770 is available since Q2/08 so I would have expected to see notebooks with the M4850 already much earlier than now. So maybe AMD needed a new revision of the RV770 to improve the power consumption of the RV770 and in the end the RV770 grew into the RV790 with all the improvements necessary for the notebook market.

    I know very well that this all is only hypothetical but well it's my hypothese. :)
     
  7. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Thanks for pointing that out Chris, A lot of people were wondering if you need a bios flash or something for that. Thanks for clearing that up!
     
  8. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Well. it's a photoshopped marketing slide, but according to amd that's a Mobility HD4850.
    There is more than one laptop with the 4850 inside. Besides the MSI one there's also This one

    Why there are no reviews? dunno.
     
  9. jimmyjames123

    Regular

    Joined:
    Apr 14, 2004
    Messages:
    810
    Likes Received:
    3
    Let me get this straight: some people are bitter about a rebranding, even though the new "rebranded" card is 25% cheaper than the previous card of same/similar performance, has much lower power consumption, is physically much smaller, and has the option for more on-board memory (which comes in handy at higher resolutions and/or with SLI-based systems)???

    Ummm, ok

    So what, everything would be fine and dandy and no complaints if NVIDIA simply created a totally hacked down version GT200 that had 1/2 the performance of a GTX 280 and named it GTS 250? Like what has been done with midrange video cards in some prior generations?

    I don't get that.

    Rebranding was probably the quickest and least expensive route for NVIDIA to go right now in the short run to combat AMD's new strategy. Clearly NVIDIA could have come out with a GT200-based midrange card, but at what extra performance, at what extra cost, and at what extra delay in time to market?

    Frankly, there is nothing wrong with minimizing extra R&D expenditure, and nothing wrong with bringing as good or better performance and lower power consumption to lower price points. What matters is the end result, not the means to get there. It's not like anyone with even half a brain could confuse GTS 250 as a performance upgrade to a 9800 GTX+ based on the name alone.

    NVIDIA ultimately must have deemed that they would be better off pouring more resources into GT300 development (and beyond).
     
  10. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,396
    Likes Received:
    132
    Location:
    msk.ru/spb.ru
    Code:
    GeForce 8800M GTX
    Stream Processors 	96
    Core Clock (MHz) 	500
    Shader Clock (MHz) 	1250
    Memory Clock (MHz) 	800
    Maximum Memory 	512MB
    Memory Interface 	256-bit
    Code:
    GeForce GTX 260M
    Processor Cores	112
    Gigaflops	462
    Processor Clock (MHz)	1375 MHz
    Texture Fill Rate (billion/sec)	31 
    Memory Specs:
    Memory Clock (MHz)	Up to 950 MHz
    Standard Memory Config	1  GB GDDR3
    Memory Interface Width	256-bit 
    Memory Bandwidth (GB/sec)	61
    So the second one is worse than the first one, yes?

    I know that we don't have any problems with doing whatever in our review of GTS 250. That's enough to know for me. You may believe Charlie (who sought NV announcing x86 CPU today somehow).

    I've said this numerous times: it's POINTLESS to use 8x MSAA on G8x+ GPUs. You have a nice CSAA modes which are generally quite a bit faster. So the only unbiased comparision of G8x+ and RV7x0 in my opinion is in 4x MSAA modes.

    The new 48x0 pricing is quite bad for NVIDIA, that's a fact.
    But i've seen reviews of GTS 250, yeah. Have you? If you don't use MSAA 8x (i've said why it's biased comparision) then GTS 250 is even with 4850 in ATI-favoring banchmarks and faster than 4850 in NV-favoring benchmarks. And that's faster overal from where i'm standing.
    And it's quite clear that GTS 250 can't compete with 4870 on the same price level at all.

    You need to refresh your memory. 3870 isn't and simply can't be faster than 2900.

    It introduced DX10.1 support only which is useless right now even on RV770x2 (the only game which uses it somewhat intensively -- Clear Sky 1.5.07 -- is unplayable with DX10.1 features enabled in any sane resolution: http://www.ixbt.com/video/itogi-video/test/0902_scsbench.shtml)
    As for half the price and reduction in power consumption -- that's funny because GTS 250 is doing exactly the same compared to 9800GTX+. But what NV's doing is bad and what AMD's done is good, right?

    They've kept the performance level which is already ten times more important to the end user then what AMD's done with 2900->3870 transition.

    Lots of rumours? From where? From Charlie? -)
    What you've heard right now originated from AMD, not NV. It's RV740 which is late and needs another spin before it may come to market. Yesterdays announcement from AMD was a paper launch intended to spoil NVs new mobile offerings hard launch -- strange that noone mentioned that.
    GT212 isn't the RV740 competitor. It may well be cancelled but certainly not because of the problems with 40G. All the other GT21x chips (one of which should be the direct RV740 competitor) are still coming.
     
    #130 DegustatoR, Mar 4, 2009
    Last edited by a moderator: Mar 4, 2009
  11. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Dude, I wrote a 1, not a 2.

    No, they're reviewing a $179 GTS-250 OC 1GB model against a $129 stock HD4850

    I've had a HD2900XT and would've loved to trade it in against a HD3870. There are only a couple of games at which the 2900XT is actually faster. Heck, the 2900XT is barely faster than a 1900XTX at some times. Your view is very askew.

    Assasins Creed, Far Cry.... No, GTS250 does not reduce power draw by half, no it doesn't reduce price by half. Heck, it doesn't even go down $20. and it certainly does not introduce new features.

    Maybe because the GTS250 isn't a hard launch either?
     
  12. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    10,873
    Likes Received:
    767
    Location:
    London
    How can NVidia hard launch something that's not new?

    Cherry picking G92b for a SKU that at best could have legitimately been called GTS250M, not GTX280M, is not a hard launch.

    Where can I buy a laptop with GTX280M?

    http://www.techreport.com/discussions.x/16507

    GTS250 isn't even a hard launch - there's literally no excuse there - it's not as if the opening day of CeBIT was announced last week and GTS250 was put together in a hurry.

    Jawed
     
  13. DegustatoR

    Veteran

    Joined:
    Mar 12, 2002
    Messages:
    1,396
    Likes Received:
    132
    Location:
    msk.ru/spb.ru
    So you're saying that naming your mobile GPUs the same as descreet while they are slower is bad? Then you should blame ATI for that -- it's their doing, they were doing this before there was any mobile NVIDIA GPU. NV is simply doing the same what ATI/AMD's doing here.

    From what reviews i saw everyone was reviewing either GTS 250 or GTS 250 + GTS 250 OC Edition.

    I have 2900 right now as well as 3870 X2. 2900 is way faster than one of 3870 X2 GPUs nearly everywhere -- especially with any kind of AA. My view is what i see on my screen, sorry.

    So "half" is good, -20% is bad?

    GTS 250 is as hard as it gets because it's mostly the same 9800GTX+ -- you may go and get one right now.
    And i was talking about new NV's mobile offerings, not GTS 250. AMD hasn't announced any new desktop cards yesterday, they've announced new mobile GPUs. And while the new G-line from NV is available right now you'll have to wait for 2Q to get AMD's offering.
     
  14. v_rr

    Newcomer

    Joined:
    Apr 30, 2007
    Messages:
    147
    Likes Received:
    0
    Yes it can. Some cases is faster, some is slower. Depend on game...



    GTS250 is reducing 50% power consuption at hidle and load? :lol::lol:
    Dude you must be dreaming.

    As far as DX_10.1 I remember ATI had 4 partnerships for new games with DX_10.1 suport. You have deal with Blizard also and Win 7 takes advantage of DX_10.1 on Aero Peak.
    Also DX_11 will incorporate DX_10.1 and Tesselation, so by DX_11 time this ATI GPU can enable features of DX_11.



    RV740 is up and running on desktop and mobile. Is it late? Depend on the defenition of late. If you compare it to Nvidia it is very sooner.

    Nvidia + New = :lol:
    This days be careful to what you buy from Nvidia. In case you don´t know GTX280 and GTX260 mobile are G92 based and not GT200 based.
    This is equal that you are eating the same crap again.

    As far as AMD it show notebooks with HD 4850, HD 4870 and HD 4870X2 (this is really new and it not a fake name being a RV6xx arquitecture). It dind´t spoil anything with paperlaunch. Only showed plans for the first 40nm mobile GPU.

    The only chip @40nm taped out for now was GT218 witch is the very low low end.
    The other chips are in one unknown state.

    RV740 was taped out in final 2008 and yields are so good that the launch was pushed from May to April and they come to desktop and mobile at the same time. It looks to me a very sucessful and early entry.
     
    #134 v_rr, Mar 4, 2009
    Last edited by a moderator: Mar 4, 2009
  15. mczak

    Veteran

    Joined:
    Oct 24, 2002
    Messages:
    3,016
    Likes Received:
    112
    Well, the older one hat the option for more memory too (though it wasn't the reference design).
    Apart from that, personally I don't really mind a new name, it's just that the name suggests it's something which it isn't. Something like 9810 GTX or so would have been more appropriate (or if nvidia wanted that GTS moniker, GTS 1xx something maybe even). Viewed from a different perspective, a rename is actually good since it allows you to buy the newer card easily, which is usually what you want (certainly if the newer one isn't more expensive), in this case if only because of the lower power consumption (granted some of the later 9800 GTX+ cards might be close, but it still seems be a bit better). (WD for instance does the opposite, they sell completely different harddisks with the exact same model number, and it's a royal pain to get the better one for sure.)

    You're right that there isn't exactly a huge difference in the capabilities of GT200 and G92 in the areas that matter but yes I think everybody would prefer it if names suggesting the same generation would actually be from the same generation.

    I dunno if they could have come out with something competitive based on GT200 actually (you can't ignore time to market and cost).

    I dunno but I think that yes some less informed people will think that.

    I really wonder how those GT2xx derivatives look like though...
     
  16. DeF

    DeF
    Newcomer

    Joined:
    May 3, 2007
    Messages:
    162
    Likes Received:
    20
    It is as useful as PhysX at the moment.

    OK, step from 8800GT to 9800GT was made with just simple sticker change with no price change.(9800GT was even priced a bit highier than 8800GT here in Poland). For you its "keeping the performance level" (please keep in mind that a lot of 9800GT boards had 65nm G92 on them). On the other hand AMD's RV670 decreased die size, price, power consumption and added DX 10.1, UVD. But for you what nV did is ten times more important for the end user, than what AMD did. Can you explain how did you come to that conclusion?

    What most people are trying to say is that review sites should inform users what is good/bad/new/old/worth buying. And what nV is doing right now is just plain bad for the customer. They are trying to use misinformation to boost their sales. They are not offering anything new or better than their previous parts but they just try to present their old parts like something new. When reviewers see this they should inform about it and not just present information handed to them by the manufacturer. I am aware that both sides have done something like this in the past but it doesnt mean that it is now ok to do it once more.
    Clients base their purchase decisions on reviews they find in the internet. They have the right to know everything about the product and its reviewer's duty to present all the information necessary to make a good purchase decision. And please remember that it is very hard to change that first impression in GPU industry. Really good marketing can do miracles for manufacturer (nvidia's FX lineup) but can be also very bad for the end user.
     
  17. fbomber

    Newcomer

    Joined:
    Jun 9, 2004
    Messages:
    156
    Likes Received:
    17
    They should have maintained the name 9800GTX and lower the price to compete. Renaming it just have one effect: make consumers confused to buy the "new video card".
     
  18. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,492
    Likes Received:
    515
    Location:
    New York
    Those same people would think that regardless of whether it was G92b or some brand new GT2xx variant under the hood. That's the point many are missing. The chip under the heatsink is irrelevant to everybody but us.
     
  19. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    270
    Likes Received:
    3
  20. v_rr

    Newcomer

    Joined:
    Apr 30, 2007
    Messages:
    147
    Likes Received:
    0
    Another:

    http://www.dailytech.com/article.aspx?newsid=14480

    I rest my case about who is early or late to 40nm when ATI is kicking RV740 in 1 month in desktop + mobile.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...