ATI RV740 review/preview

Discussion in 'Architecture and Products' started by LunchBox, Feb 25, 2009.

  1. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Because it and the GT130/160 are being launched as NEW chips. New benchmarks, reviews all around the web.

    The best example would be Apple putting a GT130M in their new workstations. A consumer does not know this "All new graphics powerhouse" is getting beaten to pulp by his 4 generation old 8800GTX.
     
  2. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    704
    Location:
    Guess...
    Well mobile naming schemes are a mess on both sides and I don't at all agree with how either company handles them.

    But I won't group the 250 with the GT130/160 because its name properly reflects its performance and feature set in the market.

    So what if NV is wrapping a product launch around the new name? They are not trying to hide the fact that this a re-branded 9800GTX+ in the slightest (their own slides make this clear). And anyone that actually reads the reviews will realise straight away what this GPU is so NV is gaining nothing other than to create awareness of the re-branding / price drop.

    The only people that won't realise the 250 is a re-branded 9800GTX+ are the people that don't read the reviews. No product launch = fewer/no reviews, so how exactly is the product launch an underhanded tactic from NV? If anything its just making the knowledge of exactly what the 250 is more public.
     
  3. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,481
    Likes Received:
    500
    Location:
    New York
    And this same consumer would magically know that if it was a new chip? How does the poor guy know that his 7900 is faster than the brand new 8400GS? Can't believe ppl are still using this flawed logic.
     
  4. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    He does not know for certain but he can assume that the card which is branded as two generations newer (8X, 9X, GT1) should be faster than his current card since he's looking at the same range of machines.
     
  5. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26
    This is actually something Nvidia is trying to address with their new naming schemes. Currently it is unfriendly to the consumer. To go just by numbering. However nvidia wants to emphasize further on the extension rather than the numbers this time.

    Think

    "GTX Performance"
    "GTS Midrange"
    "GS Mainstream"

    The general idea is someone with say a Geforce GTX 280. Wont assume a Geforce "GS" 320 is better based on numbering alone. I think you'll find similar returns of the extension meaning alot more.

    Think back to the Geforce "MX/TI" days where the difference between MX and TI definately mattered.
     
  6. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,481
    Likes Received:
    500
    Location:
    New York
    neliz: That's never been a valid assumption since the dawn of the GPU market. Assuming that every card in a new generation is faster than every card of the last is asking for trouble. That goes for most products, not just gpu's.
     
  7. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    Indeed. And that's why it happens over and over again, with the most shining example being the 5200. Two weeks ago I was talking to a guy that asked if he should stay with his FX5200 or if a HD3850 was worth the upgrading. People don't know and marketing/brand awareness really really helps sell the wannabee products.

    You can call it ignorance on a buyers part but it's hard convincing someone that "It's better because it's a Geforce" just doesn't fly.
     
  8. fbomber

    Newcomer

    Joined:
    Jun 9, 2004
    Messages:
    156
    Likes Received:
    17
    Because a person who already owns a 9800GTX will buy a GTS 250 thinking it´s better (because it´s supposedly newer)?
     
  9. trinibwoy

    trinibwoy Meh
    Legend

    Joined:
    Mar 17, 2004
    Messages:
    10,481
    Likes Received:
    500
    Location:
    New York
    Except that anybody who purchases graphics cards that often will not blindly pick up a GTS 250 and take it to the cashier. Upgrading every year is deeply in enthusiast territory. Everyone I know keeps cards for 3-5 years and even they ask questions before buying.
     
  10. silent_guy

    Veteran Subscriber

    Joined:
    Mar 7, 2006
    Messages:
    3,754
    Likes Received:
    1,379
    In chip lingo, a redesign typically means material changes to the original source code. For RTL, that's your verilog code. For analog blocks, that's low level circuit. If you keep the RTL the same but have to resynthesize, it's not called a redesign. Going from 65nm to 40nm with no RTL changes is not a big deal: you point your synthesis tool to a different library file and off you go. Years ago, I singlehandedly ported a large digital design from 0.25um to 180nm in about a week before sending it off to back end. A pure cost reduction with not a single bit of functional changes. For digital logic, it's sometimes that simple. In this particular case, the fab library had similar PLL's, so it was mostly a matter of search-and-replace the few instances.

    But it's different for custom analog blocks: there the characteristics of the process can be so different that you have to make major changes. (For custom digital logic, you also need to make changes, but at least you can keep the functionality identical.)

    So it's more accurate to talk about partial redesign.

    Now say you want to make your first chip in 40nm. What's going to be more painful? Take an existing digital design and only design new analog cells... or start with a completely new architecture but still design new analog cells?

    See? To claim that the port of 40nm was a disaster and therefore a switch was made to a full new digital design is just ridiculous.
     
  11. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    Interesting, now I uderstand your other post :) thx!
     
  12. Mariner

    Veteran

    Joined:
    Feb 6, 2002
    Messages:
    1,626
    Likes Received:
    261

    As fbomber666 mentions above, renaming a card to make it appear to be related to a newer family of cards gives the impression it is improved on older models.

    It's all very well and good to say that a customer should research their purchases correctly and it is their own problem if they don't buy the correct card but we know that the "enthusiasts" (such as people who read these boards) are very much in the minority.

    I feel this renaming of cards by the big IHVs is an actively disingenuous attempt to sell more cards to the ignorant. Perhaps that's just down to my low opinion of executives and marketing drones through. :smile:
     
  13. Shadowmage

    Newcomer

    Joined:
    Sep 30, 2005
    Messages:
    60
    Likes Received:
    3
    I agree with what you said, but I do believe that NVIDIA does quite a bit of custom logic (shaders) as well as a lot of manual place and route. Since they already have 40nm GT21x coming up in the pipeline, maybe they considered doing the straight shrink but determined that it would take too long (thus running into the GT21x release schedule) or it wasn't worth the effort.
     
  14. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    16,620
    Likes Received:
    5,633
    Except that for most the GTX/GTS/GS itself is confusing. Everytime I run into a user new to computer gaming I always have to take time to explain this. And then add to that if it has a (+) or not. And then if it has a (216) or not. Then all the other pre-fixes or suffixes that may get randomly added, and it can get mighty confusing.

    ATI's current naming scheme is significantly easier to understand. Within any current generation the relationship of a card is just denoted by a number. The higher the number the more powerful the card. Although they still have the (x2) or dual chip cards. I'm not sure whether I'm more fond of that or just a straight number where there is no indication that it's a dual chip (295).

    Of course, as with both you still have to explain that current generation isn't comparable to past generation in a signifcant way if trying to compare different performance levels. IE comparing a 3870 to a 4870 is valid since its the same performance category. Comparing 3870 to 4830 isn't. Again the simple numbering scheme makes this relatively easy to explain.

    I look forward to the day that Nvidia gets rid of the whole GTX/GTS/GT/GS and (+)/(#)/(etc). Then again with the frequency that Nvidia constantly tries to one-up the competition, I'm not sure if Nvidia CAN use a simple naming scheme. :p

    Regards,
    SB
     
  15. Subtlesnake

    Regular

    Joined:
    Mar 18, 2005
    Messages:
    270
    Likes Received:
    3
    But if you do *any* research on the card, you'll realise it's just a rebrand. A simple comparison of the old and 'new' specs is enough, whether or not you understand what those specs mean.

    The only people caught out are those that will spend $129+ on a new graphics card simply because of the name. But how many people know they want to upgrade, but aren't prepared to find out anything about what they're upgrading to?
     
  16. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    704
    Location:
    Guess...
    If thats true then its true for every generation of GPU from both AMD and NV. The Radeon 4550 is not better than the Radeon 3870 and yet it would appear to be from its name alone. Do you therefore have a problem with the Radeon 4550? The same argument can be made for dozens of different low and mid range GPUs going back over a decade.

    It doesn't matter if the 250 is based on G92b or GT2xx, from a consumer point of view it is near enough identical so its completely comparable to the example above.

    As has been said many times already, if someone is ignorant enough to think a mid range GPU of generation x is always going to be faster than a top end GPU of generation x-1 then that person will make that mistake regardless of whether they are looking to buy a 250 based on G92b or a "new design" 250 based on GT2xx with the same performance.
     
  17. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    704
    Location:
    Guess...
    But don't you think they would do that anyway even if the 250 was based on a new GT2xx design (that performed no better than the current 250)?
     
  18. CJ

    CJ
    Regular

    Joined:
    Apr 28, 2004
    Messages:
    816
    Likes Received:
    40
    Location:
    MSI Europe HQ
    Usually a new design also has other improvements which might be worth it to some people. For instance when you look at the AMD high end cards: X1950 -> HD2900 brought DX10 support, HD2900 -> HD3800 brought a much lower powerconsumption, UVD and DX10.1. HD3800 -> HD4900 brought highly improved performance.

    So it wouldn't be unreasonable to expect new 'features' in a GTS 250 which is based around a GT2xx design. Now it's just old wine in new jars no matter how nVidia or someone else tries to spin it. And even partners aren't very happy with it.
     
  19. pjbliverpool

    pjbliverpool B3D Scallywag
    Legend

    Joined:
    May 8, 2005
    Messages:
    7,583
    Likes Received:
    704
    Location:
    Guess...
    Yes and 9800GTX -> GTX280 brought highly improved performance.

    But we are talking about 9800GTX -> GTS 250, and there is no reason to expect improved performance in that case, any more than there is a reason to expect improved performance from HD3800 -> HD4600.

    And a GTS 250 based on GT2xx would have brought no meaningful improvements over the G92b version IMO. Although it would probably have come with a higher price tag in order to recoup the R&D spent on it.
     
  20. A.L.M.

    Newcomer

    Joined:
    Jun 2, 2008
    Messages:
    144
    Likes Received:
    0
    Location:
    Looking for a place to call home
    I think that CJ was talking about improvements in general... So HD4600 did bring improvements (other than strictly performance-related) over HD3800: smaller, lower consumption, cheaper, easier to squeeze into notebooks, etc...
    GTS250 brings no improvement over 9800GTX+, other than its name.
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...