Radeon HD 4770 Reviews

Discussion in '3D Hardware, Software & Output Devices' started by Arty, Apr 28, 2009.

  1. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    All reviews go here. Post your favourite reviews and I'll add them here.

    Anandtech -> http://www.anandtech.com/video/showdoc.aspx?i=3553

    I'm sorry what?:nope:
     
  2. homerdog

    homerdog donator of the year
    Legend Subscriber

    Joined:
    Jul 25, 2008
    Messages:
    6,294
    Likes Received:
    1,075
    Location:
    still camping with a mauler
    The name is perfect, I don't know what Mr. Wilson is on about :???:
     
  3. ShaidarHaran

    ShaidarHaran hardware monkey
    Veteran

    Joined:
    Mar 31, 2007
    Messages:
    4,027
    Likes Received:
    90
    Derek Wilson is diehard NV fan. It's apparent in his writings.

    That being said the only graphics cards I use right now are Nvidia. GTX 285 in my primary rig, 8800 GT in my secondary rig, and 9600 GT in my g/f's desktop which I use to play Titan Quest co-op with her when we're at her place.
     
  4. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,452
    Likes Received:
    10,357
    Whoa...idle power is respectable and load power is WAY down. That's pretty impressive.

    Faster than 4830 and sometimes challenges 4850. It's just too bad they didn't call this 4840 or even 4835.

    4835 might have made more sense as there is precedent for AMD. Phenom II Am3 parts for example having a 5 at the end.

    Still a very nice card.

    If AMD/ATI continues their current push of getting more performance to lower price brackets there may come a time when game devs don't even have to give much thought to targetting extremely low powered systems.

    Then again maybe they don't want the x8xx models to be too closely associated with the 100 USD price point. Thus 4770 might make more sense for the price bracket.

    Regards,
    SB
     
  5. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
    Yeah, this thing seems like the newest GeForce 9600 GT, but better taking into consideration when it was released.
     
  6. Spaceman-Spiff

    Regular

    Joined:
    Aug 9, 2004
    Messages:
    299
    Likes Received:
    3
    Location:
    .bc.ca
    Since ATI is discontinuing 4830, the name shouldn't be a problem anymore...
     
  7. Sound_Card

    Regular

    Joined:
    Nov 24, 2006
    Messages:
    936
    Likes Received:
    4
    Location:
    San Antonio, TX
    And it's not like they named it something higher than HD 4830 either.
     
  8. I.S.T.

    Veteran

    Joined:
    Feb 21, 2004
    Messages:
    3,174
    Likes Received:
    389
  9. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,452
    Likes Received:
    10,357
    Just read the Techreport review and for the first time ever...

    I think I'm starting to actually HATE a graphics card company.

    WARNING: Full blown rant coming, please skip if you don't want to listen (read) to me raving like a looney. :p

    /rant on

    More and more I see how large a performance increase (or IQ increase in the case of Stalker Clear Sky) you can get from just going from DX10.0 to DX10.1...

    The Hawx benchmark in the Techreport review for example. Going from Dx10.0 to 10.1 was at least 1-2 performance categories higher. That's like getting a free upgrade to a far more powerful video card...

    And when I see things like that, I can't help wondering how the gaming landscape right now might be different if Nvidia had taken their head out of their arse and supported DX10.1.

    If we had not only ATI devrel but also Nvidia devrel working with developers to implement DX10.1, then quite a few more games would have the potential to run quite a bit faster no matter who's hardware you had.

    Hell, it might finally get people off XP. :p

    /rant off...

    Heck, the last time I remember a shift with such a large "in your face" effect was the move from DX8.0 to DX9.0.

    The sad thing is, we'll see this nice performance boost when all cards support DX11 and everyone will be proclaiming DX11 as the next coming... All the while, we could have had that for that past year or so if only Nvidia had also supported DX10.1.

    I really am finding it hard for the first time in my life to not actually feel hatred towards a computer hardware company...

    Regards,
    SB

    PS - Bleh...
     
  10. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,120
    Location:
    WI, USA
    I dunno. When DX11 cards are around, we aren't going to really care about DX10.1 anymore or the retro cards that didn't support it. It'll be like DX8.1's short time of being worth mention, I think. GF4 vs. R8500/9000-9250. GF4 was the better card in the end anyway, even without DX8.1 support. I don't think that the majority of gamers are missing even DX10 support at this point, with most cards being used for DX9 gaming.

    I think that DX11 cards are coming this year, too, so the time of DX10.1 is coming to an end.. Maybe there'll be a DX11.1 which will be the next support annoyance though!!
     
  11. Silent_Buddha

    Legend

    Joined:
    Mar 13, 2007
    Messages:
    19,452
    Likes Received:
    10,357
    That's not the point. Had Nvidia taken their head out of their arses, EVERYONE would be enjoying faster speeds across the board in games as not only ATI devrel but also Nvidia devrel would be pushing and helping with DX10.1.

    But as it is, Nvidia is discouraging the use of DX10.1 since they don't support it. Thus EVERYONE gets to thank them for slower gaming.

    And it's not like they only missed it with 1 generation of cards, ala DX8.1. But we've gone through 2 generations since the introduction of DX10.1 which actually does have a tangible performance increase over 9.x and 10.0.

    Enough that it's like getting a whole new video card.

    Sigh... I'll get over it eventually, but right now, Nvidia is on the short bus for me. Thank god FireGL is competitive now so I don't HAVE to get a Quadro again when I upgrade my workstation.

    Regards,
    SB
     
  12. swaaye

    swaaye Entirely Suboptimal
    Legend

    Joined:
    Mar 15, 2003
    Messages:
    9,045
    Likes Received:
    1,120
    Location:
    WI, USA
    Maybe it's telling of NVIDIA's priorities. It's obvious that they are going all out with CUDA and friends. More than ATI is going GPGPU, IMO. And since it takes years to design a GPU, perhaps D3D 10.1 just wasn't a priority at all back when they were starting up the R&D for the current chips.
     
  13. Tchock

    Regular

    Joined:
    Mar 4, 2008
    Messages:
    849
    Likes Received:
    2
    Location:
    PVG
    Looks like Relic stopped using nVidia DX10 shader libraries in DoW II?

    Normally nVidia cards win in Essence Engine (CoH) tests.
     
  14. neliz

    neliz GIGABYTE Man
    Veteran

    Joined:
    Mar 30, 2005
    Messages:
    4,904
    Likes Received:
    23
    Location:
    In the know
    I think they rewrote a lot of their engine. way back when the first DX10 patch came out, dx10 was shyte for performance and it ran even worse on ati hardware. I think they just had time to tweak the engine to provide less nv-centric dx10 routines and performance is up all across the board.
     
  15. AlexV

    AlexV Heteroscedasticitate
    Moderator Veteran

    Joined:
    Mar 15, 2005
    Messages:
    2,535
    Likes Received:
    144
    Isn't DOW II DX9 only?
     
  16. Malo

    Malo Yak Mechanicum
    Legend Subscriber

    Joined:
    Feb 9, 2002
    Messages:
    8,933
    Likes Received:
    5,538
    Location:
    Pennsylvania
    Be interesting to see how they compared with the shadows set 1 notch lower for both cards. The latest patch introduced the new shadowing modes which is a huge performance drop.
     
  17. John Reynolds

    John Reynolds Ecce homo
    Veteran

    Joined:
    Feb 7, 2002
    Messages:
    4,491
    Likes Received:
    267
    Location:
    Westeros
    But why should they be envious? I bought a GTX 285 that I use in my main gaming rig but its PhysX support is just a marketing bullet point so far IMO. Seems to me that GPU-based physics support is just a tangled pissing match from which no one benefits.
     
  18. Scali

    Regular

    Joined:
    Nov 19, 2003
    Messages:
    2,127
    Likes Received:
    0
    I'm not saying they should be envious. But I see no reason why they have to drag PhysX down in every single thread, other than them being envious. It's getting rather tiresome.

    Aside from that, I think GPU-based physics will be a benefit to gamers in the future. Perhaps just not today yet. But you can't deny that a PPU or GPU greatly increases the potential for physics processing compared to a CPU. At some point in time it will all come together with the right API, hardware and game usage, it seems inevitable.
     
  19. Entropy

    Veteran

    Joined:
    Feb 8, 2002
    Messages:
    3,360
    Likes Received:
    1,377
    Scali. Seriously.

    Btw, I sidegraded to the 4770, and am happy to report that it performs well and in spite of its cheapo fan setup it runs both cooler and quieter under load than the 4870 with its (quite good) reference fan. And the Catalyst Control Center lets you adjust fan speed, so if you like your computer really quiet, the thermal margins allow you to drop the fan speed further relative to stock. The compromise in terms of performance unsurprisingly lies in AA/AF performance due to bandwidth compared to the 4870.

    If you run on XP like most people, and have a DX9 level card, I can report that the 4770 performs just as well as the 7900GTX, with 20 Watts lower powerdraw, and performs significantly better in games that are heavy on the shaders. Still, if you have a G71 card, the benefits of the 4770 is fairly limited in practical terms, for most games under XP. The factor of three increase in transistor count gives small return - it would be interesting to see a dedicated DX9 level chip at 40nm.

    Compared to the 4870, I don't regret the sidegrade (apart from some as yet not quite resolved problems with PCI-E 16x), but can't help being curious about whatever nVidia will produce on 40nm. ATI potentially left themselves somewhat vulnerable on the bandwidth front, and the lowered production values of the retail HD4770. Right now though, it's a good product, and IMHO the best performance/price and performance/watt compromise out there.
     
  20. Jawed

    Legend

    Joined:
    Oct 2, 2004
    Messages:
    11,729
    Likes Received:
    2,142
    Location:
    London
    Unless NVidia goes with GDDR5 on their 40nm chips, it'll be 190mm²+ GPUs chasing RV770, just to get close in terms of bandwidth.

    Now there are rumours that NVidia won't be introducing GDDR5 until GT300 and that they're leaving AMD unchallenged at this price/performance point in desktop. It seems NVidia's 40nm efforts will be laptop only, where heat is a more important metric than performance, it seems.

    Jawed
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...