Nvidia G71 - rumours, questions and whatnot

Discussion in 'Pre-release GPU Speculation' started by ToxicTaZ, Dec 4, 2005.

Thread Status:
Not open for further replies.
  1. _xxx_

    Banned

    Joined:
    Aug 3, 2004
    Messages:
    5,008
    Likes Received:
    86
    Location:
    Stuttgart, Germany
    Mobo and CPU. Can't remember the drivers, that was some 3 months ago.
     
  2. Arty

    Arty KEPLER
    Veteran

    Joined:
    Jun 16, 2005
    Messages:
    1,906
    Likes Received:
    55
    Picture of the Leadtek 7900GT, the distributor feels that it will be a hard launch but may not have enough quantity.. :|
     
  3. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
    well I gain no 3dmarks in 05 going from 2.4 to 2.5, but I gain from ocing my 7800gt from 510/1164 tp 520/1215- 200 marks.
    Totally gpu limited++
     
  4. IbaneZ

    Regular

    Joined:
    Apr 15, 2003
    Messages:
    743
    Likes Received:
    17
    Cute cooler, but is it effective? :)
     
  5. HAL

    HAL
    Newcomer

    Joined:
    Nov 12, 2005
    Messages:
    103
    Likes Received:
    2
  6. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    215
    Location:
    Uffda-land
    Can't quite tell for sure if that's a double from that picture. . .but the double-layer of heat pipes sure looks like it must be.
     
  7. Razor1

    Veteran

    Joined:
    Jul 24, 2004
    Messages:
    4,232
    Likes Received:
    749
    Location:
    NY, NY
    that is a big bump up for just upping the clocks. the 7800 gtx 512 is only a couple hundred points higher then a 7800 gtx 256 and % wise the upclocks on the 7800 gtx is higher and has less of a % increase for 3dmark 06 scores.

    Edit nm, it seems like just upped clocked g70. should have looked at the reviews before I posted ;)
     
    #1707 Razor1, Mar 5, 2006
    Last edited by a moderator: Mar 5, 2006
  8. HAL

    HAL
    Newcomer

    Joined:
    Nov 12, 2005
    Messages:
    103
    Likes Received:
    2
  9. dizietsma

    Banned

    Joined:
    Mar 1, 2004
    Messages:
    1,172
    Likes Received:
    13
    That die size is half the size of the r580 if true. My measuring rule was not too far off before in this thread .. the 7600 looks like 125mm so that can not be a picture of the 7600 as it was thought.
     
  10. Moloch

    Moloch God of Wicked Games
    Veteran

    Joined:
    Jun 20, 2002
    Messages:
    2,981
    Likes Received:
    72
  11. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    It's somewhat scary that people on an NV-centric forum still are not sure about the G70's (not G71) ALU configuration. "It's 48...it's 24...it's 16...it's a half-ALU..."

    So, by the looks of this it seems that G71 is "just" a die shrink. The size looks surreal. Even before this I was eyeing the numbers and wondering how things would turn out in the Eternal Battle (ATI v NV) in the real physical world. On the surface, Nvidia has a huge advantage. They use fewer transistors at lower clocks and achieve something like absolute performance parity. Nvidia either chopped off all the right corners or the R5xx line has a lot of untapped performance lurking in the cellar (the remote possibility that the overall NV design is superior is reserved for a parenthesis). If G71 really is tiny and performs on par with R580, I smell possibly good times for the consumers (price war initiated by Nvidia) and some tough ones for ATI. This is somewhat strange because, assuming the G71 has no added features or fixes, I believe the ATI offering is overall more compelling (better AF mainly).
     
  12. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    You'd rather have that than the HDR+AA? Because personally I'd say that's the R5xx's most compelling advantage.

    (OpenGL and Linux support the G7x's)
     
  13. Hubert

    Newcomer

    Joined:
    Sep 16, 2003
    Messages:
    151
    Likes Received:
    0
    Location:
    Transsylvania
    Yup. R580 has the IQ advantage, and unless g71 has been the subject of a very unlikely architectural redesign, it will not be able to compete in this area. So it will compete in other areas, like price, performance without HDR+AA, lower consuption, maybe.

    Funny, how the one who generated all the HDR madness now might be responsable for it's extinction. HDR is just not that hyped since ATI can do it (and do it better), is it ?
     
  14. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Its extinction? HDR is going to become more and more important over the next couple of years.
     
  15. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    Absolutely. Anti-aliasing/filtering are fundamental building blocks of producing high quality images. I'd much rather have all the basics down than any fancy extras before moving on. That is even true when considering raw performance to some extent, but that is a very difficult equation to nail down (because frame rate can have a massive impact on perceived image quality). I have nothing against HDR+AA, but there are two reasons why I think this is an overstated feature of the R5xx.

    First of all, we already have titles today, games based on the Source engine, where HDR and AA can coexist on any SM 2.0 hardware. It is merely a matter of speed. What is to say that other developers will not go down this same route?

    Secondly, I don't think this combination and even other features on top of that will become meaningful until Direct3D 10 is here and this will mean all new hardware anyways.

    Sure, I want my Age of Empires III, Splinter Cell, and Far Cry with HDR and AA, but I still place this in perspective of overall image quality and this is why I place AF so high on R5xx's list. What's the point of a geomtry anti-aliased HDR image if we still have poor anisotropic filering? Those remaining flaws in the image will become eye sores very quickly and stand out even more the better the rest of the image looks.

    I consider this "marketing blurb" on the R5xx as a bonus, but bonuses mean very little if you don't receive your base pay. If we reversed the situation by simply moving HDR+AA to the Nvidia side, I would still find the R5xx more compelling because without good AF that HDR+AA will mean very little to me. First things first.
     
  16. wireframe

    Veteran

    Joined:
    Jul 14, 2004
    Messages:
    1,347
    Likes Received:
    33
    Chalnoth already said it, but this statement is so far out I feel I must repeat his sentiment. HDR will eventually be a fundamental. All titles will be HDR and it won't be something you toggle. It's here to stay and be fully exploited even if the hype has died down (which I think is a good thing, especially if it is countered on the developers' side because good HDR is not supposed to be "noticed". You should only notice its absence when it's removed.)
     
  17. RobertR1

    RobertR1 Pro
    Legend

    Joined:
    Nov 2, 2005
    Messages:
    5,852
    Likes Received:
    1,297
    It's important to have similar feature sets. If one company introduces a new IQ imporving features, it's silly to state that the other company has no "need" to bring it to the market. If both manufacturers had similar HDR+AA features then developers would be much more inclined to add MSAA+HDR and release patches for existing games in which this is possible.

    Right now, nvidia cards take a huge hit with just AA alone, add HDR to the mix and you can see why nvidia is convincing people that this feature is "not needed." As soon as nvidia tweak their AA to be a lot more efficeint, expect MSAA+HDR to now be "needed." Consumers suffer the most when one company introduces features that the other can't/won't. Gives developers a lot less incentive to add and support this feature in their games.
     
  18. Mintmaster

    Veteran

    Joined:
    Mar 31, 2002
    Messages:
    3,897
    Likes Received:
    87
    This is what I've been saying for a while.

    ATI spent waaaaay too many transistors to get fast dynamic branching, which was the primary reason for their architectural decisions. Good for developers and advancing the state of 3D graphics, but bad for price/performance and for ATI's bottom line. It's time to sell ATI stock if you have any, because they just can't compete with the same bill of materials. NVidia will kill their margins throughout the lineup.

    ATI is so screwed, especially with the X1600XT. The 7600 will just blow it out of the water in any current game. I don't think the leaked 3DMark06 benchmark does justice to how big of a blowout this will be.

    Looking at those slides, I just knew NVidia would market the 7900 as a 48 ALU part. The FEAR fps is a bit shocking, but my guess is that's at an ultra high resolution without AA, possibly with "soft shadows" enabled. Maybe they have new drivers also, because given the heavy stenciling in that game, NVidia should be doing better right now.
     
  19. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    This argument doesn't apply to MSAA+HDR: it's basically nothing more than a toggle in the code. Challenge/time to implement is essentially nil.
     
  20. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Because it's not merely a matter of speed, but also one of quality, ease of programming, and ease of developing art assets.
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...