Nvidia G71 - rumours, questions and whatnot

Discussion in 'Pre-release GPU Speculation' started by ToxicTaZ, Dec 4, 2005.

Thread Status:
Not open for further replies.
  1. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    Varies from card to card and vendor to vendor.

    However, on a whole, the 6800s (actually it started with the FX series) have much better 2d than the earlier nvidia cards.
    My old geforce 3 had horrible image quality, 2d or 3d. Thankfully I'm on DVI now and don't have to worry about poor RAMDACs. Anyhow, back in the day I considered my geforce 3 a huge downgrade from my voodoo 3.......one because the image quality was just plain worse in every way, and on the early drivers the geforce 3 had at the time performance was just as terrible as it had been on my voodoo. I doubt I was cpu limited, I had a 1.4ghz tbird...I was still on windows 98 at the time though, so maybe 98 drivers weren't up to snuff. In fact, looking at my old 3dmark scores reporting a fillrate of 348.5 single textured pixels and 634.2 multi-textured pixels....wow that's just crap for a geforce 3, which should be able to do like double that. (also it's interesting to see that my 2.2ghz barton appears to give about 4 to 6x the performance a 1.4ghz tbird in the more cpu limited tests)
    I've got later results from a 1.8ghz XP with a new motherboard (DDR with better AGP) that show the geforce 3 at around its proper speed, instead of scores merely twice what my voodoo3 could do.

    BTW, I'd like to note that ATI's 3rd party boards can have 2d image quality just as poor as nvidia's Nvidia and ATI are stricter on IHVs when it comes to quality than they used to be, but still leave a large range. If you're buying a $20 card just for 2d work....don't be surprised if it isn't as good at 2d as a more expensive card. (and don't be surprised if your expensive card from a cheap brand isn't good at 2d, I've heard of even 7800gts with subpar image quality)
     
  2. Rys

    Rys PowerVR
    Moderator Veteran Alpha

    Joined:
    Oct 9, 2003
    Messages:
    4,163
    Likes Received:
    1,453
    Location:
    Beyond3D HQ
    So the DACs and filters for analogue used on all 7800 boards aren't exactly the same for all AIBs right now? Surprising.
     
  3. Fox5

    Veteran

    Joined:
    Mar 22, 2002
    Messages:
    3,674
    Likes Received:
    5
    Nope, but they have to be within certain limits now, but I've heard of a couple reports of people choosing a cheaper brand of x800s or 7800s and getting stuck with crap image quality.
     
  4. no-X

    Veteran

    Joined:
    May 28, 2005
    Messages:
    2,333
    Likes Received:
    290
    Crappy reference filters from nVidia?
    http://www.xbitlabs.com/articles/video/display/geforce7800-gt_4.html
     
  5. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    I find that somewhat odd. I've not noticed any 2D quality issues with any video card in the last few years, and I've used:
    Gainward GeForce4 Ti 4200
    ATI Radeon 9700 Pro
    eVGA GeForce 6800
    MSI GeForce 6600 GT
    BFG GeForce 7800 GT OC

    I actually somewhat expected a bit of a quality drop on the 7800 GT OC, since it uses dual-DVI and I had to use an adapter, but couldn't detect anything.
     
  6. Fodder

    Fodder Stealth Nerd
    Veteran

    Joined:
    Jul 12, 2003
    Messages:
    1,112
    Likes Received:
    9
    Location:
    Sunny Melbourne
    I assume it's DVI-I, in which case the extra pins carry an analogue signal anyway, so the only potential loss would come from the extra layer of connection.
     
  7. Skinner

    Regular

    Joined:
    Sep 13, 2003
    Messages:
    871
    Likes Received:
    9
    Location:
    Zwijndrecht/Rotterdam, Netherlands and Phobos
    R580 is jan. the 24st
     
  8. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    You have a source you can point at, or you're doing a Karnac thing?
     
  9. kemosabe

    Veteran

    Joined:
    Jun 19, 2003
    Messages:
    1,001
    Likes Received:
    16
    Location:
    Montreal, Canada
    It's last week's Chinese gospel. You're slipping, geo. :lol:
     
  10. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Oooh, I think I did see that. Somebody linked it hereabouts. That seems ballparkish for what folks are expecting.
     
  11. KimB

    Legend

    Joined:
    May 28, 2002
    Messages:
    12,928
    Likes Received:
    230
    Location:
    Seattle, WA
    Yeah, to tell you the truth I expected something like that. And I suppose it is also a high-end 7800 GT, so it doesn't really apply to the complaint that some with lower-end brands are seeing problems.
     
  12. Wallslide

    Newcomer

    Joined:
    Aug 7, 2002
    Messages:
    5
    Likes Received:
    0
    I personally would and do disable graphics settings until I get playable framerates at 1600x1200. My 21.3" LCD monitor's native resolution is 1600x1200, and the crispness of playing at the native resolution, even if I have to play with little to no AA, is much better IMO than playing with high texture filtering and AA at a scaled ( non native ) resolution.

    I played Counter-Strike very competitively for a number of years, and moved to Counter-Strike:Source once it came out. I do everything I can to increase my 1600x1200 framerate, including trying out lower DX modes, lowering all geometry and texture settings, and playing with the graphics driver settings.

    For reference, I have both a 6800GT O/C to ultra speeds, an ATI X800 Pro, and an Athlon 3500+ with 1gig of RAM, so I could run the game with a lot of eye candy if I wanted to. Its just that I choose not to because once my frame rate dips below 45FPS, I can really start to feel the games responsiveness dwindle. Even below 60FPS, I can often feel some sluggishness.

    By no means am I saying that all gamers follow my lead in lowering their eye candy for performance improvements at high resolutions, but I know that I am not alone in this. As more of my friends switched to LCD monitors ( they are so much easier to drag to LAN parties and tournaments ), they took the same perspective as me, trying to eek out as much performance as possible at their monitors' native resolutions.

    I have to say that whether I am playing a competative multiplayer game, or a single player game does influence my stance on this issue. In a single player game, I'm willing to sacrifice some FPS for increased eye candy to enchance the game's immersiveness, but in neither case am I willing to go below 1600x1200.
     
  13. Skinner

    Regular

    Joined:
    Sep 13, 2003
    Messages:
    871
    Likes Received:
    9
    Location:
    Zwijndrecht/Rotterdam, Netherlands and Phobos

    I heard it from an insider in the industry ( an employee from a vendor) and he has it right many times.
     
    Geo likes this.
  14. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Cool, I'll take that over hkepc anyday.
     
  15. Rys

    Rys PowerVR
    Moderator Veteran Alpha

    Joined:
    Oct 9, 2003
    Messages:
    4,163
    Likes Received:
    1,453
    Location:
    Beyond3D HQ
    The analogue signal is still generated, before it hits those pins, by a DAC and filter(s). If either/or are buy-ins by the board vendor, quality could be worse, per vendor, per SKU.
     
  16. Fodder

    Fodder Stealth Nerd
    Veteran

    Joined:
    Jul 12, 2003
    Messages:
    1,112
    Likes Received:
    9
    Location:
    Sunny Melbourne
    Sure, but it wouldn't be any worse than the signal from the D-SUB on that card.
     
  17. Dave Baumann

    Dave Baumann Gamerscore Wh...
    Moderator Legend

    Joined:
    Jan 29, 2002
    Messages:
    14,081
    Likes Received:
    651
    Location:
    O Canada!
    G71 gets a brief mention at Digitimes today - not really much new though:

     
  18. Geo

    Geo Mostly Harmless
    Legend

    Joined:
    Apr 22, 2002
    Messages:
    9,116
    Likes Received:
    213
    Location:
    Uffda-land
    Back to "NV50", eh? Saw a whisper somewhere they might go that route on naming. Not sure I like all that churning of code names. Well, no, that's not true --I dislike it intensely. :lol: It makes convos harder to have.
     
  19. ToxicTaZ

    Newcomer

    Joined:
    Mar 6, 2003
    Messages:
    52
    Likes Received:
    2
    Location:
    Edmonton
    Nvidia G71 90nm @750MHz (GeForce7 7900 Ultra 512)

    Is the NV50 not the G80?

    I thought the G80 was the NV50 or is it still comming?

    I still need more info on the G71 Nvidia's (R580 Killer)
     
  20. kemosabe

    Veteran

    Joined:
    Jun 19, 2003
    Messages:
    1,001
    Likes Received:
    16
    Location:
    Montreal, Canada
    Killer? If G71 is no more than G70 with two extra quads and higher clocks (and I don't mean that in a belittling way), I don't think it will cast any large shadows over R580. :neutral:
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...