R420 = R300

Discussion in 'Architecture and Products' started by Dimahnbloe, Apr 7, 2004.

  1. Evanescence

    Newcomer

    Joined:
    Mar 19, 2004
    Messages:
    20
    Likes Received:
    0
    Location:
    USA | Germany
    This R420 ad on ATI Webpage is really FUNNY !

    Reminds me for something like that:

    [​IMG]
     
  2. Illissius

    Newcomer

    Joined:
    Apr 3, 2004
    Messages:
    3
    Likes Received:
    0
    Location:
    Hungary
    Doesn't nVidia have their Personal Cinema line? As far as I can gather, that's pretty much the counterpart to ATi's AIW cards (although admittedly less widespread / marketed).
     
  3. epicstruggle

    epicstruggle Passenger on Serenity
    Veteran

    Joined:
    Jul 24, 2002
    Messages:
    1,903
    Likes Received:
    45
    Location:
    Object in Space
    You just missed a great time to buy ati stock, for that matter even nv was a steal. Id wait a while before putting money into either company. But thats just me, what do i know. ;)

    later,
    epic
     
  4. ANova

    Veteran

    Joined:
    Apr 4, 2004
    Messages:
    2,226
    Likes Received:
    10
    From all indications it does seem as though the NV40 will be faster then the R420, but it is mere speculation. Nothing is set in stone just yet. And even if the rumours are true you also have to factor in the other details. I know I would rather have a somewhat slower card if that means alot less power consumption. There is also the price to take into account.
     
  5. 3dcgi

    Veteran Subscriber

    Joined:
    Feb 7, 2002
    Messages:
    2,493
    Likes Received:
    474
    I won't roast you, but I will say you're wrong about the clock speeds. This might change more in the future if GPUs/VPUs rely more on branching. Besides the fact that VPUs and CPUs can't be directly compared here increasing the clock speed doesn't make a CPU more efficient. Increasing the pipeline does.
     
  6. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Thanks, I'm just thinking out loud...I'd do a LOT more homework before investing, I suck with money so I tend to over research.

    Can I still find it hilarious that people are already giving the next gen title to nVidia based on leaked nVidia PR? Does no one remember the period just before the nV30 was supposed to originally be released back in the fall of '02?

    This isn't a case of "it isn't over until it's over", it's a case of "it's over before it's even begun"! :lol:
     
  7. Nick Spolec

    Newcomer

    Joined:
    Apr 7, 2004
    Messages:
    199
    Likes Received:
    0
    I agree. I think too many people want to give Nvidia the benefit of the doubt with the next cycle.

    Not me.

    After 2003, Nvidia dosen't deserve it. ATI deserves it, for putting out one hell of a product.

    Nvidia should have to PROVE itself first before getting a vote of support from everyone.

    I personally will wait and see what Nvidia comes out with before I kiss their feet again, and under what context (read: cheating drivers) their hardware gets it's performance numbers from.
     
  8. Rugor

    Newcomer

    Joined:
    May 27, 2003
    Messages:
    221
    Likes Received:
    0
    I'm waiting for both benchmarks and analysis thereof before I award the crown. However in Nvidia's favor I will say that I expect NV40 to be faster than an R9800XT.
     
  9. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983-
    Veteran

    Joined:
    Nov 25, 2002
    Messages:
    2,234
    Likes Received:
    26

    But those are the first true shader intensive games.., I'm sure your r300 is up to it.
     
  10. Pete

    Pete Moderate Nuisance
    Moderator Legend

    Joined:
    Feb 7, 2002
    Messages:
    5,777
    Likes Received:
    1,814
    Well, if nV starts comparing the number of transistors to people in city XYZ on April 13th, then we'll know to duck and cover. ;)
     
  11. anaqer

    Veteran

    Joined:
    Jan 25, 2004
    Messages:
    1,287
    Likes Received:
    1
    Umma... FarCry being shader-intensive might not be the core problem here... I'll go out on a limb and say that T2k has been accustomed to cranking up AF/AA based on performance in previous titles and FarCry is a bitch with AF/AA. ( That's just a hunch though, don't shoot me if I'm off the mark. )


    As for DX:IW ... forget it man, that engine is sooo fucked up with the hacked-in dynamic shadow system, it's beyond pathetic.
    If only they left that out and concentrated on gameplay mechanics... but that's OT.

    | EDIT : stupid typo |
     
  12. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    I think you're on the mark, FarCry runs pretty well on my 9700 Pro as long as I keep it at 4xAA 4xAF and no v-sync.
     
  13. Quitch

    Veteran

    Joined:
    Jun 11, 2003
    Messages:
    1,521
    Likes Received:
    4
    Location:
    UK
    Ah, but don't you see the logic, dig. Last time nVidia couldn't lose because they were the best. This time they can't lose because they lost last time, and nVidia don't like not being the best.

    Talk about advance purchasing through faith. It's amazing how much weight the nVidia brand still carries in the geek arena.
     
  14. digitalwanderer

    digitalwanderer Dangerously Mirthful
    Legend

    Joined:
    Feb 19, 2002
    Messages:
    18,992
    Likes Received:
    3,532
    Location:
    Winfield, IN USA
    Sad, ain't it? I'm still working on spreading the word, people will learn better than to trust in brand names one day and actually buy based on the hardware....


    ....when pigs fly. :roll: :(
     
Loading...

Share This Page

  • About Us

    Beyond3D has been around for over a decade and prides itself on being the best place on the web for in-depth, technically-driven discussion and analysis of 3D graphics hardware. If you love pixels and transistors, you've come to the right place!

    Beyond3D is proudly published by GPU Tools Ltd.
Loading...