NVIDIA shows signs ... [2008 - 2017]

Discussion in 'Graphics and Semiconductor Industry' started by Geo, Jul 2, 2008.

Tags:
Thread Status:
Not open for further replies.
  1. digitalwanderer

    digitalwanderer Dangerously Mirthful Legend

    Has anyone started any "Intel is gonna buyout nVidia!" rumors lately. :|
     
  2. Arty

    Arty KEPLER Veteran

    You forgot IMO, which is far from the general opinion. Renaming as whole isnt something that most people like, it creates mass confusion. And renaming a slower product as faster can be categorized as stretching the truth, am I doing it right Razor?

    Where? First let AMD get the damn things out and then start complaining. :roll:
     
  3. Arty

    Arty KEPLER Veteran

    This post is so funny. I dont think even Nvidia PR would also bother with such a stretch .. :lol:
     
  4. Sontin

    Sontin Banned

    You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense. :roll:
    But i am waiting for the guy who will show me the difference between g92 and gt200.
    Until this there is no difference in selling juniper as cypress or g92 as gt200.
     
  5. neliz

    neliz GIGABYTE Man Veteran

    I thought Juniper was sold as Broadway?

    What's it with nvidiots and their semantics these days?
     
  6. Sontin

    Sontin Banned

    Yeah and they will call Broadway 58xx mobile. :lol:
    But wait: The will rename juniper to broadway? :shock:

    http://forum.beyond3d.com/showpost.php?p=1373752&postcount=4904

    So, it's okay to fool the customer with a juniper chip but it's very bad to do the same with g92?
     
  7. Dave Baumann

    Dave Baumann Gamerscore Wh... Moderator Legend

    Notebook graphics have always had separate codenames from desktop graphics; in previous years they have been using Mxx numberings.Notebook graphics product names have not been the same as desktop grahics names either - there may have been correllation and similarities, but they are not, and never have been tied. Segmentation for notebooks is different from desktop and what would be considered as an "performance" desktop part can often be seen as an "enthusiast" part by notebook vendors, and likewise down the line you often see a disjoint in the parts and their segements.
     
  8. Arty

    Arty KEPLER Veteran

    G92b is a straight shrink of G92, which was a modified shrink of the original G80. Calling it a G200 part is indeed funny. I dont think even the likes of razor, triniboy, florin, xman etc. would do it. :roll:
     
  9. Sontin

    Sontin Banned

    And G200 is a modified version of g92. And when will i see a list with the differences between g92 and gt200? :lol:
     
  10. Psycho

    Psycho Regular

    I guess this is a good place to start ;) http://beyond3d.com/content/reviews/51

    But part of the GTX2x0M rename problem is also that the G92 9800M (GT/GTX) exists already, and the difference between that and the "new" is pretty minimal, and not anywhere near the G92->GT200 difference.
     
  11. ShaidarHaran

    ShaidarHaran hardware monkey Veteran

    My eyes are bleeding reading this page. Someone make the bad man go away.
     
  12. digitalwanderer

    digitalwanderer Dangerously Mirthful Legend

    Very probably never. :yep2:
     
  13. ChrisRay

    ChrisRay <span style="color: rgb(124, 197, 0)">R.I.P. 1983- Veteran

    The only real tangible consumer benefit. ((IE not OEM benefit)) is the power saving tech of GT200 verses G92 and the improved geometry shader. There are a few CUDA benefits. I'm not sure how mobile G92 fares in power saving department but I wouldnt be surprised if its improved.
     
  14. trinibwoy

    trinibwoy Meh Legend

    I'll admit I'm at a disadvantage here as I can't really bring myself to feel outrage over the renaming of a graphics card.

    Agreed, but that only works if the complaint has merit. Just in this thread alone many of the points raised are focusing on architecture, process node and other details that are completely irrelevant to the consumer. Does Nvidia ask a fair price for these renamed parts given their feature set and performance? If so none of that other stuff is relevant.

    I just don't know who the victim is in this case. Are there really people out there who care about die size and semiconductor processes yet base their purchasing decisions on model numbers? I don't care if my TV uses a 5 year old chip and I'm sure most people feel the same way about their graphics cards.
     
  15. neliz

    neliz GIGABYTE Man Veteran

    http://www.nvidia.com/object/product_geforce_gts_360m_us.html

    G92 gets rebranded as GTS360M. Can anyone with a straight face still say this is good for consumers?

    You got a complaint now trini. G92M/GTX280M/GT360M=DX10, rest of the mobile 3xx parts, DX10.1 .

    It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1.
    It hurts DX10.1 adoption rate (if any because of their stubbornness)
     
  16. Silent_Buddha

    Silent_Buddha Legend

    Wow, just...Wow... So G92 is now branded as the exact same generation as a Fermi will be. Implying similar architechture and features.

    So it's not even going to be based on G215? So all the way back to Dx10.0 and not even Dx10.1? Much less Dx11?

    Yeah, way to think of your customers...

    Regards,
    SB
     
  17. neliz

    neliz GIGABYTE Man Veteran

  18. trinibwoy

    trinibwoy Meh Legend

    Not sure why I should complain about Nvidia's incompetence, they don't work for me :) Whether the part is named 360, 260 or 160 it is still the same part. Again, you're painting a picture of a customer who sees 3xx and assumes DX11 but isn't smart enough the read the spec sheet? Who are these people? I can appreciate that it's fun to highlight Nvidia's utter failure to execute on many fronts as of late but I don't get the need to pretend to care about the poor uninformed customer (who isn't getting hurt by any of this as far as I can tell).
     
  19. Sontin

    Sontin Banned

    Why don't read nobody the feature tab?
    And i don't think, that they will use 2000MHz GDDR3 memory, when the GTX280 has only 950Mhz. :lol:
    And the new GT335 will use 4 full cluster and 1 cluster with only one vec8 unit? http://www.nvidia.com/object/product_geforce_gt_335m_us.html
    You hate against nVidia is really funny.
    There is no logicial reason that they would use g92b instead of GT215, when they only need GT215 specification.
     
  20. Tchock

    Tchock Regular

    /face
    /palm
    /facepalm
     
Loading...
Thread Status:
Not open for further replies.

Share This Page

Loading...