Has anyone started any "Intel is gonna buyout nVidia!" rumors lately. :|
You forgot IMO, which is far from the general opinion. Renaming as whole isnt something that most people like, it creates mass confusion. And renaming a slower product as faster can be categorized as stretching the truth, am I doing it right Razor?Lol, how ever would you entertain yourself if Nvidia ever shaped up?
With respect to the GTS360M I wish people would just accept it as Nvidia's standard operating procedure. The pretend outrage about the renaming is getting far more annoying than the renaming itself. It's a victimless crime after all (i.e the people who actually care what graphics card is in their laptop are going to know the deal).
Where? First let AMD get the damn things out and then start complaining.Or selling Juniper as 58xx mobile.
This post is so funny. I dont think even Nvidia PR would also bother with such a stretch ..And which was the performance part? Oh wait: They only selling GT200 cards since june 2008.
g92b came in june. At the same time nVidia launched the GT200 cards.
This post is so funny. I dont think even Nvidia PR would also bother with such a stretch ..
You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense.
But i am waiting for the guy who will show me the difference between g92 and gt200.
Until this there is no difference in selling juniper as cypress or g92 as gt200.
I thought Juniper was sold as Broadway?
http://forum.beyond3d.com/showpost.php?p=1373752&postcount=4904Broadway XT -> HD5870, 128-bit, GDDR5, 45W-60W
* Broadway Pro -> HD5850, 128-bit, GDDR5, 30W-40W
* Broadway LP -> HD5830, 128-bit, (G)DDR3, 29W
Broadway: Juniper based, 800SP's for HD5800, GDDR5, GDDR3 and DDR3 1Billion+ Xtors
G92b is a straight shrink of G92, which was a modified shrink of the original G80. Calling it a G200 part is indeed funny. I dont think even the likes of razor, triniboy, florin, xman etc. would do it.You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense.
But i am waiting for the guy who will show me the difference between g92 and gt200.
Until this there is no difference in selling juniper as cypress or g92 as gt200.
G92b is a straight shrink of G92, which was a modified shrink of the original G80. Calling it a G200 part is indeed funny. I dont think even the likes of razor, triniboy, florin, xman etc. would do it.
Very probably never.And G200 is a modified version of g92. And when will i see a list with the differences between g92 and gt200?
I am constantly surprised by the amount of slack you are prepared to cut Nvidia for what are pretty dishonest practices - something we've seen and heard of from insiders for years now that are Nvidia's standard operating procedures.
Pointing it out (or "complaining" if you prefer) does change things. It gets the message out there, it gives people an understanding of how they are being conned by dubious business practices, and it helps people make informed choices to spend their money elsewhere on better products and with companies that don't treat their customers with such a level of disrespect.
You got a complaint now trini.
It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1. It hurts DX10.1 adoption rate (if any because of their stubbornness)
http://www.nvidia.com/object/product_geforce_gts_360m_us.html
G92 gets rebranded as GTS360M. Can anyone with a straight face still say this is good for consumers?
You got a complaint now trini. G92M/GTX280M/GT360M=DX10, rest of the mobile 3xx parts, DX10.1 .
It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1.
It hurts DX10.1 adoption rate (if any because of their stubbornness)
And i don't think, that they will use 2000MHz GDDR3 memory, when the GTX280 has only 950Mhz.High Performance GeForce DirectX 10.1 Graphics Processor
NVIDIA® GeForce ®GTS enthusiast class GPUs include a powerful DirectX 10.1, Shader Model 4.1 graphics processor, offering full compatibility with past and current game titles with all the texture detail, high dynamic range lighting and visual special effects the game developer intended the consumer to see. Water effects, soft shadows, facial details, explosions, surface textures and intricate geometry create cinematic virtual worlds filled with adrenalin pumping excitement. Of course all these special effects run at high resolution and playable frame rates for immersive heart-pounding action.
Why don't read nobody the feature tab?
And i don't think, that they will use 2000MHz GDDR3 memory, when the GTX280 has only 950Mhz.