NVIDIA shows signs ... [2008 - 2017]

Status
Not open for further replies.
Lol, how ever would you entertain yourself if Nvidia ever shaped up? :LOL:

With respect to the GTS360M I wish people would just accept it as Nvidia's standard operating procedure. The pretend outrage about the renaming is getting far more annoying than the renaming itself. It's a victimless crime after all (i.e the people who actually care what graphics card is in their laptop are going to know the deal).
You forgot IMO, which is far from the general opinion. Renaming as whole isnt something that most people like, it creates mass confusion. And renaming a slower product as faster can be categorized as stretching the truth, am I doing it right Razor?

Or selling Juniper as 58xx mobile. :LOL:
Where? First let AMD get the damn things out and then start complaining. :rolleyes:
 
And which was the performance part? Oh wait: They only selling GT200 cards since june 2008. :rolleyes:
g92b came in june. At the same time nVidia launched the GT200 cards.
This post is so funny. I dont think even Nvidia PR would also bother with such a stretch .. :LOL:
 
This post is so funny. I dont think even Nvidia PR would also bother with such a stretch .. :LOL:

You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense. :rolleyes:
But i am waiting for the guy who will show me the difference between g92 and gt200.
Until this there is no difference in selling juniper as cypress or g92 as gt200.
 
You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense. :rolleyes:
But i am waiting for the guy who will show me the difference between g92 and gt200.
Until this there is no difference in selling juniper as cypress or g92 as gt200.

I thought Juniper was sold as Broadway?

What's it with nvidiots and their semantics these days?
 
I thought Juniper was sold as Broadway?

Yeah and they will call Broadway 58xx mobile. :LOL:
But wait: The will rename juniper to broadway? :oops:

Broadway XT -> HD5870, 128-bit, GDDR5, 45W-60W
* Broadway Pro -> HD5850, 128-bit, GDDR5, 30W-40W
* Broadway LP -> HD5830, 128-bit, (G)DDR3, 29W

Broadway: Juniper based, 800SP's for HD5800, GDDR5, GDDR3 and DDR3 1Billion+ Xtors
http://forum.beyond3d.com/showpost.php?p=1373752&postcount=4904

So, it's okay to fool the customer with a juniper chip but it's very bad to do the same with g92?
 
Notebook graphics have always had separate codenames from desktop graphics; in previous years they have been using Mxx numberings.Notebook graphics product names have not been the same as desktop grahics names either - there may have been correllation and similarities, but they are not, and never have been tied. Segmentation for notebooks is different from desktop and what would be considered as an "performance" desktop part can often be seen as an "enthusiast" part by notebook vendors, and likewise down the line you often see a disjoint in the parts and their segements.
 
You are right. At this time they had two high-end parts: 9800gtx+ and GTX280. Makes sense. :rolleyes:
But i am waiting for the guy who will show me the difference between g92 and gt200.
Until this there is no difference in selling juniper as cypress or g92 as gt200.
G92b is a straight shrink of G92, which was a modified shrink of the original G80. Calling it a G200 part is indeed funny. I dont think even the likes of razor, triniboy, florin, xman etc. would do it. :rolleyes:
 
G92b is a straight shrink of G92, which was a modified shrink of the original G80. Calling it a G200 part is indeed funny. I dont think even the likes of razor, triniboy, florin, xman etc. would do it. :rolleyes:

And G200 is a modified version of g92. And when will i see a list with the differences between g92 and gt200? :LOL:
 
The only real tangible consumer benefit. ((IE not OEM benefit)) is the power saving tech of GT200 verses G92 and the improved geometry shader. There are a few CUDA benefits. I'm not sure how mobile G92 fares in power saving department but I wouldnt be surprised if its improved.
 
I am constantly surprised by the amount of slack you are prepared to cut Nvidia for what are pretty dishonest practices - something we've seen and heard of from insiders for years now that are Nvidia's standard operating procedures.

I'll admit I'm at a disadvantage here as I can't really bring myself to feel outrage over the renaming of a graphics card.

Pointing it out (or "complaining" if you prefer) does change things. It gets the message out there, it gives people an understanding of how they are being conned by dubious business practices, and it helps people make informed choices to spend their money elsewhere on better products and with companies that don't treat their customers with such a level of disrespect.

Agreed, but that only works if the complaint has merit. Just in this thread alone many of the points raised are focusing on architecture, process node and other details that are completely irrelevant to the consumer. Does Nvidia ask a fair price for these renamed parts given their feature set and performance? If so none of that other stuff is relevant.

I just don't know who the victim is in this case. Are there really people out there who care about die size and semiconductor processes yet base their purchasing decisions on model numbers? I don't care if my TV uses a 5 year old chip and I'm sure most people feel the same way about their graphics cards.
 
http://www.nvidia.com/object/product_geforce_gts_360m_us.html

G92 gets rebranded as GTS360M. Can anyone with a straight face still say this is good for consumers?

You got a complaint now trini. G92M/GTX280M/GT360M=DX10, rest of the mobile 3xx parts, DX10.1 .

It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1.
It hurts DX10.1 adoption rate (if any because of their stubbornness)
 
Wow, just...Wow... So G92 is now branded as the exact same generation as a Fermi will be. Implying similar architechture and features.

So it's not even going to be based on G215? So all the way back to Dx10.0 and not even Dx10.1? Much less Dx11?

Yeah, way to think of your customers...

Regards,
SB
 
You got a complaint now trini.

It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1. It hurts DX10.1 adoption rate (if any because of their stubbornness)

Not sure why I should complain about Nvidia's incompetence, they don't work for me :) Whether the part is named 360, 260 or 160 it is still the same part. Again, you're painting a picture of a customer who sees 3xx and assumes DX11 but isn't smart enough the read the spec sheet? Who are these people? I can appreciate that it's fun to highlight Nvidia's utter failure to execute on many fronts as of late but I don't get the need to pretend to care about the poor uninformed customer (who isn't getting hurt by any of this as far as I can tell).
 
http://www.nvidia.com/object/product_geforce_gts_360m_us.html

G92 gets rebranded as GTS360M. Can anyone with a straight face still say this is good for consumers?

You got a complaint now trini. G92M/GTX280M/GT360M=DX10, rest of the mobile 3xx parts, DX10.1 .

It hurts consumers, you could hope for DX11 since it's the all-new 300 series or at least 10.1.
It hurts DX10.1 adoption rate (if any because of their stubbornness)

Why don't read nobody the feature tab?
High Performance GeForce DirectX 10.1 Graphics Processor
NVIDIA® GeForce ®GTS enthusiast class GPUs include a powerful DirectX 10.1, Shader Model 4.1 graphics processor, offering full compatibility with past and current game titles with all the texture detail, high dynamic range lighting and visual special effects the game developer intended the consumer to see. Water effects, soft shadows, facial details, explosions, surface textures and intricate geometry create cinematic virtual worlds filled with adrenalin pumping excitement. Of course all these special effects run at high resolution and playable frame rates for immersive heart-pounding action.
And i don't think, that they will use 2000MHz GDDR3 memory, when the GTX280 has only 950Mhz. :LOL:
And the new GT335 will use 4 full cluster and 1 cluster with only one vec8 unit? http://www.nvidia.com/object/product_geforce_gt_335m_us.html
You hate against nVidia is really funny.
There is no logicial reason that they would use g92b instead of GT215, when they only need GT215 specification.
 
Status
Not open for further replies.
Back
Top