NVIDIA Maxwell Speculation Thread

About the class action question or how they have have handled all this story ?

Because lets be honest, with or without class action, Nvidia is now in a damage control slide for their image..

Maybe this was a misstake on the first time, but on the end, nobody have wanted correct it ( i absolutely dont believe that nobody have see it, Nvidia is not Apple lol )
About the class action, but I also don't feel sorry for their technical marketing team. There are probably issues over the entire company and the marketing team shouldn't take all the blame, but one must take responsibility for what he says and does.

Nvidia still didn't addressed this through any official way (press release) and it may bite them later.
 
You mean this version needs least assumptions?
Well, maybe you could put it that way also. I just feel calling it a lie is the overall simplest, easiest explanation.

If they wanted to lie, shouldn't they try harder to succeed? < ... >
And even then, why take such risks with a product that does not really need to look better?
I could only speculate as to their motives, which would be pointless. So I won't. All I'll say is that NV employs a lot of highly paid, extremely smart, technically vell-versed people. That nobody over there caught this error (assuming that's what it is, rather than a deliberate lie), over the course of several months, feels very odd indeed. Do they live in a bubble over at NV HQ? No, of course not.

So of course they knew. Assuming they didn't and you'd have to conclude they're all a bunch of incompetent imbecils over there, and that can't be, because how could they then develop such outstanding products in the first place? ;)

They're awesome, that's not an issue. It's just that they have shitty morals, and mostly at the top of the company I'd say, that's all. They're monopolistic (they love proprietary crap which they can use to tax other manufacturers or their customers with), and have tendencies to try and pull fast ones on us when they think we're not looking.
 
With all sudden attention towards memory size (or the relative lack thereof), it's going to be interesting what happens when the first HBM GPUs only have HBM1 and thus 'only' 4GB. ;)
 
The only rumor I've heard is one of using HBM.

A few people did speculate of having "mix and match" VRAM since HBMv1 is somewhat capacity limited.
I think the general consensus, or at least IMO, it would potentially be quite complicated and would limit the power benefits of HBM.
 
A class action suit here would be the epitome of ridiculousness. All you have to do is decide if you'd still buy the SKU with its measured price/performance if you'd known about it up front. How could anyone that bought it originally now say they wouldn't, with a straight face? It's an incredible discrete graphics card with excellent performance for the money and frankly we've never had it better.

They've handled it fairly farcically, but given the state of the technology press today they had few options other than the ridiculous diagram and the call with Jonah. I feel sorry for their technical marketing team, who were thrown right under the bus for no reason.

Reviews didn't necessarily test for cases that required more than 3.5GB of memory. So consumers would have expected graceful scaling up to 4.0GB when it fact that is not how the card behaves.
 
Reviews didn't necessarily test for cases that required more than 3.5GB of memory. So consumers would have expected graceful scaling up to 4.0GB when it fact that is not how the card behaves.
I posit that even if they had known how it performs under those conditions, it'd still be bought. You'd just know to manage your game settings accordingly, just like you do to fit a game to the rest of the card's performance characteristics.
 
Thanks for taking the time to do this. From what I understand you underclocked the GTX 980 to match the GTX 970's theoretical performance and compared the two in memory-constrained cases. Based on your results for Watch Dogs at 4K, the 970's reduced bandwidth beyond 3.5GB really does hurt quite a bit. Of course, even the 980 can't get playable framerates, so it's not ideal, but then again few games really require more than 3.5GB at playable framerates, at least for now.
Yep, you got that right.
In Full-HD where memory allocation is less challenging, the card still does pretty good, while in UHD where the 970 really needs to go all the way to 3.9ish GiB, things get worse. Playable framerates there is an issue, correct, but OTOH it's actually pretty hard to really get reproducibly into that space between 3.5 and 4.0 GiB, where the card/driver cannot just handle allocation more strict and stay below that line. But we're working on that too. :)

I posit that even if they had known how it performs under those conditions, it'd still be bought. You'd just know to manage your game settings accordingly, just like you do to fit a game to the rest of the card's performance characteristics.
While i generally agree, I think that some people might not have perceived the 970 as an offer as tempting as it seemed to be beforce this limitation became known. Whether or not this would have influenced their purchase decisions I obviously cannot fathom.
 
I posit that even if they had known how it performs under those conditions, it'd still be bought. You'd just know to manage your game settings accordingly, just like you do to fit a game to the rest of the card's performance characteristics.

I don't disagree. However it would had taken only a minor asterisk to save them from this rather painful story. Example:

http://www.nvidia.com/object/tesla-servers.html
2 With ECC on, 6.25% of the GPU memory is used for ECC bits. For example, 6 GB total memory yields 5.25 GB of user available memory with ECC on.
 
My little devil's advocate whispering inside is going.... nah it's easier/cheapper to ... "you honor, it was all an internal miscommunication of our D'oh marketing department, it was never our client intention of hopping to let this go unnoticed to avoid their product to appear less appealing to our beloved and respected customers"
 
Considering the amount of GPU brainpower on this thread it is amusing that people who actually use the cards spotted it rather than you guys.

Perhaps they actually use the cards rather than just talk forever on what is under the hood?

Rather a sad conclusion on how Beyond3D has gone downhill since Dave left I think.
 
It would be impractical and unreasonable to require that posters buy every piece of hardware they want to discuss. This subforum is dedicated to talking about what is under the hood, so that criticism is pointless.
This being an enthusiast forum, it is also possible that some of those most likely to hit this case may have bought non-salvage models, different architectures, or different IHVs not affected by this, or do not typically run at settings that hit this corner case. It's a numbers game and the population is not that huge.
 
Considering the amount of GPU brainpower on this thread it is amusing that people who actually use the cards spotted it rather than you guys.
You find it amusing that those who use a product are the ones who notice issues with it? And not those who have to go by paper specs?

I don't call that amusing, but logical.

Perhaps they actually use the cards rather than just talk forever on what is under the hood?
You figured it out!

Rather a sad conclusion on how Beyond3D has gone downhill since Dave left I think.
Dave being or not being here has not changed the fact that I don't play games and thus have no need for a big GPU.
 
This issue was discovered by accident from the kind of peeps who usually play/bench/tweak all day their games with a dozen of meticulously observed real-time stats OSD, running almost as a fixture to their monitors. It's quite a different crop of users from what usually is attracted to this forum.
 
A class action suit here would be the epitome of ridiculousness. All you have to do is decide if you'd still buy the SKU with its measured price/performance if you'd known about it up front. How could anyone that bought it originally now say they wouldn't, with a straight face? It's an incredible discrete graphics card with excellent performance for the money and frankly we've never had it better.

They've handled it fairly farcically, but given the state of the technology press today they had few options other than the ridiculous diagram and the call with Jonah. I feel sorry for their technical marketing team, who were thrown right under the bus for no reason.

Let people decide what´s important for them.

For many that don´t upgrade as often, a card that is not future-proof is not an option. And they eventually woulnd´t have bought the GTX 970 if they knew it has less ROPs or useable memory than what Nvidia announced. A mistake or a lie, Nvidia is fully responsible for their product´s ads.

Does Nvidia want to avoid a class action? How about provide a refund for those that demand to return their cards because of this "situation"?

That way, one can "decide if you'd still buy the SKU with its measured price/performance if you'd known about it up front".
 
Does Nvidia want to avoid a class action? How about provide a refund for those that demand to return their cards because of this "situation"?

That way, one can "decide if you'd still buy the SKU with its measured price/performance if you'd known about it up front".

Using your logic would not the end user then have to pay Nvidia going forward when new drivers improve the "price/performance".
 
142a.jpg
Nvidia has said they will give refunds to anyone who want them I think.
 
very clear case of false advertisement, they were very clear when said the card had 64ROPs and 256bit 4GB, while the 64 ROPs is 100% false, the rest also seems false to me, the correct is up to 224 bits memory for 3.5GB and UP to 32bit for ther other 512MB, and never 256bits speed (adding the 512MB to the 3.5GB drops the bandwidth, it doesn't add, like it would on a 980, so 256bits cannot be achieved)
 
Back
Top