Nvidia Ampere Discussion [2020-05-14]

I've never missed a downvote button until now. Reading a comment like this here, or in fact outside of the likes of wccftech, 4chan, etc. is heartbreaking to me.

Why, are you saying one could buy more than what was listed for the price of an Ampere card?
 
A lot.
All that juicy BOM bloat, mmmmmhm.
Delicious.
Not wrong, that is the rumor. But point taken, there will be lesser cards released, but no rumor on those prices yet.

Yes, it is wrong, because you didn't say 3090. And you can't make me not dislike that kind of fallacious comment, more typical of console fanboys over at those places.

PS: I'm talking about and disliking the comment. I've nothing against you or anything.
 
Not wrong, that is the rumor. But point taken, there will be lesser cards released, but no rumor on those prices yet.
The lesser cards also have housefire-mandated BOM bloat.
Jeez, the prices are not going down; PC gaming will go back to being a sikrit club.
 
Nah, Nvidia can just shift the model numbers down so consumers can feel like they're getting a good deal. You'll get some forum outrage and a few journalists pointing out the discrepancy compared to previous gen, but what are they going to do? Buy an AMD instead?
 
Well, I will be completely floored and made a fool of. A completely non standard memory standard with a "surprise motherf*cker!" announcement. What a damned weird thing to do.

Especially as a 384bit bus with 18gbps GDDR6 could get the high end 3090 enough bandwidth by itself, at least going by the leaked performance. But hells maybe this means they're doing the weird cut down bus thing again. RTX Titan 2 with 24gb ram, 3090 with 10/20gb? Or 11/22???

Will be interesting to see the announcement now. And I wonder if GDDR6X yield is low enough, or it's expensive enough versus normal, that the surprise announcement somehow makes sense.
 
Nah, Nvidia can just shift the model numbers down so consumers can feel like they're getting a good deal. You'll get some forum outrage and a few journalists pointing out the discrepancy compared to previous gen, but what are they going to do? Buy an AMD instead?

Lol.
 
Second die on 3090 ?

Rumors are really going everywhere lol
There's supposedly some unidentified chip on the backside, could be some power delivery related chip or something like that (at least it's far more likely than any sort of 2nd die with memory controllers or something)
 
Well, I will be completely floored and made a fool of. A completely non standard memory standard with a "surprise motherf*cker!" announcement. What a damned weird thing to do.

Is it a surprise though? I have to imagine AMD and memory industry insiders got wind of this a long time ago.
 
Just out of the blue (heavily intoxicated as usually), if you'd want to mount your memory chips only on the backside of the PCB, which would be faster: direct traces through PCB, or some fat pipes to some repeater chip through the middle and then traces from there to memory dies?
 
Is it a surprise though? I have to imagine AMD and memory industry insiders got wind of this a long time ago.

I mean, quite possibly. Which is why it weirds me out so much. It's like, why, what's the point? Who's really so blown away by a 17% jump in bandwidth over 18gbps for, assumedly more money, that you had to keep it a secret until the product announcement? The average consumer isn't, just like with power consumption they don't know and don't care.
 
I mean, quite possibly. Which is why it weirds me out so much. It's like, why, what's the point? Who's really so blown away by a 17% jump in bandwidth over 18gbps for, assumedly more money, that you had to keep it a secret until the product announcement? The average consumer isn't, just like with power consumption they don't know and don't care.
Well, those 18 Gbps don't even seem to be available, since everyone is using 14 (and one card 16) Gbps, but apparently these 19 and/or 21 Gbps chips are actually available.
But it is indeed curious how they could keep this under the wraps, and if JEDEC wasn't involved, how will it react to one memory manufacturer going solo and using their naming convention?
 
Back
Top