Speculation and Rumors: Nvidia Blackwell ...

  • Thread starter Deleted member 2197
  • Start date
You're arguing semantics in a response to a post from a moderator that such an argument really isn't adding to the conversation?

Revisit what exactly the debate is. What is it about die size that affects whatever it is people are discussing. Notions of 'high' and 'low' end don't need enter it and so don't need defining to a point everyone can agree to push the debate forwards. ;)
Without agreeing to such basic concepts, we literally cannot ever argue that Nvidia is deliberately exploiting naming to push lower tier products to a higher tier, thus being able to charge more for less.

We all know very well that's exactly what they're doing, but if we're going to play this silly semantic devil's advocate contrarianism game where we're not allowed to mention how die size relates to the tier of product in the range stack, then we literally cannot even accuse them of doing this, because nobody could actually provide any evidence that they're doing this and we could just write any such argument off as just entitled whining and say that people have no room to complain.

Now if we just dont want any talk of pricing or Nvidia's scummy practices or whatever here, fine, but you should make that clear instead.
 
Last edited:
I've been testing out lossless scaling's 3x frame generation and quite impressed with it. Around low 40s is good enough to play and I'm usually picky with the input lag, though with integrated FG like DLSS/FSR, the input latency would be higher,
Anyway, the gist is that nvidia can promote Blackwell to do 3x or even 4x FG which will add onto the performance difference and displayed prominently in the slides. With how ubiquitous upscaling has become in the past few years, FG will be adopted the same way and with how good the experience is with it on, it'd be seen more and more as 'real' performance.
Bigger multiples is neat, but you're running into big diminishing returns there in terms of visual fluidity, and becomes much less useful unless you're running 240hz+ displays. And the people with such displays who want to run games at 150fps+ are probably doing so for competitive type games where they do actually want the input lag benefits of high framerates.

I'd like to see the emphasis be on getting better information/reconstruction from lower framerates. One of the great things about DLSS2 vs FSR2 is how much better DLSS2 works from a lower resolution target. Something similar with framegen would really open up tons of overhead to push graphics and settings(like path tracing) to greater extremes or give somewhat weaker hardware and consoles a good 60fps experience without heavy compromises. It might not be as good for fancy marketing 'number goes up' charts, but would probably be more useful to more people.
 
Without agreeing to such basic concepts, we literally cannot ever argue that Nvidia is deliberately exploiting naming to push lower tier products to a higher tier, thus being able to charge more for less.

We all know very well that's exactly what they're doing, [...]

Now if we just dont want any talk of pricing or Nvidia's scummy practices or whatever here
I would personally find that very off topic and prone to lot of noise, but can't see why it couldn't have its dedicated topic if needed.

Also why not avoid loaded terms and keep things factual? It's one thing to express a negative opinion (at times it might be welcome honesty) about prices and whatnot, and entirely another to label a company or even just the practice as evil
 
I would personally find that very off topic and prone to lot of noise, but can't see why it couldn't have its dedicated topic if needed.

Also why not avoid loaded terms and keep things factual? It's one thing to express a negative opinion (at times it might be welcome honesty) about prices and whatnot, and entirely another to label a company or even just the practice as evil
Absolutely nowhere did I say anything about anybody being evil, come on now. Dont accuse me of using loaded terms, while exaggerating what I was saying using....a loaded term.
 
Absolutely nowhere did I say anything about anybody being evil, come on now. Dont accuse me of using loaded terms, while exaggerating what I was saying using....a loaded term.
You used "nvidia scummy practices", which implies assigning a moral qualifier to the company (or at least implies a pattern of reprochable actions) . Replaced that with "evil" just for simplification, sorry if that made it sound like an accusation.
To me any such moral qualifiers are equaly bad, as they distract from the factual discussion.

Unless we're explicitly discussing things like: company did x thing, is it right or wrong for for the consumer?

Of course it might make sense and it might be valuable to deliberate the moral value of a given action, but not disguised as a side remark , piggy backing on some related factual point
 
Consumers don’t like paying more for “any” reason. This is at odds with companies trying to make as much money as possible which is literally their primary reason for existence. The conflict is inevitable.

Blackwell prices will be determined based on whatever makes Nvidia the most money. Part of that is offering value to the buyer so they actually want to buy the thing. Nvidia is competing with themselves at this point so at every price point they will have to offer something to entice buyers in that segment.
 
Consumers don’t like paying more for “any” reason. This is at odds with companies trying to make as much money as possible which is literally their primary reason for existence. The conflict is inevitable.

Blackwell prices will be determined based on whatever makes Nvidia the most money. Part of that is offering value to the buyer so they actually want to buy the thing. Nvidia is competing with themselves at this point so at every price point they will have to offer something to entice buyers in that segment.

Oddly, customers can enjoy paying more. Price can reflect the psychological value a customer places upon a product, in that a higher priced product must be worth more. The star example of this is the high end fashion industry, put the word "Gucci" on a product and charge 10x more, and a certain class of people will be more willing to buy that than the same product at a tenth the price.

Humans aren't necessarily "rational" when it comes to purchases. Nvidia exploits this with its PR. Instead of comparing benchmarks, which are a fungible commodity (no differentiation) they like to work with developers on modes that benefit their hardware versus others, such as in Alan Wake 2, or put out a lot PR about how much better DLSS is than competitors, despite a blind test probably finding little difference in a lot of titles for the average consumer. My favorite is their FUD (fear, uncertainty, doubt) campaign that AMD cards "don't work well", when data shows Nvidia ends up with more driver bugs and crashes than AMD does.

Nvidia is well aware they can make their customers pay an irrationally high amount of money for a psychological premium and are happy to exploit such, AMD appears to be entirely unaware this effect even exists.
 
Oddly, customers can enjoy paying more. Price can reflect the psychological value a customer places upon a product, in that a higher priced product must be worth more. The star example of this is the high end fashion industry, put the word "Gucci" on a product and charge 10x more, and a certain class of people will be more willing to buy that than the same product at a tenth the price.

Humans aren't necessarily "rational" when it comes to purchases. Nvidia exploits this with its PR. Instead of comparing benchmarks, which are a fungible commodity (no differentiation) they like to work with developers on modes that benefit their hardware versus others, such as in Alan Wake 2, or put out a lot PR about how much better DLSS is than competitors, despite a blind test probably finding little difference in a lot of titles for the average consumer. My favorite is their FUD (fear, uncertainty, doubt) campaign that AMD cards "don't work well", when data shows Nvidia ends up with more driver bugs and crashes than AMD does.

Nvidia is well aware they can make their customers pay an irrationally high amount of money for a psychological premium and are happy to exploit such, AMD appears to be entirely unaware this effect even exists.
wrong.. nvidia just sells better lemonade than amd.. you can't force people to buy amd they just dn't like it.. nvidia is a more trusted brand than and
 
Oddly, customers can enjoy paying more. Price can reflect the psychological value a customer places upon a product, in that a higher priced product must be worth more. The star example of this is the high end fashion industry, put the word "Gucci" on a product and charge 10x more, and a certain class of people will be more willing to buy that than the same product at a tenth the price.

Humans aren't necessarily "rational" when it comes to purchases. Nvidia exploits this with its PR. Instead of comparing benchmarks, which are a fungible commodity (no differentiation) they like to work with developers on modes that benefit their hardware versus others, such as in Alan Wake 2, or put out a lot PR about how much better DLSS is than competitors, despite a blind test probably finding little difference in a lot of titles for the average consumer. My favorite is their FUD (fear, uncertainty, doubt) campaign that AMD cards "don't work well", when data shows Nvidia ends up with more driver bugs and crashes than AMD does.

Nvidia is well aware they can make their customers pay an irrationally high amount of money for a psychological premium and are happy to exploit such, AMD appears to be entirely unaware this effect even exists.
I'm sorry but this whole post if pure nonsense. Customers buy products based on their qualities. If they see these as higher/better than from the other suppliers then they'll buy them even despite them having higher prices. This is the only reason why a more expensive product can sell the same or better than its competing counterpart. It has zero to do with "enjoyment", "PR", or even "brand" although the latter does matter but not in case of such well known and proven to be highly reliable companies as those which make dGPUs these days.
 
Mod Mode: The thread is architecture speculation. Shifty was in here earlier reminding folks and cleaning up, now I'm in here reminding folks and cleaning up.

If you want to discuss a topic unrelated to Blackwell architecture and rumors, then I very much encourage you to create a new thread. Next person to step in some doo-doo and stink up this thread is getting some time off.
 
Bigger multiples is neat, but you're running into big diminishing returns there in terms of visual fluidity, and becomes much less useful unless you're running 240hz+ displays. And the people with such displays who want to run games at 150fps+ are probably doing so for competitive type games where they do actually want the input lag benefits of high framerates.

I'd like to see the emphasis be on getting better information/reconstruction from lower framerates. One of the great things about DLSS2 vs FSR2 is how much better DLSS2 works from a lower resolution target. Something similar with framegen would really open up tons of overhead to push graphics and settings(like path tracing) to greater extremes or give somewhat weaker hardware and consoles a good 60fps experience without heavy compromises. It might not be as good for fancy marketing 'number goes up' charts, but would probably be more useful to more people.

While I'd like to see 8k or at least >4k resolutions becoming the focus with new cards, it's likely we'll remain on 4k with higher refresh rates. What with the flurry of OLED and 4k 240Hz monitors this year, I'd expect 4k 240Hz+ to be a real possibility with DP2.0 next year.

Come to think of it, before DLSS3 you wouldn't really bother with the capabilities of the OFA, but now it ought to be treated as part of the architecture and speculated upon.🤔
 
While I'd like to see 8k or at least >4k resolutions becoming the focus with new cards, it's likely we'll remain on 4k with higher refresh rates.
And thank god for that! I can't imagine why anyone would want 8K anywhere as of now. You'd need to go with >40" on desktop and >100" as a TV to see any benefits IMO. It would just be a waste of processing power.
 
And thank god for that! I can't imagine why anyone would want 8K anywhere as of now. You'd need to go with >40" on desktop and >100" as a TV to see any benefits IMO. It would just be a waste of processing power.
I was just mulling this same thought, and I have another group of niche players who would absolutely "want" an 8K display support: VR and AR headset users. When it's placed just a few centimeters from your eyeball, that sort of pixel density becomes useful. Otherwise, I generally agree with you in that consumer display devices are in a bit of a race to the extreme for the pro-sumer end of things. I'd rather see display manufacturers work harder on getting HDR "right" versus just cramming more pixels into the same spce.

However, that said, it wouldn't surprise me if NVIDIA Broadwell is targeting an 8K output frame. Whether any of us like it or not, NVIDIA wants those fat margins and (one of) the better ways to sell overkill expensive cards is to tout them with overkill capabilities. Now, that said, if it folds 2x faster than my 4090 at the same or less power... ;)
 
I can see why people would want 8K for a desktop monitor (27-42" sized so about "normal") but all such monitors should come with "dual mode" function IMO which would allow you to switch them into 1/4 of the res, possibly with 2X the Hz.
This way you could get both PPI density for desktop things (like Web browsing even) and lower resolution which doesn't require a $5000 GPU for gaming applications.
I myself would probably get one of these in ~40" size. But alas there are none so far and whatever 8K is on the market is completely unattractive.
VR sure but I think it's a settled dispute already that current approach to VR doesn't sell so I doubt that anyone would target that.
 
I don't think the 4090 would have enough memory to do DLSS3 frame generation at 8K. Judging by how much it uses at 4K I think at 8K frame generation alone could use in excess of 12GB, leaving not enough memory (<12GB) to run the game at 8K.
 
If I had an 8K screen, I'd happily be a guinnea pig to test it :)

Back to Blackwell -- I agree with DegustatoR regarding VR being a reaaaaally niche use case at this point in time, yet I also wager NVIDIA could sell another handful more Blackwell parts if some heavy VR users thought it would improve their experience. Honestly, the 4090 was a really nice step up even for my (crappy by current standards) Oculus Quest when playing MSFS2020. Again though, back to DegustatoR's point, I doubt Broadwell is "targeting" this demographic; rather I presume NVIDIA might mention it only because they can.

All that to say, I wager talk about 8K is surely gonna show up for Broadwell parts, certainly at the top end.
 
Impossible. A single 8K full screen buffer is only 250MB.
It's hard to find VRAM testing for framegen. According to this it's using 3.3GB at 4K in Cyberpunk.

1718636344672.png

I might do some testing in Hellblade 2 on my 4070 some time this week. I can only do 1080p. I could use DSR to test higher resolutions but that's probably not good testing methodology.
 
Last edited:
Back
Top