Nvidia shows signs in [2023]

  • Thread starter Deleted member 2197
  • Start date
Status
Not open for further replies.
I wonder if Nvidia will use 24 Gb modules so meaning they could in theory do a 30GB cut down 5090, full-fat 36GB 5090 Ti and a 72GB Quadro.
I doubt that we'll see VRAM size changes in the high end. 16+ GBs are well enough for now. It's the low end where bus widths are limiting memory sizes below a comfortable level.
 
I doubt that we'll see VRAM size changes in the high end. 16+ GBs are well enough for now. It's the low end where bus widths are limiting memory sizes below a comfortable level.
Do we have figures of 8K gaming frame buffer VRAM usage?, im sure it’s alot but by how much compared to 4K frame buffers?, can we simply multiply 4K frame buffer size by 4 given that 8K is pushing exactly Four times the amount of 4K pixels or is it more complicated than that?
 
Do we have figures of 8K gaming frame buffer VRAM usage?, im sure it’s alot but by how much compared to 4K frame buffers?, can we simply multiply 4K frame buffer size by 4 given that 8K is pushing exactly Four times the amount of 4K pixels or is it more complicated than that?
It's more complicated than that. Textures for example are certain resolution regardless of the rendering resolution, so they're more or less fixed cost.
 
Do we have figures of 8K gaming frame buffer VRAM usage?, im sure it’s alot but by how much compared to 4K frame buffers?, can we simply multiply 4K frame buffer size by 4 given that 8K is pushing exactly Four times the amount of 4K pixels or is it more complicated than that?
Frame buffer isn't the main consumer of VRAM these days and going from 4K to 8K while it does add some sizeable fixed cost in VRAM usage isn't really a problem for 16GB GPUs so there's no apparent need to go higher than that if you're aiming at 8K (which in itself is a very questionable target to aim at).

The answer to the question if we'll see high end going with more VRAM next gen will depend on how much these x1.5 capacity chips will cost. If it'll be x1.5 cost or less than it's possible. If it'll be more than x1.5 then there's no need really.
 
Frame buffer isn't the main consumer of VRAM these days and going from 4K to 8K while it does add some sizeable fixed cost in VRAM usage isn't really a problem for 16GB GPUs so there's no apparent need to go higher than that if you're aiming at 8K (which in itself is a very questionable target to aim at).

The answer to the question if we'll see high end going with more VRAM next gen will depend on how much these x1.5 capacity chips will cost. If it'll be x1.5 cost or less than it's possible. If it'll be more than x1.5 then there's no need really.
I believe Nvidia will market the 5090 as the definitive 8K gaming GPU, 4090 does 8K but it’s on case by case basis, so if they’re positioning the 5090 as their 8K card, that triggers the question whether we need more VRAM for 8K gaming or not. assuming it’s needed for 8K and lots of gamers prefer 4K with higher framerates (myself included), then Nvidia might release two 5090 SKUs (one with more memory) like they did with many GPUs before.
 
That's a pretty awesome application for AI. It's kinda crazy that we're living through the AI revolution now which I honestly believe will make all previous revolutions with the possible exception of the agricultural pale in comparison.

For people of the generation that I think many here are, I.e. can remember a time before mobile phones, the Internet and WIFI, we are living through by far the most remarkable transition that has happened in the space of a human life time in all of human history.

How privileged are we!

Considering I grew up in a time before Personal computers existed that you could buy for your home, I have to feel that the home computing revolution trumps the AI revolution. :p But AI is certainly cool.

Regards,
SB
 

SK Hynix has started recruiting design personnel for logic semiconductors, such as CPUs and GPUs, reports Joongang.co.kr. The company is apparently looking to stacking HBM4 directly on processors, which will not only change the way logic and memory devices are typically interconnected, but will also change the way they are made. In fact, if SK Hynix succeeds, this may largely change how the foundry industry works.
 
Q3 FY24 financial results with the general trend of "number go up even more". 3x revenue y/y and 13.5x GAAP/7x non-GAAP net income y/y isn't a bad result all things considered


GAAP
($ in millions, except earnings
per share)
Q3 FY24Q2 FY24Q3 FY23Q/QY/Y
Revenue$18,120$13,507$5,931Up 34%Up 206%
Gross margin74.0%70.1%53.6%Up 3.9 ptsUp 20.4 pts
Operating expenses$2,983$2,662$2,576Up 12%Up 16%
Operating income$10,417$6,800$601Up 53%Up 1,633%
Net income$9,243$6,188$680Up 49%Up 1,259%
Diluted earnings per share$3.71$2.48$0.27Up 50%Up 1,274%

Non-GAAP
($ in millions, except earnings
per share)
Q3 FY24Q2 FY24Q3 FY23Q/QY/Y
Revenue$18,120$13,507$5,931Up 34%Up 206%
Gross margin75.0%71.2%56.1%Up 3.8 ptsUp 18.9 pts
Operating expenses$2,026$1,838$1,793Up 10%Up 13%
Operating income$11,557$7,776$1,536Up 49%Up 652%
Net income$10,020$6,740$1,456Up 49%Up 588%
Diluted earnings per share$4.02$2.70$0.58Up 49%Up 593%
Data Center Third-quarter revenue was a record $14.51 billion, up 41% from the previous quarter and up 279% from a year ago.
Gaming
Third-quarter revenue was $2.86 billion, up 15% from the previous quarter and up 81% from a year ago.

DC about 5x gaming revenue, DC + gaming is 95.87% of Nvidia's total revenue

Q2 FY24 for comparison https://nvidianews.nvidia.com/news/nvidia-announces-financial-results-for-second-quarter-fiscal-2024
 
Nvidia could give away their GPU and still have obscene margins
Their GPUs are the basis for all of their revenue and margins. They don't sell anything else.
And if you mean just GeForces then sure they can do it, and then they'll just stop making them altogether because the only point in making any product is in getting profits from it.
 
Insane numbers but unsustainable long term. Once the datacenters are full of AI hardware they’ll go back to normal upgrade cycles. Only question is how long will it take to get there. Nvidia’s best chance at sustaining this sort of revenue is to get into the AI cloud services game. Hardware alone won’t do it.
 
Status
Not open for further replies.
Back
Top