Current Generation Games Analysis Technical Discussion [2023] [XBSX|S, PS5, PC]

Status
Not open for further replies.
That’s fair. Portal benefits from optimizations like SER which tilt the scale in Lovelace’s favor so maybe it is an outlier. Future architectures could have even more tricks in store though.



I’m also banking on accelerated investment in RT hardware relative to other parts of the chip.

I have the opposite view on future scalability of raytracing hw. The hardware is in its infancy and the software is even less mature. We’re at the GeForce 3 level of RT right now.

The main thing dampening my optimism is the continued poor utilization of graphics hardware. RT adds yet another source of divergence and latency. In that respect I do agree we’re facing diminishing returns unless transistors are also spent on making better use of the massive hardware resources already available. There’s no reason a 4090 should be getting only 66fps in Dead Space. Absolutely ridiculous.
Using a performance outlier isn't a good representation of future progress because most software wont be aligning with that performance trend. In a hypothetical future where performance does continue to scale beyond 1 nm, will that be at a cost structure that allows a large enough market for consoles to be produced? If the only option is to sell 5k$ GPUS using 2KW of power, it's irrelevant because there wont be a large enough addressable market to support games being made for it.
 
Using a performance outlier isn't a good representation of future progress because most software wont be aligning with that performance trend. In a hypothetical future where performance does continue to scale beyond 1 nm, will that be at a cost structure that allows a large enough market for consoles to be produced? If the only option is to sell 5k$ GPUS using 2KW of power, it's irrelevant because there wont be a large enough addressable market to support games being made for it.

The market for gpus isn’t going to price itself out of existence. TSMC made $35 billion last year so there’s clearly room to negotiate.
 
The market for gpus isn’t going to price itself out of existence. TSMC made $35 billion last year so there’s clearly room to negotiate.
And $17bn came entirely from TSMC's biggest customer; Apple, and another $10bn from ARM or ARM-derived SoCs, then its RAM. GPUs are a pretty small part of TSMC's business.
 
Late to the discussion but in terms of depreciating rasterization support what are we even talking about specifically in terms of the hardware side? And how much of that could be adequately essentially run in software via the shader cores as performance grows (essentially negating the need for depreciation)? Hasn't there been rumors (or patent readings?) regarding compute based ROPs?
 
The market for gpus isn’t going to price itself out of existence. TSMC made $35 billion last year so there’s clearly room to negotiate.
This is at odds with the common stance here that 5nm is so expensive Nvidia cant lower prices of 4000 series GPUs. Outside of the 4090 they weren't even able to offer a worthwhile improvement in performance or value. It’s only going to get more expensive for Nvidia on smaller nodes.
 
Outside of the 4090 they weren't even able to offer a worthwhile improvement in performance or value. It’s only going to get more expensive for Nvidia on smaller nodes.

I would disagree with that, in the UK the RTX4070ti is just under double the cost of a RTX3060ti but offers double the ray tracing performance or more.

So in some aspects the price/ratio is around the same or slightly better even though the price has increased.
 
This is at odds with the common stance here that 5nm is so expensive Nvidia cant lower prices of 4000 series GPUs.
Well that's cuz this is a ridiculous, garbage argument with little basis in reality.

Nvidia is simply hoping that gamers will ultimately 'give in' so they can set their current practices as precedent going forward. They may take a bit of a sting at the moment, but it'll pay off big-time later if they can normalize such insane pricing. Sure, it'll probably lead to bit less GPU's sold overall, but a smaller hit here can be made up for with their hugely increased margins.

Basically playing a giant game of chicken with the gaming market at the moment. And I'm seriously worried they will win, cuz gamers/consumers are historically quite weak willed.
 
Another day.. another


Yeah, I'm surprised Private Division opted to go with their own developers to put out this update -- most likely all driven off minimizing costs while attempting to maximize extracted profits.
 
Late to the discussion but in terms of depreciating rasterization support what are we even talking about specifically in terms of the hardware side? And how much of that could be adequately essentially run in software via the shader cores as performance grows (essentially negating the need for depreciation)? Hasn't there been rumors (or patent readings?) regarding compute based ROPs?

Nanite is proof that rasterization can already be done via compute without the help of fixed function ROPs by using 64-bit atomic blends instead. It’s not as fast a hardware rasterizer for general usage but maybe that doesn’t matter. Raster hardware either improves to better handle small triangles or gets replaced by pure compute sooner or later.

This is at odds with the common stance here that 5nm is so expensive Nvidia cant lower prices of 4000 series GPUs. Outside of the 4090 they weren't even able to offer a worthwhile improvement in performance or value. It’s only going to get more expensive for Nvidia on smaller nodes.

The majority of gpus being sold today aren’t 4000 series. They don’t represent the market. In any case there was clearly an improvement in value. The $800 4070 Ti delivers the same performance as the $1500 3090 and has more features and lower power consumption. It’s miles better than what you could get for $800 last year.
 
Basically playing a giant game of chicken with the gaming market at the moment. And I'm seriously worried they will win, cuz gamers/consumers are historically quite weak willed.

It really doesn’t have anything to do with will. Everyone always wants to pay less for stuff. For something like a GPU though the second you hand over cash you’re admitting that the price is fair “for you”. You can’t buy a toy and then grumble that it’s overpriced because if it was you wouldn’t buy it. I have no sympathy for people who don’t walk the talk.

Higher GPU prices whether from retailers or scalpers can’t happen without buyers.
 
Nanite is proof that rasterization can already be done via compute without the help of fixed function ROPs by using 64-bit atomic blends instead. It’s not as fast a hardware rasterizer for general usage but maybe that doesn’t matter. Raster hardware either improves to better handle small triangles or gets replaced by pure compute sooner or later.



The majority of gpus being sold today aren’t 4000 series. They don’t represent the market. In any case there was clearly an improvement in value. The $800 4070 Ti delivers the same performance as the $1500 3090 and has more features and lower power consumption. It’s miles better than what you could get for $800 last year.
The 4070ti is 15-20% faster while costing 15-20% more than a 3080 released 2 years prior.
 
Worse case they could reduce the raster performance in favour of increasing the RT.

It's not as if the high-end GPU's can't afford to give up a bit of raster performance as I feel the level we have on offer at the moment will easily last the rest of this console generation.

I can't see my 4070ti needing to be replace in a few years because it can't runs multiplat games at 60fps.
Good luck selling a new gpu that performs worse in raster than older GPUs. As it stands, most gamers are not pushing for ray tracing and I'm not sure most devs are either. It looks to be totally vendor driven. As it stands, the average person can't tell the difference and don't know what benefits it offers them. Today's gpu's deliver visuals like a really attractive sex worker who is good at faking an orgasm that when you finally meet someone attractive who is not faking it, you can't tell the difference. Pardon the analogy.
 
Good luck selling a new gpu that performs worse in raster than older GPUs.

That can’t really happen since GPUs don’t spend a lot of time actually rasterizing triangles. Most of the time is spent on shading/compute and moving buffers around. Any future GPU would be faster at compute, memory access etc and therefore still win.

Today's gpu's deliver visuals like a really attractive sex worker who is good at faking an orgasm that when you finally meet someone attractive who is not faking it, you can't tell the difference. Pardon the analogy.

The more accurate analogy is that today’s games give you some real good blow up doll action and you love it because you’ve never been with a real woman.
 
The 4070ti is 15-20% faster while costing 15-20% more than a 3080 released 2 years prior.

Yeah, that's the thing. Look at the original 3090 reviews, the general gist was that yes, it was the fastest card - but also largely just a halo product with a horrible price/performance ratio and for even high-end purchasers, not really worth it over the 3080.

Using it as the basis for a value comparison is taking Nvidia's marketing. The 3090 was always an extremely niche product that offered minor advantages (in gaming) over a card less than half its price. Outside of gaming sure, it had its uses where that 24GB of memory could be utilized - but then again, the 4070ti has half that as well so it's still not a good GPU to compare it with on that front either.
 
Yeah, that's the thing. Look at the original 3090 reviews, the general gist was that yes, it was the fastest card - but also largely just a halo product with a horrible price/performance ratio and for even high-end purchasers, not really worth it over the 3080.

Using it as the basis for a value comparison is taking Nvidia's marketing. The 3090 was always an extremely niche product that offered minor advantages (in gaming) over a card less than half its price. Outside of gaming sure, it had its uses where that 24GB of memory could be utilized - but then again, the 4070ti has half that as well so it's still not a good GPU to compare it with on that front either.

That’s true the 3090 isn’t a good point of comparison. I defaulted to it since that’s my current card. Compared to the 3080 it’s a wash at 15% more money for 15% more performance + some more bells and whistles. I wouldn’t blame 5nm for that though. AMD is selling a lot more 5nm silicon and memory for the same price in the 7900xt.
 
The majority of gpus being sold today aren’t 4000 series. They don’t represent the market. In any case there was clearly an improvement in value. The $800 4070 Ti delivers the same performance as the $1500 3090 and has more features and lower power consumption. It’s miles better than what you could get for $800 last year.
To be fair that 1500 wasnt down to purely what they could pack in the hw and because of mining and what the market would pay
 
Today's gpu's deliver visuals like a really attractive sex worker who is good at faking an orgasm that when you finally meet someone attractive who is not faking it, you can't tell the difference. Pardon the analogy.
Brauh. We are on the internet debating the performance of the individual components in the rendering pipeline of modern GPUs. We are never going to experience this.
 
The 4070ti is 15-20% faster while costing 15-20% more than a 3080 released 2 years prior.

15-20% based on what? MSRP? Retailers pricing after ignoring MSRP? Scalper pricing?

The last two crypto runs allowed Nvidia to observe how much PC gamers really value their hardware and it was much higher than MSRP. Hence the change in pricing?
 
Status
Not open for further replies.
Back
Top