Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
Well it's to do with DFs article so...although I don't recall anyone in the thread saying anything about PCs declining. Just that pricing could stand to come down a bit to let the tech improvements not be littered with controversy. But yeah I think everyone has said their piece

Its leading into that territory, look at the last post in the 4070ti topic. We cant seem to handle gpu discussions all that well.
 
Ultimately the consumers will decide if they think these gpus are badly priced. If worst comes to worst amd could always do a redo of 400/500/1600 line where they focused on budget value cards in a good balance rather than pure horsepower
nVidia could just take the 4090 laptop version which allegedly consumes 150W and make some desktop variant of it, along with the rest of the line. Power efficiency it is again -like in the nVidia's Pascal era- something AMD have to work on. Intel said that with Battlemage their flagship GPU won't consume more than 225W, which is, they say, the sweet spot.

Dunno about that, but it certainly is for a 550W PSU like mine, my entire PC at max load consumes about 275W and PSUs are more efficient when power consumption is half their max wattage.
 
Other quarters from previous years don't show massive sales slow down prior to new cards releasing.
In the shift from Geforce 1xxx to 2xxx and 2xxx to 3xxx, prices dropped quite bit but that doesn't seem to have happened this time.
 
In the shift from Geforce 1xxx to 2xxx and 2xxx to 3xxx, prices dropped quite bit but that doesn't seem to have happened this time.
Well, price drops only happen at the retail level for products in the prior gen that have been obsoleted by those in the new gen, and this is staggered over several months. With Ampere->Ada Nvidia had already slashed GA102 prices a couple of months in advance (and way upstream in the supply chain), so we're not seeing much movement in retail now. It's interesting to recall how unusually broad GA102's reach was in the product stack, which is perhaps a testament to how relatively affordable those dies must have been.

Given the current state of the silicon industry I'm not sure if Ada will completely supplant Ampere all the way down to the $300 range. It's possible that Nvidia could continue to build cost-optimized/rebadged Ampere products out of cheap harvested silicon instead of attempting to push pricey 4N silicon through the entire product stack. Although these days I wonder how a rebadge will be prosecuted by the grievance-fueled techtuber/reddit community.
 
Roll on Intel battlemage.
Extra year of driver work, and their goal of competing from nowhere, they will have to be extremely competive on price for a long time to try and grab a little market share.
 
The article only used sales figures up to Q3 2022, which is before the new cards were release. This is a predictable sales downturn that happens in the run to the release of new generations of silicon, because there will be vacuum or purchases as people wait for new cards.

This is mindless slow-news-week reporting. Why didn't the new GPUs sell better before they were released? :unsure:

Yes and no. Yes, it was before the 4080 was released. However, no, historically regardless of whether there is a new card incoming Q3 represents one of the highest quarters in any given year due to the demand from the back to school season. Some examples,



You can go back as far as you want, but it is unheard of for Q3 Discrete GPU shipments to drop below Q2 shipment levels in any given year. Basically, this is the first time this has ever happened in recorded history for Discrete GPU shipments.

Q4 shipments should be higher than Q3, but I'm not entirely sure it'll be higher than Q2. And that's a problem because Q2 is historically the lowest shipment volume quarter in any given year.

Even worse than that, it's a record breaking YoY drop in Q3 discrete GPU shipments.

It's basically an indication that GPU prices without Crytocurrency returns to support them (IE - a buyer thinking that they can always either resell the GPU due to the crypto boom or recoup some of the cost by mining) combined with a massive downturn in the global economy means that for the first time ever GPU shipments have dropped going from Q2 to Q3 ... which is due to the absurdly high prices for GPUs (even before the 4000 series hit).

Regards,
SB
 
Last edited:
In the shift from Geforce 1xxx to 2xxx and 2xxx to 3xxx, prices dropped quite bit but that doesn't seem to have happened this time.
Moores Law biting us now? I read somewhere, maybe on here, that the shrinking of the SRAM is basically the big hurdle at the moment.
 
We don’t actually know that lower prices would spur significantly higher demand. It might very well be the case that the demand for PC hardware is just not there at all.
Yea I think a lot of the prospective buyers were clamoring to buy hardware as prices started ramping up because nobody was sure how long this period of shortages and crypto crap was going to go on for.. so many people either bought their hardware already and spent more than they were expecting to, resulting in less interest... or they simply tapped out altogether.

The reality is that there just isn't much pushing GPU hardware adoption anymore. Consoles utilizing PC centric hardware architectures, and a consolidation of the industry have essentially lowered the impact that new hardware releases once had. Ok... so it's faster than what I have... but what I have is still crushing current games... so where's the incentive??

That's basically the issues as I see them. Of course the economy being what it is doesn't help at all.
 
Yea I think a lot of the prospective buyers were clamoring to buy hardware as prices started ramping up because nobody was sure how long this period of shortages and crypto crap was going to go on for.. so many people either bought their hardware already and spent more than they were expecting to, resulting in less interest... or they simply tapped out altogether.

The reality is that there just isn't much pushing GPU hardware adoption anymore. Consoles utilizing PC centric hardware architectures, and a consolidation of the industry have essentially lowered the impact that new hardware releases once had. Ok... so it's faster than what I have... but what I have is still crushing current games... so where's the incentive??

That's basically the issues as I see them. Of course the economy being what it is doesn't help at all.
the PC centric hardware of consoles either makes consoles or PCs less interesing, I agree. This reminds me of the new TVs. I got a TV a few months ago 'cos of the hype and 'cos it'd be my first 4K TV. Yet whenever I can I play on my good ol' 1440p 165Hz monitor with Freesync Premium Pro.

Now at CES 2023 Samsung has shown their new TVs lineup. Better panels, more peak brightness for HDR, QD-OLED and now gaming features like 144Hz support for gamers, now with Freesync Premium Pro support -a first for OLED TVs-. Current TVs just look more and more like monitors. Also, to me, that TV is more of the same, sure just better than what I have but maybe I have something okay enough...


Is VRR not supposed to be as good as Freesync Premium Pro?

Monitors have the disadvantage that there's less competition and prices per feature are usually a lot higher. You can just find quite good 4K TVs in the 300-400€ range, now try to find a decent 4K monitor at that price.... On TVs for every expensive TV you make there is going to be a lot of competition from brands of different countries.

The next step for TVs imho, should be 240Hz support.

Once curiosity is sated, and have an okay 4K TV my next screen is going to be a good 4K/8K monitor when opportunity arises in the next few years -no hurry for that-. This hammerhead -like a fellow B3D forumed defined it- :D monitor just reminds me of how consoles were back in the day -ultrawide, 4K screen in one (7680x 2160 resolution, 57" wth?-... Maybe not appealing to everyone but the hardware was very unique.

 
I disagree with the idea of what's in the consoles makes them less interesting. Infact having off the shelf hw makes it more interesting I think. Cause it comes down to how developers optimize for themselves rather than something like the PS3 where the hw is weaker than what's on pc and is hard to work with on top of that due to being non standardized.

I think that console hw being unique as a selling point is overrated, taken from nostalgia of a bygone era. But even during those times it's not like hw in consoles was outpacing what the PC was doing. And even if it got close it wasnt for long to begin with. There is just no merit to such an approach anymore and hasn't been for a long while.

Do I think it would be interesting if Sony put some ultra fast ed ram on their apu like cerny said they could have done for PS4, sure. But I don't need any more additional complexity to feel console gaming is worth it
 
I disagree with the idea of what's in the consoles makes them less interesting. Infact having off the shelf hw makes it more interesting I think. Cause it comes down to how developers optimize for themselves rather than something like the PS3 where the hw is weaker than what's on pc and is hard to work with on top of that due to being non standardized.

I think that console hw being unique as a selling point is overrated, taken from nostalgia of a bygone era. But even during those times it's not like hw in consoles was outpacing what the PC was doing. And even if it got close it wasnt for long to begin with. There is just no merit to such an approach anymore and hasn't been for a long while.

Do I think it would be interesting if Sony put some ultra fast ed ram on their apu like cerny said they could have done for PS4, sure. But I don't need any more additional complexity to feel console gaming is worth it
Nah, they're definitely less interesting. Times were WAY more interesting when consoles (and arcade hardware) were pushing new graphics features and memory configurations allowing for things never done before. It was way more interesting when a game title would release on two or more platforms of the same generation and have completely different assets and level designs due to strengths and weaknesses of the hardware.

Of course, I'm not saying I'd want that anymore to be clear!! ...but it was definitely way more interesting. Now things are very stale across all platforms. Basically just resolution and performance differences.. We're literally counting pixels and ms of frames here acting like there's massive differences :LOL:

I LOVE going back and watching comparisons and retrospectives of older titles which had very meaningful differences between different platforms depending on the capabilities of the hardware. Again, I wouldn't want that to be the reality today.. but it was a much more interesting time.
 
Maybe in hindsight. But hardware has become so powerful in general that diminishing returns is just an inevitable conclusion in how all the hardware will run a minimum standard.

I mean Nintendo is making sure there is atleast some of that old school "flair" continues in both good and bad ways
 
Things were more interesting before because of Dennard Scaling. The diminishing returns is the silicon; the architecture, platform, and content all follow after that. Hardware today isn't "so powerful", it's just not scaling enough to warrant radically more demanding applications. If we had 40GHz CPUs a decade ago and 400GHz CPUs today things would be very different because the hardware would be very different. 1.1x to 1.3x every few years isn't very different; it's slightly better of the same thing you had before. Anything you could do before, you can now do slightly better. Anything you couldn't do before, you probably still can't.
 
It was quite intresting with the Emotion Engine and GS, EDram, 2560bit, 128bit days however its clearly better with the generic off the shelf hw for the devs and ultimately for the end user.
 
It was quite intresting with the Emotion Engine and GS, EDram, 2560bit, 128bit days however its clearly better with the generic off the shelf hw for the devs and ultimately for the end user.

That was an awesome time and PS2 is still the best designed console in terms of hardware ever (And that isn't a joke)
 
Status
Not open for further replies.
Back
Top