Digital Foundry Article Technical Discussion [2022]

Status
Not open for further replies.
I can see Sony stopping doing highly customized I/O for their PS6, or probably not improving what they currently have.
PS5 is already highly customised. They have their own controller, their own I/O logic, onboard decompression blocks, hardware memory abstraction to stored data. It feels like they already covered the core requirements, but they could always make I/O faster, support new compression in hardware and so on.
 
The days of highly customized/exotic hardware is over. Its x86, rdna2 and gddr memory, nvme interface with some customizations now. No more cell and emotion engine crazyness.
 
The days of highly customized/exotic hardware is over. Its x86, rdna2 and gddr memory, nvme interface with some customizations now. No more cell and emotion engine crazyness.
with the advent of the SSD as a baseline; the era of custom/exotic hardware to power next gen graphics is over. Going forward
It will be about exotic/novel software solutions.
There may be some accelerator work; but I see this happening in the console space as borrowing what’s already there as opposed to innovating something new.
 
with the advent of the SSD as a baseline; the era of custom/exotic hardware to power next gen graphics is over. Going forward
It will be about exotic/novel software solutions.
There may be some accelerator work; but I see this happening in the console space as borrowing what’s already there as opposed to innovating something new.
Probably but kudos for sony for pushing top notch io system and now in vr foveated rendering with eye tracking for masses. Introducing something top class in 400$ console is probably not that easy but quite important for making it standard.
 
Probably but kudos for sony for pushing top notch io system and now in vr foveated rendering with eye tracking for masses. Introducing something top class in 400$ console is probably not that easy but quite important for making it standard.
Yea I’m still interested to see what can come of it. I’m not sure if it’s an over-engineered solution or if this can be a very competent replacement for vram going forward. (It already is… but just curious to see how far they can push this)

what the vram amounts will be for next generation. Or if this solution enables a much longer tail for backwards compatibility after ps6 arrives
 
  • Like
Reactions: snc
this can be a very competent replacement for vram going forward. (It already is… but just curious to see how far they can push this)

The nvme in the PS5 being a replacement for vram? Its not even competitive to fast DDR4 ram right? Even at peak burst performance, you'd be looking at DDR4 speeds with the latency disadvantages (amongst others). vram as in GDDR6 and faster variants, its not even in the same class. Or did you mean something completely different?
 
The nvme in the PS5 being a replacement for vram? Its not even competitive to fast DDR4 ram right? Even at peak burst performance, you'd be looking at DDR4 speeds with the latency disadvantages (amongst others). vram as in GDDR6 and faster variants, its not even in the same class. Or did you mean something completely different?
It can never be used as bandwidth for work. We typically needed more vram to hold more textures and data than needed because IO was so slow to boot it in just in time.

if your scene requires more vram than 16GB; meaning all of it is being used to render, then you need more vram to render the scene. Is that typical however? I don’t know or believe so.
 
Ok i understand, the SSD's substitute for more vram/ram then. Though i can imagine it also depends on what developers and studios would be doing, vram requirements have gone up since the start of gaming. Maybe theres a barrier somewhere no idea ;)
 
It may possibly come back with performance scaling reaching its end soon. I hope it does.
I doubt it. The industry is moving in the direction of everything being on everything.

That would be a monumental step backwards.. and I don't think developers would be up for that again.

It would definitely make things more interesting though, for sure.
 
I doubt it. The industry is moving in the direction of everything being on everything.

That would be a monumental step backwards.. and I don't think developers would be up for that again.

It would definitely make things more interesting though, for sure.
In a theoretical future where it's the only avenue for increased performance, devs would be up for it IMO. While the mass industry homogenization is great for CEOs and shareholders, it isn't great for us gamers.
 
In a theoretical future where it's the only avenue for increased performance, devs would be up for it IMO. While the mass industry homogenization is great for CEOs and shareholders, it isn't great for us gamers.
I think there's plenty of other things in the way of developers realizing their ambitions, more important than just hardware power. Better tools, bigger budgets, and of course more time.. are some of the more pressing issues resulting in developers having to temper ambitions.

I mean... much more powerful hardware than consoles already currently exists.. and by the end of this year, that power is going to extend even further... So the actual technology itself doesn't seem to be a problem currently, it's the fact that tech takes time to become as cost effective as consoles need to be, that's the problem. And I doubt Sony or Nintendo... (certainly not Microsoft) is going to be trying to build completely custom, expensive chips which are extremely powerful, but new, and difficult to code for. Sony learned a hard lesson from that, and as I said, I don't think they're eager to go down that road anytime soon.

Everything points to the industry consolidating, and that goes from development studios to even hardware itself. The consoles are more similar than ever before. Besides some slight custom tweaks with considerations to their own wants and desires.. they're largely similar.

I say all that though.. but I definitely agree with you that it isn't great for us gamers. Things were MUCH more interesting when console hardware and technology was really diverse, with new processors and graphics accelerators with new features in hardware. Completely different architectures from completely different companies. You also had amazing arcade technology pushing a lot of those advancements. It was an amazing time for sure.. I just think that the push now is to have everything standardized so that they can target and support the largest markets possible.
 
The nvme in the PS5 being a replacement for vram? Its not even competitive to fast DDR4 ram right? Even at peak burst performance, you'd be looking at DDR4 speeds with the latency disadvantages (amongst others). vram as in GDDR6 and faster variants, its not even in the same class. Or did you mean something completely different?

Yes, it’s a replacement of VRAM. A huge portion of VRAM is dedicated to potential future memory needs because HDDs are too slow. So data that won’t be necessary for another 15 to 20 seconds sits in VRAM which is worsen by the fact that some of that data will not be utilized at all.

SDD even in their current form alleviate some of that future need and wasteful storage and allow more of the VRAM to be used for more immediate needs of the gpu.
 
In a theoretical future where it's the only avenue for increased performance, devs would be up for it IMO. While the mass industry homogenization is great for CEOs and shareholders, it isn't great for us gamers.
I don't know. The unification of hardware standards and cross platform middleware have put us in sort of an indie renaissance. We weren't seeing indie games like this in the heyday of custom 3d hardware (PS1 and PS2 generation), and I would argue that the lower barrier to entry has given us many more experimental, fresh ideas of what a game could be, compared to the franchise heavy, yearly releases that the larger publishers are going for.
 
I can see Sony stopping doing highly customized I/O for their PS6, or probably not improving what they currently have . They'll better design a PC box in order to have a better compatibility with their multiplatform strategy. At some point they'll probably start using DirectX even on their Playstation.

I don’t. With the expansion of more and more products needing fast memory and the shrinking of silicon becoming more and more expensive, I see consoles at the forefront of technology advancement especially when it comes to memory. Consoles are too cheap to brute force it. I expect the core components to be the same/similar but I expect MS and Sony will incorporate nonstandard ways to get more for less.

I think there's plenty of other things in the way of developers realizing their ambitions, more important than just hardware power. Better tools, bigger budgets, and of course more time.. are some of the more pressing issues resulting in developers having to temper ambitions.

I mean... much more powerful hardware than consoles already currently exists.. and by the end of this year, that power is going to extend even further... So the actual technology itself doesn't seem to be a problem currently, it's the fact that tech takes time to become as cost effective as consoles need to be, that's the problem. And I doubt Sony or Nintendo... (certainly not Microsoft) is going to be trying to build completely custom, expensive chips which are extremely powerful, but new, and difficult to code for. Sony learned a hard lesson from that, and as I said, I don't think they're eager to go down that road anytime soon.

Everything points to the industry consolidating, and that goes from development studios to even hardware itself. The consoles are more similar than ever before. Besides some slight custom tweaks with considerations to their own wants and desires.. they're largely similar.

I say all that though.. but I definitely agree with you that it isn't great for us gamers. Things were MUCH more interesting when console hardware and technology was really diverse, with new processors and graphics accelerators with new features in hardware. Completely different architectures from completely different companies. You also had amazing arcade technology pushing a lot of those advancements. It was an amazing time for sure.. I just think that the push now is to have everything standardized so that they can target and support the largest markets possible.

What’s great for gamers and what’s great for gamers who also like to discuss gaming hardware on an academic basis are two different things.

$100 million in development dollars devoted to porting across half dozen of different exotic designs will never buy you the same level of visuals that a more homogenized market would offer. Plus, the current complexity of gaming code doesn’t lend itself well to a market full of exotic hardware designs.

Discussion here would be a lot more interesting but that would come with the caveat that IQ would suffer as a result. And the level of bugs would increase significantly.
 
Last edited:
I don't know. The unification of hardware standards and cross platform middleware have put us in sort of an indie renaissance. We weren't seeing indie games like this in the heyday of custom 3d hardware (PS1 and PS2 generation), and I would argue that the lower barrier to entry has given us many more experimental, fresh ideas of what a game could be, compared to the franchise heavy, yearly releases that the larger publishers are going for.
Back in those days we didn't need indy games for new ideas. Mid and high end developers gave it to us. Now if you want something that isn't yet another mundane, open world snooze fest you have to go to pixel art or other very dated looking indy games. There are a small handful of studios that still make quality games at a higher level but they are few and far between these days. Even multiplayer games have been homogenized down to primarily battle royale titles. It's crazy how this open world trend has consumed and almost completely ruined multiple genres.
 
Back in those days we didn't need indy games for new ideas. Mid and high end developers gave it to us. Now if you want something that isn't yet another mundane, open world snooze fest you have to go to pixel art or other very dated looking indy games. There are a small handful of studios that still make quality games at a higher level but they are few and far between these days. Even multiplayer games have been homogenized down to primarily battle royale titles. It's crazy how this open world trend has consumed and almost completely ruined multiple genres.

This isn’t a young immature market anymore. It was cheaper to create cutting edge games because code complexity was a lot simpler then. It was easy to have new ideas because you didn’t have 50 years worth of people coming up with new ideas. It’s was easy to compete with exotic hardware designs because you didn’t have companies with 25-40 years of knowledge/experience pumping billions of dollars into designing hardware.

Plus you didn’t have an extra 20-30 years worth of gaming experience. It was a lot easier to impress you back then. LOL.

Pushing out more exotic hardware isn’t going to recreate your experience of yesteryear.
 
Yes, it’s a replacement of VRAM.

Complementing, not a whole replacement. PS5 still has vram. Besides wouldnt want ddr3 speeds (sustained) ddr4 (peak burst) coupled with even higher latency to fully replace any ram.

I see consoles at the forefront of technology advancement

Seems that its the other way around now. They have custom AMD cpu’s and gpu’s. They basically went to amd and customized some of what amd had to offer in the pc space (rdna1/2, zen2, pcie4.0, standard gddr6 etc). In some cases cut down even (halved cache, clocks, stripping of IC),

If they were at the forefront, they wouldnt be behind in RT and ML reconstruction hw tech.
Even the ssd, if DS/RTX io is anything to go by, using a part of gpu compute to accelerate nvme 4/5 io seems more flexible/expsndeble than a fixed function controller.
 
Status
Not open for further replies.
Back
Top