Digital Foundry Article Technical Discussion [2023]

Status
Not open for further replies.
Nothing worse than a ugly game needing more than 8gb of vram.. plague tales requime is levels above the last of us and its not a vram hog.. naughty dogs engine is not a very good engine
Cmon now. Naughty Dog have a great engine, which is tailored to Playstation.. and I wouldn't be calling TLOU P1 ugly.., that's ridiculous. It's simply a game/engine designed around a different architecture. There's nothing, absolutely nothing wrong with that.

I just think it could have been more optimally ported to PC and the devs could have used more time to do so. I really feel like in this case it was rushing to get the PC version out around the time the show was ending, and it just didn't hit the mark but Sony put it out anyway. It deserves the criticism it's getting, but we'll see where Naughty Dog can get it to, given more time. They've already communicated upcoming patches and committed to further patches beyond.. so they realize there's a lot of work to be done.
 
Cmon now. Naughty Dog have a great engine, which is tailored to Playstation.. and I wouldn't be calling TLOU P1 ugly.., that's ridiculous. It's simply a game/engine designed around a different architecture. There's nothing, absolutely nothing wrong with that.

I just think it could have been more optimally ported to PC and the devs could have used more time to do so. I really feel like in this case it was rushing to get the PC version out around the time the show was ending, and it just didn't hit the mark but Sony put it out anyway. It deserves the criticism it's getting, but we'll see where Naughty Dog can get it to, given more time. They've already communicated upcoming patches and committed to further patches beyond.. so they realize there's a lot of work to be done.

The port has been a bit of a disaster and I really feel for NG given the damage this must be doing to their previously spotless reputation. But to their credit they're clearly working very hard to fix the issues. The real question is given the likely constraints of the core engine design being laser focused on a completely different architecture, how far can they actually go with the improvements with what is likely now a very limited budget? They certainly aren't going to re-write the thing to utilise Direct Storage GPU decompression or replace the entire streaming/memory management system with something more PC optimised - which is what the game fundamentally needs.
 
Nothing worse than a ugly game needing more than 8gb of vram.. plague tales requime is levels above the last of us and its not a vram hog.. naughty dogs engine is not a very good engine
If you think tlou part 1 is ugly by any means you need to get your eyes checked.

It also should be said that atleast comparing the console versions of plague tale and tlou, they actually look broadly comparable graphics wise to me, although tlou has far lot more polish in it's character model animations and cutscenes. A difference being that one runs at 60fps and the other at 30 with the same resolution target.

Although to be fair I don't know how much the amounts of rats drag down the FPS on requiem.
 
If you think tlou part 1 is ugly by any means you need to get your eyes checked.
Indeed.

20230405024555-1.jpg


20230331223758-1.jpg


20230401233709-1.jpg
 
Nothing worse than a ugly game needing more than 8gb of vram.. plague tales requime is levels above the last of us and its not a vram hog.. naughty dogs engine is not a very good engine
What am I reading? TLOU is easily the most impressive game on the market right now. The geometric density, lighting and texture detail is off the charts for literally every little asset in this game. It literally looks like an artwork come to life all the time. How can you possibly say it looks ugly..?
 
-PCs have split memory
-You have to copy data across the PCI bus with added latency
-You have intrinsic overhead of the OS
-You have multiple vendors, APIs, and Drivers
-You have less specialized tools and documentation

-Consoles have unified memory
-CPU and GPU can both access memory directly
-Less overhead, more streamlined OS designed around gaming
-Single vendor, specialized APIs for the hardware, and guarantees from the drivers
-Specialized tools and documentation

Console development is easier than PC development... news at 11. Thanks MLID and random indie developer!

Keep in mind that showcasing what the consoles are good at while only showcasing what the PCs are bad at isn't exactly being impartial either. BTW - I get why you are doing it in order to refute that other person, however it could lead people to the wrong conclusions.

So, it should also be pointed out that

PC
  • Because of the split memory pools there is never any bandwidth contention between the CPU and GPU. Thus you can generally rely on the stated bandwidth available for the CPU and GPU regardless of whether or not they are both doing intensive memory reads and writes.
    • This obviously comes at the expense of requiring more developer skill and expertise in order to manage 2 pools of memory, but the benefit is that the bandwidth is always there regardless of how you are using it.
  • Multiple vendors, APIs and Drivers also means that PC can react more quickly to changes in technology than consoles
    • For example, we are currently "stuck" at relatively simplistic RT implementations in games because that is what consoles are limited to.
    • Unfortunately, due to consoles being the primary platforms for revenue generation for most AAA studios this means that exploiting new advances in technology usually doesn't happen until there's a new console generation.
    • This isn't so much a drawback of PCs as it is a drawback of AAA developers getting most of their revenue from a fixed hardware platform.
  • There may be less specialized tools but there's also more tools available.
    • I'd consider this a wash.
Console
  • Unified memory means that there will always be the potential for memory conflicts and either the CPU or GPU or both will not be able to always be guaranteed the full available memory bandwidth.
    • So, unlike PC, if both the CPU and GPU have to access memory at the same time the bandwidth will have to be split between them.
      • And while AMD continues to work on reducing the bandwidth penalty associated with memory contention, it's still there, so the overall bandwidth will be even less when both the CPU and GPU need to access it.
    • So, instead of managing 2 memory pools as on PC, console developers have to more carefully manage their CPU and GPU memory access patterns depending on the bandwidth needs at any point in the game.
    • Additional benefit for console manufacturers it that you don't need as much memory as you do in a system with split memory pools.
  • Single vendor, fixed hardware, fixed APIs specialized for a very specific scenario, etc. means it can certainly excel at a specific task (in this case games).
    • But it also means that it's less useful or not even useable for many things other than gaming. Which is fine since noone is buying a console to do something other than gaming.
    • It also limits you WRT rendering technology as your hardware is stuck at X point in time and nothing you can do will change that until the console manufacturer releases a new console.
      • Good if you want a nice safe consistent hardware platform to code to with minimal required effort (compared to PC, not in absolute terms)
      • Bad if you want to take advantage of new advances in technology that could lead to a significant visual improvement in your game.
  • Specialized tools and documentation also means you are reliant on the quality of tools and documentation that the console maker provides.
    • Sony have gotten significantly better at this with the PS4 and PS5 after crapping the bed with the PS3.
    • Historically, this has been a major and significant drawback of console development.
    • Unlike PC, you couldn't just choose to go with better tools if a 3rd party released a better SDE/SDK
    • Hence, you will always be reliant on the console maker to provide good tools, and this isn't guaranteed
      • For example, look at how Microsoft's provided tools have regressed at the start of the XBO and XBS generations from where they were at during the PS3/X360 generation when MS's provided tools were vastly superior to what Sony provided.
It's far too naive to say "console good" - "PC bad" when it comes to games development. Depending what what the developer is after or their requirements, then PC could be the best (for example, if you want to push RT or need uncontested memory bandwidth) or console could be the best (for example, simpler development environment or less skilled programmers required, both of which reduces your development costs).

Regards,
SB
 
Keep in mind that showcasing what the consoles are good at while only showcasing what the PCs are bad at isn't exactly being impartial either. BTW - I get why you are doing it in order to refute that other person, however it could lead people to the wrong conclusions.

So, it should also be pointed out that

PC

Console
I agree about tools lock in, but I don't think your points about split vs unified memory or a fixed hardware/api limitations are true at all inpractice. I can't think of a single rendering innovation (aside from hardware RT, which was explicitly a hardware feature catching up to known approaches, not a new technique or anything) that didn't come to consoles first. "Requires more skill" isn't really a thing, the thing is "introduces more complexity". Having to manage bandwidth around resources you entirely control that are entirely the same every time is much less scary than having to manage anything on pc, where there can be a million configurations and states and environments.
 
Nah, you've gone and taken the other extreme of this position which is also ridiculous. TLOU Pt1 is decent looking, but it's not even actually next gen, visually. It's like a small step up from TLOU2 and little more.

Again, something like Plague Tale Requiem looks pretty clearly better.
Aren't you doing the same thing? You can claim it is a small step up from tlou2 but tlou2 and part 1 are still some of the best looking games on the market. Even if you want to argue about plague tale which I would, it's still one game. Out of how many?
 
PC is clearly a more complicated space for development. You have a huge landscape of hardware, a heavy operating system, more abstraction in apis, more pitfalls in terms of io between devices and memory. Then you have unique challenges like shader compilation. Things like directstorage are a cool improvement, but it's going to be a long time before they can be targeted exclusively. Same with things like resizable BAR. It's just a reality that PC development is harder. You have to support all of the PCs that don't have those features. It has nothing to do with the talent of the studios. Naughty Dog is full of incredibly talented people. PC is just hard. Not saying that excuses the state of the game at release. It probably should have been released later. It's just the reality of the platform that it's more difficult.
 
Nah, you've gone and taken the other extreme of this position which is also ridiculous. TLOU Pt1 is decent looking, but it's not even actually next gen, visually. It's like a small step up from TLOU2 and little more.

Again, something like Plague Tale Requiem looks pretty clearly better.

I can't speak to whether APT is 'clearly better', but a big part of the critique with how TLOU PT1 performs on the PC is how it compares visually to TLOU2. You can of course select cut-scene screenshots (real-time in both games) and wide scenery shots from TLOU PT1 that no doubt look great, part of that due to being a decent distance away from some of the more weaker texturing, but also just because there are some extremely talented artists involved and laborious attention to detail.

But here's TLOU Part 2. 1440p max so they're at a disadvantage out of the gate, but this is a game that can produce these visuals while running on a Jaguar CPU + 1060-class GPU (in the case of the Pro) and streaming from a 5400rpm laptop HDD. It will run these at a locked 60 - from that same laptop HDD - on a PS5, restricted to the same working ~5GB total memory set. The levels in this game are larger, sometimes significantly so, than PT1.

This is what's not computing for some people. Yes, PT1 looks better, primarily in texturing and skin shaders. But in so many aspects they are so, so close (again, with smaller levels) and it can achieve that on a system that has a glacially slow I/O system tied to a much smaller memory pool by comparison.

You can argue that's irrelevant if the game's streaming was built from the ground up for the PS5, but the end results are what people are going to pay attention to, not the technical underpinnings. All this points to that this game isn't really doing something that's necessarily indicating it is impossible to do on a system that doesn't have the exact PS5's I/O architecture, the very existence of TLOU2 running on hardware a world apart from the PS5 indicates that it is, in fact, largely possible - it's just that it wasn't done in this case for the PC.

1680807411255.png
1680807827891.png
1680807858373.png

1680807912705.png

Aren't you doing the same thing? You can claim it is a small step up from tlou2 but tlou2 and part 1 are still some of the best looking games on the market. Even if you want to argue about plague tale which I would, it's still one game. Out of how many?

That's the point though - TLOU1's visual quality is often used as a defense for the hardware requirements it requires on PC. A Plague's Tale is being used as a counterpoint to that, how it can look like it does but require far less. TLOU2 argues that position as well, and I would say more strongly, as it comes from the same studio and is looking like that on even weaker hardware.
 
Last edited:
PC is clearly a more complicated space for development. You have a huge landscape of hardware, a heavy operating system, more abstraction in apis, more pitfalls in terms of io between devices and memory. Then you have unique challenges like shader compilation. Things like directstorage are a cool improvement, but it's going to be a long time before they can be targeted exclusively. Same with things like resizable BAR. It's just a reality that PC development is harder. You have to support all of the PCs that don't have those features. It has nothing to do with the talent of the studios. Naughty Dog is full of incredibly talented people.

Of PC developers? Yes, PC development is harder, sometimes significantly so - which is why Sony went out and got Nixxes for a reason. Simply due to market share there are likely far more experienced console developers when dealing with engines from AAA games than there are on PC.

Naughty Dog are full of incredibly talented console developers. That doesn't mean they're incompetent because TLOU doesn't run well on a 1060, but it does speak to perhaps not having the knowledgebase (and development time) required for navigating these potential PC porting pitfalls when your game is running ~75% more performant in GPU limited scenarios than a 2070 Super (!)

1680809563968.png

Like Insomniac are brilliant developers too! And it's pretty much a guarantee that Spiderman would have been a complete disaster had not Nixxes worked on its and redesigned its streaming system to be more accommodating to the PC.
 
@Flappy Pannus I have no doubt that the technical people at Naughty Dog could learn the ins and outs of PC development. It's just a question of time, and/or money. Companies like Nixxes are great because they're already experienced and it prevents you from having to divert your resources. You wouldn't necessarily want to split your team at Insomniac between Playstation and PC. You leave your Insomniac people on console, and you let the Nixxes people run with PC, for example. It's not that the Insomniac people aren't knowledgeable enough or capable of learning. Iron Galaxy fits in somewhere on this PC port. They don't have a great reputation, but it still may just be a time/money question more than it is one about talent or experience. Again, I just think PC is a more complicated space than console. It probably takes more time to develop something than it would on a single console platform. I think in this case the game just needed more time.

Edit: Going back to the content of the Digital Foundry video, Alex did bring up the fact that the Last of Us relies heavily on artist controlled lightmaps. I think it's true for both the original and the sequel that the environment lighting is handled with baked lightmaps, that are probably of very high quality. That is probably pretty unique in the game world. Most games are shifting to real-time, dynamic lighting, but it makes a lot of sense for a game like this to bake out light maps, because it doesn't have time of day and everything is incredibly artist controlled. Everything about the game is curated to the extreme by artists. It's one reason that the VRAM hit may be huge independently of texture quality.
 
Of PC developers? Yes, PC development is harder, sometimes significantly so - which is why Sony went out and got Nixxes for a reason. Simply due to market share there are likely far more experienced console developers when dealing with engines from AAA games than there are on PC.

Naughty Dog are full of incredibly talented console developers. That doesn't mean they're incompetent because TLOU doesn't run well on a 1060, but it does speak to perhaps not having the knowledgebase (and development time) required for navigating these potential PC porting pitfalls when your game is running ~75% more performant in GPU limited scenarios than a 2070 Super (!)

Like Insomniac are brilliant developers too! And it's pretty much a guarantee that Spiderman would have been a complete disaster had not Nixxes worked on its and redesigned its streaming system to be more accommodating to the PC.
Well we have to consider that these very talented people are focusing on the console and they aren't about to take them off their projects to work on a PC port of an old game. So they may very well have the knowledge to absolutely make an incredible PC version, but either aren't on the team, or wouldn't have been given the time to do the job properly.
 
I can't think of a single rendering innovation (aside from hardware RT, which was explicitly a hardware feature catching up to known approaches, not a new technique or anything) that didn't come to consoles first.

What would you class as a rendering innovation?

I would have thought that programmable vertex and pixel shaders were one of the biggest shifts in how graphics processors handled rendering. The entire course that consoles GPUs ended up taking was established by the PC space. Tensor cores and DLSS are a pretty big step change in how final images are generated too - though perhaps you could argue this work happens post-rendering. Machine learning and inference is scary big in terms of its potential.

Consoles certainly have seen a lot of innovation in hardware and in terms of how you draw your scene, but I think the PC has too. Though I suppose what one considers a rendering innovation might be different for different people.
 
@Flappy Pannus I have no doubt that the technical people at Naughty Dog could learn the ins and outs of PC development. It's just a question of time, and/or money. Companies like Nixxes are great because they're already experienced and it prevents you from having to divert your resources. You wouldn't necessarily want to split your team at Insomniac between Playstation and PC. You leave your Insomniac people on console, and you let the Nixxes people run with PC, for example. It's not that the Insomniac people aren't knowledgeable enough or capable of learning. Iron Galaxy fits in somewhere on this PC port. They don't have a great reputation, but it still may just be a time/money question more than it is one about talent or experience. Again, I just think PC is a more complicated space than console. It probably takes more time to develop something than it would on a single console platform. I think in this case the game just needed more time.

I don't think anyone, at least the PC advocates here, disagrees with that summary at all.

Ultimately it does come down to resources, if your architecture is very powerful but marketshare-speaking, for the games being targeted, you're a minority platform - you're eventually going to have to flexible enough to accommodate implementation approaches that are used on the larger platform, expecting every development team to have a Nixxes at the ready to redesign major parts of your engine is not realistic for a lot of publishers. Hence, Directstorage and the like. I would be far more sour on the future of the PC platform if no movement was being done in this area and it was just expected we'll have 32GB vram cards and 16 core CPU's to deal with it.

That being said, if any game were to get this star treatment from Sony, you would expect it would be this one. I've maintained this is primarily a failure of Sony management, and they very clearly knew it. I'm not of the mindset this is a disastrous 'black mark' for Sony as it won't affect the vast bulk of their sales at all, TLOU3 is going to be a huge hit regardless if they never fix this port - but it's certainly a bizarre slip-up for a company that's made such a big pronouncement of embracing this platform. With the added pressure of trying to hit close to the TV series window, it makes even less sense that you would not at least pull in your best-in-class porting team to aid in this.

Well we have to consider that these very talented people are focusing on the console and they aren't about to take them off their projects to work on a PC port of an old game. So they may very well have the knowledge to absolutely make an incredible PC version, but either aren't on the team, or wouldn't have been given the time to do the job properly.

Yes, which is why I mentioned development time. Albeit that goes against what Naughty Dog has said of course - both in who's actually doing the port, and the attention to detail (read their 'delay' announcement, it's painful in retrospect).

It's interesting to note btw that with the patch notes, they make mention of Iron Galaxy assisting. So clearly they require outside developers more accustomed to the PC (though uh, maybe not exceptionally gifted).

We at Naughty Dog and our partners at Iron Galaxy are closely watching player reports to support future improvements and patches. We are actively optimizing, working on game stability, and implementing additional fixes which will all be included in regularly released future updates.
 
Last edited:
What would you class as a rendering innovation?

I would have thought that programmable vertex and pixel shaders were one of the biggest shifts in how graphics processors handled rendering. The entire course that consoles GPUs ended up taking was established by the PC space. Tensor cores and DLSS are a pretty big step change in how final images are generated too - though perhaps you could argue this work happens post-rendering. Machine learning and inference is scary big in terms of its potential.

Consoles certainly have seen a lot of innovation in hardware and in terms of how you draw your scene, but I think the PC has too. Though I suppose what one considers a rendering innovation might be different for different people.
Oh yeah, programmable pixel shaders were easily the biggest innovation ever in graphics, I kinda assumed we were all talking about since then. Dlss is a cool hardware tweak on an existing technique (temporal upscaling/aa) which started on consoles — for the purpose of “how is the platform for developers” im thinking of things that change development - new techniques, new approaches, etc, not hardware acceleration or better hardware alternatives to widely used techniques.
 
Keep in mind that showcasing what the consoles are good at while only showcasing what the PCs are bad at isn't exactly being impartial either. BTW - I get why you are doing it in order to refute that other person, however it could lead people to the wrong conclusions.

So, it should also be pointed out that

PC
  • Because of the split memory pools there is never any bandwidth contention between the CPU and GPU. Thus you can generally rely on the stated bandwidth available for the CPU and GPU regardless of whether or not they are both doing intensive memory reads and writes.
    • This obviously comes at the expense of requiring more developer skill and expertise in order to manage 2 pools of memory, but the benefit is that the bandwidth is always there regardless of how you are using it.
  • Multiple vendors, APIs and Drivers also means that PC can react more quickly to changes in technology than consoles
    • For example, we are currently "stuck" at relatively simplistic RT implementations in games because that is what consoles are limited to.
    • Unfortunately, due to consoles being the primary platforms for revenue generation for most AAA studios this means that exploiting new advances in technology usually doesn't happen until there's a new console generation.
    • This isn't so much a drawback of PCs as it is a drawback of AAA developers getting most of their revenue from a fixed hardware platform.
  • There may be less specialized tools but there's also more tools available.
    • I'd consider this a wash.
Console
  • Unified memory means that there will always be the potential for memory conflicts and either the CPU or GPU or both will not be able to always be guaranteed the full available memory bandwidth.
    • So, unlike PC, if both the CPU and GPU have to access memory at the same time the bandwidth will have to be split between them.
      • And while AMD continues to work on reducing the bandwidth penalty associated with memory contention, it's still there, so the overall bandwidth will be even less when both the CPU and GPU need to access it.
    • So, instead of managing 2 memory pools as on PC, console developers have to more carefully manage their CPU and GPU memory access patterns depending on the bandwidth needs at any point in the game.
    • Additional benefit for console manufacturers it that you don't need as much memory as you do in a system with split memory pools.
  • Single vendor, fixed hardware, fixed APIs specialized for a very specific scenario, etc. means it can certainly excel at a specific task (in this case games).
    • But it also means that it's less useful or not even useable for many things other than gaming. Which is fine since noone is buying a console to do something other than gaming.
    • It also limits you WRT rendering technology as your hardware is stuck at X point in time and nothing you can do will change that until the console manufacturer releases a new console.
      • Good if you want a nice safe consistent hardware platform to code to with minimal required effort (compared to PC, not in absolute terms)
      • Bad if you want to take advantage of new advances in technology that could lead to a significant visual improvement in your game.
  • Specialized tools and documentation also means you are reliant on the quality of tools and documentation that the console maker provides.
    • Sony have gotten significantly better at this with the PS4 and PS5 after crapping the bed with the PS3.
    • Historically, this has been a major and significant drawback of console development.
    • Unlike PC, you couldn't just choose to go with better tools if a 3rd party released a better SDE/SDK
    • Hence, you will always be reliant on the console maker to provide good tools, and this isn't guaranteed
      • For example, look at how Microsoft's provided tools have regressed at the start of the XBO and XBS generations from where they were at during the PS3/X360 generation when MS's provided tools were vastly superior to what Sony provided.
It's far too naive to say "console good" - "PC bad" when it comes to games development. Depending what what the developer is after or their requirements, then PC could be the best (for example, if you want to push RT or need uncontested memory bandwidth) or console could be the best (for example, simpler development environment or less skilled programmers required, both of which reduces your development costs).

Regards,
SB
This is great and all but, if you follow the flow of the discussion, you'll see it was not a pc vs console debate. It was a discussion about properly constructed arguments and straw mans.
 
Status
Not open for further replies.
Back
Top