Digital Foundry Article Technical Discussion Archive [2015]

Status
Not open for further replies.
Yup it's the same old song and dance, gamers won't spend money on console hardware hence they get new boxes with weak hardware that will struggle with visuals for the entire 8 year generation. That's how it is. I figured gamers knew that by now, consoles are built to be dirt cheap not to be powerful.

I agree partially (the spending money part). If somehow, Sony and MS can figure out how to support multiple systems (value system and premium system), that would allow developers the needed legroom on scaling texture assets and IQ settings for the premium owners, without any additional time or cost to development. Then this would satisfy the console gamers wanting (willing) to spend more on better hardware... without going the PC route.
 
According to this presentation, Naughty dog's Game logic on PS3 was single threaded (not the SPU job stuff). They had to move it into fibers and spread it across 6 threads (It's not like they could keep the game logic still single threaded and jobify just the rest for the purposes of remastering), even after they jobified everything, including render logic, the cpu could finish at 25ms. They had to get creative (and change how their engine handled frames) to go under 16ms. The amount of tweaking they had to do to paints a bleak picture of the current gen cpu's IMHO.

What do you mean with not the SPU stuff? Because I am not sure you are correct there.
 
The CPUs are pretty bad and it's not just customers unhappy with them.

In the console gaming ecosystem, the PS4/XB1 Jaguar CPU is far more capable the XB360 Xenon CPU, and a notch or so better than the PS3 Cell. Raw performance anyhow...
ps4gpu_x1gpu.jpg


From a PC (CPU) standpoint, sure they look "weak". But that's to be expected... especially the price range and demographic of gamers MS and Sony were (are) shooting for.
 
According to this presentation, Naughty dog's Game logic on PS3 was single threaded (not the SPU job stuff). They had to move it into fibers and spread it across 6 threads (It's not like they could keep the game logic still single threaded and jobify just the rest for the purposes of remastering), even after they jobified everything, including render logic, the cpu could finish at 25ms. They had to get creative (and change how their engine handled frames) to go under 16ms. The amount of tweaking they had to do to paints a bleak picture of the current gen cpu's IMHO.

It doesn't have anything to do with current gen CPUs... it has to do with the fact that they were aiming to accomplish the same work in half the frame time. 60fps is hard. A lot of people have figured that out this gen, and trying to reach it has almost tanked a few titles.
 
What do you mean with not the SPU stuff? Because I am not sure you are correct there.

I mean they state specifically their engine was single threaded (spu's weren't used for gameplay code, but "engine systems" were running on SPU's (they are already multithreaded), they ran the gamelogic then the command buffer on the same frame (so it wasn't even multithreaded like Guerrilla's engine, where Gamelogic and render logic would work on different frames so they could work in parallel, and this is what ND needed to do get under 16 ms)

Indeed, they are now twice as fast with the CPU stuff, but before doing the major change about game and render logic running parallel, but after jobifying everything relating to game and render logic, they were still at 25ms, which is only a few ms faster compared to what CELL was doing on PS3. With a 7 year gap between consoles, I'd expect PS4's cpu to do better (I guess running at 3.2Ghz and those beastly SPU's make CELL still a very formidable processor).

It doesn't have anything to do with current gen CPUs... it has to do with the fact that they were aiming to accomplish the same work in half the frame time. 60fps is hard. A lot of people have figured that out this gen, and trying to reach it has almost tanked a few titles.

They wanted to do the same work in half the time on hardware that is 7 year "newer", Shortbread's performance slide is telling (although Xbox's figure doesn't seem well, it wasn't something that held it back because SPU's were already busy trying to make up for the lacking RSX).
 
Last edited:
As much as complainers gamers want to bitch about the PS4/XB1 CPUs being poor choices... however we're still in the early stages of this generation (gaming). Developers and engine optimization gets better over time, not overnight.

That being said; if console gamers want better hardware, then the underlying issue must be addressed, that is "How much is too much" when it comes to purchasing console hardware? And from what we have seen in the past (3DO, Neo Geo, PS3, XB1), console gamers aren't very receptive to high prices.
I am not sure if you are implying that I am "bithcing" or "complaining". There is an obvious difference between what we were experiencing in the past and what we are experiencing now. In the past console makers were selling at a loss, and the consoles were outperforming their price.
Now we see games that arent pushing the envelope yet the performance isnt stable. Its not a matter of bithcing. It is a simple matter of observation.
 
I am not sure if you are implying that I am "bithcing" or "complaining". There is an obvious difference between what we were experiencing in the past and what we are experiencing now. In the past console makers were selling at a loss, and the consoles were outperforming their price.
Now we see games that arent pushing the envelope yet the performance isnt stable. Its not a matter of bithcing. It is a simple matter of observation.
Please explain what you were experiencing less than two years in, last gen, that was so stable. Lair (because I was experiencing those performance issues)? Personally, this gen's issues seem more like PC-type issues (microstuttering and the like).

It's obvious that developers need more time with these console architectures. Plus, there are still performance improvements locked away in reserves (CPU and RAM reservations).
 
Please explain what you were experiencing less than two years in, last gen, that was so stable. Lair (because I was experiencing those performance issues)? Personally, this gen's issues seem more like PC-type issues (microstuttering and the like).

It's obvious that developers need more time with these console architectures. Plus, there are still performance improvements locked away in reserves (CPU and RAM reservations).
I dont think you are following me. The examples I mentioned are games that arent supposed to push the envelope. If you are selectively choosing past gen's worst examples you arent contributing to a counter argument.
 
http://www.eurogamer.net/articles/d...-install-the-witcher-3-day-one-patch#comments

After a weekend of testing The Witcher 3 on Xbox One, it's fair to say installing its day one patch (version 1.01) is something of a double-edged sword. On the one hand, the 588MB file improves frame-rates slightly during play, while fixing minor bugs scattered across the game. In many ways it's a more polished experience with the patch - notably we have less geometry pop-in during cut-scenes, fewer instances of flickering shadows, and a great many more tweaks elsewhere.

Xbox One's dynamic resolution scaling may also help to uphold this level of performance - in effect before and after the patch. In theory, this allows the framebuffer to switch between a 1600x900 resolution to a native 1920x1080 on the fly, seemingly based on GPU load at any given point. However, in practice this doesn't switch as much as we'd expected - The Witcher 3 is predominantly a 900p game, and the only scenes we've found to run at a full 1080p are the in-engine rendered title screen, and the video cut-scenes. Even reducing the GPU load by looking directly up to the sky shows the game is still rendering at a native 900p. We do notice some indoors scenes rendering at what seems to be a higher resolution than 900p, but even here, it is clearly not a full, native 1080p output.
 
DF said:
But the downsides pack a punch too. It's apparent after switching between the game's default and patched states that these improvements come at a cost. Chief among these is the aggressive stuttering during pre-rendered cut-scenes.

In the Gamersyde PS4 vs. XBO video, BlimBlim notes that the Xbox One version also drops frames when scrolling around the map. Hopefully this is a minor glitch common to both issues that CDPR can track down and eliminate because these sure are weird things for stutters to creep in. On the plus side, it's not going to affect your enjoyment of the game unlike the eternal loading bug and Gwent-crashing bug which seem common to both consoles.
 
I am not sure if you are implying that I am "bithcing" or "complaining". There is an obvious difference between what we were experiencing in the past and what we are experiencing now. In the past console makers were selling at a loss, and the consoles were outperforming their price.

Now we see games that arent pushing the envelope yet the performance isnt stable. Its not a matter of bithcing. It is a simple matter of observation.

Not implied towards you... gamers in general.

Anyhow, Sony/MS/Nintendo for the most part want their hardware profitable right out the gate. The only way that this is going to change for console gamers wanting premium hardware (premium performance)... a) Gamer's move the acceptable price delta ($299-399) to a more premium ($499-599) pricing scheme, or b) The gaming manufactures provide two different sku models (core model / premium model), that are compatible hardware and SDK wise, that wouldn't cost any additional time or money on developers supporting both platforms.
 
I dont think you are following me. The examples I mentioned are games that arent supposed to push the envelope. If you are selectively choosing past gen's worst examples you arent contributing to a counter argument.
Please explain why you believe certain games "aren't suppose to push to envelope". This is based on what exactly? If they aren't suppose to push the envelope, why aren't you talking about the game developer's engines? Other games, that are possibly pushing developers' current knowledge of the architecture, don't seem to have most of those issues.

Was Lair suppose to be pushing the PS3, last gen? Knowing what we know now, do you believe that? Then, what about all the Madden games, etc that ran at half the frame rate (or had a resolution of 576P), etc at this same point in history? Are those enough examples?
 
http://www.eurogamer.net/articles/digitalfoundry-2014-the-witcher-3-tech-analysis
http://www.eurogamer.net/articles/2...-combat-was-deliberately-easy-cd-projekt-says

According to these two articles, 2014 E3 build can run on a xbox1 and is also 900p @ 30fps. Since the retail version has almost no resolution boost, why CDP drops 2014 version? Apparently 2014 build looks more impressive than the final retail version.
 
Xbox One's dynamic resolution scaling may also help to uphold this level of performance - in effect before and after the patch. In theory, this allows the framebuffer to switch between a 1600x900 resolution to a native 1920x1080 on the fly, seemingly based on GPU load at any given point. However, in practice this doesn't switch as much as we'd expected - The Witcher 3 is predominantly a 900p game, and the only scenes we've found to run at a full 1080p are the in-engine rendered title screen, and the video cut-scenes. Even reducing the GPU load by looking directly up to the sky shows the game is still rendering at a native 900p. We do notice some indoors scenes rendering at what seems to be a higher resolution than 900p, but even here, it is clearly not a full, native 1080p output.

Was the 1080p PR even warranted? I wondered if MS pushed CDPR on this issue? 900p should have just been excepted.
 
http://www.eurogamer.net/articles/digitalfoundry-2014-the-witcher-3-tech-analysis
http://www.eurogamer.net/articles/2...-combat-was-deliberately-easy-cd-projekt-says

According to these two articles, 2014 E3 build can run on a xbox1 and is also 900p @ 30fps. Since the retail version has almost no resolution boost, why CDP drops 2014 version? Apparently 2014 build looks more impressive than the final retail version.
Maybe they were using the Nvidia GTX equipped dev kits again...
 
That's how it's been for one generation (this one). Previous generations have had by-and-large very strong hardware that's gone toe-to-toe with decent PCs in terms of results for years after release. And it's ridiculous to think that a console can be designed to be technological 'strong' 8 years after release, at least without being £2000 at launch.

Years of gaming has created a very different impression on me than the one you have.

Yeah, consoles at launch look somewhat good on paper against available PC gpus. However, by the time the first crop of games that really express the power of those consoles arrive, newer PC gpus usually kill consoles in terms of specs by a country mile.

How does a NV2A in the XboxOG compare to a AMD 9700? Look at Morrowind on the XBOG versus PC and tell me the xbox version holds up well in comparison. If this was last gen, the AMD 2900XT would have been released 4 days ago. Does the 360's gpu specs really compare well to the performance offered by that card.

Consoles haven't gotten by by having comparable hardware. They have gotten by first class access to development dollars while being the platform of choice for most publishers of AAA games.

Consoles don't have to be designed to be "technologically strong 8 years after release" and thats because most game design and development especially for big costly project tends to be centered around the current gen of consoles whether those consoles are two years old or 8 years old. And why the performance delta between console hardware and pc hardware (that grows every year) is never really truly expressed by the visual delta between console games and their PC ports.
 
Years of gaming has created a very different impression on me than the one you have.

Yeah, consoles at launch look somewhat good on paper against available PC gpus. However, by the time the first crop of games that really express the power of those consoles arrive, newer PC gpus usually kill consoles in terms of specs by a country mile.
It's not about specs but what's accomplished on screen. Baldur's Gate Dark Alliance/CON on PS2 looked far better than Dungeon Siege on a Ti4200 (or whatever GeForce 4 I had at the time). Obviously raw specs won't be comparable because specs double every couple of years, such is the rate of technological progress. There's no way a console can match a decent PC built on tech 4 years+ after release in hardware. Yet console games have typically been in the same ball park as mainstream PC (especially at the same price bracket!) games because the hardware was loss-leading and efficiently used with highly optimised software, and importantly console gamers have been constantly impressed by the visuals of late-in-the-life-cycle titles that really pushed the envelope. We weren't playing Uncharted 3 and saying, "dang this is shit versus a decent modern PC. I wish I hadn't been so cheap and bought a £500 console." Nor were we playing PS1 in 2002 and thinking it should be looking better than it was as the hardware should magically have improved over time.
 
Please explain why you believe certain games "aren't suppose to push to envelope". This is based on what exactly? If they aren't suppose to push the envelope, why aren't you talking about the game developer's engines? Other games, that are possibly pushing developers' current knowledge of the architecture, don't seem to have most of those issues.

Was Lair suppose to be pushing the PS3, last gen? Knowing what we know now, do you believe that? Then, what about all the Madden games, etc that ran at half the frame rate (or had a resolution of 576P), etc at this same point in history? Are those enough examples?
Again if you are selectively choosing the worst examples you arent contributing to the discussion. Lair is that example. You mentioned it twice. But even if I accept your Lair example you know perfectly well that talking about past gen's Lair and this gen's New n' Tasty is a silly comparison, The former was experiencing huge development pressure, and was indeed trying to do a lot more for its time than New n' Tasty on thie PS4, such as huge massive environments with massive battles. It faced more challenges and difficulties. Back then we also had a lot more superb examples to compare to, so we know that those worse examples you seelctively pick were the exception of the rule. Regardless if you want to discuss with me you should leave this douche attitude.

Here is my original post

What I dont understand is why these consoles struggle to reach 60fps with titles that arent very demanding. There is either a significant bottleneck somewhere or optimization is very hard.
Even remastered games dont seem to reach the 1080p and stable 60fps. Let alone a reboot
Discuss with me in the original context. Give attention to these two points 1) I did mention the possibility of difficult optimization 2) Remasters dont push the envelope.
It was food for thought. I expressed my pondering. If you want to share with others what you think may be the issue there are better ways to do it than with this attitude
 
Last edited:
It's not about specs but what's accomplished on screen. Baldur's Gate Dark Alliance/CON on PS2 looked far better than Dungeon Siege on a Ti4200 (or whatever GeForce 4 I had at the time). Obviously raw specs won't be comparable because specs double every couple of years, such is the rate of technological progress. There's no way a console can match a decent PC built on tech 4 years+ after release in hardware. Yet console games have typically been in the same ball park as mainstream PC (especially at the same price bracket!) games because the hardware was loss-leading and efficiently used with highly optimised software, and importantly console gamers have been constantly impressed by the visuals of late-in-the-life-cycle titles that really pushed the envelope. We weren't playing Uncharted 3 and saying, "dang this is shit versus a decent modern PC. I wish I hadn't been so cheap and bought a £500 console." Nor were we playing PS1 in 2002 and thinking it should be looking better than it was as the hardware should magically have improved over time.

You are comparing a product that was the first game from Gas Powered Games that was meant for the PC versus a product that was the third game by Snowblind Studios. You can't simply boil down the visual difference to hardware specs. Besides different targeted hardware, you also have account for differences that may exist in skill level, resources and support.

This is my point, the delta you see between console games and PC games aren't simply due to the differences in hardware performance. Consoles don't have to worrying about matching the performance of a PC 4 years in the future because in four years pubs and developers will still be pouring the bulk of their resources into developing on consoles.

Games especially AAA games are bankrolled with the ideal of targetting the widest part of the market. Loss leading hardware or not, that won't change. Most of the resources will be spent trying to squeeze as much performance out of consoles as possible. PC ports will get hefty portions of tweaks like higher resolution, higher rez textures and better AA but nothing that will warrant heavy investment.

Put it this way, pc with high end cpus and 1.3-1.8 TFlops gpus sporting 150 GB/s of bandwidth have been around since before 2010 but no one here was going, "I can't wait to buy a XB1 or a PS4 so my games can look like PC games from 3 years ago!!!".
 
In the Gamersyde PS4 vs. XBO video, BlimBlim notes that the Xbox One version also drops frames when scrolling around the map. Hopefully this is a minor glitch common to both issues that CDPR can track down and eliminate because these sure are weird things for stutters to creep in. On the plus side, it's not going to affect your enjoyment of the game unlike the eternal loading bug and Gwent-crashing bug which seem common to both consoles.
Minor glitch, yes. The Witcher 3 runs smooth as silk -if by that we define smooth as 30 fps- on the Xbox One. And I am very sensitive to framerates, in fact it got me some time to adapt because most of my games run at 60 fps. Unlike Far Cry 4, I can easily stand it, and now I am happily playing this incredible game. More tomorrow... Can't wait to play again.

edit: note that DF article on TW3 never complained about the framerate in-game, other than preaching their locked 30 fps holy grail crusade.
 
Status
Not open for further replies.
Back
Top