Digital Foundry Article Technical Discussion [2019]

Status
Not open for further replies.
Which effects and techniques are generally diminished on the base XBoxOne compared to the base PS4, and which are likely to be in future?

Or is resolution the only difference?
Most games keep effects similar but sacrifice res/frame rate, but I guess it's because games have matured to a point where they want a "cannon" look, with just varying degrees of fine rendering fidelity. They don't wanna be fine running individual assets, particle effects, lighting set-ups or shaders for each platform.
But I bet all those things would look better on every platform had the base Xbox one been more competitive performance-wise. I feel like multiplat devs target the lowest common denominator with a slightly reckless leniency for low framerates on the base machine, and whatever they can't optimise by the end of the dev-cycle they compensate with an agressive res drop.
Or, they use Xbox one base as the measuring stick for where everything should be able to run, even if accepting shit res and some frame drops, and ps4 as where everything should run at 1080p at minimally solid frame rate, and then finally for X and Pro thet just twist nobs untill where they can get away with.
 
Some tools deficiency might be byproduct of corporate agenda. Lets not forget that ultimately Microsoft aims to make xbox a trojan horse for microsoft store and services and no so much dedicated platform. Easy portability between xbox and windows might be priority instead of best and most efficient solutions for given piece of hardware. Some rare quote that might touch the subject (of course it's old and there is constant development but still...)
DF says they heard about performance problems with some tools on xb(look around end of video), so I quoted a very experienced programmer who was suggesting that some problems came from MS choosing suboptimal solution to fixed spec xb api and that solution is used for rendering on pc side, dx11 helps with broad compatibility but with performance loss. Why would they do this ? To finally please other sides of company and for easy portability of code and future services and what not, even with sacrificing some performence on console. Second post was short colorfull synopsis of their corporate motivation for that decision.

I don't believe the problem is exclusively DirectX related. Yes, DirectX allows for a broad sweeping scope towards compatibility between applications and game engines amongst PC wares, and other platforms (phones, tablets, etc.). But of course, this was to eliminate the need for developers to create the hundreds of thousands individualized toolchains/APIs for the vast sea of PC configurations and consumer electronics. But that's mostly PC specific, not Xbox specific.

DirectX on the Xbox is very tailored towards its specific needs. This would include being able to program "close to the metal," while maintaining a level of cross-platform eco cohesiveness. The problem may have stemmed from older SDK DirectX libraries... developers using ESRAM in an inefficient way... or a combination of both. But without a developer stating the exact issue with DirectX, rather than hallway whispers, it's all conjecture at this point.

As for the Xbox brand being a (PC) Trojan horse into people living rooms, is pretty laughable and quite ignorant. Microsoft seen an opportunity to expand their OS presence, applications and services outside the PC space. When Dreamcast didn't meet those expectations, Microsoft retooled their vision on providing such games, apps and services through their own brand of hardware and ecosystem. That's not say Microsoft didn't feel any pressure internally, towards competing for living room dominance against the Sony PS2. However, In the end, Microsoft's goals (then and now) have always been about expanding their business beyond the PC space.
 
Last edited:
Compared to the PS4 not the mid gen refresh consoles?

On rush is 60fps on base PS4 and 30fps on Xbox One
Yes. I don't think the game has being enhanced on XBX, it should still be capped at 30fps. But I remember the team that ported the game on XB1 got help from Nixxes.
 
Most games keep effects similar but sacrifice res/frame rate, but I guess it's because games have matured to a point where they want a "cannon" look, with just varying degrees of fine rendering fidelity. They don't wanna be fine running individual assets, particle effects, lighting set-ups or shaders for each platform.
But I bet all those things would look better on every platform had the base Xbox one been more competitive performance-wise. I feel like multiplat devs target the lowest common denominator with a slightly reckless leniency for low framerates on the base machine, and whatever they can't optimise by the end of the dev-cycle they compensate with an agressive res drop.
Or, they use Xbox one base as the measuring stick for where everything should be able to run, even if accepting shit res and some frame drops, and ps4 as where everything should run at 1080p at minimally solid frame rate, and then finally for X and Pro thet just twist nobs untill where they can get away with.

It's all very obviously obvious milk. Of course in todays muliplatform reality, engines are architected in such way to reach least code maintenance, lesser potential work, support and angry nerds rages. Least common denominator and setting knobs from here even if some good things will be lost on some platforms, examples checkerbord/temporal injection, how many third party games are reaching IQ of best first party implementations on pro? or AMD cards, how many pc games are reaching level of performance like in dice/id engines. I wish there was less of this "commodization " nowadays and more demoscene like angle.



As for the Xbox brand being a (PC) Trojan horse into people living rooms, is pretty laughable and quite ignorant. Microsoft seen an opportunity to expand their OS presence, applications and services outside the PC space. When Dreamcast didn't meet those expectations, Microsoft retooled their vision on providing such games, apps and services through their own brand of hardware and ecosystem. That's not say Microsoft didn't feel any pressure internally, towards competing for living room dominance against the Sony PS2. However, In the end, Microsoft's goals (then and now) have always been about expanding their business beyond the PC space.
Pc trojan horse? thats little to literaly. In todays CEO mind more like microsoft subscription trojan horse and other net opportunities like" share your space marines achivments with friends on linkedin and get office 365 price cut". After many years of masquerading as magical console, xbox is a brand which we can use to strengthen our core busneses, especially in world of androids and ios, thats how things are looking to me and literally the same meaning can be read with non gaming interviews with ms VP, nothing to do with dreamcast fallout today. It can also be understood literally. Where xbox exclusives are going? back to pc to benefit this platform, exclusives crafted for unique platform thats not good, console targeted game? its not good for the gamers, keybord and mouse on console ? good for gamers, cross platform gaming too. Killer games after years of pushing the architecture and budget granted by massive user base, all of which finally shows value of this limited piece of hardware? no... but don't buy gpu... buy entire new console... moar pixels , Its best place to play. By the logic It's getting to the point where there is nothing unique and dedicated about this console and that should be point, not some hybrid of worst/not wanted aspects of pc and console worlds. Worse, slowly, subtlety, little step by step with spencer trustworthy smile, they are trying to seed this in minds as expected standard, diluting console advantages, ultimately benefiting microsoft.



...DirectX on the Xbox is very tailored towards its specific needs. This would include being able to program "close to the metal," while maintaining a level of cross-platform eco cohesiveness. The problem may have stemmed from older SDK DirectX libraries... developers using ESRAM in an inefficient way... or a combination of both. But without a developer stating the exact issue with DirectX, rather than hallway whispers, it's all conjecture at this point.
Again, all of this should be true and I wish these things werent NDA minefiled. One thing which is obvious to any B3D reader. Keeping things compatible through diffrent architectures requires more abstraction and more carefull , general, higher level thinking , there is no way about it, which might have some performance implication for a closed spec hardware - console, and there were, even if in crucial launch period as some very brave ( such brave... that in only can be true) dev dared to state. It's naive to think there is't any such compromises for xbox to reach "one micrsoft" era, should be there even if later as linked interview mentions but there were issues as clearly stated. This is one more subtle aspect of rambling in above paragraph.


bonus slide( there are some opinions)

v6QwHw9.png

v6QwHw9
 
agreed, still..., nitpicking here, but if you play classic games, from the last decade -and older-, my 1050Ti is pretty capable of running them at 4k.

The most recent game that my 1050Ti ran at 4k was Ori and the Blind Forest -but I think there was something funky going on with the framerate, or maybe that was me I don't have DF staff by my side to tell, still quite playable though and managed to beat some complex zones of the game-
 
Last edited:
that could make for an interesting DF or DF Retro video. I mean, using 1050Ti level hardware or similar -1060, RX 570, etc-, and run classic games or modern non demanding games at 4k and check the framerates.
 
agreed, still..., nitpicking here, but if you play classic games, from the last decade -and older-, my 1050Ti is pretty capable of running them at 4k.

The most recent game that my 1050Ti ran at 4k was Ori and the Blind Forest -but I think there was something funky going on with the framerate, or maybe that was me I don't have DF staff by my side to tell, still quite playable though and managed to beat some complex zones of the game-
Really? That game would always slow down on about half a dozen areas while I was playing it on a sub 1080p monitor on a 1050TI myself. My impression was they were areas specially full of overlapping transparent lighting/fog sprites, but it's mostly speculation actually.
 
Last edited:
Really? That game would always slow down on about half a dozen areas while I was playing it on a sub 1080p monitor on a 1050TI myself. My impression was they were areas specially full of overlapping transparent lighting/fog sprites, but it's mostly speculation actually.
trust me on this one. I've beaten the game on a 1050Ti at a stable 1080p 60fps, and it is also playable at 4k.

The computer where I've beaten it, is the laptop I'm using to write this post.

Intel i7-7700HQ
1050Ti 4GB
16 GB of RAM
256GB SSD + 1TB HD
 
trust me on this one. I've beaten the game on a 1050Ti at a stable 1080p 60fps, and it is also playable at 4k.

The computer where I've beaten it, is the laptop I'm using to write this post.

Intel i7-7700HQ
1050Ti 4GB
16 GB of RAM
256GB SSD + 1TB HD
I don't doubt you. I'm just pissed to know it was slowing down on my computer. And I've played much more demanding stuff on it with no problems. Just Ori and Hob were problem titles, but from what I saw online that second one also struggles with nvidia cards for many people. I only recently went back to gaming on PC more often, and with more demanding stuff, but I am too old to go back to learning to fiddle with drivers and this and that knob. As more of a console guy, when I do sit down to play a videogame I just wanna play the game without giving that shit much thought. Even though I'm interested in graphics and technology, I don't wanna engage that part of my mind when I'm actually gaming, which might be kind of paradoxical coming from a beyond3Der.
 
that could make for an interesting DF or DF Retro video. I mean, using 1050Ti level hardware or similar -1060, RX 570, etc-, and run classic games or modern non demanding games at 4k and check the framerates.
Well always depends on how demanding a title is. Not every game is a Crysis ;)
 
Well always depends on how demanding a title is. Not every game is a Crysis ;)
right...and that's something that deeply disappoint me about tech sites. They usually mention things like "The 1050Ti is not meant for 4k.". "The 1060 is great for 1080p/1440p 60fps. For 4k you have the 1080 or 1080Ti or Vega....".

And that absolute truth is not true. I can't run The Witcher 3 at 4k and smooth framerate on the 1050Ti but it runs, and the game runs at 1080p 60 fps stable knowing where to tweak. But it doesn't mean that you can't run Oblivion at 4k 60 fps without a hitch and so on.
 
right...and that's something that deeply disappoint me about tech sites. They usually mention things like "The 1050Ti is not meant for 4k.". "The 1060 is great for 1080p/1440p 60fps. For 4k you have the 1080 or 1080Ti or Vega....".

And that absolute truth is not true. I can't run The Witcher 3 at 4k and smooth framerate on the 1050Ti but it runs, and the game runs at 1080p 60 fps stable knowing where to tweak. But it doesn't mean that you can't run Oblivion at 4k 60 fps without a hitch and so on.
Well tech sites want to give you suggestions what the card is good at with current games with high details. And a 1050TI is not really meant to play current games in 4k.
Old games don't really matter from that view. Hey my Voodoo 5 can run Half Life at 1600*1200 but who cares for that old game (well.. actually that old hardware ^^). Full HD was nothing really new for PC gamers, but it always depends on the card and the game you test.

If reviewers write "4k is no problem" (with low details) there are enough people out there that buy such a card and expects he can play the game at 4k with all details enabled.
And in the end we get "review"-videos of a PC with a low-end CPU and a 1050 card and the reviewer states that it runs games as good as the PS4 Pro/xbox one X and is cheaper (somehow) … those videos ware really misleading.

But well, I'm still happy with my RX570 4GB at a 1080p display :) (there are enough people/reviewers that say this card with 4GB is not enough to run current games at even 1080p … so far everything runs well)
 
Well tech sites want to give you suggestions what the card is good at with current games with high details. And a 1050TI is not really meant to play current games in 4k.
Old games don't really matter from that view. Hey my Voodoo 5 can run Half Life at 1600*1200 but who cares for that old game (well.. actually that old hardware ^^). Full HD was nothing really new for PC gamers, but it always depends on the card and the game you test.

If reviewers write "4k is no problem" (with low details) there are enough people out there that buy such a card and expects he can play the game at 4k with all details enabled.
And in the end we get "review"-videos of a PC with a low-end CPU and a 1050 card and the reviewer states that it runs games as good as the PS4 Pro/xbox one X and is cheaper (somehow) … those videos ware really misleading.

But well, I'm still happy with my RX570 4GB at a 1080p display :) (there are enough people/reviewers that say this card with 4GB is not enough to run current games at even 1080p … so far everything runs well)
the RX 570 is, imho, the best AMD GPU in performance/wattage consumption/price rate. I have one and 1080p 60fps is basically granted on almost everything as noted per the DF review back in the day, which is true from personal experience.

I am quite surprised t know that a Voodoo 5 can run a game at 1600x1200. Question is..., how smoothly. My last Voodoo card was 3, too bad it lacked 32 bit colour precision, but it performed very well
 
I don't doubt you. I'm just pissed to know it was slowing down on my computer. And I've played much more demanding stuff on it with no problems. Just Ori and Hob were problem titles, but from what I saw online that second one also struggles with nvidia cards for many people. I only recently went back to gaming on PC more often, and with more demanding stuff, but I am too old to go back to learning to fiddle with drivers and this and that knob. As more of a console guy, when I do sit down to play a videogame I just wanna play the game without giving that shit much thought. Even though I'm interested in graphics and technology, I don't wanna engage that part of my mind when I'm actually gaming, which might be kind of paradoxical coming from a beyond3Der.
hehe, quite ironic, isn't it?

also trust me on this one, you don't need to tweak anything if you don't want to. I don't even fiddle with the settings of the 1050Ti, being a laptop component, and don't touch the settings -MHz specially-, letting the values at factory defaults and as for drivers, I just install the newer ones when nVidia Experience indicates me that new drivers came out and that's it.

It's like updating a console's firmware. solid CPU/solid GPU, you are good to go
 
Status
Not open for further replies.
Back
Top