Digital Foundry Article Technical Discussion [2024]

Status
Not open for further replies.
I saw this too. PRO is putting up some fairly good results. I'm hoping that Sony will revise the PSSR tech more before the generation is over. Image quality is a big priority over performance for me. I know some like 60FPS but I hate sacrificing the image quality and rendering quality for it. I'm OK with just 30FPS as long as it's constant -- for 3rd person shooters. 1st person is another story.
Or you can just play it on PC where you can have both.
 
In real life, if you turn on a bright light in a dark room the entire room will instantly* light up. In games that use RTGI solutions with slow accumulation, only part of the room will instantly light up and the rest will only light up after a noticeable delay. If you turn off the light, it will instantly turn dark in real life, but there will be a delay with RTGI and accumulation. This is actually noticeable in Metro Exodus EE.

*Technically there will be a delay based on the speed of light, but that might as well be instant as far as human perception is concerned
experienced this today on Indiana Jones and the Great Circle. The first time in the fountain puzzle in the Vatican. Behind the statues and the door there is very little light when you first rotate the structure.

There is some light coming from the sides of the entrance, but when you get just behind the wall and the statues, the area gets totally dark in a millisecond instead of gradually darkening, and then again when you come out of that dark place the area gets lit so fast, it looks strange..

I also experienced something similar in one of the tents in Gizeh (Egypt).
 
XSX have more performance than 2080.

In raster yes, a little more but generally the same ballpark. It RT and ML (which is responsible for superior image quality in most titles) it is comfortably slower. But the 2080 is more than 6 years old now. It's modern equivalent is the 4060 which is also over 18 months old. And very soon even the 5060 should be faster than that.
 
It's really sad that it took 6 years for an entry level card to perform better than the old high end card. I remember the times where a new x60 card performed like the x80 card of previous generation. This is how it should be.
 
Below2D You can look to specs of XSX and 2080 tou can see what XSX is better in every aspect except XSX don't have tensor cores. PS5 is another story. It's performance us lower. I wouldn't put them on same level.
 
Below2D You can look to specs of XSX and 2080 tou can see what XSX is better in every aspect except XSX don't have tensor cores. PS5 is another story. It's performance us lower. I wouldn't put them on same level.

How about looking at real world performance at matched settings instead.

Which looking at the few videos DF do implies the 2080 is superior to PS5 and XSX at matched raster settings and faster with RT enabled.
 
Consoles don't show all they can in first years. Same was with 360 and One. And as we can see now 2080 became history, and XSX just started to show what it can. And results will be better in comming 2 years when Gears E-Day, Fable, Perfect Dark and other games will be released.
 
Consoles don't show all they can in first years. Same was with 360 and One. And as we can see now 2080 became history, and XSX just started to show what it can. And results will be better in comming 2 years when Gears E-Day, Fable, Perfect Dark and other games will be released.

360 was outclassed within one year and One was dead from a performance point of view before it even released.
 
We just have different opinions about that, so I think there's no point to continue argue about that.

You have no argument, within 360's first year on sale the 8800GTX released, and when running matched settings the 8800GTX offered better frame rates in every game for the duration of 360's life.

And if we look at multi-GPU solutions 360 was outclassed from launch.

And Xbox One was such a joke at launch in terms of GPU performance I wouldn't even use that to try and prove any kind of point.

Gen on gen GPU performance increases on PC were crazy during 360's life.

To put it in to context for you, between PS3 and PS4 releasing Nvidia increased its Tflops by a factor of 17x.

If we apply that same scaling to today's hardware we would be finishing this current console generation with PC GPU's rated at 504Tflops, the 4090 has 82.5Tflops which shows how much hardware has slowed down and how far off the pace we are.
 
Last edited:
Consoles don't show all they can in first years. Same was with 360 and One. And as we can see now 2080 became history, and XSX just started to show what it can. And results will be better in comming 2 years when Gears E-Day, Fable, Perfect Dark and other games will be released.
It’s been 4 years. How long do we need to wait?
We just have different opinions about that, so I think there's no point to continue argue about that.
What do you mean opinion? The data is out there. The RTX 2070S, Series X, and PS5 GPUs are all generally roughly similar in performance. The 2080 is usually a bit faster than them but not by much and the consoles can win here and there.
 
What do you mean "unless"? The 2080 is also better at RT than them. They can only win when it runs out of VRAM.
I mean it has bpretty much been established that the consle APU's are around 2070 performance, less in RT.
His argument is weird and seems disconnected from reality.
 
Status
Not open for further replies.
Back
Top