Lower quality AO. Will be solved with ray tracing.Honestly not sure what caused the flickering on my throat for the opening shots in the video. I used the same lights and bulbs I always used... and my previous videos never had that problem. Maybe I accidentally turned my camera into some mode unknowingly and did not notice!
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps.
That's because there are some sections that still appear to be CPU-limited.They mentioned in the video that a full Pro-patch still wouldn't be able to hit 1080p60, which I found an odd statement, why not? and then in the next sentence Linneman was talking about 4K60 mode for PS5...
Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s.
As much grief the people give this generations CPUs, most games on console are GPU limited. Otherwise, dynamically dropping rendering resolution wouldn't steady out the framerate; a technique used quite often this gen.Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s.
As much grief the people give this generations CPUs, most games on console are GPU limited. Otherwise, dynamically dropping rendering resolution wouldn't steady out the framerate; a technique used quite often this gen.
That's because there are some sections that still appear to be CPU-limited.
While I don't disagree with anything you said, this has been basically true of most consoles since the beginning. Consoles have been traditionally bound by their graphics processors, and almost all of them shipped with CPUs that were well below what would be considered high end in the PC space. The fact that developers had to optimize around the limitations of the platform isn't new, either.But thats the whole thing my guy...engines and games BECAME GPU dependent early on this gen because they were optimized and formed around taking advantage of the consoles. If the consoles had stronger CPU's, that would not have been the case.
So its not a matter of the CPU's being underestimated but the fact that devs are designing their games intentionally not to put too much stress on the CPU due to how weak they are.
If you dont do that, you get games like assassin's creed unity and just cause 3. really terrible affairs. Not to mention all the games that end up having bad performance due to devs not parallelizing work loads enough( because its pretty much mandatory to spread out work to all the cores with parallel operations) with such low clockspeed and IPC the only real thing devs could use was lowering CPU demands and paraellizing for the multi core format to get as much working performance as possible under those limitations.
That's why next gen is so good...CPU's in the consoles are a world away from jaguar and games can start using real CPU crunching again
He still had to enable Boost Mode to hit 60, which is interesting, because the snippets they showed of base clocks it looked like ~40fps (~3:14 in the video). Boost only gives a 30% CPU uplift, which would only get you to ~52fps, so I guess that cleared some stalls? I wonder how much just adding the specialized de/compression hardware MS and Sony are touting for the next gen would help current gen consoles, even with HDDs. Would it help the 99th percentile more than the average frametime?Neat. Guess that also confirms how GPU-limited Bloodborne is as he dropped the resolution down to 720p and got a much more stable 60fps. Some areas are still in the 40s
The problem also stems from the DX12 branch they are using, it's not optimized and it performs badly compared to DX11.An excellent video of @Dictator about RT great find it because of echnical Art Director of Coalition.
There will come a day in when the traditional T&L model will be tossed and only support RT lighting. I suspect when that day arrives, efficiency will be much greater than we have today.The problem also stems from the DX12 branch they are using, it's not optimized and it performs badly compared to DX11.