D
Deleted member 13524
Guest
PS4 Pro is the ultimate Sony machine.
Until PS5.
Until PS5.
I have to ask, but is the super sample a feature we can natively enable on PC? So. Go from 1440p and have it downscale to 1080? It would appear that the results are better than native, If this thread anything to go off.
GCN1/2 GPUs can do 1440->1080 or 1600->1200 scaling. GCN3/4 can do 4K->1080p scaling.
I think all Maxwells can do 4K->1080p, most probably Pascals do it too. I'm not sure about Kepler.
You mean for a Sony console? Because Xbox Scorpio hits next year and PCs can already do higher quality than both consoles.
Thanks! The feature I'm looking for is called DSR. Not OGSSAA. Good info post.
Vaporware until it hits the shelves.
We've been able to force supersampling through third party tools like Nvidia Inspector for a long time. DSR just brought that capability into the main driver control panel making it a hell of a lot more convenient. The functionality has been available since at least Kepler. I'd be surprised if all GCN GPU's don't have the same capability. DSR's a great feature, I use it a lot with my GTX 1070 and 1080p monitor. I was testing ROTTR the other day and can render at 4k on the highest graphics preset at a solid 30fps. Downscaling that back to 1080p looks pretty sweet.
Vaporware until it hits the shelves.
The effectiveness of the term "vapourware" has been eroded to the point that it now simply means "unreleased".
Uhm, PCs already exist and are already on the shelves that are more "the ultimate 1080p machine" than 4Pro.
Apparently the original PS4 also had DDR3, but it was only 256MB.Memory on the south bridge is a 512MB, DDR3-1600, and it's super narrow 8bits wide. There must be another one on the other side, dual ranking. So it's probably 1GB.
I meant just the chip we see on that side. That chip is an 8 bit packaging (78 balls). The south bridge of the original PS4 was 16 bits of DDR3-1600, I am assuming it still is, so there would need to be another 8bits 4Gbits mirrored on the other side. So why didn't they use a single 16bits/8Gbits? Probably less expensive to use 4Gbits capacity because of yield and availability, and 8bits packaging is what Dimms are using, so huge volume.Apparently the original PS4 also had DDR3, but it was only 256MB.
I'm guessing this is the "extra 1GB for apps" that Cerny mentioned, which is also responsible for the extra 512MB of GDDR5 that the GPU can access. So it's not an extra 1GB, but rather 768MB.
Why do you assume it's using an 8bit bus?