PS5 Pro *spawn

I can't say $700 is a surprising price for the US. I'm glad I won't have to pay sales tax on it if I decide to purchase it. However the price was expected to be this high during initial launch due to people willing to buy the machine at such a premium. The timing of all this could line up with a price cut around the time GTA VI launches to potentially keep sales going. But if they don't do that I don't foresee PS5 Pro being that much of a success. Make the money from early adopters while they can and lower the price within a year to keep sales momentum going.

Without knowing actual specifics of hardware changes other than more CU"s It's hard to gauge how much more worth it this is.
 
I'm sure PS6 will be aimed at the mass market, with at least a version of the PS6 at around 400$ (even if it's a lite console like the series s), but the lack of vision for the future is worrying.
It's either we have more complex hardware designs due to following economically unsustainable PC graphics technology trends or we can develop custom hardware with far fewer transistors. Those are our only choices left because high-end technology isn't getting any cheaper ...
Straight from an ex playstation game developer and now an AMD hardware dev.

He's NOT a game developer at all according to his linkedin profile and he works for their data center GPU business. Not exactly Radeon graphics proper employee and you can't infer much from that statement (implementation details) other than the fact that it runs on AMD HW ...
 
It's either we have more complex hardware designs due to following economically unsustainable PC graphics technology trends or we can develop custom hardware with far fewer transistors. Those are our only choices left because high-end technology isn't getting any cheaper ...

He's NOT a game developer at all according to his linkedin profile and he works for their data center GPU business. Not exactly Radeon graphics proper employee and you can't infer much from that statement (implementation details) other than the fact that it runs on AMD HW ...
I think XDNA is a fairly reasonable piece of kit to support AI. It's not as silicon or power heavy as tensor cores found in deep learning GPUs. XDNA has been out for a few generations, and for semi-custom solutions, this seems like the right fit here.

I can't math out 300TOPs of int8 using 60 CUs, it's just too large a number.
 
  • Like
Reactions: snc
Didn’t MS claim as well that XBox would be more powerful and have more chips and rays and tracing and whatnot? So it is possible that other people believed it as well. As MS is a real company and they would never lie because their investors would sue them.

Btw

Cerny was saying that players shouldn’t choose between framerate and quality anymore, but it looks like GT7 now has gameplay raytracing OR an 8K mode..
So now they choose between graphics, and more graphics? lol

If the PSSR upscaling is great then PS5 pro would run games at lower res with more raytracing and then have it output at a higher res and higher framerate as a result as well.

I am afraid DF is going to need to buy 8K tv and 8K capture devices to test Sony claims.
Isn’t the PS5 incapable of outputting 8K? Unless they’ve upgraded the HDMI interface I think it’s going to be 8K downscaled to 4K for output.
 
It's either we have more complex hardware designs due to following economically unsustainable PC graphics technology trends or we can develop custom hardware with far fewer transistors. Those are our only choices left because high-end technology isn't getting any cheaper ...

He's NOT a game developer at all according to his linkedin profile and he works for their data center GPU business. Not exactly Radeon graphics proper employee and you can't infer much from that statement (implementation details) other than the fact that it runs on AMD HW ...
What exactly should this custom hardware do for fewer transistors that is better than ai upscaling and rt cores?
I don't think there is any magic solution to this. Maybe getting off TSMC and sacrificing some efficiency.
 
He's NOT a game developer at all according to his linkedin profile
Sigh, look at his twitter account: "Former Xbox & PlayStation hardware dev, now making AI GPUs"

he works for their data center GPU business
He is an AMD official who is working on developing AI GPUs, you know AI which PSSR is using?

you can't infer much from that statement (implementation details) other than the fact that it runs on AMD HW
You don't need to infer anything, the question was obvious and clear cut, and the answer was even more obvious with no room for maneuvers.

Even if what you're stating is true that still leaves you and everyone else being wrong about the RT perf details ...
If you think I thought for a second that PS5 Pro will be 3x faster in ray traced scenes then you obviously don't know me very much. Those perf numbers were clearly labeled "RT specific tests", nothing more.
 
I don’t really get this view. I get less friction when using my PC on my TV/couch setup. What is this friction you are talking about?
That's not even theoretically possible, forget about being realistically possible. From the friction of multiple stores, numerous game issues, windows related issues, driver related issues, etc. On console, from a cold boot, you are 2 controller button presses away from booting your game. From rest mode, it's a single button press. Just this week, I've had to troubleshoot memory with memtest86, replace a less than 1 year old 1000watt psu, troubleshoot random mouse lags, random keyboard lags, etc. PC gaming when everything works is friction filled, talk less of when things don't work.

Like i said, the only steam deck presents an acceptable experience especially because you can download pre-compiled shaders.
 
I don’t really get this view. I get less friction when using my PC on my TV/couch setup. What is this friction you are talking about?

It's in large part going to be dependent on what you're used to and familiar with, and want to spend time on.

I personally find a PC easier to use than a console, but that's in large part because I'm not used to controllers as an input device.
 
I think XDNA is a fairly reasonable piece of kit to support AI. It's not as silicon or power heavy as tensor cores found in deep learning GPUs. XDNA has been out for a few generations, and for semi-custom solutions, this seems like the right fit here.
XDNA and many other NPUs features a flaw in their memory model in that they can only address a small amount of "local memory" and you have to do explicit DMA transfers if the working set is larger than what the hardware can physically store ...

I remain skeptical if a temporal upscaler can be reasonable implemented if there's no existing references to look at ...
What exactly should this custom hardware do for fewer transistors that is better than ai upscaling and rt cores?
I don't think there is any magic solution to this. Maybe getting off TSMC and sacrificing some efficiency.
Have you forgotten that high-end IC manufacturing has been largely unprofitable outside of a "winner-takes-all" competition model ? Until the world goes back on the trend of globalizing and the so called "world's factory" regains access to critical technologies to have the means to produce them, TSMC will never be realistically toppled or much less face competitive pressure ...

There's really nothing 'clever' about RT or AI upscaling implementations and all it has involved is just a race to the bottom of dumping as many transistors as possible into your HW designs whilst trying to scrape out every last bit of memory performance by squeaking ever higher clocks and moving to a new memory standard. Do you see a future where these technologies can be economically scalable when the inherent trends (poor cache size scaling, no widespread consumer proliferation of HBM, etc.) suggest otherwise ? There doesn't need to be a magic solution but we need to find some alternative graphics technology out there that CAN WORK WELL with those trends ...

At this rate, we'll never see price cuts if we don't go back to reasonable HW designs that uses less transistors/simpler memory architectures. We may as well consider the PS5 Pro as a very early and somewhat disappointing preview of the "next generation" with no more gains to be had left in value ...
 
XDNA and many other NPUs features a flaw in their memory model in that they can only address a small amount of "local memory" and you have to do explicit DMA transfers if the working set is larger than what the hardware can physically store ...
Unfortunately as you know, as others here know, Sony are way too tight lipped on this and all the documentation is watermarked to your login credentials.

We will never know and that type of information would never be released and is likely only to come when you ask Sony for assistance when you aren’t getting the performance you’re looking for.
 
Sigh, look at his twitter account: "Former Xbox & PlayStation hardware dev, now making AI GPUs"
His linkedin profile suggests that he worked at IBM AFTER the release of first HD twins. At most he's only *indirectly* responsible for overseeing design ports of their CPUs to newer logic process nodes. Had no involvement with their original designs, case closed ...
He is an AMD official who is working on developing AI GPUs, you know AI which PSSR is using?
He works with the Instinct division and not in the Radeon division. He also doesn't have any listed skills in graphics programming or past experiences with developing fixed function graphics IPs. Their Instinct accelerators aren't really 'GPUs' anymore in the proper sense of the term ...

I think you should look into other people's past history before declaring they're some authoritative source especially when you've likely got the wrong guy here who hasn't contributed anything related to graphics technology ...
You don't need to infer anything, the question was obvious and clear cut, and the answer was even more obvious with no room for maneuvers.
He just threw the term "HW blocks" without any other descriptor which can refer to anything on the SoC from between integrated NPUs or just plain shader units ...
 
Any of you going to buy the PS5 Pro?
hell no. If it was $499 for the digital then yes. But I'm going to be buying the PS5 "slim". They're selling a box that costs them around $400 to make for $699. The only major change is the GPU but the CPU is the same, the amount of memory is the same albeit higher mem bandwidth but still memory prices have dropped significantly since 2020, cost of storage as well has dropped significantly since 2020. The 2TB SSD doesnt cost anywhere close to $100 like the 1TB SSD costed in the OG PS5. So the pricing is more to do with greed. I hadnt bought a PS5 yet because of how large the thing is. I think I may get the base PS5 slim or wait for it to get slimmer. I cant even justify buying a PS5 pro to play GTA 6 tbh. I'll just play it on my Series X.
 
Even when you try to watch a movie through your pc and are sitting on the couch with the mouse, windows is going to say you need to change your windows password

...what

Or you will have the mouse cursor inside the screen when trying to watch a movie. So you minimize the screen and put it back to fullscreen not knowing it is even going to work.

Dude, jesus. I actually agree with the simplicity argument, I game on my TV with my PC the majority of the time and there are a host of annoyances for sure (largely brought about by the yin/yang nature of 'choice'), but these examples in particular are ridiculous. People know how to move a mouse cursor and hit the little icon to make a video fullscreen - people have been doing it for decades!
 
...what



Dude, jesus. I actually agree with the simplicity argument, I game on my TV with my PC the majority of the time and there are a host of annoyances for sure (largely brought about by the yin/yang nature of 'choice'), but these examples in particular are ridiculous. People know how to move a mouse cursor and hit the little icon to make a video fullscreen - people have been doing it for decades!
When you are watching on a 120 inch screen you will definitely see the mouse cursor not disappearing while in fullscreen video.

You are merely saying it does not bother you, sure, in which case I will say: nobody should be listening to your opinion regarding this
 
Back
Top