Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
No, the opposite. It will always be boosted. The clock speeds can only go down not up. The power is fixed so it can't gain more power if it needs it, there is a fixed amount of power that the mobo will supply at which both CPU and GPU will be able to maintain it's clock rate cap. Under certain loads and the power requirements from GPU or CPU need more than can be supplied it must borrow from one or the other. Causing a down clock in 1 or the other. If the load is too high both will start down clocking to their base clocks. (which were not announced).

Iroboto... Do you mean that if they decrease they cannot go up again?? Off course thy will go up. ;)
 
Iroboto... Do you mean that if they decrease they cannot go up again?? Off course thy will go up. ;)
Yea they will go back up to their cap.
But boost mode is about sitting at a base clock and going up to a max cap when the load is light and there is power to spare to increase frequency.
This is how we do this on GPUs today.

PS5 has it's power set higher to already have the base frequency high, and when loads are introduced the clock goes down.

In this essence it's the opposite. One is trying to go up as much as it can, and the other is trying to stop from going down.
 
Yea they will go back up to their cap.
But boost mode is about sitting at a base clock and going up to a max cap when the load is light and there is power to spare to increase frequency.
This is how we do this on GPUs today.

PS5 has it's power set higher to already have the base frequency high, and when loads are introduced the clock goes down.

In this essence it's the opposite. One is trying to go up as much as it can, and the other is trying to stop from going down.

Nope... Thats a boost clock Speed... That happens on PC!

This is not boost clock speed, its "Continuous Boost"! The CPU ang GPU are all the time boosting to required clock levels, with no base clock. This is how I get it!

The comparison to PC is the reason I believe people think PS5 will downclock to 9.2 Tflops.
 
On Alex's posted that was totally misinterpreted by wccftech for a clickbait title:

His point is largely that to increase the variety of content developers will have to rely on procedural generation vs storing pre-made assets on the ssd. If you look at the cost of a game like RDR2, now what would it cost to double or triple the amount of unique plants, animals, people, building etc? Now there is maybe some middle ground where you could create more base assets and stitch them together for variability in clothing, or certain animals like horses, dogs that come in varieties of colours. But I think his point is not that you can't stream in all of these assets from an SSD, but that you'd have to create all off them first, and that's where the problem is. Procedural generation will probably have to be the way to go for open world games, if people expect a certain next-gen density. People can disagree, but I don't think he's necessarily wrong. There could be a lot of gains on the development side that could allow people to make more assets faster. Like have procedural creation tools up front so they can be stored on the ssd, vs having procedural generation in memory.

There's a lot to discuss here without having a fanboy war over it.
 
This is true. I have a vrr monitor and I love it. What % of users do you think will have vrr tvs? Probably very very few. If anything they'll have uncapped modes that allow you to run above 30fps, or above 60fps. They'll still design with headroom for stable 30, 60. That 2% frequency boost on the gpu, if that's really the variable range won't even gain you 1 fps.

Very few now, by end of generation everyone who cares about framerate? All the new tv's from lg for example have had vrr for a while. Consequently I'm looking to perhaps get the new lg 48" gaming oled to play console games on(or just use my 27" pc vrr enabled display if console can connect to it). LG is doing a 48" oled with gaming branding. It's about perfect size for gaming for me. For movies, bigger would be better.

https://www.engadget.com/2020-03-17-lg-2020-4k-oled-pricing.html
 
Last edited:
Nope... Thats a boost clock Speed... That happens on PC!

This is not boost clock speed, its "Continuous Boost"! The CPU ang GPU are all the time boosting to required clock levels, with no base clock. This is how I get it!
Yes.. so it is the opposite from how GPU boost is.
One is trying to go up. The other is trying not to go down.
 
Who is this Andrew Maximov and why are they considered more reputable than Digital Foundry?

Actual developer and not a journalist. Maybe?

If the load is too high both will start down clocking to their base clocks.

I'm not sure I've understood it correctly. But AFAIK. Cerny was hinting that unless you use AVX CPU will be always underpowered.
And I have no problem with that.
 
Last edited by a moderator:
Xbox Series X TeraFlops puts it above 2080 Super indeed

No. It does not.
Turing has full speed parallel integer pipeline. That's used at on average 36% with the FP pipeline.
You could look at it as if Turing cards have +36% to their FLOPS (in real games, on paper they have 2x flops, because of that pipeline).

He never actually wrote that. He said that procedural generation would be necessary, not that it would "replace all the art in games."

I think you've lost the context of the discussion.
 
No. It does not.
Turing has full speed parallel integer pipeline. That's used at on average 36% with the FP pipeline.
You could look at it as if Turing cards have +36% to their FLOPS (in real games, on paper they have 2x flops, because of that pipeline).

I didn't realize int was used so much in graphics. I would have assumed it would be much more highly skewed to fp. Do you have any links or reading on where they're using it?
 
Actual developer and not a journalist. Maybe?
I'm not sure I've understood it correctly. But AFAIK. Cerny was hinting that unless you use AVX CPU will be always underpowered.
And I have no problem with that.

No. He did not say that!
He talked about AVX instructions as an workload example. Using them would pull the CPU to the max, but they would not be used all the time.
 
Status
Not open for further replies.
Back
Top