What do you prefer for games: Framerates and Resolutions? [2020]

What would you prioritize?


  • Total voters
    42
How much does the game have access to on PS5?

Xsx after removing os allocation is left with 3GB slower memory.
Deduct game engine
Audio
AI
Level data
Whatever, how much do you expect to be left of that 3GB?

Are the patents incompatible with RDNA2's implementation?
Or can they modify/customize it.
Really don't know but if the PS5's OS usage is smaller and you wanna maximize ram allocation to graphics while keeping game logic simple, what is there is such a game? Just speculating.
 
i mean, what if both companies are using RDNA 1, because they need that GCN, and upgraded as much as they could to RDNA2, but could not port everything over. This being one of them?
I think that's too much of an ask from AMD. As well as creating RDNA2, they are supposed to create two branches of RDNA1 for the consoles? With the efficiency of RDNA2? We also had AMD say the next-gen consoles are using their latest GPU architecture. I think an absence of VRS HW on PS5 would either have to be because RDNA2 doesn't have it, or because Sony took it out, which is pretty nonsensical. Ergo I think the HW is in there, and either it's present as a feature but not talked about, or prohibited as a feature on a software level. Those seem some fairly straight-forward scenarios versus the somewhat convoluted alternatives.
 
I think that's too much of an ask from AMD. As well as creating RDNA2, they are supposed to create two branches of RDNA1 for the consoles? With the efficiency of RDNA2? We also had AMD say the next-gen consoles are using their latest GPU architecture. I think an absence of VRS HW on PS5 would either have to be because RDNA2 doesn't have it, or because Sony took it out, which is pretty nonsensical. Ergo I think the HW is in there, and either it's present as a feature but not talked about, or prohibited as a feature on a software level. Those seem some fairly straight-forward scenarios versus the somewhat convoluted alternatives.
Fair enough, SDK is incomplete is probably the correct answer if it’s not there currently. Features can arrive at later dates even after launch as they pull back more resources for developers as time goes on.

I’m not sure to be honest. Occam’s is not applying here. The simplest most straight forward answer was for both to use AMD. Sony is quiet, for something that they don’t need to be legally, and MS rolled their own.
 
I’m not sure to be honest. Occam’s is not applying here. The simplest most straight forward answer was for both to use AMD. Sony is quiet, for something that they don’t need to be legally,...
Um...has MS told us they have audio hardware decompression or anything? AFAIK they've just said they have audio HW. I don't think lack of info at this point is really indicative of anything. That is, I don't think everything said so far is everything there is. I'm not aware of where Sony have skirted around the question. Maybe if there's a clear quoted example, we can see if they are being elusive or not.
 
Um...has MS told us they have audio hardware decompression or anything? AFAIK they've just said they have audio HW. I don't think lack of info at this point is really indicative of anything. That is, I don't think everything said so far is everything there is. I'm not aware of where Sony have skirted around the question. Maybe if there's a clear quoted example, we can see if they are being elusive or not.
They current support 3D encoding/decoding for Dolby Atmos, Windows Sonic and in the future DTS:X.

and they have their project acoustics. That’s pretty much all I could find on audio.
 
Anyone remember what the benefits / changes that MS VRS patent promoted?
Seems like tier 2; perhaps handles some additional use cases and/or edge cases. But it’s functionality is sort of unknown outside of the patent
 
Seems like tier 2; perhaps handles some additional use cases and/or edge cases. But it’s functionality is sort of unknown outside of the patent
Tier 2 isn't really much of a benefit, unless we think that RDNA2 isn't tier 2, and I find that hard to believe.
Was hoping someone who read it could see what it would provide beyond what we know of any standard implementation, efficiency, wider grid selection etc.
Maybe like you say it's just teir 2 but handling some kind of edge case better. Hardly worth trumpeting the fact it's patented, but then again why not.
 
Tier 2 isn't really much of a benefit, unless we think that RDNA2 isn't tier 2, and I find that hard to believe.
Was hoping someone who read it could see what it would provide beyond what we know of any standard implementation, efficiency, wider grid selection etc.
Maybe like you say it's just teir 2 but handling some kind of edge case better. Hardly worth trumpeting the fact it's patented, but then again why not.
T2 has a huge benefit over tier 1.

being able to use an image to determine shading areas is a massive improvement over selecting an arbitrary area. It allows
For fine grained VRS to be applied in areas only specifically where it should be.

In order for your card to be certified DX12U - it must support Tier 2. Tier 1 is insufficient.
 
Choice makes a lot more sense when you look at the history of game development and the impact of screenshots in print. The move to video advertising somewhat helps, but attention is still highly reliant on static image sharing in screenshots in reviews and also in social media sharing.

Your game may be a better game for targeting 60fps, but it may well sell less than being 30fps. People share tweets of gorgeous screenshots but no-one's going to share a tweet of someone saying, "love that this game is 60fps."

That's why my opinion is that the 60hz framerate should be a mandate, and this would naturally create a new bar for graphics.

This hole conundrum happens because developers cannot behave themselves. The entire PS3/x360 generation was a disgrace aside from some developers.
 
I also think being 30fps potentially gives you a much higher chance to score better from the critics. You have GOTY material like God of War, RDR2, Tlou2, UC4, The Witcher 3, Bloodborne, Breath of the Wind all at 30fps for their primary target frame rate. Yet the critics and majority of the gamers alike eat them up like cakes without a sliver of uproar, this indicates people are generally fine with 30fps being the norm as long as it's locked without severe dipping. Now imagine cutting their base resolution in half to 720p, reduce the interactivity, asset quality etc but you get 60fps now. I bet a lot of base console folks would not sit very well with it at all. the critics would be looking at the latest Naughty Dog games as if back in the PS3 days due to the extreme blurriness on their full HD and 4k TVs. Same applies to next gen, you can't have your game looking subpar in an already highly competitive visual based industry, the more pretty pixels get you more attention period. Only the hardcore B3Ders and forum goers crave for smoothness over eye candies.

Reporters will eat every bit of sh*t the big developers gives them in the hopes for good access on futures titles. Its all about clicks, and it behaves like a snow ball. Big developers are more responsible for this.
 
Yep that's my point I expect RDNA2 to be teir 2.

Actually forgot about that, be a bigger fail for amd if RDNA2 isn't DX12U, in fact I find that impossible to even consider. (until it happens :D)
RDNA 2 is DX12U. That’s the confusing part. Either it’s not ready in time. Or it wasn’t enough for MS
 
RDNA 2 is DX12U. That’s the confusing part. Either it’s not ready in time. Or it wasn’t enough for MS
Brain fart regarding DX12U (put it down to the crazy heat having in UK)

That's why I said previously that RDNA2 has VRS, and why i expect PS5 to also.

For me it's more so about what if anything the patent provides, as if it's in hardware i expect it to be a customization of RDNA2 more so than a replacement
 
Possibly the fact that Microsoft has their own patented VRS implementation. ?
Is that hardware though? I'm imagining a situation where MS's DX API enables VRS as long as the hardware has it. The hardware can then support VRS on MS platforms, but other platforms won't be able to use that hardware unless they can get around MS's patents.

I'm unfamiliar with the content of said patents. You just see, "MS have VRS patented," and we all know that the same ideas manage to get used elsewhere despite patents, so the existence of patents doesn't stop the concept being elsewhere; it's the content of those patents that may or may not stop VRS being used.
 
Back
Top