Inuhanyou
Veteran
I know that Xbox one you could put discs from 360 in it iircThere is a patent, but with any patent doesn't mean will be made a thing. Would be cool though.
Can't remember if it would work for pre X1 though.
I know that Xbox one you could put discs from 360 in it iircThere is a patent, but with any patent doesn't mean will be made a thing. Would be cool though.
Can't remember if it would work for pre X1 though.
I think the extra $100 for a PS5 digital edition is well worth it.At this point in time there isn't a reason to buy a ps4/pro/xone/xonex with the series s out there. You might have an occasional game that performs slightly better on the pro or x but going forward there wont be new games for them. You also loose out on the nvme drive in the s .
If you have a ps4/xone the series s is the way to go for a low price upgrade unless you really want the best possible performance or sony exclusives.
If you have a ps4 pro/ xonex you might want to buy the ps5 or series x. but if you want to save money the series s is a good pick up.
The closer to $200 the s stays the more of a good deal it is. I think this will become even more evident when we get redsign shrinks with 5nm . The series s is going to start getting really small
I wonder what the next PS5 version will costI think the extra $100 for a PS5 digital edition is well worth it.
This will be a topic that ends badly.I think the extra $100 for a PS5 digital edition is well worth it.
Do we have any info on how well Series S is selling? 33% more money for over double the performance seems like a no brainer to me. If you plan on Game Pass I suppose that would be the only decision changer.This will be a topic that ends badly.
All I'll say is that many people obviously disagree considering how well its sellimg.
And I don't think it's only because of availability either.
Do we have any info on how well Series S is selling? 33% more money for over double the performance seems like a no brainer to me. If you plan on Game Pass I suppose that would be the only decision changer.
I still not sure I follow. What you're describing above seems to be the more general ability for consoles to extract more performance from a given set of hardware due their fixed nature, which is fair enough. Translated to a real world example we could imagine a shadow setting which on medium looks worse than the console, while on high it looks better. But medium has the same cost on PC as the better looking setting does on console, while High has say a 50% higher cost. I think that's what you're describing above which is fair enough, but then surely there is still a place for a setting between the two that is as good as the console setting but only costs say 20% more performance on the PC? That way people can get at least console level quality for the minimum possible performance outlay.
I'm struggling to think of a scenario like the one above where you have settings both above and below the consoles where something in the middle that matches the console doesn't offer a compromise between the two.
I can at least envision a scenario though where for example a specific sub component of an effect is more costly on PC and thus the effect at any quality level will be more costly, and therefore on PC the developer chooses to increase the quality level of a different sub component of that effect because it's effectively free. Therefore you have what appears to be a very costly effect on PC that looks better than the console version but doesn't scale down. A crude analogy of that might be RT on a slow CPU with 40xx GPU where the BVH creation and thus overall performance is slower than the console and so RT resolution is ramped up beyond the console. But lowering RT resolution in that scenario gains you nothing and so the option isn't presented to the user.
What does this mean? Series s is a great bargain buy, but I don't think sales say anything about PS5 not being worth price?This will be a topic that ends badly.
All I'll say is that many people obviously disagree considering how well its selling.
And I don't think it's only because of availability either.
Games should provide good performance and resolution scaling options. Doing this via monitor mode changes is pure legacy at this point.This is why I'm genuinely inquiring how you would have phrased Alex's critique in that section - like I'm not asking for a form letter we can all spam MS with, but instead of saying "Games should provide all resolutions and refresh rates", something like...?
This is precisely what I'm talking about. In almost all cases games already provide quality settings that span both above and below the consoles. But coming up with "optimal" settings is always coming up with a tradeoff of the best use of performance across *all* of the various quality knobs... they are not really independent in practice. Thus if something is far more expensive on one platform it is likely to be turned down/disabled in favor of something that is relatively cheaper but provides a bigger visual impact.I can at least envision a scenario though where for example a specific sub component of an effect is more costly on PC and thus the effect at any quality level will be more costly, and therefore on PC the developer chooses to increase the quality level of a different sub component of that effect because it's effectively free.
Yea but for the extra $100 bucks I think the xbox series x is well worth itI think the extra $100 for a PS5 digital edition is well worth it.
I think the extra $100 for a PS5 digital edition is well worth it.
How well its sellimg and it also contributing a lot to people who've never owned an xbox before is based on comments by Phil, earning calls, when ever there's sales breakdowns etcs.Do we have any info on how well Series S is selling?
I never said either of the PS5's (or any console) aren't worth the price.What does this mean? Series s is a great bargain buy, but I don't think sales say anything about PS5 not being worth price?
Sorry I misunderstood your postHow well its sellimg and it also contributing a lot to people who've never owned an xbox before is based on comments by Phil, earning calls, when ever there's sales breakdowns etcs.
Pretty well documented.
I never said either of the PS5's (or any console) aren't worth the price.
I'm saying he may not think the XSS is worth it, but a lot of people do based on how it's selling.
This is precisely what I'm talking about. In almost all cases games already provide quality settings that span both above and below the consoles. But coming up with "optimal" settings is always coming up with a tradeoff of the best use of performance across *all* of the various quality knobs... they are not really independent in practice. Thus if something is far more expensive on one platform it is likely to be turned down/disabled in favor of something that is relatively cheaper but provides a bigger visual impact.
These sorts of situations are actually fairly common, and thus while it's intellectually interesting to ask for "console settings" where it makes sense, IMO it's more important to get presets that are optimized *for PCs* on PCs. Of course this is an imperfect process in and of itself because PC configurations vary widely, but there are definitely systematic cases where something will be worse or better on all PCs just due to the nature of the available code paths on each platform.
10 on console is only for the Fidelity setting which maxes at 40 fps - if you are willing to do 40 fps on PC, then it is very "cheap!" The performance mode (targetting 60 fps+) is between 7 and 8 on PC.To give a real world example of this we can look at Spiderman on the PC. The RT object distance is very costly on PC vs console and while the PC does match the console setting (10) it doesn't go any higher.
thousands of games do that. FIFA 23, going from Ultra to Low the difference is not that big -in fact the game is CPU limited, at Ultra it maxes out at 115-119fps on my computer, and so it does at Low settings-. Playing with optimised settings in Elden Ring doesn't make much of a difference with having everything set to Maximum. The most noticeable difference to me is the grass density, and even so it's not that easy to notice.Yup, I'm playing the game with "optimized ray tracing" mode and it still ENHANCES the image quality a lot, while bringing huge uplifts in performance. I don't care if the higher settings look better, I just want tailored settings for my hardware. I could understand the sentiment if the said optimized settings wasn't worth the hassle in terms of image quality improvement but they are.
"low", in most cases NOWADAYS, became a semantics trap laid out by developers to lure people into thinking they need better and more expensive hardware to play games. Let's be real here; if something is to be low, it must looks hideous/bad quality. Most modern games nowadays do not even look "that" much different from bad at such low settings, which is funny. back then, I mean pre-2010, low settings really meant "low". graphics would go haywire. I remember setting GTA IV and watch it evolve into something completely different, hideous and ugly. Nowadays I set something like GOTG or RDR 2 to low EXCEPT texture settings, and game looks... %70 same. Oh, a couple trees and grasses are not there, oh that shadow is just a bit more jagged. Nowadays settings are too granular between low and ultra. Back then, low meant really low. Shadows would be gone, ACTUAL performance gains were there, stuff looked ugly, draw distance would be limited to 3 meters in front of you etc. Nowadays, one would expect RDR2 draw distance to go haywire at low but no, all mountains, trees etc. present themselves with all of their glory even at lowest preset BAR texture quality (crucial).
Mowadays, most settings are miniscule changes that you need %800 scope to find differences of. this is why I personally think the term "low" should not be used sparingly. call it "original", "normal", or at least "medium" or something. Low makes people think it is a bad thing. For example, "low" crowd density in spider-man makes no sense, as streets are still full of NPCs. I think I can add one more to the Alex's list: Name the settings PROPERLY. If you call a setting LOW, it MUST adhere to the semantics of the said word. It must bring the ramifications it represents. same goes the other way, sometimes ultra is just a buzzword in certain games.
I'm saying this not because I dislike such settings. Its just that people that NEED to use those settings cannot bring themselves to use them due to the naming scheme. They automatically think low settings as hideous things that should not be touched upon, despite having a low tier or mediocre hardware. I cannot blame them.
God of War proves my theory on this. My friend happily played this game on the "original" preset with great performance. He has a 1070 mind you, and a addict of "high/ultra" settings, often cursing to games for bad performance on such settings. The thought of using medium settings is a sin to him. Even though I urged him to turn down certain settings to medium or low in RDR2 with great gains of performance, he couldn't bring himself to do so and forced himself to play at 45 FPS, despite 60 FPS would be attainable with the optimized settings. Yet, the term "original" was enough safe haven for him to consider those settings playable. See? Psychology is in play here.
I agree with them too. Low/medium in terms of semantics have some serious semantics, at least to the regular end user. Especially if the said user experienced what low medium meant back in 2000s. Some of my friends, I was able to enlighten them, and I owe this to the DF's analysis on optimized settings. Seeing what these so called "ultra" and "medium" settings do in close ups was great of help for some of them.
But sadly, they can only reach to so much people. I think the underlying mentality between naming schemes has to change. I think calling reasonable medium settings "Normal" would be a great of doing it. I saw some games do this, and people were happy with such settings. You can always call higher tiers ultra, extreme, awesome etc. Just call the reasonable settings normal, original or something that means positive and not bad. Neither low nor medium sounds positive. They sound negative to most users. They sound "hehe, you HAVE to use this hideous setting to get playable performance. you cannot play high, hehe"