I don't have a HDR capable screen, can it still impact SDR? Sorry, I do not understand HDR things much.I’ve seen that same orange glow in other games where HDR tone mapping isn’t working properly.
I don't have a HDR capable screen, can it still impact SDR? Sorry, I do not understand HDR things much.I’ve seen that same orange glow in other games where HDR tone mapping isn’t working properly.
Arbitrary resolution counting doesn't really matter to what generation the hw is as long as the resulting image looks passable and/or it allows for a stable fps. It's no longer even next gen, it's current gen. Your acting like DLSS, TSR and FSR and XeSS are not vital to the gaming experience now. They are and it applies to all hw.Jedi was designed to run at 30 FPS. This performance mode is a joke. "Next Gen" console and then upscaling from ~1080p with FSR performance and reduced quality.
I think console gaming has to go back to simpler times.
Another fantastic straw man. That has never been my argument. My argument has been that in the context of the discussion occurring here on b3d, your point is pedantic as the audience here already know it. I’ve been in support of you correcting misinformation at the source.Sigh... okay, to try and bring this back to something resembling a technical discussion, let me explain for you why this kind of misreporting of technical details does indeed have the potential to impact peoples purchasing decisions.
Scenario 1
Scenario 2
- Reviewer claims that a 4090/7900XTX is unable to maintain 60fps in a game while in reality they are bottlenecked by lets say a 5950X to 50fps average and without that bottleneck the GPU could comfortably average 120fps.
- Consumer who owns (for example) a 3060 along with a 5600x concludes that since their GPU is less than 1/3 as fast as the GPU that cannot hit 60fps, the game, will be unplayable on their system.
- Consumer decides not to purchase game on those grounds.
- Consumer has been misinformed because in fact their GPU was quite capable of hitting playable frame rates in the game while their CPU, being only roughly 10%-15% slower for gaming would have still been in the 40fps range for a solid 30fps lock (or higher with VRR).
Obviously given the variety of hardware combinations out there it would be possible to come up with hundreds of different variations on the above scenarios that all lead to the consumer making a poor choice because they were misinformed.
- Reviewer claims that a 4090/7900XTX is barely able to maintain 60fps in a game while in reality they are bottlenecked by lets say a 7800X3D, and without that bottleneck the GPU could easily exceed 100fps.
- Consumer who owns (for example) a 3080 along with a 3700x concludes that since their GPU is only around half as fast as the GPU that just about hitting 60fps, they need to upgrade their GPU to something faster
- Consumer throws $1200 down on a 4080, buys the game and loads it up
- Consumer is massively bottlenecked by their 3700x to around 30 fps because they were misinformed by the reviewer. If they had spent 1/4 of that $1200 on a a new CPU (5800X3D) instead of the GPU, they could have achieved the same ~80% performance of the reviewers system that they expected to get from their GPU upgrade. Instead, they spent 4x as much and got no performance increase at all over their older GPU.
Let's be honest, if we boil this right down to basics, you're arguing that reviewers putting out false/misleading information doesn't matter. Are you really willing to die on that hill?
This is a joke right? The devs can’t be that bad….. I don’t know if it’s just me but, it appears that there’s a generational shift going on in the games industry in terms of talent. This might not be correct but, I get the impressions that the talent from the ps360 era are starting to retire or go to find new challenges. And to me, the new talent is not remotely up to scratch. Saints Row, Fallen Order, Gotham Knights, etc comprise a list of many of the technically unimpressive efforts that are unnecessarily demanding.The write up says the performance mode is reconstructed from between 972p and 1080p up to 1440p. Is the internal res actually even lower?
Is does look shockingly bad at times in performance mode with loads of aliasing and shimmering and has loads of what I assume are FSR artifacts when you move the camera.
Another fantastic straw man. That has never been my argument. My argument has been that in the context of the discussion occurring here on b3d, your point is pedantic as the audience here already know it. I’ve been in support of you correcting misinformation at the source.
If you are happy with 30fps then both consoles seem ok.
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.This is a joke right? The devs can’t be that bad….. I don’t know if it’s just me but, it appears that there’s a generational shift going on in the games industry in terms of talent. This might not be correct but, I get the impressions that the talent from the ps360 era are starting to retire or go to find new challenges. And to me, the new talent is not remotely up to scratch. Saints Row, Fallen Order, Gotham Knights, etc comprise a list of many of the technically unimpressive efforts that are unnecessarily demanding.
I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.
It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
The problem with this is who knows what the "consequences" are internally for respawn not making this date or financial quarter. The first time I heard compensation was being withheld for arbitrary metacritic review scores after the fact I knew there was tons of shit under neath the counter we were not seeing in how devs are pressured by publishersI think that's true, but also in this case EA asked Respawn if they needed more time. They declined.
I don’t think all the blame can be put on the publishers when there are developers who manage to ship technically competent games in reasonable time frames. There is a huge range in skill/talent when looking at any complex endeavor, why do people think coding would be any different. It’s no coincidence that the best games always come from the same small pool of developers.I think devs are being crunched to death, overworked, overruled by publishers which want games out faster with the complexity and scale of games not slowing down and instead vastly increasing with every title.
It's definitely a problem. But I won't fault the human labor for being "unskilled" it's a bad take.
The problem with this is who knows what the "consequences" are internally for respawn not making this date or financial quarter. The first time I heard compensation was being withheld for arbitrary metacritic review scores after the fact I knew there was tons of shit under neath the counter we were not seeing in how devs are pressured by publishers
Of course if it was just a matter of respawn overestimating themselves that's fine. But it's very hard to believe they could not know the issues inherent with the game skus rather than the mindset of pushing it out and fixing post launch.
.Either way I don’t understand why the metacritic score is so high.
User scores are very sus a lot of the time(see burning shores)Lol the user rating is 1.4!
User scores are very sus a lot of the time(see burning shores)
I think it's a matter of reviewers not being as technically inclined, playing the game on the "most optimal" sku at launch and stuff like that. Unless the issues actually make the game unplayable to that degree, fps drops like on PS5 aren't gonna affect the evaluation of the game itself
Here's my overclocked i5 12400f in the same spot on Ultra settings (And still CPU limited)
View attachment 8538
That doesn't seem very obvious to me? CPU utilization is probably pretty bad on consoles as wellIn short the game was obviously designed to run on consoles @ 30fps. That combination seems to do very well.
Yes at that low resolution the performance mode is also very likely CPU limited on consoles. It's probably because of the default RT reflections surprisingly also present in that mode.That doesn't seem very obvious to me? CPU utilization is probably pretty bad on consoles as well
This is my personal guess.The game has weird glowy orange hair in lit scenes on Series S and usually non ray traced cards. What's up with that?
I was happily playing but now I'm at a location where my hair gets orange glow whenever I move. Ray tracing somehow fixes it. Any ideas?