The-Nastiest
Newcomer
Happens every boot for me in any recent Call of Duty game and Uncharted 4 and i'm hearing it happens in Hogwarts as well.That's not a thing.
Happens every boot for me in any recent Call of Duty game and Uncharted 4 and i'm hearing it happens in Hogwarts as well.That's not a thing.
I would not worry. Considering they only have to worry about Xbox they should have much better time optimizing for them. For my part I'd hope series S doesn't have performance issues cause that's what I will eventually be playing it onGreat Dead Space PC breakdown video!
With the way 2022 ended and 2023 started with PC ports, I am worried for Starfield.
Currently, I have a PS5 so I can play some of the titles that I am interested in over there where the issues are noticeably lessened and I don't have to wait 1-4 mins for shaders to compile or check caches at game boot whenever I want to start a game.
That said, I don't own a Series X. In 2020, I went with a X570/3800X-based PC and in 2021 upgraded my older GPU to the RTX 3060. It was already expensive and hard to get and I wasn't willing to spend more (I got it retail but the prices back then were higher than MSRP.)
So, the PC is good, but doesn't have anywhere near enough power to brute force and minimize any optimization issues...and I don't plan on upgrading again anytime soon.
...I am worried for Starfield.
A lot of the issues I would say aren't due to multiple platform development.I would not worry. Considering they only have to worry about Xbox they should have much better time optimizing for them. For my part I'd hope series S doesn't have performance issues cause that's what I will eventually be playing it on
Happens every boot for me in any recent Call of Duty game and Uncharted 4 and i'm hearing it happens in Hogwarts as well.
Great Dead Space PC breakdown video!
With the way 2022 ended and 2023 started with PC ports, I am worried for Starfield.
Currently, I have a PS5 so I can play some of the titles that I am interested in over there where the issues are noticeably lessened and I don't have to wait 1-4 mins for shaders to compile or check caches at game boot whenever I want to start a game.
That said, I don't own a Series X. In 2020, I went with a X570/3800X-based PC and in 2021 upgraded my older GPU to the RTX 3060. It was already expensive and hard to get and I wasn't willing to spend more (I got it retail but the prices back then were higher than MSRP.)
So, the PC is good, but doesn't have anywhere near enough power to brute force and minimize any optimization issues...and I don't plan on upgrading again anytime soon.
...I am worried for Starfield.
Maybe loading a bunch of uncompressed Vertex Animation.Wonder what's causing cutscenes to take up so much vram -- loading a lot more shader variants?
Huh, maybe, that’s more plausible than my guess but still pretty hard to believe.Maybe loading a bunch of uncompressed Vertex Animation.
In this case it is just not included due to time - I have to move onto other work and the video is already 24 minutes in length without them... and the video was already "late" (not really "late", but after release day). A big issue I think happens with my work is that it is requires a lot more time and effort than a console video - and people in our audience are extremely fickle. If a video covering PC comes out at a normal pace after a game release, people just do not watch it as the next new shiney thing is there for them to latch onto. That happened with Dead Space and to a degree, ForSpoken. I got Dead Space review code just at release time, but was still working on Forspoken because I likewise got Forspoken code just at release time. That means I am working on my reviews sequentially for each game as consumers already have both the games in their hands. And every day a video is not released, the less viewers it gets and the less money it makes due to how fickle our audience is.Great video. It would have been nice to see how RDNA 2 cards fared on some of these issues with their larger frame buffers.
I feel personally attacked.... and people in our audience are extremely fickle.
Interesting talk about DLSS3. Contrary to the video I found DLSS3 to be unusable in Witcher 3 but fine in Portal. I need to have a play around with W3 some more though as I think I can get it running better. I was getting a pretty poor framerate without FG (30's to 40's I think from memory), but turning DLSS3 on increased things into the 50's or so, but the latency and pacing was horrendous.
Cheers, I'm wanting to avoid a full platform upgrade again at the moment though. I don't like to swap the mobo out too often (nothings gone wrong before but I'm always conscious of it!)
I'm toying with the idea of picking up a used 5800X3D on ebay which is the fastest CPU by current system will support and will likely give me an easy 50%+ improvement. But for now I'm not hitting any real issues on this CPU outside of the Witcher, and I can live with them tbh with frame generation in place. I'm not actually playing the game anyway, just playing settings
Yeah I can get it a fair way slower than that too. That said it's less the city that I had a problem with (which outside of the flickering hud elements was much better with FG, albeit far from ideal), and more an area in a small village I was testing which was likely a mix of both CPU and GPU bound. Here when I turned on FG the frametimes went crazy and input latency went through the roof - unplayably so (and I'm pretty tolerant in that respect). Eyeballing it I'd say 2-300ms! However after resetting the game that settled down a lot so was probably just a bug related to changing the settings, perhaps in combo with RTSS which at that point I think may have still had a frame cap engaged, albeit a generous one.
They said they were pushing a billion polygons in vr? Seriously?
"Having to check game caches every time you load a game"?
I've been PC gaming for over 30 years and not only have I never done it, I don't even know what it is.
I feel personally attacked.
Just kidding, great video