Yah fortnite already takes advantage of a lot of this, I think.
Fortnite looks a lot better than it used to. A lot of people see gameplay on low graphics or they think of what it looked like five years ago. It’s very nice now. It’s also had motion matching for quite a while now I think when...
Do you think the margins on the gpus have grown or stayed about the same since prices have gone up? Honest question. I don't know how much profit Nvidia turns on a 4080 or 4090 gpu and how that's change from maybe a 2080 or 2080ti. I would guess their margins have gone up and they're comfortable...
I basically started this line of the thread. Also an Nvidia owner and haven't considered switching to AMD or intel. I'm waiting on the 5070/5080 to see pricing, performance etc. When I look at the gpu market where I live, the RTX 4080 is still about $1300 CAD, which is $300 more than I paid for...
We need amd and intel to wake the f' up. At some point Nvidia isn't going to be able to sell gpus because it'll just be money lost. Every dollar they put into ai will reward far more than any gpu they could sell. I'm interested in maybe getting a 5070 or 5080, but I have a feeling they'll be...
I guess Unity hasn't given up on DOTS. Livestream tomorrow, unless the livestream is to announce it's dead lol.
https://www.youtube.com/live/3pLun0GKSAs?si=z6TmCc1LvrU0jues
@trinibwoy It will be interesting to see CDPR's use of UE5 for sure, but there's always the option that they basically don't use lumen and roll their own ray-tracing/GI. UE5 gives you a lot, but you have the option to fully customize any part of the engine that you want. They don't strike me as...
Moon Studios has an incredible art team. Game looks fantastic and has a very unique visual style. The curved world and the subtle lighting being centred on the player are really smart design choices, and they help highlight the great artwork in the game.
I feel like something went wrong on the Microsoft side where they had plans for things to take advantage of their little ai tweaks (int4, int8 ?) on the xbox and they didn't work out. Either that or they straight up just made tiny enhancements without any real plans because machine learning and...
Yah, you can enable cpu wait and gpu wait in the overlay. You can show an average or a raw number (plus other options), and you can graph it.
I don't use gpu wait or cpu wait. Maybe I should. Don't know. I basically just plot gpu and cpu busy relative to the total frame time. I use all raw...
next-gen should target 1080p60 upscaled to 4k so all of the rendering power can be put towards more complex algorithms instead of raw pixel counts and VRAM requirements can stay relatively flat.
🍿
Back to DF, I'm really interested to see what they get out of PresentMon. I've played around with...
I appreciated the amount of practical effects and sets that were used. The only set I didn't really like was Filly, but it still wasn't bad. Maybe it was just the lighting in those scenes, but it's the only one that didn't look quite the same as the rest of the show. Otherwise, I think they did...
I don’t see why accurately reporting resolution is suddenly a problem. And yes, letterboxing is a different output resolution.
Sure forum warriors might use it as a talking point but that’s no different than anything else.
And the resolution is relevant when talking about performance. I do run...