“No one’s saying devs suck, they’re just not as good as they once were.” How is this anything other than pure conjecture? Sure, question and even complain about why what you’re seeing isn’t as impressive or performant as what you expect, just don’t use the “devs suck” brush. The next useful step in a technical forum isn’t to figure out why the devs weren’t good enough but what compromises may have led to what you’re seeing. Talent may be a reason, but is generally a black box to us (and time and money and leadership can all be contributors).I don’t think anyone is saying devs suck but we are criticizing their implementations. Personally, I do not think the level of talent is as high as it was in the past for a variety of reasons.
Starfield isn’t bad because it’s not as impressive as Cyberpunk in some respects. Keep in mind that Cyberpunk was pushed out the door at least slightly undercooked (police spawning behind the player, pedestrians despawning if you turned your back, traffic skating around on rails: see all the GTA comparison videos) even on PC by leadership to meet some financial deadlines.
(And as highly praised as Cyberpunk was and is visually, CDPR still decided to abandon their in-house engine for their next game. Talent flight? Leadership priorities? Market realities?)
You indirectly questioned Digital Foundry’s qualifications to weigh in on a game’s technical merits (how many other “YouTube outlets“ are “celebrated on here?”), yet it’s likely their analysis is at least somewhat informed by dialogue with developers. They posited Starfield’s adherence to the Bethesda standard of object (semi-)permanence as a potential limitation, and here we are with many benchmarks showing a decent correlation between framerate and memory access/latency.