Do you think there will be a mid gen refresh console from Sony and Microsoft?

Well it's not a 'test', just more of a very obvious and pretty undeniable observation. My point here is that there is no actually worthwhile way to actually 'test' or measure this at all.
Disagree. If your observations are meaningful, than observing the choices of thousands and thousands of console gamers will be more meaningful. Seeing the performance/quality choices console gamers are making in games that support them will give a little useful insight into what gamers prefer.
 
GTA is known for going big or going home, and that comes with a cost. I would be very sad if they watered down their ambitions just to hit this '60fps or bust' mentality.
Games can offer both fidelity and performance, and that is what games are doing. Why are publishers making an effort to include a 60fps mode?

It an absolute mystery. :nope:
 
Games can offer both fidelity and performance, and that is what games are doing. Why are publishers making an effort to include a 60fps mode?

It an absolute mystery. :nope:
Because overall the big money is currently being made on >60fps gaming (PC, mobile and COD, Fortnite, sport games on consoles). With their wallets consumers have already decided what was best for them, in the vast majority it's >60fps gaming.

30fps gaming is becoming a relic of the past.
 
GTA VI says otherwise.

It's probably smart to target 30 fps on existing XSX and PS5 knowing that this game is going to last 10+ years and they can fleece people for 60 fps versions later. Get the visuals beautiful at 30 fps and then ramp up the speed later.
 
GTA VI says otherwise.

It's probably smart to target 30 fps on existing XSX and PS5 knowing that this game is going to last 10+ years and they can fleece people for 60 fps versions later. Get the visuals beautiful at 30 fps and then ramp up the speed later.

Insomniac have even confirmed they've seen little sales difference between 30 and 60fps, most people aren't "Gamurs!" obsessed over metrics.

Besides, GTA VI is probably CPU bound. If you're GPU bound, at least with 4k resolutions, it's plausible you can just lower rendering resolution/LOD metrics/etc. until it runs at 60, I.E. Spiderman 2. But if you're CPU bound then that's often baseline, there's nothing for it and you're at your target.

As many games don't use the CPU for that much, unlike GTA with its physics and AI strewn open world, a lot of games (especially so far) just aren't as likely to be CPU bound.
 
GTA VI would be perfect for PS5 pro and Xbox series xs to offer a 60 fps mode at launch. Releasing either mid gen console around when GTA vi launches would be great
 
Games can offer both fidelity and performance, and that is what games are doing. Why are publishers making an effort to include a 60fps mode?

It an absolute mystery. :nope:
I really shouldn't have to explain that this isn't how it works. If it was that simple, then every game in the XB1/PS4 generation could have had a 30fps and 60fps mode as well, but they didn't. It's not like nobody had thought of this("Woah, why dont we just also make the game 60fps!?). It's that you're going to build your game differently if you are targeting 30fps versus having to target 60fps.

We are not just talking fidelity here, either. Again, shouldn't have to explain that there's more to a game's processor demands than just basic graphics features. Obviously any game that has very low CPU demands can scale graphics and performance better.

Please dont waste both our time here making me explain how games aren't infinitely scalable and that having a 30fps/33ms framerate target allows them to fundamentally do more than with a 60fps/16ms target. This isn't some small difference. That's actually a really big boost in frametime headroom to play with.

Disagree. If your observations are meaningful, than observing the choices of thousands and thousands of console gamers will be more meaningful. Seeing the performance/quality choices console gamers are making in games that support them will give a little useful insight into what gamers prefer.
They aren't my observations. This isn't some personal anecdote. We can both observe that console gamers are entirely fine with 30fps gaming. So much so that even today, a 30fps game like Tears of the Kingdom can be considered one of the very best games ever made by critics and gamers alike. At absolutely no point has anybody ever said that not being 60fps is some mark against the game and so doesn't deserve its unabashed praise. This is in addition to countless other examples of 30fps games being completely beloved. I just keep bringing up TotK because it's a 2023 game, showing that there's not actually been any large standards shift when it comes down to it.

You are proposing something different - some kind of 'measured test' that can say something about what people prefer, but I've spent the last several responses already pointing out the many reasons such a test doesn't work and that you cant measure this. I dont want to keep talking in circles here, but I dont know how else to get that across. This isn't measurable. If a test is flawed to such a hard enough degree that even tentative conclusions cant be made, you discard the test, you dont say, "Well it's the best we have".

I guess it's gonna sound quite arrogant to say, but you cant 'prove' me wrong here. It'll sound even more arrogant when I say that even most console gamers who say "60fps or bust" are fooling themselves and would still be fine playing 30fps games when it comes down to it. Because it's 100% possible that people build up ideas of standards than actual real standards. Input lag is another area where I think many gamers massively overestimate their sensitivity levels(though this actually would be testable). I think there's a reason we see most of these sorts of comments coming from people in online enthusiast communities(basically any online gaming forum where people love to talk about games), which doesn't represent the average gamer anymore than Twitter represents the average voter.
 
Last edited:
I guess it's gonna sound quite arrogant to say, but you cant 'prove' me wrong here.

Likewise you can't "prove" you are correct. Everyone is talking about hypotheticals. There are no absolutes here. No 100% statements can be made.

Because it's 100% possible that people build up ideas of standards than actual real standards.

I mean, duh, for something to be possible in the first place would mean it's 100% possible. Yes, some people can't tell the difference. Some people can. None of that is being disputed. Some people are willing to play at 30 FPS to varying degrees to satisfaction (the game itself) combined with varying degrees of dissatisfaction (lowered graphical quality of 30 FPS and/or lowered control quality and/or lack of crisp controls, etc.).

Being unhappy at having to play at 30 FPS doesn't mean that they don't like the game itself, just that they really dislike playing it at 30 FPS whether for gameplay reasons for graphical IQ reasons.

If there is no choice, then you can't make any judgements about whether X gamer would prefer that game at 30 FPS or 60 FPS. If a game was only offered at 60 FPS, and everyone is playing that, you can't make the statement that the majority of gamers prefer 60 FPS. Likewise if you can only play a game at 30 FPS, you can't make a statement that the majority of gamers prefer 30 FPS.

It's why, as Shifty Geezer has noted, it'd be nice if we had access to publisher/console holder databases showing the split between people playing at 30/40/60/unlocked settings on console. It'd at least be a good foundation from which to start.

Just because a gamer is held hostage by a developer/publisher's choice to force just one framerate does not mean that the gamer actually enjoys that framerate even if they like the game itself.

Regards,
SB
 
GTA VI says otherwise.

It's probably smart to target 30 fps on existing XSX and PS5 knowing that this game is going to last 10+ years and they can fleece people for 60 fps versions later. Get the visuals beautiful at 30 fps and then ramp up the speed later.
For a limited period, the time they milk those console peasants and their weak Zen 2 CPUs. But eventually the bulk of GTA 6 sales will be on >60fps platforms. And many will double dip to play at >60fps. With diminishing returns of graphics the switch to higher framerate than 30fps as the sweat spot was inevitable.

They day Digital Foundry crew, '30fps motion blur lovers', favored playing Horizon Forbiden West using the patched 60fps performance mode was the day 'bell and whistles 30fps gaming' died.
 
Wow so PS5 Pro is rumored (post I just read on Ree Era) to be 56 CU's at just 2 ghz? 14.33 Tflop. And 16 GB RAM?

These seem to be the same type of leaks that correctly predicted PS5/XSX specs before release, as well

  • Viola is fabbed on TSMC N4P.
  • GFX1115
  • Viola's CPU is maintaining the zen2 architecture found in the existing PS5 for compatibility, but the frequency will once again be dynamic with a peak of 4.4GHz. 64 KB of L1 cache per core, 512 KB of L2 cache per core, and 8 MB of L3 shared (4 MB per CCX).
  • Viola's die is 30WGPs when fully enabled, but it will only have 28WGPs (56 CUs) enabled for the silicon in retail PS5 Pro units.
  • Trinity is the culmination of three key technologies. Fast storage (hardware accelerated compression and decompression, already an existing key PS5 technology), accelerated ray tracing, and upscaling.
  • Architecture is RDNA3, but it's taking ray tracing improvements from RDNA4. BVH traversal will be handled by dedicated RT hardware rather than fully relying on the shaders. It will also include thread reordering to reduce data and execution divergence, something akin to Ada Lovelace SER and Intel Arc's TSU.
  • 3584 shaders, 224 TMUs, and 96 ROPs.
  • 16GB of 18 gbps GDDR6. 256-bit memory bus with 576 GB/s memory bandwidth.
  • The GPU frequency target is 2.0 GHz. This lands the dual-issue TFLOPs in the range of 28.67 TFLOPs peak (224 (TMUs) * 2 (operations, dual issue) * 2 (core clock)). 14.33 TFLOPs if we ignore the dual-issue factor.
  • 50-60% rasterization uplift over Oberon and Oberon Plus, over twice the raw RT performance.
  • XDNA2 NPU will be featured for the purpose of accelerating Sony's bespoke temporal machine learning upscaling technique. This will be one of the core focuses of the PS5 Pro, like we saw with checkboard rendering for the PS4 Pro. Temporally stable upscaled 4K output at higher than 30 FPS is the goal.
  • September 2024 reveal


Yeah at that point if I'm Xbox I dont even worry about it. And if I'm Sony I gotta price no higher than 499.

The "XDNA 2 NPU" could be some kind of wild card, I guess, though.

Apologize I have no idea how much of this has been discussed in pages past.
 
Likewise you can't "prove" you are correct. Everyone is talking about hypotheticals.
I'm not talking about hypotheticals at all, though. Y'all are really not grasping the dynamics of this discussion here.

I've already proven I'm correct. There are countless 30fps console games that are wildly loved. There is no "It's an otherwise great game, but it's only 30fps..." qualifiers for these games, they are simply beloved. This isn't some personal take, I'm just stating objective reality here.

And I bring up Tears of the Kingdom because it proves that even in 2023, 30fps has not become some hindrance to console gamers loving a game. And I also think it's terribly obvious that if Breath of the Wild/Tears of the Kingdom had to be 60fps games, they would not have been able to make these games what they were. You can of course press on this uncertainty and make bad faith arguments knowing full well what I'm saying here is 100% reasonable, but I'll hope you wont do so in order to save us all time.
 
Being unhappy at having to play at 30 FPS doesn't mean that they don't like the game itself, just that they really dislike playing it at 30 FPS whether for gameplay reasons for graphical IQ reasons.
First off, sorry for having to double post this. I'm bad at editing quotes, it is not always very cooperative.

But anyways, you're talking about everything I already addressed. They aren't UNHAPPY to play at 30fps. There is no indication of this at all. I've literally talked about this relentlessly in these comments and you're just bizarrely ignoring it. Nowhere were these beloved 30fps-as-default hits scarred for merely being 30fps. There was no indication of any 'unhappiness' at all. Whatsoever. None. Zip. Zilch. All these PS4 blockbuster exclusive hits for example never had a word from critic or gamer about only being 30fps detracting from the experience.

It's much easier to be unhappy about 30fps when the compromises aren't that big and importantly - an actual 60fps option even exists. Which is huge, cuz it means the game wasn't designed around a 30fps target on said hardware.

I cant stress enough - it is NOT possible to measure whether console gamers prefer 30fps or 60fps. Because it is not possible to demonstrate the reality of what developers have to give up to ensure their game runs at 60fps.
 
Wow so PS5 Pro is rumored (post I just read on Ree Era) to be 56 CU's at just 2 ghz? 14.33 Tflop. And 16 GB RAM?

These seem to be the same type of leaks that correctly predicted PS5/XSX specs before release, as well




Yeah at that point if I'm Xbox I dont even worry about it. And if I'm Sony I gotta price no higher than 499.

The "XDNA 2 NPU" could be some kind of wild card, I guess, though.

Apologize I have no idea how much of this has been discussed in pages past.
Most of that is just FUD from probably an Xbox fan. Kepler already strongly hinted Pro will have 60 activated CUs (he explains how), Henderson was also very specific about 60CUs, and 2ghz at 4nm is also fantasy when Sony can already do 2.23ghz at 7nm.

Clocks will most likely hit >2.8ghz using dynamic clock system. The rest that is likely true is taken from Kepler tweets / hints. 96 ROPs is also completely unnecessary for Sony and we pretty much know from Kepler than base architecture will be RDNA3.5, not RDNA3. Finally even an insider wouldn't know about specific clocks like this.
 
Clocks will most likely hit >2.8ghz using dynamic clock system. The rest that is likely true is taken from Kepler tweets / hints. 96 ROPs is also completely unnecessary for Sony and we pretty much know from Kepler than base architecture will be RDNA3.5, not RDNA3. Finally even an insider wouldn't know about specific clocks like this.

Sony will likely keeps the clocks down to help with thermals and power draw, PS4 Pro didn't see a huge increase in GPU clock speed over base PS4.

And given RDNA3 power consumption is shit there's every reason to believe they'll hamstring clocks to try and get it within the power/perf sweet spot.
 
Wow so PS5 Pro is rumored (post I just read on Ree Era) to be 56 CU's at just 2 ghz? 14.33 Tflop. And 16 GB RAM?
I made a post in the gta6 trailer thread that some of the leaks displayed stats showing it seemed to be running on a 2080ti and maybe that was what they expect the pro console to be around power wise. The specs in your leak are somewhat around 2080ti level, interestingly enough.
 
Most of that is just FUD from probably an Xbox fan. Kepler already strongly hinted Pro will have 60 activated CUs (he explains how), Henderson was also very specific about 60CUs, and 2ghz at 4nm is also fantasy when Sony can already do 2.23ghz at 7nm.

Clocks will most likely hit >2.8ghz using dynamic clock system. The rest that is likely true is taken from Kepler tweets / hints. 96 ROPs is also completely unnecessary for Sony and we pretty much know from Kepler than base architecture will be RDNA3.5, not RDNA3. Finally even an insider wouldn't know about specific clocks like this.

Yes this is FUD...
 
I really shouldn't have to explain that this isn't how it works. If it was that simple, then every game in the XB1/PS4 generation could have had a 30fps and 60fps mode as well, but they didn't.
You're only explaining why you don't understand. Both PS4 and Xbox One had relatively weak CPUs, and choosing to use a significant chunk of the CPU to drive twice as many frames could really impact gameplay options and other mechanics - assuming the GPU could hold up. PS5 and Xbox Series have much better CPUs where driving more frames still leaves plenty of CPU time for the rest of the game.
 
Last edited by a moderator:
Wow so PS5 Pro is rumored (post I just read on Ree Era) to be 56 CU's at just 2 ghz? 14.33 Tflop. And 16 GB RAM?
It is weaker than 7700XT, but the article says it has 1.5~1.6x performance of PS5?

Besides if RDNA3 is power hungry why not using a shrinking RDNA2 in latest CMOS process?
 
Last edited:
I guess it's gonna sound quite arrogant to say, but you cant 'prove' me wrong here.
I'm not trying to. We know some gamers don't care for higher FPS. We know console gamers are content to buy and play 30 fps games. I'm simply proposing a way to see how gamers value lower and higher framerate to a degree better than the absolutely no insight whatsoever we presently work from. ;)

If the outcome is heavily skewed one way or the other, that would indicate a preference that seeing people buy 30 fps games can't. Notably, if almost everyone chooses 30 fps, it shows console gamers don't care one jot for framerate. If nigh everyone chooses 60 fps, it shows console gamers would prefer smoother framerates for the same general experience.
 
Wow so PS5 Pro is rumored (post I just read on Ree Era) to be 56 CU's at just 2 ghz? 14.33 Tflop. And 16 GB RAM?

These seem to be the same type of leaks that correctly predicted PS5/XSX specs before release, as well




Yeah at that point if I'm Xbox I dont even worry about it. And if I'm Sony I gotta price no higher than 499.

The "XDNA 2 NPU" could be some kind of wild card, I guess, though.

Apologize I have no idea how much of this has been discussed in pages past.
That all seems reasonable enough. Except for the 2GHz figure.

I seem to recall early news that with PS5 dev kits, you could select from a few performance modes:
1 - the dynamic model touted by Cerny
2 - CPU fixed at 3.5GHz and GPU at 1.8GHz
3 - GPU fixed at 2.23GHz and the CPU at something notably lower than 3.5GHz (although I can't remember what figure that was exactly)

So, perhaps that's the GPU's minimum clockspeed in the same sense that 1.8GHz is the PS5's minimum? It'd fit with the "free" 1.15x clockspeed increase you can generally expect from 7nm>5nm

Or maybe AMD's dual issue architecture is simply too power hungry. Heck, maybe Sony don't want to spend money on liquid silver for cooling the Pro, and they're clocking it more modestly in keeping with their cooling solution?

Too many unknowns at the moment, but if those Pro specs are correct, I'll be pretty happy with them. Especially if they incorporate IC, as that seems to have done wonders for AMD's discreet GPU's.
 
Back
Top