Current Generation Hardware Speculation with a Technical Spin [post GDC 2020] [XBSX, PS5]

Status
Not open for further replies.
Realistically. How much do some 10,20,30% more flops matter? To me it feels like we really are hitting the point where development team quality and time used matters more than brute force. Especially so if more effort allows higher textures/geometric details thanks to both more optimized streaming and assets.

The diminishing returns is real thing unless it's combated with higher production values? Will we end up into episodic land where creating giant games with highest quality is not anymore realistic? I believe there was some rumors that gta next would be more episodic to bring the development time down?
 
Realistically. How much do some 10,20,30% more flops matter? To me it feels like we really are hitting the point where development team quality and time used matters more than brute force. Especially so if more effort allows higher textures/geometric details thanks to both more optimized streaming and assets.

The diminishing returns is real thing unless it's combated with higher production values? Will we end up into episodic land where creating giant games with highest quality is not anymore realistic? I believe there was some rumors that gta next would be more episodic to bring the development time down?
what if you have the brute force, the efficiencies, the innovations and the talent?
I think thats what MS targeted and got. PS5 aimed for some innovations but it is unknown if they managed to achieve all of the efficiencies and new technological innovations they could.
 
  • Like
Reactions: JPT
what if you have the brute force, the efficiencies, the innovations and the talent?
I think thats what MS targeted and got. PS5 aimed for some innovations but it is unknown if they managed to achieve all of the efficiencies and new technological innovations they could.

I didn't want to make a versus statement. That just leads to circles and silly arguments like halo versus horizon. I was wondering in playstation context that if ps5 had let's say 20% more flops but game developers didn't get any more time/resources would the games be visibly different in other than still pictures with 200% crops. We kind of already know game developers are pushed to the max and it's unlikely game development budgets grow much(if at all) from previous gen.
 
Things like AI driven asset creation, as Nvidia has demoed will probably be used to help bring production time down. Whether that leads to less or more interesting environments compared to hand crafted remains to be seen however.
 
Realistically. How much do some 10,20,30% more flops matter? To me it feels like we really are hitting the point where development team quality and time used matters more than brute force. Especially so if more effort allows higher textures/geometric details thanks to both more optimized streaming and assets.

The diminishing returns is real thing unless it's combated with higher production values? Will we end up into episodic land where creating giant games with highest quality is not anymore realistic? I believe there was some rumors that gta next would be more episodic to bring the development time down?
Minecraft RTX is a perfect example of low production values, but with extremely high graphical fidelity.
 
Minecraft RTX is a perfect example of low production values, but with extremely high graphical fidelity.

Minecraft would be great outlier. Would be interesting if someone took pc version with vrr on and made comparison video with some specific card using one video with default settings and another video with 20% less flops and tweak settings a little bit so fps is same but fidelity is less. Would average person notice the difference between videos, would the difference be significant?
 
Minecraft would be great outlier. Would be interesting if someone took pc version with vrr on and made comparison video with some specific card using one video with default settings and another video with 20% less flops and tweak settings a little bit so fps is same but fidelity is less. Would average person notice the difference between videos, would the difference be significant?
That's sort of been the challenge of every discussion in general here with regards to graphic fidelity.

We choose a baseline, say Gears 5, crank it to ultra 4K60 and say, well this all the hardware is capable of, so it can't possibly be capable of more because neither of these consoles should be able to outperform a 2080TI.

Except that consoles have never run ultra settings, ever. Most of the time they are running low/med and sometimes they are running high. And @DavidGraham really put out some good posts where most people were completely unable to spot the differences between low, med, high, ultra settings, but ultra settings nuked performance by insane amounts, for visually marginal increases in graphical fidelity.

if any of this sounds familiar, and it should, then suddenly RT makes more sense not less. If you're going to get hit hard, you may as well get hit hard without the having to hack every single scene so that it looks right, without the hacks you keep your budget elsewhere, games can budget according, prioritize where they can put their effort better, world builders don't need to fidget with a bunch of things to make sure that things look 'right'. Instead of high budget teams being the only ones that are able to output high fidelity graphics because they have the resources (money, time, and hardware) to constant fidget, re-bake, and re-do areas and scenes over and over again to mimic lighting. You may as well just run real lighting, toss all that rework out the door and focus on something else.

I call this the equalizer. This is where studios that have insane budgets and support will suddenly not be as special because someone like FYQD can come along by themselves with some contract help and asset purchases and make something like Bright Memory Infinite.

but yea, Minecraft is a big outlier ;) It's path traced. I'm not expecting many games this coming gen to have any sort of path tracing except for stylized games in that sense.

For me, Ray Tracing is the path I want games to take. Like anyone who is a small/single/indie developer is looking at that and seeing an all new way to showcase games and basically have all new puzzle systems because of the physics of light being simulated... and realizing that you can be competitive with the bigger studios, with virtually little budget and stylized art. Like so many indies right now do basic games like Meatboy, Tower Fall, etc. And even making those bitmaps are time consuming. It takes years to make those types of games.

It would take someone a fraction of the time to put out blocky ray traced lowish models, but people could still really enjoy the graphics. Seriously eyeing UE4 right now. Unity is also a thing. Running your own ground up RT engine is also very much a good possibility now.
 
Last edited:
Because it was, as stated in the video, various sleepless weeks for him of work to patch the game to not run at double speed. Also this uses the PS4 Pros boost mode to hit it's targets at 720p. From Software most likely didn't get the funding from Sony to do it or actually patch the game to be visually worse than on base PS4 just to get to 60fps.
 
Tweeting yosp is a good way to let sony know you want something. The ps4nodrm thing was good. Do the same with bloodborne 4k60 ps5 and youll have sonys attention.

Dont forget that if its true demons is what bluepoint is working on its not like they arent willing to invest in old properties. Specifically fromsoft

Quite a lot of us are upgrading from the base consoles, so the difference is pretty significant.

Imagine being a guy that's currently got a base Xbox One and goes to Series X - that's a 9.4x increase in flops. We know that efficiencies are going to be significant too, didn't Cerny state that they'd be about 60% faster flop-for-flop? That's going to be a huge difference and a 4x increase in resolution.

A lot of you guy are hardcore PC gamers with significant resolution monitors and top of the range GPUs. The majority of gamers are going to see huge changes. B3Ders not so much.

I'm excited for it. Most of the games shown during the Microsoft presentation looked amazing to me.

I'm going from base PS4 to next-gen and buying my first 4k TV* at the same time. For me, it's a huge leap.

*I have a traditional house with antique furniture, so not the most high tech inviting environment. It'll all be going in my boys' room.

This is me as well. Hated the concept of the pro iterations and pro never changed my mind on that in particular.

Bought my 500gb launch ps4 day 1 and its still kicking. Even playing on a 4ktv is a nice experience. Im expecting ps5 to be a revelatory experience
 
As long as there are pc ports there will be lower targets than xb1.
Not for much longer IMO.
Give it 2 years and almost all mobile APUs in the market will largely surpass it, as some already do (Renoir with 7/8 Vega CUs and LPDDR4X).
 
Not for much longer IMO.
Give it 2 years and almost all mobile APUs in the market will largely surpass it, as some already do (Renoir with 7/8 Vega CUs and LPDDR4X).
You need to cycle the hardware out of use which takes time. I expect there's still tons of 1050 mobile GPUs in use.
 
You need to cycle the hardware out of use which takes time. I expect there's still tons of 1050 mobile GPUs in use.
The 1050 is a ~1.8TFLOPs Pascal with 112GB/s bandwidth. It's comfortably above the XBOne.
Perhaps you meant the MX150 or lower, which is a ~1.1TFLOPs GPU, but I wouldn't call it below the XBOne considering how Pascal cards got larger performance-per-TFLOP than GCN.

Plus, there are many PC games released in the past couple of years that are almost impossible to run even on a >1TFLOPs Raven Ridge (Shadow of the Tomb Raider, AC:Odyssey), let alone the old gen9 that's been releasing on Comet Lake CPUs even today.
I think the same way many games today simply won't run on a Gen9 GPU, in 2 years many games won't run on a Raven Ridge / Picasso or Ice Lake.
 
I feel there's a disconnect going on in this last little discussion about minimum specs. All I know is a mate of mine can barely run Warzone on his i5 4590k 16GB RAM GTX 680 and that's a hell of a lot stronger than an Xbox one.
 
They look like higher res versions of console games or PC versions at max.
The next gen vibe is pretty much non-existent so far visually

As long as theses games are planed on ps4/xbox generation too, we won't see what new consoles can do except higher res. That' why I can't wait too see the first ps5 exclusive, with only ssd in mind.
 
You need to cycle the hardware out of use which takes time. I expect there's still tons of 1050 mobile GPUs in use.

But there's a difference between a game being officially supported on a GPU (i.e. the developers specifically targeting it as 'base hardware') and the games just happening to run on it. Granted most, probablt all games available on the market today will run in some form on a GTX 1050, but how many AAA games list that as the minimum requirement?

And how many will continue to do so next gen?
 
One guy patched Bloodborne to run at 60fps on base PS4. Let that sink in. The game could have run at 60fps the whole time on BASE PS4 and we never even got a Pro patch, let alone a fix for the bad frame pacing from From Software.

Morale of the story, we’re all doomed.

It pushed the same amount of pixels/second (1080p vs 720p).
 
Status
Not open for further replies.
Back
Top