How should devs handle ports between consoles? *spawn

Every tried increasing the resolution while gaming on your PC?

Every tried increasing the AA?

Or even reducing them?

Do you think devs thought long and hard about how to optimise these standard options?

Just about every other game manages, no reason to think it's actually hard to do.
 
Saying it over and over doesn't make it any more true.

There is absolutely no reason why even when aiming for "parity" they cannot just enable better AA/AF because the headroom is physically there, this isn't some theoretically nonsense like using the ESRAM 2 ways, and sometimes it'll actually do both at the same time by luck. If this game releases with parity, there's 500 gflops sitting there doing nothing on the PS4, this is fact.

Maybe the xbox version hits the mid 20s and PS4 never misses a frame, then lets pick and arbitrary figure of 250 gflops sitting there untapped.

I just hope Sony get such a lead they refuse to release any game that aims for parity and put a stop to this rubbish.

Lets assume Microsoft realise the clusterfuck they made this time round, and release xbox one + one with a much better GPU than the PS5, the precedent will have been set, parity the norm and if we aren't all banned, we can have the exact same argument, but the roles will be reversed.

As a technical forum, we should be moving away from the No true Scotsman fallacy. Enlighten of any new information, the defense is always shifting to ensure the defense stays the same; it's is highly held opinion that PS4 will always hold a 44% power differential over Xbox One. They don't know how, they don't know what type of work loads, they don't even know why previously it was performing better, this opinion is all summed up looking at the spec sheet.

Under your exact reasoning that there are 500 GFLOPS unused (since Xbox One is maxed out as your say), unlikely, since no console has hit sustained 100% ALU, Xbox One has never twirled it's fans yet loud enough to be audible for any game I've played and bandwidth has never really hit 192 GB/s on esram either, I don't see how you can magically throw numbers into the air like this.

tldr; There is space on both consoles still to move higher, but you just don't care to see this point. If they set the target for 900p so be it, let them finish the game, and they can revisit resolution and graphics at a later point.
 
Every tried increasing the resolution while gaming on your PC?

Every tried increasing the AA?

Or even reducing them?

Do you think devs thought long and hard about how to optimise these standard options?

Just about every other game manages, no reason to think it's actually hard to do.

Turn up the MSAA so you can use more Gigaflops!

But not on Xbone, because it's just luck and entirely unpredictable whether you can activate the bandwidths.
 
Every tried increasing the resolution while gaming on your PC?

Every tried increasing the AA?

Or even reducing them?

Do you think devs thought long and hard about how to optimise these standard options?

Just about every other game manages, no reason to think it's actually hard to do.

I have , did you not read my last post. Some games are cpu bound no matter what and increasing graphical options decreases performance even when cpu bound.

When your playing civ 5 with 4 other people and 16 city states and rampaging barbarians late in the game when everyone has 20 or 30 large cites each and tons of units on the map you will be both cpu and gpu limited.

I had a fx 8150 with a 6950 and going to a 7950 did not help my performance in that game, I still couldn't turn on more features. Replacing the cpu with a faster one was the key to increasing performance. Once I did that I was getting better framerates and I was able to turn on graphical options that I couldn't previously .
 
I didn't read your post as I was mid post when you did it. Anyway, I am in a unique place to check being CPU limited in virtually every game I play, my CPU is crap, and my GPU is pretty good, so after work I shall do a little testing with the games I have installed on my steam library.

If I am wrong, that enabling AA and increasing the resolution doesn't hit the framerate when CPU limited, I will apologise to you.

I will test what I have at 900p (1600x900) and 1080p.
 
Saying it over and over doesn't make it any more true.

There is absolutely no reason why even when aiming for "parity" they cannot just enable better AA/AF because the headroom is physically there, this isn't some theoretically nonsense like using the ESRAM 2 ways, and sometimes it'll actually do both at the same time by luck. If this game releases with parity, there's 500 gflops sitting there doing nothing on the PS4, this is fact.

Maybe the xbox version hits the mid 20s and PS4 never misses a frame, then lets pick and arbitrary figure of 250 gflops sitting there untapped.

I just hope Sony get such a lead they refuse to release any game that aims for parity and put a stop to this rubbish.

Lets assume Microsoft realise the clusterfuck they made this time round, and release xbox one + one with a much better GPU than the PS5, the precedent will have been set, parity the norm and if we aren't all banned, we can have the exact same argument, but the roles will be reversed.

We've already had the roles reversed, last gen slight differences in resolution was a legitimate discussion point now for some its not noticeable or for others now it matters. Some Xbox enthusiast seem to be reveling in the mantra if I can't have 1080P no one will while some Playstation fans suddenly care about it.
 
An SDK cannot gift power that isn't there in the first place.

An SDK can allow a developer to reclaim performance that was obscured by a previous SDK, and SDK improvements can allow a developer to get closer to 100% utilisation of the target hardware.


There is absolutely no reason why even when aiming for "parity" they cannot just enable better AA/AF because the headroom is physically there, this isn't some theoretically nonsense like using the ESRAM 2 ways, and sometimes it'll actually do both at the same time by luck. If this game releases with parity, there's 500 gflops sitting there doing nothing on the PS4, this is fact.

You have the CPU/GPU utilisation data to back up this 'fact' I take it?
 
@DJ12
its also depends on how powerful is your GPU.
its possible although in 900p its bottlenecked by CPU, in 1080p your GPU become not strong enought, thus frame rate goes lower.

some games like BF4 also happily eats CPU and GPU. Unlock more cores, FPS goes up 100%*. Lower resolution, fps also goes up.
*(not really 100%, but from 2cores to 4 cores its almost double the fps for me)
 
Gpu is a 7870 the Tahiti variant, not a beast by today's standards but certainly way above cpu capabilities.
 
So if I can turn up the resolution and aa with little or no hit in performance I prove then wrong.
 
I said before and I will say it again. A 1.8TFlop machine sounds a lot better than a 1.3TFlop machine but the difference in the rest of the system is almost non existant and in some cases may favor the slower graphical machine. We are going to see both platforms start running games at 900p and we might see both go down to 720p by the end of the generation

Yeah, I've wondered about that. Really not sure which way it'll go. I think it's equally likely 1080P will remain common, but it'll be interesting to watch.
 
Lets assume Microsoft realise the clusterfuck they made this time round, and release xbox one + one with a much better GPU than the PS5, the precedent will have been set, parity the norm and if we aren't all banned, we can have the exact same argument, but the roles will be reversed.

You realize this isn't a new problem, right? Xbox stomped over PS2 to a significantly greater degree than PS4 does over X1, yet most of the multiplats back then typically were fairly identical across the two.

I've said before, you could probably make an argument that making slightly stronger hardware in any given gen is just a waste of money, because it'll be nerfed down to the (presumably cheaper to manufacture) weaker hardware anyway.

I wouldn't agree with that, I think Ps4's hot sales show that, but there's some validity in the argument. Again, for example mostly the Xbox was confined to running PS2 ports in the early 2000's.
 
So if I can turn up the resolution and aa with little or no hit in performance I prove then wrong.


No hit would be correct. Little hit could be a much larger issue once things get going.

Consoles clip frame rates at 30 and 60
So the game needs to run above 30 and 60 to ensure the frame rate is solid. During your tests you should see how far the frames will dip once you get into heavy action.
 
You have the CPU/GPU utilisation data to back up this 'fact' I take it?

I think it is simple math. If they are CPU bound, then that implies they are not GPU bound. If that is the case then both consoles can get by with using no more than 1.3TF from the GPU, 1.8-1.3=.5.

Of course I don't believe the resolution has anything to do with the CPU, even the producer didn't draw that connection, he was just speaking about the frame rate.
 
Turn up the MSAA so you can use more Gigaflops!

But not on Xbone, because it's just luck and entirely unpredictable whether you can activate the bandwidths.

There's a lot in common with the systems, and at a more foundational level, that people often dont notice.

They're both reading data off the same Blue Ray discs with the same capacity and same speeds.

They're both reading data off hard drives of the same speed.

They both have more or less the same CPU.

They both have 8 Gigabytes RAM.

Again these are the foundations of the systems.

It isn't like 1 system is using DVD, and has no internal HDD, and has 4GB RAM, and a 4 core 1.2 ghz Jaguar CPU. Just something to keep in mind imo.
 
Then it was understandable, PS2 had been out for a couple of years, and was a sale juggernaut.

If both released at the same time, I'd argue things would've been different. And back then there really was something PS2 was way better at than it's more powerful competitor.

It;s not entirely an apples to apples comparison to whats going on this gen.
 
On phone, dammit, but wanted to point out that Watch dogs on PS4 didn't maintain 30 fps at 900p.

1080p would not have been pretty, just as 900p would not have been on Bone.

You don't know that. The drops on PS4 occurs mainly during heavy CPU tasks (like explosions and a lot of physics stuff or particles computed by the CPU).

If the game was GPU limited then the drops would occurs because of the GPU so an improvement of the resolution would dramatically worsen the drops.

But like AC Unity, watchdogs was a heavily CPU limited game, at least on PS4.
 
So if I can turn up the resolution and aa with little or no hit in performance I prove then wrong.

PS4 is like a PC APU, hasn't got separate memory pools for CPU and GPU. CPU and GPU affect each others memory access and therefore performance.
 
PS4 is like a PC APU, hasn't got separate memory pools for CPU and GPU. CPU and GPU affect each others memory access and therefore performance.
It's the same setup on Xbox One. But generally you'll be writing code that avoids lots of random accesses to memory. Ideally you'll be wanting most of your code and data to exist in the 4mb L2 cache (2mb L2 per quad-core cluster).

This is optimisation 101.
 
Back
Top