[Allegedly Leaked]Battlefield 4 Sticks 720P/60 FPS on Next-Gen Consoles

I understand a competitive MP may feel better with 60fps but I really don't think people cares about that for the SP especially it's just a 5-6hours ride.

I really care about 60 fps for single player. Rage shows how it should be done. 30 fps is nasty.
 
Sorry but to me, stable 30fps >>> unstable 60fps. Personally i'm more annoyed by drops in framerate, whatever that framerate is when things are 'calm'.
 
I really care about 60 fps for single player. Rage shows how it should be done. 30 fps is nasty.

Dynamic framebuffer is even nastier. Nothing like when you see a sharp wall and all of a sudden you turn around you see a low res jaggy, muddy outdoor view.

Anyway if I'm still using my old 46" tv then it really doesn't matter, but now most of my 720p games are looking like a jaggy mess on my new 65". Please feel my pain, 1080p is much needed.
 
Dynamic framebuffer is even nastier. Nothing like when you see a sharp wall and all of a sudden you turn around you see a low res jaggy, muddy outdoor view.

Whether dynamic framebuffers are even noticeable depends on how much the buffer reduces by and for how long. And dynamic framebuffers aren't limited to 60 fps games. If you've reached the point where a dynamic framebuffer has turned into a "low res jaggy, muddy ... view" then chances are that without it your frame rate would be in the toilet.

Anyway if I'm still using my old 46" tv then it really doesn't matter, but now most of my 720p games are looking like a jaggy mess on my new 65". Please feel my pain, 1080p is much needed.

I started out gaming in the 80's when games were 60 fps on 14 inch tvs. Now there are tvs the size of a wall with frame rates half of that like a giant frikkin flick book. Feel my pain. Every time I sit down to play a game it's like having haemorrhoids. Rage was like one of those doughnut shaped cushions that eased my pain. Then I saw the 30 fps PS4 Killzone video and started bleeding out of my ass.

Next gen am cry 4 u.
 
Dynamic framebuffer is even nastier. Nothing like when you see a sharp wall and all of a sudden you turn around you see a low res jaggy, muddy outdoor view.

That description sounds more like that of virtual texturing than that of dynamic framebuffers.
 
But only 720p...am cry as well

I hear the PC Gaming Taliban is looking to indoctrinate some knew recruits ...

720p will be okay as long as there's some sub pixel edge anti aliasing going on. Apply a post process AA filter to an unresolved 4x MSAA buffer and edges would probably be cleaner as if you just did the post process on a standard 1080p buffer. Probably. I think. And you'd definitely reduce pixel pop.
 
That description sounds more like that of virtual texturing than that of dynamic framebuffers.

Yup that basically describes texture pop in whether from virtual texturing or texture streaming. Nothing to do with the resolution of the frame buffer.

Regards,
SB
 
I hear the PC Gaming Taliban is looking to indoctrinate some knew recruits ...

720p will be okay as long as there's some sub pixel edge anti aliasing going on. Apply a post process AA filter to an unresolved 4x MSAA buffer and edges would probably be cleaner as if you just did the post process on a standard 1080p buffer. Probably. I think. And you'd definitely reduce pixel pop.

Yeah, already got a note from Davros :mrgreen:
 
850p looks pretty good on my set 8-10ft away. I could live with that for more visual boost on any titles wishing to "push it to the limit".
 
720p will be okay as long as there's some sub pixel edge anti aliasing going on. Apply a post process AA filter to an unresolved 4x MSAA buffer and edges would probably be cleaner as if you just did the post process on a standard 1080p buffer. Probably. I think. And you'd definitely reduce pixel pop.

It's not just about edge aliasing though. It's about the general sharpness of the image and clarity of fine detail as well. 1080p with no AA at all looks better on my 1080p 27" monitor than 720p with 16xQ AA (8x MSAA + 16x Edge AA). There's more actual aliasing in the 1080p image but it generally looks sharper and clearer. That's using either the displays inbuild scaler or the one built into my 670.
 
It's not just about edge aliasing though. It's about the general sharpness of the image and clarity of fine detail as well. 1080p with no AA at all looks better on my 1080p 27" monitor than 720p with 16xQ AA (8x MSAA + 16x Edge AA). There's more actual aliasing in the 1080p image but it generally looks sharper and clearer. That's using either the displays inbuild scaler or the one built into my 670.

A monitor and a TV are not the same thing because of the size of the screen and the distance you sit away from it.
 
Sorry but to me, stable 30fps >>> unstable 60fps. Personally i'm more annoyed by drops in framerate, whatever that framerate is when things are 'calm'.

There's no such thing as stable 30fps, unless they aren't pushing the system to the limit. What ends up happening is that unless there's a bit of leeway, 30fps can quickly becomes 22fps, but 60fps can become 45 fps. Which feels better then?
 
There's no such thing as stable 30fps, unless they aren't pushing the system to the limit. What ends up happening is that unless there's a bit of leeway, 30fps can quickly becomes 22fps, but 60fps can become 45 fps. Which feels better then?

Don't be silly. Of course there is such a thing as a stable 30fps. Regardless of the reasons. And to me, personally, i rather have that than a 60fps game which starts dropping frames as soon as something starts happening. That's all.
 
A monitor and a TV are not the same thing because of the size of the screen and the distance you sit away from it.

True, my monitor from the distance I view it is roughly 4x larger (2x diagonal) than my 50" TV from my normal viewing distance of 12 feet. But the point still stands, low resolution + good AA is not a straight substitute for high resolution with no AA - especially where scaling is involved. Even on my TV with 1/4 the viewing area I can see the difference.
 
It's not just about edge aliasing though. It's about the general sharpness of the image and clarity of fine detail as well. 1080p with no AA at all looks better on my 1080p 27" monitor than 720p with 16xQ AA (8x MSAA + 16x Edge AA). There's more actual aliasing in the 1080p image but it generally looks sharper and clearer. That's using either the displays inbuild scaler or the one built into my 670.

Higher-res images are always gonna be more detailed and sharper than an upscaled one, with AA or not, but without reasonable AA, there is going to be shimering and jaggies, that are very visually distracting and un-organic, which can harm the overal impression.
Also, in real world scenarios for next gen games, a 1080p game is going to at least have some post-AA like MLAA or TXAA. In that case, half the sharpness you win by rendering to native, you loose again thanks to the image based AA, so between blurry with no actual sub pixel sampling (still leads to shimmer and bad looking thin edges) vs. a slitly blurrier but more stable overall image, I guess 720p might be the best option.
I was myself hoping for 1080p across the board for this gen, but thinking about it, I realised great AA at 720 trumps 1080p native with poor AA.
 
I think dynamic framebuffer is the way to go, if the lowest res it can be is 720p then it's never going to turn into a blurry mess, like RAGE on PS3 can at times.

And since both PS4 and 720 will have display planes which they can use for rendering the UI independently of the game resolution, changes to res will be far less noticeable than they are now (since the big giveaway is that the UI/text look far worse when the res drops)
 
Back
Top