[Allegedly Leaked]Battlefield 4 Sticks 720P/60 FPS on Next-Gen Consoles

Sigh, are we going to have this discussion again? If the player can, for example, aim better/faster/more accurately you can design the levels differently and have different speed of player/camera movement.

Aiming better is more about player's skills then frame rate: a poor player is poor at 60fps as well as 30fps.
Faster aim/camera has nothing do with frame rate: if max acceleration is set at 50 it's not like it will magically go up at 100 just because you are playing at 60fps.
Accuracy has more to do with sensitivity than frame rate as well.
Level design is more influenced by RAM than frame rate.
Character movement speed has nothing to do with frame rate: you can make a slow character even at 60fps.

Detail, input lag and image fluidity are influenced by frame rate.
 
Last edited by a moderator:
As someone who also has a gaming pc attached to a half decent tv (50" 1080p from around 12 feet away) I fully agree with Billy, the difference between 720p and 1080p is surprisingly noticable. Ive done a lot of testing on this due to the trade off of 720p / 60fps or 1080p / 30fps using 3d via hdmi and despite really appreciating 60fps, id choose 1080p every time due to what I personally see as a severe loss of image quality.

Every clan I have been in chose 60Hz over resolution. You can have your opinion -- 1080p30 vs. 720p60 vs. 720p+eyecandy is a preference thing -- but in each case there is, something "surprisingly noticable." As a self proclaimed PC expert I am really surprised you would fail to note almost all competitive PC gamers have historically chosen frequency over resolution. Btw, that isn't an opinion, that is an observation of being involved in competitive PC gaming for years and observing it in quite a few games & forums.

Specifically, for Battlefield, I know the PC trend, if you had to choose, is frequency over resolution or other eyecandy. Frequency impacts gameplay and while you lose frame resolution you gain temporal resolution.

Aiming better is more about player's skills then frame rate: a poor player is poor at 60fps as well as 30fps.
Faster aim/camera has nothing do with frame rate: if max acceleration is set at 50 it's not like it will magically go up at 100 just because you are playing at 60fps.
Accuracy has more to do with sensitivity than frame rate as well.
Level design is more influenced by RAM than frame rate.
Character movement speed has nothing to do with frame rate: you can make a slow character even at 60fps.

Detail, input lag and image fluidity are influenced by frame rate.

Input lag, image fluidity, and detail have a direct influence on a gamer's ability to aim.

Don't believe me? Try a 15Hz game with 250ms latency. The reverse is also true (e.g. playing Quake on a CRT running at 120Hz+). The question is where is the sweet spot--and importantly not all things are directly tied to eachother. You can have an engine and user input run at a different refresh rate than the renderer.
 
Every clan I have been in chose 60Hz over resolution. You can have your opinion -- 1080p30 vs. 720p60 vs. 720p+eyecandy is a preference thing -- but in each case there is, something "surprisingly noticable." As a self proclaimed PC expert I am really surprised you would fail to note almost all competitive PC gamers have historically chosen frequency over resolution. Btw, that isn't an opinion, that is an observation of being involved in competitive PC gaming for years and observing it in quite a few games & forums.

Specifically, for Battlefield, I know the PC trend, if you had to choose, is frequency over resolution or other eyecandy. Frequency impacts gameplay and while you lose frame resolution you gain temporal resolution.

You're talking about multiplayer where high framerate for fast responses and a competitive edge are far more important than visuals. I'm talking about single player where that's not as much of a concern and visuals take on more importance.
 
Sigh, are we going to have this discussion again? If the player can, for example, aim better/faster/more accurately you can design the levels differently and have different speed of player/camera movement.
That's the same gameplay. If you're playing football running around like maniacs or at a more leisurely pace, it's the same gameplay. If you're playing dominoes and going great guns or taking it easy, it's the same gameplay. If you fire up Borderlands 2 on the PC and run it at 15 fps on 4k or 120 fps at 720p, it's the same gameplay - same guns, same skills, same monsters. If you are moving and shooting a gun with exactly the same controls and same view, it's the same gameplay whatever resolution and framerate you run it in. It's a different experience, but the same gameplay. I do not believe 60 fps vs 30 fps is going to change level design or any aspect of an FPS, especially one grounded in realistic human warfare like Battlefield 4. It's play like a realistic contemporary theatre of war at whatever framerate and resolution it's executed in.
 
Every clan I have been in chose 60Hz over resolution. You can have your opinion -- 1080p30 vs. 720p60 vs. 720p+eyecandy is a preference thing -- but in each case there is, something "surprisingly noticable." As a self proclaimed PC expert I am really surprised you would fail to note almost all competitive PC gamers have historically chosen frequency over resolution. Btw, that isn't an opinion, that is an observation of being involved in competitive PC gaming for years and observing it in quite a few games & forums.

Specifically, for Battlefield, I know the PC trend, if you had to choose, is frequency over resolution or other eyecandy. Frequency impacts gameplay and while you lose frame resolution you gain temporal resolution.

Input lag, image fluidity, and detail have a direct influence on a gamer's ability to aim.

Don't believe me? Try a 15Hz game with 250ms latency. The reverse is also true (e.g. playing Quake on a CRT running at 120Hz+). The question is where is the sweet spot--and importantly not all things are directly tied to eachother. You can have an engine and user input run at a different refresh rate than the renderer.

I choose both on my pc: 1080p and 60Hz. I just put down settings which (for me personally) have less visual impact until I get the framerate (if possible, with Far Cry 3 it seems that I was cpu bounded and I did never achieve steady 60Hz).

Of course this is not possible on console and one just have to trust the devs that they do the right decisions...
 
That's the same gameplay. If you're playing football running around like maniacs or at a more leisurely pace, it's the same gameplay. If you're playing dominoes and going great guns or taking it easy, it's the same gameplay. If you fire up Borderlands 2 on the PC and run it at 15 fps on 4k or 120 fps at 720p, it's the same gameplay - same guns, same skills, same monsters. If you are moving and shooting a gun with exactly the same controls and same view, it's the same gameplay whatever resolution and framerate you run it in. It's a different experience, but the same gameplay. I do not believe 60 fps vs 30 fps is going to change level design or any aspect of an FPS, especially one grounded in realistic human warfare like Battlefield 4. It's play like a realistic contemporary theatre of war at whatever framerate and resolution it's executed in.

I disagree. Sonic 2 moving at half the speed would not feel like Sonic 2 at all. Playing BF3 at 100X100 resolution at 10 fps would be a totally different game because you would not be able to see anything and you would not be able to aim, move or do anything with any sense of accuracy.
 
Input lag, image fluidity, and detail have a direct influence on a gamer's ability to aim.

Don't believe me? Try a 15Hz game with 250ms latency. The reverse is also true (e.g. playing Quake on a CRT running at 120Hz+). The question is where is the sweet spot--and importantly not all things are directly tied to eachother. You can have an engine and user input run at a different refresh rate than the renderer.

I am not arguing with that, but IQ & performance are not gamepaly or game mechanics.
Also BF 4 at 1080p would benefit from a higher level of detail than 720p so there is a qui pro quo.
IF this rumor is true then DICE has opted for higher performance instead than higher IQ which is a debatable choice; If I had to chose between 1080p 30fps and 720p 60fps I would choose 1080p 30fps simply because gaming at 30fps for me is not a problem.
 
Character movement speed has nothing to do with frame rate: you can make a slow character even at 60fps.

Of course. But you can not have fast games with low frame rates because everything becomes blurry and disoriented.
 
It is much easier to aim with a higher frame rate. Try it yourself if you do not believe me.

First stop assuming I never played at 60fps.
Second much depends on how aiming is tweaked.
If a game has poor aim sensitivity and poor acceleration it can run at 60fps but aiming will still be poor and unresponsive.
Running at 60fps is not enough to have smooth controls and camera.
 
But you can not have fast games with low frame rates because everything becomes blurry and disoriented.

Higher frame rate makes fast paced games more playable and visually more appealing (lower input lag, and more fluid images) but if done well a fast paced game at 30fps can still be very playable and not necessarily blurred or disorienting.
I didn't play it myself but I am told that DmC at 30fps is great.

Those attributes are game mechanics enablers.

Higher IQ & performance are beneficial for the game but are not "game mechanics enabler".
As Shifty Geezer said the gameplay is identical at 720/1080p and 30/60fps
It's not like BF 4 at 30fps will have different gameplay or game mechanics than at 60fps.

Playabilty is influenced by IQ & performance and by many other elements.

Of course not. But it enables you to have a faster smooth camera. You can have a slow smooth camera at lower frame rates.
60fps don't enable faster or smoother camera: acceleration & sensitivity determine how fast and smooth a camera is (also much depends on how the camera system is design)
60fps determine how responsive the controls are and make images look more fluid.
If you set your mouse or analog sticks sensitivity and acceleration very low aiming/camera will never be faster and smoother just because you play at 60fps.
 
Last edited by a moderator:
I disagree. Sonic 2 moving at half the speed would not feel like Sonic 2 at all. Playing BF3 at 100X100 resolution at 10 fps would be a totally different game because you would not be able to see anything and you would not be able to aim, move or do anything with any sense of accuracy.
Different experiences, same gameplay. I'm using 'gameplay' to mean mechanics of playing the game - the buttons you press, the visual cues, etc. I make this distinction because it doesn't make sense to clump the rules and mechanics of gaming with execution. Sonic will still have the same gameplay at 30 fps even if it doesn't feel as smooth and become much harder to react.

Also, we're not talking about Sonic or a game designed for speed here. We're talking about Battlefield 4, a modern warfare game. the game being designed, running around streets, sneaking through undergrowth, blind-firing and aimed firing and everything else, will be running at human speeds and not Sonic The Hedgehog speeds. the gameplay will be suitably slow enough that such comparisons don't matter. If keenism wants to play at higher resolution and lower framerate, he's still playing the same game. And what he loses in temporal resolution, he gains in image resolution, so he'll have a better chance at spotting enemies at a distance in replace of his better chance to react to targets.

Unless the intention is to turn Battlefields into an Unreal Tournament style twitch shooter, the gameplay (aim, shoot, move, cover, explore) won't be affected by the experience of the framerate or resolution.
 
Isn't there a thread for this already can we not just get back to talking about ea being evil?
There is a thread about framerate, but this thread is about Battlefield's rumoured framerate and resolution, so the discussion is inevitably going to cover some of the same ground.
 
Considering that DICE is adding stuff like dynamic weather and a destructible environment, I'd say achieving all those things on a solid 720p/60fps is admirable.

Keep in mind that they'll probably improve future Battlefield games in other ways, and doing this might cut down or eliminate inconsistent frame-rates and tearing.

The current consoles we have couldn't achieve similar results without making serious compromises. Should we really be pushing those limits and coming up with the same compromises to only bite us in the butt again?
 
Back
Top