Battlefield 3 announced

Does this mean we finally will have game properly using more than 2GB of RAM?
Between the OS, driver and application? Yes. The application itself, perhaps 4GB but not more :) They've said they're only doing a 32-bit version at the moment since it doesn't make any difference in practice right now. However a 64-bit OS does, since the total footprint of the various components can exceed 2GB.

Will it make use of all available RAM not to flush already loaded textures and data every time we load new map?
Doubt it, but given that they are using some form of virtual texturing it shouldn't need to load all the textures for the level "up front" anyways - they will be streamed in nicely as you need them.

I can't imagine a GTX 570/480 not being enough for "max" settings, but it depends a bit on what you mean by that. 2560x1600 (or more!) with 4x MSAA? That's gonna be pricey no matter how you look at it :) For 1080p, 4xMSAA I'd venture a guess that both of those cards will be plenty. This is all just speculation on my part though based on some very rough information on what they're doing. Note that for non-screen space AA (i.e. MSAA) in this game you're going to want a good chunk of VRAM.

Keep in mind though that "max settings" is pretty subjective... they can easily include settings that would bring even the highest end GPU to a crawl, but the quality increase wouldn't be worth it. Thus put less stock in "max settings" and more in where the high-end sweet spot is.
 
Last edited by a moderator:
Keep in mind though that "max settings" is pretty subjective... they can easily include settings that would bring even the highest end GPU to a crawl, but the quality increase wouldn't be worth it. Thus put less stock in "max settings" and more in where the high-end sweet spot is.

Yeah I don't like when people talk about 'max' setting, you do not want to run it at max, you want to run it at (just as Andrew says) a good sweet spot between quality & performance for your machine (even if it is very high-end). As the higher you crank up the quality you get more and more diminishing returns and potentially dramatic performance hits.

Not sure how to best communicate this to gamers though, most everyone typically just cranks up the game to the highest possible detail when they get it and then complains that it crawls on their non-godlike machine. Even reviewers do this and can label a game 'non-optimized' or 'too demanding' even though all the game developer did was to spend time adding a extra detailed mode that is only meant for a small but (for us) important minority.

Autodetecting of settings only goes so far and you still want to be able to customize it yourself & not prevent users that know what they are doing to crank up the specific detail they want (& are willing to pay for without complaining about it).

This is something I'm thinking of right now of how to properly set up our PC graphics settings for BF3 to avoid as much as possible to above scenarios while still providing extra value & eye candy for the users (& ourselves :) ) that really have a machine that can handle super high quality visuals.

Any concrete & constructive suggestions are welcome.
 
Yeah I don't like when people talk about 'max' setting, you do not want to run it at max, you want to run it at (just as Andrew says) a good sweet spot between quality & performance for your machine (even if it is very high-end). As the higher you crank up the quality you get more and more diminishing returns and potentially dramatic performance hits.
.

ok, I just want to play the settings used in the Tank battle trailer...that is all!
which PC specs/graphics card did you use for scenes shown in this trailer??
 
This is something I'm thinking of right now of how to properly set up our PC graphics settings for BF3 to avoid as much as possible to above scenarios while still providing extra value & eye candy for the users (& ourselves :) ) that really have a machine that can handle super high quality visuals.

Any concrete & constructive suggestions are welcome.

It would be amazing if there was a visual preview of IQ and performance that updated in real-time as you adjusted settings. The most frustrating part of tweaking PC options is trying to see the IQ benefit and the performance impact of each setting. If that process was streamlined it would go a long way to helping folks find the sweet spot for their hardware.

Another simpler approach is to simply eliminate any feature that has a large performance hit for negligible IQ benefit. Those are the things that really cause the "unoptimized" gripes :)
 
It would be amazing if there was a visual preview of IQ and performance that updated in real-time as you adjusted settings. The most frustrating part of tweaking PC options is trying to see the IQ benefit and the performance impact of each setting. If that process was streamlined it would go a long way to helping folks find the sweet spot for their hardware.

Another simpler approach is to simply eliminate any feature that has a large performance hit for negligible IQ benefit. Those are the things that really cause the "unoptimized" gripes :)


All of our graphics settings do change in real-time as you edit them, though the UI still covers a portion of the screen so can be difficult to see behind it. Ideally I would like the settings screen to be a bit custom and have a very minimal UI that shows the maximizes the view of the game background behind it so you can see your changes clearly. Not sure if that is possible with our UI design though.

And yeah do agree about removing features that have fully negligible quality impact :) Though can be quite subjective
 
Repi, I game at 1440x900. How would a GTX260 handle that resolution?

In many modern games I'm actually CPU limited (E6750 @3.2GHz). You think this will be the case with BF3?
 
One simple way of preventing common users from just maxing to ridiculous degrees would be to only have maximum options in a config file. As with most games, when you adjust settings ingame, those changes are specified in an ini file or similar. Have the sliders etc. only go to a certain amount but allow a bit more for those whom really know what they're doing.

A perfect example is AA. If you support supersampling via ingame controls, leave that out but inform the community how to enable it via ini file instead. Have your MSAA/FXAA or whatever you're going to support ingame.
 
No that would be a PITA. Just put that stuff in the advanced options section with a warning that it could massively reduce FPS.
 
Any concrete & constructive suggestions are welcome.
Yeah I'm tempted to agree with Malo to some extent. While I'm obviously one of the users who wants to see all the pretties I'm well-aware of the strange perception that gamers have about "max" settings. Metro 2033 was a great recent example of now not to do it... no matter what you label the maxes (Ultra or whatever) people will pick them, and even a very small amount of playing with Metro's settings reveals that the highest end settings are well beyond the point of diminishing returns. Literally you can halve your frame rate with zero perceivable increase in quality.

Thus I'd really just encourage you guys to make your best call on what is going to be reasonable for current high end and perhaps future cards and make that the "max". Assume compute goes up and bandwidth stays about the same :p

A few other suggestions while we're on the topic:
- Make the shadow quality/resolution setting a multiple of the screen resolution. So "Low" could be 1/2 screen res, medium screen res, high 2x screen res or something. The jaggy-ness of the shadows is relative to the screen resolution so it makes sense to relate these two directly and it gives the user a better idea of the cost of increasing or decreasing the resolution. It also avoids silly scenarios like having way too much or way too little shadow data for a given resolution.
- Hide options that are obviously beyond current hardware in a config file or something as Malo suggests. If even the highest end systems don't spit out 30-60Hz at "max", people get a bad impression from reviews. The reality is that review sites and benchmarks *will* just set it to max and click go, so I'd encourage you to make "max" somewhat reasonable for high-end cards. You could even expose this stuff in the UI down the road with updates.

Cool to hear that the settings updates are "live" - that's always really helpful. With respect to the UI covering it, perhaps you could just provide a "preview" button in that menu or something that hides the UI while a key is held down or something so you can see the affect of whatever options you have changed (and performance). A "before/after" button would be even better.
 
Repi, I game at 1440x900. How would a GTX260 handle that resolution?

In many modern games I'm actually CPU limited (E6750 @3.2GHz). You think this will be the case with BF3?

We don't comment on specs or performance yet as the game is not done.

But highly recommend a quad core, just as with Bad Company 2.
 
Seconded.

He doesn't seem to want to comment on it... I understand that some things can't be spoken here at this point in time, but is pad support one of them and if so why? I have to assume he knows the situation about it.

Some comment however vague would be appreciated.
 
Given that it's a cross platform title pad support should be a no-brainer. At least for the 360 controller. No way they will enable aim-assist though.
 
Given that it's a cross platform title pad support should be a no-brainer. At least for the 360 controller. No way they will enable aim-assist though.

I'd hope so, but with games like Mass Effect 2 and Bioshock 2 around with no pad support, I don't think it's a no brainer for a lot of the devs. This game is looking so good that I might consider playing it with a keyboard, but that'll be a rare exception.
 
What if you come out-of-the-box with a basic UI and then offer a download people opt into that is the 'tweaker UI' that replaces the simple UI with all the potentially machine-destroying tweaks and twiddles and 'max' settings? I'm sure you'll have DLC and the interfaces to manage all that, so it could be a 'free DLC' in terms of it being a plugin?

Another idea might be not labeling things 'high', low, max or ultra or other apparently psychologically damaging things, but have the labels relative. The auto-detected setting could be 'max' or 'optimal' then the entries above this are 'slow' etc (or frowny faces or crosses or red gradient or something), and the ones below 'fast', 'competitive/responsive', 'greased lightning' etc.
 
Back
Top