*spin* "Low Settings" on PC ports vs Consoles

ultragpu

Banned
According to Gametech, The Witcher 3: Wild Hunt is a very demanding title; a title that a single GTX 780Ti struggles to run when higher levels of AA are enabled. As the Russian gaming website noted, the PC version runs currently with all its bells and whistles enabled (plus 8xMSAA) with 35-45fps on a GTX 780Ti at 1080p. Naturally, a single 780Ti is able to hit the 60fps sweet spot in its current form provided MSAA is lowered.

Moreover, it is said that the PS4 version will run at 900p/30fps while the Xbox One version will run at 720p/30fps. What's also interesting is that the console versions of the game will meet the minimum graphics settings of the PC version. Gametech claimed that The Witcher 3 on consoles will look similar to The Witcher 2
http://www.gametech.ru/news/39979/ (its in russian)

I remember when they said they can go "nuts" with the new consoles and there will be no visual discrepancy between the consoles and PC, now if CDPR doesn't wanna go bankruptcy they wouldn't want to blatantly make do with the minimum settings on consoles. Especially PS4 version.
 
The Witcher 3 is one of those cases where the console version was made for GCN GPUs but the PC version got infected by Gameworks (an advanced stage of TWIMTBP disease?).

As far as I remember, that didn't bode well for people playing Assassin's Creed 4 with an AMD graphics card.


I wonder if ubersampling will make a return

Ubersampling should make the game more interesting to whomever plays it now with a high-end system. I don't think it was used at all back in 2011 where the best you could have was a couple of GTX580 or HD6970.


I remember when they said they can go "nuts" with the new consoles and there will be no visual discrepancy between the consoles and PC, now if CDPR doesn't wanna go bankruptcy they wouldn't want to blatantly make do with the minimum settings on consoles. Especially PS4 version.

Why would they go bankrupt if there was a large visual discrepancy between consoles and PC? That has always happened, ever. As long as the console versions look fine, there's no problem.
The real problem would come if you need something as powerful as a HD7870 desktop or HD7970/8970M laptop to play the game on low settings.
That would alienate most of their notebook clients.
 
Why would they go bankrupt if there was a large visual discrepancy between consoles and PC? That has always happened, ever. As long as the console versions look fine, there's no problem.
The real problem would come if you need something as powerful as a HD7870 desktop or HD7970/8970M laptop to play the game on low settings.
That would alienate most of their notebook clients.

Now I'm not sure how fine will the minimum setting look on consoles, that's the big question mark here. People are much more informed about their games these days due to DF, forums and online comparisons, so the prospect of only getting the bare minimum of the pc version would psychologically keep them discouraged from buying the game.
 
Now I'm not sure how fine will the minimum setting look on consoles, that's the big question mark here. People are much more informed about their games these days due to DF, forums and online comparisons, so the prospect of only getting the bare minimum of the pc version would psychologically keep them discouraged from buying the game.

So basically PC gamers with more capable systems should miss out on better graphics so as not to hurt the feelings of console gamers?

Does the same apply to PS4 games so as not to hurt the XB1 guys feelings?
 
So basically PC gamers with more capable systems should miss out on better graphics so as not to hurt the feelings of console gamers?

Does the same apply to PS4 games so as not to hurt the XB1 guys feelings?

No. But, how far CD can drive graphics on a highend PC can be heavily influenced by how well their titles are accepted on consoles. Console have a larger userbase and higher retail prices which makes them a great source of revenue generation. Thereby, the more console games you sell, the bigger investment you can make into taking greater advantage and providing better performance of "all" the hardware you support.

To many, "bare minimum PC settings" implies not taking full advantage of the hardware on consoles. Hardly a reality you want your potential console consumers to assume and contrary to what other major console devs practice.
 
Last edited by a moderator:
To many, "bare minimum PC settings" implies not taking full advantage of the hardware on consoles. Hardly a reality you want your potential console consumers to assume and contrary to what other major console devs practice.

All it says is that PC on low = console settings. I personally don't believe it but if it were true that has no connection to how well the consoles have been utilised unless you account for the PC required specs at a given quality level.

If it takes an R7 270 to run the PC version at "low" + 900p @ 30fps and an R9 390x (which should be around by the time the game launches) to run at ultra + 1080p @ 60fps then that would be perfectly in line with the statement that console settings = low on the PC without any requirement for the consoles to not be taken full advantage of.
 
All it says is that PC on low = console settings. I personally don't believe it but if it were true that has no connection to how well the consoles have been utilised unless you account for the PC required specs at a given quality level.

If it takes an R7 270 to run the PC version at "low" + 900p @ 30fps and an R9 390x (which should be around by the time the game launches) to run at ultra + 1080p @ 60fps then that would be perfectly in line with the statement that console settings = low on the PC without any requirement for the consoles to not be taken full advantage of.

Given the general practice of PC development, what PC games have a minimum spec of one gen old midrange or high end cards? R7s are midrange cards and a R9 270 is in the high end tier one tier below enthusiast.

Throw in the nature of fixed hardware development and low level access apis on consoles (will w3 support mantle or dx12 out the box?), saying "low PC setting" is hardly going to encourage any positive to neutral thoughts.

I don't that its true either, but I find it highly unlikely that CD or anybody else would generally take what you describe above as "low PC settings".
 
Throw in the nature of fixed hardware development and low level access apis on consoles
Which amounts for literally nothing, considering high end PCs overpower consoles at least 4 to 1 right now. Imagine the situation after 2 years!

And if the recent developments are of any indication, careful software optimizations on PCs are able to penetrate through any "CPU" API advantage the consoles might enjoy.

The Metro games ran on low/med settings on the X360, Crysis games ran on medium "at best", same with Battlefield 4. So it is not like this will be the first trend. If the game is geared toward high end PCs as a priority, then consoles will get the short end of the stick. however I still don't believe W3 will be able to do that.
 
Given the general practice of PC development, what PC games have a minimum spec of one gen old midrange or high end cards? R7s are midrange cards and a R9 270 is in the high end tier one tier below enthusiast.

What PC games have there been so far which are next gen only? And how many of those are released in 2015 with known requirements?

By 2015 the 270 will be using a GPU core that's around 3 years old and was only ever launched as an upper midrange card in the first place. Is it that hard to believe it would be at the lower end of the spec requirements for a game that has traditionally pushed PC GPU's to their limits at launch?

Also I didn't say it was the minimum spec. Naturally the game would be playable on slower GPU's if you can make do with <900p and a solid 30fps. Following my earlier reasoning the min spec could actually be an R7 260 with a 720p experience at 30fps (XB1 settings according to the "rumour").

Is an R7 260 really so unbelievable as the minimum spec for a 2015 high end, next gen only game? Because if it were then it would fit in perfectly with the XB1 version being equivilent to the PC on low settings at 720p/30fps.

Throw in the nature of fixed hardware development and low level access apis on consoles (will w3 support mantle or dx12 out the box?), saying "low PC setting" is hardly going to encourage any positive to neutral thoughts.

All that applies to CPU requirements which could still be very high for the game. There's no real evidence to speak of for this having much effect on GPU requirements. The games unlikely to support Mantle and almost certainly not DX12 unless it's coming much later than expected so I expect fairly high minimum CPU requirements.
 
Which amounts for literally nothing, considering high end PCs overpower consoles at least 4 to 1 right now. Imagine the situation after 2 years!

And if the recent developments are of any indication, careful software optimizations on PCs are able to penetrate through any "CPU" API advantage the consoles might enjoy.

The Metro games ran on low/med settings on the X360, Crysis games ran on medium "at best", same with Battlefield 4. So it is not like this will be the first trend. If the game is geared toward high end PCs as a priority, then consoles will get the short end of the stick. however I still don't believe W3 will be able to do that.

Considering what gist of the discussion I have no ideal of the point you are trying to make.

I am not saying the console will run at enthusiast levels.

Mentioning a bunch of PC titles that ran on "med" settings on hardware that was 4-6 years old is hardly proof that a xbox one or PS4 should run a PC title slated for 2015 at low settings. Its all PC hardware. What PC dev would use a Xbox One or PS4 while avoiding their low level access as a minimum spec? You are basically talking about serving a fraction of the PC userbase.

That not a PC title thats a console title that provides additional support for high end PC rigs.
 
Which amounts for literally nothing, considering high end PCs overpower consoles at least 4 to 1 right now. Imagine the situation after 2 years

I don't think it's true... Titan/780 usually produces slightly more than 2x amount of frames of ati 7850.
Even SLI/ crossfire configs are not that fast.
 
By 2015 the 270 will be using a GPU core that's around 3 years old and was only ever launched as an upper midrange card in the first place.

The core tech may be 3 years old a 270 isn't even a year old. So why any dev use that as a minimum spec. We talking a card that barely 6 months for a titles slated for 2015 which begins in 8 months.

Battlefield 4 supports mid range cards from 2007.
 
Mentioning a bunch of PC titles that ran on "med" settings on hardware that was 4-6 years old is hardly proof that a xbox one or PS4 should run a PC title slated for 2015 at low settings. Its all PC hardware. What PC dev would use a Xbox One or PS4 while avoiding their low level access as a minimum spec? You are basically talking about serving a fraction of the PC userbase.
The situation is different than before, at the days of the X360 and PS3, consoles were right out of the gate better than high end PCs. it took a couple of years till PCs outpowered them and for PC games to reflect that.

Now out of the gate consoles are way behind, so don't be surprised if games in 2015 started to utilize PC hardware to the point of leaving consoles biting the dust.

I don't think it's true... Titan/780 usually produces slightly more than 2x amount of frames of ati 7850.
Even SLI/ crossfire configs are not that fast.
Yes, they do that at higher resolutions, higher visual quality settings, with deferred MSAA or even more. Also note that the gains with bigger hardware will not be linear (look for example at the X360 -XO transition).
 
The core tech may be 3 years old a 270 isn't even a year old. So why any dev use that as a minimum spec. We talking a card that barely 6 months for a titles slated for 2015 which begins in 8 months.

I don't see why that matters. They could easily list the 7850 or 7790 as the minimum spec instead and it would mean exactly the same thing. It hardly matters that those same GPU's have been renamed a year after launch, they'll still be 3 years old. And the 260/7790 is still only going to be around 1/4 of a high end GPU in 2015.

Battlefield 4 supports mid range cards from 2007.

Battlefield 4 is a 2013 game and designed to run on 360/PS3 as well so not really a good comparison. It's expected that games designed for next gen targets only will come with much higher minimum specs than last gen games.
 
dual 780ti are roughly 10 TFLOPS. I expect about 13 TFLOPS SLI (or crossfire) setup by the time Witcher 3 is released. That gives us nice 1.3 to 13 TFLOPS range to work with. It's important to remember that Witcher 3 is (or will be), one of the first multiplatform AAA games that only support current gen consoles and PC. Thus it makes perfect sense to avoid situation where lower tier PCs would drag design down since we're finally free from old gen consoles.

I don't think 1.3 TLOPS is too much to ask mid 2015 or later. It's likely that if Witcher 3 is must play game, some will upgrade their rigs or buy console for it. Or they wait few years and buy it cheap from Steam sales.

For example Crysis was released 2007 and it took years for even the highest powered system to run it enthusiast details, 1080p and 60 fps. I couldn't do it with my 670 (2012 high/mid range card).
 
I am talking about difference at exact same resolution and settings.
p.ex.
http://www.purepc.pl/karty_graficzn...ectcu_ii_oc_najlepsza_z_najlepszych?page=0,10
I see, but you still have to consider that the equation is not linear. In fact producing more frames puts more load on the hardware than mere resolution or quality increases. it often takes more than double the hardware resources to produce double the frames.

see this for example:

We reach double frame rate compared to the graphically demanding titles such as Infamous Second Son, Killzone (SP), Assassin's Creed and Ryse: Son of Rome.

Rough mapping:
720p @ 60 fps is slightly more demanding to the GPU than 1080p @ 30 fps
900p @ 60 fps is almost 50% more demanding to the GPU than 1080p @ 30 fps
1080p @ 60 fps is twice as demanding to the GPU than 1080p @ 30 fps

For our game, stable 60 fps is the most important thing. Pushing twice as many frames requires twice as many GPU and CPU cycles. This makes our game very demanding to the hardware.
 
Now out of the gate consoles are way behind, so don't be surprised if games in 2015 started to utilize PC hardware to the point of leaving consoles biting the dust.
So why do some of the ps360 (8 yr old hardware) titles look better than most of todays PC games

Ive got a new PC, where can I play something as impressive as say imfamous?

(crickets) but but but the hardwares more powerful, I know um battlefield 4 :)
 
Back
Top