Are PCs holding back the console experience? (Witcher3 spawn)

  • Thread starter Deleted member 11852
  • Start date
I need to be clear on one thing here: to me, The Witcher 3 looks fantastic and the only reason why I'm probably not a 1st-day buyer is because I'm still having a great time with GTA V.
I'm not arguing either the game looks better or worse than what they showed in E3 2013. I don't think the 2013 comparisons are relevant, as it seems practically the same great-looking game. The screenshots seem to have been taken at a different time of the day so different lighting situations aren't really comparable. This is definitely not the same situation we saw with Colonial Marines or Watch Dogs.
Maybe the problem is that - because of Gearbox and Ubisoft - many people are now very vigilant about demo-to-gold differences. It's a consumer reaction, and you'd be well to expect this on every AAA release for the following years. The Internet doesn't forgive nor forget easily and both developers and publishers should have this in mind, for their own sake.


In my posts, I'm talking specifically about the general idea of developers placing advanced IQ options in their games that are impossible to implement at decent framerates given the hardware available during release.
I think it's a good idea, generally.



Rate of streaming assets? You'd have to load in more (significantly more on a 180 degree turn) which'd choke the streaming engine and need a solution to accommodate like more precaching.
Still, the option to have it (again, above the pre-determined Ultra options) wouldn't hurt.
There's a bottleneck in the streaming engine? Then the framerate will hurt whenever the player does a 180º turn or the player will notice stuff popping up in the horizon. And then the hardcorest PC gamers will long for faster SSDs or humongous amounts of RAM to install the game in a RAM drive, just to get that slide to the max.
I think PC gamers love advanced IQ options, impossible-to-meet settings and the choice to either get more framerate or better looks. If they didn't, why would they spend so much money on hardware and time on solving driver/OS/instalation issues?
 
The funny thing is, the only ones that are crying (like babies) are the ones trying to flex their high-end e-penis. Personally, TW3 looks good on all platforms, especially on the PC. What's going to be funny all those high-end cry babies claiming downgrade, are going to be crying performance issues or scaling back settings when released.

Anyhow, returning this thread back to console gamers...
Oh yes, exposing the false advertising of a company = crying like a baby. I hope your getting a paycheck for this.

Right now though, the consoles are higher than most developers will want to target if they were limited to supporting the PC space, and therefore I think we are again in a situation where PC is holding back consoles more than the other way around, in the sense that if you were to develop a game, in order to have a large enough audience, you would target to support minimum hardware far lower than PS4 or XboxOne.
That's absolutely false:

http://store.steampowered.com/hwsurvey/videocard/

80% have DX11 cards. The API is not a constraint.
 
Based on Steam Survey only 2,24% of users have 3GB of VRAM.
That's a pretty low percentage.
 
Last edited:
Funny how PS4 users get admonished for talking about XB1 parity... yet, here we are with the PC crowd crying foul. Anyhow, the game looks great on all platforms... :yep2:

PC edition for me though :mrgreen:

Well its what 1.8tflops ps4 vs 1.3tflops xbox one. Now pc user's are approaching 10Tflops ? That's quite a large difference in power compared to the consoles difference.

I suspect that q1 2016 when the rift hits , I will have close to 20Tflops of gpu power in my pc
 
Based on Steam Survey only 2,24% of users have 3GB of VRAM.
That's a pretty low percentage.

And about 6% has 4GB...

About 8% has 3GB or more, around 50% has 1GB or less. Percentages itself don't mean a lot unless you know how much each group is spending. I wouldn't be surprised if the 3GB+ group actually generates more revenue than the 1GB or less group.

I'd say for a game like Witcher 3 aiming for the 3GB+ group while making sure that the game runs decent on 2GB cards as well, netting 30%+ of the market, is actually pretty good. If you're still running a 1GB card you are probably not that likely to buy a game like Witcher 3 at full price.
 
And about 6% has 4GB...

About 8% has 3GB or more, around 50% has 1GB or less. Percentages itself don't mean a lot unless you know how much each group is spending. I wouldn't be surprised if the 3GB+ group actually generates more revenue than the 1GB or less group.

I'd say for a game like Witcher 3 aiming for the 3GB+ group while making sure that the game runs decent on 2GB cards as well, netting 30%+ of the market, is actually pretty good. If you're still running a 1GB card you are probably not that likely to buy a game like Witcher 3 at full price.

I also wonder how much cheap windows tablets are affecting the stats. I have a 1 gig windows tablet that works perfectly well for streaming from my main pc. So sometimes I will sit on the couch and play something while my gf watchs tv
 
I'll try and find a link but CDPR have said on more than one occasions, and quite recently somewhere, that the Witcher 3 is a massive project for them given the relatviely small team working on it. I guess this is relative to other huge open world games like GTA V, WATCH-DOGS, Far Cry and Assassin's Creed all of which have teams which absolutely dwarf the Polish developer.
From the interviews I read, what they often claim is that their budget is a lot smaller than the games developed by large american, french, german or british teams.
I think that has more to do with the discrepancies in the average salary between those countries and Poland, than anything else. The Polish people receive between 1/3rd and 1/4th of the other countries.
Also, CDPR are practically their own publisher, so they don't have the same tendency of spending huge proportions of their budget on marketing like the big publishers tend to do.
They can't compete on marketing budgets, so I'd guess they're putting all their chips into making a great game and let the critics do the marketing for them.

And there are people wanting 4K to be standard next gen...Imagine the same lighting and draw distances in next gen...
4K is just a resolution, which PC gamers have been able to change at will for decades. Let's not pretend that 4K, like its "HD" predecessor, it's more than that.


Based on Steam Survey only 2,24% of users have 3GB of VRAM.
Why should ANY game on PC target that when so few are going to befit?
And you're conveniently leaving out that 5.7% have 4GB?
How does the proportions of the people with >3GB in their cards matter, if you don't know how many people are doing the survey?
Steam has probably over 70 million users (it topped 65M a year ago). If half of those are taking the survey, that's 35 million. 8% of users with 3+GB VRAM means there are 2.8 million users with high-end graphics cards.
Plus, those 2.8 million users are a lot more likely to buy games at full price than the ones with 2GB midranges, and those with 1GB or less aren't likely to buy full priced games at all.

And also regarding the Steam Survey, my anecdotal experience that bothers me a lot.
I have 3 systems: my laptop with a 2GB GTX 650M, a HTPC with a 2GB GTX 660 Ti and my desktop with a 4GB R9 290X.
For the past year, I played games mostly in my desktop, yet I've never been asked to do the survey in the system with the R9 290X.
Last month I logged on Steam in my laptop once, after months of not using it, and the first thing that appeared after the much-needed update was the friggin' Steam Survey.
I cancelled the survey because it I figured it would be stupid to let the statistics count with a system I don't play games with, but there's probably a lot of people that just accept it with the wrong PCs (windows tablets, work PCs, etc.).
So I wonder how much we can count on Steam Survey for what really matters: which demographics spend money on games.
 
Reading this thread saddens me because I felt I'm holding you guys back.
So I must say I'm sorry to the PC uber race and Console peasant for holding you guys back...
Sorry guys

-Kaveri gamer
 
how can consoles hold back PCs when i read in every gaming forum how much better PC gaming is, with versions so vastly superior to consoles, and "lol consoles" etc...

pure contradiction.
 
Well, there's really no way around this one. Some assets (in this case, that vertical edge on the building) were really downgraded between the 2014 gameplay demos and the release:

QcOH0nT.png
 
Last edited by a moderator:
Oh yes, exposing the false advertising of a company = crying like a baby. I hope your getting a paycheck for this.


That's absolutely false:

http://store.steampowered.com/hwsurvey/videocard/

80% have DX11 cards. The API is not a constraint.

Really? DirectX11 is a CPU usage nightmare, remember? ;) And if 10% have 3GB of VRAM or more, that's already a fraction of PS4 owners alone who are over 22 million with theoretically at least 4GB of VRAM and no real boundaries between GPU and CPU usage of that RAM. Development being easier on PC, I am convinced that at this very moment, PC specs are holding back consoles more than console specs hold back PC games. Though just maybe the XboxOne is holding back CU programming usage. ;)

I don't know nearly enough about the API programming part to know for sure, but consoles, and Playstation in particular, PS4 but PS3 just as well, have always been DirectX12, basically (but still more open). But purely specs wise taking the top 20 million PS4s and the top 20 million PCs, it's likely the PS4 average performance will be higher for at least this and next year, say?
 
10 years' time sounds a bit exaggerated, but does the perspective of an AAA game that will only be maxed out in 3/4 years sound that bad in the PC market?

Because to be honest, Crysis 1 certainly did bring Crytek a lot of fame and free publicity (and sales!) when it released as a game that could only be maxed out sometime in the future. "But will it play Crysis" became a longtime running meme.
Plus, the game is almost 8 years old but it still looks awesome today, and Warhead is still a part of the benchmark suites for dozens of websites, representing the pinnacle of what DX10 is capable of.

Chris Roberts has claimed that he wants Star Citizen to be the next "Will it play Crysis?" game. But Chris Roberts wants the game to be the next everything, which makes me wonder if it'll even become the next anything-at-all.

The Witcher 3 had the conditions to become the next Crysis. They certainly seemed to have the right assets for it.
Not quite. It will take a long while before an open world game can achieve Crysis quality graphics. When you have to draw and create textures and stuff for thousands and thousands of square kilometres of terrain I don't think you can make the game appealing and realistic at the same time if you don't have tons of money. :(

Star Citizen is coming to the Xbox One as well, and what I've seen of it is out of this world -check the screengrabs I shared in the Star Citizen thread, console games forum-.

Maybe when close to metal APIs similar to DirectX 12 are the norm, most hardware will make use of their features and then we could talk... :)

It's not the consoles, it's creating something for the stable average, rather than a few fortunate people who don't mind running games at 20 fps if they look like the E3 trailers.
 
Oh yes, exposing the false advertising of a company = crying like a baby. I hope your getting a paycheck for this.


That's absolutely false:

http://store.steampowered.com/hwsurvey/videocard/

80% have DX11 cards. The API is not a constraint.
It's not always the cards holding back the experience, but the drivers... I am going to post a video of how better Project Cars is running on Windows 10 than on Windows 8.1, using DX11, on AMD hardware, just because the Windows 10 AMD drivers are substantially improved.

Continuing from @ToTTenTranz post, I shall add more images edited by a MS Paint user:

ozAzLD2.jpg
admvLGj.jpg
gLQRd7N.jpg
 
Not quite. It will take a long while before an open world game can achieve Crysis quality graphics. When you have to draw and create textures and stuff for thousands and thousands of square kilometres of terrain I don't think you can make the game appealing and realistic at the same time if you don't have tons of money.

Crysis is an open world game...



As for the other comparison images, I didn't put them because the quality differences are rather arguable, at least from looking at the pictures.
Those comparison screenshots were taken at different times of the day and/or not in the exact same spot...
Maybe the assets are really the same between both, and the 2013-2014 previews were simply handpicked from ideal angles and lighting conditions.

As for the vertical edge in the picture I showed, there's really no possible confusion: they took the asset off from the game.
 
Last edited by a moderator:
As I stated before, I believe the initial TW3 2013 E3 reveal was (is) more scripted for cinematic purposes, with very controlled lighting, shadowing, and gobs amount of effects that probably required offline re-rendering on speeding things up (framerate). Honestly, it's Killzone 2, Watch Dogs, ACU and MKX all over again... but this is nothing new to the industry.

If you guys (TW3 downgrade headhunters) can provide live 2013 E3 video showing CDRED or attendees actually playing the reveal (with gamepad), then I will bow out gracefully from this conversation... :yep2:
 
Consoles holding back PC argument is silly. Development cost is what is holding back the entire industry. Consoles move the development baseline and since platform exclusivity for huge budget titles is nearing its end (unless its a first party game) this is the only way forward. Pushing the envelope doesn't make economic sense if only a small fraction of the consumer base can benefit from it and this is coming from a person who uses a PC as a primary platform. The people complaining about the Witcher downgrades make up a very very small fraction of consumers that could actually run the game at a decent frame rate at the quality the early bullshots implied.
 
As I stated before, I believe the initial TW3 2013 E3 reveal was (is) more scripted for cinematic purposes, with very controlled lighting, shadowing, and gobs amount of effects that probably required offline re-rendering on speeding things up (framerate). Honestly, it's Killzone 2, Watch Dogs, ACU and MKX all over again... but this is nothing new to the industry.

If you guys (TW3 downgrade headhunters) can provide live 2013 E3 video showing CDRED or attendees actually playing the reveal (with gamepad), then I will bow out gracefully from this conversation... :yep2:
This situation reminds me of MS showing Forza 5 at Jimmy Kemmel's show -iirc- and then people thinking it was downgraded.

I wonder how these scenes translate into the actual game when it's released:


Ah...and this... :)

walwys.png


@ToTTenTranz , some parts of Crysis could be considered what's defined and considered as open world. Far Cry 2 was open world too. Perhaps what played to the advantage of those games were the kind of repetitive assets?
 
Back
Top